FinGPT

AI4Finance-Foundation
1 liked
entry-slick
About FinGPT

Let us DO NOT expect Wall Street to open-source LLMs nor open APIs, due to FinTech institutes’ internal regulations and policies.

We democratize Internet-scale data for financial large language models (FinLLMs) at FinNLP and FinNLP Website

Blueprint of FinGPT

Disclaimer: We are sharing codes for academic purposes under the MIT education license. Nothing herein is financial advice, and NOT a recommendation to trade real money. Please use common sense and always first consult a professional before trading or investing.

img

Why FinGPT?

1). Finance is highly dynamic. BloombergGPT retrains an LLM using a mixed dataset of finance and general data sources, which is too expensive (1.3M GPU hours, a cost of around $5M). It is costly to retrain an LLM model every month or every week, so lightweight adaptation is highly favorable in finance. Instead of undertaking a costly and time-consuming process of retraining a model from scratch with every significant change in the financial landscape, FinGPT can be fine-tuned swiftly to align with new data (the cost of adaptation falls significantly, estimated at less than $416 per training).

2). Democratizing Internet-scale financial data is critical, which should allow timely updates (monthly or weekly updates) using an automatic data curation pipeline. But, BloombergGPT has privileged data access and APIs. FinGPT presents a more accessible alternative. It prioritizes lightweight adaptation, leveraging the strengths of some of the best available open-source LLMs, which are then fed with financial data and fine-tuned for financial language modeling.

3). The key technology is “RLHF (Reinforcement learning from human feedback)”, which is missing in BloombergGPT. RLHF enables an LLM model to learn individual preferences (risk-aversion level, investing habits, personalized robo-advisor, etc.), which is the “secret” ingredient of ChatGPT and GPT4.

FinGPT Demos

  • FinGPT V3 (Updated on 8/4/2023)

    - FinGPT v3 series are LLMs finetuned with LoRA method on the News and Tweets sentiment analysis dataset which achieve best scores on most of the financial sentiment analysis datasets.

    - FinGPT v3.1 uses chatglm2-6B as base model; FinGPT v3.2 uses llama2-7b as base model

    • Benchmark Results:

    \| Weighted F1 \| BloombergGPT \| ChatGLM2 \| Llama2 \| FinGPT v3.1 \| v3.1.1 (8bit) \| v3.1.2 (QLoRA) \| FinGPT v3.2 \| \| ———– \| ———————————————— \| ———————————————— \| ———————————————————— \| ———————————————————— \| ————- \| ————– \| ———————————————————— \| \| FPB \| 0.511 \| 0.381 \| 0.390 \| 0.855 \| 0.855 \| 0.777 \| 0.850 \| \| FiQA-SA \| 0.751 \| 0.790 \| 0.800 \| 0.850 \| 0.847 \| 0.752 \| 0.860 \| \| TFNS \| - \| 0.189 \| 0.296 \| 0.875 \| 0.879 \| 0.828 \| 0.894 \| \| NWGI \| - \| 0.449 \| 0.503 \| 0.642 \| 0.632 \| 0.583 \| 0.636 \| \| Devices \| 512 × A100 \| 64 × A100 \| 2048 × A100 \| 8 × A100 \| 8 × A100 \| 8 × A100 \| 8 × A100 \| \| Time \| 53 days \| 2.5 days \| 21 days \| 8 hours \| 6.47 hours \| 4.15 hours \| 8 hours \| \| Cost \| \(2.67 million \| \) 14,976 \| \(4.23 million \| \)262.4 \| \(212.2 \| \)136.12 \| \(262.4 \|

**Cost per GPU hour.** For A100 GPUs, the AWS p4d.24xlarge instance, equipped with 8 A100 GPUs is used as a benchmark to estimate the costs. Note that BloombergGPT also used p4d.24xlarge As of July 11, 2023, the hourly rate for this instance stands at \)32.773. Consequently, the estimated cost per GPU hour comes to \(32.77 divided by 8, resulting in approximately **\)4.10**. With this value as the reference unit price (1 GPU hour). BloombergGPT estimated cost= 512 x 53 x 24 = 651,264 GPU hours x \(4.10 = \)2,670,182.40

- Reproduce the results by running benchmarks, and the detailed tutorial is on the way. - Finetune your own FinGPT v3 model with the LoRA method on only an RTX 3090 with this notebook in 8bit or this notebook in int4 (QLoRA) - FinGPT V2 - Let’s train our own FinGPT in American Financial Market with LLaMA and LoRA (Low-Rank Adaptation) - FinGPT V1 - Let’s train our own FinGPT in Chinese Financial Market with ChatGLM and LoRA (Low-Rank Adaptation)

Understanding FinGPT: An Educational Blog Series

Visit Official Website

https://github.com/AI4Finance-Foundation/FinGPT

Community Posts
no data
Nothing to display