Don't use the Chat GPT for financial advice.



The Chat GPT and other large language models are very good tools when they answer multiple-choice tasks where is strict and precise values. But when LLM should answer about things questions that are not so precise as multiple-choice questions. Those LLMs have problems. Things like fusion contracts are problematic. 

Because there are always some kind of different interpretations of those contracts. When the data that the LLM handles involves many interpretations and uncertainty the LLM has problems. The financial information is always short numeric data when we talk about things like stock marketing. The AI can operate multiple or even millions of objects at the same time. 

And that makes the AI cannot make a financial loss. But contracts and juridical texts are different. Many things are different from the precise world. The thing is that the LLM that we call artificial intelligence can handle very well. 

The problems are coming when the AI should find information that is not from the most common topics. It's possible. That those data sources are not valid. Another thing is that the terminology in marketing, trade, and court systems are not very common things outside those systems. 

That causes a situation in which the AI can start to search for things like the term "liquidate" from the internet. And we know that this term might mean something that doesn't belong to financial and trade language. Synonyms can cause problems for the AI. 

When the AI makes analyzes of some systems that system is as good as information that it can use. The AI requires information that is the same way trusted and confirmed as humans do. The thing is while the AI analyses the court decisions and advice should the lawyer start the court process the AI requires all information about the case and the precedents about similar cases. And that means the AI cannot probably predict the court order in the case. 

The AI is the ultimate tool when it uses specific and exact information. But if that information is manipulated, corrupted or something is missing. That means the AI cannot make the right decisions without the right information. AI doesn't differ from humans when it makes decisions. If we think the need for confirmed and trusted information. 


https://scitechdaily.com/why-you-should-think-twice-before-relying-on-chatgpt-for-financial-advice/


Kommentit

Suositut tekstit