The Risks and Rewards of ChatGPT Becoming a VC Tool

Despite the fact that it’s relatively new, ChatGPT is already completely changing the way nearly every sector does business — not even finance is exempt. 

Statistics show that generative AI will have a hand in producing 10% of all global data by 2025. By 2030, generative AI will be responsible for a 26% boost to global GDP.

Generative AI has made its way into finance and investment — first with the launch of BloombergGPT to assist with basic processes like sentiment analysis and finance tasks, and then with companies testing generative AI’s fraud defense and product development capabilities.

The explosive popularity of ChatGPT and generative AI means prompt engineering is quickly becoming its own skilled profession and a valuable resume asset. Since generative AI may very well be an integral part of finance and investment moving forward, we’re likely to see investment experts shift into a role that includes prompt engineering. 

This means it’s time to debug the system and examine potential risks and rewards so the industry can be prepared for this change. Here are some risks and rewards to think about when it comes to ChatCPT and its role in the finance industry. 

The risks

1. It could cause more market volatility

The financial markets are volatile enough without the added chaos of overreliance on AI. Picture this scenario: ChatGPT reads some bold or provocative Twitter posts from one or more VCs and uses that information to create analytics. 

If a majority of finance professionals start relying too heavily on this tool for market analysis, it’s easy to see how fast things could spiral out of control. Tech can make even the best of us lazy. Still, if the finance sector doesn’t maintain its individual analysis methods (which keep everyone from forming a hive-mind), misinformation can spread like wildfire. Worst-case scenario? Markets collapse under bad recommendations because everyone is giving out the same flawed advice. 

However, we are already seeing the use of ChatGPT by financial advisers and hedge fund managers, and this isn’t necessarily a bad thing. When used as part of an expert’s overall “investment toolbox,” ChatGPT is great for generating automated investment strategies, identifying trends and analyzing data. Because it can examine massive data sets in relatively little time, ChatGPT can potentially help financial managers make better market predictions as long as they use the information as a supplement to their own methods.

2. We would need a tight control framework

ChatGPT is an incredibly powerful tool, and it will only become more so as it develops. This means that every industry, including finance, must develop a system of checks and balances for its use.

Currently, many firms that deal with sensitive data, such as JPMorgan Chase, Amazon, Goldman Sachs, Accenture and others, have outright banned the use of ChatGPT, while others like Citadel have welcomed its use. There is agreement on both sides that there must be guidance surrounding how and when it’s acceptable to use ChatGPT tools, particularly since there have already been reported data leaks.

For now, corporations and countries alike are more or less on their own when it comes to deciding how to handle generative AI regulations. Many businesses are scrambling to consult with IT, legal and compliance departments to figure out how to proceed. Italy has even issued a temporary ban, citing GDPR violations, and Germany is studying this prospect closely. 

Regulators are already way behind in formulating a legal framework for this technology (maybe they should just ask ChatGPT to make its own guidelines to speed up the process? I wonder how it handles the conflict of interest). But there will need to be strict controls put in place before it gets widespread adoption. This is especially crucial for industries like health care and finance, where cybersecurity is paramount.

3. It could cause more stress for professional investment managers and flood the market with unskilled players

One of the defining features of ChatGPT is the fact that it can only generate outputs based on historical inputs. This means it lacks the human skill and nuance necessary to predict futures or provide context to data sets. 

For instance, a NerdWallet investment reporter asked ChatGPT some financial planning questions about things like taxes and retirement, then asked a seasoned adviser about the advice he received. In every instance, ChatGPT was, at best, partially correct, but it always left out important pieces of the puzzle that could make a big difference to someone who isn’t a professional and wouldn’t know the difference. 

A two-pronged problem

There are two significant issues that will likely arise when people start getting financial advice from AI.

First, it will likely cause some people to think they “know better” than their investment adviser because they have been told something different by ChatGPT. There is still a general notion that artificial intelligence is all-powerful and better than human intelligence, even though experts disagree

Because of this belief, it may be hard for the average person to understand that ChatGPT’s advice is not as accurate or reliable as they think. Professionals may have to argue with clients to keep them from making half-informed choices, which can be stressful and frustrating. 

The second problem is an inevitable influx of amateurs playing at being pros. There are plenty of DIY investors out there who have some knowledge of financial markets and a bit of spare lunch money. Still, they don’t have the training, experience or true expertise to act as a financial planner for other people or even make consistently sound investment decisions. 

Unfortunately, ChatGPT will give some percentage of these amateurs an inflated sense of competence because they will mistakenly believe they can rely on the AI tool to be their primary base of knowledge. They will flood the market, branding themselves as professionals and likely causing short-term chaos. 

The upside here? People will quickly get tired of hearing trite generalizations and regurgitations of basic advice they could have found online. The market will eventually correct itself, and seasoned advisers will be the only ones left who can offer real help. 

The rewards

The good news is there are some things we can gain from ChatGPT and its future iterations. Learning a bit of prompt engineering can save time, improve productivity and even give better insights than traditional methods.  

For instance, a recent study found that using ChatGPT as part of a larger decision-making process can be a performance enhancer when it comes to making predictions and anticipating stock market movements.

It also has great potential to make the workload of finance professionals much less tedious. AI is ideal for automating tasks like data entry, report generation and model creation, so allowing it to take over those routine parts of the job can free up professionals to focus on the more value-adding and expertise-driven parts of their career, which is something Morgan Stanley is already testing. 

Ultimately, generative AI is still in its infancy, so we don’t yet know the scope of what it will be able to do for the finance industry in the long run. However, for investors and finance professionals who already have a strong knowledge base, ChatGPT can be a useful tool.


Mikhail Taver is a career investor, with field-specific education and deep expertise in investments and strategic consulting. He is the founder and managing partner of Delaware-based Taver Capital, an international venture capital fund focused on investing in global artificial intelligence companies. In 20 years of top executive roles with major financial groups and industrial companies, Taver has closed over 250 M&A and private equity deals. He holds CFA, ACMA and CGMA certifications.

  • Originally published May 12, 2023