Home  >  Article  >  Technology peripherals  >  Experts warn: Artificial intelligence may trigger the next financial crisis

Experts warn: Artificial intelligence may trigger the next financial crisis

WBOY
WBOYforward
2023-08-22 19:57:18669browse

According to a report on the Spanish "Confidential" website on August 15, artificial intelligence (AI) is rapidly changing our society, and there are still unknown areas regarding its impact on the economy. However, some have warned that the risk that these machine learning algorithms could cause a new financial collapse is real. U.S. Securities and Exchange Commission Chairman Gary Gensler also issued a warning, saying that "artificial intelligence will be at the core of future financial crises, and regulators will face huge challenges."

In an interview with Bloomberg, as the head of the regulatory agency, Gensler said that the biggest risk of artificial intelligence in financial markets is black box algorithms, that is, running in a closed system where users cannot understand the internal functions of the algorithm, which may lead to loss of control and Eventually triggering a market crash

Gensler has warned for years that artificial intelligence could pose huge dangers if not controlled in time. In 2020, Gensler, then a professor at MIT, warned in an article that deep learning (a type of machine learning and artificial intelligence systems) poses risks to financial stability. The article points out that existing regulatory mechanisms are not ready to deal with this potential threat

He pointed out that homogeneity in financial markets is one of the dangers of artificial intelligence. This situation may be due to the so-called "apprenticeship effect", which occurs when the people running these systems come from the same school, have the same background and have developed a strong affinity. "Few people are trained to build and manage these models, and they often come from similar backgrounds," Gensler explained in the study.

Regulatory controls over AI could lead to risks and increase the likelihood that most companies will choose to use the same AI from a handful of providers

The evolution of artificial intelligence is unpredictable, and it is difficult for us to understand its behavioral mechanism, which brings one of the huge risks, that is, the possibility of discriminatory behavior against humans. For example, an AI that was not racist yesterday may become racist today without us being able to detect and stop it in time. Furthermore, the rules are completely opaque and uncontrollable, with a lack of retroactive oversight, leaving regulators virtually powerless to prevent future disasters

Gensler warned that models built using the same data set could result in highly correlated predictions, and that consistency could trigger herding. In addition, he believes that the need for large numbers of data sources could lead to monopolies and become a "single point of failure" for the entire system to collapse.

He also pointed out that even the largest data sets are insufficient to cover a single complete financial cycle because the time range of Internet usage, handheld device data, telematics data, GPS data and smartphone data are not long enough

Gensler believes that for effective supervision, the capital stock required by financial institutions to use artificial intelligence tools should be increased. Regulators can require testing of AI systems to prevent or prohibit companies from taking unexplained actions that could slow growth. However, Gensler also warned that even if this were done, it would be difficult to avoid an increase in systemic risks. He said that even if the regulatory rules are implemented, it will still not be enough to complete the current task

Experts warn: Artificial intelligence may trigger the next financial crisis

This sentence is rewritten as follows: According to reference news reports

The above is the detailed content of Experts warn: Artificial intelligence may trigger the next financial crisis. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:sohu.com. If there is any infringement, please contact admin@php.cn delete