search
HomeTechnology peripheralsAIIf it cannot support its future vision, will artificial intelligence once again usher in a 'winter'?

Since Alan Turing first raised the question "Can machines think?" in his seminal paper "Computing Machines and Intelligence" in 1950, the development of artificial intelligence has not been smooth sailing and has not yet been achieved. Its goal of “general artificial intelligence.”

If it cannot support its future vision, will artificial intelligence once again usher in a 'winter'?

However, the field has still made incredible progress, such as: IBM Deep Blue robot defeated the best chess player in the world, automatic The birth of the driving car, and Google DeepMind's AlphaGo defeating the world's best Go player... The current achievements demonstrate the best research and development results of the past 65 years.

It is worth noting that there was a well-documented “AI Winters” during this period, which almost completely overturned people’s early expectations for artificial intelligence.

One of the factors leading to the AI ​​winter is the gap between hype and actual fundamental progress.

In the past few years, there has been speculation that another artificial intelligence winter may be coming. So what factors may trigger an artificial intelligence ice age?

Cyclic Fluctuations in Artificial Intelligence

"AI Winter" refers to the public interest in artificial intelligence as the commercial and academic fields develop these technologies. A period of tapering investment.

Artificial intelligence initially developed rapidly in the 1950s and 1960s. Although there have been many advances in artificial intelligence, they have mostly remained academic.

In the early 1970s, people’s enthusiasm for artificial intelligence began to fade, and this gloomy period lasted until around 1980.

During this artificial intelligence winter, activities dedicated to developing human-like intelligence for machines are starting to lack funding.

If it cannot support its future vision, will artificial intelligence once again usher in a 'winter'?

In the summer of 1956, a group of mathematicians and computer scientists occupied the top floor of the building that housed the Department of Mathematics at Dartmouth College.

For eight weeks, they imagined a new field of research together.

As a young professor at Dartmouth University, John McCarthy coined the term “artificial intelligence” while designing a proposal for a seminar. ​

He believes that the workshop should explore the hypothesis that "every aspect of human learning or any other feature of intelligence can in principle be described so accurately that it can be described by machines to simulate it”.

At that meeting, researchers roughly sketched out what we know as artificial intelligence today.

It gave birth to the first camp of artificial intelligence scientists. "Symbolism" is an intelligent simulation method based on logical reasoning, also known as logicism, psychology school or computer school. Its principles are mainly physical symbol system assumptions and limited rationality principles, which have long been dominant in artificial intelligence research.

Their expert system reached its peak in the 1980s.

In the years after the conference, "connectionism" attributed human intelligence to the high-level activities of the human brain, emphasizing that intelligence is produced by a large number of simple units through complex interconnections and Results of parallel runs.

It starts from neurons and then studies neural network models and brain models, opening up another development path for artificial intelligence.

The two approaches have long been considered mutually exclusive, with both sides believing they are on the road to general artificial intelligence.

Looking back in the decades since that meeting, we can see that AI researchers’ hopes were often dashed, and that these setbacks did not stop them from developing AI.

Today, even though artificial intelligence is bringing revolutionary changes to industries and has the potential to disrupt the global labor market, many experts are still wondering whether today’s artificial intelligence applications have reached their limits.

As Charles Choi describes in "Seven Revealed Ways AI Fail," today's deep learning systems Weaknesses are becoming increasingly apparent.

However, researchers are not pessimistic about the future of artificial intelligence. We may be facing another AI winter in the near future.

But this may be the moment when inspired artificial intelligence engineers finally lead us into the eternal summer of machine thinking.

An article by Filip Piekniewski, an expert in computer vision and artificial intelligence, titled "AI Winter is Coming" has aroused heated discussion on the Internet.

This article mainly criticizes the hype of deep learning, believing that this technology is far from revolutionary and is facing development bottlenecks.

The interest of major companies in artificial intelligence is actually converging, and another winter of artificial intelligence may be coming. ​

Will the artificial intelligence winter come?

Since 1993, the field of artificial intelligence has made increasingly impressive progress. ​

In 1997, IBM's Deep Blue system became the first to defeat world chess champion Gary. Kasparov's computer chess player.

In 2005, a Stanford unmanned robot drove 131 miles on a desert road without touching a single step, winning the DARPA Autonomous Robot Challenge.

In early 2016, Google’s DeepMind’s AlphaGo defeated the world’s best Go player.

In the past twenty years, everything has changed.

Especially with the booming development of the Internet, the artificial intelligence industry has enough pictures, sounds, videos and other types of data to train neural networks and apply them extensively.

But the ever-expanding success of the deep learning field relies on increasing the number of layers in neural networks, and increasing the GPU time used to train them.

The computing power required to train the largest artificial intelligence systems doubles every two years and then every 3-4 months, according to an analysis by artificial intelligence research firm OpenAI. Ichiban.

As Neil C. Thompson and colleagues write in Diminishing Returns of Deep Learning, many researchers worry that the computational power of artificial intelligence Demand is on an unsustainable trajectory.

A common problem faced by early artificial intelligence research was a severe lack of computing power. They were limited by hardware rather than human intelligence or ability.

Over the past 25 years, as computing power has increased dramatically, so have our advances in artificial intelligence.

However, in the face of the surging massive data and increasingly complex algorithms, 20ZB of new data are added globally every year, and the demand for AI computing power increases 10 times every year. This rate has far exceeded Moore's Law is about the doubling cycle of performance.

We are approaching the theoretical physical limit of the number of transistors that can be installed on a chip.

For example, Intel is slowing down the pace of launching new chip manufacturing technologies because it is difficult to continue to reduce the size of transistors while saving costs. In short, the end of Moore's Law is approaching.

If it cannot support its future vision, will artificial intelligence once again usher in a winter?

Photo source: Ray Kurzwell, DFJ

There are some short-term solutions that will ensure the continued growth of computing power and thus the advancement of artificial intelligence.

For example, in mid-2017, Google announced that it had developed a specialized artificial intelligence chip called "Cloud TPU" that is optimized for the training and execution of deep neural networks.

Amazon develops its own chip for Alexa (artificial intelligence personal assistant). At the same time, there are currently many startups trying to adapt chip designs for specialized artificial intelligence applications.

However, these are only short-term solutions.

What happens when we run out of options to optimize traditional chip design? Will we see another AI winter? The answer is yes, unless quantum computing can surpass classical computing and find a more solid answer.

But until now, a quantum computer that can achieve "quantum hegemony" and is more efficient than a traditional computer does not yet exist.

If we reach the limit of traditional computing power before the arrival of true "quantum supremacy", I am afraid that there will be another winter of artificial intelligence in the future.

The problems that artificial intelligence researchers are grappling with are increasingly complex and pushing us toward realizing Alan Turing’s vision of artificial general artificial intelligence. However, there is still a lot of work to be done.

At the same time, we will be very unlikely to realize the full potential of artificial intelligence without the help of quantum computing.

No one can say for sure whether an AI winter is coming.

However, it is important to be aware of the potential risks and pay close attention to the signs so that we can be prepared when it does happen.

The above is the detailed content of If it cannot support its future vision, will artificial intelligence once again usher in a 'winter'?. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:51CTO.COM. If there is any infringement, please contact admin@php.cn delete
Tesla's Robovan Was The Hidden Gem In 2024's Robotaxi TeaserTesla's Robovan Was The Hidden Gem In 2024's Robotaxi TeaserApr 22, 2025 am 11:48 AM

Since 2008, I've championed the shared-ride van—initially dubbed the "robotjitney," later the "vansit"—as the future of urban transportation. I foresee these vehicles as the 21st century's next-generation transit solution, surpas

Sam's Club Bets On AI To Eliminate Receipt Checks And Enhance RetailSam's Club Bets On AI To Eliminate Receipt Checks And Enhance RetailApr 22, 2025 am 11:29 AM

Revolutionizing the Checkout Experience Sam's Club's innovative "Just Go" system builds on its existing AI-powered "Scan & Go" technology, allowing members to scan purchases via the Sam's Club app during their shopping trip.

Nvidia's AI Omniverse Expands At GTC 2025Nvidia's AI Omniverse Expands At GTC 2025Apr 22, 2025 am 11:28 AM

Nvidia's Enhanced Predictability and New Product Lineup at GTC 2025 Nvidia, a key player in AI infrastructure, is focusing on increased predictability for its clients. This involves consistent product delivery, meeting performance expectations, and

Exploring the Capabilities of Google's Gemma 2 ModelsExploring the Capabilities of Google's Gemma 2 ModelsApr 22, 2025 am 11:26 AM

Google's Gemma 2: A Powerful, Efficient Language Model Google's Gemma family of language models, celebrated for efficiency and performance, has expanded with the arrival of Gemma 2. This latest release comprises two models: a 27-billion parameter ver

The Next Wave of GenAI: Perspectives with Dr. Kirk Borne - Analytics VidhyaThe Next Wave of GenAI: Perspectives with Dr. Kirk Borne - Analytics VidhyaApr 22, 2025 am 11:21 AM

This Leading with Data episode features Dr. Kirk Borne, a leading data scientist, astrophysicist, and TEDx speaker. A renowned expert in big data, AI, and machine learning, Dr. Borne offers invaluable insights into the current state and future traje

AI For Runners And Athletes: We're Making Excellent ProgressAI For Runners And Athletes: We're Making Excellent ProgressApr 22, 2025 am 11:12 AM

There were some very insightful perspectives in this speech—background information about engineering that showed us why artificial intelligence is so good at supporting people’s physical exercise. I will outline a core idea from each contributor’s perspective to demonstrate three design aspects that are an important part of our exploration of the application of artificial intelligence in sports. Edge devices and raw personal data This idea about artificial intelligence actually contains two components—one related to where we place large language models and the other is related to the differences between our human language and the language that our vital signs “express” when measured in real time. Alexander Amini knows a lot about running and tennis, but he still

Jamie Engstrom On Technology, Talent And Transformation At CaterpillarJamie Engstrom On Technology, Talent And Transformation At CaterpillarApr 22, 2025 am 11:10 AM

Caterpillar's Chief Information Officer and Senior Vice President of IT, Jamie Engstrom, leads a global team of over 2,200 IT professionals across 28 countries. With 26 years at Caterpillar, including four and a half years in her current role, Engst

New Google Photos Update Makes Any Photo Pop With Ultra HDR QualityNew Google Photos Update Makes Any Photo Pop With Ultra HDR QualityApr 22, 2025 am 11:09 AM

Google Photos' New Ultra HDR Tool: A Quick Guide Enhance your photos with Google Photos' new Ultra HDR tool, transforming standard images into vibrant, high-dynamic-range masterpieces. Ideal for social media, this tool boosts the impact of any photo,

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

SublimeText3 English version

SublimeText3 English version

Recommended: Win version, supports code prompts!

mPDF

mPDF

mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

MinGW - Minimalist GNU for Windows

MinGW - Minimalist GNU for Windows

This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.

Atom editor mac version download

Atom editor mac version download

The most popular open source editor