Home  >  Article  >  Technology peripherals  >  The cost and sustainability of generative AI

The cost and sustainability of generative AI

WBOY
WBOYforward
2023-04-14 23:04:011474browse

AI is resource intensive for any platform, including public cloud. Most artificial intelligence technologies require a large amount of inference calculations, which collectively require higher processor, network, and storage requirements, and will also result in higher electricity bills, infrastructure costs, and carbon footprints.

The rise of generative artificial intelligence systems, taking ChatGPT as an example, has brought the above issues into focus again. Given the widespread adoption of this technology, and its likely large-scale adoption by companies, governments, and the general public, we can see the electricity consumption growth curve taking on a worrying arc.

Artificial intelligence technology has been around since the 1970s. Since the work of a mature artificial intelligence system requires a large amount of resources, it did not have much commercial impact initially. I remember that the artificial intelligence system I designed when I was in my 20s required more than $40 million in hardware, software, and data center space to run it. Spoiler alert: That project and many other AI projects never got off the ground. Not viable as a business case.

Cloud computing changes everything. Things that were once out of reach are now sufficiently cost-effective with the public cloud. In fact, as you might have guessed, the rise of the cloud has roughly coincided with the rise of artificial intelligence over the past 10 to 15 years. What I'm trying to say is that they are now tightly integrated.

Sustainability and Cost of Cloud Resources

You really don’t need to do much research to predict what will happen here. There will be a surge in demand for artificial intelligence services, such as generative AI systems that are now attracting attention, as well as other artificial intelligence and machine learning systems. This surge will be led by businesses looking for innovative advantages, such as smart supply chains, and even thousands of college students hoping to have a generative AI system to write their term papers.

More demand for artificial intelligence means more demand for the resources used by these artificial intelligence systems, such as public clouds and the services they provide. This demand is likely to be fulfilled through multiple data centers that house power-hungry servers and network equipment.

Public cloud providers, like any other utility resource provider, will increase prices as demand increases, just like we see seasonal increases in household electricity bills (also based on demand). Therefore, we often reduce usage, for example, turning the air conditioner to 24 degrees instead of 20 degrees in the summer.

However, higher cloud computing costs may not have the same impact on businesses. Enterprises may find that these artificial intelligence systems drive certain key business processes and are not dispensable. In many cases, they may try to save money within the business, perhaps by reducing headcount to offset the cost of AI systems. It’s no secret that generative AI systems will soon replace many information workers.

What can be done?

What can we do if the resource demands of running AI systems will result in higher computing costs and carbon output? The answer may lie in finding more efficient ways for AI to utilize resources such as processing, networking and storage.

For example, the sampling pipeline can speed up deep learning by reducing the amount of data processed. Research from MIT and IBM shows that you can use this approach to reduce the resources required to run neural networks on large data sets. However, it also limits accuracy, which may be acceptable for some business use cases, but not all.

Another approach already used in other technology areas is in-memory computing. This architecture can speed up AI processing by not moving data in and out of memory. Instead, AI calculations run directly within the memory module, which speeds things up significantly.

Other methods are also under development. For example, changing the physical processor, using co-processors for artificial intelligence calculations to increase computing speed, or adopting next-generation computing models such as quantum. You can expect a lot of technical announcements from the big public cloud vendors on how to solve many of these problems.

What should you do?

My advice is certainly not to avoid AI in order to get a lower cloud computing bill or save the planet. Artificial Intelligence is a foundational computing method that most businesses can leverage to gain significant value.

I recommend that you enter an AI-enabled or net new AI system development project with a clear understanding of the cost and impact on sustainability, which are directly related. You have to make a cost/benefit choice, and that really comes back to the value of cost and risk that you can bring to the business. After all, there is actually nothing new.

I do believe that most problems will be solved in innovative ways, whether it's memory or quantum computing or solutions we haven't seen yet. Both AI technology providers and cloud computing providers are keen to make AI more cost-effective and green. This is good news.

Source: www.cio.com

The above is the detailed content of The cost and sustainability of generative AI. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete