Home >Technology peripherals >AI >AI everywhere: across the edge and sustainably
The integration of artificial intelligence (AI) is everywhere, providing transformative opportunities for various industries.
One such paradigm shift is the convergence of artificial intelligence and edge computing, promoting sustainable solutions and innovative applications.
Enterprises can leverage the rapid development of artificial intelligence to enable hyper-personalization at scale in customer experience (CX) and predictive analytics to transform their services and operations to manage business operations.
The benefits provided by 5G networks are:
Popular Waterhouse Coopers has released a report setting out the potential of artificial intelligence to help reduce carbon emissions. According to its analysis, by 2030, the artificial intelligence economy applied in the four major fields of agriculture, energy, transportation and water may bring up to:
In this way, environmental goals and economic goals can be coordinated with each other, especially through technological progress. As businesses and the overall economy grow, more efficient AI is more effective at a macroeconomic and social level, able to scale and create economic and job growth. At a microeconomic level, by lowering the cost of deploying and scaling AI, businesses may expand into new services, products, and business models, and enable startups to thrive and scale. At the same time, achieving this with lower energy consumption reduces the carbon footprint.
In addition, a group of leading scientists in the field of artificial intelligence mentioned that machine learning can be used to assist in combating climate change, across electrical systems, industry, transportation, construction, smart grids, disaster management and other industries. These challenges ensure the importance of scaling up AI on an efficient basis that combines cost and environmental benefits. Energy efficiency is key in both areas.
The emergence of production artificial intelligence has caused a craze, which is often provided by large language models (LLM). These models employ transformers and self-attention mechanisms, often combined with deep reinforcement learning, to optimize their responses. While these models are computationally expensive, including hardware requirements, energy costs, and carbon footprints, their inclusive service requirements, energy costs, and carbon footprints are also reduced.
Smart refers to devices that are connected to the Internet. However, connected devices are becoming increasingly “smarter” as AI is embedded locally on devices, such as PCs with AI. In this case, intelligence refers to the ability to respond meaningfully to the user and personalize the experience, rather than human-level intelligence.
As the Internet of Things scales, the growth of edge computing will require ultra-low latency, which in turn allows for real-time responses.
As mentioned above, artificial intelligence will increasingly be at the edge of the network - called edge computing or simply edge, where the processing of data is closer to where it is generated and may actually be located on the device itself. This keeps latency very low, resulting in real-time responses to users.
The cloud model will continue to be applied to data centers, providing important resources and capabilities for storing historical data for analysis. This will also allow for ongoing algorithm development using hybrid models, supporting training of AI models on cloud servers and inference of AI at the edge, providing further potential for personalization at scale.
The above is the detailed content of AI everywhere: across the edge and sustainably. For more information, please follow other related articles on the PHP Chinese website!