Home > Article > Technology peripherals > Why edge computing and artificial intelligence strategies must complement each other
Many enterprises have begun exploring edge computing use cases because of the ability to push computing power closer to data sources and closer to end users. At the same time, artificial intelligence or machine learning may be being explored or implemented, and the ability to automate discovery and gain data-driven insights has also been recognized. But if you don’t proactively combine edge and AI strategies, you’ll miss out on the potential for change.
There are clear signs that edge analysis and data analysis are converging. According to data, by 2025, the creation of edge data will increase by 33%, accounting for more than one-fifth of data. By 2023, data analysis professionals will focus more than 50% of their energy on creating and analyzing edge data. Edge solutions are very or extremely important to achieving the enterprise's mission. 78% of leaders believe the edge will have the greatest impact on AI and ML.
Traditionally, enterprises need to transport remote data to data centers or commercial clouds to perform analysis and extract value. This can be challenging in edge environments due to increased data volumes, limited or no network access, and the increasing need for faster decision-making in real-time.
But today, the enhanced availability of small-capacity chipsets, high-density compute and storage, and mesh networking technologies has laid the foundation for enterprises to deploy artificial intelligence workloads closer to the source of data production.
To enable edge AI use cases, identify where near-real-time data decisions can significantly enhance the user experience and achieve mission goals. We are seeing an increasing number of edge use cases focused on next-generation flight kits to support law enforcement, cybersecurity and health investigations. Where investigators once collected data for subsequent processing, the new deployment suite includes advanced tools for processing and exploring data in the field.
Next, determine where to transmit large amounts of edge data. If the data can be processed at a remote location, then only the results need to be transferred. By moving only a small portion of your data, you free up bandwidth, reduce costs, and make decisions faster.
Leverage loosely coupled edge components to achieve the necessary computing power. A single sensor cannot perform processing. But high-speed mesh networks allow for connected nodes, some of which handle data collection, and others that process and so on. ML models can even be retrained at the edge to ensure continued prediction accuracy.
The best practice for edge AI is infrastructure code. Infrastructure code allows network and security configurations to be managed through configuration files rather than through physical hardware. Using infrastructure code, configuration files include infrastructure specifications, making it easier to change and distribute configurations and ensuring environments are consistently provisioned.
You can also consider using microservices and running them within them and leveraging development ops capabilities such as CI/CD pipelines, giitops, etc. to automate the iterative deployment of ML models into production environments at the edge and provide Write code once and use it anywhere with the flexibility.
We should seek to use consistent technologies and tools at the edge and core. This way, no specialized expertise is required, one-off issues are avoided, and it can be expanded more easily.
Everyone from the military to law enforcement to agencies managing critical infrastructure is executing AI at the edge. For example, the International Space Station.
The International Space Station includes an on-site laboratory for conducting research and operating experiments. In one example, scientists focused on sequencing the DNA genomes of microorganisms discovered on the International Space Station. Genome sequencing generates vast amounts of data, but scientists only need to analyze a portion of it.
In the past, the International Space Station transmitted all data to ground stations for centralized processing, often with many terabytes of data per sequence. At transitional transmission rates, data could take weeks to reach scientists on Earth. But harnessing the power of edge and artificial intelligence, the research is done directly on the International Space Station, with only the results transmitted to the ground. Analysis can now be performed on the same day.
The system is easy to manage in environments where space and power are limited. Software updates are pushed to the edge of necessity and ML model training is performed on-site. And the system is flexible enough to handle other types of ML-based analysis in the future.
Combining artificial intelligence and edge computing allows enterprises to perform analytics anywhere. AI can be scaled and scaled at remote locations with a common framework from core to edge. By placing analytics close to where data is generated and users interact, decisions can be made faster, services can be delivered faster, and tasks can be scaled to wherever they are needed.
The above is the detailed content of Why edge computing and artificial intelligence strategies must complement each other. For more information, please follow other related articles on the PHP Chinese website!