Home  >  Article  >  Technology peripherals  >  How Artificial Intelligence is Bringing New Everyday Work to Data Center Teams

How Artificial Intelligence is Bringing New Everyday Work to Data Center Teams

WBOY
WBOYforward
2023-03-31 20:38:311035browse

In hyperscale environments, secret features and micro-optimizations may provide real benefits, but for the mass market, they may not be necessary. If it were critical to do so, the move to the cloud would be limited by the emergence of tailor-made network solutions, but unfortunately, this is not the case.

How Artificial Intelligence is Bringing New Everyday Work to Data Center Teams

Artificial intelligence has gone from distant imagination to near-term imperative, driven by breakthrough use cases in generating text, art, and video. It is affecting the way people think about every field, and data center networking is certainly not immune. But what might artificial intelligence mean in the data center? How will people get started?

While researchers may unlock some algorithmic approaches to network control, this does not appear to be the primary use case for artificial intelligence in data centers. The simple fact is that data center connectivity is largely a solved problem.

In hyperscale environments, secret features and micro-optimizations may provide real benefits, but for the mass market, they may not be necessary. If it were critical to do so, the move to the cloud would be limited by the emergence of tailor-made network solutions, but unfortunately, this is not the case.

If AI is to make a lasting impression, it must be operational. Networking practices will become the battleground for the workflows and activities required to network. Combined with the industry's 15-year ambition around automation, this actually makes a lot of sense. Can AI provide the technology push needed to finally move the industry from dreaming of operational advantages to actively leveraging automated, semi-autonomous operations?

Deterministic or random?

It seems possible, but the answer to this question is nuanced. At a macro level, data centers have two different operating behaviors: one that is deterministic and leads to known results, and the other that is random or probabilistic.

For deterministic workflows, AI is more than just overkill; it’s completely unnecessary. More specifically, with known architectures, the configuration required to drive the device does not require an AI engine to handle it. It requires translation from an architectural blueprint to a device-specific syntax.

Configuration can be fully predetermined even in the most complex cases (multi-vendor architectures with varying sizing requirements). There might be nested logic to handle changes in device type or vendor configuration, but nested logic would hardly qualify as artificial intelligence.

But even outside of configuration, many day-two operational tasks don’t require artificial intelligence. For example, take one of the more common use cases where marketers have been using AI for years: resource thresholding. The logic is that AI can determine when critical thresholds such as CPU or memory usage are exceeded and then take some remedial action.

Threshold is not that complicated. Mathematicians and AI purists might comment that linear regression is not really intelligence. Rather, this is pretty rough logic based on trend lines, and importantly, these things have been showing up in various production settings before artificial intelligence became a fashionable term.

So, does this mean artificial intelligence has no role? Absolutely not! This does mean that AI is not a requirement or even applicable to everything, but there are some workflows in the network that can and will benefit from AI. Workflows that are probabilistic rather than deterministic would be the best candidates.

Troubleshooting as a Potential Candidate

There may be no better candidate for probabilistic workflows than root cause analysis and troubleshooting. When a problem occurs, network operators and engineers engage in a series of activities designed to troubleshoot the problem and hopefully identify the root cause.

For simple problems, the workflow may be scripted. But for anything other than the most basic of problems, the operator is applying some logic and choosing the most likely but not predetermined path forward. Make some refinements based on what you know or have learned, either seek more information or make guesses.

Artificial intelligence can play a role in this. We know this because we understand the value of experience during troubleshooting. A new hire, no matter how skilled they are, will usually perform less well than someone who has been around for a long time. Artificial intelligence can replace or supplement all ingrained experiences, while recent advances in natural language processing (NLP) help smooth the human-machine interface.

AI starts with data

The best wine starts with the best grapes. Likewise, the best AI will start with the best data. This means that well-equipped environments will prove to be the most fertile environments for AI-driven operations. Hyperscalers are certainly further along the AI ​​path than others, thanks in large part to their software expertise. But it cannot be ignored that they attach great importance to real-time collection of information through streaming telemetry and large-scale collection frameworks when setting up data centers.

Businesses that want to leverage artificial intelligence to some extent should examine their current telemetry capabilities. Basically, does the existing architecture help or hinder any serious pursuit? Architects then need to build these operational requirements into the underlying architecture assessment process. In enterprises, operations are often some additional work that is done after the equipment passes through the purchasing department. This is not the norm for any data center hoping to one day leverage anything beyond simple scripting operations.

Going back to the issue of determinism or randomness, this issue really shouldn’t be framed as an either/or proposition. Both sides have their roles to play. Both have to play a role. Each data center will have a deterministic set of workflows and the opportunity to do some groundbreaking things in a probabilistic world. Both will benefit from data. Therefore, regardless of goals and starting points, everyone should focus on data.

Lower expectations

For most businesses, the key to success is to lower expectations. The future is sometimes defined by grand declarations, but often the grander the vision, the more out of reach it seems.

What if the next wave of progress was driven more by boring innovations rather than exaggerated promises? What if reducing hassle tickets and human error was enough to get people to take action? Aiming at the right goals makes it easier for people to grow. This is especially true in an environment that lacks enough talent to meet everyone's ambitious agenda. So even if the AI ​​trend hits a trough of disillusionment in the coming years, data center operators still have an opportunity to make a meaningful difference to their businesses.

The above is the detailed content of How Artificial Intelligence is Bringing New Everyday Work to Data Center Teams. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact [email protected] delete