Home >Technology peripherals >AI >New IT operation and maintenance management requires hard work on both infrastructure and data

New IT operation and maintenance management requires hard work on both infrastructure and data

王林
王林forward
2024-02-05 18:25:20808browse

In the AI ​​big model era, data gives IT people a “new mission”

Currently IT people play the role of operational support in enterprises character of. When it comes to operation and maintenance management, I believe everyone has a hard time. They are responsible for tedious, high-load and high-risk operation and maintenance work every day, but they have become "transparent" when it comes to business planning and career development. There is a joking saying in the industry: "Those who only spend money do not deserve to have a say".

#With the popularization of AI large model applications, data has become a key asset and core competitiveness of enterprises. In recent years, the scale of enterprise data has grown rapidly, increasing exponentially from the petabyte level to the hundreds of petabyte level. Data types have also gradually evolved from structured data based on databases to semi-structured and unstructured data based on files, logs, videos, etc. In order to meet the needs of business departments, data storage needs to be classified and accessible like a library, and a more secure and reliable storage method is also needed.

IT people are no longer just passive players responsible for building and managing IT resources and ensuring equipment stability.

The new mission of IT people has evolved into providing high-quality data services, making data easy to use, and helping business departments make good use of data!

New IT operation and maintenance management requires hard work on both infrastructure and data

"Infrastructure" and "data" are very close, but their "management" is far apart

For infrastructure management, a common approach in the industry is to use AIOps technology to change tedious manual daily operation and maintenance into automated execution using tools, through expert systems and knowledge Intelligent capabilities such as maps can proactively discover system hazards and automatically repair faults. After the popularization of generative AI technology, new applications such as intelligent customer service and interactive operation and maintenance have recently emerged.

In terms of data management, the industry has leading companies such as Informatica, IBM, etc. The representative professional DataOps software supplier supports data integration, data labeling, data analysis, data optimization, data market and other capabilities, and provides services for data analysts, BI analysts, data scientists and other business teams.

The author’s research found that infrastructure operation and maintenance management and data management in most enterprises are currently separated, and are managed by different teams. Responsible, there is no effective collaboration between tool platforms. Business data is stored in IT infrastructure such as storage and should be integrated. However, the actual management of the two is far apart, and even the languages ​​between the two teams are not aligned. This usually brings several disadvantages:

1) Data from different sources: Because they belong to different teams and use different tools, business teams usually use the original data to Copy a copy to the data management platform through ETL and other methods for analysis and processing. This not only causes a waste of storage space, but also causes problems such as data inconsistency and untimely data updates, which affects the accuracy of data analysis.

2) Difficulty in cross-regional collaboration: Nowadays, enterprise data centers are deployed in multiple cities, and data is spread across During regional transmission, replication is currently mainly performed at the host layer through DataOps software. This data transmission method is not only inefficient, but also has serious hidden dangers such as security, compliance, and privacy during the transmission process.

3) Insufficient system optimization: Currently, optimization is usually based on the utilization of infrastructure resources, because It is impossible to perceive the data layout and achieve global optimization. The cost of data storage remains high. The contradiction between the limited growth of the budget and the exponential growth of data scale has become the key contradiction restricting the accumulation of enterprise data assets.

IT people, open up the two channels of "infrastructure" and "data" and start the flywheel of digital intelligence

The author believes that the IT team should manage and optimize "infrastructure" and "data" as an organic whole, achieve data origination, global optimization, and safe circulation, and play the role of data asset manager important role.

First, achieve a unified view of global files. Utilize global file system, unified metadata management and other technologies to form a unified global view of data in different regions, different data centers, and different types of equipment. On this basis, global optimization strategies can be formulated according to dimensions such as hot, warm, cold, repetition, expiration, etc., and sent to the storage device for execution. This approach can achieve global optimization. Technologies such as compression and encryption based on storage layer replication can usually achieve dozens of times faster data movement, and both efficiency and security can be guaranteed.

Second, automatically generate a data directory from massive unstructured data. Automatically generate data directory services through metadata, enhanced metadata, etc. to efficiently manage data by category. Based on the catalog, the business team can automatically extract data that meets the conditions for analysis and processing, instead of manually searching for data like a needle in a haystack. The author's research found that the technology for data annotation through AI recognition algorithms is relatively mature. Therefore, open frameworks can be used to integrate AI algorithms for different scenarios, automatically analyze file content to form diversified tags, and use them as enhanced metadata to improve data management capabilities. .

At the same time, when data flows across devices, special considerations need to be given to issues such as data sovereignty and compliance privacy. The data in the storage device should be automatically classified, privacy graded, decentralized and divided into domains, etc. The management software should uniformly manage data access, use, flow and other policies to avoid the leakage of sensitive information and private data. , these will become basic requirements in future data element trading scenarios. For example, when data flows out of a storage device, compliance, personal privacy, etc. must first be determined to determine whether it meets policy requirements, otherwise the enterprise will face serious legal and regulatory risks.

The reference architecture is as follows:

New IT operation and maintenance management requires hard work on both infrastructure and data

According to After conducting research and consulting peer experts, the author found that leading storage vendors in the industry such as Huawei Storage and NetApp have released product solutions for integrated storage and data management. I believe that more vendors will support this in the future.

Equipment and data must be grasped with both hands, and both hands must be strong. IT people can play a more important role in the AI ​​era.

The above is the detailed content of New IT operation and maintenance management requires hard work on both infrastructure and data. For more information, please follow other related articles on the PHP Chinese website!

Statement:
This article is reproduced at:51cto.com. If there is any infringement, please contact admin@php.cn delete