Home  >  Article  >  After the Dencun upgrade, how to solve the long-term storage and access problems of Ethereum historical data?

After the Dencun upgrade, how to solve the long-term storage and access problems of Ethereum historical data?

WBOY
WBOYOriginal
2024-06-19 01:39:26934browse

Ethereum State Data Expansion Problems and Solutions

With the popularity of the Ethereum network and the increase in application demand, its historical state data began to grow rapidly. To deal with this problem, Ethereum has improved step by step, from the initial full node to the light client, and then to the recent Dencun upgrade that introduced the state expiration function to automatically clean up long-term unused data.

One of Ethereum’s long-term goals is to reduce the load on a single blockchain by implementing sharding to spread data across different blockchains. EIP-4844 implemented in the Dencun upgrade is the Ethereum network An important step towards fully implementing sharding. EIP-4844 introduces the "blobs" temporary data type, allowing Rollup to submit more data to the Ethereum main chain at a lower cost. In order to control state data expansion, Ethereum deletes blobs data after storing it in consensus layer nodes for about 18 days.

In addition to improvements in Ethereum itself, there are also projects such as Celestia, Avail, and EigenDA that are building solutions to improve data issues. They provide effective short-term data availability (DA) solutions that enhance the real-time operation and scalability of blockchain. These solutions then do not address applications that require long-term access to historical data, such as dApps that rely on long-term storage of user authentication data or dApps that require artificial intelligence model training.

To solve the challenge of long-term data storage in the Ethereum ecosystem, projects such as EthStorage, Pinax, and Covalent have proposed solutions. EthStorage provides long-term DA for Rollup, ensuring that data can be accessed and used in the long term. Pinax, The Graph, and StreamingFast have jointly developed a solution for long-term storage and retrieval of blobs. Covalent’s Ethereum Wayback Machine (EWM) is not only a long-term data storage solution, but also a complete system that enables data query and analysis.

As artificial intelligence becomes the mainstream trend in global technology development, its combination with blockchain technology is also regarded as the future development direction. This trend has resulted in a growing need for historical data access and analysis. In this context, EWM demonstrates its unique advantages. EWM provides Ethereum historical data archiving and data processing, allowing users to retrieve complex data structures and conduct in-depth analysis and query of the internal status of smart contracts, transaction results, event logs, etc.

Ethereum Wayback Machine (EWM) Introduction

Ethereum Wayback Machine (EWM) draws on the concept of Wayback Machine to save historical data on Ethereum and make it available Be accessed and verified. The Wayback Machine is a digital archival project created by the Internet Archive to record and preserve the history of the Internet. This tool allows users to view archived versions of a website at different points in the past, helping people understand historical changes in website content.

Historical data is the fundamental reason for the birth of blockchain. It not only supports the technical architecture of blockchain, but also is the cornerstone of its economic model. Blockchain was originally designed to provide a public, unchangeable historical record. For example, Bitcoin aims to create an immutable, decentralized ledger that records the history of each transaction to ensure the transparency and security of transactions. The demand scenarios for historical data are very wide, but currently there is a lack of an efficient and verifiable storage method. As a long-term DA solution, EWM can permanently store data, including blob data, and can cope with historical data accessibility issues caused by state expiration and data fragmentation. EWM focuses on the archiving and long-term accessibility of historical data on Ethereum, supporting complex data structure queries. Next, we will explore in detail how EWM achieves this goal through its unique data processing process.

After the Dencun upgrade, how to solve the long-term storage and access problems of Ethereum historical data?

EWM’s data processing process: extraction, refining and indexing

Covalent is to provide users with A platform for blockchain data access and query services. It enables reliable storage and fast access to data by capturing and indexing blockchain data and storing it on multiple nodes on the network. Covalent processes data through the Ethereum Wayback Machine (EWM), ensuring continued accessibility of historical blockchain data. The EWM data processing process includes three key steps: Extraction and Export, Refinement, Indexing and Query.

  1. Extraction and Export: This is the first step in the process and involves extracting historical transaction data directly from the blockchain network. This step is performed by specialized entities, namely Block Specimen Producers (BSP). The main task of BSP is to create and save "block samples", which are original snapshots of blockchain data. These block samples serve as canonical representations of the historical state of the blockchain, and the key is to maintain data integrity and accuracy. Once created, these block samples are uploaded to a distributed server (built on IPFS) and published and verified through the ProofChain contract. This not only ensures the security of the data, but also provides a signal to others that the data has been safely stored.

  2. Refining: After data extraction, it is refined by Block Results Producers (BRP). BRP is responsible for transforming basic data into a more useful form. Traditional blockchain data access methods often only provide limited information and are not easy to query complex data structures. By re-executing and transforming data, BRP can provide more detailed information, such as the internal status of the contract, transaction execution path, etc. In addition, BRP significantly reduces the need to rerun a full node for every query or data analysis by preprocessing and storing processed data, thereby increasing query speed and reducing storage and computing costs. At this point, the original "block sample" is transformed into a form "block result" that is easier to query and analyze. This process not only speeds up the performance of the Covalent network, but also provides more possibilities for further querying and analysis of the data.

  3. Indexing and querying: Finally, query operators (Query Operators) organize and save the processed data in an easy-to-find location. Data is pulled from distributed servers based on the needs of API users, ensuring that both historical and real-time data are available in response to API queries. This enables users to efficiently access and utilize blockchain data stored in the Covalent network.

Covalent provides a unified GoldRush API that supports obtaining historical data from multiple blockchains (such as Ethereum, Polygon, Solana, etc.). This GoldRush API provides developers with a one-stop data solution, allowing developers to obtain the account's ERC20 token balance and NFT data with a single call, thereby easily building cryptocurrency and NFT wallets (such as Rainbow, Zerion), greatly simplifying development process. In addition, using the API to access DA data requires consuming credits. Different types of requests are divided into different categories (Type A, Type B, Type C, etc.), and each category has its own specific credit cost. This revenue is used to support operator networks.

After the Dencun upgrade, how to solve the long-term storage and access problems of Ethereum historical data?

Future Outlook

With the rapid development of AI, the trend of combining AI and blockchain It's becoming more and more obvious. Blockchain technology provides AI with an immutable and distributed verified data source, enhancing data transparency and trust, making AI models more accurate and reliable in data analysis and decision-making. By analyzing on-chain data, AI can optimize algorithms and predict trends, thereby directly executing complex tasks and transactions, significantly improving dApp efficiency and reducing costs. Through EWM, AI models have access to extensive on-chain structured Web3 data sets that are complete and verifiable. As a bridge between AI models and blockchain, EWM greatly facilitates data retrieval and utilization for AI developers.

There are already some AI projects that have integrated Covalent:

  • SmartWhales: a platform that uses AI technology to optimize copy trading investment strategies. Copy trading relies on the analysis of historical data to identify successful trading patterns and strategies. Covalent provides a comprehensive and detailed blockchain data set through which SmartWhales analyzes past trading behavior and results to identify which strategies perform well under specific market conditions and recommend them to users.

  • BotFi: DeFi trading bot. Analyze market trends and automated trading strategies by integrating Covalent's data, and automatically conduct buying and selling operations based on market changes.

  • Laika AI: Leveraging AI for comprehensive on-chain analysis. The Laika AI platform drives its AI model by integrating structured blockchain data provided by Covalent to help users conduct complex on-chain data analysis.

  • Entendre Finance: Automated DeFi asset management, providing real-time insights and predictive analytics. Its AI leverages Covalent’s structured data to simplify and automate asset management, such as monitoring and managing digital asset holdings, automating specific trading strategies, and more.

EWM is also constantly improving and upgrading as needs change. Covalent engineer Pranay Valson said that in the future, EWM will expand the protocol specifications to support other blockchains such as Polygon and Arbitrum, and will Integrate the BSP fork into Ethereum clients such as Nethermind and Besu to achieve wider compatibility and application. In addition, EWM will use KZG commitments when processing blob transactions on the beacon chain to improve data storage and retrieval efficiency and reduce storage costs.

The above is the detailed content of After the Dencun upgrade, how to solve the long-term storage and access problems of Ethereum historical data?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn