Author: Crypto, Distilled
Compiled by: Deep Tide TechFlow
Crypto and AI: Is it the end of the road?
In 2023, Web3-AI once became a hot topic.
But nowadays, it’s full of imitators and huge projects with no real purpose.
The following are the misunderstandings to avoid and the key points to focus on.
IntoTheBlock CEO @jrdothoughts recently shared his insights in an article.
He discusses:
a. Core challenges of Web3-AI
b. Over-hyped trends
c. Trends with high potential
I’ve distilled it for you Make every point! Let’s find out:
The current Web3-AI market is over-hyped and over-funded.
Many projects are out of touch with the actual needs of the AI industry.
This disconnect creates confusion, but it also creates opportunities for those with insight.
(Credit to @coinbase)
The gap between Web2 and Web3 AI is widening for three main reasons:
Limited AI research Talent
Constrained infrastructure
Insufficient models, data and computing resources
Generative AI relies on three major elements: model, data and computing resources.
Currently, no major models are optimized for Web3 infrastructure.
Initial funding supported some Web3 projects that were disconnected from the reality of AI.
Despite all the hype, not all Web3-AI trends are worth paying attention to.
Here are some of the trends that @jrdothoughts believes are the most overrated:
a. Decentralized GPUNetwork
b. ZK-AI model
c. Proof of Inference ( Credits @ModulusLabs)
These networks promise to democratize AI training.
But the reality is that training large models on decentralized infrastructure is slow and impractical.
This trend has yet to deliver on its lofty promises.
Zero-knowledge AI model looks very attractive in terms of privacy protection.
But in practice, they are computationally expensive and difficult to interpret.
This makes them less practical for large-scale applications.
(Credit to @oraprotocol)
Information in the picture:
b) Currently, the overhead is up to 1000 times.
However, this approach is far from practical, especially for use cases like those described by Vitalik. Here are some examples:
zkML framework EZKL takes about 80 minutes to generate a proof for a 1M-nanoGPT model.
According to data from Modulus Labs, the overhead of zkML is more than 1000 times higher than pure computation, with the latest report showing 1000 times.
According to the EZKL benchmark, RISC Zero’s average proof time on the random forest classification task is 173 seconds.
The proof of inference framework provides cryptographic proofs for AI outputs.
However, @jrdothoughts believes that these solutions solve problems that do not exist.
Thus, they have limited real-world applications.
While some trends are over-hyped, others have significant potential.
Here are some undervalued trends that may offer real opportunities:
a. AI with walletsAgents
b. Cryptocurrencies for AI Provide funding
c. Small base model
d.Synthetic dataGenerate
Imagine an AI agent with financial capabilities through cryptocurrency .
These agents can hire other agents or stake funds to ensure quality.
Another interesting application is "predictive agents" as mentioned by @vitalikbuterin.
Generative AI projects often face funding shortages.
Cryptocurrency’s efficient capital formation methods, such as airdrops and incentives, provide critical financial support for open source AI projects.
These methods help drive innovation.
(Credit to @oraprotocol)
Small base models, such as Microsoft’s Phi model, demonstrate the idea that less is more.
具有 1B-5B 參數的模型對去中心化 AI 至關重要,能夠提供強大的裝置端 AI 解決方案。
(資料來源:@microsoft)
合成資料產生
資料是 AI 發展的主要稀缺障礙之一。
克服炒作
最初的 Web3-AI 熱潮主要集中在一些脫離實際的價值主張上。
@jrdothoughts認為,現在應將重點轉向建構實際可行的解決方案。
🎜隨著注意力轉移,AI 領域依然充滿機會,等待敏銳的目光去發現。 🎜🎜🎜本文僅供教育用途,非財務建議。非常感謝@jrdothoughts提供的寶貴見解。 🎜🎜The above is the detailed content of CryptoX AI is not popular anymore? A quick look at high-potential storytelling directions you may have overlooked. For more information, please follow other related articles on the PHP Chinese website!