The Impact of DeepSeek on Web3 AI Upstream and Downstream Protocols

BlockBooster
10 min readFeb 18, 2025

--

TLDR:

  • The emergence of DeepSeek shatters the computing power moat, with computing power optimization led by open-source models becoming the new direction.
  • DeepSeek benefits the model and application layers of the industry’s upstream and downstream, while negatively impacting the computing power protocols in the infrastructure layer.
  • DeepSeek’s positive impact inadvertently bursts the last bubble in the Agent sector, with DeFAI being the most likely to foster new growth.
  • The zero-sum game of project fundraising is likely to come to an end, with community launches + a small amount of VC becoming the new norm for fundraising.

The shock triggered by DeepSeek will have a profound impact on both the upstream and downstream of the AI industry this year. DeepSeek has successfully enabled consumer-grade GPUs to complete large model training tasks that originally required high-end GPUs costing tens of thousands of dollars. The first moat of AI development — computing power — is beginning to crumble. As algorithmic efficiency surges at an annual rate of 68%, while hardware performance climbs linearly in accordance with Moore’s Law, the deeply entrenched valuation models of the past three years are no longer applicable. The next chapter of AI will be ushered in by open-source models.

Although Web3 AI protocols are completely different from those of Web2, they inevitably bear the impact of DeepSeek. This impact will give rise to entirely new use cases across the Web3 AI upstream and downstream, including the infrastructure layer, middleware layer, model layer, and application layer.

Author: Kevin, the Researcher at BlockBooster

Mapping Out the Collaborative Relationships Among Upstream and Downstream Protocols

Through an analysis of technical architecture, functional positioning, and real-world use cases, I have divided the entire ecosystem into four layers: the infrastructure layer, middleware layer, model layer, and application layer, and mapped out their dependencies:

Infrastructure Layer

The infrastructure layer provides decentralized underlying resources (computing power, storage, L1). Among these, computing power protocols include Render, Akash, io.net, etc.; storage protocols include Arweave, Filecoin, Storj, etc.; and L1s include NEAR, Olas, Fetch.ai, etc.

Computing power layer protocols support model training, inference, and framework operations; storage protocols store training data, model parameters, and on-chain interaction records; and L1s, through specialized nodes, optimize data transmission efficiency and reduce latency.

Middleware Layer

The middleware layer serves as a bridge connecting the infrastructure with upper-layer applications, providing framework development tools, data services, and privacy protection. Among these, data labeling protocols include Grass, Masa, Vana, etc.; development framework protocols include Eliza, ARC, Swarms, etc.; and privacy computing protocols include Phala, etc.

The data service layer fuels model training, development frameworks rely on the computing power and storage of the infrastructure layer, and the privacy computing layer protects data security during training and inference.

Model Layer

The model layer is used for model development, training, and distribution, including the open-source model training platform: Bittensor.

The model layer depends on the computing power of the infrastructure layer and the data from the middleware layer; models are deployed on-chain through development frameworks; and the model market channels the training outcomes to the application layer.

Application Layer

The application layer comprises AI products aimed at end users, including Agents such as GOAT, AIXBT, etc., and DeFAI protocols such as Griffain, Buzz, etc.

The application layer calls upon the pre-trained models from the model layer; it relies on the privacy computing provided by the middleware layer; and complex applications require real-time computing power from the infrastructure layer.

DeepSeek May Have a Negative Impact on Decentralized Computing Power

According to sample surveys, about 70% of Web3 AI projects actually invoke OpenAI or centralized cloud platforms, only 15% of projects use decentralized GPUs (such as Bittensor subnetwork models), and the remaining 15% adopt hybrid architectures (with sensitive data processed locally and general tasks handled in the cloud).

The actual usage rate of decentralized computing power protocols is far below expectations and does not match their actual market value. There are three reasons for this low usage rate: Web2 developers migrating to Web3 continue to use their original toolchains; decentralized GPU platforms have not yet achieved a price advantage; and some projects use the term “decentralized” to evade data compliance reviews, while their actual computing power still relies on centralized clouds.

AWS/GCP hold over 90% of the AI computing power market share; in comparison, the equivalent computing power of Akash is only 0.2% of AWS. The moats of centralized cloud platforms include cluster management, RDMA high-speed networks, and elastic scaling; while decentralized cloud platforms offer web3-enhanced versions of these technologies, they have inherent shortcomings: latency issues — the communication delay between distributed nodes is six times that of centralized clouds — and fragmented toolchains, as PyTorch/TensorFlow do not natively support decentralized scheduling.

DeepSeek reduces computing power consumption by 50% through sparse training, and dynamic model pruning enables consumer-grade GPUs to train 10-billion parameter models. The market’s short-term demand forecast for high-end GPUs has been significantly downgraded, and the market potential of edge computing has been re-evaluated. As shown in the image above, before the emergence of DeepSeek, the vast majority of protocols and applications in the industry used platforms like AWS, with only a very small number of use cases deployed on decentralized GPU networks; these use cases valued the price advantage of consumer-grade computing power and did not pay attention to the impact of latency.

This situation may further worsen with the advent of DeepSeek. DeepSeek removes the limitations on long-tail developers, and low-cost, efficient inference models will spread at an unprecedented pace. In fact, many of the aforementioned centralized cloud platforms and several countries have already begun deploying DeepSeek; the substantial reduction in inference costs will spawn a large number of front-end applications, which will have enormous demand for consumer-grade GPUs. Faced with the impending huge market, centralized cloud platforms will initiate a new round of user competition — not only competing with leading platforms but also with countless small centralized cloud platforms. The most direct form of competition is price reduction; it is foreseeable that the price of the 4090 on centralized platforms will drop, which would be a devastating blow to Web3’s computing power platforms. When price is not the only moat for these platforms and they are forced to lower prices, the result will be unsustainable for io.net, Render, Akash, and others. A price war will destroy their only remaining valuation ceiling, and the downward spiral of declining revenues and user attrition may force decentralized computing power protocols to transform in a new direction.

The Specific Implications of DeepSeek for Industry Upstream and Downstream Protocols

As shown in the figure, I believe that DeepSeek will have different impacts on the infrastructure layer, model layer, and application layer. From a positive perspective:

  • The application layer will benefit from a significant reduction in inference costs, enabling more applications to keep Agent applications online for extended periods at low cost while completing tasks in real time.
  • At the same time, DeepSeek’s low-cost model overhead can allow DeFAI protocols to form more complex SWARMs, with thousands of Agents assigned to a single use case, where the division of labor for each Agent is extremely fine-grained and clearly defined. This can greatly improve the user experience by preventing user inputs from being incorrectly decomposed and executed by the model.
  • Developers in the application layer can fine-tune models, feeding DeFi-related AI applications with pricing, on-chain data and analytics, and protocol governance data without having to pay exorbitant licensing fees.

Regarding negative impacts:

  • The inherent usage latency of computing power protocols in the infrastructure layer cannot be optimized.
  • Furthermore, the hybrid network composed of A100 and 4090 GPUs demands higher coordination algorithm requirements, which is not an advantage of decentralized platforms.

DeepSeek Bursts the Last Bubble in the Agent Sector; New Growth Has Already Begun

Agents are the last hope for AI in the industry. The emergence of DeepSeek has liberated the constraints of computing power, depicting a future where applications will explode. This should have been a major boon for the Agent sector, but due to its strong correlation with the industry, US stock market, and Federal Reserve policies, the last remaining bubble has been burst, and the sector’s market value has plummeted.

In the wave of AI integration with various industries, technological breakthroughs and market dynamics have always been intertwined. The chain reaction triggered by Nvidia’s market value fluctuations acts like a mirror, revealing the deep-seated dilemmas within the industry’s AI narrative: from On-chain Agents to DeFAI engines, beneath the seemingly complete ecosystem lies the harsh reality of weak technical infrastructure, hollowed-out value logic, and capital-driven dominance. The superficially prosperous on-chain ecosystem harbors hidden issues: large amounts of high-FDV tokens are competing for limited liquidity, outdated assets rely on FOMO-driven sentiment to survive, and developers are trapped in a PVP (player vs. player) competition that consumes innovation. When incremental funding and user growth hit a ceiling, the entire industry falls into the “innovator’s dilemma” — yearning for breakthrough narratives while being unable to escape the shackles of path dependency. This state of division presents a historic opportunity for AI Agents: they represent not only an upgrade to the technological toolbox but also a reconstruction of the value creation paradigm.

Over the past year, more and more teams in the industry have realized that traditional financing models are failing — methods of giving VCs small shares, high control, and waiting for listings to pump the price are no longer sustainable. With VCs tightening their pockets, retail investors refusing to buy in, and high barriers to listing on large exchanges, a new playbook more suited to a bear market is emerging: partnering with leading KOLs and a few VCs, with a large proportion of tokens distributed to the community and a low market cap to kick off projects, securing millions in funding.

Innovators like Soon and Pump Fun are paving a new path with “community launches” — partnering with leading KOLs to endorse the project and distributing 40%-60% of tokens directly to the community, starting the project at a valuation as low as $10 million FDV to raise millions. This model uses KOL influence to build consensus and FOMO, allowing teams to lock in profits early while exchanging high liquidity for market depth. While it sacrifices short-term control advantages, it can repurchase tokens at low prices in a bear market through compliant market-making mechanisms. Essentially, this is a shift in power structure: from a VC-dominated “pass the parcel” game (institutional investors buy in — exchanges list the tokens — retail investors buy) to a transparent pricing game driven by community consensus, where the project team and community form a new symbiotic relationship in the liquidity premium. As the industry enters a cycle of transparency, projects that cling to traditional control logic may become the remnants of an era swept away by the wave of power migration.

The market’s short-term pain precisely affirms the irreversibility of the long-term technological tide. When AI Agents reduce on-chain interaction costs by two orders of magnitude, and adaptive models continuously optimize the capital efficiency of DeFi protocols, the industry is poised to embrace the long-awaited Massive Adoption. This transformation doesn’t rely on concept hype or capital-driven acceleration; it is rooted in the technological penetration power of real demand — just as the electric revolution didn’t stagnate due to the bankruptcy of light bulb companies, Agents will ultimately become the true golden track after the bubble bursts. And DeFAI might be the fertile ground for new growth. When low-cost inference becomes routine, we may soon see use cases with hundreds of Agents combined into a single Swarm. With equivalent computing power, the rise in model parameters will ensure that Agents in the open-source model era can be fine-tuned more thoroughly, even when faced with complex user input instructions. These can be broken down into task pipelines that a single Agent can fully execute. Each Agent optimizes on-chain operations, potentially boosting overall DeFi protocol activity and liquidity. More complex DeFi products, led by DeFAI, will emerge, and this is where new opportunities will arise after the previous bubble burst.

Disclaimer:

This article/blog is provided for informational purposes only. It represents the views of the author(s) and it does not represent the views of BlockBooster. It is not intended to provide (i) investment advice or an investment recommendation; (ii) an offer or solicitation to buy, sell, or hold digital assets, or (iii) financial, accounting, legal, or tax advice. Digital asset holdings, including stablecoins and NFTs, involve a high degree of risk, can fluctuate greatly, and can even become worthless.

You should carefully consider whether trading or holding digital assets is suitable for you in light of your financial condition. Please consult your legal/tax/investment professional for questions about your specific circumstances. Information (including market data and statistical information, if any) appearing in this post is for general information purposes only. While all reasonable care has been taken in preparing this data and graphs, no responsibility or liability is accepted for any errors of fact or omission expressed herein.

--

--

BlockBooster
BlockBooster

Written by BlockBooster

BlockBooster is a leading Asian Web3 venture studio. Its mission is to lead the Web3 industry through strategic investment and incubation of promising projects.

No responses yet