AI and Blockchain Integration: From Technical Exploration to Industrial Chain Restructuring

The Integration of AI and Blockchain Technology: From Infrastructure to Applications

Artificial intelligence ( AI ) industry has rapidly developed in recent years and is regarded as a key driving force of a new industrial revolution. The emergence of large language models has significantly improved work efficiency across various industries, with Boston Consulting estimating that GPT has brought about a 20% increase in overall work efficiency in the United States. At the same time, the generalization capability of large models is considered a new software design paradigm, as modern software design increasingly adopts generalized large model frameworks to support a wider range of modal inputs and outputs compared to traditional precise code. Deep learning technology has ushered in a new wave of prosperity for the AI industry, and this wave has also extended to the cryptocurrency industry.

This article will explore in detail the development history of the AI industry, the classification of technologies, and the impact of deep learning on the industry. We will conduct an in-depth analysis of the upstream and downstream of the deep learning industrial chain, including GPU, cloud computing, data sources, edge devices, etc., and sort out its current development status and trends. Subsequently, we will fundamentally explore the relationship between cryptocurrency and the AI industry, and organize the layout of the AI industrial chain related to cryptocurrency.

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

Development History of the AI Industry

The AI industry started in the 1950s. To realize the vision of artificial intelligence, academia and industry have developed various schools of thought under different historical contexts. Modern artificial intelligence technology mainly adopts "machine learning" methods, whose core idea is to enable machines to be data-driven and continuously iterate in tasks to improve system performance. The main steps include inputting data into algorithms, training models, testing deployment, and ultimately achieving automated predictions.

Currently, there are three main schools of thought in machine learning: connectionism, symbolicism, and behaviorism, which respectively mimic the human nervous system, thinking, and behavior. Among them, connectionism, represented by neural networks, holds a dominant position and is also known as deep learning. The architecture of neural networks includes an input layer, an output layer, and multiple hidden layers. As the number of layers and neurons increases, it can fit more complex general tasks. By continuously inputting data to adjust parameters, it ultimately reaches an optimal state, which is also the origin of "deep."

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

Deep learning technology has undergone several evolutions, from the earliest neural networks to feedforward neural networks, RNNs, CNNs, and GANs, eventually developing into modern large models such as those using Transformer technology like GPT. The Transformer, by adding converters, can encode multimodal data ( such as audio, video, images, etc. ) into corresponding numerical representations, thus achieving fitting for any type of data.

The development of AI has gone through three technological waves:

  1. 1960s: The first wave was triggered by symbolic technology, addressing the issues of general natural language processing and human-computer dialogue. During the same period, expert systems were born.

  2. 1997: IBM's Deep Blue defeated the world chess champion, marking the second peak of AI technology.

  3. Since 2006: The three giants of deep learning proposed the concept of deep learning, algorithms gradually evolved, forming the third technological wave, which is also the heyday of connectionism.

In recent years, several landmark events have emerged in the field of AI, including:

  • 2014: Goodfellow proposed GAN( Generative Adversarial Network)
  • 2015: OpenAI was founded
  • 2016: AlphaGo defeated Lee Sedol
  • 2017: Google released the Transformer algorithm paper
  • 2018: OpenAI released GPT
  • 2020: OpenAI released GPT-3
  • 2023: The GPT-4-based ChatGPT was launched and quickly became popular.

Newbie Guide丨AI x Crypto: From Zero to Peak

Deep Learning Industry Chain

Current large language models mainly adopt deep learning methods based on neural networks. The large models represented by GPT have triggered a new wave of AI craze, leading to a significant increase in market demand for data and computing power. We will focus on discussing the composition of the industrial chain of deep learning algorithms, as well as the current situation, supply-demand relationship, and future development trends of both upstream and downstream.

The training of large language models based on Transformer technology ( LLMs ) is mainly divided into three steps:

  1. Pre-training: Input a large amount of data pairs to find the best parameters for the neurons. This stage is the most computationally intensive and requires repeated iterations to try various parameters.

  2. Fine-tuning: Train with a small amount of high-quality data to improve model output quality.

  3. Reinforcement Learning: Establish a reward model to rank output results for automatic iteration of large model parameters. Sometimes human participation in evaluation is also required.

The performance of the model is mainly determined by three factors: the number of parameters, the amount and quality of data, and computational power. The more parameters, the higher the upper limit of the model's generalization ability. According to empirical rules, pre-training a large model requires approximately 6np Flops of computational power, where n is the number of tokens and p is the number of parameters.

Early AI training primarily used CPUs for computing power, but later gradually shifted to GPUs, such as NVIDIA's A100 and H100 chips. GPUs perform floating-point operations through Tensor Core modules, and the Flops data under FP16/FP32 precision is an important metric for measuring chip performance.

Taking GPT-3 as an example, it has 175 billion parameters and 180 billion tokens of training data. A single pre-training requires about 3.15*10^22 Flops, and even with the most advanced GPU chips, it takes hundreds of days. As the model size increases, the demand for computing power grows exponentially.

During the model training process, data storage also faces challenges. Due to limited GPU memory, data needs to be frequently transferred between the hard drive and memory, making chip bandwidth a key factor. When training with multiple GPUs in parallel, the data transfer rate between chips is also very important. Therefore, the computing power of the chips is not the only bottleneck; memory bandwidth is often more critical.

The deep learning industry chain mainly includes the following several links:

  1. Hardware GPU provider: NVIDIA holds a monopoly position in the high-end AI chip market. Companies like Google and Intel are also developing their own AI chips.

  2. Cloud service providers: mainly divided into three categories: traditional cloud vendors ( such as AWS, Google Cloud ), vertical AI cloud service providers ( such as CoreWeave ), and inference as a service providers ( such as Together.ai ).

  3. Training data source providers: Provide massive, high-quality, specific data for large models and vertical field models.

  4. Database Providers: Mainly vector databases, used for efficient storage and processing of unstructured data.

  5. Edge Devices: Including energy supply and cooling systems to support the operation of large-scale computing clusters.

  6. Applications: Various AI applications developed based on large models, such as dialogue systems, creative tools, etc.

Newbie Science Popularization丨AI x Crypto: From Zero to Peak

The Relationship Between Cryptocurrency and AI

The core of Blockchain technology is decentralization and trustlessness. Bitcoin created a trustless value transfer system, and Ethereum further realized a decentralized and trustless smart contract platform. Essentially, a Blockchain network is a value network, where each transaction is a value conversion based on the underlying token.

In the traditional internet, the value of enterprises is mainly reflected through cash flow and price-to-earnings ratio. In the blockchain ecosystem, native tokens ( such as ETH) carry multidimensional value for the network; they can not only generate staking rewards but also serve as a medium for value exchange, a means of storage, and consumer goods for network activities. Token economics defines the relative value of settlement objects in the ecosystem. Although it is difficult to price each dimension individually, the token price reflects the multidimensional value comprehensively.

The charm of tokens lies in their ability to assign value to any function or idea. Token economics redefine and discover the way value is perceived, which is crucial for various industries, including AI. In the AI industry, issuing tokens can reshape the value at various stages of the industrial chain, incentivizing more participants to delve deeper into niche markets. Tokens not only bring cash flow but can also enhance the value of infrastructure through synergies, forming the paradigm of "fat protocols and thin applications."

The immutability and trustless characteristics of blockchain technology also bring practical significance to the AI industry. It can enable applications that require trust, such as ensuring that models do not leak privacy when using user data. When there is a shortage of GPU supply, computing power can be distributed through blockchain networks; when GPUs are iteratively updated, idle old devices can still contribute value. These are unique advantages of a global value network.

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

Overview of AI Industry Chain Projects in the Cryptocurrency Sector

  1. GPU Supply Side:

Major projects include Render, Golem, etc. Render, as a more mature decentralized infrastructure project, mainly targets video rendering and other non-large model tasks. With the growth of AI demand and GPU iteration, the demand for shared GPU computing power may increase, providing opportunities for value discovery for idle GPUs.

  1. Hardware Bandwidth:

Typical projects like Meson Network aim to establish a global bandwidth sharing network. However, shared bandwidth may be a pseudo-demand, as the latency of local data storage is far lower than that of distributed storage for high-performance computing clusters.

  1. Data:

Projects such as EpiK Protocol, Synesis One, and Masa provide AI training data services. Among them, Masa is based on zero-knowledge proof technology and supports privacy data collection. The advantage of these types of projects lies in their ability to achieve extensive data collection and incentivize users to contribute data through tokens.

  1. ZKML( Zero Knowledge Machine Learning):

Implement privacy computing and training using zero-knowledge proof technology. Major projects include Modulus Labs, Giza, etc. Some general-purpose ZK projects like Axiom and Risc Zero are also worth paying attention to.

  1. AI Applications:

The main idea is to combine AI capabilities with traditional Blockchain applications, such as AI Agent. Fetch.AI is a representative project that helps users make complex on-chain decisions through intelligent agents.

  1. AI Blockchain:

Adaptive networks specifically built for AI models or agents, such as Tensor, Allora, and Hypertensor. These types of projects typically employ mechanisms similar to reinforcement learning to improve model parameters through on-chain evaluators.

Newcomer Science Popularization丨AI x Crypto: From Zero to Peak

Summary

Although the current development of AI mainly focuses on deep learning technologies, there are still other potential AI technological paths worth paying attention to. While deep learning may not achieve general artificial intelligence, it has been widely applied in fields such as recommendation systems and has practical value.

Blockchain technology and token economics have brought new value definitions and discovery mechanisms to the AI industry. They can reshape the value of various links in the AI industry chain, incentivize more participants, and achieve efficient resource allocation through a global value network.

However, decentralized GPU networks still have disadvantages in terms of bandwidth and developer tools, and are currently mainly suitable for non-urgent small model training. For large enterprises and mission-critical tasks, traditional cloud service platforms still have advantages.

Overall, the combination of AI and Blockchain has practical utility and long-term potential. Token economics can reshape and discover broader value, while decentralized ledgers can solve trust issues, facilitating the flow of value and the discovery of surplus value on a global scale. With the advancement of technology and the improvement of the ecosystem, the integration of AI and Blockchain is expected to bring more innovations and opportunities.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • 5
  • Share
Comment
0/400
MEVHuntervip
· 38m ago
backrunning the ai revolution like a pro
Reply0
ContractSurrendervip
· 17h ago
crypto world smart contracts a knife🔪
View OriginalReply0
UncleWhalevip
· 17h ago
Boston really dares to boast, just make a table and you'll know it's 20%.
View OriginalReply0
Whale_Whisperervip
· 17h ago
When GPT breaks out, it's doomed.
View OriginalReply0
GateUser-40edb63bvip
· 17h ago
Crazy hoarding of GPUs...
View OriginalReply0
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)