📢 Gate Square #MBG Posting Challenge# is Live— Post for MBG Rewards!
Want a share of 1,000 MBG? Get involved now—show your insights and real participation to become an MBG promoter!
💰 20 top posts will each win 50 MBG!
How to Participate:
1️⃣ Research the MBG project
Share your in-depth views on MBG’s fundamentals, community governance, development goals, and tokenomics, etc.
2️⃣ Join and share your real experience
Take part in MBG activities (CandyDrop, Launchpool, or spot trading), and post your screenshots, earnings, or step-by-step tutorials. Content can include profits, beginner-friendl
AI and Crypto Assets Integration: Reshaping Value Systems and Industry Chain Patterns
AI x Crypto: From Zero to Peak
Introduction
The artificial intelligence industry has been booming recently and is seen as a new industrial revolution. The emergence of large models has significantly improved efficiency across various industries, with Boston Consulting estimating that GPT has increased work efficiency in the U.S. by about 20%. The generalization capability of large models is considered a new software design paradigm, enabling software to have better performance and broader modality support. Deep learning technology has ushered in the fourth boom for the AI industry, and this wave has also impacted the cryptocurrency industry.
This report will explore in detail the development history of the AI industry, the classification of technologies, and the impact of deep learning on the industry. It will analyze in depth the industrial chain of deep learning, including GPU, cloud computing, data sources, and edge devices, as well as their current status and trends. It will essentially discuss the relationship between cryptocurrency and the AI industry, outlining the pattern of the AI industrial chain related to cryptocurrency.
The Development History of the AI Industry
The AI industry began in the 1950s. To realize the vision of artificial intelligence, academia and industry have developed various implementation paths under different historical contexts.
Modern artificial intelligence technology primarily adopts "machine learning" methods, allowing machines to iteratively improve system performance through data. The main steps are to input data into algorithms, train models, test deployment, and complete automated prediction tasks.
There are three major schools of machine learning: connectionism, symbolism, and behaviorism, which respectively mimic the human nervous system, thinking, and behavior.
Currently, connectionism represented by neural networks dominates ( also known as deep learning ). Neural networks have an input layer, output layer, and multiple hidden layers. When the number of layers and neurons ( parameters ) is sufficient, they can fit complex general tasks. By continuously adjusting parameters, the optimal state is ultimately achieved, which is the origin of "depth."
Deep learning technology has undergone multiple iterations, evolving from early neural networks to feedforward neural networks, RNNs, CNNs, GANs, and finally developing into modern large models like the Transformer technology used by GPT. The Transformer technology is just one evolutionary direction of neural networks, adding a transformer module that can encode multimodal data into numerical representations before inputting it into the neural network, achieving multimodal processing.
The development of AI has gone through three technological waves:
In the 1960s, the development of symbolic technology triggered the first wave, solving the problems of general natural language processing and human-computer dialogue. Expert systems were born during the same period.
In 1997, IBM's Deep Blue defeated the world chess champion, marking the second peak of AI technology.
In 2006, the concept of deep learning was proposed, marking the beginning of the third technological wave. Deep learning algorithms have continuously evolved, from RNNs and GANs to Transformers and Stable Diffusion, ushering in a prosperous era for connectionism.
Many iconic events emerged during the third wave:
Deep Learning Industry Chain
Current large language models mainly adopt deep learning methods based on neural networks. The large models led by GPT have sparked a new wave of AI enthusiasm, with a surge in market demand for data and computing power. This section explores the composition and development status of the deep learning algorithm industry chain.
The training of large models is mainly divided into three steps:
Pre-training: Inputting a large amount of data to find the best parameters is the most computationally intensive.
Fine-tuning: Train with a small amount of high-quality data to improve model quality.
Reinforcement Learning: Establish a reward model to evaluate output quality and automatically iterate parameters.
The performance of large models is mainly influenced by three aspects: the number of parameters, the quality and quantity of data, and computational power. An empirical formula can be used to estimate the required computational effort.
Computing power mainly uses GPU chips, such as Nvidia's A100, H100, etc. GPUs perform floating-point calculations through Tensor Core modules, and the chip performance is mainly assessed in terms of FLOPS at FP16/FP32 precision.
Training large models requires massive computing power and storage space. Taking GPT-3 as an example, with 175 billion parameters and 180 billion tokens of data, one pre-training session takes 584 days. The parameters and data volume of GPT-4 have increased by 10 times, requiring 100 times the computing power.
The industrial chain mainly includes:
The Relationship between Crypto and AI
Blockchain technology, combined with ZK, has evolved into a decentralized and trustless concept. Essentially, it is a value network where each transaction is based on the value conversion of tokens.
Token economics can endow the network with multidimensional value, far exceeding that of traditional corporate securities. Tokens allow any innovation and idea to be assigned value.
In the AI industry, token economics can reshape the value of various links in the supply chain and incentivize more participation. The immutability and trustless characteristics of blockchain technology can also enable some AI applications that require trust.
In summary, token economics promotes value reconstruction and discovery, decentralized ledgers solve trust issues, and value is reflowed globally.
Overview of the Value Chain in the Crypto Industry
GPU Supply Side
Representative projects such as Render. The GPU cloud computing power market is not only aimed at AI model training and inference but can also be used for traditional rendering tasks, reducing the risk of a single market.
It is expected that the demand for GPU computing power will be approximately $75 billion in 2024, reaching $773 billion by 2032, with a CAGR of 33.86%.
With the iteration of GPUs, a large number of idle GPUs will play a long-tail value in the shared network. However, on-chain GPU sharing faces data transmission bandwidth issues.
hardware bandwidth
Representative projects such as Meson Network. However, shared bandwidth may be a pseudo-demand, as geographical dispersion leads to latency higher than local storage.
data
Representative projects include EpiK Protocol, Synesis One, Masa, etc. The advantage of Web3 data providers lies in broader data collection channels. Projects in the ZK direction, such as Masa, have good prospects.
ZKML
Using homomorphic encryption technology to achieve privacy computing and training. Representative projects include Axiom, Risc Zero, Ritual, etc.
AI applications
Mainly traditional blockchain applications + automated generalization capabilities. AI Agent is becoming an important direction, with representative projects like Fetch.AI.
AI Public Chain
Adaptive networks built for AI models or agents, such as Tensor, Allora, etc. Based on token economics, it can significantly reduce inference costs.
Summary
Although deep learning technology is not the only direction for AI development, it already has practical application scenarios. Token economics can reshape the value of the AI industry chain, and blockchain technology can solve trust issues.
Although the GPU sharing platform can utilize idle computing power to reduce costs, bandwidth issues limit its applicability to non-urgent small model training.
Overall, the combination of AI and Crypto has practical utility, can reshape the value system, address trust issues, and discover residual value.