DeepBrain Chain was founded in November 2017 with the vision of building an infinitely scalable, distributed high-performance computing network based on blockchain technology and to become the most important infrastructure in the 5G+AI era.
Humanity is moving into the age of intelligence, and artificial intelligence has been integrated into every aspect of people's lives. The Artificial intelligence troika: deep models, big data (Internet, sensors, IOT), and high-performance computing (GPU, FPGA, special chips). Individual deep models have an increasing demand for computing power: ImageNet image recognition -- 1~10 GPUs, AlphaFold/AlphaFold2 -- 100~200 GPUs, BERT language model -- 100~200 GPUs, using 1026 TPUs, training time can be shortened to 76min, GPT- 3 language model -- 1,000 GPUs OpenAI, 175 billion parameters, training once consumes millions of dollars, multimodal large-scale pre-training model -- 2,000 GPUs Beijing Academy of Artificial Intelligence (BAAI).
The Artificial intelligence race is the computing power race: solving the problem of computing power supply and demand and rewarding computing power is imminent. DeepBrain Chain hopes to build an infinitely scalable, distributed high-performance computing network through blockchain technology to achieve cost reduction and efficiency improvement of AI computing power worldwide, promote the popularization and democratization of AI computing power, and accelerate the arrival of the era of artificial intelligence.
Humanity is moving into the age of intelligence, and artificial intelligence has been integrated into every aspect of people's lives. The Artificial intelligence troika: deep models, big data (Internet, sensors, IOT), and high-performance computing (GPU, FPGA, special chips). Individual deep models have an increasing demand for computing power: ImageNet image recognition -- 1~10 GPUs, AlphaFold/AlphaFold2 -- 100~200 GPUs, BERT language model -- 100~200 GPUs, using 1026 TPUs, training time can be shortened to 76min, GPT- 3 language model -- 1,000 GPUs OpenAI, 175 billion parameters, training once consumes millions of dollars, multimodal large-scale pre-training model -- 2,000 GPUs Beijing Academy of Artificial Intelligence (BAAI).
The Artificial intelligence race is the computing power race: solving the problem of computing power supply and demand and rewarding computing power is imminent. DeepBrain Chain hopes to build an infinitely scalable, distributed high-performance computing network through blockchain technology to achieve cost reduction and efficiency improvement of AI computing power worldwide, promote the popularization and democratization of AI computing power, and accelerate the arrival of the era of artificial intelligence.