Startup Unveils World’s Largest AI Computer Chip
A San Francisco startup has revealed what it claims is the world’s first trillion-transistor chip optimised for artificial intelligence.
Cerebras Systems, a California-based startup has revealed to the world a computer chip that is slightly larger than a standard iPad. The Wafer Scale Engine (WSE) is a single high-speed chip that can power complex artificial intelligence (AI) systems.
WSE is 56.7 times larger than the biggest graphics processing unit (GPU) and has 3,000 times more high-speed, on-chip memory with 10,000 times more memory bandwidth. In terms of size, the chip measures 46,225mm2, making it 56.7 times larger than the biggest GPU.
While the most powerful desktop CPUs have about 30 processor cores and powerful GPUs, which are traditionally used for AI processes, have as many as 5,000 WSE has 400,000 cores all linked to each other by high bandwidth connections.
The company says that the WSE can accelerate AI development exponentially by processing information and producing answers much faster and thereby reducing the time-to-insight. This means that researchers can test more theories, use more data and solve more problems faster than ever.
- Digital Sepsis Alert System Has Saved ‘Hundreds of Lives’ Says NHS
- PM Calls on Social Media Firms to Join Fight Against Anti-Vaxx Fake News
- Pioneering Sensor Technology Could Help Tackle Livestock Disease
Andrew Feldman, founder and CEO of Cerebras Systems, said: “Designed from the ground up for AI work, the Cerebras WSE contains fundamental innovations that advance the state of the art by solving decades-old technical challenges that limited chip size – such as cross-reticle connectivity, yield, power delivery and packaging.”
According to Cerebras Systems: “With 56.7 times more silicon area than the largest graphics processing unit, the WSE provides more cores to do calculations and more memory closer to the cores, so the cores can operate efficiently.
“Because this vast array of cores and memory are on a single chip, all communication is kept on-silicon. This means the WSE’s low-latency communication bandwidth is immense, so groups of cores can collaborate with maximum efficiency, and memory bandwidth is no longer a bottleneck.”
The company has already started shipping the chips to a small number of customers, however, they have yet to reveal how much this unique chip costs.
Dr Ian Cutress, senior editor at the news site AnandTech, told BBC News that while the chip would be able to process data much faster, the speed would come at a price.
“One of the advantages of smaller computer chips is they use a lot less power and are easier to keep cool,” he said.
“When you start to deal with bigger chips like this, companies need specialist infrastructure to support them, which will limit who can use it practically.
“That’s why it’s suited for artificial intelligence development as that’s where the big dollars are going at the moment.”