Nvidia CEO Jensen Huang kicked off the artificial intelligence (AI) giant's GTC 2024 conference with a keynote address on Monday in which he unveiled the company's next-generation AI chip that the company hopes will keep Nvidia at the forefront of the AI race.
The new Blackwell G200 is the successor to its Grace Hopper line of graphics processing units (GPUs), which are used in data centers and supercomputers as well as speeding up tasks. Nvidia said the new chip can enable AI models with a trillion parameters while reducing operating costs and energy consumption.
In a press release, the company said companies like Amazon Web Services, Dell, Google, Meta, Microsoft, OpenAI, Oracle, Tesla and xAI are expected to adopt Blackwell.
"Hopper is fantastic, but we need bigger GPUs," Huang said during his keynote address in which he introduced Blackwell as a new "very, very big GPU" that binds two silicon squares that are each the size of its predecessor chip so that they function as a single unit.
NVIDIA UNVEILS ROBOTS POWERED BY SUPER COMPUTER AND AI TO TAKE ON WORLD'S HEAVY INDUSTRIES
Huang said that Blackwell's design allows it to easily replace its predecessor, the Grace Hopper chip, "You slide out Hopper, and you push in Blackwell. That's the reason why one of the challenges of ramping is going to be so efficient."
"There are installations of Hoppers all over the world and they could be the same infrastructure, same design, the power, the electricity, the thermals, the software – identical. Push it right back, and so this is a Hopper version for the current SGX configuration," Huang explained.
Huang brought out a "quite expensive" fully-functioning board to show the audience that featured two Blackwell chips and four Blackwell dyes connected to a Grace CPU that has a "super fast chip-to-chip link."
NVIDIA FACES LAWSUIT FROM AUTHORS OVER ALLEGED COPYRIGHT INFRINGEMENT IN AI MODELS
"What's amazing is this computer is the first of its kind where this much computation, first of all, fits into this small of a place. Second, it's memory coherent, so it feels like they're just one big happy family working on one application together. So everything is coherent within it," Huang said. "There's a lot of terabytes this, and terabytes that, but this is a miracle."
Huang went on to say that the inference capability of Blackwell is "off the charts" and about 30 times Hopper's capability. Generative AI tools use inference to respond to users' queries posed to chatbots or image generation programs.
"Blackwell is going to be just an amazing system for generative AI. And in the future data centers are going to be thought of, as I mentioned earlier, as an AI factory. An AI factory's goal in life is to generate revenues. Generate, in this case intelligence in this facility, not generating electricity as in AC generators of the last industrial revolution, in this industrial revolution the generation of intelligence," Huang said. "The excitement of Blackwell is really off the charts."
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
Huang told CNBC that the Blackwell chip will be priced between $30,000 and $40,000 — although he later clarified that he "wasn't intending on providing a pricing for a chip" and emphasized that the focus is on designing and integrating into data centers more broadly rather than individual chips.
Nvidia Chief Financial Officer Colette Kress told the outlet that the company will start shipping Blackwell chips "later this year" and noted that there are likely to be supply constraints after its initial launch.
Nvidia's stock has surged more than 247% over the last year amid the surge in demand for generative AI tools and the hardware that powers and trains those programs.
GET FOX BUSINESS ON THE GO BY CLICKING HERE
The company's stock was trading near $900 in mid-afternoon trading on Tuesday, up about 1.7% on the day. Nvidia's stock hit an all-time high of $926.69 on March 7.