Nvidia has revealed the H200 graphics processor, a significant upgrade from its predecessor, the H100 chip, utilized by OpenAI for training the advanced GPT-4 language model. The H100 chips come with a price tag ranging from $25,000 to $40,000. The key improvement in the H200 lies in its incorporation of the next-generation HBM3 memory, boasting an impressive 141 GB capacity.
According to Nvidia, the H200 is expected to generate output data nearly twice as fast as its predecessor, the H100, as demonstrated in testing using Meta's Llama 2 LLM language model. The H200 is set to hit the market in the second quarter of 2024, positioning itself as a formidable competitor to AMD's MI300X graphics processor.
It's worth noting that a similar chip from AMD, comparable to the H200, features additional memory compared to its predecessors. Nvidia emphasizes that the H200 is backward compatible with the H100, meaning companies won't need to overhaul their server systems or software to adopt the new chip.
We may use cookies or any other tracking technologies when you visit our website, including any other media form, mobile website, or mobile application related or connect...
Read more about cookies