NVIDIA unveils major advancements in AI technology

Every DGX Cloud instance, according to NVIDIA, is powered by eight of its H100 or A100 systems, each with 60GB of VRAM, increasing the node’s total memory capacity to 640GB. As you might assume, there is high-performance storage in addition to low-latency fabric connecting the systems. With so much capability, existing DGX clients would find the cloud option more alluring—why spend another $200,000 on a box when you can accomplish so much more for a lower monthly cost? Oracle’s Cloud Infrastructure will initially power DGX Cloud, but NVIDIA claims that it will expand to include Microsoft Azure, Google Cloud, and other providers “soon” after that.

So what should you do with all of that AI knowledge? A simpler method for businesses to create their own Large Language Models (like ChatGPT) and generative AI has been unveiled by NVIDIA as well. Large corporations already use it to create their own LLMs, including Adobe, Getty Images, and Shutterstock. Additionally, NeMo, a language-specific service, and NVIDIA Picasso, an image, video, and 3D service, are directly connected to DGX Cloud.