Intel and the US Government are working on building the greatest GPT ever

Intel and the US Government are working on building the greatest GPT ever

Now, this training gig is no sprint; it’s a marathon. They’re starting with 256 nodes out of 10,000 on the Aurora supercomputer – taking it slow and steady. ScienceGPT is shaping up to be the heavyweight champ of Language Models, leaving even Google’s Bard in the dust. Ogi Brkic, the VP and GM for data center and HPC solutions, spilled the beans, sharing that it’s a powerhouse mixing text, codes, and scientific results to supercharge research.