Everyone knows that artificial intelligence is changing businesses and technology very quickly. But fewer people understand how much strain AI puts on computer memory and storage. Huawei, a leading tech giant, has released several new solid-state drives for AI, showing a new way to handle huge amounts of data without depending only on expensive HBM, which stands for high-bandwidth memory.
Let us talk about what is new, why it matters, and how the “secret sauce” in Huawei’s drives is supposed to change the game.
Table of Contents
What’s all the fuss about MORE memory?
AI models, especially large ones like the ones used in speech, image recognition, or translation, need extremely fast and large amounts of memory to train and work properly. This memory cannot be just standard RAM. High-bandwidth memory, or HBM, is very quick and useful for AI tasks. The problem is, HBM is expensive, difficult to get, and when used in large amounts, it makes the hardware costly and hard to upgrade.
Chinese companies and many others around the world now face more restrictions on getting advanced HBM chips. This makes it imperative to find new ways to deliver high performance using more available technology. This is where Huawei’s new OceanDisk AI SSDs come into play.
What exactly is an AI SSD?
For those of you who do not know, SSD stands for solid-state drive. It is a type of storage that is much faster than old-fashioned hard disks. AI SSDs are a new category, designed to work closely with memory to boost data handling for artificial intelligence tasks. Huawei announced three new models in its OceanDisk series. These are called the OceanDisk EX, OceanDisk SP, and OceanDisk LC 560. The company launched these drives with the aim of reducing “memory wall” and “capacity wall,” which are problems that slow down AI training and make the systems more expensive and harder to use.
What is the USP of Huawei’s new AI SSD?
First, the numbers are huge. The LC 560 has a mind-blowing single-drive capacity of 245TB, which is the most of any SSD ever made for business use so far. Most people are used to seeing 1TB or even 4TB SSDs in laptops and desktops. For AI training, larger drives mean you can handle massive datasets locally, without needing to split data or constantly move files across networks.
The LC 560 offers a read bandwidth of almost 15GB/s, which is very high for SSDs. This means data moves quickly from storage to memory, saving time during AI tasks. The EX model is about “extreme performance,” with write speeds up to 1,500K IOPS, latency under 7 microseconds, and endurance of 60 drive writes per day. This high speed is important when you need to read and write large files multiple times for training artificial intelligence models.
The SP model of OceanDisk focuses on cost, delivering lower speeds but making the solution affordable for more customers. It is built for “inference scenarios,” which means not training new models, but running existing models more efficiently. Huawei claims it reduces first-token latency by 75 percent, and doubles throughput. In simple words, it can reply to AI tasks quicker and handle more requests with the same hardware.
Now, let us take a look at what this ‘secret sauce’ is all about –
Beyond hardware, Huawei has also introduced a software called DiskBooster. This driver is designed to combine or “pool” memory across different types, including HBM, SSD, and DDR (regular RAM). According to the company, DiskBooster can expand pooled memory twentyfold by allowing SSDs and HBM to work together.
This is important because AI workloads often hit the limits of available memory. If memory can be pooled effectively, you can handle larger AI models and tasks with fewer hardware changes. DiskBooster coordinates data across SSDs and HBM, moving what is needed in and out depending on demand.
Another technology included is called “multi-stream.” This aims to cut down write amplification, which can wear out SSDs faster. By controlling how data is written to the disk, the drives last longer and give better value.
Why does this reveal matter for China?
Because of continued restrictions on buying advanced HBM chips from the United States and other suppliers, Chinese companies like Huawei have had to rethink their approach. Instead of relying only on imported chips, they are putting their faith in domestic NAND flash technology. NAND is what most SSDs use to store data, and making it more effective for AI is a strong step toward self-reliance.
Still, experts say these SSDs are not a full replacement for HBM. Instead, they work as a partial alternative, acting as a supplement in systems where memory and storage must balance each other out. The real solution is to mix several layers of technology so that limitations in one area do not stop progress in AI.