Internet

Samsung takes important step to make HBM4 memory for AI chips


Despite being the world’s biggest memory chip maker, Samsung has been left behind in the AI chip race by smaller rivals Micron and SK Hynix. It is still unable to supply HBM3E memory for Nvidia’s flagship AI chips. However, the company has taken a critical step towards its future by ordering equipment to make HBM4 memory.

Samsung orders equipment to make 1c DRAM used for HBM4 memory

According to a report from ZDNet, Samsung has begun ordering equipment to manufacture 1c DRAM. The equipment will be used to build a mass production line (at P4) on the company’s Pyeongtaek campus.

Samsung is said to be developing HBM4 memory using 1c DRAM, which is one version advanced compared to its rivals. Micron and SK Hynix have reportedly decided to use 1b DRAM for their HBM4 chips. Samsung’s aim is to make up for the time it lost in the HBM3 and HBM3E segments in the past few years.

The South Korean chip giant has reportedly started discussions with its partners, and the construction of the new production line is expected to be completed in mid-2025.

Meanwhile, the company had been testing 8-layer and 12-layer HBM3E memory with Nvidia but couldn’t supply those chips due to performance issues. Now, Samsung is said to be slightly modifying those chips so that it can start providing HBM3E memory to Nvidia.

SK Hynix is aiming to complete the tape-out process for its HBM4 memory by the fourth quarter of this year. Tape-out refers to the process of supplying the final design to the manufacturing team.

Once the tape-out is done, manufacturing tests can be started, and if everything goes to plan, mass production can begin. Since SK Hynix will use 1b DRAM for HBM4 chips, it could have more stable production.

Hopefully, Samsung won’t face performance issues with its HBM4 chips as it did with its HBM3 and HBM3E chips. Market analysis firms indicated that demand for HBM memory will be high next year, so it is important for Samsung to perform well in this segment.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.