Samsung archrival discloses more details about key AI memory tech that could end up in Nvidia’s purported H300 — HBM4 will have the same die density but 16 layers, a whopping 1.65TBps bandwidth and will be available in 48GB SKUs
South Korean memory giant SK Hynix has been making a number of big announcements in recent months, including its plans to build the world’s largest chip factory and the creation of a mobile storage chip that could make phones and laptops run faster.
The company has also begun collaborating with Taiwanese semiconductor foundry, TSMC, to develop and produce the next-generation of High Bandwidth Memory, known as HBM4, which will significantly boost HPC and AI performance, and could end up in Nvidia‘s purported H300 GPU.
+ There are no comments
Add yours