Samsung missed out on Nvidia’s most expensive AI card but beats Micron to 36GB HBM3E memory — could this new tech power the B100, the successor of the H200?

Estimated read time 2 min read



Samsung says it has developed the industry’s first 12-stack HBM3E 12H DRAM, outpacing Micron Technology and potentially setting the stage for the next generation of Nvidia‘s AI cards. 

The South Korean tech giant’s HBM3E 12H offers bandwidth of up to 1,280GB/s and an industry-leading capacity of 36GB, representing a more than 50% improvement over the 8-stack HBM3 8H.



Source link

You May Also Like

More From Author

+ There are no comments

Add yours