Oddbean new post about | logout
 Samsung's 8-layer HBM3E chips clear Nvidia's tests for use, sources say. A version of Samsung Electronics' fifth-generation high bandwidth memory (HBM) chips, or HBM3E, has passed Nvidia's tests for use in its artificial intelligence (AI) processors, three sources briefed on the results said.
 https://image.nostr.build/1aabc7b8674c94c9acd191305c190dcfa6a2fcd0aaf9fec4a205149e38785566.jpg 
The qualification clears a major hurdle for the world's biggest memory chipmaker which has been struggling to catch up with local rival SK Hynix in the race to supply the advanced memory chips capable of handling generative AI work.
 https://yakihonne.s3.ap-east-1.amazonaws.com/ad6a909b8dfd6e278f94881d83dbd5ad5f9260c7502175059b29042e589fb93c/files/1720502056140-YAKIHONNES3.png  
 HBM, or high bandwidth memory, is a type of dynamic random access memory (DRAM) standard that was first produced in 2013. It involves vertically stacking chips to save space and reduce power consumption. HBM is a crucial component of graphics processing units (GPUs) used in AI applications, as it helps process large amounts of data generated by complex tasks 
 hdbd