Skip to content

How Samsung’s HBM2E Flashbolt is Laying the Foundations for the Tech of Tomorrow

  • mail
Samsung HBM2E Flashbolt Chip front and back.
Samsung HBM2E Flashbolt Chip front and back.
The memory and data storage sectors have been subject to significant change recently, with the industry having changed more in the past 10 years than it did in the preceding 25. Even more drastic change is expected too as the memory sector continues to innovate in order to keep up with the introduction of game-changing innovations like AI. Amid the shifting climate, Samsung’s HBM2E Flashbolt stands at the head of the memory pack, offering rarely-seen-before bandwidth capabilities and next-level power efficiency, all in a compact, easy-to-use format. Having already been taken up by multiple trailblazing manufacturers and institutions, Samsung’s premium high-bandwidth memory solutions are laying the foundations upon which the incredible tech innovations of tomorrow will be built. Understanding HBM HBM stands for ‘high-bandwidth memory’, a premium performance interface for 3D-stacked SDRAM (synchronous dynamic random-access memory). It maximizes data transfer rates in a small form factor that uses less power and has a substantially wider bus when compared to other DRAM solutions. For high-performance computing applications, industries planning to leverage AI, graphics card vendors and advanced networking applications, HBM provides data speed increases that are essential to helping drive industries forward. The inception of HBM memory solutions has been followed by the introduction of HBM2 and HBM2E, which allow for more DRAM die to be utilized per stack, increasing capacities across the board. Samsung’s HBM2E Flashbolt offers a bandwidth increase of up to 1.3X on the company’s second-generation HBM2 Aquabolt offering, for instance, as well as per-pin data transfer speeds that are 33 percent faster than those of previous-generation HBM2 solutions. Next-level Speed and Capacity, in a Compact Package The HBM2E Flashbolt is the industry’s first 3rd-generation, 16GB HBM2E offering. It delivers next-level specs, with its ability to provide superior memory bandwidth levels of up to 410 GB/s and data transfer speeds that reach 3.2 GB/s per pin being achieved with the vertical stacking of eight layers of 10nm-class 16 Gb DRAM dies. To give an idea of just how fast the HBM2E Flashbolt is, the solution could facilitate the transfer of 82 full HD movies (410 gigabytes of data) in approximately 17 minutes at a transfer speed of 3.2 gigabits per second. These rapid speeds mean that the HBM2E Flashbolt is ideally positioned to maximize next-level graphics systems going forward. The HBM2E Flashbolt additionally offers double the capacity of its predecessor, the HBM2 Aquabolt, all in a compact, energy-efficient format that is ideally suited to space-constrained designs. Making Waves Throughout the Industry The ability to underpin next-generation customer systems with rapid speeds and expansive capacities has already made HBM a hit with clients. Nvidia has chosen to utilize HBM2 for its Tesla P100 accelerators, which power data centers, while AMD is using it for its Radeon Instinct accelerators and high-end graphics cards. Intel has taken the technology on-board for use with its high-performance graphics solutions for mobile PCs, while Rambus and Northwest Logic have embraced HMB2 technology in their memory controllers and high-performance networking chips. A host of other companies have also harnessed HBM2 storage for their networking systems. Providing a Foundation for the Future As exciting as HBM’s current applications are, it is the future-tech it stands to empower that is truly game-changing. AI has emerged as one of the main sectors set to benefit from the production of more advanced memory solutions, with studies revealing that the scope of deep learning can be significantly enhanced with accelerated processing speed and increases in the amount of memory a system contains. What’s more, Samsung’s high-bandwidth memory interfaces and DRAM solutions for servers accelerate data processing and analysis, which in turn empowers computer vision, natural language processing and a host of other capabilities. Closely linked to the area of AI is that of ‘big data’. As the amount of data generated globally continues to multiply at a dizzying rate, advanced memory solutions and sophisticated AI systems will prove fundamental to enabling the continued analysis and consumption of this valuable information. Solutions like Samsung’s HBM2E Flashbolt are set to empower the next generation of supercomputers as well, providing the speed and capacity that will be needed as computers continue to become smarter and more powerful. Along the same vein, as the high-powered computing (HPC) sector is charged with delivering things like high-quality live streams in real-time, premium memory will be essential in providing the power that the solutions need to keep up with increasingly demanding workloads.