What happens in the blink of an eye? Think about all the things that high-performance computers are formulating in microseconds. They can analyze personalized business and scientific data for almost everything including 8K broadcasting, next-generation cloud applications, new approaches to deep learning, next-level computer generated imagery and now, even the fight against the coronavirus pandemic. At the center of these advancements, the blazing fast HBM2 (High Bandwidth Memory – 2) and, in the coming weeks, HBM2E, Flashbolt, are transforming into critical linchpins for AI-enabled data and graphics.
Powering Technology Pioneers
HBM has been attracting a growing field of data-intensive market innovators ever since HBM2 was approved as an industry standard in January 2016. Memory leader Samsung Electronics started manufacturing HBM2 that same month, and is excited to take the next major industry leap in rolling out HBM2E.
Industry pioneers are already sampling HBM2E for a variety of high-end AI-based HPC (High-Performance Computing) applications. Their revolutionary efforts will have a lasting impact around the world.
This big statement is inspired by everything we’ve seen HBM2 enable. In March 2020, ZDNet reported
the world’s fastest supercomputer, IBM Summit, is being deployed in the fight against the COVID-19 Pandemic. Located at the Oak Ridge National Laboratory, Summit uses HBM2
in its memory arsenal. The supercomputer “has been simulating more than 8,000 compounds to screen for those that are most likely to bind to the main ‘spike’ protein of the coronavirus, rendering it unable to infect host cells.”
Additionally, NVIDIA® has been supplying Tesla® V100 accelerators with HBM2
for use in a number of cutting-edge datacenters to deliver high levels of performance for deep learning and cloud computing. AMD is using HBM2 in its Radeon Instinct™
compute cards to deliver high levels of performance for similar applications as well as ultra-scalable rendering systems.
Furthermore, the Intel® Stratix® 10 FPGA incorporates Samsung HBM2
to address a host of end markets and applications, including datacenter, HPC, 8K broadcast, wireline networking and data analytics.
Consumers are getting the benefit of AI-based analytics at the forefront of highly sophisticated health and wealth-building applications. Similarly, thanks to high bandwidth memory solutions, vivid 8K imagery in many areas of gaming, CAD/CAM and educational electronics is becoming commonplace.
When it comes to graphics, HBM2-powered GPUs are doing a stellar job in overcoming image complexities in large volumes of graphics-intensive applications. In fact, digital artists, thanks to HBM2, can significantly reduce the time required to render complex scenes. Users are also better able to manipulate 3D objects, and maneuver with greater satisfaction in one activity after another.
Any way you look at it, data-intensive analytics are embracing the best of high-bandwidth memory to solve today’s challenging problems. This dynamic shift is also ready to occur in the automotive sector, where HBM2E and inference-driven AI are primed to become an essential part of the future of autonomous vehicles.
Future Market Growth
The future for HBM appears bright indeed. Overall, IDC is forecasting the global AI market to approach $98 billion by 2023, with a compound annual growth rate of 28.5%.
With HBM2 and now HBM2E entrenching themselves as the fastest form of DRAM available, AI will be able to make significant inroads across the premium analytics domain. Over the next few years, this highly advanced technological integration will only accelerate as Samsung leads with the best and the most reliable high bandwidth memory around.
Learn more on our high-bandwidth memory
landing page. Also check out a recent Samsung session
at Nvidia’s GTC conference.
All product and company names are designations, trademarks™ or registered® trademarks of their respective holders.