Article count:640 Read by:1001592

Account Entry

What do high-performance systems and ultra-bandwidth solutions have in common?

Latest update time:2020-12-01
    Reads:

Click on the blue words "Micron Technology" above, [Follow] and [Star] us to receive the latest information from Micron Technology!


  • Author of this article:

Spencer Homan, Global Memory Business Manager at Micron


This spring, as a global pandemic shut down most sports, I witnessed an interesting development in the entertainment industry when video game company EA Sports' Madden simulated an unprecedented NFL team competition (courtesy of Bleacher Report). Each team had the best players of all time, and these "best of all time" teams played against each other until one team was declared the winner.


Not only does this pique my interest as a football fan, but it’s also equally fascinating from a technical perspective to watch simulated events without spectators. As video games have improved over the years, the visual graphics, physics hardware, and realism have advanced to the point where pure simulations can get extremely high viewership on Twitch. As technologies like graphics, artificial intelligence (AI), and deep learning (DL) continue to improve, I predict that we will see more simulated entertainment soon.


Ray tracing makes the experience more realistic


In the fall of 2018, ray tracing was finally introduced to gaming, bringing an unprecedented level of realism. Ray tracing is a rendering technique that produces incredibly realistic lighting effects. It works by simulating how light interacts with virtual objects in a computer-generated world by tracing the path of light.



Ray tracing has been discussed for years, and with the launch of Nvidia’s Turing-based graphics cards, the technology is finally here—albeit at a high cost. When Sony and Microsoft release their next-generation consoles this fall, ray tracing will go mainstream. By the end of 2021, ray tracing will become standard for the masses, along with 4K/HDR displays. People are always looking to improve game visuals—and to keep up with these high demands, we need memory products that can meet the needs of these ultra-high bandwidth applications.


New memory emerges


Memory is the core of technology and the source that brings life to every product, application and innovation. Memory system bandwidth is the most critical requirement for some of the future highly integrated applications. Artificial intelligence, machine learning (ML), deep learning, autonomous driving, high performance computing (HPC), virtual reality (VR), augmented reality (AR) and next generation games are not just on paper. These applications are increasingly being put into use, and each application requires a large amount of data, which is not only large in capacity, but also requires fast and repetitive analysis. The analysis process requires a huge amount of system bandwidth.


Micron believes that 2020 will be a decisive year for the development of next-generation memory designed for high-bandwidth applications. With the advent of GDDR6X, a new class of memory has emerged. Ultra-Bandwidth Solutions is a new class of memory for applications with high bandwidth requirements.



Accelerating intelligence requires greater hardware bandwidth


The world is enthusiastic about adopting artificial intelligence to transform data into deep analytics. Accelerating intelligence requires AI training and reasoning to be as fast as possible. Applications such as high-performance computing, professional visualization, automotive, and networking all require systems with the largest hardware bandwidth and the fastest speed.


There has always been a performance gap between GDDR and mainstream memory, and that gap continues to widen. The chart below breaks down I/O and system-level performance by category, highlighting some impressive data points for GDDR6X.



GDDR6X is the latest and greatest generation of memory in ultra-bandwidth solutions . For some quick overviews, watch the GDDR6X video. If you need to learn more, see the GDDR6X blog, GDDR6X: Reinventing Memory .