Now that Nvidia has finally lifted the curtain on its Ampere cards for consumers, Micron has come out and officially announced its also-new GDDR6X memory (which was previously leaked), and provided technical details on why you should care. In short, it's a much faster memory standard compared to GDDR6.
What it mostly boils down to is the adoption of four-level pulse amplitude modulation (PAM4), a multi-signaling technology that's been around for several years and used in areas like high-end networking (think: 200 and 400 gigabit Ethernet solutions).
"Our multilevel signaling innovation in GDDR6X has shattered conventional bandwidth limits, clocking record-breaking speeds," said Tom Eby, senior vice president and general manager of the compute and networking business unit at Micron. "Unlike traditional memory, GDDR6X has unparalleled data rates that can keep pace with gaming innovation and data-hungry applications—setting a new standard for graphics memory."
Here's a video that covers some of the details at a high level:
The adoption of PAM4 effectively doubles the number of signal states in the GDDR6X memory bus. Whereas traditional GDDR6 memory relies on two signal levels to transmit data as ones or zeroes, with graphics memory capped at 64GB/s, PAM4 uses four distinct levels to transmit two bits of data to and from the memory at the same time.
"As a result, Micron’s GDDR6X dramatically increases memory bandwidth to 84GB/s for each component, translating to system bandwidth of up to 1TB/s—rates once thought impossible," Micron explains.
Two of Nvidia's newly unveiled Ampere cards have made the jump to Micron's GDDR6X memory, those being the GeForce RTX 3090 priced at $1,499 and the GeForce RTX 3080 priced at $699. The GeForce RTX 3070 priced at $499 still uses GDDR6.
For the time being, Micron is mass producing GDDR6X in 8-gigabit (Gb) density, with speeds of 19-21Gb/s. Next year, however, Micron plans to double the density to 16Gb, perhaps paving the way for "Ti" or "Super" upgrades to Nvidia's latest generation GeForce RTX cards (though that's purely speculation).