NVIDIA shows off future of 'AI compute' with silicon photonics, 3D GPU + DRAM stacking

NVIDIA's vision of the future of AI compute: silicon photonics interposer, GPU 'tiers' possible GPU on GPU, 3D stacked DRAM in the future. Continue reading at TweakTown >

featured-image

NVIDIA's vision of the future of AI compute has been teased: the AI GPU leader sees the use of a silicon photonics interposer, 3D stacked DRAM, and GPU 'tiers' that tease GPU on GPU tech in the future. In a new post on X from Ian Cutress, NVIDIA's vision of the future of AI compute has many different layers: module level cooling with a cold plate, die-to-die and tier-to-tier electrical connect, 3D stacked fine-grained DRAM, vertical power delivery, integrated silicon photonics, and an advanced package substrate. Cutress reports that NVIDIA teased the silicon photonics interposer, SiPh intrachip and interchip, 12 SiPh connects wth 3 per GPU tile, 4 GPU tiles per tier, GPU 'tiers' as the image above shows, and 3D stacked DRAM with 6 per tile, fine-grained.

I do like the tease that NVIDIA plans to use 3D stacked GPUs, stacking multiple GPUs vertically, increasing chip density and reducing its footprint. With a "GPU tier" system using up to 4 GPU tiles per tier stacked vertically, reducing interconnect latency and probably using power gating as well. NVIDIA won't just be 3D stacking the GPUs, but would also use 3D-stacked DRAM chips with 6 per tile.



An interesting, and exciting approach. Silicon photonics is still in its infancy, but TSMC has teased its next-gen COUPE technology using silicon photonics packaging which should be ready in 2026. We should expect NVIDIA's future of AI compute to roll out in the years after, between 2027 and 2030.

.