• brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      14 days ago

      When Nvidia talks ‘datacenter’ it is talking almost exclusively B200 and H200, gigantic HBM GPUs that will only ever be socketed into servers in sets of 8; the silicon is never to be used for gaming.

      B200 HGX from ServeTheHome

      https://www.servethehome.com/new-shots-of-the-nvidia-hgx-b200-astera-labs/

      A little bit of 5090 silicon gets vacuumed up for ‘low end’ accelerators, but this is relatively low volume.

      People do rent them for Blender rendering and other GPGPU stuff, but the vast majority is probably being used for AI inference (or, honestly, just hoarded by companies who don’t really understand ML and let them sit there under-utilized :/)


      Point I’m making is they’re not talking about gaming GPUs here. Making so many ‘AI’ GPUs does impact how much TSMC can allocate to gaming, but this announcemet has nothing to do with that.

      • DominusOfMegadeus@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        ·
        14 days ago

        So, they’re saying other AI customers will “continue to be a top priority”? That makes sense given your explanation. I care even less now about this statement from Nvidia.