Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • elvith@feddit.de
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    10 个月前

    I have a 2060 super with 8GB. The VRAM is enough currently for FHD gaming - or at least isn’t the bottle neck, so 12 GB might be fine with this use case BUT I’m also toying around with AI models and some of the current models already ask for 12 GB VRAM to run the complete model. It’s not, that I would never get a 12 GB card as an upgrade, but you’d be sure, that I’d do some research for all alternatives and then it wouldn’t be my first choice but a compromise, as it wouldn’t future proof me in this regard.

    • AProfessional@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      10 个月前

      Do you think there is a large overlap of people who buy $600-$900 cards and like 1080p?

      My 3080 10GB runs out of VRAM personally at 1440p. I would never get <16GB again.

      • elvith@feddit.de
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 个月前

        Hard to say. I was an early adopter of FullHD, always had the equivalent of an xx80 card. Then I stepped back a bit with the 970, as it was the best upgrade path for me (considering I was only upgrading the GPU and the CPU would very likely be the bottle neck moving forward). I was thinking to move to higher resolutions with my new PC. Then my PSU fried my mainboard, CPU and GPU while covid and crypto currencies cause huge price spikes on almost every component and I had to pay way to much for what I’d get performance wise. That’s why I’m running a 2060 super now and stay on FHD.

        I might consider upgrading the next time I need a new PC, as this left me in an awkward spot: If I want a higher resolution, I need a new monitor. If I buy one, I’d need a new GPU probably, too. And since my CPU would now be a bottleneck for the rig I should also change that in this process. Then I might want a new mainboard, as I’m currently only running on DDR-4 RAM, and so… the best way forward is basically a new PC (I might save some money by keeping my NVMe drive, etc…).

        I’m not sure, what I’m going to do in the future. Up until around the GTX 970, you could get a decent rig that plays current games in FHD on ultra or very high and would continue doing for about 1-2 years. If you degrade to medium - high, probably 4-5 years. You could get that easily for ~900-1000 bucks (or less). Nowadays, the GPU alone can get you close to this price range…

        I get it. 1080p is about 2.1 megapixels, while 1440p is already 3.69 megapixels - that’s 75% more pixels and thus, you need way more performance to render it (or rather raster and shade it). But still… I don’t like these prices.

    • AA5B@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 个月前

      Thanks, that was going to be exactly my question. I don’t see anyone choosing low memory for video but had no idea what ai needs

      • elvith@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        10 个月前

        You can run Stable Diffusion XL on 8GB of VRAM (to generate images). For beginners, there’s e.g. the open source software Fooocus, which handles quite a lot of work for you - it sends your prompt to a GPT-2 model (running on your PC) to do some prompt engineering for you and then uses that to generate your images and generally features several presets, etc. to get going easily.

        Jan (basically an open source software that resembles ChatGPT and allows you to use several AI models) can run in 8GB, but only for 3B models or quantized 7B models. They recommend at least 16GB for regular 7B models (which they consider “minimum usable models”). Then there are larger, more sophisticated models, that require even more.

        Jan can run on CPU in your regular RAM. Since it’s chatting with you, it’s not too bad, when it spits out words slowly, but GPU is / would be nice here…