I agree if that would mean that those options are cheaper. You can be a very active gamer and not need more, why pay more.
He said most people are playing at 1080p, and last month’s Steam survey had 55% of users with that as their primary display resolution, so he’s right about that. Ignore what’s needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?
Absolutely. Why pay more if less is good enough?
They are open about it, and give the option to get more RAM if you want it. Fine by me.
No one with a 4k monitor will by them anyway.
Sorry but I don’t understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren’t forcing your hand or limiting your options.
At least we know nVidia aren’t the only ones being shitty, they just lead the market in it.
The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.
The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.
This is one of the most uninformed comments I’ve read so far.
I shared this vid to try and spread awareness that Frank Azor, AMD’s Chief Architect of Gaming Solutions and Gaming Marketing, of whom made that needless badfaith comment as it holds back the advancement of gaming.
AMD’s 9060XT 16GB ($350) released recently but we’ve yet to see if they’re able to provide them to consumers at AMD’s own stated MSRP price of $350 USD; something that was unmet in their previous launch.
For PC building, you walk around this show Computex and talk to case manufacturers and cooler manufacturers, and they’ll sync up their launches to Nvidia GPU launches cause they don’t sell things in between at the same velocity. And so if Nvidia launches a GPU and the interest falls off a cliff because people just feel like they either can’t get a card or they get screwed if they get a card, it I think actively damages the hobby.
I remember even when the RTX 3070 came out and I gave that a positive review I said it was a great value product because by all metrics/measurements that we had at the time it was a good product. We had very few examples that we could point to where 8 gigabytes wasn’t enough. Of course the competing card the upcoming competing card we knew had a 16 GB vram buffer so; it/that doesn’t necessarily make that a valid thing. Like it can be like if you had a 32GB buffer now on that product, you’d be like “Well it’s got enough vram”. It’s probably nothing.
…
But because we did see, and even when you were looking at like dedicated used vram, a lot of games like 7, 7 and a ½GB [usage of vram increasing]. So you could see it creeping up over the years from like 4, 5, 6, 7; you could see where it was trending right? Which is why I always find it funny when people [say] “Why are we using more than 8 now? Like 8 should be enough”.- First quote paragraph from Steve Burke of Gamers Nexus, second and third quote paragraph from Steve Walton of Hardware Unboxed
- Is Nvidia Damaging PC Gaming? feat. Gamers Nexus
- if you replace Nvidia’s name in the provided quotes with AMD it still holds the same force in that AMD would ruin the gaming landscape for the benefit of only themselves at the cost of literally everyone else.
Clicks literally has no value to me as what I care about the most is trying to inform gamers so that people aren’t exploited by badfaith actors, especially hardware manufacturers as they dictate the limitations that game developers must work within. I’m not paid or affiliated with any of the hardware manufacturers.
shitstirring for clicks
“Shitstirring for clicks” literally does nothing for me. It would actually be detrimental for my reputation if I were to do so.
No one should get a free pass if behaving badly. If Nvidia acts poorly they should be called out. Same for AMD, same for Intel. 0 exceptions.
I can agree that the tweet was completely unnecessary, and the naming is extremely unfair given both variants have the exact same brand name. Even their direct predecessor does not do this.
The statement that AMD could easily sell the 16 GiB variant for 50 dollars less and that $300 gives “plenty of room” is wildly misleading, and from that I can tell they’ve not factored in BOM at all.
They blanketly state that GDDR6 is cheap and I’m not sure how they figure.
This really feels like AMD’s “don’t you guys have phones” moment.
AMD doesn’t know I selfhost generative AI models.
8GB is barely sufficient for my needs and I often need to use multigpu modifications to deploy parts of my workflow to my smaller GPU which lacks both enough cuda cores and enough vram to make an appreciable difference. I’ve been searching for a used K80 in my price range to solve this problem.
Could you please help me understand the reason behind this?
Here’s a comment I made in this same post here that provides some details what’s happening:
https://lemmy.ca/comment/16934365