AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.
AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.
That’s pretty damn impressive. AMD is changing the game!
Meh. It’s also a $330 chip…
For that price you can get a 12th gen i3/RX6600 combination which will obliterate this thing in gaming performance.
Your i3 has half the cores. Spending more on GPU and less on CPU gives better fps, news at 11.
So what’s the point of this thing then?
If you just want 8 cores for productivity and basic graphics, you’re better off getting a Ryzen 7 7700, which is not gimped by half the cache and less than half the PCIe bandwith and for gaming, even the shittiest discrete GPUs of the current generation will beat it if you give it a half decent CPU.
This thing seems to straddle a weird position between gaming and productivity, where it can’t do either really well. At that pricepoint, I struggle to see why anyone would want it.
It’s like that old adage: there are no bad CPUs only bad prices.
Half L3, yes. 24 vs. 16 (available) PCIe lanes that’s not half, still enough for two SSDs and a GPU, if you actually want IO buy a threadripper. The 8700G has quite a bit more baseclock, 7700 boosts higher but you can forget about that number with all-core loads. About 9 times raw iGPU TFLOPs.
Oh, those TFLOPs. 4.5 vs. my RX 5500’s 5 (both in f32), yet in gaming performance mine pulls significantly ahead, must be memory bandwidth. Light inference workloads? VRAM certainly won’t be an issue just add more sticks. Those TFLOPS will also kill BLAS workloads dead so scientific computing is an option.
Can’t find proper numbers right now but the 7700 should have a total of about half a TFLOP CPU and half a TFLOP GPU.
So, long story short: When you’re an university and a prof says “I need a desktop to run R”, that’s the CPU you want.
24x PCIe 5.0 vs 16x PCIe 4.0
So 8 lanes less and each lane has half the bandwith = less than half the PCIe bandwidth.
But isn’t the point of this post being that the CPU still runs games okay without a dedicated video card?
It’s hardly a useful comparison to compare the CPU on its own against a Video Card + CPU.
If it was the i3 on its own, that might be a different story.
It is a useful comparison if the latter combination is the same price
Is this a pun?