AMD’s new CPU hits 132fps in Fortnite without a graphics card::Also get 49fps in BG3, 119fps in CS2, and 41fps in Cyberpunk 2077 using the new AMD Ryzen 8700G, all without the need for an extra CPU cooler.
I have routinely been impressed with AMD integrated graphics. My last laptop I specifically went for one as it meant I didn’t need a dedicated gpu for it which adds significant weight, cost, and power draw.
It isn’t my main gaming rig of course; I have had no complaints.
I have a Lenovo ultralight with a 7730U mobile chip in it, which is a pretty mid cpu… happily plays minecraft at a full 60fps while using like 10W on the package. I can play Minecraft on battery for like 4 hours. It’s nuts.
AMD does the right thing and uses their full graphics uArch CU’s for the iGPU on a new die, instead of trying to cram some poorly designed iGPU inside the CPU package like Intel does.
AMD’s integrated GPUs have been getting really good lately. I’m impressed at what they are capable of with gaming handhelds and it only makes sense to put the same extra GPU power into desktop APUs. This hopefully will lead to true gaming laptops that don’t require power hungry discrete GPUs and workarounds/render offloading for hybrid graphics. That said, to truly be a gaming laptop replacement I want to see a solid 60fps minimum at at least 1080p, but the fact that we’re seeing numbers close to this is impressive nonetheless.
Common W for AMD
Mind you that it can get these frame rates at the low setting. While this is pretty damn impressive for a APU, it’s still a very niche market type of APU at this point and I don’t see this getting all that much traction myself.
Only downside if integrated graphics becomes a thing is that you can’t upgrade if the next gen needs a different motherboard. Pretty easy to swap from a 2080 to a 3080.
Integrated graphics is already a thing. Intel iGPU has over 60% market share. This is really competing with Intel and low-end discrete GPUs. Nice to have the option!
Yeah, I know integrated graphics is a thing. And that’s been fine for running a web browser, watching videos, or whatever other low-demand graphical application was needed for office work. Now they’re holding it up against gaming, which typically places large demands on graphical processing power.
The only reason I brought up what I did is because it’s an if… if people start looking at CPU integrated graphics as an alternative to expensive GPUs it makes an upgrade path more costly vs a short term savings of avoiding a good GPU purchase.
Again, if one’s gaming consists of games that aren’t high demand like Fortnite, then upgrades and performance probably aren’t a concern for the user. One could still end up buying a GPU and adding it to the system for more power assuming that the PSU has enough power and case has room.
For a slightly different perspective, I will not game on anything other than a Steamdeck. So, this is kind perfect for me. But, I am a long hauler with hardware so I typically upgrade everything all at once anyway.
And the shared RAM. Games like Star Trek Fleet Command will crash your computer by messing with that/memory leaks galore. Far less crashy with a dedicated GPU. How many other games interact poorly with integrated GPUs?
Could you not just slot in a dedicated video card if you needed one, keeping the integrated as a backup?
Yeah, maybe. I commented on that elsewhere here. If we follow a possible path for IG - the elimination of a large GPU could result in the computer being sold with a smaller case and lower-power GPU. Why would you need a full tower when you can have a more compact PC with a sleek NVMe/SSD and a smaller motherboard form factor? Now there’s no room to cram a 3080 in the box and no power to drive it.
Again, someone depending on CPU IG to play Fortnite probably isn’t gonna be looking for upgrade paths. this is just an observation of a limitation imposed on users should CPU IG become more prominent. All hypothetical at this point.
or it may end up making for a push for longer lifetimes for motherboards
Oh, oh ok I thought one of the new Threadrippers is so powerful that the CPU can do all those graphics in Software.
It’s gonna take decades to be able to render 1080p CP2077 at an acceptable frame rate with just software rendering.
It’s all software, even the stuff on the graphics cards. Those are the rasterisers, shaders and so on. In fact the graphics cards are extremely good at running these simple (relatively) programs in an absolutely staggering number of threads at the same time, and this has been taken advantage of by both bitcoin mining and also neural net algorithms like GPT and Llama.
It’s a shame you’re being downvoted; you’re not wrong. Fixed-function pipelines haven’t been a thing for a long time, and shaders are software.
I still wouldn’t expect a threadripper to pull off software rendering a modern game like Cyberpunk, though. Graphics cards have a ton of dedicated hardware for things like texture decoding or ray tracing, and CPUs would need to waste even more cycles to do those in software.
That’s pretty damn impressive. AMD is changing the game!
$US330 for the top 8700G APU with12 RDNA 3 compute units (compare to 32 RDNA 3 CUs in the Radeon RX7600). And it only draws 88W at peak load and can be passively cooled (or overclocked).
$US230 for the 8600G with 8 RDNA 3 CUs. Falls about 10-15% short of 8700G performance in games, but a much bigger spread in CPU (Tom’s Hardware benchmarks) so I’m pretty meh on that one.
Given the higher costs for AM5 boards and DDR5 RAM, you could spend about the same or $100-200 more than an 8700G build you could combine a cheaper CPU and better GPU and get way more bang for your buck. But I see the 8700G being an solid option for gamers on a budget, or parents wanting to build younger kids their first cheap-but-effective PC.
I also see this as a lazy mans solution to building small form factor mini-ITX Home Theatre PCs that run silent and don’t need a separate GPU to receive 4K live streams. I’m exactly in this boat right now where I literally don’t wanna fiddle with cramming a GPU into some tiny box, but also don’t want some piece of crap iGPU in case I use the HTPC for some light gaming from time to time.
itll be a great upgrade for these little nuc like things , thin laptops, and steamdeck competitors
The playstation 5 also does this.
deleted by creator
A bit misleading, what is meant is that no dedicated GPU is being used. The integrated GPU in the APU is still a GPU. But yes, AMD’s recent APUs are amazing for folks who don’t want to spend too much to get a reasonable gaming setup.
Wow, it’s almost like that’s why they said you didn’t need a graphics card, instead of saying you didn’t need a GPU!
Reading is difficult for some folk.
Because the title is still vague, and yes GPU and “graphics card” are often used interchangeably by the internet (examples: https://www.hp.com/gb-en/shop/tech-takes/integrated-vs-dedicated-graphics-cards and https://www.ubisoft.com/en-us/help/connectivity-and-performance/article/switching-to-your-pcs-dedicated-gpu/000081045 ).
“New CPU hits 132fps” could wrongly suggest software rendering, which is very different (see for example https://www.gamedeveloper.com/game-platforms/rad-launches-pixomatic----new-software-renderer ) and died more than a decade ago.
Up until the G in 8700G I totally thought ‘software renderer’ and was hella impressed. So yea, totally plausible it could have been described better.
Software rendering hasn’t worked in 99% of games made on the last 15+ years. Only the super under low fi hipster stuff would be fine without 3D acceleration.
Which is why the title was momentarily impressive. Was thinking some ‘in the lab’ demo cpu.
I can see Single Board Conputers with this on for powerful TV boxes. hello Emulators and Steam OS‽
That’s what the article says…