The AMD Ryzen 5 2500X and Ryzen 3 2300X CPU Review
by Ian Cutress on February 11, 2019 11:45 AM ESTGaming: Ashes Classic (DX12)
Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.
As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.
Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.
For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.
AnandTech CPU Gaming 2019 Game List | ||||||||
Game | Genre | Release Date | API | IGP | Low | Med | High | |
Ashes: Classic | RTS | Mar 2016 |
DX12 | 720p Standard |
1080p Standard |
1440p Standard |
4K Standard |
Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.
All of our benchmark results can also be found in our benchmark engine, Bench.
AnandTech | IGP | Low | Medium | High |
Average FPS | ||||
95th Percentile |
At the lowest resolutions, the 2500X has the high ground, but cedes it to the 8350K as the resolution ramps up.
65 Comments
View All Comments
romrunning - Monday, February 11, 2019 - link
It may just be me, but all of the links on the "Pages In This Review" at the bottom of the main page simply return me to the main page.romrunning - Monday, February 11, 2019 - link
But the drop-down to the specific page works as expected.evilspoons - Monday, February 11, 2019 - link
It's definitely not just you. I spent a few tries wondering what I was doing wrong and re-read the start of the article until I tried the drop-down menu instead of the links.Ian Cutress - Monday, February 11, 2019 - link
That's my fault, as the hyperlinks need to be manually added. I had messed up the part of the URL after the /show/13945. It should be fixed now.Kevin G - Monday, February 11, 2019 - link
I noticed this as well.IGTrading - Monday, February 11, 2019 - link
Thank you Ian for a good review.I completely agree with the conclusion that the 2300X makes perfect sense, but the 2500X is harder to place in the picture ...
On the other hand, despite 2400G and the 2500X have the same TDP, if I look at the graph with full load power consumption, I can clearly see that the latter has a very generous thermal limit, compared with the 2400G where the thermal envelope seems to be very strictly limited.
Meaning OEMs will probably be able to use the 2500X for cheaper gaming systems where auto-overclocking is used as a feature and AMD will thus be able to offer something better for a lower price.
This also allows AMD to push AM4 harder on the market, giving itself the opportunity to future upgrades for AM4 buyers.
So the 2500X will show considerably better performance than the 2400G despite the similar config (minus the iGPU) while not cannibalizing the 2600 nor the 2400G.
If AMD manages to sell more 2500X through OEMs, AMD also builds a future upgrade market for itself, unlike Intel that will likely push buyers into purchasing new machines.
dromoxen - Monday, February 11, 2019 - link
ppl buying these CPUs are not the sort to be upgrading the CPU.. to most the computer is a closed box and is upgraded as a whole . I do wonder where all these cores are going .. I mean its great to have 4 6 8 cores with another 8 hyperthreads .. but who is using all that power ? Lets make 4 cores the absolute limit , unless you have a Govt permit to purchase more.GreenReaper - Monday, February 11, 2019 - link
Browsers have been getting a lot better at using multiple cores, and websites surely do enough in the background nowadays to justify the effort.RadiclDreamer - Tuesday, February 12, 2019 - link
Why would there be any limit on how man cores? Whats it to you that I want to transcode movies faster, or multitask more, or anything else? And government permit to have more? Thats just insane.kaidenshi - Tuesday, February 12, 2019 - link
He's trolling like he always does. Anything to get under someone's skin enough to get a reaction out of them.