maybe for that reason was cheap...
In these articles, are they also using an APU? Or are they normal CrossFire?
In game are you sure CrossFire is enabled and running, setup an onscreen sensor display of both GPUs utilization.
Have you tried disabling the APU GPU and running solo?
Remember your APU GPU is using the system RAM, massively slower than discrete GPU memory, and yours is 1833. And is 8GB for system and GPU in games. If you need more boost.
Is your PSU upto the job?
Hi guys, thanks for replying. We mostly solved the mystery over at the Steam forums: AMD Dual Graphics producing WORSE FPS in Tomb Raider?? :: Tomb Raider General Discussions
The online benchmarks I mentioned are such as this: AMD A10-7850K Dual Graphics Performance > Dual Graphics Performance - TechSpot
And the massive boosts they reported were on low settings. I was running my Tomb Raider tests roughly between high and ultra, as that's how I play it. Adding low and normal settings to my own benchmark tests produces the following results, which don't in fact contradict the articles I read. They show that the R7 240 in crossfire with the A10 7850K does perform much better at low settings, with rapidly diminishing returns as you increase settings, until after High it starts to reduce FPS. I still don't really understand the technical reasons for this intuitively strange, and very unsatisfying performance curve. I suppose it means for shooting games, where you want high FPS over graphics, the R7 240 may be worth it. For Tomb Raider, where I want it to look nice and am happy with 30-40FPS, and so like to run it between High and Ultra, dual graphics makes zero or slightly negative difference.
Tomb Raider FPS testing as follows:
ultra: 21fps avg
high: 28.3fps avg
normal: 33.6fps avg
low: 46.8fps avg
Dual graphics with R7 240
ultra: 17.2fps avg
high: 30fps avg
normal: 37.7fps avg
low : 62.7fps avg
So I'm largely considering AMD's Dual Graphics a failed experiment. However I have ordered an R7 250 off Amazon in order to test that, and see how much better it performs, esp at high settings.
Largely reposting this for others' information - a warning perhaps. But I do have one further question:
Am I right that the R7 250 comes only as either 1Gb DDR5, or 2Gb DDR3? (i.e. no 2Gb DDR5, and no 1Gb DDR3?) Which of those two versions ought to do better in crossfire at high and ultra settings? I'm not getting stuttering, so I believe that means I'm not running out of VRAM - is that a correct conclusion? In which case the 1Gb DDR5 model should be preferable? Or is there some other factor why the extra VRAM might outweigh the faster memory speed?
I'm also curious about the technicalities of this. Why the crossover in performance curves, when the 240 dual graphics curve dips below the A10 just before Ultra settings? I can understand an inferior GPU dragging a superior GPU down. But why is the R7 240 in DG better than the A10 alone at low settings, but worse at ultra settings? Why that discrepancy? What specs are responsible?
have you try R7 240 run alone??i mean not dual.is that possible?
Yes that's possible, I'll try it tomorrow in Tomb Raider and see what happens.
see that and tell me.for curiosity!perhps it helps!!
2 of 2 people found this helpful
Right, the higher settings dropping that much is suggesting it might be CPU/GPU. Customize your settings in that game and turn off things like PostFX, other stuff that is intensive. In Crimson turn down Tessellation to x6 or x8, you wont notice higher and the game is x64. Take a look at what your CPU is doing in game at each setting level.
The memory of your R7 250 is exactly the same data as the memory on the APU, even if you had HBM on your 250, the APU will still be using the system RAM and every time that renders a frame, it will slow things down. You want to be running with 2400mhz RAM for scaling and to better complement the R7 240s speed.
So interesting results. I re-did all my tests using Tomb Raider's built-in benchmark, and included a run using the R7 240 alone. And it looks like this (average FPS then min-max):
A10 only, 1200x1024
ultimate: 15 avg, 12-20
ultra: 27 avg, 21-33
high: 34 avg, 25-44
normal: 43 avg, 34-52
low: 62 avg, 46-74
Dual graphics 240, 1200x1024
ultimate: 14 avg, 11-18
ultra: 25 avg, 19-31
high: 37 avg, 28-47
normal: 47 avg, 36-58
low: 73 avg, 56-90
240 only, 1200x1024
ultimate: 14 avg, 10-20
ultra: 25 avg, 19-31
high: 37 avg, 28-48
normal: 47 avg, 36-58
low: 73 avg, 56-86
Zero difference running the 240 alone or in dual graphics. Which isn't what happened in that Techspot article I linked above. They were using a 2Gb DDR3 model, whereas mine is 1Gb DDR5. Not sure what that implies.
It's very interesting/strange that the 240 completely takes over in dual graphics, but doesn't itself receive any boost from the A10 at ultra settings. So strange that I re-ran the benchmarks a few times to be sure. Unless I need to restart after selecting Dual Graphics in Crimson? Maybe I should try that just to be sure... Could these results be a sign that my I haven't actually been running in Dual Graphics all along??
The RAM speed is an interesting point, regardless. I thought I remembered deciding against getting >1833MHz RAM as after that the price jumped significantly. But looking on Amazon now I could upgrade to 2400MHz for only £10 more! I'll def return that 1833 RAM and get the upgrade and see what that does.
OK so I benchmarked one more time having made absolutely sure Dual Graphics was running. No 'AMD Radeon Dual Graphics Status Icon (where applicable)' that I can see anywhere, but Device Manager shows 'R7 Graphics + R7 200 Dual Graphics' so I think that means it's def working. And the FPS came out just the same as above.
the memory of A10 holds back the 240.but at low settinhs the proccesor of 240 works faster.give a try to amemory upgrade at dual channel maybe
Yes I'm pretty sure you're right. Got the R7 250 today and it showed even more clearly that the A10, or perhaps rather the 1833 RAM, is severely holding back the graphics card when in crossfire.
Part of my confusion was that I couldn't see where that Techspot review stated their RAM speed, as they didn't when they described their test rig. I then realised they state right there on the benchmarks that they're using 2133MHz RAM (how did I miss that??). So it's looking very very likely that RAM speed is the cause of the poor dual graphics performance. My 2400MHz RAM arrives tomorrow so hopefully I'll have a positive message, and a very useful lesson, to report then!
can iasked you a question? where can i find the tomb raider benchmark?there is not in the menu of the game
It's there for me on the main menu. By which I mean, first I get the launcher (Play, Options, Tomb Raider Youtube... etc), then after clicking Play I get what I'm calling the Main Menu, and the 9th item is Start Benchmark. Not there for you?
no its not there!!we talk about rise of the tomb rider are we?
1 of 1 people found this helpful
Nooooooo, Tomb Raider 2013!
oh!i dint know that!!
Just thought to double check that my motherboard can actually handle 2400MHz RAM... (I've got a cold, don't judge my shoddy thought processes!) And Gigabyte's specs say: 'Support for DDR3 2400(OC)/2133/1866/1600/1333 MHz memory modules'. So tomorrow holds some fun BIOS rummaging to work out how to unlock the full 2400MHz! Some tests online have shown diminishing returns for 2400 over 2133. e.g.:
But for dual graphics specifically, it might very much be worth the effort. We'll see.
Oh and it turns out that 2400Mhz Kingston HyperX 2x4Gb kits on Amazon are actually cheaper than 2133MHz right now! £42 shipped. Pretty awesome. And that might be why I thought going faster than 1833MHz was going to be prohibitively expensive. So even if you specifically want to run at 2133MHz, get this and just don't overclock the memory controller, and save a few quid while future-proofing a little.
BTW, does anyone know if to overclock the APU memory controller is to void its warranty?
Well I think I give up. Got my 2400MHz RAM. It gave me a little boost in general. About 1-5FPS from ultra to low settings in Tomb Raider. Not quite what I'd hoped. And more importantly, it hasn't changed the weird Dual Graphics behaviour. Still dual graphics in Tomb Raider produces the same results as the graphics card alone (so that's with both the R7 240 and 250). In Bioshock Infinite, dual graphics was worse than the R7 250 alone on all settings.
So in both cases this is completely contrary to what happened in the Techspot article. I even tried running my memory at 2133MHz and at a lower resolution, to match their settings. But still, this happened:
Tomb Raider Low settings (dif resolutions as I couldn't choose 1200x800)
Dual Graphics them (1200x800): 117.7; me (1200x720): 141
250 them: 108; me: 146
A10 them: 69; me: 89
So you can see my A10 result and my 250 result are proportional. Then their dual graphics results keeps rocketing up, and mine falls down. And as I say, it's the same in Bioshock Infinite.
I read this in the comments: 'I had an HP laptop with a 6750 Discrete and AMD A8 with Dual Graphics... it came out pretty much as the benchmarks here. With the Dual Graphics disabled it ran faster 90% of the time. Unless the APU's GPU, and the discrete are clocked the same, with the exact same memory speed (meaning you'd need to get the DDR3 version,) I'm guessing the system needs to wait to sync the two.'
So maybe mismatched RAM frequency is the cause, I thought. But checked and the review used GDDR5 versions of both cards, just like I am.
So who knows - maybe there's a setting hidden somewhere, maybe it's a driver issue, maybe it's a hardware issue. I don't know. I'm done. I've spent far too much of the last few days running damned benchmarks and as far as I'm concerned Dual Graphics is a load of chuff. Even if you're getting the results others are, it's only in some games, so best case you're opening yourself up to a world of settings-tweaking pain. In my case, after my own world of benchmarking pain, dual graphics provides zero benefit. I don't have time to keep troubleshooting this, so I'm just going to get a proper graphics card and run that.
... And always know how much more efficient it would have been to just buy a proper CPU and proper GPU from the start...