cancel
Showing results for 
Search instead for 
Did you mean: 

DirectX 12 Unleashes AMD FX Processors in Battlefield 1

Staff
Staff
8 15 155K

Battlefield™ 1 has now been on the scene for a spell, and we hope y’all are having a blast storming the trenches with powerful Great War weapons like the mighty Kolibri. Between rounds, we’ve been crunching the numbers on the new DirectX® 12 renderer in Battlefield 1’s Frostbite Engine, and AMD FX users are in for a real treat: 30-46% higher framerates!1

Here it is, plain as day:

bf1_blog.png

But… how?

The secret lies in a DirectX® 12 feature “multi-threaded command buffer recording,” which we covered in detail last year. The short version is pretty straightforward: MTCBR allows a game’s “to-do list”—its geometry, texture, physics, and other requests—to be interpreted and passed to the GPU by multiple CPU cores, rather than just one or two cores as in DirectX® 11.

Because the processor can tackle the to-do list more quickly with DirectX® 12, the flow of information into the graphics card can be accelerated, which helps rendering tasks spend less time waiting around for important bits to appear.

In software as in real life: having more hands for a complex job just gets things done a little (or a lot) more quickly. See you on the Battlefield!

Robert Hallock is an evangelist for CPU/APU technologies and IP at AMD. His postings are his own opinions and may not represent AMD’s positions, strategies or opinions. Links to third party sites are provided for convenience and unless explicitly stated, AMD is not responsible for the contents of such linked sites and no endorsement is implied.

FOOTNOTES:

1. Testing conducted by AMD Performance Labs as of 19 October, 2016 on the AMD FX 8370, FX 8350, FX 8300, FX 6350 and FX 6300. Test system: Radeon™ RX 480 GPU, 8GB DDR3-1866, 512GB SanDisk X300 SSD, Windows 10 Pro x64, Radeon™ Software 16.9.2, 1920x1080 resolution, Ultra in-game preset. Average framerates DirectX® 11 vs. 12: AMD FX-8370 (66.9 vs. 86.9), FX-8350 (61.58 vs. 84.89), FX-8300 (58.76 vs. 80.6), FX-6350 (60.03 vs. 80.48), FX-6300 (52.38 vs. 76.24).  PC manufacturers may vary configurations, yielding different results. Results may vary with future drivers. DTV-84

15 Comments
Adept III
Adept III

Thanks Robert! You think we'll see multi-GPU DX12 support in Battlefield 1 soon?

Adept II
Adept II

has dx12 been patched to work better? No more hitching?

Journeyman III
Journeyman III

hey man its great that you are advertising DX12 for this game, but have you actually played BF1 in DX12??

It's very poorly implemented. The frame times and frame rates spike so much on DX12 that its actually unplayable right now.

I have the same setup you listed. RX 480, SSD, 8 GB ram running at 1866, FX 6300 @ 4.3 ghz (so technically an fx 6350). The game runs horribly at DX12 and in DX11 its averages are good but the 1% lows are so terrible.

Hopefully DICE can fix this in the next update. Also a question, does BF1 running DX12 mode use the ACEs for GCN GPUs?

Adept I
Adept I

Hm...

I have a 8370 and a RX480, and the FPS fluctuations when using DX12 are pretty bad. Lots of hitching. Very rarely do I get smooth gameplay. Using the latest GPU drivers.

Journeyman III
Journeyman III

This seems like a bit of an unfair test as you used a GPU that is known to largely benefit from DX12 regardless of CPU so it's hard to tell exactly how much the CPU is gaining the system with extra performance.

Are you able to test a GPU known not to benefit from DX12?

Adept I
Adept I

The funny part is that DX12 is unplayable right now for almost everyone (broken frametimes and huge FPS drops) on multiplayer. I hope DICE fixes it.

Journeyman III
Journeyman III

Comment on DX12/BF1Smiley Frustratedystem:

CPU: FX 6300  at 4,2 GHz locked (cooled with Thermalright Macho 2 temps under load def. under 60 deg. C.)

Mobo: ASUS M5A78L-M/USB3 Bios: 2101, added special cooling on voltage regulators, Max PCI-E: 2.0. (3.0 not available)

Ram: 16GB Crucial 1600 at 1728 (Baseclock 216 MHz)

GFX: XFX390 DD black edition, Bios: 015.049.000.007.000000, not further OCed.

Mon: AsusVG248 (at 144Hz desktop/games, DVI cable that came with it is used, not displayport)

PSU: XFX550 W

HDDSK: 1 Samsung SSD (evo 840 + 850) FW up to date, seagate 2TB hybrid mechanical HD (not checked for FW update) .

OS: Win10 x64 upgraded from Win7 not a fresh install, stripped down to the bone updates blocked, cores unparked, razer cortex, teamspeak, smart port forwarding

Tested gfx drivers: Crimson 16.10.2 and 16.10.3. Settings: Powertarget raised in the driver to +50%, Tessalation set to custom/off in driver.

Tested game: BF1 Multiplayer in DX11(Fullscreen seems to be faster) and DX12 (Borderless seems to be best, and alt/Tab to Teamspeak in DX12 Fullscreen is not so nice). Custom settings used, GPU memory unclamped.

Time tested: 112 hrs running DX12 almost all the time, 2 crashes with DX12 errormessage so far.

Observation:

There is major stuttering going on once the DX12 is used for the first time. After a while when you have "shown DX12 the map" it becomes smooth with no hiccups anymore. This behaviour resembles what was going on with Mantle in the early implementations in BF4. The problem is in the Windocs/Battlefield 1/cache folder files. I removed the files: stutter, put them back in: smooth gameplay. I really like DX12 in BF1 because it makes it much more enjoyable than DX11 on my end - however getting that cache filled (on SSD) is a pain. I am not sure wich side (AMD/DICE) can fix that issue. Maybe DICE could come up with a DX12 pre-build-cache utility? I don't know but filling up the cache after each driverversion is not convenient.

Keep it coming, and cheers from Ger.

BTW: Using "renderdevice.usereservedjobthreads 0" helps when filling up the DX12 cache, so something to try out.

Adept II
Adept II

GPU doesn't matter as long as its the same I think

Adept III
Adept III

Hi guys

I haven't the game in question, but perhaps the problem lies elsewhere.

Basic AMD platform is not fully recognized by Windows  and this reverberates on any videogame and other applications.

Try this registry-tweak for FX CPUs...

http://www.mediafire.com/file/bn2s5bewhbbw900/G-RegTweak+FX+v4.0a+Final+%7B64-bit%7D+.zip

see the readme file, first

Adept I
Adept I

DX12 is not perfect yet, but getting there.

Adept III
Adept III

I am convinced of what states, however, the software companies continue to crank out patches to DX12 for their games and it would be a shame not to enjoy them fully.

Journeyman III
Journeyman III

Hmm, tried the regpatch on Win10 (creating a restorepoint first..) it doesn't do anything on my end. I ran CinebenchR15 and the CPU-Z built in benchmark they both came up with the same result. I also searched the registry for "SecondLevelDataCache" and changed that from "0" (autodetect) to "2048" (Decimal value) representing the 2MB 2nd level cache on my FX6300. Didn't change anything as well. So I think win10 handles FX cpus quiet well. However I have the impression that the AMD crimson drivers don't handle AMD FX cpus in the optimal way. Sounds absurd. But try running the Intel Compiler Patcher (ICP) 1.0 (Download Intel Compiler Patcher (ICP) - MajorGeeks ) on the extracted AMD crimson driver and after patching the unoptimized files let it then install (it will complain about the altered files but continue anyways). I think it helps but I don't have any numbers so it may be a voodoo impression on my end but something to try out IMHO since it doesn't harm your system in any way. You can always install the original driver again ( I do the DDU-procedure..).

EDIT: You can also boot win10 in safe mode and scan/patch the whole windows directory. The driverfiles are in "System32" and "SysWOW64". I did that with my complete win10 installation and encountered no problem at all, however it is potentionally dangerous and you might be ending up re-installing win10 although the original files are kept as *.orig (like "amdhcp32.dll.orig").

Adept III
Adept III

I would not go OT, but actually these two benchmarks you mentioned, I am a bit of special cases. I got better scores on my system (FX 6350 @ 4.7 GHz, with CPU / NB @ 2550 MHz), when it enters the BP (up to 8 score points on CB15).

In Orochi, the BP is very particular (and perhaps risky) ... it frees from classical precedents of K8 and K10. Is based (it seems) on the 'Perceptron', which does not rely directly to the Code Cache. It could therefore improve over time:

http://www.agner.org/optimize/microarchitecture.pdf  ( FX page 33, K8/K10 page 31).

Try to launch the 'Stress CPU' on CPUZ to see if rise the score.

Change the value of "SecondLevelDataCache", could worsen the situation.

Did not know the history of the Crimson compiled with Intel compiler ... strange...

Anyway, you did not have benefits with the videogames ?

EDIT:

Regarding 'Compiler patcher', I was already aware of the matter.

As opposed to CB11.5, CB15, it does not seem to take advantage of the Intel compiler, but it

could also take advantage of vector instructions 'AVX'.

This code is dissimilar between AMD and Intel ... is not identical, so do not exclude that switch on other slower SIMD.

http://www.polyhedron.com/web_images/intel/productbriefs/3a_SIMD.pdf

It 'an old story ... It all started with the proposal of the instructions 'SSE5' of AMD, then turned into XOP...

Journeyman III
Journeyman III

Thank you very much for your reply!

I am not into the guts of modern CPUs (the last CPU I programmed in Assembler was Motorola MC68010, no fancy stuff there..). However, instead of running all those benches I gave G-RegTweak FX v4.0 Final another shot - this time in BF1 (btw I deleted the line "............."="TRUE" from the .reg file). I went on full conquest - servers to spots I know where framedrops happen and it seems better - well hard to say it def. didn't worsen, probably better. I also think the Intel Compiler Patched Crimson helps the game. Thanks again for bringing up G-RegTweak FX!

EDIT: This was interesting to read (your link http://www.agner.org/optimize/microarchitecture.pdf  ( FX page 33, K8/K10 page 31))

Journeyman III
Journeyman III

The just released "Fall Patch" of BF1 simply destroys the FPS on the map "Empire's Edge" - no matter DX11 / DX12 - no matter of the GPU chip brand. I have seen reports that even an i5 4760k is crawling with 30-40FPS at the G flag now (the fortress with the big tower with the staircase). I guess the problem is based on this statement in the patchnotes:

...

"Fixed terrain destruction depth issues on Monte Grappa and Empire's Edge."

...

Obvoiously that "fix" drives the frostbite/engine code crazy (maybe a recursion gets out of hand?) and produces massive junk CPU load.

Whatever that is - currently it is pointless to do representative benchmarking on BF1. Just wanted to mention this as a warning, keeping my fingers crossed that a new patch comes out soon.