Showing results for 
Search instead for 
Did you mean: 

General Discussions

Adept I

Threadripper Pro and D4 error


Not sure where to put this in which forum section but i need help with my first AMD after 25 years of Intel's.

My system specs are this:
MB: Asus Pro WS WRX80E-SAGE (latest BIOS from 06-2021)
CPU: AMD Threadripper Pro 3955WX
RAM: 256GB (8*32GB) 3200MHz Kingston ECC
PSU: Corsair AX1600i
NVMe: Seagate Firecuda 520
HDD (3*SATA mechanical drives 4,12,16TB)
GPUs: 4* Nvidia 2080Ti
USB: keyboard, mouse, headset, some hub, USB software dongle, Wacom tablet....

And my problem is the GPUs, actually this AMD/Motherboard setup don't allow me to boot with more than 2 GPUs. As soon as i plug in 3rd GPU my POST LCD screen on Motherboard shows "d4" error and just sits there whit no boot. According to manuals d4 error is "not enough PCI-E resources".

I enabled Above 4G decoding in BIOS, i plugged in all 4 PCI-E power cables to MB power positions but still no go.

Some say that i need to disable CSM support in BIOS, enable Fast Boot but when i disable CSM then i can't boot in windows form my NVMe, it behaves like my NVMe is not boot drive at all and it just loops back to BIOS screen. Also when it's disabled i can't boot/install anything from USB or DVD drive (like Acronis TrueImage Boot CD or windows installation) if CSM is disabled.

CPU is 128 lanes so there should be plenty of that since my old DualXeon with only 80 Lanes was able to boot and use same those 4 GPUs just fine and now when i updated for newer/faster machine/CPU i can't use more than 2 GPUs for my Octane GPU rendering :(.

Can someone help me here (I tried Asus support but they are useless, guy answering the questions is slow on replies and has no idea about hardware in general let alone what i ask).

Heeeeeeeeelp !


6 Replies

Downloaded your Asus Motherboard BIOS PDF. Make sure in BIOS all of your 7 PCIe lanes are set at "x16":

Screenshot 2021-07-14 172859.png

most likely they all are by default but need to verify anyways if you haven't yet.

You aren't using RAID for your HDDs are you?

Also in your BIOS there is a whole section on configuring your PCIe slots:

Screenshot 2021-07-14 172859.png

But I really don't have no idea how to configure these settings.


According to your Motherboard Manual here is the best configuration using 4 GPUs at the same time:

Screenshot 2021-07-14 172859.png

If you were aware of all this then I am sorry that I repeated it again.




Thanks for reply and tiem to look into this.
But yes my BIOS settings are 16x on all PCI-E slots (actually that is default bios settings also) but it's not that sadly.


I read that the consumer ThreadRipper Pro is similar to the server AMD EPYC Processor and you are using a Server type Motherboard.

Try posting your question at AMD SERVER GURUS and see if anyone there can help you with your situation from here:

That AMD forum is for all Server related type hardware.


Update: Finally SOLVED


1. I had to install Microsoft ADK
2. Then used Admin console do to convert NVME boot disk to GPT wiht mbr2gpt /convert command,
3. then go to BIOS and disable CSM (has to be disabled, not even UEFI & Legacy works) and then NVME can boot windows with CSM disabled.
4. After that i was pluging one by one GPU and installinh Nvidia drivers and i had to do cleanup of drivers (with DDU tool) in windows safe mode to install all 4 GPUs properly.

Now my GPUs are slotte din slots 1,2,3,7 and it works with all 4 GPUs.



Good troubleshooting.

I have a Asus motherboard with several internal HDDs. My Windows HDD I converted from MBR to GPT when I was forced to do a clean Windows install. I also converted all of my internal HDDs to GPT.

But in BIOS Settings I have both UEFI & LEGACY mode enabled otherwise I won't have any video output during POST from my 4k Monitor.

You can't enable UEFI Mode in BIOS with a MBR Windows drive. It won't boot up. You need to have CSM enable for a MBR Windows drive to boot up. 

I didn't mention this because I was assuming your Windows Drive was already in GPT mode.

NOTE: You didn't mention which OS you are using but if you are using Windows 10 and you have UEFI Mode enabled you are now able to upgrade to Windows 11 as long as you have TPM 2.0 enabled (either fTPM or Hardware TPM)  in BIOS later on.

anyways glad you got your 4 GPUs working and posted what you did to get them to work.

Take care.





Yes my NVMe was somehow MBR instead GPT and i did not want to reinstall all again (took me 3 days to install all the software and transfer licenses/plugins/settings/configs/presets from my old workstation so reinstalling wasn't an option for me).
I had Legacy on in CSM because i couldn't install windows from my DVD drive otherwise and apparently that formatted my system drive (NVMe) into MBR instead GPT.
At that time I haven't considered that being an issue/weird to see DVDs and all HDDs in the BIOS.
While i was installing all i had only 1 GPU inserted in MB slot one and it all seemed fine, I've not noticed any issues until i started plugging all the GPUs back on and getting d4 code :(.

As for BIOS settings for me i had to put it fully on UEFI only. UEFI and Legacy did not work for me so CMS is fully Disabled here to make it work.

As for monitor image during post, you are right i don't see post screen/BIOS on my 7th slot GPU when using HDMI, it's all just black till windows. But i have triple monitor setup and when it was on single monitor it somehow worked (but back then when i was installing i had 1 GPU in slot 1 so maybe that is key part ? - to have monitor GPU in slot 1, it's stupid if that's the key but worth mentioning).

BUT i got a workaround for that with my older monitor and D-SUB converter to DP connector so i switch that monitor to 2nd input if really want to see BIOS/Post screen now. Really stupid since in my BIOS display is set to Discrete GPU first (instead onboard GPU which i had to turn off on DIP switch on MB) but i can't pick which slot GPU should display so i wouldn't' be surprised if is displaying at one of my 4 GPUs but not one i want since my slot 1 and 2 are on Extenders and GPUs are hanging from the roof of case (i made custom aluminum holders) so i can't really plug in monitor in these to check

But for me currently it's acceptable workaround (it's not like i enter the bios much once is set it correctly now) and i had same that POST black screen till windows with DualXeon on Asus Z10PE-D8 WS motherboard so i wouldn't be surprised if that is some crazy Asus "feature" with those Workstation Motherboards where they forget to include WHICH discrete GPU/slot to show image during Boot so it decides automatically or per PCI/IRQ/port or some other logic with multi GPU setup...