Hey all! I’m tinkering with a Radeon Pro V340l (dual Vega 10 GPUs, 8GB Hynix HBM each) that I snagged off eBay. Trying to crossflash it to a Vega 56 vBIOS so it plays nice with Windows 10. Since AMD’s official Vega 56 BIOS only works with Samsung memory (and this card uses Hynix), I flashed both GPUs with an ASUS Strix Vega 56 BIOS instead (Using ATIFlash 2.93 as well as a CH341a hardware flasher).
Current Status:
GPU #1 works perfectly after flashing.
GPU #2 shows Code 43 in Device Manager. If I disable/re-enable it, Windows claims it’s "working," but it’s clearly not functional.
What I’ve Tried:
Latest Adrenalin + Enterprise drivers: Both fail immediately.
Using modded Radeon ID (R.ID) drivers: Only fully supports GPU #1.
This feels like a driver/software hiccup. I’ve dealt with similar weirdness before (like flashing a Firepro S9300x2 to Fury X BIOS), but this one’s got me stumped. I’ll drop some pics below for context. Anyone run into this before? Open to ideas! I also want to note that I am using my motherboard's integrated graphics for display out and am just planning on using the GPU's for compute or passing through to VM's for 3D rendering. My motherboard is an Asrock Rack EP2C612WS with dual Xeon E5-2699v3's. It could be a PCIe config issue but I don't know what other settings to mess with. Thanks!
(Note: The cards only display as a V64 due to the driver. They are flashed with This Bios, not a Vega 64 one.)
Chipset BIOS Settings
Device Manager
Code 43 on Second GPU
GPU-Z on each GPU
HWInfo SHows all data for one GPU, but not other?
Task Manager only showing one GPU
Solved! Go to Solution.
I was just looking, you need to enable sr-iov in bios
I have a few of these coming in the mail. There's not much info about them.
Same situation as me, I'm waiting to snag a few more until I can get it working how I'd like. Please me know how things go for you!
Ok, my plan is a poor man's cloud gaming server. I'm gunna try windows 2016 server with 4 win 11 vm's with 2 cards driving the graphics on all 4. I'm going to attempt to use the default outdated drivers. Did you have issues with default drivers?
I wanted to potentially do something similar. I just did a quick windows 10 pro install on the system to begin with. No default AMD drivers I found would work. If you can find a driver package you would want me to try, send it
https://www.amd.com/en/resources/support-articles/release-notes/RN-PRO-WIN-19Q2-V340.html
I found ver 21 somewhere as well. The release notes don't include a Vulcan version which I found odd.
Let me reflash them to the default vBIOS and test that driver. give me maybe ~1 hour.
Driver install gets stuck at "detecting hardware" if selecting express install, or "Checking for new drivers" if selecting custom install.
Just tried the 18.q4.1 version from here and it errors out with:
Well at least you figured out how to get it to work as a Vega 56. We won't be stuck with complete trash, lol. I wish we could get some guy that ran these cards to tell us how. Mine were shipped a couple days ago, so I will start playing with them soon.
Wish I could get them as dual V56's though lol, that would be best case scenario
It's gotta be something to do with something with the virtualization controller. I've read that they work with amdgpu open source Linux driver and they show up as 2 devices according to vendor I got mine from. So proxmox is my next bet for accomplishing my goals with them.
There is definitely an FPGA on the back of the card, possibly for some virtualization weirdness? Idk, they should just work as 2 separate GPUs but I also had some weirdness when I was using my Firepro s9300x2's flashed to Fury X BIOS's. I'm not sure overall. There's also some resistors on the card, maybe for identifying Slave / Master GPU? I'll upload some pictures of the front and back of the PCB. Sorry if the quality isn't great
I was just looking, you need to enable sr-iov in bios
This made it work? Can you update them to Vega 56 and still use them both? This would make driver support a lot easier.
Yes, it works now. Thank you so much! The forum is rate-limiting my replies so it wouldn't let me respond. They're both working currently as Vega 56's :)) I just bought another 3 cards
Yeah I got rate limited too. They must not have good servers. Can you point me to a guide to flash them? Thank you too, I've been nervous since I purchased something nobody has ever used.
I was able to use atiflash 2.93 from TechPowerUp. You need to run CMD as administrator, navigate to the directory and run "amdvbflash.exe -f -p gpu_num biosfile.rom" where "gpu_num" is the number of the GPU in the system (If you only have one of these cards, it would be 0 and 1), and biosfile is the rom you are flashing. The Asus V56 BIOS I mentioned in my original post works great.
Thank you, and good luck, cheapest best GPUs ever
Unfortunately after a boot yesterday both GPU's stopped being recognized. Same issue as before. I've been trying various driver reinstalls with no luck so far. Will look into this further. Kinda bummed I might have 4 cards now that can't work fully.
https://www.amd.com/en/resources/support-articles/release-notes/RN-PRO-WIN-21-Q3-10.html
The v340 is not in the release notes but is in U0373467.inf . This is one of the newest I have found to be likely compatible.
With R.ID drivers on the default BIOS, only one card shows up yet again. I'll try that driver shortly and get back to you.
What if only one card can work in one system, like you have to pass through the other card. I know in some system that I had fury and on board Vega I couldn't get both functioning at the same time due to driver issues.
That's most likely due to different GPUs, like some APU/dGPU switching weirdness.
https://www.reddit.com/r/sysadmin/s/9gbG7qERlG
This has got the code 43 problem
That seems to be within a VM when using GPU-P, I'm just trying to get them working properly on the host first without virtualizing.
hey, just a few questions.
do you mind telling me how you flashed it?
i know you said you used the asus bios but what tool did you use (you listed both atiflash and a ch341)?
is there 2 bios' on the card that need to be flashed?
just looking for some more in-depth info because i have 3 of these cards and i can only use them in Linux but id love to try windows.
Hey there! Sorry I just now saw your message. To address your questions:
You must choose a BIOS to flash that has the correct memory based on your GPU. In my case, that was Hynix. All of the official AMD Vega 56 BIOS's are for Samsung HBM and thus wouldn't boot. Same goes for all Vega 64 BIOS's as well unfortunately since they all utilize Samsung. I chose the Asus BIOS since it had the highest power limit from TechPowerUp's database, and I could always just lower that limit rather than trying to raise the default limit with regedits or other trickery.
I used AtiFlash 2.93 as it lets you bypass SSID mismatches with the BIOS files. However, at one point for some reason after all the flashing, at least one of the EEPROM chips was corrupted. To fix this I used a CH341A with the 3.3v mod. You shouldn't need to do this, but it's good to know how and don't be surprised if it's necessary. The process wasn't too difficult.
And yes, you need to flash both BIOSs as there are 2. These GPU's are essentially separate cards with separate power delivery, EEPROMs, etc. just on one PCB. In Windows and overall, this one card is functionally 2 separate ones. They just share a PCIe finger and communicate over a PLX PCIe switch. Both GPU's have full 3.0x16, it's not bifurcated!
Just a heads up so you don't have the same issue I did, your BIOS either needs to have SR-IOV or it has to support it by default for both GPUs to work.
Best of luck and let me know if you need any further help!
thanks for the info. just one more question (sorry). since there are two bios chips how do you make sure ati flash does both? is it one at a time? if it is how do you select them? Ive used both ati flash and a ch341 before but this is new to me.
No worries, don't be sorry lol. Yes, you flash each separately and they show up as 2 different GPUs in AtiFlash. You need to run CMD as administrator, navigate to the directory and run "amdvbflash.exe -f -p gpu_num biosfile.rom" where "gpu_num" is the number of the GPU in the system (If you only have one of these cards, it would be 0 and 1), and biosfile is the rom you are flashing. You can also run "amdvbflash.exe -i" to get a list of the GPUs installed and the info for them. You run the command once for each GPU.
also did you have any problems with clock stability? @cgavaller2
I haven't done any extensive tests yet, and am waiting on a better fan to arrive in the mail before I 3D print a shroud for it and repaste the cards and replace their thermal pads.