Showing results for 
Search instead for 
Did you mean: 

Graphics Cards

Journeyman III




I'm writing homebrew DX12 engine (mainly for fun & demoscene purposes).


For the rasterizing part, everything works as expected on both NVidia and AMD hardware, but for DXR part I'm struggling with making it to work on AMD (I have Radeon RX 6800 XT for testing stuff right now (Win10 21H2, latest adrenalin software), but I'v also tried on different machine with Radeon RX 6700, same effect).

When I enable full debug layer validation:


((ID3D12Debug3 *)D3D12DebugCtrl)->EnableDebugLayer();
((ID3D12Debug3 *)D3D12DebugCtrl)->SetEnableGPUBasedValidation(TRUE);

I'm unable to create ANY root signature with D3D12_ROOT_SIGNATURE_FLAG_LOCAL_ROOT_SIGNATURE flag

It crashes the driver instantly uppon calling ID3D12Device5->CreateRootSignature.




((ID3D12Debug3 *)D3D12DebugCtrl)->SetEnableGPUBasedValidation(FALSE);

It creates those signatures just fine, but crashes driver when I want to create ID3D12Device5->CreateStateObject

There is nothing spew to debug output, the callstack is as follows (not very informative):


amdxc64.dll!00007fffa70dc153() Unknown
amdxc64.dll!00007fffa70dbf6a() Unknown
amdxc64.dll!00007fffa70cb08c() Unknown
amdxc64.dll!00007fffa70cbb1c() Unknown
amdxc64.dll!00007fffa706d27d() Unknown
amdxc64.dll!00007fffa70a84d8() Unknown
D3D12Core.dll!00007fff899907b3() Unknown
D3D12Core.dll!00007fff8991e33a() Unknown
D3D12Core.dll!00007fff89901b23() Unknown
D3D12Core.dll!00007fff898e4f1c() Unknown
d3d12SDKLayers.dll!00007fffaa38bc55() Unknown
d3d12SDKLayers.dll!00007fffaa360bb6() Unknown
d3d12SDKLayers.dll!00007fffaa344f0c() Unknown
D3D12.dll!00007fff9a5a5737() Unknown
D3D12Core.dll!00007fff898f59c8() Unknown
d3d12SDKLayers.dll!00007fffaa358628() Unknown
> SETISequencer.exe!FRHIStateObject::FRHIStateObject(const FComPointer<FRHIDevice> & Dev, const D3D12_STATE_OBJECT_DESC & Desc, unsigned __int64 _Hash) Line 19 C++
SETISequencer.exe!FRHIDevice::GetCachedStateObject(const D3D12_STATE_OBJECT_DESC & Desc) Line 1854 C++

When I use root signatures defined in DXIL Library, then all works fine, but since my pathtracer material system is node based, I prefer to have full control over the fragments and generate local root signatures in c++ code (as whole logic & binding is done there).

All this works just fine on NVidia hardware (tested on windows 10 and 11, and few different RTX models ranging from 2060 to 4090) with debug runtime & validation on, and off.


Even this very simple StateObject (dump below) crashes driver:


| D3D12 State Object 0x000000F9423AD688: Raytracing Pipeline
| [0]: DXIL Library 0x000002D59D414980, 6546 bytes
| [0]: RayGeneration --> RG_3E307212_B6CF74EB_??RayGeneration@@YAXXZ
| [1]: RayHit --> RH_3E307212_B6CF74EB_??RayHit@@YAXURayPayload@@UBuiltInTriangleIntersectionAttributes@@@Z
| [2]: RayHitOcclusion --> RH_3E307212_B6CF74EB_??RayHitOcclusion@@YAXURayPayloadOcclusion@@UBuiltInTriangleIntersectionAttributes@@@Z
| [3]: RayMiss --> RM_3E307212_B6CF74EB_??RayMiss@@YAXURayPayload@@@Z
| [4]: RayMissOcclusion --> RM_3E307212_B6CF74EB_??RayMissOcclusion@@YAXURayPayloadOcclusion@@@Z
| [1]: Hit Group (Hit_3D4802DE_AE444268)
| [0]: Any Hit Import: [none]
| [1]: Closest Hit Import: RH_3E307212_B6CF74EB_??RayHit@@YAXURayPayload@@UBuiltInTriangleIntersectionAttributes@@@Z
| [2]: Intersection Import: [none]
| [2]: Hit Group (Occ_3D4802DE_AE444268)
| [0]: Any Hit Import: [none]
| [1]: Closest Hit Import: RH_3E307212_B6CF74EB_??RayHitOcclusion@@YAXURayPayloadOcclusion@@UBuiltInTriangleIntersectionAttributes@@@Z
| [2]: Intersection Import: [none]
| [3]: Local Root Signature 0x000002D59D37FB10
| [4]: Subobject to Exports Association (Subobject [3])
| [0]: RG_3E307212_B6CF74EB_??RayGeneration@@YAXXZ
| [5]: Raytracing Shader Config
| [0]: Max Payload Size: 16 bytes
| [1]: Max Attribute Size: 8 bytes
| [6]: Subobject to Exports Association (Subobject [5])
| [0]: Hit_3D4802DE_AE444268
| [1]: Occ_3D4802DE_AE444268
| [2]: RG_3E307212_B6CF74EB_??RayGeneration@@YAXXZ
| [3]: RM_3E307212_B6CF74EB_??RayMiss@@YAXURayPayload@@@Z
| [4]: RM_3E307212_B6CF74EB_??RayMissOcclusion@@YAXURayPayloadOcclusion@@@Z
| [7]: Raytracing Pipeline Config
| [0]: Max Recursion Depth: 1
| [8]: Global Root Signature 0x000002D5F56009D0


Anyone have any idea what it might be that the driver just bails out ? (Unreal engine 5.0 uses similar way to bind local root signatures, and it works, so it's not that the driver/card is broken, but there seems to be something that needs to be set and I'm missing it obviously, but the ammout of code UE does in the middle obfuscates all the steps).

There is no single information on debug output what's wrong, DX Debug layer thinks all is fine.

I'v also tried the Microsoft D3D12RaytracingLibrarySubobjects example from DXSDK, and just added very simple local root signature to it, and the effect is identical, it just crashes the driver uppon creation of RootSignature (when debug layer is enabled) or during StateObject creation (when debug layer is disabled).

To replicate this behaviour in Microsoft example, one need to add this code:


std::vector<CD3DX12_ROOT_PARAMETER1> Parameters;

CD3DX12_ROOT_PARAMETER1 ConstantBuffer;

RootSignatureDesc.Init_1_1(Parameters.size(), &Parameters[0], 0, NULL, D3D12_ROOT_SIGNATURE_FLAG_LOCAL_ROOT_SIGNATURE);
ID3DBlob *SignatureData = NULL;
ID3DBlob *ErrorData = NULL;

if (FAILED(D3DX12SerializeVersionedRootSignature(&RootSignatureDesc, D3D_ROOT_SIGNATURE_VERSION_1_1, &SignatureData, &ErrorData)))
ID3D12RootSignature *Result = NULL;
m_dxrDevice->CreateRootSignature(0, SignatureData->GetBufferPointer(), SignatureData->GetBufferSize(), __uuidof(ID3D12RootSignature), (void **)&Result); // << -- here the driver will crash


at the begining of D3D12RaytracingLibrarySubobjects::CreateRaytracingPipelineStateObject function, and enable SetEnableGPUBasedValidation in DeviceResources::InitializeDXGIAdapter function.


I'v tried various AgilitySDK versions (606 (1.606.3), 600 (1.600.10) and 4 (1.4.10)). I'v also tried without AgilitySDK, plain old DX12, the effect is always that same.

0 Replies