cancel
Showing results for 
Search instead for 
Did you mean: 

OpenGL & Vulkan

nmanjofo
Journeyman III

OpenGL: 16-bit shader storage buffer problem

I am implementing a compute shader that uses 16-bit SSBO, defined as

layout(std430, binding=10) writeonly buffer _chunkDesc{
    uint16_t chunkDesc[];};

using

#extension GL_AMD_gpu_shader_int16 : enable

The problem is that when I read the data on the CPU, it looks like they are indexed as they were 32-bit and not 16, like:

value, 0, value, 0, value, 0, value, 0,...

This works perfectly OK on nVidia cards (values not interleaved by zeros), what am I missing here? The compute shader header is here:

#version 450 core
#extension GL_ARB_gpu_shader_int64 : enable
#extension GL_AMD_gpu_shader_int64 : enable
#extension GL_ARB_shader_ballot : enable
#extension GL_NV_gpu_shader5 : enable //16 bit nV
#extension GL_AMD_gpu_shader_int16 : enable //16bit AMD

Testing on 8GB RX 480 and Vega64, same result, latest driver.

Thanks for help.

0 Likes
1 Reply
xhuang
Staff

Hi,  would you please share your minimal source code and/or binary to help us investigate this issue efficiently?

0 Likes