Hi AMD Dev Team,
at our application occures "bad" results from glReadPixels when using 10 Bit Mode.
Hardware:
- Dell M6500
- AMD ATI FirePro M7740
Software:
- Windows 10 Pro x64 10.0.10586
Driver:
- Package: 15.20.1062.1004-150803a1-187674C
- OpenGL: 6.14.10.13399
The color inside the test shaders is calculated with:
gl_FragColor = vec4( color/1024.0, color/1024.0, color/1024.0, 1.0);
or
gl_FragColor = vec4(...);
The pixel value is read with (delphi code):
glReadPixels( X, Y, 1, 1, GL_RGBA, GL_UNSIGNED_INT_10_10_10_2, @LUIntVar);
The output:
color | glReadPixels (UInt32) / (bin)
| expected output (UInt32)/(bin) |
---|
0 | 3072/ 00000000000000000000110000000000 | 3 / 00000000000000000000000000000011 |
1 | 1074793473 / 01000000000100000000110000000001 | 4198407 / 00000000010000000001000000000111 |
2 | 2149583874 / 10000000001000000000110000000010 | 8396811 / 00000000100000000010000000001011 |
3 | 3224374275 / 11000000001100000000110000000011 | 12595215 / 00000000110000000011000000001111 |
4 | 4201476 / 00000000010000000001110000000100 | 16793619 / 00000001000000000100000000010011 |
vec4(1.0,0.0,0.0,0.0) | 3146748 / 00000000001100000000001111111100 | 4290772992 / 11111111110000000000000000000000 |
vec4(0.0,1.0,0.0,0.0) | 3222269952 / 11000000000011111111000000000000 | 4190211 / 00000000001111111111000000000011 |
vec4(0.0,0.0,1.0,0.0) | 1069547523/ 00111111110000000000000000000011 | 4092 / 00000000000000000000111111111100 |
vec4(0.0,0.0,0.0,1.0) | 3072 / 00000000000000000000110000000000 | 3 / 00000000000000000000000000000011 |
What might have gone wrong?
Is this on purpouse?
Is there an error in the driver?
When using an 8 Bit context and fetching the data via GL_UNSIGNED_BYTE, everything is fine.
When using the GL_UNSIGNED_INT_10_10_10_2 type, the output is shifted/corrupt(?)
Thx for your help
(edited: added vec4() colors and results)