I'm working on a Masters thesis about rendering algorithms and researching Vulkan for my thesis advisor at the same time. I've run into a bit of trouble. You see, among the people I know, I've only been able to find a single AMD GPU user. He has a Radeon HD 7800 and told me that my program crashes when trying to run the deferred tiled renderer.
Unfortunately, my university didn't help much either since the only Vulkan capable AMD GPUs I've found are inside the holy iMacs that I wasn't allowed to defile by Bootcamping Windows.
The program runs fine on non-AMD GPUs. So I guess that I'm either doing something wrong and AMD GPUs are less lenient about it or the problem is in that guy's PC. The validation layers only emit a single perf warning and nothing more serious.
I'd be really grateful if someone who owns an AMD GPU could start the program, switch to the 3rd algorithm out of 5 in the dropdown (the GUI's in my native Lithuanian. Sorry) and tell me if it crashes or not. I'd really appreciate a stack trace if it does, because I currently have no clue on what's going wrong.
While I'd love to share the source code as well, some people at my university don't like when students do that before the thesis defense and I'd rather avoid the potential trouble.
Here's a link to my program: https://drive.google.com/drive/folders/0B9GqACYn0ETzajNRSUlReWpGeHM?usp=sharing
Some other potentially useful notes:
- Press Q to quit, right mouse + WASD to move the camera
- This is a build with debug symbols using GCC on Linux and MinGW64 on Windows.