AnsweredAssumed Answered

High GPU present times with simple scenes (Win7, GPUView, ATI Radeon)

Question asked by brianl on Aug 14, 2013

I have an application where very low latency is very important. I have a windowed scene that is ~1800 x 1100 pixels, and simply clearing a 10x10 area of it and calling D3D9's Present().

 

Using GPUView, I'm seeing ~5ms just for the GPU to finish presenting to the scene (ATI Radeon HD 4550). This seems strangely high. I don't wait for vsync and have tried using all the swap effects available (discard, flipex, copy) but haven't seen any improvement. Using an nVidia card with this same test (on a slower machine), my GPU frame times are around 4x faster.

 

Has anyone else seen similar performance problems? Do I just have a bad card or driver I wonder?

 

 

----------------------------

 

Windows 7

 

Machine 1

HP Intel Core i7, @ 2.8 ghz (quad core)

ATI Radeon HD 4550 (desktop GPU)

Driver Date: 4/24/2013 (beta driver download)

Driver Version: 8.970.100.0

 

Machine 2

Macbook Pro, Intel Core 2 Duo, 2.4 Ghz

Nvidia GeForce 8600M GT (laptop GPU)

Driver Date: 6/21/2013

Driver Version: 9.18.13.2049

Outcomes