Hi I'm new to GPGPU programming so it might be a little too difficult for me to get started quickly. I would like to see how much time it takes for a single 5870 to do, say 1,000,000,000,000 double precision (or other float point format) float point operations. Any short / simple code good for reference as a start?
Thanks very much!