I have an ATI card and I've been working lately with a lot of images. One particularly messy thing I need to do a fair bit is convert from RGB to RGBA. However, depending on the situation I can work in RGBA, I just have to ignore the A channel from the source.
I'm just downloaded the SDK and started going through it, and find that I cannot access sub-32bit elements. I'm not sure if it's because it's only OpenCL 1.0 or just an arbitrary limit on the hardware (RV710).
My question is, is there anything that can be done easily?
I suppose that I could 'extract' out each byte, shifting bits around and then build up the resultant RGBA 32bit int by ORing and shifting.
Does that sound like so much work that it'd hardly be worth trying to do it on a GPU? The maths won't likely be all that complicated, Multiply or Average will probably be OK, although I think Screen would be most likely to look best.
I'm trying to find out if the ATI 5570 still has an RV7XX or the Evergreen chipset (you think they'd list that somewhere...?), I think I might upgrade. It's important to me though to try and keep the card price well below $100, that's my target audience.
Thoughts? Or just throw the RV7XX away? I might try it anyway just for experimenting...