0 Replies Latest reply on Mar 16, 2010 6:31 PM by FGHades

    [DX9] hlsl lerp weird bug. SM3.0 pixel shader



      I've got weird result on some pixel shader output.

      I  sample R16G16 four times (using point sampling) to compute a manual bilinear filtering on a R16G16 surface. (I can't use bilinear filetring as I'm packing different kind of data in this surface). The result is put in a ARGB16F render target.

      Screen : http://img52.imageshack.us/img52/3778/weirdf.jpg

      shader sample :


       float f1 = tex2D(ShadowSampler, _texcoord.xy).g;

      float f2 = tex2D(ShadowSampler, _texcoord.xy + o.xw).g;  

      float f3 = tex2D(ShadowSampler, _texcoord.xy + o.wy).g; 

      float f4 = tex2D(ShadowSampler, _texcoord.xy + o.xy).g; 

       float fV0 = UnpackFade(f1);


      float fV1 = UnpackFade(f2);

       float fV2 = UnpackFade(f3);

      float fV3 = UnpackFade(f4);

      // Bilinear filter on fade value

       fFade = lerp( lerp( fV0, fV1, d.x ), lerp( fV2, fV3, d.x ), d.y );



      When debugging the weird pixel in PIX, the oC0 register reports a correct computation (mean wanted computation). But pixel written in render target are not.

      I can send a PIX capture of this problem. I've tested it on 2900XT and 3850. The problem doesn't exist on Nvidia 8800 GTS card. Maybe someone could confirm that it's a driver problem or point me to a workaround solution...

      Thanks for any help !


      Driver Packaging Version 8.702-100202a-095692C-ATI 
      Catalyst™ Version 10.2 
      Provider ATI Technologies Inc. 
      2D Driver Version 
      2D Driver File Path /REGISTRY/MACHINE/SYSTEM/ControlSet001/Control/Class/{4D36E968-E325-11CE-BFC1-08002BE10318}/0000 
      Direct3D Version 
      OpenGL Version 
      Catalyst™ Control Center Version 2010.0202.2335.42270 
      AV Stream (T200) Driver Version