I have a quite complex fragment shader (written in cg) that works under arbfp1 profile but fails to load under glsl on x1x00 cards. Analyzing with GPU ShaderAnalyzer, I've found that if I comment out a call to arithmetic random number generator routine, the shader loads (24 GPR, 390 ALU, 33 TEX). But since the analyzer shows N/A when the shader doesn't load, I don't know which limit is being breached when the RNG code is there.
The RNG routine doesn't use texture fetches; if used alone it takes 34 ALU and 6 GPR.
Is it possible to find out somehow what exactly causes the shader not to load? glGetProgramInfoLog only remarks that "Fragment Shader not supported by HW", as does the analyzer.
I suppose that when the shader works fine when compiled to ARB assembly language, that there is a way to write it in glsl so that it loads there too, isn't it?
Edit:Removed Advertising from the post