0 Replies Latest reply on Feb 7, 2017 9:21 AM by neuronum

    Why could wrong lightmap happen while using gaming display cards?

    neuronum

      Comparing to professional display cards, I heard, gaming display cards may result in wrong lightmap, while using viewport to observe some scenes in 3ds max.

      How could this happen? I mean, both pro cards and gaming cards run the same viewport render program, and the program written in dx11 shader language should determine everything to happen in viewport. But the fact is that one can correctly render while another cannot.

      It would be ridiculous if the same program yields different results when it runs on different CPUs.

      I presume that, to chase high frame rate in games, driver providers( AMD or Nvidia)  override some original D3D11 code with more efficient one, and this leads to wrong render result.

      Is it true? If not, what should be responsible for different render results?

      45c6eabb-b8e7-43f1-8acc-bebdc65f58c0.jpg