cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Anonymous
Not applicable

blender unreal and unity know which to use and when heres how and why

Okay so blender is a 3d modelling tool that is BUILT upon the 1970's AMD imaging and graphics techniques for 3d rendering that uses a sort of 2d raster and camera based viewpoint and vanishing points surface format optimization. So its like imagine you've a cube instead of all 6 sides only the side facing the screen is drawn then you can make a hollywood toy story pixar like disney cartoon in CG with it and it lets you use more efficient shadows and less memory and less hardware faster with less computer work for NOT MATHS NOT COMPUTERS to save on production and editing work time to increase productivity. It works with all the other 3d software too now and they support the same tricks but it blends these sorta not 3D as much as possible and "REAL 3D" into a carefully balanced to look like real 3D when needed system for not maths not computers for lighting and shadows and other things.

in the 90's unreal saw the AMD reality universe graphics and was amazed but knew you need a real AMD computer or something mega impossibly powerful and expensive to do it using actual hardware and couldnt be faked cheaply.. but they understood about the diminishing returns of 'the last 10% of the 100% quality' like many people with fake not maths not computers not hardware like intel and nvidia dont beleive in gaming in ultra.. so since unreal was about marketing cheap games for children and entertainment their software was quickly 'all fun and games literally' and since a universe and reality has such infinity high exponential resource and maths and performance demands on hardware.. capping decimal places and length of numbers and floating point operations and other things and truncating everything so it could at max in theorey go to 95% reality or whatever actually is trillions of billions of times more efficient and lets them pretend they were still working with real maths and real numbers and for the most part since it was all dialed down and turned off for consumers and their not really computers and lousy 70's looking rabbit ears TV's and monitors it wasnt a visibly bad move at all  since most people didnt know they could have been gaming in reality like photorealism. this unreal engine system of truncating and lowering or scaling down and pretending its the same was taken to the utmost absolute extreme by intel and nvidia trying to imitate or fake or copy AMD in the 80's or whatever by using the cheapest pseudo computers not quite maths and smallest not quite calculators to try and take infinity precision and maths and computer hardware.. and do the same equations with everything being like 1 decimal place and 3 or more figure numbers or whatever you get the point.

Unity decided to properly scale down or up, you dont like limit polygons and floating point and decimal places and resolution and image quality and lighting and shadows and countless other things in completely different random numbers till your cheap **bleep** not quite computers can look like its capable of pretending to function. They used a system of every value in the computer scaling up or down with the every setting and performance and things were all linked in a single sort of uhh quality setting or ALL the things were essentially the one database field. Imagine unity maybe uses something like the windows performance index system score if your computers a 10 they give you a 10 in the graphics and the whole game looks and runs at max. Was the idea behind it. Unity is built to run on REAL COMPUTERS with less cores or whatever criminal **bleep**s didnt disable or steal whatever you have left of your AMD computer or you just buy the cheapest models like the non XT version 5500 vs 5500xt over say a 5700xt or 7900XTX top model on the market to consumers at that time you then see and hear the same content its just varies based on your total systems ability for quality overall.. instead of different model complexity or varying light and shadows or turning off and disabling the most performance intensive things for fake hardware not computers not maths intel and nvidia trash..

So .. now you know what does what and when to use them.

0 Likes
0 Replies