cancel
Showing results for 
Search instead for 
Did you mean: 

General Discussions

Anonymous
Not applicable

Same mobile game like genshin impact or minecraft.. runs same games same.. similar FPS to gaming PC!

in 1950's computers were size of a house and cost more than a house.. wayyy more in millions. 

but people wanted businesses and banks needing cheap computers for small business or home office. Want it cost a few thousands.. so they duct tape a few calculators together and make a general purpose but really slow lousy but tiny light weight and runs on anything computery looking pretending to be a computer type of stuff.. it was a commerce language from IBM called COBOL. by the 1960's COBOL had a few extra features added to it.. it was now known as "C" the C is for COBOL.

by 1965 the real computers had vastly different designs and systems and specific custom software and were all genuine super computers.. even below 300mhz with millions of instruction set length they were maths monsters with low latency could probably out game modern gaming PC's without breaking a sweat. Industry wanted to fit more RAM or like 32bit to 64bit.. so they cried and shouted that C was decades behind and needed to "get with the times" but people spent the last few years all desperately learning C and still sucking at it.. telling them to learn a new thing for new systems that are hundreds of times faster and better and cost the same or less just by putting in newer better software was dumb as it would take a year or two of learning things. So they simply patched one or two more features onto C and called it C++ the two pluses show theres literally only two new features.. so the thought to be dead work horse for the common people became an undead zombie t virus Sh!T we cant get rid of to this day.. its single CPU single GPU crap means that everything runs like 16bit DOS pre win 95 era garbage.. and its all emulated on modern PCs.. even ancient garbage like intel and nvidia hardware core designs are like the fakest of fake PC's ever and rely on those old softwares to seem modern. Intel CPU's are 11th gen and are 15 to 20 years old literally by design depending on how you see them maybe only 13 to 15 years.. but could be as much as 20 years old or more.

in the 1970's hifi sound systems, TV's and  microwaves and other things you'd press buttons and use stored settings with required cheap computery chips that definitely werent computers.. many of them would need to be as low power as possible.. So they developed Reduced Instruction Set Computers (RISC) the famous ones nowadays are the ARM chips we have in our phones and apples trying to use in their m1 laptops. You see since DLSR cameras and such used RISC chips and stuff RISC instead of being a powerful PC CPU with hundreds of thousands long instruction sets and maybe 120 rows or even 40 rows.. its just simple basic 1+1=2 toddler crap about 60 length and maybe 60 rows? Google bought the linux kernels from the DLSR cameras and used it to build its android OS which was designed for cameras.. but since the same garbage cheap low power chips were used in mobile phones they all ran android on those too.. which later on bit them in the arse when it has harsh limitations and they tried to make a whole new OS.. which was sorta a bandaid fix to the problem they rushed it out called CHROME OS.. for their chromebooks.. but to port android to chrome or chrome to android wasnt really doable.. they tried for a few years and failed.. then bought a new linux kernel and decided they'd make their new OS called FUSCHIA! they been working on fuschia for years now! anyway software people keep polishing 60's turds and using preschool maths and functions saying how "optimal" and "fast" it is.. and not to mention EASY! and using cost cutting budget bull**bleep** hardware and faking it all with software.. like sorta nvidia intel did for a number of years on various corners they've probably cut from how I see it has long since been a serious problem because after the year 2003 computer hardware for multi GPU and multi core CPU has changed completely and the way order and steps of doing things are altogether physically built different in many places.

Look at the 70's floppy disk drive or the 90's CDROM's vs modern NVME drives.. but game devs still use 1960's software on them all called C++ and its single CPU single core.. which runs just as awfully terribly on your AMD threadripper that cost thousands as it does on your mobile phone! Because nobody's ever bothered to learn what a computer looks like and how to use one.. they're taking away our computers altogether! use modern programming languages and modern API's and DIRECTLY passthrough copy files into the GPU hardware for hardware decoding and remove all buffering and other useless delays. Develop using Rust or C# and F# or using DHART and then pair it with VULKAN or DX12. 

AMD CPU's and GPU's have always been vastly better than nvidia and intel but people dont know how to use them is what I believe. I should know i myself figured out how to use an AMD GPU to hardware decode 4k 10bit HDR video.. every reviewer lied and said the 5700xt i bought was bad and buggy at decoding HEVC.. and that all your $200 tablets and mobile phones and $200 smart TV's which can decode and playback 4k HEVC netflix or VP9 youtube were far better quality than your $600 dedicated graphics card that chews up monstrous amounts of power and has very expensive 8GB of VERY fast RAM.. They tell us these lies so we fail to notice their lies about nvidia intel and eww apple and countless others. When people figure out what a HBM RAM radeon VII or other AMD cards are for and how to use them they'll be slapping themselves in the face. 

You idiots see 120hz and 90hz high refresh rate 4k or 8k video mobile phones playing netflix and youtube for hours with HQ OLED DISPLAYS and "long battery life" and look at the massive 750 watt powersupply in your PC and your 64core thread ripper running cyberpunk 2077 the same speed and FPS as a 3300x or a 5600x quad core and 6 core CPU's that are supposed to be running multicore true 64bit software all perform the exact same no matter how many extra cores and better RAM you throw in as a basic generalization.. the code and drivers they're using to make the game access the game files is the same floppy disk CD ROM controller bull**bleep**. playstation and xbox had to have a massive media **bleep**storm to get people to care about windows 10 and "direct storage technology" which is literally just saying "delete your bull**bleep** software and run modern programming langauges and code you fakers". I complaint to everybody but nobody cares.. Even AMD's own website you can basically count all the games on two hands that even pretend to pretend to have a single AMD feature.. Explore AMD Featured Games | AMD

go to AMD website and click gaming and view featured games.. buy the games that say "freesync premium pro" and press the menu buttons on your TV or monitor or use your remote to enable freesync, then setup windows 10 settings and AMD adrenaline settings to get your freesync working. Disable Vsync and disable AA and configure your games graphics settings to eliminate things that maybe greatly increase latency and then game away low latency gaming high refresh rate over 120fps is super awesome. I highly recommend dirt 5.. or gears 5 or strange brigade or resident evil 3 anything listed on the AMD features games is decent.

0 Likes
0 Replies