Showing results for 
Search instead for 
Did you mean: 

Drivers & Software

Not applicable

what are the correct words/terms flags and switches to enable AMD CPU/GPU functions like 3dnow?

For example, in registry or a text file I could type in 3dnow or 99999Dnow and depending on if its in registry or a text file it would work to varying degrees of success. If you view really really ancient code for windows 2000 and so on era with CD ROM drives being more common they might have used switches and for enhanced3dnow like -cpue3dn or -cpu3dn how 'effective it is' can vary too and of course they used to use a hell of a lot more buffering and had like dozens of things to control how much your PC dammed/gated cached its output to be slower because older systems were far far more default setttings closer to true AMD super computer 0 latency DDR3 RAM being 1-4ns which is a lot closer to 0 than DDR 5RAM being like 30-60ns

but they all depending on wording used and where they were typed in, would vary in visible output extroadinarily. 

you can view the wikipedia page for every AMD CPU ever and see a bunch of SSE and other instructionsets/features/functions and extensions. but i have no clue where to go for how to officially turn them on? also if making a text file in the computer is naming it engine.ini or settings.ini or config.ini or .cfg or whatever better? which one is the CORRECT naming convention and the most visibly perfect and uhh 'works more' registry keys just typing them anywhere works but are there places they MUST be typed to work better? im aware of differences between reg key types like multistring or expandablestring visibly being different while gaming or during video playback by alt tabbing typing them in and seeing the change apply. even AMD's GPUopen website doesnt seem to have such things, i think they're kinda essential. Also a complete list of compiler flags would be umm impossible to have a product without and the correct bit depth and quality settings for each AMD device. but me searching the internet for years and years for these specific things mean game and app developers are literal criminal retard **bleep**s!

For example, would one type in DIRECTDRAW and DIRECT3D and enable hardware acceleration with the letters HW or /HW.. or H/W or HWA.. are you supposed to say like doublebuffering/YES or doublebuffering no or in a japanese app i saw doublebuffering/yes or maybe Go (as in ikou) with the -usedb switch? like -usedb/go or usedb/yes. do you type 3dnowtrue? or is it 3dnow/YES 3dnowenabled or 3dnowON for example typing each thing is different.  is it the fake bit depth thing with the intel naming using fonts such as 3DNOWtrue capital letters then lower case letters like some ancient quasi pretend calculator fortran/COBOL atarist2000   nonsense the way the intel website states all its formatting and prose and stuff should be? using the official AMD website gpuopen seems to make when typing in whats said to be official AMD code like AMDFIDELITYFXsuperresolution2 deleting the letters AMD in front can improve quality or putting it back in fully typed out advanced microdevices may improve or worsen various fidelity FX functions? give it a try and visibly see the difference for yourself. 

Using directsound has massive latency and sounds lousy when its supposed to sound absurdly better than vinyl records or people wouldnt be throwing out their vinyl records the minute dolby surround sound became a thing that exists and cost way more than free and fished from the trash chopsticks and a needle record players with an electro magnet and melting and recycling plastics into pressing your own vinyl records with garage kits.  Multimillion dollar 80's recording studios with absurdly senstive super and ultra high definition recordings with tiny capsule mics that can hear the air and breath perfect and record it and the environment in the track with gain and senstivity boosts to absurdly high levels should have it so you can hear the size and type of room clear as day and any nearby walls or large surfaces in the sound space. Even when saved 80's chrome  in metallic tapes or metallic pressed record discs. and things were still digital in the 80's especially those karaoke machines and recording studio setups. or master tracks with things like sony DXD or heck just super audio CD or regular compact disks DSD or minidisc and so on quality is never supposed to be better than the cheapest of digital computer playback parts in the modern age. its a latency thing. So using modern software with older waveoutput methods and like vista WASAPI renderer exclusivemode push and pull and so on.. sounds so much better just for the lower latency.. but type C and HDMI since day one are 0 latency.. so why the heck isnt it 0 latency or better. my quantum infinity AMD super computer is negative latency but i've never been allowed to use it because i have no clue what to type in to turn it on.

how do i get the most bloody obvious better than 80's quality out of my hundreds of modern dollars stuff decades later that the cheapest of $10 ebay parts vastly outspec like hundreds of dollars 80s junk for peanuts in terms of class A amplifier circuits and quad dacs and audio signal processing and other things? and do the same for my video. in fact you should never ever be jealous of a 90's vhs for having more details and better lighting and things. thats the most retarded thing ever.. its just fake hardware companies like intel and nvidia sell the fake stuff and dont compile anything ever because they dont actually have computers to run computer code. and their audio is crap coz they dont even have true audio and are just fist pumping over barely faking and getting by with support for 1970's analog TV at 30fps and are celebrating doing it up to 10% faster than traditional methods.. but the fake hardware fake nvidia approach is to use the 1960's ray tracing method which was done when the cheapest and basic of 60's computers werent actually a super computer and could use things like ray marching or god rays in real time. the AMD god rays were needed to 1:1 reality simulate in the 70's but religious nuts kicked up a fuss and tried to shove them up nvidia and intels fake hardware companies butts for religious reasons. 

0 Replies