the usage of my gpu is about 70% but i want get more fps with 100% usage how can i fix this can sb help me plz
my system :
rx 6900xt avg temp at games 70c(2000RPM)
cpu : i9 9900kf avg temp at games 60c
motherboard : ASUS TUF Z390 gaming plus WIFI
2t ssd NVMe ( all games instaled )
480 g ssd ( for windows )
16x2 ram 3000mhz cl 16
windows 10 64bit
PSU 800w Radeon Software Adrenaline 21.9.2
monitor 2k (2560x1440p) 165hz DP port
In games using directx 12 and vulcan, there will be no problems, in games on directx 11 and below, the video card will be underloaded, but not everywhere, depending on the complexity of the scene, take the same game: immortal fenyx rising, in the open world FPS and loading rx6000 or rtx3000 video cards in the region of 70-80% even in 2k resolution, but as soon as you go down into the dungeon where there are few small details and textures (polygons), the maps will work at full (they will not be limited by the directx api itself 11, and the central processor) The same thing happens in assassins creed odyssey (origins), but not in Assassin's Creed: Valhalla since there is direct 12 (it can draw very very many polygons of textures). Also, AMD software, namely Adrenaline, does not correctly read the load from the video cards, msi afterburner does it correctly (but you need to activate the universal GP load monitoring mode in the settings beforehand). This is evident in all far cry games except part 6 there is directx 12 if I'm not mistaken)See the photo to understand what I'm talking about. In the open world, the limitation will be directx 11, indoors or in tartarus, the limitation will probably be the processor (but this is not accurate). the resolution is 2k, reduced to 1080p due to limitations in downloads.
I agree with your response, it is likely that OP is experiencing this in a DirectX11 or OpenGL game. But, from my experience this issue is more pronounced on Radeon GPU's than Nvidia GPU's in DirectX11, half of the time by drastic margins.
AMD Radeon does benefit more than Nvidia in games that are implemented with DirectX12 from scratch, but games that wrap DirectX12 over DirectX11 (Deus Ex Mankind Divided, Total War Warhammer, ~Serious Sam 4) perform even worse than their DX11 counter-parts on Radeon.
I have both a RX 480 and GTX 1060 I test this with.
Edit: as far as my reading went on this over the years, it is apparently due to AMD not supporting command lists in DirectX11, whereas Nvidia does, but AMD does support context switching in DirectX11 from what I remember. The gist of it is, that DirectX11 is for the most part single-threaded due to Radeon Drivers, whereas Nvidia does very good multi-threading of the DirectX11 runtime/driver by allowing other CPU cores to build draw-calls and link them to the main submitting CPU thread, whereas Radeon drivers has one CPU core building the draw calls and having the same core to submit them to the GPU.
thank you but warzone not DX 12 ?!?!? because i have this problem at warzone ,Of course we don't forget that warzone 40g update and coming 40 bug to game
I know this is frustrating, but as long as you are not dropping below 60FPS (this happens to me in Unreal Engine 3 DirectX 9 games, which AMD has broken since after 17.7.1 where I get 20-30FPS with 10% GPU utilization where I used to get 80FPS and they have not fixed it for 5 years after hundreds of reports). But, I am guessing you always get above 80FPS with that configuration.
Furthermore, 165Hz is already very-very smooth and responsive, even with VSync enabled. What I would suggest if I had that system, set Chill Min = 100FPS & Chill Max = 100 or 120FPS and set Frame Rate Target Control = 100 or 120FPS and enable Enhanced Sync for the game.
With frame-rate higher than 80FPS you can rather do things to improve input response than to aim for higher FPS (you do not always need higher FPS to get better input response) which is also why AMD developed Radeon Anti-Lag. Therefore, as an alternative to the above you can set FRTC = 120, Enable Anti-Lag and Enable Enhanced Sync. You should also get very good input response then.
@Geforsikus_2021yes Cuda Cores are just the name Nvidia gives to the cores on their GPU, just as Radeon names their cores Stream-processors. Furthermore, AMD also adapted hardware accelerated physics with their development of stream processors years ago, but only some games have used them (For example, Tress FX in Tomb Raider and Deus Ex are AMD's "hairworks", and I think Deus EX MD also does some cloth physics on GPU). Apparently, Radeon supported their own type of PhysX through OpenCL, but unfortunately game Devs did not use them:
Just wish AMD could improve older DirectX11 performance and after 5 years fix the Unreal Engine 3 DirectX9 bug they have reintroduced.
Yes, then in Global Graphics, look for Frame-Rate Target control and set it to 120FPS for example, to ensure a consistent frame-rate which will result in less jumps in framerates. Hope this helps, you can also try the chill method I have described if this is not smooth enough.
Edit: I wrote all of this with the assumption that you have a FreeSync monitor.
How do you display monitoring in the game? adrenalin or msi afterburner? I'm watching a video in 2k with i9 10900k with an rx6900xt video card, it loads into warzone 90%+- 5%. i9 9900k I think it will work the same way, not worse for sure. On the street you should have 160+ fps indoors up to 180-190, in 2k on ultra settings. If the FPS is lower, then there is a problem somewhere, do you have RAM installed in the 2nd and 4th slots in the motherboard? Open the Task Manager (performance => memory) How many sockets are used (hover the mouse should be 2 and 4 for you) If not, then the single-channel mode works. Regarding the settings in AMD Adrenaline, I only turned on amd freesync + there, limited the frame rate to 115 (monitor 120hz), the rest of the settings were not touched.
You also have windows 11, you shouldn't have upgraded, considering that there are quite a lot of performance problems in games. In the Task Manager, hover the mouse over (as in the screenshot) Slot 2 of 4 is used (it is on the numbers 2 of 4) and a window will pop out which sockets are currently being used. You should have: slot 2: 16gb, 3000 MHz and slot 4: 16gb, 3000 MHz, if so, then that's not the problem. Is the problem only in warzone? Try to disable all options in amd adrenaline, except freesync, and 160 fps limits, since you have a monitor at 165hz, and freesync only works in the range of, for example, 30-165 in your case.(but you can not limit it, then there will be frame breaks when the fps is above 165 frames / sec) I haven't played warzone myself, I can install it for the sake of interest, it will be installed by morning. and I'll throw off the fps photo (I have i9 9900k + 16GB of RAM, + rx 6900xt. + windows 10 ltsc), I will switch to windows 11 when the problems related to performance in games are corrected.) Check if you have the Virtualization-based Security component turned off, it is enabled by default. (to check in the search, type: Kernel Isolation => open the security settings and turn off the memory integrity and restart the PC.) And try playing warzone again.
Yesterday I installed warzone, crashes constantly with various errors, everything is fine in the game menu, here are some screenshots I could make (ultra settings without rtx, 2k) The video card is loaded to the brim with 90+%, what games do you have from ubisoft? Is assassin's creed valhalla installed? how does she behave? I don't have enough online games to compare with yours, diablo 2 resurrected is, I'm more single-player games)
In 4k, I did not try to run the warzone game (due to the fact that it constantly flew out during the jump from the plane, sometimes when I had already landed, I deleted it) when I was training, the game did not crash. My monitor is 27 inches 2k, I don't see the point of raping the card in 4k, since I want it to give out 100+ fps, if there was a 4k 60hz monitor, then it would be possible to play 4k). Smart Access Memory is disabled, I turned it on about 2-3 months ago, then turned it off (few games use it well except in assassin's creed valhalla and warzone, forza 4.) Especially so that it would work, you need to update the bios, and when updating, all vulnerabilities are sewn up and processor performance drops, so I returned the old bios and disabled Smart Access Memory. If you have no problems in other games, it is possible that the settings in amd adrenalin need to be reset and gradually turn on one by one checking the load on the video card. Maybe it's just possible that the monitoring program is lying, I turned on the universal gpu monitoring mode in the msi afterburner settings, he used to lie and showed me a floating load on the GPU (then 30%, then 50%, then 70%) Now everything is stable. I don't play warzone, just for the sake of interest, I installed it to check how my game works, tried and deleted it because it crashes constantly with different errors, alas.)
Higher Field of View will tax your CPU & GPU more, since the CPU will have more Draw Calls to submit, since there are more polygons in the scene. Have you tried to see if their is a difference in performance with the default FOV?
Furthermore, AMD does not seem to keep our videos any more. The best alternative is to link your youtube videos in your post with the video attachment option and then selecting "From the Web".
Can I ask, why does it matter that in a CPU heavy game like Warzone with lots of other players and units in one map, that your GPU utilisation drops from 99%? It's perfectly normal in a game like this, particularly as you decrease resolution and image quality settings.
Your CPU, the game engine, the API used by the game, is only able to render so many FPS depending on the settings you use. As you lower the resolution/image quality settings, this strain is increased on those components and reduced on the graphics card.
As you increase the resolution, the reverse is seen. This is expected and perfectly normal.
Using a faster CPU such as a Ryzen 5000 series and Smart Access Memory will help increase GPU utilisation.
I watched your videos and the game looks to be running well and smoothly, so what is the issue here?
Here is some Warzone footage I captured using lowest settings at 4K resolution. Call of Duty Warzone - Testing for Stuttering with Low Settings + 6900 XT + 5950X - YouTube
I can test the same at 1440P if required, the only thing that would change is my GPU utilisation would drop a little. But that's normal and expected.
I liked your video, awesome system you have! I would like to weigh in as well and am not speaking for the OP.
I agree with what you said 90%, which is why I suggested to the OP that he experiment with Chill and FRTC to hit the sweet spot between smooth and high enough frame-rate, but possible more responsive gaming instead of aiming for maximum frame-rate.
But I would like to acknowledge that there are strange scenario's; sometimes bugs in our releases (I have seen you guys sometimes have different experimental driver versions than us); that results in not even one thread/core maxing out, but leaving the GPU underutilized and sometimes with the addition of stutters or uneven frame-pacing.
But I do agree that your gameplay look solid with 100% utilization for the most part and it would be interesting to see if the OP gets the same type of performance at the same settings as yours.
One last note, AMD Performance Overlay does not always update at the speed which Rivatuner Statistics Server does, therefore not showing major dips as often.
(Surprised face emoji not working)