cancel
Showing results for 
Search instead for 
Did you mean: 

Drivers & Software

3ek0
Adept I
Adept I

Problem with GPU idle core & memory clock

Jump to solution

Problem became noticable after i change my CRT monitor to LCD 24 inch.

I dont have problems with PCU,CPU,motherboard,drivers are newest Omega's.
Video Adapter    Sapphire Radeon HD 7870

Motherboard & CPU support PCI-E 3.0 but he works only in 3D application only.

With low GPU usage PCI-E swich to 1.0.
Default max clocks are: (max temperature 100% GPU load -Fan~80%-not more than 55 degrees Celsius-extreamly cool i would say)
GPU:1000 MHz

Memory :1200 MHz(4800 MHz effective)

Default minimum frequences are: (temperature idle 30~33 degrees Celsius  Fan 20%~1100 RPM)

GPU:300 MHz

Memory 150MHz (600MHz effective)

Clocks monitored on GPU-Z

I can't understand why these numbers are so low for that graphics usage nowadays.Internet,youtube...etc

Card can handle some serious games..but with that low  clocks even youtube lags  and make problems with framerate

So my question is how to put a biger lower limit to the default minimum clocks? (for example core 600MHz and memory 500 MHz(2000MHz effective))

There is no  such option in CCC.I cant find solution on the web.

0 Kudos
1 Solution

Accepted Solutions
backFireX64
Forerunner
Forerunner

Re: Problem with GPU idle core & memory clock

Jump to solution

I am not entirely sure that this is what you are looking for, but give this a try:

[Updated]AMD/ATi 2D Clock Guide | TechPowerUp Forums

I'd stick with what Thanny said earlier though, if i were you ...

View solution in original post

11 Replies
Thanny
Miniboss
Miniboss

Re: Problem with GPU idle core & memory clock

Jump to solution

Whenever a program uses GPU-accelerated video decoding, the clocks automatically ramp up to a middling value.  If that's not happening, then you're not using hardware video acceleration.  If you're using Flash with YouTube, you need to enable hardware acceleration in the Flash options. 

Keep in mind that this will prevent the GPU from ramping up to full 3D speeds while such a video is playing, which is why most people don't enable that acceleration.  Without, you're relying on your CPU to do all the work, which is why it would appear slower when moving to a higher screen resolution.

3ek0
Adept I
Adept I

Re: Problem with GPU idle core & memory clock

Jump to solution

Hardware accelaration is not enough mate.. when i open UltraHD(4k) video on youtube  CPU runs in 60% and GPU(core 450 Mhz memory 1200 with 0~20% Quick loading)

You should read my question again..-> my question is how to put a biger lower limit to the default minimum clocks?^^

U know CPU power state/power phase for energy saving..this is the same thing..but applied on GPU..i just want to do higher minimum limits..i can't understand why GHZ edition+ graphic cards have such low limits..they are mostly used from gamers...who ofcouse use them properly..even someone need overclocking

0 Kudos
Thanny
Miniboss
Miniboss

Re: Problem with GPU idle core & memory clock

Jump to solution

You're not understanding the situation.  The GPU clock speed at minimum is more than enough to drive a 2D display.  If it's being used for hardware video decoding, then the clocks will rise automatically to the correct speed to allow that.  If it's not being used for hardware decoding, any and all lack of performance is due to the CPU not being able to keep up with the demand placed on it.  It takes a very fast CPU to decode a 4K video and draw it fullscreen without any GPU-accelerated decoding.

Forcing a higher minimum GPU clock speed will accomplish absolutely nothing but wasting electricity.

3ek0
Adept I
Adept I

Re: Problem with GPU idle core & memory clock

Jump to solution

My CPU is Intel I5 3470 Quad core at 3.2GHz (fully capable for supporting every game nowadays)

And dude how to force higher GPU clock?

Be a good person and give me correct & helpful answer.

0 Kudos
backFireX64
Forerunner
Forerunner

Re: Problem with GPU idle core & memory clock

Jump to solution

Alternatively, use HTML5 only, which in turn uses only the CPU. My old core i7 975 at 3.8 Ghz  can run the following video in 4K rendering resolution with ~30-40% utilization with no lag at all:

Ghost Towns in 8K - YouTube

So, if you have a beefy modern intel CPU, you should have no problem at all.

For some reason i cannot seem to run it at 8K though. It just stuck.

3ek0
Adept I
Adept I

Re: Problem with GPU idle core & memory clock

Jump to solution

Black screen on 8k also..

How to force higher GPU clock?

Be a good person and give me correct & helpful answer.

0 Kudos
hardcoregames_
Big Boss
Big Boss

Re: Problem with GPU idle core & memory clock

Jump to solution

with more recent video cards, the actually GPU speed, bus speed etc will all vary depending on the load

running conventional desktop productivity packages are not demanding

video playback is also not exactly demanding either

so do not expect you GPU to flinch much until load up a modern game like GTA V

3ek0
Adept I
Adept I

Re: Problem with GPU idle core & memory clock

Jump to solution

Far Cry 3/4 need almost 100% load when i play without Anti-alliasing on 60 fps
Yes exactly..where is the point to have powerful graphic card and half time of the usage to have lags couse of graphic card's power state swiching(caused by loading new operation)

For example When i play normal game(CS GO/LOL) and Alt+Tab card's parameters swich from 1000/1200MHZ to 300/150MHZ and when back to the game they are going back to 1000/1200 MHz...(but then card need some time to load) that load on my pc is low...less than 1 sec..but i don't need that power swiching all the time...so i want to just put higher levels to the minimum core & memory clocks diferent from 300/150 MHz

0 Kudos
backFireX64
Forerunner
Forerunner

Re: Problem with GPU idle core & memory clock

Jump to solution

I am not entirely sure that this is what you are looking for, but give this a try:

[Updated]AMD/ATi 2D Clock Guide | TechPowerUp Forums

I'd stick with what Thanny said earlier though, if i were you ...

View solution in original post