I have never overclocked anything but I'm interested in learning about it from you! Below are a few questions that I've been wanting to ask for a while:
I used to be an over-clocker. Today, "IT'S JUST NOT WORTH IT."
The avenue I would prefer is; the gaming industry writing code that is capable of capitalizing on 12, 16, or 24 core processing.
If you have a way of monitoring core usage while playing THAT game, I think you would be disgusted by the number of cores you see being accessed and used. Here you have this 8 or 12 core CPU and only 2-3 cores being utilized.
@Vynski Thank you for your feedback. I just learned something new. Well, technically I remember how not all cores of a processors are used by games and/or apps/programs etc. But I think overtime, as games get more and more complex, developers will start to utilize all the available cores, don't you think?
Of course I feel the same. But they are sure taking their sweet time about it. They were still only using one core when the quad core CPU's arrived on the scene. Are there any games that utilize more than 2 cores today?
It's disgusting that I have a 12 core 24 thread processor and most of the production software, like AUTODESK, is taking advantage of the multi core processors. I can remember with a 2 core CPU and rendering a VIZ creation when a 30 second scene took overnight to render it.
You know, I was chatting with a colleague yesterday about our new Red Team Overclockers Group. I was reminded there are other types of apps that could potentially benefit from OC'ing. For example, rendering tools, compilers, etc.
So that I'm clear, pardon my ignorance, will overclocking a CPU improve Autodesk's performance?
The issue with games and writing game code, is that if you have a process that can be highly parallelized it is better to just shunt that process to the GPU. GPUs are already really good at running code in parallel, so on the CPU side, if you have a process you can split out into 16, 24, 32 threads maybe that process should just be on the GPU.
That happens less in scientific and productivity applications, since the software writers don't assume you have a high end GPU. Where in gaming, the coders know it's there so why not use it?
Ive heard the theory the lack of core use by games is because the vast majority of people on steam hardware survey are using 4 cores still. They want to sell games to as many people as possible, not something that only runs well on a 12 core that most dont have
I agree. A part of developing and publishing new games is based on "trends" (data). From what I recall, Steam is a great source for such data, so I'm not surprised.
If overclocking is no longer valuable (performance increases, etc.), I still believe it can be a great hobby. I know tons of folks simply enjoy tweaking the hardware and shooting for world records.
Overclocking today has been made fairly easy thanks to hardware manufacturers embracing the practice. Not all silicon is the same and as a manufacturer we understand you have to take the lowest common denominator and set your default settings there. I personally find it fun to see what my chips are capable of. The performance gains may not be what they once were however they are still gains. It's nice to get every bit of performance out of the money you spent. I also find it as a way to extend the use of my hardware as it gets more dated and programs or games are harder to run.
Why do you overclock your CPU and/or GPU in the first place? (To meet the min. req. of a specific game so that you can play it? Or just because you want to boost the overall performance of your rig?)
- Feels like even with a 6800XT & a 3900X you need to boost them to play at 1440p / high to ultra settings and hopefully getting 144hz+ frame rates.
What are the first steps you take to overclock a processor? (Aside from buying the right CPU and/or GPU, do you use a specific software tool?)
- Ryzen Master, its very hit and miss. I found the perfect boost setting so its easy to dial in but when it acts up its a nightmare. I tried to do it via the bios but it's just too messy.
Is the BIOS always something you use to overclock?
See last answer, I have an ASUS X570-E and trying to find guides and specifics to make changes is quite difficult. A lot of the times when you find a video it's outdated and a number of bios's behind the latest bios version so some things may have been moved/removed/renamed since.
Is liquid nitro a must?
Unless you are trying to get in the top 10 of extreme overclocking, or to put it another way.. No!
How do you change, set, and monitor the voltages?
CPU voltage changed via ryzen master, GPU via adrenaline. Don't really monitor them after that fact
Except video games, what are other reasons for overclocking?
None for me, I have my OC settings for gaming and then revert to 'STFU' settings for work/normal use.
Maybe EXTREME YOUTUBING!! I could see people maybe needing to boost CPU for work stuff alright but I've nothing I do in work that needs what I have.
GPU: Red Devil Limited Edition RX 6800 XT
CPU: Ryzen 9 3900X
CPU Cooler: ARCTIC Liquid Freezer II 280
MB: Asus ROG Strix X570-E
MEM: Corsair Vengeance RGB Pro 32 GB (2 x 16 GB) DDR4-3200
PSU: Corsair HX Platinum 1000 W 80+ Platinum
CASE: Fractal Design Meshify S2
Mon: DELL OMEN 27i 27" 1440p 165Hz x 2
@Nagrenol About your last answer...I thought that some people OC their rigs for mining for bitcoin, folding at home, etc. Correct me if I'm wrong, but having a CPU clocked to a higher frequency will directly impact the time needed to fold or mine, right?
I thought that some people OC their rigs for mining for bitcoin, folding at home, etc. Correct me if I'm wrong, but having a CPU clocked to a higher frequency will directly impact the time needed to fold or mine, right?
- Super limited KO on mining but from what I recall mining did start with CPU's originally but it was generally seen as a terrible way to do it. It's pretty much all done via GPU's now so I don't think it matters in mining. I'm sure you can 'optimize' the CPU for mining with some 'oc' type tweaks but I don't think it's a prio for it.
- I've honestly no idea what folding is.. I mean I don't even like folding shirts!
Very informative thanks.
Why do you overclock your CPU and/or GPU in the first place? (To meet the min. req. of a specific game so that you can play it? Or just because you want to boost the overall performance of your rig?)
- To maximize the performance of my rig and to get every bit of performance I paid for.
What are the first steps you take to overclock a processor? (Aside from buying the right CPU and/or GPU, do you use a specific software tool?)
- Overclocking has become a bit more complex now that manufacturers run factory overclocks. I personally just stick to the bios to adjust settings and then run appropriate benchmarks and stability software to make sure everything is stable. With my current processor I am just using PBO for the moment.
Is the BIOS always something you use to overclock?
- For the CPU yes. For the GPU I will use whatever software best fits. (with AMD I just use Wattman now)
Is liquid nitro a must?
- LN2 is for extreme overclocking, for us day to day guys we try to get the best we can on air or a closed loop system. So no, LN2 isn't a must.
How do you change, set, and monitor the voltages?
- I change and set voltages in the Bios and then monitor via Ryzen Master.
Except video games, what are other reasons for overclocking?
- Other than video games I typically overclock for seeing the best benchmark results I can get out of my system.
GPU: Red Devil RX 6950 XT
CPU: Ryzen 9 5950X
CPU Cooler: Corsair H170i Elite Capellix
MB: Asus ROG Crosshair VIII Dark Hero
MEM: 32GB (2x16) G.Skill Trident Z Nero F4-3800C14D-32GTZN
PSU: Corsair HX1000i 80 Plus Platinum
CASE: Corsair 7000D
Mon: Samsung Odyssey G7 27" 1440P - 240hz
Still remember my first OC, it was on a Intel Pentium 166mhz using the board jumpers to get it to 200mhz, I was too young to even know that was overclocking, I seriously just though I got a free 200mhz CPU instead.
Why do you overclock your CPU and/or GPU in the first place?
To boost the overall performance of the system and learn more about how a particular microarchitecture behaves.
What are the first steps you take to overclock a processor?
Aside from buying the right CPU and GPU, I'd say buying an appropriate motherboard also counts. Personally prefer one that comes with a functional, well-laid-out BIOS with ample tuning options and additional debug aids like a 7-segment displays for POST codes.
Is the BIOS always something you use to overclock?
Yes.
Is liquid nitrogen a must?
Nope. Ambient cooling has its own challenges.
How do you change, set, and monitor the voltages?
Set from the BIOS, monitor using tools like HWINFO. Maybe HSMP can be improved for monitoring purposes?
Except video games, what are other reasons for overclocking?
With the right settings, there are gains to be had for a variety of workloads such as data compression-decompression, rendering, code compilation, etc. And, of course, microarchitectural analysis.
The first thing I do with any new Ryzen cpu is lap it. I have a setup with various grits of sandpaper taped to a mirror. It takes a bit of elbow grease, but it's worth it to get the integrated heat spreader level. Because of the construction of the Ryzen processors, every single one I've ever had is much higher at all 4 corners. You'll get to bare copper on all 4 corners before the Ryzen logo is barely worn off.
Oh, and I overclock just because it's fun to see how far you can push a cpu. To be honest, with PBO it's mostly unnecessary at this point.
Been OC'ing from the jump.
Why not?
It's fun!
Remember the Celeron 300A's ? I got some of them to double or even triple stock clock speeds.
Then I ascended to AMD's...
AMD has always been the most budget-friendly option for overclockers.
PRO TIP
Do your research , AMD overclocking is very MB specific.
and as always RTFM
I pitty the foo who dont
Probably dating myself a bit... but my first foray into overclocking with AMD, was with the K5 series. Wow.... where has the time gone. Crazy homemade cooling contraptions and fiddling with jumpers for bus clock, multipliers, voltage .. going through every SIMM you can scrape up to find that perfect combo ... crossing your fingers, eye's and toes hoping it boots at all or doesn't just melt the socket, and let all of the magic smoke out. Often to only get a few Mhz more
It can be a fun hobby, but with newer components it is really a waste of time. The gains to be had are very minimal, and its not "free"...your parts will run hotter, use more electricity and require more cooling. For the tiny gains seen these days, I dont even bother anymore. Every time I overclock my GPU it crashes. It it worth the hours I would need to spend to get it stable? For me, no. The last chip I was able to get significant gains from O/C was a Phenom II 4 core. Anyway, if you do dive in, enjoy