Intel 13th Gen CPU's

One of the unfortunate things about the latest several generations of CPU's and GPU's is that Intel, Nvidia, etc., have used up virtually all the overclocking potential right in stock form.

The penalty is the insane amounts of wasted power pumped through their chips just for a few more fps. That, in turn, means you need ever more powerful power supplies and cooling, which wastes even more power and generates a lot more noise. Costs for all this are higher, and longevity is also reduced.

The consumer suffers, but hey, in that battle between Intel and AMD, or Nvidia and AMD, there's no price too high for the consumer to pay as long as the companies can claim some trivial and unnoticeable 3 fps higher performance for all that wasted cash, power, heat and noise!

And if you are an overclocking enthusiast (at least below the LN types), there's just so little left to be gained without serious instability risks that it's just not worth the time and effort anymore. I certainly don't bother.
 
Last edited:
  • Deleted member 197115

One of the unfortunate things about the latest several generations of CPU's and GPU's is that Intel, Nvidia, etc., have used up virtually all the overclocking potential right in stock form.
And why is that "unfortunate" for a consumer that manufacturers can push it out of the box and guarantee the best performance AND stability?
 
And why is that "unfortunate" for a consumer that manufacturers can push it out of the box and guarantee the best performance AND stability?

Because they have thrown efficiency under the bus due to the pressures of their marketing droids. In general, power could be reduced by say 20% with indistinguishable performance reduction (outside of benchmark obsessing).

The "unfortunate" penalties include excessive power use and power bills, more demanding power supply requirements, more fans and cooling requirements, hotter temperatures, likely lower component lifespans and the costs of all that.

Of course, if you don't care about your power bill, or the extra carbon spewed into the atmosphere by the tens or hundreds of millions of computers out there, all burning more power than they could if efficiency was a goal, etc., then great, you've got an extra 1-3 fps.

Der8auer did a video on this topic recently, demonstrating that the presumptively lower power setting switch on graphics cards do nothing at all except lower fan speeds (and raise temperatures!). This is a lie, then, as the "gaming" or "performance" setting also does nothing at all to change performance.
 
Yeah I'd really like to have an effeciency mode for all these products.
And I don't just mean a power limit. More like a different power ramp.

I don't want my cpu to be locked at 50W but watching a video hitting 40W already.

I'd like it to not boost as aggressively without a limitation.
Making watching a video hitting only 20W, while being allowed to use 100W when needed.

Difficult to determine when what is needed though, ofc..
 
  • Deleted member 197115

Of course, if you don't care about your power bill, or the extra carbon spewed into the atmosphere by the tens or hundreds of millions of computers out there, all burning more power than they could if efficiency was a goal, etc., then great, you've got an extra 1-3 fps.
Don't care about cows burping and farting either, with gaming PC I want the best performance. Plus frequency boost with increased power demand comes only when required by load.
 
Plus frequency boost with increased power demand comes only when required by load.
Yes, power scales linearly with frequency. But it also scales with the square of the voltage used. So if you run an overclock with say 1.4V instead of say 1.1V, you will use as much as 60% more power, even at idle.

We are now in a world where CPU's use 350+ W, graphics cards use 450+ W (or even 600!) and total power approaches or even exceeds 1 kW. It's more than a little crazy, even if you don't pay for your own power and don't mind the heat turning your room into a sauna.

Apple has been spanking Intel and AMD in efficiency with their M1 and M2 chips. Meanwhile, Intel and AMD are locked in a marketing battle in which their product's inefficiency is deemed the consumer's problem.
 
I so agree! Give us back the old "turbo" switch! Quieter, less heat, and so on, but able to be turned up to 11 when I want that too!
My imagination of a good implementation would be these things:

1. lots of GPUs nowadays come with a hardware switch for quiet vs performance mode.
It doesn't make any difference though..
Make this make a massive difference!
My 3080 has awful coil rattling at stock and even uses 300W at only 60% gpu load, although having the nvidia energy setting at "normal" (optimal, adaptive are the older names).
Basically as soon as the driver decides to put the card beyond around 1500 MHz, which is the border into the boost algorithm.

When undervolting and limiting to 1700 MHz, it only uses 170-230W!
However if the application is a power hungry one, like furmark donut torture test, it still uses up to 330W.

The 4000 cards do a better job at this though. But my friend who just upgraded his 1070 to a 4070ti still easily reduced its 250W to 160W without losing any performance.

I'd happily invest the 30 minutes for the manufacturers to put such a profile + fan curve into the quiet mode vbios.


Make something similar for CPUs!
Just put a little switch next to the debug LEDs/reset/powerbuttons or at least make it an option on the first bios main page.


2. Make a profile based software.
Similar to the nvidia control center with settings for each 3D application, create bios profiles that quickly change the power limit and voltage vs MHz curve.
And since my 7600x varies between 4-20W for the memory controller depending on the soc voltage, MHz and timings of the RAM, make this dynamic too.
It's even there, but can't be used when using an xmp profile or custom settings.
My surface pro 8 happily switches the ram clocks.

Just store 2-3 profiles and make them automatically switching depending on the application, probably with a checkbox for "only when in the foreground" or "always if process is running".

Even though it's important enough for me to write all this, I can't be bothered to restart my pc, go into the bios, load a different profile, wait the 40 seconds for the ram to be tested and only then starting a game.
And there's simply no software that can do what I'd like. I'd even pay for it..
Yeah I know, we Germans are crazy ;)

Don't care about cows burping and farting either
Yeah that's a whole different topic and can't be tackled by a few pieces of software.
The part of the world population who's eating more than 200g of meat every day is the "problem" and you won't change it by putting a software developer down for half a day.
with gaming PC I want the best performance.
I agree and for gaming only it's really not an issue imo.
But I'm also working at my PC in quite a small room with constant air exchange with the rest of the apartment.
So having some convinient tools and maybe some "preset choice" would be a nice thing to have.

I'd guess most people would happily sacrifice 3% for half the power consumption.
Plus frequency boost with increased power demand comes only when required by load.
Yes and no. As I mentioned my 3080 can easily save 100W while having an fps limiter active or being cpu limited without any difference at all.

And the 13900k can easily save 50% while being Gpu limited without any difference.

Modern hardware does quite well when being below 50% load but between that and full load, it's not really optimized yet...
 
Yes, power scales linearly with frequency. But it also scales with the square of the voltage used. So if you run an overclock with say 1.4V instead of say 1.1V, you will use as much as 60% more power, even at idle.

We are now in a world where CPU's use 350+ W, graphics cards use 450+ W (or even 600!) and total power approaches or even exceeds 1 kW. It's more than a little crazy, even if you don't pay for your own power and don't mind the heat turning your room into a sauna.

Apple has been spanking Intel and AMD in efficiency with their M1 and M2 chips. Meanwhile, Intel and AMD are locked in a marketing battle in which their product's inefficiency is deemed the consumer's problem.
I left the apple ecosystem a while ago. I was in it because early 2000's corporates were not keen you having linux connected to their lan and it was a good way to have unix on your laptop with all its tools as well as ms office and no IT security guy question you on why linux is connected to the network.

Now I have linux on everything but my gaming machine and I am looking back at that apple hardware very impressed. Speed/cool laptop and power efficiency that I think many amd/intel laptop users would be drooling over. I see that Linus Torvalds is using one daily so hopefully the wrinkles will be sorted out soon.

Of course not an option for running our sims on but looks great for a laptop.
 
I left the apple ecosystem a while ago. I was in it because early 2000's corporates were not keen you having linux connected to their lan and it was a good way to have unix on your laptop with all its tools as well as ms office and no IT security guy question you on why linux is connected to the network.

Now I have linux on everything but my gaming machine and I am looking back at that apple hardware very impressed. Speed/cool laptop and power efficiency that I think many amd/intel laptop users would be drooling over. I see that Linus Torvalds is using one daily so hopefully the wrinkles will be sorted out soon.

Of course not an option for running our sims on but looks great for a laptop.

I also use Linux for gaming, too - if it doesn't run on Proton (very rare), I don't play it! It works extremely well, at least if you are on the Plasma desktop as I am - not sure about others.

My only exception is sims, but only because my Simucube doesn't yet work on Linux. I hear that's finally in development, and I'll move that over too as soon as it works.

I don't even need it, but I kind of want one of those M2 Pro Mac Mini's!
 
Last edited:
I also use Linux for gaming, too - if it doesn't run on Proton (very rare), I don't play it! It works extremely well, at least if you are on the Plasma desktop as I am - not sure about others.

My only exception is sims, but only because my Simucube doesn't yet work on Linux. I hear that's finally in development, and I'll move that over too as soon as it works.

I don't even need it, but I kind of want one of those M2 Pro Mac Mini's!
There is something appealing about a powerful and very small form factor desktop. On my desktop i am just in browsers, emacs/vim, consoles, tiling WM. The idea of linux in a tiny box suits me down to the ground :)
 
A few CPU temperature questions.

I recently installed a 13900K. I have a 360mm AIO attached to the CPU.

  • There are many CPU temperatures that can be tracked. Is tracking the "package" temperature good enough to understand what is going on?
  • If I play a game like ACC, what level of CPU temperatures do you see on a 13900K?
  • If I track temperatures while playing ACC, I see temps as high as 93°C, does this make sense?
  • When does thermal throttling kick in? 100°C. What temperature sensor is used to initiate thermal throttling?
Thanks.
 
A few CPU temperature questions.

I recently installed a 13900K. I have a 360mm AIO attached to the CPU.

  • There are many CPU temperatures that can be tracked. Is tracking the "package" temperature good enough to understand what is going on?
  • If I play a game like ACC, what level of CPU temperatures do you see on a 13900K?
  • If I track temperatures while playing ACC, I see temps as high as 93°C, does this make sense?
  • When does thermal throttling kick in? 100°C. What temperature sensor is used to initiate thermal throttling?
Thanks.
I don't have a 13900K, but 93C for a game looks pretty high to me, even if it's ACC. I'd expect that to be in the 70s, especially with a decent AIO
 
Note sure anyone really cares however the plots below show my CPU (13900K) temps for a 10-minute race in ACC as a function of time. I also checked if my CPU is going into a thermal throttle at any time. It is not. The CPU "Package" and "Max" look very similar. The average CPU temp is 10-15° lower.

This was only a 10-minute race. It looks like the system has come close to thermally stabilizing after about a 10-minute race.

The main reason for doing this was to see if my CPU was ever throttling. I was happy to see it was not.
Package.JPG
Core Max.JPG
Avg.JPG


IMG_3795.jpg
 
Last edited:
I count 10 fans; guesses for seemingly excessive CPU temps are some combination of
  • AIO has lost fluid - relatively unlikely
  • AIO has developed gunk - known to occur with specific models
  • AIO pump is not intimate with CPU
  • thermal paste between AIO pump and CPU is inadequate
  • too many fans are either pressurizing or exhausting, confounding flow
  • case vents are inadequate when covers are in place
    Gamers Nexus' review showed 10 degree drop by removing front cover,
    tested with GTX 1080 and i7-6700K
 
Last edited:
@BillyBobSenna could you show us the cpu package power draw for the 10 minute session in your graphs?

That's the real question here.

How much heat does the cooler has to take on.

Depending on that, you could think about setting a power limit without really losing any fps.

I count 10 fans; guesses for seemingly excessive CPU temps are some combination of
  • AIO has lost fluid - relatively unlikely
  • AIO has developed gunk - known to occur with specific models
  • AIO pump is not intimate with CPU
  • thermal paste between AIO pump and CPU is inadequate
  • too many fans are either pressurizing or exhausting, confounding flow
  • case vents are inadequate when covers are in place
    Gamers Nexus' review showed 10 degree drop by removing front cover,
    tested with GTX 1080 and i7-6700K
You forgot:
- The mobo thinks the temps are fine and the fans aren't spinning up

- the mobo would like to spin up the fans but they aren't correctly controlled


Overall though:
These modern CPUs easily spike up to very high temps and they are designed to do so.
If they aren't throttling and the average, while playing, is okay, everything is fine!

Whenever I test stuff like this, I use hwinfo64 and reset the min/max/avg while already on track and then tab out and instantly take a screenshot before quitting the session.

Otherwise the min and avg will be lower than while gaming, obv.

The 3 graphs are completely sufficient too though, since you can see everything yourself.

The graphs look absolutely normal for me in a non-high-airflow case, which the shown case is.
The intakes blow at 90° and it doesn't look like the front would be mesh at all.

Unlike the Fractal Torrent with high speed flow straight from front-bottom to rear-top.

A few CPU temperature questions.

I recently installed a 13900K. I have a 360mm AIO attached to the CPU.

  • There are many CPU temperatures that can be tracked. Is tracking the "package" temperature good enough to understand what is going on?
  • If I track temperatures while playing ACC, I see temps as high as 93°C, does this make sense?
  • When does thermal throttling kick in? 100°C. What temperature sensor is used to initiate thermal throttling?
Thanks.
I had a i7 2600k, i5 10600k and now a 7600X.
Two friends had a 9600k.
All overclocked and tested etc.
But no 13th gen...

Anyway:
Package is usually okay and I'm always using it as my controller sensor.
However one should always have a look at the core temps too to make sure they aren't throttling.

From what I know the throttling is done per-core depending on that core's temp.
I'd think the package temp can cause throttling too, but it's more of a fail safety, since you won't see 99°c package without the cores already melting...

You did all that and I see nothing to be worried about.

Intel core temps can shoot up from 30°c to 99°c within a few tenths of a second.
They shoot to full load during a loading screen or just opening chrome.
You pump a lot of energy into that core and the temp sensor is quicker than the heat moves from the core into the rest of the cpu into the heatspreader and then into the cooler.

So the package makes more sense, as long as the cores aren't getting out of control.

To keep these core temp spikes below 80°C, you'd have to constantly cool the rest of the cpu.
Basically make the rest of the cpu the "cooler" for the single cores.


I personally have my spinning up reaction time at 10 seconds.
But I also put a power limit in place.

So my CPUs won't shoot up that high in the first place.

Quite a long text, sorry. Can't get my thoughts perfectly together this week but wanted to give a reply.
 
@BillyBobSenna could you show us the cpu package power draw for the 10 minute session in your graphs?

That's the real question here.

How much heat does the cooler has to take on.

Depending on that, you could think about setting a power limit without really losing any fps.


You forgot:
- The mobo thinks the temps are fine and the fans aren't spinning up

- the mobo would like to spin up the fans but they aren't correctly controlled


Overall though:
These modern CPUs easily spike up to very high temps and they are designed to do so.
If they aren't throttling and the average, while playing, is okay, everything is fine!

Whenever I test stuff like this, I use hwinfo64 and reset the min/max/avg while already on track and then tab out and instantly take a screenshot before quitting the session.

Otherwise the min and avg will be lower than while gaming, obv.

The 3 graphs are completely sufficient too though, since you can see everything yourself.

The graphs look absolutely normal for me in a non-high-airflow case, which the shown case is.
The intakes blow at 90° and it doesn't look like the front would be mesh at all.

Unlike the Fractal Torrent with high speed flow straight from front-bottom to rear-top.


I had a i7 2600k, i5 10600k and now a 7600X.
Two friends had a 9600k.
All overclocked and tested etc.
But no 13th gen...

Anyway:
Package is usually okay and I'm always using it as my controller sensor.
However one should always have a look at the core temps too to make sure they aren't throttling.

From what I know the throttling is done per-core depending on that core's temp.
I'd think the package temp can cause throttling too, but it's more of a fail safety, since you won't see 99°c package without the cores already melting...

You did all that and I see nothing to be worried about.

Intel core temps can shoot up from 30°c to 99°c within a few tenths of a second.
They shoot to full load during a loading screen or just opening chrome.
You pump a lot of energy into that core and the temp sensor is quicker than the heat moves from the core into the rest of the cpu into the heatspreader and then into the cooler.

So the package makes more sense, as long as the cores aren't getting out of control.

To keep these core temp spikes below 80°C, you'd have to constantly cool the rest of the cpu.
Basically make the rest of the cpu the "cooler" for the single cores.


I personally have my spinning up reaction time at 10 seconds.
But I also put a power limit in place.

So my CPUs won't shoot up that high in the first place.

Quite a long text, sorry. Can't get my thoughts perfectly together this week but wanted to give a reply.
Below are the power levels for the CPU and GPU. I used HWINFO64 to get this data. I did not collect the data from the CPU cooler. It may be interesting to see what the cooler is doing.
 

Attachments

  • CPU Power.JPG
    CPU Power.JPG
    138.6 KB · Views: 50
  • GPU Power.JPG
    GPU Power.JPG
    149.6 KB · Views: 52
Last edited:
Below are the power levels for the CPU and GPU. I used HWINFO64 to get this data. I did not collect the data from the CPU cooler. It may be interesting to see what the cooler is doing.
At 150-200W, a 360 AiO that's not running at max speed will just barely keep the cpu from throttling in peaks.
Only with high airflow cases the temps will be lower.
But as long as it's not throttling, who cares!

Totally normal :thumbsup:

If you want to keep it cooler or just more efficient, you could set a power limit at 80W and check if you see any difference.
I highly doubt it!
 
Back
Top