Home > Article > Computer Tutorials > How to choose the size of computer power supply?
When many users purchase computer accessories, they often only pay attention to the core components of the computer such as the CPU and graphics card, but pay little attention to the power supply. Some users even use the original power supply directly. But the importance of power supply is self-evident. It is the basis for providing power to the computer and ensuring its stable operation. So how to choose the size of computer power supply? Next, PHP editor Xigua will introduce to you how to choose a computer power supply. I hope it will be helpful to you. Friends who are interested should not miss it.
Simply put, it depends on the peak power. For example, the TDP of your CPU is 45W, the TDP of the graphics card is 150W, the peak power of memory and motherboard is generally around 20W, and the hard disk is at most 5W. In addition, some peripheral equipment has redundancy. For such a configuration, assuming that all components are fully loaded, you need a power supply rated at 300W, which is enough.
In fact, when you use your computer normally, it is 100% impossible for all components to work at full capacity at the same time. The actual power consumption may be only 30W in standby and only 200W when playing games. power around.
So from this perspective, it seems to be a very reliable thing to directly buy a power supply with a rated power of 300W without a false standard. but!
If your machine configuration reaches a certain level, the power supply cannot be purchased like this. This matter is difficult to describe simply, so I will use my own machine as an example:
1. The CPU is overclocked: the default TDP of 5960X is 140W, and the default full load voltage is around 1.1v.
According to actual measurements on foreign websites, the 5960X is fully loaded at the default frequency, and the output power consumption of the CPU 12v In, the so-called CPU 12v rail part, is around 110W, and the current can be calculated to be 100A.
(The 12v voltage mainly supplies power to the CPU Package, which is the core part, through the power supply module.) Since this current is output at a voltage of 1.1v, if it is switched to a 12v voltage, it will be about 10A.
For this indicator, generally qualified power supplies that comply with Intel specifications can achieve it. For example, an ordinary power supply rated at 300W has a 12v output voltage between 18A and 22A.
But if the CPU is overclocked, for example, I am currently overclocking the core to 4.5GHz, the Uncore to 4GHz, the core voltage is 1.35v, and the Uncore voltage is 1.4v.
At this time, the full load power consumption of the CPU will be close to 300W, and the current will be close to 200A. If the voltage is switched to 12v, the 12v output circuit is required to be able to output at least 25A.
Use a random power supply parameter as an example. This power supply is rated at 400W.
If only the CPU of my machine has high power consumption, and the graphics card has ultra-low power consumption, it seems that it is no problem to handle a 300W CPU + peripheral devices with a rated power of 400W? The problem is with the 12v output current.
First, this power supply is a single-channel 12v, which can only output 21A current. In other words, even if other components of the entire machine consume no power, this power supply cannot drive a CPU with a full load power of about 300W.
Since there is no way to guarantee the power supply to the CPU, we can only choose a power supply with a 12v output that can exceed 25A no matter what.
This kind of power supply is generally rated above 500W.
PS. The full load of the CPU is very easy to achieve, such as when doing video rendering, video suppression and the like.
2. After talking about the CPU, there are graphics cards: I have 4 Titan Xs, and the TDP of the default frequency of each graphics card is 250W.
The actual stress test full load power is about 240W, and the typical game power consumption is about 200W.
Based on the maximum power consumption, one graphics card requires a 12v power supply circuit to provide a current of close to 20A, and four graphics cards require at least 80A.
If it is the CPU just mentioned plus such a graphics card, a total of 12v power supply is required to output at least 105A of current.
When the machine is configured to this level, you basically have no other choice. Just use a power supply with a rated power of 1200W. But for most national brand 1200W power supplies, it is already very good to be able to run at full load. It is difficult to say about redundancy, because for such a system, the measured power under the stress test is so much: not to mention, I also overclocked .
In other words, for such a system, the current output from the 12v circuit needs to be around 120A. . . . . .
In other words, this machine theoretically requires a power supply rated at more than 1500W.
Currently I am using HCP1200, which has 6 12v outputs that can provide a total current of 99A.
But fortunately, Delta was very conscientious when it was an OEM. Whether I ran a stress test myself or some professional tests abroad, the actual output capability of this power supply should be in the early 1600W range. (⊙﹏⊙)b.
Because of this, this power supply did not explode directly under such a load. Having said a lot of the above, what I actually want to say is: it is okay for ordinary users to simply estimate the full load power and then buy a power supply that is slightly larger than the full load power. But the scientific way should be: buy a power supply that can provide sufficient current output support. In other words, users should mainly look at the 12v output current marked on the power supply nameplate to see whether it can meet the current consumed by the CPU + graphics card when fully loaded, rather than simply looking at the rated power.
The above is the detailed content of How to choose the size of computer power supply?. For more information, please follow other related articles on the PHP Chinese website!