In the past, hardcore gamers used to always overclock their graphics cards. However, overclocking no longer provides the same performance boost level as it once did, so it's not as popular as it used to be. Overclocking has been giving diminishing results as time passes, which begs the question: should gamers overclock their graphics cards? The answer is no, considering the risks involved.
This article explains why gamers should not overclock their GPUs.
Note: This article is subjective and solely reflects the writer's opinions.
What happened to overclocking?
Overclocking used to mean something back in the day. Gamers would risk their PC components in a bid to overclock their GPUs and gain a 20% - 30% performance gain in multiple titles.
However, modern graphics cards barely provide a 2% - 5% performance gain, which is too little to mean anything. So, what happened to overclocking? To understand this, you must know how GPUs and overclocking worked in the past.
Back in the day, stability was the core focus of graphics card manufacturers. Any instability or crashes could lead gamers to rethink where they should purchase their next GPU from. No manufacturer wanted to lose potential customers, so they wouldn't clock their GPU as high as possible and keep the voltages under strict limits to remove any possibility of crashes.
So, the GPU manufacturer sacrificed some performance to gain stability. However, gamers would simply unlock the voltage after purchase and give the core and memory clocks a decent push. This often led to a significant performance increase in terms of FPS.
However, these days, GPU manufacturers are in stiff competition with each other. The higher the FPS, the better their graphics card is in the eyes of their consumers. Stability is no longer as important as it used to be. Higher FPS is the key now.
For this reason, modern graphics cards are launched with clocks and voltages as high as possible straight out of the factory, leaving little room for anyone to overclock their graphics card. For a mere 5% FPS increase, people would need to push their core voltage much higher than before.
This often led to a significant heat increase on the core, VRMs (voltage regulating modules), and the VRAM. The heat from the GPU core alone often proved to be too much to handle for the existing fans, which can't be changed without significant modifications.
To gain any more performance, gamers would need to significantly push the voltages. This posed a serious threat of damaging the actual GPU die, so graphics card manufacturers like Nvidia locked the GPU voltages.
Without the possibility of increasing the voltage, the overclocking potential is extremely narrow. Now, you barely get a 2% - 5% increase in FPS, which makes very little sense for gamers to overclock their graphics cards.
Pros and cons of overclocking the GPU
Here are the pros and cons of overclocking a GPU:
Pros
- 2% - 5% increase in average FPS
- Higher 0.1% low FPS
- Fewer stutters, especially with VRAM overclocks
Cons
- Risk of damaging the core, VRMs, or VRAMs
- Temperature increase, which could lead to throttling
- If throttle occurs, the FPS could be even lower than it was before overclocking
- Artifacting, corruption, or lines on the screen
- May prove too much to handle by your existing PSU
Who should overclock their graphics card?
While there are a couple of pros to overclocking a graphics card, the cons far outweigh them. The increased risk and power consumption are simply not worth the little performance you get, so it's hard to recommend gamers to overclock their graphics cards. However, if performance is all you care about and even 2 FPS is what you need, then go for it.