GPU Overclocking Guide

Some people are unsure on how to overclock the GPU. Quite frankly, the GPU is the easiest thing to overclock in a system. Lets start from the beginning.

The first thing I recommend is to grab some useful programs and the latest NVIDIA WHQL drivers. The programs I recommend are EVGA Precision 1.6.1 (for the overclocking/monitoring), GPU-Z 0.3.3 (monitoring/GPU info), and ATItool (artifact testing).

ForceWare 182.50 WHQL Driver
WinXP 32 bit:
WinXP 64 bit:
WinVista 32 bit:
WinVista 64 bit:

Proper uninstallation/installation of video drivers (Vista, in XP it should be somewhat similar):
*Download and install Driver Sweeper or something similar.
*Right click on computer, click Device Manager (on left)
*Select your display device (video card), uninstall video device software.
*Delete driver folder (C:\NVIDIA\WinVista64\1xx.xx)
*Reboot in safe mode (repeatedly hit F8 during boot sequence).
*Run Driver Sweeper or your equivalent. Only clean Display driver.
*Download and install your desired driver.

When installing new drivers, make sure all anti-virus software is disabled.

EVGA Precision 1.6.1 Download:

GPU-Z 0.3.3 Download:…U-Z_v0.3.3.html

ATItool 0.26 Download:

Driver Sweeper 1.5.5 Download:

For those of you who don’t know, artifacts occur when your GPU is overclocked too high and/or your temperatures are too high (but not high enough for a shut down). If you get artifacting right out of the box with good temperatures and you did the proper method of installing new (and different) NVIDIA WHQL drivers, I suggest you RMA your GPU. Make sure your temperatures stay below 105 degrees Celsius, that is the max most cards can handle. It is best to stay below 90 degrees Celsius, artifacts can occur before 105.

For this guide I’ll be using my primary video card for example, an MSI GTX 260. This is the vanilla (default) version which came at the factory overclocks.

Factory Settings:
Core Clock: 576 mhz
Shader Clock:1242 mhz
Memory Clock: 999 mhz (1998 mhz effective data rate)
Fan Speed: Auto (40%)

Uh oh, we ran into another confusing term here.

Effective data rate? What does that mean? My card was supposed to come with 1998 mhz memory but it is shown as 999 mhz!

High end video cards use double data rate memory, usually GDDR3 or on several ATI cards they use GDDR5 as well. Multiply the memory clock EVGA Precision/GPU-Z shows to get the effective data rate, which is usually what is advertised.

999 mhz x 2 = 1998 mhz

Now onto the overclocking. I open up EVGA Precision 1.6.1, GPU-Z 0.3.3, and ATItool 0.26. My core clock is 576 mhz, shader clock is 1242 mhz, memory clock is 999 mhz just as it should be. Core/shaders are linked and fan speed is auto, or 40%. Notice with EVGA Precision 1.6.1 you can select other video cards to overclock separately. The monitoring graph can be removed from the box and expanded to show additional information. Very useful. Do not enable Apply at startup until you’ve ensured stability!

Lets get started. Unlink the core/shaders, and set fan speed to manual and raise it a little. You can push the sliders up, or enter a number and it will give you your desired clock speeds. Overclock in small increments, no more than 25 mhz at a time for the core clock. Shader clock should run at least double the core clock so you won’t have any issues. Memory clock should be raised in small increments too, no more than 25 mhz. After the first small overclock, select the scan for artifacts feature on ATItool. So first I unlinked core/shaders and set fan speed to 60% (it went to 61% on it’s own).

I increased the core clock speed from 576 mhz to 600 mhz, shaders from 1242 mhz to 1280 mhz, and memory clock from 999 mhz to 1025 mhz. Yeah a little more than 25 mhz increments but I’ve overclocked this card before, I know what it is capable of.

Run ATItool for a little while (several minutes) and see if you run into any artifacts. I didn’t, so lets move on.

Lets bump it up to GTX 260 SC speeds, 626/1350/1053 (core/shaders/memory). I’m making pretty large heaps because I’ve overclocked this card before, I recommend 5 mhz increments.

Run ATItool. Still success, lets move on.

Bumped up to FTW speeds. No errors yet.

Run ATItool. I ramped the fan speed up to 60% to keep the temperatures down. Still success, lets move on.

I ran into an error. What do I do?
Lower the speeds a little bit. If you get the nvlddmkm stopped responding and has recovered error, try lowering the memory clock first. If it still persists, lower the core. If it still persists, lower the shader clock too. Lower them to the last successful clock you had them at, and move forward again a little bit more. Sometimes you can get a BSOD from the same error. These blue screens are pretty detailed and usually say nvlddmkm stopped responding and has failed to recover or something along the lines of that. Follow the same procedure.

I can’t overclock as high as others who have the same card. Why is this?
Not all cards are the same, some overclock further than others. Water cooling your GPU can help you reach higher speeds.

How much does GPU overclocking affect performance?
In games, the difference isn’t that noticeable unless you reach a very high overclock. In GPU heavy benchmarks such as 3DMark Vantage you’ll get a nice increase in your GPU score and even your overall score if you’ve achieved a decent overclock.

EVGA offers SC, SSC, and FTW models. What is the difference?
These are factory overclocked, guaranteed to work above the default speeds. In my opinion the SC/SSC models aren’t worth it when there is a FTW model for that card, from what I’ve read these are the parts that fail to reach the FTW spec but I’m not too sure on this. The FTW cards are priced very high but they also have very nice overclocks on them. Same goes for the SSC on cards that have no FTW model.

I’ve ran into my max overclock. When you do, you can post it here. :slight_smile:

Final Settings:
Core Clock: 760 mhz
Shader Clock: 1379 mhz
Memory Clock: 1080 mhz
Fan Speed: 60%

where can I find specs for my older card for my older PC?

its an Nvidia EVGA 6200

(yes I checked the box, most of the specs aren’t there)

EVGA precision will show you your clocks.