- Banned
- #1
I reinstalled my PC yesterday, and of course I also had to install drivers for my video card again. I installed ForceWare v93.71 (latest version of nVidia's video card drivers) followed by the (unofficial) Coolbits 2.0 Registry Hack. This registry hack unlocks some features in the ForceWare drivers that are normally hidden. After you use the Coolbits 2.0 Registry Hack, you can change your video card's core and memory clock speeds (amongst other things).
A normal GeForce 7800 GTX has clock speeds of 430 / 1200 (core / memory). My XFX GeForce 7800 GTX has clock speeds of 450 / 1250 (standard for the XFX version). The XFX GeForce 7800 GTX XXX version has clock speeds of 490 / 1300 (standard for the XFX XXX version).
Of course, the nVidia control panel (after having installed Coolbits 2.0) always told me that my video card's clock speeds are 450 / 1250.
But not today. For some reason, the nVidia control panel tells me my clock speeds are 450 / 2500. As you can see, the memory clock is twice as fast. I'm about to stress my videocard (while I monitor the temperature) to see if it gets too hot (so I can find out if the clock speeds really are higher). If it gets too hot, I will (obviously) have to downclock to 450 / 1250.
The thing is: I want to know why the clock speeds are showing up incorrectly.
The only thing I know is this: before I reinstalled my PC, I had the 91.33 beta drivers installed.
So to sum it up:
Before:
ForceWare v91.33 beta
Coolbits 2.0
Clock speeds: 450 / 1250 (= correct)
After:
ForceWare v93.71 non-beta
Coolbits 2.0
Clock speeds: 450 / 2500 (= incorrect)
Why...?
A normal GeForce 7800 GTX has clock speeds of 430 / 1200 (core / memory). My XFX GeForce 7800 GTX has clock speeds of 450 / 1250 (standard for the XFX version). The XFX GeForce 7800 GTX XXX version has clock speeds of 490 / 1300 (standard for the XFX XXX version).
Of course, the nVidia control panel (after having installed Coolbits 2.0) always told me that my video card's clock speeds are 450 / 1250.
But not today. For some reason, the nVidia control panel tells me my clock speeds are 450 / 2500. As you can see, the memory clock is twice as fast. I'm about to stress my videocard (while I monitor the temperature) to see if it gets too hot (so I can find out if the clock speeds really are higher). If it gets too hot, I will (obviously) have to downclock to 450 / 1250.
The thing is: I want to know why the clock speeds are showing up incorrectly.
The only thing I know is this: before I reinstalled my PC, I had the 91.33 beta drivers installed.
So to sum it up:
Before:
ForceWare v91.33 beta
Coolbits 2.0
Clock speeds: 450 / 1250 (= correct)
After:
ForceWare v93.71 non-beta
Coolbits 2.0
Clock speeds: 450 / 2500 (= incorrect)
Why...?