Albrig
08-28-2009, 06:50 PM
<p>Many of you are probably experiencing a problem with GPU shadows in one form or another; the ones probably not experiencing it [whatever 'it' may be] are likely not using GPU shadows at all for various reasons and happily going along with it all.This problem is not likely going to improve when GU54 arrives. If anything, it is going to give the users of EQ2 an all new high in GPU headaches; and for SoE, that is not a good thing.The reason I know this is going to happen is that unless CPU shadows is abandoned (bad move when mulit-core CPUs are pretty much mandatory anyway) GPU shadows is going to do what it does best of all to graphics cards... burn them up.I've logged in to EQ2 a few weeks ago to see the progress on GPU shadows (not impressed really, but GU54, I hope, should change that) and the one thing I noticed is how quickly a 8800GT heats up to 80c in a few minutes (from 44c idle) and then keeps going up to the point where you get graphical errors; which at first glance appear to be a driver problem, or a programming one.It's not.The GPU is heating up because of the following:-- for nVidia, it's usually because they're overclocked to keep up with ATI-- for ATI, it's usually because they have high base clocks (better design) to balance their god-awful driversWith nVidia, the problems come from the majority of manufacturers doing a pretty [Removed for Content] stupid thing. They have some kind of bent in making the fan spin slowly at all times and allowing the GPU to heat up to 90c before the fan kicks in to high speed and control the heat output.With ATI, their fans don't do this before it can get out of hand, but their sound output has to be heard to be believed.With an nVidia 8800GT (because this is the base measure I use for what most users have GPU capabilities-wise), this GPU suffers really badly from a poorly configured fan and too high a clock speed (even when they are sometimes, several hundred mhz lower than ATI).For this test, I use:NiBiTor 5.0This utility program will read the bios of your GPU into itself so that you can save it (backup copy) for the changes that will be made to it, into a new bios file.In my case, I was surprised to find out my 8800GT is an EVGA (I got a... well, you know, unbadged one from overclockers.co.uk).Select Tools > Read Bios > Select Device; your GPU will appearSelect Tools > Read Bios > Read Into NiBiTor; values will now appear in NiBiTorWhen you do this, you will now see lots and lots of information/configuration about the GPU.The only values we need to change here are the ones that already appear in their boxes related to the following (and I repeat... N-O-T-H-I-N-G else (pretend that echoed several times in your head)).</p><p>In the case of this particular 8800GT, to reduce the heat we need to reduce the clock speed.The original clock speed of the standard 8800GT is 600mhz CPU, 1500mhz for the Shader clock and 900mhz for DDR3 memory.The core voltage of this GPU is [1.05v].</p><p>Before we start, let me get one thing straight. A) GPU manufacturers, and the original designers, overclock their GPUs as far as they will go if competition is in effect. B) Competition is in effect so, good luck.</p><p>FURTHER</p><p>The defaults all GPUs use are A) too high and B) need a high core voltage for a gigantic and monstrous piece of silicon.</p><p>Onward...</p><p>ClockratesClick on the Clockrates tab:What we want to do is reduce the core voltage of the GPU, and to do this, you have to reduce the clocks. So change the following from their standard defaults, to:400mhz CPU, 1200mhz Shader and 800mhz DDR3 memory.The reason I have chosen this configuration is because it is nicely stepped: 400,800,1200 - therefore, it will provide a certain (unknown) amount of balanced transitional performance in terms of clock speed internally to the GPU.VoltagesClick on the Voltages tab (Exact mode):Now, we can reduce the core voltage to [.095v]. This will reduce the heat output of the GPU by a significant margin in terms of Shader and Pixel Fillrate. To highlight, this is a SIGNIFICANT DROP in the amount of juice supplied to the GPU. Even though it's just 0.1v, that in nano electrical terms, is a pretty B-I-G difference.TemperaturesClick on the Temperatures tab and then Fanspeed IC: It is highly likely that the fan needs to be configured because when I checked the EVGA settings, they were utterly wrong and meaningless for the default clock speeds of this particular GPU. Sorry EVGA. Just plain stupid. If you want me to show proof of their original settings, just ask. Hilarious.From top left and down, change whatever is there to the following:TCrit 65°Thyst 2°Automatic SpeedTmin 52°min Duty cycle 60°Trange slope 20°</p><p>These should change, or appear as, from top right and down:PWM mode High freqThigh 70°TOperating 50°TLow 40°What this does:The fan will start at 60%.If the temperature exceeds 60° it will increase with the temperature and reach 100% at 72°.After the temperature drops below 58°, it will run at 60% again.If the temperature exceeds 73°, the fan will run at 100% until it drops below 71°.Now, we need to save this Bios configuration so - File, Save BIOS. Give it a name. Save it.Create a Win98 bootable USB Memory stick. Use HP's HPUSBFW utility program. Copy nVFlash (from nVidia) to the USB memory stick.Copy the BIOS file (and the original backup copy) you just saved in NiBiTor</p><p>The files on the USB memory stick should look like this:bootlog.prvbootlog.txtcwsdpminvFlashxxx.romor iginal.romNow, as long your Motherboard can boot from a USB, you're all set to go:Win98 should boot from the USB memory stick.Now type: nvflash xxx.romWait for it to complete. Do not interrupt it or do ANYTHING. Don't do this during a lightning storm.Reboot.You should be very surprised at the performance and how well you can run say, the demo of Titan Quest set to maximum detail, without burning a large whole in your GPU. I dare you to try this challenge without the shock you get after doing just that before you re-configure your GPU in this way...For some reason, the Titan Quest demo seems highly capable of killing a GPU. It sailed right past 90c when I tried this out with the default settings and I smelt it before I noticed the graphics screwing up.Now with this configuration it barely touches 68c at an internal case temperature of 33c.</p><p>It will idle 20c lower than normal too. Run the GPU-Z utility and check everything is as you would expect.</p><p>This, in my vast knowledge of all things tech, will give improved performance, even by the very nature of you reducing performance. It will make sense in the end. Well, at least in GPU terms.</p><p>Best regards.</p>