PDA

View Full Version : GPU shadows and GPU temp, maybe GPU problems since etc


Albrig
08-28-2009, 06:50 PM
<p>Many of you are probably experiencing a problem with GPU shadows in one form or another; the ones probably not experiencing it [whatever 'it' may be] are likely not using GPU shadows at all for various reasons and happily going along with it all.This problem is not likely going to improve when GU54 arrives. If anything, it is going to give the users of EQ2 an all new high in GPU headaches; and for SoE, that is not a good thing.The reason I know this is going to happen is that unless CPU shadows is abandoned (bad move when mulit-core CPUs are pretty much mandatory anyway) GPU shadows is going to do what it does best of all to graphics cards... burn them up.I've logged in to EQ2 a few weeks ago to see the progress on GPU shadows (not impressed really, but GU54, I hope, should change that) and the one thing I noticed is how quickly a 8800GT heats up to 80c in a few minutes (from 44c idle) and then keeps going up to the point where you get graphical errors; which at first glance appear to be a driver problem, or a programming one.It's not.The GPU is heating up because of the following:-- for nVidia, it's usually because they're overclocked to keep up with ATI-- for ATI, it's usually because they have high base clocks (better design) to balance their god-awful driversWith nVidia, the problems come from the majority of manufacturers doing a pretty [Removed for Content] stupid thing. They have some kind of bent in making the fan spin slowly at all times and allowing the GPU to heat up to 90c before the fan kicks in to high speed and control the heat output.With ATI, their fans don't do this before it can get out of hand, but their sound output has to be heard to be believed.With an nVidia 8800GT (because this is the base measure I use for what most users have GPU capabilities-wise), this GPU suffers really badly from a poorly configured fan and too high a clock speed (even when they are sometimes, several hundred mhz lower than ATI).For this test, I use:NiBiTor 5.0This utility program will read the bios of your GPU into itself so that you can save it (backup copy) for the changes that will be made to it, into a new bios file.In my case, I was surprised to find out my 8800GT is an EVGA (I got a... well, you know, unbadged one from overclockers.co.uk).Select Tools > Read Bios > Select Device; your GPU will appearSelect Tools > Read Bios >  Read Into NiBiTor; values will now appear in NiBiTorWhen you do this, you will now see lots and lots of information/configuration about the GPU.The only values we need to change here are the ones that already appear in their boxes related to the following (and I repeat... N-O-T-H-I-N-G else (pretend that echoed several times in your head)).</p><p>In the case of this particular 8800GT, to reduce the heat we need to reduce the clock speed.The original clock speed of the standard 8800GT is 600mhz CPU, 1500mhz for the Shader clock and 900mhz for DDR3 memory.The core voltage of this GPU is [1.05v].</p><p>Before we start, let me get one thing straight. A) GPU manufacturers, and the original designers, overclock their GPUs as far as they will go if competition is in effect. B) Competition is in effect so, good luck.</p><p>FURTHER</p><p>The defaults all GPUs use are A) too high and B) need a high core voltage for a gigantic and monstrous piece of silicon.</p><p>Onward...</p><p>ClockratesClick on the Clockrates tab:What we want to do is reduce the core voltage of the GPU, and to do this, you have to reduce the clocks. So change the following from their standard defaults, to:400mhz CPU, 1200mhz Shader and 800mhz DDR3 memory.The reason I have chosen this configuration is because it is nicely stepped: 400,800,1200 - therefore, it will provide a certain (unknown) amount of balanced transitional performance in terms of clock speed internally to the GPU.VoltagesClick on the Voltages tab (Exact mode):Now, we can reduce the core voltage to [.095v]. This will reduce the heat output of the GPU by a significant margin in terms of Shader and Pixel Fillrate. To highlight, this is a SIGNIFICANT DROP in the amount of juice supplied to the GPU. Even though it's just 0.1v, that in nano electrical terms, is a pretty B-I-G difference.TemperaturesClick on the Temperatures tab and then Fanspeed IC: It is highly likely that the fan needs to be configured because when I checked the EVGA settings, they were utterly wrong and meaningless for the default clock speeds of this particular GPU. Sorry EVGA. Just plain stupid. If you want me to show proof of their original settings, just ask. Hilarious.From top left and down, change whatever is there to the following:TCrit 65°Thyst 2°Automatic SpeedTmin 52°min Duty cycle 60°Trange slope 20°</p><p>These should change, or appear as, from top right and down:PWM mode High freqThigh 70°TOperating 50°TLow 40°What this does:The fan will start at 60%.If the temperature exceeds 60° it will increase with the temperature and reach 100% at 72°.After the temperature drops below 58°, it will run at 60% again.If the temperature exceeds 73°, the fan will run at 100% until it drops below 71°.Now, we need to save this Bios configuration so - File, Save BIOS. Give it a name. Save it.Create a Win98 bootable USB Memory stick. Use HP's HPUSBFW utility program. Copy nVFlash (from nVidia) to the USB memory stick.Copy the BIOS file (and the original backup copy) you just saved in NiBiTor</p><p>The files on the USB memory stick should look like this:bootlog.prvbootlog.txtcwsdpminvFlashxxx.romor iginal.romNow, as long your Motherboard can boot from a USB, you're all set to go:Win98 should boot from the USB memory stick.Now type: nvflash xxx.romWait for it to complete. Do not interrupt it or do ANYTHING. Don't do this during a lightning storm.Reboot.You should be very surprised at the performance and how well you can run say, the demo of Titan Quest set to maximum detail, without burning a large whole in your GPU. I dare you to try this challenge without the shock you get after doing just that before you re-configure your GPU in this way...For some reason, the Titan Quest demo seems highly capable of killing a GPU. It sailed right past 90c when I tried this out with the default settings and I smelt it before I noticed the graphics screwing up.Now with this configuration it barely touches 68c at an internal case temperature of 33c.</p><p>It will idle 20c lower than normal too. Run the GPU-Z utility and check everything is as you would expect.</p><p>This, in my vast knowledge of all things tech, will give improved performance, even by the very nature of you reducing performance. It will make sense in the end. Well, at least in GPU terms.</p><p>Best regards.</p>

TSR-DanielH
08-28-2009, 07:13 PM
<p>Greetings,</p><p>Generally speaking, I would not recommend these steps unless you are an advanced user.  Please keep in mind that doing these steps incorrectly can result in hardware damage.  If you still decide that you want to try this then you will be assuming all the risk associated with the procedure.</p><p>Thank you</p>

Albrig
08-28-2009, 07:29 PM
<p>I should have mentioned this. Yes, ask an advanced user friend if you can. There's little risk or danger doing this yourself - anyone can do it; but mistakes could be made if you don't check carefully what you are changing. A mistake, in that case, means either a dead graphics card or one that doesn't even start - so flash back to the original and you will be fine.</p><p>However, I am confident you'll not be at any risk whatsoever - like following instructions that specifically tell you not to insert your hand into a blender while it is in operation, say - and it will greatly benefit users with an nVidia 8800GT or higher-end GPU experiencing heating issues; which they will as it's impossible to avoid.</p><p>Remember, please copy the original bios rom file to the flash memory stick, because if you do have problems, you can flash it again.</p><p>In most tests I tried, even if you stupidly type in 700 for the core instead of 400, or use 1.15v instead of 1.05v, it will highly likely not boot the GPU bios. It will not kill the GPU, but you never know for sure what could happen. It certainly won't cause a risk say, of damaging your PC or anything else within.</p><p>I'm very thorough.</p><p>This change will be a great benefit to any who wants to try it.</p>

Albrig
08-29-2009, 07:02 PM
<p>How I managed to start his thread and then this one pops up, is the biggest coincidence that has happened in my time on the internet.</p><p><a href="http://www.eurogamer.net/articles/digitalfoundry-system-failure-article" target="_blank">heat</a></p><p>The way to resolve the problems of the above link is to reduce the clocks and reduce the core GPU voltage.</p><p>But amazingly, they don't mention this; nor has anyone related to the subject criteria, tried to do it.</p><p>It would solve every single problem.</p><p>You could argue the reduction in performance, but my argument is that in reducing clocks, there would be perhaps a 5% drop in performance because the 1280x720 and lower resolutions would have a nil effect.</p><p>For instance, if anyone does look at the OP in this thread, there will be questions how this would affect performance. The answer is, from resolutions up to 1680x1050, negligible. At 1920x1200, probably by a very large amount. So unless you consider 1920x1200 to be your only acceptable resolution, you either accept reduced speed or reduce detail levels.</p><p>The fact of the matter is that GPU manufacturers are causing primary problems across a spectrum, rather than isolated incidents of 'things going wrong for some reason' elsewhere.</p>