kitty Posted December 31, 2010 Share Posted December 31, 2010 So I decided I wanted to give my computer a little extra juice to play around with while running games. I run a decent setup with a E6600 Core 2 Duo, 4gb RAM, and an ATI 5770. I also run two monitors, one at 1920x1200 at 60Hz and another with my desktop extended on it at 1280x1024 at 60Hz. So here's my problem: the system runs fine on default settings in AMD's CCC Overdrive utility. Manual fan control on or off works fine too. However, when I adjust either clock setting, even adjusting it lower causes display corruption on my second monitor. When the screens are showing still images the display is fine. The corruption occurs even when I scroll in a window like in Firefox and when I use more graphics power like in games. I've played Borderlands (max settings) on my main monitor and it runs fine with the higher clock speeds but the second monitor still has problems. I run HDMI to my main and DVI to my secondary if that makes any difference. EDIT: The display corruption during Borderlands happens when I'm loading a new area and not during actual gameplay, even when my fps drops pretty low. My processor is also overclocked to 3.4ghz. Also still does it when clocked lower than default. EDIT2: Well I had Borderlands alt-tabbed for about 30 minutes and the display corruption is only on the main now but still only during the loading screen and during scrolling of windows. Drivers are updated to the latest version as well. EDIT3: When I connect both monitors using DVI the display corruption is on my main monitor. Quote Link to comment Share on other sites More sharing options...
prophetik music Posted January 5, 2011 Share Posted January 5, 2011 So I decided I wanted to give my computer a little extra juice to play around with while running games. I run a decent setup with a E6600 Core 2 Duo, 4gb RAM, and an ATI 5770. I also run two monitors, one at 1920x1200 at 60Hz and another with my desktop extended on it at 1280x1024 at 60Hz.So here's my problem: the system runs fine on default settings in AMD's CCC Overdrive utility. Manual fan control on or off works fine too. However, when I adjust either clock setting, even adjusting it lower causes display corruption on my second monitor. When the screens are showing still images the display is fine. The corruption occurs even when I scroll in a window like in Firefox and when I use more graphics power like in games. I've played Borderlands (max settings) on my main monitor and it runs fine with the higher clock speeds but the second monitor still has problems. I run HDMI to my main and DVI to my secondary if that makes any difference. EDIT: The display corruption during Borderlands happens when I'm loading a new area and not during actual gameplay, even when my fps drops pretty low. My processor is also overclocked to 3.4ghz. Also still does it when clocked lower than default. EDIT2: Well I had Borderlands alt-tabbed for about 30 minutes and the display corruption is only on the main now but still only during the loading screen and during scrolling of windows. Drivers are updated to the latest version as well. EDIT3: When I connect both monitors using DVI the display corruption is on my main monitor. i've found that ati cards don't really overclock very well. corruption usually is a result of heat, but it's weird that it's happening no matter what. buy nvidia? i don't know what to tell you. i've only overclocked an ATI once - a 4870, back in the day - and it was terribly difficult to get stable. i was only on one monitor, too. i'd be inclined to point to a secondary monitor as part of the issues - but again, i'm just speculating. Quote Link to comment Share on other sites More sharing options...
kitty Posted January 5, 2011 Author Share Posted January 5, 2011 Seriously am going to only go nvidia from now on. I only went ATI this time because it was relatively cheap for the price. Yeah and it's weird that it isn't heat because it runs at 50 degrees celcius during load. Quote Link to comment Share on other sites More sharing options...
prophetik music Posted January 5, 2011 Share Posted January 5, 2011 unpossible most graphic cards IDLE at 50C. i'm willing to bet your temp sensors are toast. Quote Link to comment Share on other sites More sharing options...
kitty Posted January 6, 2011 Author Share Posted January 6, 2011 I used both the CCC and a third-party temperature tool to look at the temps unless the temp sensors are hardware based. You'll have to excuse my ignorance. Quote Link to comment Share on other sites More sharing options...
Pyrion Posted January 6, 2011 Share Posted January 6, 2011 i've found that ati cards don't really overclock very well. I figure it would help to adjust the GPU's core voltage proportional to the overclock being done, except that isn't something CCC lets you do in its gui itself, you have to create a CCC profile, and then edit its xml entry to alter the core voltage (all CCC entries in xml are multiplied by 100 I believe), and then reapply the CCC profile. C:\Users\%username%\AppData\Local\ATI\ACE\Profiles Like for instance my 4890 underclock (by 50MHz) to get it to run stable in UE3 games looks like this: <Feature name="CoreClockTarget_0"> <Property name="Want_0" value="80000" /> <Property name="Want_1" value="80000" /> <Property name="Want_2" value="80000" /> </Feature> <Feature name="MemoryClockTarget_0"> <Property name="Want_0" value="97500" /> <Property name="Want_1" value="97500" /> <Property name="Want_2" value="97500" /> </Feature> <Feature name="CoreVoltageTarget_0"> <Property name="Want_0" value="1313" /> <Property name="Want_1" value="1313" /> <Property name="Want_2" value="1313" /> </Feature> Also for underclocking I had to keep the core voltage at stock for it to remain stable, which suggests to me that the 4890 is just a stock-overclocked 4870. I tried raising the voltage a bit for running at stock clocks and it still didn't take (worth a shot though if the card can handle running at stock, which mine can't). And yeah, I'm forcing 3d clocks all the time. Fixes a bug that crops up in some games where the GPU downclocks itself to 2d rates when there's little rendering activity going on. Quote Link to comment Share on other sites More sharing options...
kitty Posted January 6, 2011 Author Share Posted January 6, 2011 Yeah I was at a loss for how to adjust the core voltage since CCC didn't let me. My brief reading of the interwebs suggested that I needed to adjust it in proportion with the clock speeds and I was perplexed as to why the CCC won't let you do just that. I'll try this out with underclocking to see if it helps. Quote Link to comment Share on other sites More sharing options...
kitty Posted January 8, 2011 Author Share Posted January 8, 2011 Well doing the profile thingy to adjust the core voltage didn't have any effect on the voltage that I can tell. I applied a profile with altered core voltage settings and then saved a new profile with those settings. The voltage settings in the new profile were still default. However I did force the card to run at max clock settings 100% of the time and the corruption ceased. Are there any downsides or things I should know about if I'm running my card at max settings all the time? Or is the dynamic clock settings just a power saving thing? EDIT: So I'm running stable at 928MHz GPU and 1298MHz for the memory with it forced at max. That's a 6% boost over the already overclocked stock settings. Quote Link to comment Share on other sites More sharing options...
Pyrion Posted January 12, 2011 Share Posted January 12, 2011 I applied a profile with altered core voltage settings and then saved a new profile with those settings. The voltage settings in the new profile were still default. Why are you saving a new profile? Just use the profile with the altered settings. Quote Link to comment Share on other sites More sharing options...
kitty Posted January 12, 2011 Author Share Posted January 12, 2011 I saved a new one just to see if the settings are being recognized by the CCC at all. Works perfectly the way you said it does but I was just wondering if it's actually getting the altered voltage or not. Apparently saving the current settings it either reverts to default voltage or doesn't recognize the other settings. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.