Jump to content

ATI 5770 Overclocking


kitty
 Share

Recommended Posts

So I decided I wanted to give my computer a little extra juice to play around with while running games. I run a decent setup with a E6600 Core 2 Duo, 4gb RAM, and an ATI 5770. I also run two monitors, one at 1920x1200 at 60Hz and another with my desktop extended on it at 1280x1024 at 60Hz.

So here's my problem: the system runs fine on default settings in AMD's CCC Overdrive utility. Manual fan control on or off works fine too. However, when I adjust either clock setting, even adjusting it lower causes display corruption on my second monitor. When the screens are showing still images the display is fine. The corruption occurs even when I scroll in a window like in Firefox and when I use more graphics power like in games. I've played Borderlands (max settings) on my main monitor and it runs fine with the higher clock speeds but the second monitor still has problems. I run HDMI to my main and DVI to my secondary if that makes any difference.

EDIT: The display corruption during Borderlands happens when I'm loading a new area and not during actual gameplay, even when my fps drops pretty low. My processor is also overclocked to 3.4ghz. Also still does it when clocked lower than default.

EDIT2: Well I had Borderlands alt-tabbed for about 30 minutes and the display corruption is only on the main now but still only during the loading screen and during scrolling of windows. Drivers are updated to the latest version as well.

EDIT3: When I connect both monitors using DVI the display corruption is on my main monitor.

Link to comment
Share on other sites

So I decided I wanted to give my computer a little extra juice to play around with while running games. I run a decent setup with a E6600 Core 2 Duo, 4gb RAM, and an ATI 5770. I also run two monitors, one at 1920x1200 at 60Hz and another with my desktop extended on it at 1280x1024 at 60Hz.

So here's my problem: the system runs fine on default settings in AMD's CCC Overdrive utility. Manual fan control on or off works fine too. However, when I adjust either clock setting, even adjusting it lower causes display corruption on my second monitor. When the screens are showing still images the display is fine. The corruption occurs even when I scroll in a window like in Firefox and when I use more graphics power like in games. I've played Borderlands (max settings) on my main monitor and it runs fine with the higher clock speeds but the second monitor still has problems. I run HDMI to my main and DVI to my secondary if that makes any difference.

EDIT: The display corruption during Borderlands happens when I'm loading a new area and not during actual gameplay, even when my fps drops pretty low. My processor is also overclocked to 3.4ghz. Also still does it when clocked lower than default.

EDIT2: Well I had Borderlands alt-tabbed for about 30 minutes and the display corruption is only on the main now but still only during the loading screen and during scrolling of windows. Drivers are updated to the latest version as well.

EDIT3: When I connect both monitors using DVI the display corruption is on my main monitor.

i've found that ati cards don't really overclock very well. corruption usually is a result of heat, but it's weird that it's happening no matter what.

buy nvidia? :< i don't know what to tell you. i've only overclocked an ATI once - a 4870, back in the day - and it was terribly difficult to get stable. i was only on one monitor, too. i'd be inclined to point to a secondary monitor as part of the issues - but again, i'm just speculating.

Link to comment
Share on other sites

i've found that ati cards don't really overclock very well.

I figure it would help to adjust the GPU's core voltage proportional to the overclock being done, except that isn't something CCC lets you do in its gui itself, you have to create a CCC profile, and then edit its xml entry to alter the core voltage (all CCC entries in xml are multiplied by 100 I believe), and then reapply the CCC profile.

C:\Users\%username%\AppData\Local\ATI\ACE\Profiles

Like for instance my 4890 underclock (by 50MHz) to get it to run stable in UE3 games looks like this:

        <Feature name="CoreClockTarget_0">
<Property name="Want_0" value="80000" />
<Property name="Want_1" value="80000" />
<Property name="Want_2" value="80000" />
</Feature>
<Feature name="MemoryClockTarget_0">
<Property name="Want_0" value="97500" />
<Property name="Want_1" value="97500" />
<Property name="Want_2" value="97500" />
</Feature>
<Feature name="CoreVoltageTarget_0">
<Property name="Want_0" value="1313" />
<Property name="Want_1" value="1313" />
<Property name="Want_2" value="1313" />
</Feature>

Also for underclocking I had to keep the core voltage at stock for it to remain stable, which suggests to me that the 4890 is just a stock-overclocked 4870. I tried raising the voltage a bit for running at stock clocks and it still didn't take (worth a shot though if the card can handle running at stock, which mine can't).

And yeah, I'm forcing 3d clocks all the time. Fixes a bug that crops up in some games where the GPU downclocks itself to 2d rates when there's little rendering activity going on.

Link to comment
Share on other sites

Yeah I was at a loss for how to adjust the core voltage since CCC didn't let me. My brief reading of the interwebs suggested that I needed to adjust it in proportion with the clock speeds and I was perplexed as to why the CCC won't let you do just that.

I'll try this out with underclocking to see if it helps.

Link to comment
Share on other sites

Well doing the profile thingy to adjust the core voltage didn't have any effect on the voltage that I can tell. I applied a profile with altered core voltage settings and then saved a new profile with those settings. The voltage settings in the new profile were still default.

However I did force the card to run at max clock settings 100% of the time and the corruption ceased. Are there any downsides or things I should know about if I'm running my card at max settings all the time? Or is the dynamic clock settings just a power saving thing?

EDIT: So I'm running stable at 928MHz GPU and 1298MHz for the memory with it forced at max. That's a 6% boost over the already overclocked stock settings.

Link to comment
Share on other sites

I saved a new one just to see if the settings are being recognized by the CCC at all. Works perfectly the way you said it does but I was just wondering if it's actually getting the altered voltage or not. Apparently saving the current settings it either reverts to default voltage or doesn't recognize the other settings.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...