Jump to content

Pyrion

Members
  • Posts

    627
  • Joined

  • Last visited

Everything posted by Pyrion

  1. *logs back in*

    *checks member profile, looks at URL*

    *laughs*

    1. Pyrion

      Pyrion

      Also, wtf happened to my post count? :(

  2. I wonder if Camtasia itself is just trying too hard (or not enough) to compress the video such that it's forcing the game's framerate down lower than the recording setting (which is what this is starting to sound like) due to either wasting too much CPU to compress the video (and forcing the game to step down its framerate further) or not compressing it much at all and thus lagging the hard drive to the point that, again, the game has to slow down. What compression settings are you using in Camtasia, if any? Cuz FRAPS doesn't have anything along those lines, its codec does just barely enough compression on its end to manage a realtime recording but spits out 4GB AVI chunks that can contain anywhere from 10-20 minutes of video to a few minutes at most depending on what's being recorded.
  3. Yeah, pay the licensing fee, get FRAPS, record at your leisure. Although it'll still limit your framerate, at 60fps it's not really all that noticeable. Even 30fps is fine for most games (although not all...) Thing to bear in mind that two of the key factors for recording video at high framerates are the resolution you're recording at, and the speed of your hard drive. Not necessarily capacity (although that helps too), both the sequential write speed of your hard drive and the spindle rotation speed are huge factors. All of my AudioSurf recordings are recorded at 60fps (previously 960x600 (1920x1200 half-res), and now 1280x720) on a 150GB WD Raptor, preferred over a 1TB WD Caviar Black, for precisely these reasons. EDIT: What also helps, is to record to a drive that doesn't also contain the game you're playing (as loading game resources will interrupt the recording) or the Windows pagefile (as any activity there will also interrupt the recording).
  4. It just irritates the hell out of me how panicky people get when the terms "nuclear" or "radiation" are used in a news article that it almost instantly elicits a comparison to Hiroshima or Chernobyl. Often with the obligatory mushroom cloud photo. The news media is almost certainly unreliable here because inciting panic is a great way to boost ratings, little things like facts be damned. Yes, I know, people are stupid (the pearl harbor/global warming pics proved that easily enough), but there's no excuse for it anymore.
  5. In terms of actual radiation release, this so far doesn't come anywhere close to being comparable to Chernobyl. We're talking the difference between 300 Sv/hour and 0.4 Sv/hr (400 mSv, at its highest on Tuesday (EDIT: And this is assuming they actually mean millisieverts and not microsieverts - the comparison gets even more absurd if they mean microsieverts). Best part is that the inverse square law applies, so the farther away you get from the source, the actual exposure drops off dramatically. Again, for this to be "worse than Chernobyl," the same sort of circumstances and (lack of) precautionary measures from Chernobyl would have to apply as well.
  6. Meltdowns don't create fallout. The ultimate consequence of a total meltdown is the molten radioactive slag hitting the underground water table. Chernobyl isn't an exception to the rule either - Chernobyl was in fact a partial meltdown (as was TMI), but its graphite core was exposed to the outside air and ignited. Something similar would have to happen here for the above map to have some semblance of relevance.
  7. Just get another wireless router (set to router mode rather than gateway mode), put it in the switch's place and use the same SSID but a different channel (like say for instance if the main router is using channel 6, use channel 1 or 11 for the new router) just to rule out the two possibly interfering with eachother if you put a device between the two.
  8. Also, this makes for a good read on the subject of Steam and indie developers.
  9. And once you recoup the production costs, everything past that point, regardless of Valve's cut, is pure profit, and you stand to make more of it per sale than you would if each sale is divided up amongst multiple middlemen. Plus you don't have to administer the distribution system, Valve (and their many hosting partners) does that for you as part of the deal.
  10. There's nothing stopping you from doing it, except of course that you likely will gain nowhere near the level of product exposure that you would via signing with a well-known and established online distribution system like Steam. You'd have to do all the advertising yourself, basically.
  11. Part of the problem is that in order for the digital distribution-only model to work, you either have to create the system itself or sign with an online publisher (Valve/Steam, Stardock/Impulse, IGN/D2D, etc) and tie your game to that particular service almost exclusively. It's more an antitrust concern than anything else, particularly in the case of Valve's "SteamWorks" API wherein all the functionality of handling things like DLC, game updates, achievements, etc., are provided to game developers for the low price of absolutely free but with the caveat that using it means the game absolutely has to use Steam. So you end up with brick-and-mortars and Steam's direct competitors having to sell licenses and downloads for a game that would still require their competitor's service to install and use it anyway, which leads to the obvious implication that for each license sold in those alternative channels, the developer loses money equivalent to the difference they'd make if they just signed with Steam exclusively.
  12. "Lost profits." I hate that phrase. One game pirated is one sale that likely will never happen so long as the game costs money in the first place, so why bother trying to appease the pirates by dropping the price? Fuck them. Set the price where you want it, drop the copy protection schemes, stop fucking over your paying customers, and ignore the pirates cuz they aren't gonna give you their money anyway. http://arstechnica.com/gaming/news/2008/03/pc-game-developer-has-radical-message-ignore-the-pirates.ars
  13. There's actually a semi-hidden (at least for anyone who plays through this game "normally") Really Bad ending for someone who plays a psychopath out to kill everyone. That, incidentally, was my first playthrough and I haven't picked up the game since.
  14. Then it's an issue of semantics. Your base statement was that ME2 "had an entirely different engine," which it didn't, cuz in context, "engine" without any additional qualifiers means "game engine." ME2 is a UE3 game, same as ME1. If your point is that Bioware changed the nature of combat, sure they did that, so what? If they wanted to, they could've overhauled the nature of combat in the game, left everything else alone, and called it a "patch." The sheer amount of new content produced on top of what they had in ME1 is why it was released as a new game. It doesn't make any business sense to take that much work and make it a download-only release for a product already released and (by ME2's release date) several years old. It would be as dumb as Microsoft taking all the changes that went into Windows 7 and packaging them as a service pack for Windows Vista.
  15. "Gameplay engine?" What the hell is a "gameplay engine?" It's UE3 regardless. I'm starting to get the impression that "gameplay engine" is a term you made up. Irregardless, changing the way certain classes are played doesn't mean they'd absolutely have to release a new game just due to that. Hell, Valve changed the very nature of the Pyro in TF2 yet they didn't have to duplicate content and release a whole new game. ME2 was released as a separate game simply due to the fact that the sheer amount of new content made meant that if they were to sell it as DLC, it'd be a multi-gigabyte download costing the equivalent of $50 to recoup their losses per unit sale and at that point they might as well just make new box art for it and call it a new game. DLC isn't supposed to add such a significant amount of new content over the base game in the first place.
  16. Mass Effect 1 and Mass Effect 2 were both Unreal Engine 3 games. The sheer amount of new content developed for Mass Effect 2, combined with the fact that the series was always meant to be a trilogy, is why ME2 was released as a sequel game on its own merits rather than as DLC.
  17. Pendulum - The Island (both parts, gapless, 9:30) Also Pixelectronic's 8bit version of it, which is hilarious.
  18. Why are you saving a new profile? Just use the profile with the altered settings.
  19. I figure it would help to adjust the GPU's core voltage proportional to the overclock being done, except that isn't something CCC lets you do in its gui itself, you have to create a CCC profile, and then edit its xml entry to alter the core voltage (all CCC entries in xml are multiplied by 100 I believe), and then reapply the CCC profile. C:\Users\%username%\AppData\Local\ATI\ACE\Profiles Like for instance my 4890 underclock (by 50MHz) to get it to run stable in UE3 games looks like this: <Feature name="CoreClockTarget_0"> <Property name="Want_0" value="80000" /> <Property name="Want_1" value="80000" /> <Property name="Want_2" value="80000" /> </Feature> <Feature name="MemoryClockTarget_0"> <Property name="Want_0" value="97500" /> <Property name="Want_1" value="97500" /> <Property name="Want_2" value="97500" /> </Feature> <Feature name="CoreVoltageTarget_0"> <Property name="Want_0" value="1313" /> <Property name="Want_1" value="1313" /> <Property name="Want_2" value="1313" /> </Feature> Also for underclocking I had to keep the core voltage at stock for it to remain stable, which suggests to me that the 4890 is just a stock-overclocked 4870. I tried raising the voltage a bit for running at stock clocks and it still didn't take (worth a shot though if the card can handle running at stock, which mine can't). And yeah, I'm forcing 3d clocks all the time. Fixes a bug that crops up in some games where the GPU downclocks itself to 2d rates when there's little rendering activity going on.
  20. You do realize that you are effectively downloading them to disk just to view them inline anyways?
  21. Bit late but better late than never. What's your downstream SNR/power level and upstream power level? Downstream SNR should be anywhere between 25dB and 55dB, downstream power level should be between -12dBmV and 12dBmV (closer to 0 is best) and upstream power level follows the same rule as downstream SNR (between 25dBmV and 55dBmV). Anything outside these margins and the modem will tend to continuously reboot itself (as signal/power level is either "too loud" or "not loud enough"). This doesn't necessarily mean the modem is defective, chances are, it's the line itself that's defective, and usually this is fixed by running a clean split off the main direct to the modem itself with new cable. I'm running a SB5101 on Cox as well.
  22. And here's my take on it: 1.14MB, same settings as the last one. I'm using a really old version of JASC Animation Shop (before they were bought out by Corel) in case you're wondering. AS 3.04. Still does the job just fine. 63-color Octree palette, color substitution by dithering, duplicate frames removed, 50ms delay between frames (after the first one, which is 2.5sec, since leaving it at 50ms makes it damn near impossible to read).
  23. Just threw this together: EDIT #2, fixed the dithering, slightly larger filesize though still within your acceptable margins:
×
×
  • Create New...