Massive Assault Official Forum
   
It is currently Thu Mar 28, 2024 5:46 am

All times are UTC - 5 hours




Post new topic Reply to topic  [ 10 posts ] 
Author Message
 Post subject: The Missing Options (GFX)
PostPosted: Mon Mar 08, 2004 6:53 am 
Offline
Sea Wolf
User avatar

Joined: Sat Feb 21, 2004 1:26 pm
Posts: 821
Karma: 0
I love this game, but there are a few options missing... now to be fair these options, which should be in EVERY 3D game, are missing from almost every 3D game (figures, huh?). Also, as a small developer it's understandable, but I still think they should be application selectable.

What am I talking about? The following:

Triple Buffering: The problem with V-Sync is that when the refresh rate of the monitor is not hit in FPS (e.g. refresh rate 85, but FPS 75), the FPS of your game is instantly halved. Considering what a a drain the trees appear to be on pans, I'd say this is a lot. Yet, in my Massive Assault experience, you cannot play without V-Sync, the tearing is just too noticable. Triple Buffering is the answer. For the cost of a little extra video memory, you can ensure that you never have your frame rate halve. More and more games support V-Sync, but without Triple Buffering it is often worthless (some games, like UT2003 only support it via the INI file).

Anti-Aliasing: One of the two biggest features of graphics cards these days. In the past, cards have never been able to run this without making games unplayable, but that has now changed, especially in the case of ATI cards. Problem is, you have to force it, globally, via the GFX cards control panel. This is clunky, it means everything has AA applied to it, even things that shouldn't really need it. It means you can't change the level on a per-application basis as your frame rate demands... all in all, it should be an application setting, yet the number of games which offer it as such is a joke! This should have happened five years ago, but has still to occur.

In my experience, AA doesn't make a big difference to Massive Assault... probably won't matter anywhere but on the base of cities (AA smoothes out jaggies... all those straight lines that suffer from the step effect, AA makes them look like straight lines... once you've used AA, you'll never go back, trust me).

After that waffle, let me simply say I think AA support should be offered, it should be in every app and MA shouldn't be one of those "lazy" apps. It is, however, the least important option to my mind. Hell, one of the reasons to support it is so that people can disable it and save valuable FPS. The trickly element here is that nVidia and ATI offering rather different settings, but it's hardly a huge number, it's just a value to pass to the card.

Lastly, Antistropic Filtering. Back in the day we had texture aliasing, when you moved forward down a tunnel you'd see the textures literally crawl down the walls towards you. Next came mip mapping, a process which divided the distance into various stages, the first being the most detailed, and then various levels of dithering and blur being used on each one after... when you play a flight sim with only mip mapping (bilinear and trilinear) you'll find a detailed patch underneath you, becoming a blur very quickly the further towards the horizen you look.

Antistropic Filtering is a process which allows for detailed textures over a much larger radius and reduces the mipmapping effect, while also eliminating texture crawl. In a game like Massive Assault with a sky viewpoint, this is a very useful effect as it stops all the terrain on the far side of the island looking like a green smudge.

Again, this is another feature which should be done on an application level. If done through a GFX control panel, AF is applied to everything, even things which don't need it, not to mention this is then the AF level for EVERY application, regardless of its needs. You should be able to choose AF, the level of AF (the choices currently are 2, 4, 6, 8 and 16... 16 is only on ATI Radeon 9xxx cards). You should also be able to choose between Bilinear and Trilinear filtering (Bilinear means each mipmap ends and the next begins, while with Trilinear the levels fade in and out so you can't see the effect when scrolling around).

That's it, Triple Buffering, Anti-Aliasing and Antistropic Filtering. All three should be configurable within almost every game on the planet, but aren't. In about five years when other options have taken on their role, no doubt they will be :).

I'd like to see Massive Assault take the lead in this respect and allow me to configure my games as I like them, rather than forcing clunky global settings in a way they were not intended to be applied, and in a way which is less effecient for frames per second.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Mar 08, 2004 12:56 pm 
Offline
Veteran

Joined: Fri Feb 27, 2004 2:54 pm
Posts: 85
Karma: 0
"One of the two biggest features of graphics cards these days. In the past, cards have never been able to run this without making games unplayable, but that has now changed, especially in the case of ATI cards"

You can force it in the ATI control panel, too, with any Radeon 9000 series card. My room mate has a 9700 and has the option to force all levels of AA and all levels of filtering, just like I do in my GeForceFX.

Me.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Mar 08, 2004 3:18 pm 
Offline
Sea Wolf
User avatar

Joined: Sat Feb 21, 2004 1:26 pm
Posts: 821
Karma: 0
That's my point, forcing is NOT a good solution, as ATI driver developers have said, the the only reason that control panel even exists is because game developers are not picking up their end. The game, and not the card, should be telling the card what to do. When I have a one size fits all solution, how well do you think that's working?

It's a nonsense and it shouldn't be done that way.


Top
 Profile  
 
 Post subject: Re: The Missing Options (GFX)
PostPosted: Mon Mar 08, 2004 3:56 pm 
Offline
Sea Wolf
User avatar

Joined: Sat Dec 13, 2003 3:06 am
Posts: 1338
Karma: 1

Location: USA
Quitch wrote:
That's it, Triple Buffering, Anti-Aliasing and Antistropic Filtering. All three should be configurable within almost every game on the planet, but aren't.


Just give a game player these features once and they're spoiled, can't live without them again ;) j/k

_________________
Founder of The New World Order, and moderator for the Andromeda Clan War.

NWO website:
http://www.freewebs.com/massiveassault-nwo/index.htm

Clan War website:
http://www.massiveassault.com/clans/nwo/ClanWar


Top
 Profile  
 
 Post subject:
PostPosted: Mon Mar 08, 2004 4:53 pm 
Offline
Veteran

Joined: Fri Feb 27, 2004 2:54 pm
Posts: 85
Karma: 0
"That's my point, forcing is NOT a good solution, as ATI driver developers have said, the the only reason that control panel even exists is because game developers are not picking up their end. The game, and not the card, should be telling the card what to do. When I have a one size fits all solution, how well do you think that's working? "

Well, that's ATI's developers shifting blame to another source, and it's a crock.

The fact of the matter is that ATI's war with nVidia has caused the graphics cards to advance so quickly that game developers CAN'T keep up, even if they wanted to. They don't add AA into a game because they shouldn't need to. You can force AA to max ALL THE TIME and it will rarely dog any game on your card, period. Simply because game developers have to put a ton more work in to a game engine to make it take advantage of all the nifty things the cards can do than the chipset manufacturers need to put in to make the cards go faster.

Look at CPUs for computers. Even gamers don't need a 2.5gHz CPU. Nothing can support or keep up with that kind of a speed. It happened because of AMD's war with Intel, not because the software needs it.

To take a few examples in the game world...

Look how long Half-Life2 took to develop. Or Doom3 (is that out yet?). Or FarCry. Or Splinter Cell. Or Raven Shield. Or (and this isn't a joke) Duke Nukem Forever.

These are the few games that really test the level that a video card can do. It takes a developer LONGER to develop those games than it takes ATI/nVidia to come out with their "next generation" GPU. Then ATI and nVidia point the blame at game developers for not keeping up with them, when they have a new GPU with more featuresets (that you can force the card to do) and more speed than a lot of peoples' processors, coming out on a yearly, sometimes semi-yearly, basis.

Your graphics card isn't going to be bothered by most games, especially if you have an FX. Just force everything to max and go with it. ;)

That's what I did, and on an AMD 2000+ (which is pretty outdated by way of processors. . .), there's not a single game that I have that dips below 30fps, except for SWG. And that is mostly a Ping issue.

Me.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Mar 08, 2004 4:54 pm 
Offline
Veteran

Joined: Fri Feb 27, 2004 2:54 pm
Posts: 85
Karma: 0
Add a mention:

In the time that HL2 and Doom3 (for instance) have been in development, we have seen 4 generations of graphics accelerators from ATI/nVidia. ;)

Me.


Top
 Profile  
 
 Post subject:
PostPosted: Mon Mar 08, 2004 7:09 pm 
Offline
Sea Wolf
User avatar

Joined: Sat Feb 21, 2004 1:26 pm
Posts: 821
Karma: 0
Asmodeous wrote:
"That's my point, forcing is NOT a good solution, as ATI driver developers have said, the the only reason that control panel even exists is because game developers are not picking up their end. The game, and not the card, should be telling the card what to do. When I have a one size fits all solution, how well do you think that's working? "

Well, that's ATI's developers shifting blame to another source, and it's a crock.

The fact of the matter is that ATI's war with nVidia has caused the graphics cards to advance so quickly that game developers CAN'T keep up, even if they wanted to. They don't add AA into a game because they shouldn't need to. You can force AA to max ALL THE TIME and it will rarely dog any game on your card, period. Simply because game developers have to put a ton more work in to a game engine to make it take advantage of all the nifty things the cards can do than the chipset manufacturers need to put in to make the cards go faster.


No, it's not. A graphics card is there for the application to do 3D graphics, it's there to do what the APPLICATION tells it to do. Global settings are nothing but filler for the application. Are you telling me that games like Unreal Tournament 2003 didn't know about AA, despite AA being ancient before UT2003 was a glimmer in the developers eye?

It's a nonsense. Developers are trying to stay with the status quo because it works and lessens the support they need to offer. Global AA support is absolute crap. AA can have such a huge affect on FPS that while you might get away with 6x in one you'll need 2 in another.

It's like telling me that a global resolution is "just fine".

Quote:
Look at CPUs for computers. Even gamers don't need a 2.5gHz CPU. Nothing can support or keep up with that kind of a speed. It happened because of AMD's war with Intel, not because the software needs it.


Actually, many games already push CPUs far beyond what they can handle, Unreal Tournament 2003 for example, at a resolution of 1024x768, is bottlenecked by the CPU, not the GPU. Half-Life 2 is an example of where the GPU is not the bottleneck, rather the CPU is.

Games push CPUs a lot harder than people realise. The new found love of both AI and physics is taking its toll, and some people have even toyed with the idea of passing some of this work to the GPU processor, such is the load on the CPU.

Quote:
To take a few examples in the game world...

Look how long Half-Life2 took to develop. Or Doom3 (is that out yet?). Or FarCry. Or Splinter Cell. Or Raven Shield. Or (and this isn't a joke) Duke Nukem Forever.


Yet, amazingly, they still manage to come out supporting new techniques like Bump Mapping. Honestly, this is a feeble argument. AA and AF are two very old techniques and the lack of support is a travesty.

Quote:
These are the few games that really test the level that a video card can do.


Nonsense, there are a ton of them. Games like (and I use my favourite example) UT2003 easily push the graphics card WHEN combined with AA and AF, and if you're not using AA and AF then you don't need to buy a high-end graphics card anymore.

Imagine if you hadn't been able to select resolution. You've have been appalled wouldn't you? Well, AA and AF are the new settings to play with. Resolution isn't the most important setting in this day and age, yet it is supported and those two aren't. Ridiculous beyond belief.

Quote:
It takes a developer LONGER to develop those games than it takes ATI/nVidia to come out with their "next generation" GPU. Then ATI and nVidia point the blame at game developers for not keeping up with them, when they have a new GPU with more featuresets (that you can force the card to do) and more speed than a lot of peoples' processors, coming out on a yearly, sometimes semi-yearly, basis.


Again, a nonsense. Developers know exactly what's coming up and have access to samples long before we do. But what has this got to do with anything? Do you know how long AA and AF have been in existence? No one could claim with a serious face that AA and AF have just sprung on them and they didn't have time to take them into account.

Quote:
Your graphics card isn't going to be bothered by most games, especially if you have an FX. Just force everything to max and go with it. ;)


An FX is going to be challenged by any game using 2.0 Pixel Shaders (every DX 9 game then) since its support of them is dire (and hence why the high-end card of choice right now is an ATI card).

An FX can't run above 4x AF reliably in most games.

An FX struggle to give decent AA FPS above 2x.

You're stuck in the world of resolution, and do you know why? It's because applications have always offered a resolution switch above everyting. The world has moved on, applications haven't. It's a sad state of affairs, worse still that anyone could think that developers are doing a great job in this regards and that global settings are a good thing. A GOOD THING for crying out loud!

Quote:
That's what I did, and on an AMD 2000+ (which is pretty outdated by way of processors. . .), there's not a single game that I have that dips below 30fps, except for SWG. And that is mostly a Ping issue.

Me.


Anyone serious about FPS games wants a frame rate of 60, but you're rather missing the point. What's that GFX card doing? Giving you frame rates? Turn it down to 800x600 and get it over with. No, graphics cards are about getting you to an acceptable rate with a high quality image.

1. Just as a global resolution would be highly stupid, so is global AA and AF.

2. AA and AF have been around forever, no developer can use the excuse that they're new, or outside the development cycle. If that pile of nonsense were true, and GFX cards were really that far ahead, we'd be able to buy two generation old cards and run games fine. You can't, and games manage to incorporate new features all the time... just look at the latest MA patch for goodness sake!


Top
 Profile  
 
 Post subject:
PostPosted: Mon Mar 08, 2004 9:21 pm 
Offline
Veteran

Joined: Fri Feb 27, 2004 2:54 pm
Posts: 85
Karma: 0
"Actually, many games already push CPUs far beyond what they can handle, Unreal Tournament 2003 for example, at a resolution of 1024x768, is bottlenecked by the CPU, not the GPU. Half-Life 2 is an example of where the GPU is not the bottleneck, rather the CPU is. "

Then that's bad optimization on their part, to be honest. At least for UT2k3. In HL2, it makes sense, because the logic-patterns they use for the AI is INCREDIBLY processor intensive because it's trying to act as human-like as is possible.

The fact that UT2k3 has CPU-based issues at high-resolutions leads one to believe not enough work is being done by the GPU. The GPU in my FX5700 is running two procs at 450mHz to do NOTHING but resolution, framerate, and graphics. Please abuse it. So far it is completely untaxed, whereas my CPU is getting overtaxed doing sub-processing that the video card should be doing with all the spare cycles it has lying around. :)

"Yet, amazingly, they still manage to come out supporting new techniques like Bump Mapping. Honestly, this is a feeble argument. AA and AF are two very old techniques and the lack of support is a travesty."

Bump Mapping isn't "new". Bump mapping has been around since GLide, but no one used it until OpenGL and GLide 2.0 (came about with Pixel Shader 2.0 and 1.1) because the optimaization in the cards was so horrible that the framerate hit was catastrophic.

"Nonsense, there are a ton of them. Games like (and I use my favourite example) UT2003 easily push the graphics card WHEN combined with AA and AF, and if you're not using AA and AF then you don't need to buy a high-end graphics card anymore."

Pah. My system, BEFORE I got my FX, when I was using a GF3, ran UT2k3 with all the goodies on at 1024x768 at 35 fps stable, occasional dips to 30, and occasional rises as high as 45.

"Imagine if you hadn't been able to select resolution. You've have been appalled wouldn't you?"

Not if it was at least 1024x768 or 1280x1024. :)

"Again, a nonsense. Developers know exactly what's coming up and have access to samples long before we do. But what has this got to do with anything? Do you know how long AA and AF have been in existence? No one could claim with a serious face that AA and AF have just sprung on them and they didn't have time to take them into account. "

Again, you can globally force those settings on. My GPU has 8x AA on all the time, if something is causing it to drag, I'll dip it to Quincunx, which looks nearly as good with less work.

"An FX is going to be challenged by any game using 2.0 Pixel Shaders (every DX 9 game then) since its support of them is dire (and hence why the high-end card of choice right now is an ATI card)."

I haven't seen that to be a problem, and I run a decent amount of DX9 supported games. Yeah, my roomie gets about 10-15fps faster than I do, but so what? The human eye can't perceive the difference, for most people. If I don't see the screen hitch, I don't care.

"You're stuck in the world of resolution, and do you know why? It's because applications have always offered a resolution switch above everyting."

You ASSume much, young Padawan.

"Anyone serious about FPS games wants a frame rate of 60, but you're rather missing the point. What's that GFX card doing? Giving you frame rates? Turn it down to 800x600 and get it over with. No, graphics cards are about getting you to an acceptable rate with a high quality image."

Incorrect.

The only FPS games that framerates make a difference are Doomstyle games where the ability you have to hit someone before they hit you is directly dependant upon the framerate of the game. Game developers are already realizing how absolutely stupid that this concept is and less and less games are taking framerate into account for such things as time passes. The Doom3 engine does not rely on the framerate to judge such situations like the Doom2 engine did, as iD software learned from their mistake.

The human eye, on average, perceives the world at approximately 35 frames a second. Granted, some people will be able to see as high as 40, some as low as 25, but the mean is 35fps. If the framerate of a video game is higher than your eye is able to perceive, your eye drops the frames in the interim, ergo the image looks smoothe and fluid, like reality. If the framerate drops below the same level that your eye runs at, you have that hitching that people hate in the game from the video card dropping frames that your eye should perceive.

I turn all the detail on in every game I have. That's WHY I bought a GeForceFX. I want everything to be on at at LEAST 1024x768 (I rarely go above 1280x1024, because it really is pointless, and jumping the AA from 2x to 4, 6, or 8x has a better affect on the image quality at a lesser framerate hit), and I want a framerate of high enough that I can't see frames missing.

My target number? It's around 30/35. If it hits 30, I can see it. It's blatantly obvious to me that frames are missing. If it hits 35ish, I can't notice a difference no matter how many more frames you throw in there, unless the screen "tears" because of no V-Sync.

I also think that V-Sync is the spawn of satan, and games shouldn't have to rely on that to not destroy the image quality.

Having 60+ fps doesn't do me a bit of good, much less having the framerate match my monitor's refresh, which is about 120Hz at 1024x768, and 100Hz at 1280x1024. I don't see a difference in the image quality of a higher framerate. All I see is a number that doesn't look any different to me, quality wise, than a number just about half as big.

I'd rather have all detail on and have a heavily realistic image at 35fps than have a crappy image with everything maxed at 120fps. It just strikes me as more useful.

You're making a lot of assumptions about me, man. ;)

"If that pile of nonsense were true, and GFX cards were really that far ahead, we'd be able to buy two generation old cards and run games fine. You can't,"

Since when? My GeForce3 (two generations behind) ran UT2k3 with everything on at 1024x768 at 35fps. Seems to me like you can run them fine.

Me.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Mar 09, 2004 5:12 am 
Offline
Sea Wolf
User avatar

Joined: Sat Feb 21, 2004 1:26 pm
Posts: 821
Karma: 0
Quote:
Then that's bad optimization on their part, to be honest. At least for UT2k3. In HL2, it makes sense, because the logic-patterns they use for the AI is INCREDIBLY processor intensive because it's trying to act as human-like as is possible.


Please, explain to me HOW this is bad optimisation, and not factors like physics and shadows. Please, entertain me. Better yet, explain to me how you know what the AI in an unreleased game is doing.

Quote:
The fact that UT2k3 has CPU-based issues at high-resolutions leads one to believe not enough work is being done by the GPU. The GPU in my FX5700 is running two procs at 450mHz to do NOTHING but resolution, framerate, and graphics. Please abuse it. So far it is completely untaxed, whereas my CPU is getting overtaxed doing sub-processing that the video card should be doing with all the spare cycles it has lying around.


Put your resolution to 1600x1200 and try maxxing AA and AF. You will cripple your frame rate. If the GPU isn't doing enough, that's your fault.

Quote:
Bump Mapping isn't "new". Bump mapping has been around since GLide, but no one used it until OpenGL and GLide 2.0 (came about with Pixel Shader 2.0 and 1.1) because the optimaization in the cards was so horrible that the framerate hit was catastrophic.


Pixel Shader 2.0 didn't exist until DX9, and that IS new, graphics wise. But that's my point, Bump Mapping ISN'T new, but it still gets supported, unlike AA and AF.

Quote:
Pah. My system, BEFORE I got my FX, when I was using a GF3, ran UT2k3 with all the goodies on at 1024x768 at 35 fps stable, occasional dips to 30, and occasional rises as high as 45.


Well, 30fps is not a good frame rate for an FPS game. Myself, I can't play below 60.

Quote:
"Imagine if you hadn't been able to select resolution. You've have been appalled wouldn't you?"

Not if it was at least 1024x768 or 1280x1024.


So you'd like your desktop resolution to be your game resolution? I doubt that, but we'll pass on it for now.

Quote:
Again, you can globally force those settings on. My GPU has 8x AA on all the time, if something is causing it to drag, I'll dip it to Quincunx, which looks nearly as good with less work.


Considering the tricks that nVidia has pulled over the last year, what you see in your control panel doesn't tend to be what you see on screen. Not to mention that nVidia AA is *terrible*.

Quote:
I haven't seen that to be a problem, and I run a decent amount of DX9 supported games. Yeah, my roomie gets about 10-15fps faster than I do, but so what? The human eye can't perceive the difference, for most people. If I don't see the screen hitch, I don't care.


That's because those games don't give the FX 2.0 pixel shaders. Developers have pretty much given up on the FX range as DX9 cards, and instead send them DX8 tasks to do.

Quote:
Incorrect.

The only FPS games that framerates make a difference are Doomstyle games where the ability you have to hit someone before they hit you is directly dependant upon the framerate of the game. Game developers are already realizing how absolutely stupid that this concept is and less and less games are taking framerate into account for such things as time passes. The Doom3 engine does not rely on the framerate to judge such situations like the Doom2 engine did, as iD software learned from their mistake.


Don't quote urban myth as fact. The human eye can distinguish between FPS faaaar above 30fps.

Quote:
I also think that V-Sync is the spawn of satan, and games shouldn't have to rely on that to not destroy the image quality.


What are you talking about? What the hell can an application do to get around the fact that the monitor can only refresh so fast???

Global settings have never been a good idea, they never worked well, and they never will work well. Know why? Because no two games are alike. You seem quite content to have global settings... good for you, but that doesn't change the fact that some apps need AA, some don't, seem need more, some need less. Global settings do nothing but force AA onto an app, and that's just wasting frame rates. It's stupid.

Honestly, some of the stuff you're throwing my way boggles the mind, but if it makes you happy, good for you.

P.S. You have my sympathy for the FX.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Mar 09, 2004 5:40 am 
Offline
Veteran

Joined: Fri Feb 27, 2004 2:54 pm
Posts: 85
Karma: 0
"Please, explain to me HOW this is bad optimisation, and not factors like physics and shadows. Please, entertain me. Better yet, explain to me how you know what the AI in an unreleased game is doing."

Simple, because factors like the physics are not effected by the resolution. Ergo if it is non-effectual below 1024x768 it's due to graphics issues, not due to AI/Physics engine. Those are done by skeletal points in the various critters, and the skeletal points don't change by resolution.

Shadows I'll give, but the CPU can give those over to the GPU.

And let's not even talk about how easy it is to obtain a Beta version of a game, first off. And secondly, the AI in HL set a new standard for AI in a video game. The developers, not the marketting team, have talked about nothing more than the levels of advancement they have been working on for the AI.

So if the first point doesn't satisfy, the second point should.

"Put your resolution to 1600x1200 and try maxxing AA and AF. You will cripple your frame rate. If the GPU isn't doing enough, that's your fault. "

The improvement between 1280x1024 and 1600x1200 is minimal in comparison with other problems it causes.

One of them being that any text is too small to read from the distance I sit at.

"Pixel Shader 2.0 didn't exist until DX9, and that IS new, graphics wise. But that's my point, Bump Mapping ISN'T new, but it still gets supported, unlike AA and AF."

You also can't force something like Bump Mapping globally and expect it to work.. So blame the game developers or blame the video cards' driver developers for giving you the option?

"Well, 30fps is not a good frame rate for an FPS game. Myself, I can't play below 60."

Sorry to hear that, I can.

"So you'd like your desktop resolution to be your game resolution? I doubt that, but we'll pass on it for now."

Doubt it all you like, oddly, my desktop runs at the same resolution I play most games in...

"Considering the tricks that nVidia has pulled over the last year, what you see in your control panel doesn't tend to be what you see on screen. Not to mention that nVidia AA is *terrible*."

I see no difference in quality from ATI, only in performance. It's also not a hardware issue, it's a driver issue.

"That's because those games don't give the FX 2.0 pixel shaders. Developers have pretty much given up on the FX range as DX9 cards, and instead send them DX8 tasks to do."

That's funny, the image looks the same. . . The loss of 10-15fps doesn't phase me, and I still get the same image quality.

"Don't quote urban myth as fact. The human eye can distinguish between FPS faaaar above 30fps."

The eye is also heavily limited by the processor behind it, which, though it sees a better framerate, the brain doesn't, necessarily.

"What are you talking about? What the hell can an application do to get around the fact that the monitor can only refresh so fast???"

Reduce dependancy on framerate. :)

"Global settings have never been a good idea, they never worked well, and they never will work well. Know why? Because no two games are alike. You seem quite content to have global settings... good for you, but that doesn't change the fact that some apps need AA, some don't, seem need more, some need less. Global settings do nothing but force AA onto an app, and that's just wasting frame rates. It's stupid."

That's the glory of global settings, you can change them.. whenever. Both ATI and nVidia have little tools that sit on your taskbar that you can change the AA/AF to whatever level you want with two clicks. I'm sorry if that's too much for you and all, but it's there. They come with every video card.

The point is it's not the developer's fault you have them, but if the developers have the ability to avoid doing something, because you can do it manually, then they can save time on that and put it into other things.. like making the game better. Good Graphics don't make a Good Game. There's better things they can do with their time, especially when they're given deadlines. If I have to click two or three times more to enjoy a game better, so be it, it's not like it wastes any of my time.

And as for wasting framerates, framerates don't make the world go around. You can operate quite smoothely on lower framerates, and if you seriously have a problem with V-Sync, all I can say is get a better monitor or force the settings higher.

I mean, sheez, a nice ViewSonic monitor, like the one I have, needs to bust 100+ Hz in order to tear. 60fps doesn't hit that. Or even 85 if you wanna go for some gusto. And there's no palpable reason, save for maybe Doom2, why you would want/need to have a game running at 90fps or higher. You want to talk mind-boggling?

I'm glad you see a huge difference between 35fps and 60fps. I, personally, don't. The game operates the same for me without any issue, and I don't find myself any more "skilled" because I have more frames than my mind seems willing to comprehend without filling the rest in itself or what, but it's not exactly amazing stuff.

"P.S. You have my sympathy for the FX."

I don't see why, the hardware on it is excellent. Yeah, it has driver issues, but they'll grind that stuff out. I'm not a big fan of ATI, so whatever. You won't, however, see me knockin' 'em.. Both companies make a great product, and both have their strengths and weaknesses. I, personally, prefer the image I get out of my card, even though it doesn't run quite as fast (or cold!) as my roommates.

Me.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 10 posts ] 

All times are UTC - 5 hours


Who is online

Users browsing this forum: No registered users and 16 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
Karma functions powered by Karma MOD © 2007, 2009 m157y