View Full Version : SM 3.0 vs Anti-Aliasing

08-03-2005, 10:42 PM
Question about this. I got my geforce 6800 card and it works wonderfully. I can run the game at 1024x768 with SM 3.0 and everything else maxed, or 1024x748 with AA at 4 and everything else maxed. Both work great. What's the difference? Maybe I'm just not in the right areas of the game yet to see the difference. I have gone into lowly lit areas and highly lit areas and changed the settings back and forth, and I don't see the difference. Is sm 3.0 supposed to be superior to using the anti-aliasing, or vice versa? Thanks in advance.

08-03-2005, 10:55 PM
I think you mean HDR or Hight Dynamic range. That has nothing to do with SM3.0. HDR is rather in simple terms, Nvidia support for Floating point filtering and blending which works independantly of SM3.o [yes i know its under the shader options] but trust me it has nothing to do with it. The use of HDR and AA is generally down to personal taste. I myself play the game with HDR enabled [i love AA but i ain't in love with it] and what it does is improve lighting in certain situations. You can notice subtle brightness shifts as you go from areas of differing intensities. Its not as pronounced as Far Cry's HDR effect which more or less trys to recreate a camera [Far Cry doesn't use Tone Mapping as Far as I know], but the effect is there. BTW generally on a Geforce 6800 series of cards, you will take more of a performance hit with HDR enabled than 4xAA. By all accounts you should be able to run the game at 1280x1024 with at least 2xAA unles 1024x768 is as far as your monitor will allow.

So like i said HDR or AA, its all down to personal taste.

08-03-2005, 11:04 PM
I guess. I know I have the option in the advanced graphics settings in the game of shader 1.0 and being able to use AA, or using the shader 3.0 and it automatically disabling the AA choices. So you are saying the shader 3.0 option is actually HDR?

08-03-2005, 11:23 PM
The shader 3.0 option contains the ability to enable HDR, but HDR has nothing to do with with shader. It sounds confusing but trust me.

You cannot use HDR and AA at the same time. It is a limitation of the 6800 and 7800 series of cards. It would require far to much bandwidth and were it possible would reduce the frame rates to a slide show. If you want to use AA then disable HDR with Tone Mapping.

08-03-2005, 11:47 PM
Check the new nvidia demos. aa + hdr. no prob. all a matter of coding.

08-03-2005, 11:56 PM
Originally posted by Dojomann:
Check the new nvidia demos. aa + hdr. no prob. all a matter of coding.

Yes and no. The new nvidia demos use software AA with is not a viable option in games. Games like splinter Cell use full hardware support for AA when HDR is not available. The software option require some CPU cycles.

EDIT: I may be wrong in the CPU cycles. It probably uses the vertex or pixel shaders to create the smooth transitions, which is still effctively a software method. Its like ATI's 9x00 implementation of truform. It uses the vertex shader where as the 8500 had full harware support for it. Eitherway, i will have to look more into this.

Here is a snippet from Beyond3D

Link (http://www.beyond3d.com/forum/showpost.php?p=517707&postcount=1)

You can also see it been clear up with an Interview with David kirk on the HDR and software AA issue.

Link 2 (http://www.bit-tech.net/bits/2005/07/11/nvidia_rsx_interview/1.html)

08-04-2005, 12:29 AM
ah. thanks for clearing that up. i knew something was fishy. still tho i think today's processors are up to it. for some games anyways. i wouldnt be surprised if nvidia added this feature to their drivers. would be nice for the people with dual core processors especially eh?

08-04-2005, 06:12 AM
I have been playing around with the settings using an ATI card and the newest patch. When I enable HDR I notice a few things. If I move the camera so Sam's head is in the way of a light source the entire brightness of the room changes which is kinda strange. I noticed the the shimmering effect around lightsources is not as noticable and the light seems tighter some how. Some things seem to be brighter then normal and the frame rate is about 10FPS slower then using 4X AA. Of course edges of objects are all jagged since AA doesn't work.

Personally I use AA. I find on-the-fly light changing to be unrealistic and the frame rate hit and jaggies everywhere make the game look and perform worse. I do like what it does to the light shimmer but it's not enough for me to enable it and if I wanted things to look brighter I would adjust the brightness and contrast on my monitor without a hit to performance.

While we are on the subject I was wandering what everyone thinks about the other SM3.0 options like soft shadows and the mapping option.
(Done some editing after further exploration of these effects)
In my opinion the soft shadows make the shadows look lighter but they seem less defined which makes the overall image look less sharp in some situations but also more realistic in others.

Parallax mapping seems to make certain walls look more 3D or have more depth. I notice no differance in some area's but in others the differance is obvious.

Turning on the soft shadows and paralax mapping doesn't effect my FPS notciably but I am using the V-sync so they may be causing slight differances that are not noticable when playing the game on my setup. It is nice to see the game without the banding problems of SM1.1 though. I think my FPS went up a little as well but I havn't tested it yet. Many report the FPS to have went down when switching to SM2.0 over 1.1 but I know my low's did not get any lower because I checked with fraps.

It's nice to have the HDR option but I prefere AA over HDR. HDR is unplayable at the settings I play at anyway. IF they make HDR work with AA and at playable levels while impelementing it properly it would be welcomed but that won't be for a while.

08-04-2005, 11:02 AM
Originally posted by the_sextein:
I have been playing around with the settings using an ATI card and the newest patch. When I enable HDR I notice a few things...
I thought that HDR is exclusive to Nvidia... http://forums.ubi.com/images/smilies/blink.gif

08-04-2005, 11:50 AM
Well, after a bit of playing around I found the best settings for me are using the HDR with everything enabled or on high and AF at 8. I use 1024x768, not because my monitor won't handle higher resolutions, I just get better frame rates with it at 1024x768. Averaging about 35fps. I think my computer doesn't allow the full potential of the graphics card, but next year I am going to build an Athlon 64 system, so this is just fine for now. Thanks for the help guys.

08-04-2005, 11:53 AM
No. What is exclusive to Nvidia is their licensing of the openEXR storage format required for HDR. nvidia doesn't own or created openEXR, but it does have support for Floating point filtering and blending which gives comparable image quality to hollywood movies, preferably those of ILM. Any card capable of SM2.0 is able to use HDR via using a ping pong method which involves doing the blending manually rather than in hardware via FP and blending which Nvidia 6800 series supports. Also, SM2.0 HDR does have limitations also where it is limited to Interger 16 bit percision where as FP has floating point 16 bit precision as well as other features such as floating point Alpha blending and superior quality HDR [not that i notice the difference mind]. So HDR is not exclusive to Nvidia cards, but FP filtering and blending is at the moment. Hell its even possible to do HDR on directx8.1 hardware, but its limited and in a gaming situation it implementation would not be an effective one, preferably reducing the refresh rates to single digits...if that. It is limited. I'll link you to a demo that uses directx8.1 - i.e SM1.0, 1.1, 1.3, 1.4 - to achieve HDR.

EDIT: For that particular HDR demo you will need either an ATI Radeon 8500 and above or a Nvidia Geforce3 and above.

Link (http://www.daionet.gr.jp/~masa/mshdribl/index.html)

08-04-2005, 02:25 PM
When I first cranked up SCCT a couple of weeks ago, I played with the options. I found HDR to be da shiz (don't want to get too technical) but was a framerate killa. I'm using a 6600GT.

If I had my choice between HDR and AA, I'd probably go with HDR (if I could). The SC series is so dark that the jaggies don't matter as much as in "bright" games - where it really annoys me.

My sweetspot is 10x7 res. with 2XAA, 8XAF, SM3.0. I noticed about a 10% drop in framerate going from SM 1.1 to SM 3.0 but that doesn't add up to alot when you're averaging only 30fps in outdoor scenes.

08-04-2005, 03:06 PM
I personally like AA better, it's a shame they can't work concurrently.

08-04-2005, 03:12 PM
Originally posted by Spekkio9:
I personally like AA better, it's a shame they can't work concurrently. Give Nvidia a few months and I'm sure such an option will be available soon (I'm not sure but I think I heard somewhere the 7800 GTX already supports HDR & Anti Aliasing....by using a different kind of AA)! http://forums.ubi.com/groupee_common/emoticons/icon_wink.gif

08-04-2005, 03:17 PM
That's nice, but what the 7800gtx does doesn't help me with a 9800 pro http://forums.ubi.com/images/smilies/16x16_smiley-indifferent.gif

08-04-2005, 03:28 PM
Sweet link JaillumMalord! I like the tea cups there.

08-04-2005, 04:44 PM
I love what HDR does to the game, i would choose it over AA in a heartbeat but i have a 6600GT, so its not really playable.

I do have High Quality Soft Shadows enabled, and those look fantastic.

I play at 1024x768 res, all settings enabled, and SM 3.0.

P4 2.53 GHZ
1 GB PC 2700 DDR RAM
Geforce 6600 GT AGP

08-04-2005, 05:18 PM
Originally posted by Spekkio9:
That's nice, but what the 7800gtx does doesn't help me with a 9800 pro http://forums.ubi.com/images/smilies/16x16_smiley-indifferent.gif I'm still wondering why ATI's SM3.0 Video Card isn't out yet because they planned to release it in April or May so people like you could finally benefit of the Shader in SCCT (Graphics look great and with HDR & Tone Mapping enabled even greater)! I mean, just look at NVidia's latest "Flagship", it's almost the next Generation of Videocards! That's gonna be a very tough time for ATI! http://forums.ubi.com/groupee_common/emoticons/icon_frown.gif Well I always prefered NVidia (and still do), but that doesn't mean I don't care about the "suffering" ATI Users! http://forums.ubi.com/groupee_common/emoticons/icon_wink.gif

08-04-2005, 05:24 PM
Not saying you did. All I'm saying is that it doesn't matter what NVidia's NEXT card can do, or what ATI'S NEXT card can do, because I don't own those cards.

I bought the 9800 pro because in Feb of 2004, that was the best card for a reasonable price on the market. The GeForce 6 series was not even out yet, and the 6600gt wasn't even on the horizon as far as I knew.

Given a choice of a card to buy TODAY, I would go with NVidia. However, I'm not in the market for a GPU just yet. When I am in the market for one, I'll re-evaluate what each company has to offer and choose whichever one fits my criteria.

08-04-2005, 08:21 PM
nVidia impressed everyone when it made the 7800 GTX available the same day it was announced. Thats never been done b4, and now ATI is feeling the pressure, with nVidia's next generation out and strong, many people already own a 7800 GTX, and ATI's R520 (i think) chip hasn't even made a paper launch, not to mention an actual hardware launch.

08-04-2005, 10:51 PM
Yeah if I was in the market for a new card I would buy a 7800GTX for sure. The new transparancy AA options are very cool as well. Both Nvidia and ATI had problems getting their cards out last time and it was nice to see Nvidia just push the 7800GTX on to the market like they did. As for ATI I believe their new card is going to be a while longer because of poor yields. Hopefully they get things under control.

As for the HDR thing with ATI. It's true that all of these supposodly SM3.0 features can be done on ATI's current SM2.0b cards and it can be done just as fast if not faster due to ATI's faster pixel shader performance. Nvidia does own the rights to FP16 blending but nobody's using the highest quality HDR because cards can't even run the low quality very well. Every game that has come out in the lifetime of the 6800 Ultra could be done the same on X800 hardware and with faster performance. It all comes down to who pays who.

I hear that Nvidia has a new card waiting to come out if ATI hits them back hard so it should be interesting to see. I probably won't upgrade my card for another year so I will have to see who is currently king at that point.

Right now I am playing the game at 1600X1200 resolution with 16X AF and 4X AA. I am using SM2.0 with parallax mapping and soft shadows with all other options turned on. HDR is the only thing I leave off. Currently my game runs at 30FPS solid but if I turn off the 4X AA and turn on HDR my FPS drop to a solid 20FPS which is not good enough for me. I still think HDR needs to be implemented a little better and I feel that it is way to slow on anything but a 7800GTX. I am not sure if AA and HDR will be working togather in the future but I certainly hope so. Right now the technique that Nvidia is using on their web site demo's cannot be done in real time on a game. Hopefully they will figure something out.

08-05-2005, 10:39 AM
Well I always prefered NVidia I was too until the Nvidia FX series came around - *ouch*. At that time, I picked up an ATI 9500 Pro and it lasted me longer than any video card I've owned.

I agree when it comes time to getting a new card, evaluate both companies as they've both proven they can deliver the goods.

Where's everyone getting the dough for 7800GTXs? Out selling...SCAG...if you get my drift. http://forums.ubi.com/groupee_common/emoticons/icon_biggrin.gif

BTW, since I've been away from the SC forums for so long, is this the same forum as the original one when SC was just out? Back then, all I can remember were whiners and arguments and people getting banned. Now, everyone seems a heck of alot more "cool" here. It seems like a different set of people.

08-05-2005, 04:50 PM
Wow...I didn't know that this would start such a discussion post. I am happy with my new video card. Much better than this radeon 9200 junk I took out of my computer. Anybody want to buy it? LMAO! Actually...the radeon 9200 has been a very good card for me. Not fast, but reliable. Overclocked easily.