PDA

View Full Version : Badly coded games. Do they make these on purpose?



Airmail109
10-29-2007, 02:41 PM
Seen some benchmarks for a Q6600 system with an 8800 GTX:

Unreal III Maxxed Out: Runs from 80-100fps average

Crysis Maxxed Out: Runs at about 18-24fps average

Unreal III looks just as good as crysis. Makes you wonder.

Airmail109
10-29-2007, 02:41 PM
Seen some benchmarks for a Q6600 system with an 8800 GTX:

Unreal III Maxxed Out: Runs from 80-100fps average

Crysis Maxxed Out: Runs at about 18-24fps average

Unreal III looks just as good as crysis. Makes you wonder.

Skunk_438RCAF
10-29-2007, 02:43 PM
Yeah but all those leaves that you can wipe your virtual butt with must eat a lot of cpu power.

Low_Flyer_MkIX
10-29-2007, 02:45 PM
I always forget this one. What's the max fps the human eyeball can cope with?

SlickStick
10-29-2007, 02:46 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Aimail101:
Seen some benchmarks for a Q6600 system with an 8800 GTX:

Unreal III Maxxed Out: Runs from 80-100fps average

Crysis Maxxed Out: Runs at about 18-24fps average

Unreal III looks just as good as crysis. Makes you wonder. </div></BLOCKQUOTE>

One thing I'll add is that when new games come out, video card drivers may not be well-optimized for specific aspects of a specific game. Many times in hardware review articles, they note this, and then they later test a subsequent version of drivers that produces better results.

May or may not be the issue here, but I thought I'd throw that out there.

SlickStick
10-29-2007, 02:47 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Low_Flyer_MkIX:
I always forget this one. What's the max fps the human eyeball can cope with? </div></BLOCKQUOTE>

I believe it's said that the human eye can only detect differences up to 30 FPS (I've also read 30-60 FPS), but for me, more FPS makes for much smoother and more fluid gameplay, especially when there is allot going on at one time (smoke, fire, tracers, over cities, etc).

Airmail109
10-29-2007, 02:50 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by SlickStick:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Low_Flyer_MkIX:
I always forget this one. What's the max fps the human eyeball can cope with? </div></BLOCKQUOTE>

I believe it's 30FPS, but more FPS makes for much smoother and more fluid gameplay. </div></BLOCKQUOTE>

Yeah Ill find the article that debunks the myth thats 28fps is fine to play with because the eye only does 28fps, something about the fact that a 28fps eye and a 28fps screen arnt syncronized....

Slickstick Unreal III just came out with amazing optimized performance

Crysis seemingly has not, so they have no excuse lol

Low_Flyer_MkIX
10-29-2007, 02:51 PM
I look forward to it. A particular interest of mine.

Capt.LoneRanger
10-29-2007, 02:51 PM
24FPS would be enough for the human eye to simulate a fluid motion with single pictures.

But you wouldn't like to play a game with 24FPS.

SlickStick
10-29-2007, 02:54 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Aimail101:
Slickstick Unreal III just came out with amazing optimized performance </div></BLOCKQUOTE>

I hear what you are saying, but there have been many cases in the past where NVIDIA performed better for certain games over ATI for instance and vice-versa.

Then, they optimize their drivers better for the most popular game features and things tend to even up. It may also be the way the game is graphically coded, I'm just sharing what I've read.

I wonder how Crysis performs with an ATI vid card?

Airmail109
10-29-2007, 02:56 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by SlickStick:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Aimail101:
Slickstick Unreal III just came out with amazing optimized performance </div></BLOCKQUOTE>

I hear what you are saying, but there have been many cases in the past where NVIDIA performed better for certain games over ATI and vice-versa.

Then, they optimize their drivers better for the most popular game features and things tend to even up. It may also be the way the game is graphically coded, I'm just sharing what I've read. </div></BLOCKQUOTE>

If its the way the games graphically coded (the engine) then it stinks lol

Airmail109
10-29-2007, 02:59 PM
http://www.daniele.ch/school/30vs60/30vs60_1.html

"This is the second toughest part of this article. TV and Movies are easy to understand, and the technology behind it is also easy to understand. Computers and the way games are projected to us is a lot more complex (the most complex is the actual physiology /neuro-ethology of the visual system).
First off, the hardware used for visualization (namely the monitor) is a very fine piece of equipment. It has a very small dot pitch (distance between phosphors) and the phosphors themselves are very fine, so we can get exquisite detail. We set the refresh rates at over 72 Hz for comfort (flicker free). This makes a very nice canvas to display information on, unfortunately because it is so fine it can greatly magnify flaws in the output of a video card. We will get into refresh in the section on the human eye.
Let us start with how a scene or frame is set up by the computer. Each frame is put together in the frame buffer of the video card and is then sent out through the RAMDAC to the monitor. That part is very easy, nothing complex there (except the actual setup of the frame). Now each frame is perfectly rendered and sent to the monitor. It looks good on the screen, but there is something missing when that action gets fast. So far, programmers have been unable to make motion blur in these scenes. When a game runs at 30 fps, you are getting 30 perfectly rendered scenes. This does not fool the eye one bit. There is no motion blur, so the transition from frame to frame is not as smooth as in movies. 3dfx put out a demo that runs half the screen at 30 fps, and the other half at 60 fps. There is a definite difference between the two scenes, with the 60 fps looking much better and smoother than the 30 fps.
The lack of motion blur with current rendering techniques is a huge setback for smooth playback. Even if you could put motion blur into games, it really is not a good idea whatsoever. We live in an analog world, and in doing so, we receive information continuously. We do not perceive the world through frames. In games, motion blur would cause the game to behave erratically. An example would be playing a game like Quake II, if there was motion blur used, there would be problems calculating the exact position of an object, so it would be really tough to hit something with your weapon. With motion blur in a game, the object in question would not really exist in any of the places where the "blur" is positioned. So we have perfectly drawn frames, so objects are always able to be calculated in set places in space. So how do you simulate motion blur in a video game? Easy, have games go at over 60 fps! Why? Read the section on the human eye.
Variations in frame rate also contribute to games looking jerky. In any game, there is an average frame rate. Rates can be as high as the refresh rate of your monitor (70+), or it can go down in the 20's to 30's. This can really affect the visual quality of the game, and in fast moving ones can actually be detrimental to your gameplaying performance. One of the great ideas that came from the now defunct Talisman project at Microsoft was the ability to lock frame rates (so the rate goes neither above or below a certain framerate). In the next series of graphics cards, we may see this go into effect."

"Here is where things get a little interesting, and where we will see that humans can perceive up to 60+ fps.
Light is focused onto the retina of the eye by the lens. Light comes in a steady stream and not pulses (ok, so this is a little wrong, but we are not talking about the dual nature of light, where it acts as both a particle -photon- and a wave). Again, we live in an analog world, where information is continuously streamed to us. The retina interprets light in several ways with two types of cells. Rods and Cones make up the receiving cells for light. Intensity, color, and position (relative to where the cell is on the retina) is the information transmitted by the retina to the optic nerve, which then sends that info to the Visual Cortex for it to be translated to our conscious self (whoa, went from science to philosophy in one step!).
Rods are the simpler of the two cell types, as it really only interprets position and intensity. Rods are essentially color blind, and are referred to as transmitting in black and white. The black and white is not really true, but rather it is just intensity of the light hitting the cell. Rods are also very fast due to the basic nature of intensity. The amount of neurotransmitter released is basically the amount of light that is stimulating the rod. The more light, the more neurotransmitter. Rods are also much more sensitive than cones. How is this proven? We know by microscopic examination of the retina shows that there is a much greater concentration of rods on the outer edges. A simple experiment that you can do yourself is to go out on a starry night and look at the stars out of your peripheral vision. Pick out a faint star from your periphery and then look at it directly. It should disappear, and when you again turn and look at it from the periphery, it will pop back into view.
Cones are the second cell type, and these are much more complex. There are three basic parts to them that absorb different wavelengths of light and release differing amounts of different neurotransmitters depending on the wavelength and intensity of that light. Basically there are three receptors in a cone that absorb red, green, and blue wavelengths of light. Each of these receptors release a different neurotransmitter for the color, with differing amounts of the neurotransmitter depending on the intensity of the wavelength. Purple is a combination of blue and red, so the red and blue receptors would release differing amounts of neurotransmitter, while the green wouldn't release any. This information then passes onto your visual cortex and we "see" purple. Cones are much more inefficient than rods due to their more complex nature. They also are a little slower to react to changes in light and are also not as sensitive as rods (see above experiment). Cones are what largely make up the center of the retina and fovea (focal point of the retina).
The optic nerve is the highway from which information is passed from the eye to the visual cortex in the brain. This nerve is just a pathway, and does no processing on its own. Its bandwidth is actually really huge, so a lot of information can be passed on. Nerve impulses also travel at over 200 mph to the brain, so it is nearly instantaneous for information to be received from the eye (since the optic nerve is only about 2 cm to 3 cm long).
The visual cortex is where all the information is put together. Humans only have so much room in the brain, so there are some tricks it uses to give us the most information possible in the smallest, most efficient structure. One of these tricks is the property of motion blur. We cannot get away from the phenomena because it is so important to the way we perceive the world. In the visual cortex we can theorize the existence of what I call the motion blur filter. Because the eye can only receive so much information, and the visual cortex can only process so much of that, there needs to be a way to properly visualize the world. This is where it gets tough.
Take for example a fast moving object. The faster it goes, the more it blurs (be it a snowflake or a train). Why does this happen? Let's take the example of a snowflake. At any time it has a fixed position in the universe, no matter what speed it goes at (unless it starts to get relativistic, then we go into some strange physics, but something that is not applicable to what we are talking about). Lets say at 5 mph, we see the snowflake in perfect detail as it falls to the ground. Now we hop into a car and go 55 mph. Can we see the detail of the snowflake? No, it is just a streak to us. Has the snowflake changed itself? Of course not. If we had a really fast camera with a fast shutter speed, it would see the snowflake in perfect detail. Now due to the speed in which our eyes/visual cortex can process information, we cannot see the snowflake in detail. A bird such as an eagle would be able to see more detail and not so much of a streak because it only has rods (it is color blind) and the distance from the eyes to its highly specialized visual cortex is 1/16th the distance of ours. This leads to more information being pumped into the visual cortex. So what would look like a streak to us would look like a fast moving snowflake to the eagle.
If we didn't have the ability to produce motion blur, we would see the snowflake pop in and out of existence at high speeds. We would first see it one place, then it would disappear and pop into existence several feet beyond depending on the direction it is going. Is this a good thing? No, we would have a hard time figuring out the direction of the snowflake and have many problems with perceiving movement in three dimensional space. With motion blur we get the impression of continuity where our hardware cannot distinguish fine detail while the object is moving at high speeds.
Contrary to the belief that we cannot distinguish anything over 30 fps, we can actually see and recognize speeds up to 70+ fps. How can you test this? You can quickly do this with your monitor at home. Set the refresh rate to 60 Hz and stare at it for a while. You can actually see the refreshes and it is very tiring to your eyes. Now if we couldn't see more than 30 fps, why is it that flicker free is considered to be 72 Hz (refreshes per second). You can really tell if the refresh is below 72 by turning your head and looking at the screen through your peripheral vision. You can definitely see the screen refreshes then (due to rods being much more efficient and fast).

Conclusion
We as humans have a very advanced visual system. While some animals out there have sharper vision, there is usually something given up with it (for eagles there is color, for owls it is the inability to move the eye in its socket). We can see in millions of colors (women can see up to 30% more colors than men, so if a woman doesn't think your outfit matches, she is probably right, go change), we have highly movable eyes, and we can perceive up to and over 60 fps. We have the ability to focus as close as an inch, and as far as infinity, and the time it takes to change focus is faster than the fastest, most expensive auto-focusing camera out there. We have a field of view that encompasses almost 170 degrees of sight, and about 30 degrees of fine focus. We receive information constantly and are able to decode it very quickly.
So what is the answer to how many frames per second should we be looking for? Anything over 60 fps is adequate, 72 fps is maximal (anything over that would be overkill). Framerates cannot drop though from that 72 fps, or we will start to see a degradation in the smoothness of the game. Don't get me wrong, it is not bad to play a game at 30 fps, it is fine, but to get the illusion of reality, you really need a frame rate of 72 fps. What this does is saturate the pipeline from your eyes to your visual cortex, just as reality does. As visual quality increases, it really becomes more important to keep frame rates high so we can get the most immersive feel possible. While we still may be several years away from photographic quality in 3D accelerators, it is important to keep the speed up there.
Looks like 3dfx isn't so full of it."

SlickStick
10-29-2007, 03:00 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Aimail101:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by SlickStick:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Aimail101:
Slickstick Unreal III just came out with amazing optimized performance </div></BLOCKQUOTE>

I hear what you are saying, but there have been many cases in the past where NVIDIA performed better for certain games over ATI and vice-versa.

Then, they optimize their drivers better for the most popular game features and things tend to even up. It may also be the way the game is graphically coded, I'm just sharing what I've read. </div></BLOCKQUOTE>

If its the way the games graphically coded (the engine) then it stinks lol </div></BLOCKQUOTE>

Here's a link where they use Crysis to benchmark video cards. The 8800 series leads the way. Be glad you're not on ATI at the moment for that game, hehe. http://forums.ubi.com/images/smilies/16x16_smiley-wink.gif

Crysis Performance (http://www.firingsquad.com/hardware/nvidia_geforce_8800_gt_performance/page17.asp)

Also, I might not use the term "badly coded". There is allot going on. From what I've seen on those demo screenies (which I hear is DX9), that is a very good looking game. http://forums.ubi.com/images/smilies/25.gif

I'm not usually one for FPS games, but I may have to give that badboy a try one day.

Stiletto-
10-29-2007, 03:00 PM
This is a good question and an interesting subject low flyer.. As it is very important to get smooth consistant frames in a simulation. I want to say the human eye sees something similar to 28-30 frames per second but as we all know, this is kind of apples and oranges since the human eye doesn't actually see in frames.
If you take a game and run it at 30 frames it might look decent enough, Grand Prix Legends is capped at 30 frames no matter how fast your machine is, so 30 frames is definatley a playable number. But if you add twice as much, say 60 frames, you can definatley notice a difference, as things look even more smoother and after that it gets harder to notice. Once you go below 25 or so, you start to notice a real downward trend into the unplayable level.

Whirlin_merlin
10-29-2007, 03:08 PM
Purly anecdotal but I'm sure my gunnery % increase when I pass the magic 40fps mark.

Low_Flyer_MkIX
10-29-2007, 03:09 PM
Interesting stuff. So the 'holy grail' of video game rendering would be to enable 'motion blur' while retaining the ability to pinpoint multiple objects as if rendered clearly. Ever going to be possible?

Thanks, chaps. http://forums.ubi.com/images/smilies/25.gif

K_Freddie
10-29-2007, 03:09 PM
I think it all depends on the efficiency of the 'engine' they use and also how the optimise things. Not to mention the development platform, and the coding language.

For sometime there's been a trend to use a basic engine and the rest is high level stuff. Makes sense financially, but it's nonsense technically. So if you're into 3D engine making... there's a market for you. http://forums.ubi.com/groupee_common/emoticons/icon_cool.gif

Airmail109
10-29-2007, 03:11 PM
The guys who made Crysis have shot themselves in the foot unless they can double the performance through patches and drivers

A card that can add 50 percent more fps onto that game through sheer brute force wont be out for another 6 months lol

SlickStick
10-29-2007, 03:11 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Whirlin_merlin:
Purly anecdotal but I'm sure my gunnery % increase when I pass the magic 40fps mark. </div></BLOCKQUOTE>

I know for sure my gunnery as well as my situational awareness is increased when 1946 is running smooth with no choppiness or stutters and at a +40-50 FPS or higher. http://forums.ubi.com/images/smilies/25.gif

Airmail109
10-29-2007, 03:11 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Low_Flyer_MkIX:
Interesting stuff. So the 'holy grail' of video game rendering would be to enable 'motion blur' while retaining the ability to pinpoint multiple objects as if rendered clearly. Ever going to be possible?

Thanks, chaps. http://forums.ubi.com/images/smilies/25.gif </div></BLOCKQUOTE>

No motion blur in games is gay and i mean REALLY gay, try it on some of the console games sometime.

You WILL hate it lol

For example you cant track something clearly in the top right hand corner whilst moving your weapon in an FPS

Davinci..
10-29-2007, 04:04 PM
agreed! the more fps the better. The whole, you dont need more then 30fps thing, comes from people thinking "the minimum the eye needs" and "the max it can see" are the same thing. Which they certainly are not.

I used to be quite the quake player years ago, and because everything there happend so fast, i could quite litteraly tell if i was on a server that had its fps capped at 90fps, 80fps, or 70s, just by walking around on the server(i wouldnt even play on serves with 60, as it just wasnt enough, but anything above 90 i couldnt notice). For example, at 30fps if you were to make a 360? turn in 1 second. The scene would rotate in 12? chucks, which your eye easily sees. Now at 90fps, your giving your eye(and brain) a lot more information to work with.
heres what i mean by things moving fast in quake2.
http://www.mmshare.co.uk/en/download.php?id=A2B1D5E47

theres no way you would have enough images/info to aim/track/engage targets at 30fps, at that speed(how fast things are happening).

The more "motion" involved in a scene, the bigger the gaps become bewteen frames. This is not the case with "film" because of motion blur, which effectivly "ties" frames together. The more motion the bigger the tie, which is why you dont see it, unlike in a video game(which you do).

This is something that always botherd the hell out me about il2.. Why could i go play Call Of duty 1, with settings maxed(really nice pixel shaded water), with a ton of stuff going on, at 90fps.. but i load up perfect mode in il2, and i get 30fps http://forums.ubi.com/groupee_common/emoticons/icon_frown.gif

Haigotron
10-29-2007, 04:19 PM
I heard the UT3 beta demo, only carries medium resolution textures, and minimal models (all the characters are the same model) to lower the download size, maybe that's why you're getting clearly better results for UT3 vs. Crysis?

rafaellorena
10-29-2007, 04:59 PM
i test the crysis... in very high i got 30-40 fps ... the difference betwen ut and crysis its the vegetation... its like armed assault... in urban or open areas i got 50 fps.. when i go to a dense tree area fps go to 15 ... its loot of sprites and textures... the problem is there... in ut its more open areas... details in building.. but not a dense thing like crysis

Airmail109
10-29-2007, 05:30 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Haigotron:
I heard the UT3 beta demo, only carries medium resolution textures, and minimal models (all the characters are the same model) to lower the download size, maybe that's why you're getting clearly better results for UT3 vs. Crysis? </div></BLOCKQUOTE>

No this guys benchmarks were from the full game

Rafaellorena! Are you running SLI lol or a an 8800 Ultra? Reason why I ask is because the Ultra in benchmarks makes about 35 fps on very High settings under certain resoloutions and no AA or AF

Still I think its insane to release a game that cant be enjoyed properly the way it was intended to be played on release date. Its almost as stupid as say a film company spending multi-millions on say a film such as Lord Of the Rings in colour, and then releasing it when the only cinemas are black and white.

VW-IceFire
10-29-2007, 06:00 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Low_Flyer_MkIX:
I always forget this one. What's the max fps the human eyeball can cope with? </div></BLOCKQUOTE>
Its not a hard number from what I've been reading but for intensive purposes a general range of 60fps on a computer screen is an ideal range to have. Film is good for 24fps but thats because film is blurred from frame to frame forming a composite in your mind that looks very clear and smooth.

This is a discussion that is still debated amongst the experts as far as I can tell. Part of the issue is that we're dealing with several complex systems ranging from the projection of the screen (be it from film or digital from a computer game or from a HD camera or whatever) and then to the whole issue of eyes and the brain. Our eyes and brain form a pretty complex system on their own.

Its also worth noting that with computer games the average fps is not a good indicator because that means that in a game that gets an average of 30 fps is at some point going under 30 fps (as its never flat) and thats not good. Pumping out a game at an average of 80 fps is great so long as that means that your hardware is good enough to do 30 fps minimum.

But just remember 60 fps...thats what game developers on consoles shoot for. 30 fps is ok but in computer games you never want to be going under 30 fps as thats when it looks choppy.

rafaellorena
10-30-2007, 03:01 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Aimail101:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Haigotron:
I heard the UT3 beta demo, only carries medium resolution textures, and minimal models (all the characters are the same model) to lower the download size, maybe that's why you're getting clearly better results for UT3 vs. Crysis? </div></BLOCKQUOTE>


Rafaellorena! Are you running SLI lol or a an 8800 Ultra? Reason why I ask is because the Ultra in benchmarks makes about 35 fps on very High settings under certain resoloutions and no AA or AF
</div></BLOCKQUOTE>


using afx4 aax8 ... gpu benchmark... my system is oc to 650/2020 (gts640mb)

M_Gunz
10-30-2007, 03:56 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Aimail101:
The guys who made Crysis have shot themselves in the foot unless they can double the performance through patches and drivers

A card that can add 50 percent more fps onto that game through sheer brute force wont be out for another 6 months lol </div></BLOCKQUOTE>

It might not be an issue to those who can stand not running full-out graphics until later.

Low_Flyer_MkIX
10-30-2007, 05:50 PM
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by VW-IceFire:
<BLOCKQUOTE class="ip-ubbcode-quote"><div class="ip-ubbcode-quote-title">quote:</div><div class="ip-ubbcode-quote-content">Originally posted by Low_Flyer_MkIX:
I always forget this one. What's the max fps the human eyeball can cope with? </div></BLOCKQUOTE>
Its not a hard number from what I've been reading but for intensive purposes a general range of 60fps on a computer screen is an ideal range to have. Film is good for 24fps but thats because film is blurred from frame to frame forming a composite in your mind that looks very clear and smooth.

This is a discussion that is still debated amongst the experts as far as I can tell. Part of the issue is that we're dealing with several complex systems ranging from the projection of the screen (be it from film or digital from a computer game or from a HD camera or whatever) and then to the whole issue of eyes and the brain. Our eyes and brain form a pretty complex system on their own.

Its also worth noting that with computer games the average fps is not a good indicator because that means that in a game that gets an average of 30 fps is at some point going under 30 fps (as its never flat) and thats not good. Pumping out a game at an average of 80 fps is great so long as that means that your hardware is good enough to do 30 fps minimum.

But just remember 60 fps...thats what game developers on consoles shoot for. 30 fps is ok but in computer games you never want to be going under 30 fps as thats when it looks choppy. </div></BLOCKQUOTE>

RGR that. http://forums.ubi.com/images/smilies/25.gif

roybaty
10-30-2007, 08:16 PM
I've played the Crysis demo with medium settings, 1440x900 (no AA, weird texture/rendering issue in DX9 with AA enabled) with no significant issues aside from AI enemies that take 4-5 bullets to take down.

Take a look at my specs below.

Now I have heard a lot about Vista, DX10, and nVidia cards having issues. I can't bring myself to blow $200 to upgrade to Vista Ultimate, so not an issue until Vista comes down to a reasonable price.