PDA

View Full Version : Why is there no perfect mode on DirectX?



WIFC_JG10_nc
03-22-2005, 12:37 AM
Could someone enlighten me?

JunkoIfurita
03-22-2005, 04:57 AM
The game was coded and intended for OpenGL gameplay - one of the last of that beautiful generation of non-M$ dictated games.

DirectX support is included, but it's running as a 'wrapper' (an emulator, basically) around the OpenGL renderer, and as such isn't perfect - certain aspects of render can't be implemented - such as perfect landscape and any of the Pixel Shader 3.0 water settings.

To draw a broad (and probably incorrect) parallel, running a PSX game on the PC within an emulator, say Final Fantasy 7, you can't see the on screen menus - this is because the emulated render doesn't support the 'foreground' layer where the menus are situated.

Seeing as though all the cards support OpenGL, I don't think the development team saw it as a particularly high priority to keep tweaking the DirectX emulation until it was perfect in all aspects.

So just run in OpenGL and you'll be fine - you're not losing anything after all, and gaining a lot of eyecandy http://forums.ubi.com/groupee_common/emoticons/icon_biggrin.gif

----

Abbuzze
03-22-2005, 05:19 AM
DirectX was made by MS for games. OpenGL is made for rendersoftware by Silicon Graphics!
Its for profesional work. For example when the first NVidia graficcards with T&L apeared
DirectX games didn´t benefit.

But cause OpenGL had this from the beginning, games and rendersoftware realy gained speed.

In my eyes this was the real breaktrough for NVidia, cause it was the first graficcard for gaming and work!
3dfx Voodoo was just for gaming...

WIFC_JG10_nc
03-22-2005, 05:36 AM
Thanks for the explanations http://forums.ubi.com/groupee_common/emoticons/icon_smile.gif

XyZspineZyX
03-22-2005, 06:01 AM
Hello,

Is it also possible that you are playing in 16 bytes mode and not in 32 bytes? Look at it please it can happen...

Sensei

Slammin_
03-22-2005, 07:30 AM
<BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by Haddock55:
Hello,

Is it also possible that you are playing in 16 bytes mode and not in 32 bytes? Look at it please it can happen...

Sensei <HR></BLOCKQUOTE>

There is no Perfect Mode when using DX. Period.

Capt._Tenneal
03-22-2005, 09:36 AM
Since DirectX emulates the OpenGL code, can't DX "emulate" Perfect mode ? I'm a layman when it comes to these thngs too.

VW-IceFire
03-22-2005, 11:07 AM
<BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by Capt._Tenneal:
Since DirectX emulates the OpenGL code, can't DX "emulate" Perfect mode ? I'm a layman when it comes to these thngs too. <HR></BLOCKQUOTE>
I'm sure it could...but theres really no point. The graphics engine was natively programmed for OpenGL. All of the graphics calls to the video card are made via OpenGL and through to the hardware. OpenGL is like a conduit between game engine and through to the video card.

Add in a Direct3D wrapper mode and you have four layers instead of three. Firstly the game engine, then OpenGL which the game engine understands, then a wrapper which interprets the OpenGL calls and converts them to Direct3D calls, and then the hardware. So by adding an extra stage you add a performance hit. Some cards do Direct3D much better than OpenGL and some cards don't do OpenGL at all. This is rare these days...but surely when the game was developed it was not.

I'm willing to bet that emulating OpenGL to Direct3D pixel shader modes is akin to a nightmare of epic proportions.

Hoatee
03-22-2005, 01:15 PM
DirectX ain't perfect.

Capt._Tenneal
03-22-2005, 01:33 PM
I just noticed your new sig Icefire. Is that going to be the name of your Tempest campaign once we get 4.0 ? http://forums.ubi.com/groupee_common/emoticons/icon_smile.gif