PDA

View Full Version : Not All Ubi's Fault



Warwick709
12-06-2006, 01:28 AM
Hi guy's i as most of you on this forum am having problems with slow frame rates.I have a XFX 7800 GTX 256 ram Card Corsair 2Gig Twinx 2048-4000PT AMD FX 55 overclocked to 2.8 so as you can see it's a pretty high end system.I to am averaging about 25FPS. However i borrowed a 7950 GTX from a friend Running SLI it has 2 processors on the one card so you can run SLI and has 1Gig ram and low and behold it's running at over 50fps average.I Think we might be complaining about nothing its definently the card that lets the side down.My friend said i can use the 7950 until i finish the game.Its an absolute pleasure to play the graphics are sensational

S.a.S-Akbari
12-06-2006, 02:06 AM
Oh god no, The videocards have to ajust them selfs and optimize them selfs to a game now? , nono - It's a devoloper job to optmize a game to get everything out of every card.<div class="ev_tpc_signature">


-----------------------------------------------------------------------
"I used to be a hero, Now I'm a no lifer, I am a WoW Geek" - Akbari, the
day he started World of Warcraft -- WoW addict till June 2006 R.I.P.

"salami akbari", "Amalahad Akbari" -- LoGiCaL_ - Proof nubs do get older then 2 years.

Xfire (http://www.xfire.com/profile/akbari) -- Thanks Dojomann for the siggeh!
http://img201.imageshack.us/img201/562/akbarisnewsig2ed5.jpg

SoulGrind
12-06-2006, 02:52 AM
Agreed, It's got nothing to do with the gaming community the fault lies entirely on ubi.
The game wasn't optimized very well. I went back to SCCT and it runs like a dream, hell. I even think it looks better simply because i can run it with everything maximized. Can't do that with this game.<div class="ev_tpc_signature">

_____________________________
http://www.users.on.net/~torskin/sc.jpg
Death, is only the beginning.

noir-colombia-
12-06-2006, 03:17 AM
see the mouse lag fix, im sure it will solve all ur problems

Rhythmin
12-06-2006, 05:07 AM
Originally posted by Warwick709:
Hi guy's i as most of you on this forum am having problems with slow frame rates.I have a XFX 7800 GTX 256 ram Card Corsair 2Gig Twinx 2048-4000PT AMD FX 55 overclocked to 2.8 so as you can see it's a pretty high end system.I to am averaging about 25FPS. However i borrowed a 7950 GTX from a friend Running SLI it has 2 processors on the one card so you can run SLI and has 1Gig ram and low and behold it's running at over 50fps average.I Think we might be complaining about nothing its definently the card that lets the side down.My friend said i can use the 7950 until i finish the game.Its an absolute pleasure to play the graphics are sensational

I would say your pc is midrange, i have conroe e6600 x1800xt 512mb 2gb ddr2 and it's midrange pc hardware.

splinterjunkie
12-06-2006, 05:41 AM
I'd have to agree with you...

http://forums.ubi.com/eve/forums/a/tpc/f/3651075192/m/1871022905

So far, over 2/3 of the owners are running this game with little to no trouble. The remaining 1/3 are having a real hard time. Makes you wonder... same number of people have 0 problems as those who cant get the game to run. Slightly more people have almost no problems at all compared to those having to restart every hour (or less). And then there is the majority, stuck in the middle with our mid to low end computers...<div class="ev_tpc_signature">

http://i48.photobucket.com/albums/f226/Performance_nut/SCforumsig.jpg

Warwick709
12-06-2006, 06:01 AM
Thanks dude. What do people expect with every new game that comes out off course the graphics are going to be a lot more demanding on the card.If they left the graphics at chaos theory spec then this game wont advance any further with the visual aspect.Imagine if we were still looking at SC1 graphics and how basic they look compared to even pandora tommorrow. Shure your going to run at65-70fps but the game will be stuck in a time warp and all other games will be advancing.I say good on Ubi for giving us a better visual experience with every new SC that comes out

S.a.S-Akbari
12-06-2006, 06:08 AM
Chaos Theory had good graphics, but did it lack in gameplay FPS? , No it didn't. Chaos Theory was a good combination of Superb Graphics and FPS and ontop of that you could choose which shadermodels you wanted to use.

I applaud UBI for upping the Graphics on every installment but I also expect them to be able to support it. There is no good in eye candy graphics when your screen starts to stutter just looking at a wall or the FPS drops overall.

Double Agent didn't give us such great graphics, Surely they where a tad better then Chaos Theory, But I'd rather have smoother performance and Chaos Theory graphics rather then having to be stuck with almost near identical graphics but lesser peformance - Like .. ALOT lesser.<div class="ev_tpc_signature">


-----------------------------------------------------------------------
"I used to be a hero, Now I'm a no lifer, I am a WoW Geek" - Akbari, the
day he started World of Warcraft -- WoW addict till June 2006 R.I.P.

"salami akbari", "Amalahad Akbari" -- LoGiCaL_ - Proof nubs do get older then 2 years.

Xfire (http://www.xfire.com/profile/akbari) -- Thanks Dojomann for the siggeh!
http://img201.imageshack.us/img201/562/akbarisnewsig2ed5.jpg

Warwick709
12-06-2006, 06:18 AM
I agree to a point but there we have the garphics card issue again how long did the people who bought cheap and nasty 7600 cards think they would last before a high demanding game came out and now they can't play it. remember the old saying you pay peanuts you get monkeys seems most of the people with issues here are the ones with these cheap cards

braiog
12-06-2006, 10:54 AM
If I understand correctly, it goes like this (please let me know if I'm wrong)

The major chipset players are nVidia and ATI. The major game APIs are Microsoft's DirectX and the open source OpenGL.

nVidia and ATI make chip's that use their own coding to power technology that the API supports. Lets take simple anti-aliasing. nVidia uses their own algorithms to create AA with their card, as does ATI. Proof of proprietary is nVidia's "Quad-AA" that is said to have 4x AA with the performance hit of 2x.

So, the hardware itself is hit and miss. Couple that in with the fact that game programmers have to CONDITIONALLY code to support features that the hardware MAY or MAY NOT have (like if they coded the game to use Pixel Shader 3.0, but the card supports only 2.0, then the game fails - consumer is pissed.

Well, even if the hardware supported 3.0, the hardware itself may not jive right with DirectX (the API), and cause problems as well. All this is topped off by power - the more powerful a CPU/GPU/Memory/etc you have, the better performance you get.

Since Microsoft/DirectX does not force hardware developers, and in turn, software developers, to standards, there's no "level" to build/code to, so chances for failure are high.<div class="ev_tpc_signature">

http://www.aaronmartone.com/misc/braiog_sig.jpg
Death by Lethal Injection?

PC does NOT mean a Ported Console version.

lochang19
12-06-2006, 05:40 PM
I get what you're saying, Braiog, and I'm sure you're right, but when other companies can make games work on (let's say for arguement's sake) 80% or higher of consumer computers that meet the req's (or recommended req's) and UBI can't seem to make stable, somewhat bug-free games (and if not more-or-less bug-less initially, patch them a few times till the MAJOR problems are gone) then UBI is rather apparently not putting the effort required into it.

I mean, they screwed up the XBox360 people fairly well, and they have the SAME platform in each system!<div class="ev_tpc_signature">

http://www.sloganizer.net/en/style5,LoChang.png (http://www.sloganizer.net/en/)

exsilentioverum
12-06-2006, 06:00 PM
Originally posted by braiog:
If I understand correctly, it goes like this (please let me know if I'm wrong)

The major chipset players are nVidia and ATI. The major game APIs are Microsoft's DirectX and the open source OpenGL.

nVidia and ATI make chip's that use their own coding to power technology that the API supports. Lets take simple anti-aliasing. nVidia uses their own algorithms to create AA with their card, as does ATI. Proof of proprietary is nVidia's "Quad-AA" that is said to have 4x AA with the performance hit of 2x.

So, the hardware itself is hit and miss. Couple that in with the fact that game programmers have to CONDITIONALLY code to support features that the hardware MAY or MAY NOT have (like if they coded the game to use Pixel Shader 3.0, but the card supports only 2.0, then the game fails - consumer is pissed.

Well, even if the hardware supported 3.0, the hardware itself may not jive right with DirectX (the API), and cause problems as well. All this is topped off by power - the more powerful a CPU/GPU/Memory/etc you have, the better performance you get.

Since Microsoft/DirectX does not force hardware developers, and in turn, software developers, to standards, there's no "level" to build/code to, so chances for failure are high.

For some reason your explanation reminded me of something. Back with Doom 3, there was a performance issue IIRC with the way certain rendering was done on ATI video cards. Someone found the section of code in one of the resource files that handled that specific rendering method for the ATI cards, and found a way to force it down a different path. Plays right in to your comment regarding conditional coding. There were individual paths for a number of different ATI chipsets....the hack forced it down one particular chipset that wouldn't run into the problem. Anyway, on screen graphic degradation was practically unnoticeable, but it yielded something along the lines of 10-15% performance increase. Now I might be remembering this incorrectly, but the problem was actually in the vidcard drivers, and eventually patched by ATI. I'm sure we are running into situations like this with SCDA

My guess is that we're running into the additional problem of trying to convert code written for the unified shader model (which is an entirely different beast in both hardware and software than what we have on the PC) to something that would work well on a PC based shader model. Given the rather compressed schedule (or so I believe...I'm guessing that UbiShang probably spent the brunt of their time on the X360 code and left the PC porting for the tail end of the dev cycle), it wouldn't be feasible for them to do much more than just figure out a base level methodology to convert the routines done for the unified model. That would unfortunately result in the performance issues....the code has been ported, and they probably threw in some simple tweaks but no real hardcore optimizations were implemented. I'd wager that also governed the choice to stick with SM3 only, due to the time it would take to develop the conditional code for it to work with SM1/2 level cards.

riz_k_biznez
12-06-2006, 08:15 PM
Warwick709 Posted Wed December 06 2006 05:18 Hide Post
I agree to a point but there we have the garphics card issue again how long did the people who bought cheap and nasty 7600 cards think they would last before a high demanding game came out and now they can't play it. remember the old saying you pay peanuts you get monkeys seems most of the people with issues here are the ones with these cheap cards


WTF???Are u being serious?
I have a selfbuilt system...AMD 64 4800x2
4 gig ddr2 ram and 2 7600gt's in SLI and your saying that it isnt UBIS fault that the game wont run coz i have "cheap" graphics cards.Do u work for ubi?Cmon my system owns any game i throw at it.I mean.I average 70 fps in GR AW for goodness sake!Ubi needs to put butter on there fingers so itll be easier to get there fingers out there asses and fix the dam game!!!<div class="ev_tpc_signature">

http://img412.imageshack.us/img412/6858/sigki0.jpg

Warwick709
12-06-2006, 11:02 PM
70 fps in GRAW is a pretty piss poor effort dude especially running SLI Thats what i get with my 7800 GTX single card not SLI.Your 7600's are ok if yr playing Solitair or Chess get your self a real card not those mickey mouse pretend cards like your 7600's

Warwick709
12-06-2006, 11:06 PM
As for your 4800 you get better speeds running a single FX 55 or FX 57 the game doesn't utilize duel core anyway so your second core is a waste