Yeah, asking if one of the best graphics card on the market will max out a game is just asking to get trolled.
It depends on your definition of "maxing the game". As I posted on another thread (not going to repeat it here, apparently discussing rigs is bragging) I can run FC2 with everything at max on my GTX-580, but if I put AA above 4, the frame rate will dip below 60FPS when the action gets busy.
For me, 60FPS is golden, but people commonly accept anything above 30FPS as playable.
What about those who run their monitors at 75hz? Technically they need 75FPS in order to drop no frames. Same goes for 120hz of course, you need 120FPS to keep your monitor "fed". Do you actually SEE missing frames at 60 or 75FPS? The missing frames only show themselves when your character is turning fairly rapidily, otherwise you might never know it.
So, by MY definition, the 580 won't "max" FC2, and let me make this clear, moving to 8xAA gives a very noticable improvement in the outline of the large leaves, tires, and other round-ish features.
Is it worth it? If my card could do it, of course I would set it to 8xAA, even 16xAA, but it can't handle it and preserve my personal frame rate target. But I have my doubts if even the new GTX-680 could run FC2 with everything (including 16xAA) at max. Bottom line is AA is very taxing, and if you understand what it is doing, it's easy to see why it takes serious graphics power to pull it off.
So what does this mean for FC3? No clue. If the reworked Dunia engine is faster and more efficient, maybe the 580 will be enough. The way this stuff tends to go is each evolution in games isn't happy with just good hardware, it might need the top cards in SLI or Crossfire. If your standards are as high as mine (meaning the word "everything" truly means EVERYTHING), some might be surprized FC3 could end up needing more than a GTX-580, in fact history shows it's likely.
I feel safe to say you can expect a tolerable gaming experience with FC3 and a 580. "Maxed", as in can't set anything at all any higher? Doubtful.
You sir, are a graphics snob. I am jealous.
I have an excuse for that. I spent about a dozen years designing high end graphics cards, but nothing I ever designed would be considered anything but a joke in this day and age. In fact NVidia was one of the companies that convinced my company to leave that market. Design was no longer based on brainpower and adding features no one else had, but to build the cards based on the latest offerings from the major graphics chip providers and do so while making it cheaper witha reduced time to market.
Originally Posted by ram8113
Sort of a cheaper/faster/better, but realistically missing the "better" because most designs are "canned" from the chip manufacturer; if you stray too far from the reference design and it doesn't work or works poorly, you have no one to blame but yourself. In essence, they took the brains out of introducing new products, and engineers like to design new things, not copy someone else's design template. I know of (3) "completely different" vendors products that were shipped out of the exact same Taiwanese factory, their "engineering" was convincing the factory to make sure their's rolled out first. Now of course most everything in made in China, but it's the same game, build it as fast and as cheap as possible, but try to convince the customer "yours" is superior.
But I never kicked the addiction, there was a time I would meet with early game programmers to try and improve performance, let them know about new capabilities we were offering and so on. That all ended when the chipset guys started making strides. NVidia killed my graphics career, but I still use their products!
Far Cry 3 Elite Member
Engineers should have optimized better ways via software rather than hardware (where possible) like Euclideon is doing but of course you can't spill much money out of consumers by simply patenting a software.. that's marketing laws...
That was always the case, the software folks were doing everything they could, while the hardware folks were providing incrementally better platforms on which to run. The real heros are the driver folks, without adequate hooks into the hardware new designs were impotent.
Originally Posted by Viragoxv535
Seems like you might be saying new designs were introduced simply to extract more money from the "unfortunate customer"? That would have been news to us as designers, here we thought we were taking advantage of new ideas and capabilities to offer better products for everyone including ourselves.
I do understand part of that perception though, as each project was entering the market it's improved sibling was already in the works. From the customer's perspective this could look like a never ending stream of enhanced products that if you missed one, you were behind the times. This of course is true at some level, but businesses don't stay in business long by waiting for each 2nd, 3rd, or 4th evolution to become mainstream, we were in the pressure-cooker constantly trying to keep up.
Every time you introduced something, you had a few months at best before clones were being produced that offered virtually exactly your product, but with none of the design investment overhead, which meant it could be sold for a song. Even when we went to custom silicon and thought we were "safe" from the clones, we found out in short order there were "engineers" who's entire purpose in life was to deconstruct your silicon, and clone that too.
I believe most people's perception is engineers would bang something out then sit back and watch the money roll in. In my experience, just when you were reaching the critical points in a design phase ready to release to manufacturing we'd get wind of the latest advancements (internal to the company or not) so it was a balancing act to push out the old while sinking your teeth into the new. There was never any "sitting back" just a constant push forward.
So, the customer's perspective that buying something new was buying something already headed for obsolescense was true, we knew it because our designs were already obsolete before manufacturing had a chance to ramp up. The result when taken in the big picture is, each step provided a clearcut improvement in the product. Were were never able to take advantage of things like the automotive industry for example. They could design new fenders and a front grille, maybe a new shape to the tail lights, and can it "the new year model". Meanwhile, all of the tech companies were at each others throats, we were doing everything we could to distance ourselves from the competition in order to capture market for as long as we could before the clones came in to steal our efforts for essentially "free".
Fast paced, high pressure, always pushing the cutting edge. And just when you thought it was time to take a breather, "the company's future rides on your ability to get this to market in less time than humanly possible". Glad I'm NOT doing that anymore, truth be told.
That has chan
Lol, your in for a shock, I only just hold 60 frames with 2x 580s in SLI and that is without HDAO or MSAA enabled, knocks down to 30-40, with HDAO+4xAA, adding 8xAA takes me into the 20s and thats with 99% usage on both cards! The game is very nice looking but uses alot of techniques that can cripple even high end cards! Sometimes these games like to be used as benchmarks and so they want high end tech to struggle, they get alot of press coverage this way! Look at how well Metro 2033 sold or S.T.A.L.K.E.R?! Both ridiculously bad games, but they brought high end gaming rigs to their knees and still do to this day!
Better get a second card bought or atleast a GTX 680 or ATI 7970 to get a decent frame rate!