PDA

View Full Version : Graphics Card for Unity



Altair1789
09-19-2014, 01:27 AM
My current rig is:

CPU: AMD Radeon FX 4300
RAM: 8GB DDR3
the GPU I've had my eye on for a while is the Zotac GeForce GTX 560 Ti
Screen resolution: 1280x1024


I'd like to play AC Unity at medium-high settings with decent fps (30+ish). What graphics card with a 110$ budget would be my best bet. Also, I heard of Ubisoft partnering with Nvidia, and this is a noob question, but is using the Zotac GeForce going to give me the benefits of their partnership? As in, what's the difference between Nvidia GeForce and Zotac GeForce? They're not too different, right?

UbiVolfbane
09-19-2014, 09:16 PM
Unfortunately, we do not have any information regarding the system requirements for AC Unity at this time. I ask that you check back closer to the release as well as keeping an eye on the official website as well as checking back regularly here: http://assassinscreed.ubi.com/en-US/home/index.aspx . We'll do our best to keep you updated! Thanks. :)

Altair1789
09-19-2014, 10:23 PM
Unfortunately, we do not have any information regarding the system requirements for AC Unity at this time. I ask that you check back closer to the release as well as keeping an eye on the official website as well as checking back regularly here: http://assassinscreed.ubi.com/en-US/home/index.aspx . We'll do our best to keep you updated! Thanks. :)

Thanks for the information. I know system requirements aren't out, and that would mean any advice I could get would be purely speculation, but that's fine with me (which is why I made the thread)

Megas_Doux
09-19-2014, 10:53 PM
Unfortunately, we do not have any information regarding the system requirements for AC Unity at this time. I ask that you check back closer to the release as well as keeping an eye on the official website as well as checking back regularly here: http://assassinscreed.ubi.com/en-US/home/index.aspx . We'll do our best to keep you updated! Thanks. :)

Thanks bro!

thewhitestig
09-20-2014, 04:32 PM
What graphics card with a 110$ budget would be my best bet.

Get a second hand 750 Ti.

BlastThyName
09-20-2014, 06:30 PM
A R7 265 would be a great option as well.

Altair1789
09-20-2014, 06:58 PM
Get a second hand 750 Ti.

Even with playing at 1280x1024 resolution? I can't really find a used 750 ti in my price range either, any other suggestions?


A R7 265 would be a great option as well.

I kinda want to get an Nvidia card since Ubisoft's been working with Nvidia to make Unity better for people with Nvidia cards

thewhitestig
09-20-2014, 10:05 PM
Even with playing at 1280x1024 resolution? I can't really find a used 750 ti in my price range either, any other suggestions?



I kinda want to get an Nvidia card since Ubisoft's been working with Nvidia to make Unity better for people with Nvidia cards

Used 650 Ti Boost. I believe you can find those for a lot less money.

Altair1789
09-20-2014, 10:58 PM
Used 650 Ti Boost. I believe you can find those for a lot less money.

This is just a question, but what would be the benefits of buying a model with a higher number? Don't higher number models have support for different things found in games? Like, if I used the Zotac GeForce GTX 560 Ti, would it not have more support for things that will make certain games run better that a normal Nvidia GeForce GTX 650 might have? Essentially, would buying an Nvidia card with a higher number be better than the Zotac card even if the Zotac GeForce 560 Ti is better in a lot of ways? This question might not be that clear because I'm pretty bad at explaining this kind of stuff

thewhitestig
09-20-2014, 11:40 PM
This is just a question, but what would be the benefits of buying a model with a higher number? Don't higher number models have support for different things found in games? Like, if I used the Zotac GeForce GTX 560 Ti, would it not have more support for things that will make certain games run better that a normal Nvidia GeForce GTX 650 might have? Essentially, would buying an Nvidia card with a higher number be better than the Zotac card even if the Zotac GeForce 560 Ti is better in a lot of ways? This question might not be that clear because I'm pretty bad at explaining this kind of stuff

The first number (ex. 5xx) is the generation of that product. (Although sometimes they re-skin old GPUs under new names).
The second number is the class off that product.(ex.GTX 280). 7s and 8s tend to be the high end products. 5s and 6s are mid range. Anything below that is low end. Of course these numbers do not represent the relative performance of a product. You should always look at benchmarks when comparing different kinds of GPUs, especially when comparing GPUs from different generations. When you're looking at only one generation though (ex. 700 series) the higher number always means a that it's a faster card. TI models tend to be an improved (or a non cut) version of the normal model. (ex. 660 Ti, 780 Ti). In the past they used to put a 5 instead of a Ti (ex. GTX 285, GTX 465).

About you question of weather or not it's better to buy old high end cards instead of newer lower end cards. I would always go with the newer lower end cards, simply because newer GPUs tend to have lower power consumption/thermal output (usually true, not always) and better driver support.
The 560 Ti and the 650 Ti boost have pretty close performance numbers, however the temperatures and the power draw are completely different.
560 Ti - 147W peak, 170W TDP
650 Ti Boost - 104W peak, 134W TDP
I don't have temperature numbers, but judging by the power draw, the temperature of the 650 Ti Boost would be much lower, although running low temps is not always particularly important. Lower temperatures do increase the life span of the GPU, but most of them are made to withstand temps of up to 100C and some even 110C so there isn't much of a problem there.
Here's a benchmark of Crysis to compare the two cards. http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/crysis_1920_1200.gif

Always, I repeat ALWAYS use gaming benchmarks to compare GPUs. Compare them in different games in order to get a better overall picture of their performance. Sometime a game would prefer AMD over Nvidia or the opposite. (ex. Splinter Cell: Blacklist hates AMD GPUs). You can also use synthetics, but they aren't always as reliable as gaming benchmarks. After all you're buying the GPU to play games, not run synthetics. You can also look at compilation charts that average out the results of many games combined. Here's one. http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/perfrel.gif
The source I'm using for these charts is http://www.techpowerup.com/ In my opinion it's by far the best source of information about different PC tech, because they go very in-depth in their reviews. Charts, graphs, everything you would want is there. A geek's paradise.

I went far too long and I might have created more questions than I answered, so feel free to ask anything that's on your mind in order to clear things out. I'll do my best to answer your questions.

Fatal-Feit
09-20-2014, 11:42 PM
The 650 Ti Boost is a decent card. It's great if you don't mind 30 fps in medium settings. You can probably get away with high settings if you turn off anti-aliasing.

Altair1789
09-20-2014, 11:57 PM
The first number (ex. 5xx) is the generation of that product. (Although sometimes they re-skin old GPUs under new names).
The second number is the class off that product.(ex.GTX 280). 7s and 8s tend to be the high end products. 5s and 6s are mid range. Anything below that is low end. Of course these numbers do not represent the relative performance of a product. You should always look at benchmarks when comparing different kinds of GPUs, especially when comparing GPUs from different generations. When you're looking at only one generation though (ex. 700 series) the higher number always means a that it's a faster card. TI models tend to be an improved (or a non cut) version of the normal model. (ex. 660 Ti, 780 Ti). In the past they used to put a 5 instead of a Ti (ex. GTX 285, GTX 465).

About you question of weather or not it's better to buy old high end cards instead of newer lower end cards. I would always go with the newer lower end cards than an older higher end card, simply because the new GPU would have lower power consumption/thermal output (usually true, not always) and would have better driver support.
The 560 Ti and the 650 Ti boost have pretty close performance numbers, however the temperatures and the power draw are completely different.
560 Ti - 147W peak, 170W TDP
650 Ti Boost - 104W peak, 134W TDP
I don't have temperature numbers, but judging by the power draw, the temperature of the 650 Ti Boost would be much lower, although running low temps is not always particularly important. Lower temperatures do increase the life span of the GPU, but most of them are made to withstand temps of up to 100C and some even 110C so there isn't much of a problem there.
Here's a benchmark of Crysis to compare the two cards. http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/crysis_1920_1200.gif

Always, I repeat ALWAYS use gaming benchmarks to compare GPUs. Compare them in different games in order to get a better overall picture of their performance. Sometime a game would prefer AMD over Nvidia or the opposite. (ex Splinter Cell: Blacklist hates AMD GPUs). You can also use synthetics, but they aren't always as reliable as gaming benchmarks. After all you're buying the GPU to play games, not run synthetics. You can also look at compilation charts that average out the result of many games combined. Here's one. http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/perfrel.gif
The source I'm using for these charts is http://www.techpowerup.com/ In my opinion it's by far the best source of information about different PC tech, because they go into lots of detail with tons of charts.

I went far too long and I might have created more questions than I answered, so feel free to ask anything that's on your mind in order to clear things out. I'll do my best to answer your questions.

So would an EVGA NVIDIA GeForce GTX 650 Ti give me better results or a Zotac GeForce GTX 560 Ti on a game like Unity where the company favors Nvidia? Right now I can't decide what to get that will let me run Unity with okay settings and fps


I found a decently priced GeForce GTX 650 Ti boost 2gb, but I used http://gpuboss.com/ to compare it to the Zotac card and it said the Zotac card was a better option, should I get the 650 and not trust the website?


The 650 Ti Boost is a decent card. It's great if you don't mind 30 fps in medium settings. You can probably get away with high settings if you turn off anti-aliasing.


Even if I run the game at 1280x1024?

thewhitestig
09-21-2014, 12:24 AM
So would an EVGA NVIDIA GeForce GTX 650 Ti give me better results or a Zotac GeForce GTX 560 Ti on a game like Unity where the company favors Nvidia? Right now I can't decide what to get that will let me run Unity with okay settings and fps


I found a decently priced GeForce GTX 650 Ti boost 2gb, but I used http://gpuboss.com/ to compare it to the Zotac card and it said the Zotac card was a better option, should I get the 650 and not trust the website?




Even if I run the game at 1280x1024?

Well, first of all their Passmark score is wrong. They're confusing the Boost version with the non boost version as evident here. http://www.videocardbenchmark.net/compare.php?cmp[]=2177&cmp[]=2479&cmp[]=18 Secondly, they don't have enough benchmarks included to give a proper assessment. Don't trust that site. Always do your own research.

http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/bf3_1920_1200.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/farcry3_1920_1200.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/ac3_1920_1200.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/cod_bo2_1920_1200.gif

The 650 Ti Boost is undoubtedly the faster card. Plus it has 2 gigs of memory instead of 1. In the past this didn't matter much cause none of the games maxed out the memory anyways so it was pointless to talk about vram amount. But with games now becoming very memory intensive, these 2 gigs will make a difference, because that will allow you to turn up the settings a little bit higher without experiencing stuttering. If your game needs more vram, than your card actually has, then the game engine starts using the system memory and the hard drive page file which are both horrendously slow. This low memory bandwidth causes microstuttering which basically means that frames stay on screen for more than they should've, or when the screen outright freezes. This happens all the time in Watch Dogs when you crank up the settings on a card that does not have enough vram.

Long story short, go for the 650 Ti Boost. It's better.

Fatal-Feit
09-21-2014, 01:15 AM
Even if I run the game at 1280x1024?

I used to own the card and play at 1080p. You should have less problems with your resolution.

Altair1789
09-21-2014, 01:22 AM
Well, first of all their Passmark score is wrong. They're confusing the Boost version with the non boost version as evident here. http://www.videocardbenchmark.net/compare.php?cmp[]=2177&cmp[]=2479&cmp[]=18 Secondly, they don't have enough benchmarks included to give a proper assessment. Don't trust that site. Always do your own research.

http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/bf3_1920_1200.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/farcry3_1920_1200.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/ac3_1920_1200.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_650_Ti_Boost/images/cod_bo2_1920_1200.gif

The 650 Ti Boost is undoubtedly the faster card. Plus it has 2 gigs of memory instead of 1. In the past this didn't matter much cause none of the games maxed out the memory anyways so it was pointless to talk about vram amount. But with games now becoming very memory intensive, these 2 gigs will make a difference, because that will allow you to turn up the settings a little bit higher without experiencing stuttering. If your game needs more vram, than your card actually has, then the game engine starts using the system memory and the hard drive page file which are both horrendously slow. This low memory bandwidth causes microstuttering which basically means that frames stay on screen for more than they should've, or when the screen outright freezes. This happens all the time in Watch Dogs when you crank up the settings on a card that does not have enough vram.

Long story short, go for the 650 Ti Boost. It's better.


I used to own the card and play at 1080p. You should have less problems with your resolution.

Alright, thanks for all the help from every one, I'm going to get the 650 Ti *boost

thewhitestig
09-21-2014, 01:32 AM
Alright, thanks for all the help from every one, I'm going to get the 650 Ti

*Boost

Fatal-Feit
09-21-2014, 01:46 AM
Make sure it's the 650 Ti *Boost.

-----

You are welcome. :o

Altair1789
09-21-2014, 05:57 PM
This doesn't have anything to do with ACUnity but what is the power supply needed for an MSI GeForce GTX 650 Ti Boost Overclocked? I googled it but couldn't find anything :/

Fatal-Feit
09-21-2014, 06:23 PM
A 430 +80 bronze PSU should be enough.

thewhitestig
09-21-2014, 06:25 PM
This doesn't have anything to do with ACUnity but what is the power supply needed for an MSI GeForce GTX 650 Ti Boost Overclocked? I googled it but couldn't find anything :/

450-500W would be enough.

strigoi1958
09-21-2014, 09:50 PM
Sorry for high jacking this thread but a question for the graphic wizards out there.... I'm going to buy a gtx 980 ( or 970 which is quite close factory ovrclocked at 1080 x 1920) hopefully with unity bundled or Farcry 4..... both would be nice :-) ).

But I'm a bit confused ?

The gtx 780 and the R9 290X seem to have more of (what I considered) important points... more transistors, processors, texture units and the 2 I thought most important memory bus and max bandwidth... :confused: I thought these 256 were a bottleneck for Vram meaning more than 3gb was overkill
but the 980 appears to out perform the 780 and 290X but I'm not sure why ??? is it to do with the number of ROP units ?

Spikey1989
09-21-2014, 10:10 PM
Sorry for high jacking this thread but a question for the graphic wizards out there.... I'm going to buy a gtx 980 ( or 970 which is quite close factory ovrclocked at 1080 x 1920) hopefully with unity bundled or Farcry 4..... both would be nice :-) ).

But I'm a bit confused ?

The gtx 780 and the R9 290X seem to have more of (what I considered) important points... more transistors, processors, texture units and the 2 I thought most important memory bus and max bandwidth... :confused: I thought these 256 were a bottleneck for Vram meaning more than 3gb was overkill
but the 980 appears to out perform the 780 and 290X but I'm not sure why ??? is it to do with the number of ROP units ?

The 980 are close to x2 faster then the 680. This 980 are using Maxwell (GPU: GM204) and 780 are using kepler (GPU: GK110). Maxwell are faster with MFAA (new AA) and there are also many new feature for the 980.

970 has some lowerspec but run on same gpu
290X cant i really compair to, havent read up so much on it :/

hope this will help

/Spikey

thewhitestig
09-21-2014, 10:19 PM
Sorry for high jacking this thread but a question for the graphic wizards out there.... I'm going to buy a gtx 980 ( or 970 which is quite close factory ovrclocked at 1080 x 1920) hopefully with unity bundled or Farcry 4..... both would be nice :-) ).

But I'm a bit confused ?

The gtx 780 and the R9 290X seem to have more of (what I considered) important points... more transistors, processors, texture units and the 2 I thought most important memory bus and max bandwidth... :confused: I thought these 256 were a bottleneck for Vram meaning more than 3gb was overkill
but the 980 appears to out perform the 780 and 290X but I'm not sure why ??? is it to do with the number of ROP units ?

Those things you mentioned are important, but at the end of the day you measure a video card by it's real world performance. Nvidia have managed to create a very efficient architecture that not only consumes less electricity but is also able to do more computation with what at first might seem like less capable hardware. But it's not less capable. It's definitely not.
http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/watchdogs_1920_1080.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/farcry3_1920_1080.gif
http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/ac4_2560_1600.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_980/images/bf4_1920_1080.gif

strigoi1958
09-21-2014, 10:58 PM
thanks Spikey and Stig so although it's less it's working faster... hmm the Palit 970 jetstream looks good ... and it offers a 5% increase over the reference card

Altair1789
09-22-2014, 12:33 AM
Alright, so what about the fact that my motherboard is pcie 2.0 and the card is 3.0, will that be a problem?

YazX_
09-22-2014, 11:48 AM
Alright, so what about the fact that my motherboard is pcie 2.0 and the card is 3.0, will that be a problem?

no its backward compatible, but you will be bottle-necking your card since PCIEx 3.0 doubles the bandwidth of PCIEx 2.0.

thewhitestig
09-22-2014, 12:30 PM
no its backward compatible, but you will be bottle-necking your card since PCIEx 3.0 doubles the bandwidth of PCIEx 2.0.

He won't be bottlenecking anything since most graphics cards today do not use the full bandwidth of PCI-E 3.0. A GTX 680 does not take advantage of PCI-E 3.0 http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review #.VCAHNvmSxMA And he will be running a 650 Ti Boost which has only a fraction of the performance of a 680, so PCI-E 2.0 would not be a problem. Not at all.

http://www.hardocp.com/images/articles/1341219566Ew8yr7oTVd_8_2_l.gif

YazX_
09-22-2014, 02:05 PM
He won't be bottlenecking anything since most graphics cards today do not use the full bandwidth of PCI-E 3.0. A GTX 680 does not take advantage of PCI-E 3.0 http://www.hardocp.com/article/2012/07/18/pci_express_20_vs_30_gpu_gaming_performance_review #.VCAHNvmSxMA And he will be running a 650 Ti Boost which has only a fraction of the performance of a 680, so PCI-E 2.0 would not be a problem. Not at all.


well i havent noticed that he will be getting 650Ti boost, maybe its in previous page or something, but in that case, you are right, it wont bottleneck it. but tell you something from real experience, i upgraded to GTX 770 a while back running on PCIEx 2.0, that card is heavily overclocked running at 1280 Mhz and 7.5 Ghz for memory, when i upgraded my system to PCIEx 3.0, i noticed 15-20% performance boost, then i went SLI, it would be a huge bottleneck if i stayed on PCIEx 2.0 since each is running at x8, so it would be like x4 on PCIEx 3.0.

so i would say it depends on the game and setup you have, with new and more powerful GPUs, i believe they can saturate all PCIEx 3.0 x16 lanes.

Altair1789
09-22-2014, 08:03 PM
Great, thanks for helping a computer noob like me


(edit) Alright, probably my last question, is http://www.newegg.com/Product/Product.aspx?Item=N82E16814487024 <- this card worth getting over the gtx 650 ti boost (if I get the money together before the cheap ones are sold out), and if so, will it have problems with my current setup and the PCIE 2.0 slots?

playlisting
09-23-2014, 05:09 PM
thanks Spikey and Stig so although it's less it's working faster... hmm the Palit 970 jetstream looks good ... and it offers a 5% increase over the reference card

I'd recommend waiting for the 8GB versions to come out before buying, unless you upgrade annually. Games are rapidly requiring more VRAM (just look at Watch_Dogs). The 8GB may seem like huge overkill now, but in 3-4 years time when 4GB users are limited by their VRAM, us 8GB users will still have loads of breathing room.

Fatal-Feit
09-23-2014, 05:20 PM
I'd recommend waiting for the 8GB versions to come out before buying, unless you upgrade annually. Games are rapidly requiring more VRAM (just look at Watch_Dogs). The 8GB may seem like huge overkill now, but in 3-4 years time when 4GB users are limited by their VRAM, us 8GB users will still have loads of breathing room.

Good point, but in 3-4 years, he will need to update his card anyway. I don't think the 970 will support more than a few years of high settings that requires more than 2-4GB of VRAM.

thewhitestig
09-23-2014, 06:54 PM
Great, thanks for helping a computer noob like me


(edit) Alright, probably my last question, is http://www.newegg.com/Product/Product.aspx?Item=N82E16814487024 <- this card worth getting over the gtx 650 ti boost (if I get the money together before the cheap ones are sold out), and if so, will it have problems with my current setup and the PCIE 2.0 slots?

Their performance is about the same. But buying new from the store is always better than buying used cause you'll have warranty and stuff. But going AMD for this specific price bracket would be a better decision, because they'll give you more performance for a lesser price. Yes you lose out on the Nvidia exclusive features (in Ubi games) like TXAA but you won't be using that anyways on a
mid range card so it's all good. Here's what I'm talking about. http://www.newegg.com/Product/Product.aspx?Item=N82E16814150719

And here are some benches:
http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_750_Ti/images/ac4_1920_1080.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_750_Ti/images/bf4_1920_1080.gifhttp://tpucdn.com/reviews/NVIDIA/GeForce_GTX_750_Ti/images/farcry3_1920_1080.gif


I'd recommend waiting for the 8GB versions to come out before buying, unless you upgrade annually. Games are rapidly requiring more VRAM (just look at Watch_Dogs). The 8GB may seem like huge overkill now, but in 3-4 years time when 4GB users are limited by their VRAM, us 8GB users will still have loads of breathing room.

The 8 gig versions may or may not come. There hasn't been any official confirmation yet. Only rumors. And by the time games start using up 8 GB of vram the 970 would not be able to play those games at the settings that require these high amounts of vram.Or he could always go AMD which do put more vram on their boards. They're announcing new products on the 25th, so you should all keep an eye on that one. But he will lose up on the exclusive Nvidia features that seem to be in every one of the new Ubisoft games.

btw AMDs gaming scientist (which is basically a made up title lol) Richard Huddy said that in a few years 16GB will become the standard for videocards. So there might be truth to what you said. They're probably expecting vram requirements for next gen ports to blast past 8GB. But still, any GPU from today would probably not be able to play a game from 2018 (that has an 8GB requirement for ultra settings) at ultra settings.

Altair1789
09-23-2014, 08:04 PM
So, the only thing that makes an Nvidia card worth using with ubisoft is the extra super high settings? This may sound weird but they don't have any special drivers or anything like that?

thewhitestig
09-23-2014, 08:19 PM
So, the only thing that makes an Nvidia card worth using with ubisoft is the extra super high settings? This may sound weird but they don't have any special drivers or anything like that?

Some people might argue that Nvidia have better optimized drivers for their partners (like Ubisoft) but benchmarks prove otherwise. There is no difference between AMD and Nvidia in terms of performance. AMD GPUs at a particular price point perform exactly as they should. Watch Dogs which was another Nvidia partnered game actually performed better (or equal) on an R 290X than a GTX 780 Ti due to the higher amount of vram, even though the 780 Ti has more raw horsepower. Of course there's this once exception - Splinter Cell: Blacklist that for some reason hates AMD gpus. But that's an exception to the rule. Of course there are also other stuff to look at like frametimes and frame variance, but those things don't differ either. So yeah, the advantage of Nvidia GPUs in Ubi games is that you get stuff like PhysX, TXAA, different GameWorks features, PCSS (soft shadows for Far Cry 4) and stuff like that. But the visual difference that those things make is negligible at best. TXAA is actually worse than MSAA because it blurs out the image way too much, even though it has better effect on things like ropes (on the ships of AC4). But I would personally take MSAA over TXAA any day, because MSAA gives me that clear unblurred look.

Fatal-Feit
09-23-2014, 08:21 PM
So, the only thing that makes an Nvidia card worth using with ubisoft is the extra super high settings? This may sound weird but they don't have any special drivers or anything like that?

Well, Nvidia also have longer hands-on with the product. That means these games are usually better optimized for their card. e.i. - Watch_Dogs.

Regarding special stuff, they do. They have features like PhysX particles and TXAA.

Fatal-Feit
09-23-2014, 08:24 PM
Some people might argue that Nvidia have better optimized drivers for their partners (like Ubisoft) but benchmarks prove otherwise. There is no difference between AMD and Nvidia in terms of performance. AMD GPUs at a particular price point perform exactly as they should. Watch Dogs which was another Nvidia partnered game actually performed better (or equal) on an R 290X than a GTX 780 Ti due to the higher amount of vram, even though the 780 Ti has more raw horsepower. Of course there's this once exception - Splinter Cell: Blacklist that for some reason hates AMD gpus. But that's an exception to the rule. Of course there are also other stuff to look at like frametimes and frame variance, but those things don't differ either.

Watch_Dogs was poorly optimized for AMD cards. I believe they've patched it by now, but there was an outrage during launch.


https://www.youtube.com/watch?v=Oxtel5strVs

Altair1789
09-23-2014, 08:52 PM
Hmm... I might just play it safe and go with Nvidia fearing Ubisoft might do the same thing with Unity. It'll probably just come down to how risky I'm feeling the day I order a card, or when they release system requirements/ any info about how long AMD has with the game. But does that only matter for newly released cards or will a card that comes out before the game have any differences depending on how long a company has the game?

thewhitestig
09-23-2014, 09:00 PM
Watch_Dogs was poorly optimized for AMD cards. I believe they've patched it by now, but there was an outrage during launch.

This was actually when Nvidia had a beta driver prepared and AMD had no driver. After AMD released a driver everything went fine. Obviously Watch Dogs after release was not fine at all, but for all intensive purposes I would call it fine cause both vendors were suffering from the same stuttering issues. . so it evens out. AMD less than Nvidia though, especially on Ultra textures. Anyways, take a look at this.
http://www.hardocp.com/images/articles/14083370007cx2pU3ZI8_8_2_l.gif

The the R9 290 and the 780 usually trade in benchmarks, but in the case of Watch Dogs the AMD card outperforms the Nvidia one. Plus it has better frame consistency. Look at the minimum fps on the 780. So really, it's very debatable if Nvidia have any optimization advantage over AMD. Yes they may release a driver 3 days earlier, but... whatevs man... 3 days is nothing compared to the life span of the game.


Hmm... I might just play it safe and go with Nvidia fearing Ubisoft might do the same thing with Unity. It'll probably just come down to how risky I'm feeling the day I order a card, or when they release system requirements/ any info about how long AMD has with the game. But does that only matter for newly released cards or will a card that comes out before the game have any differences depending on how long a company has the game?

I didn't quite understand your question. I believe the answer is no, but please try to rephrase it a little bit. Usually it doesn't matter weather or not your GPU just got released, or if it's a one year old model. The driver package consists optimizations for both new and old GPUs. The very old ones though do not get much attention. Stuff like the AMD 4000/5000 series and the GTX 200/400s. Forward compatibility is only an issue with low-level APIs like Mantle. The latest R9 285 for example could not play BF4 under Mantle at a playable framerate because the drivers had not been yet optimized for the Tonga GPU. That might be an issue with DX12 too. A new GPU gets released and all the old games simply don't work with it cause the optimization responsibility is more or less moved from the GPU vendor to the game developer. And since game developers don't have the resources to constantly update their games to work with new GPUs, some of the old games might be let to stay broken. Don't forget that I'm talking about low level APIs like Mantle/DX12 here. DX11 does not have that issue of forward compatibility because it has a very thick layer of abstraction that does not allow developers to "hard code" to the GPU and also the optimization responsibility is more or less forwarded towards the GPU vendor rather than the game developer.

Altair1789
09-23-2014, 09:22 PM
Oh, ok, I understand now. I thought the difference in performance when the company has time with the game compared to when they don't have time with it (like AMD vs Nvidia for watch_dogs) was with the actual card, I didn't know about the drivers they release.

thewhitestig
09-23-2014, 09:29 PM
Oh, ok, I understand now. I thought the difference in performance when the company has time with the game compared to when they don't have time with it (like AMD vs Nvidia for watch_dogs) was with the actual card, I didn't know about the drivers they release.

Well, there is a difference cause Nvidia did release a Watch Dogs ready driver before the release date of the game. So they had an edge on release. But after everything settled down, both vendors had optimized drivers. But you know how it is. Both of these companies have their partnerships and they advertise themselves like crazy "Nvidia the way it's meant to be played" "AMD Gaming Evlolved", even though in most cases there is no difference in performance. Or when there is, it's negligible.

strigoi1958
09-23-2014, 11:30 PM
I'd recommend waiting for the 8GB versions to come out before buying, unless you upgrade annually. Games are rapidly requiring more VRAM (just look at Watch_Dogs). The 8GB may seem like huge overkill now, but in 3-4 years time when 4GB users are limited by their VRAM, us 8GB users will still have loads of breathing room.

I tend to change a card after 2 to 2.5 years and then the whole system after another year and a bit.... I've got a palit gtx 670 jetstream which runs games brilliantly but it doesn't have DSR or VXGI and an old i5 3570k which was shipped at 4.5 Ghz but I quickly set that back to stock 3.4 as it offered nothing to me O/C and my SSD drive is constantly getting close to full so maybe a new system might be in order.... :D
On a side note, even if Watch Dogs could utilize 80GB of Vram it still wouldn't have made my top 50 games ;) but the potential exists for the next games in the sequel to be brilliant.

And as technology marches on I never like to wait....

Fatal-Feit
09-23-2014, 11:51 PM
Well, there is a difference cause Nvidia did release a Watch Dogs ready driver before the release date of the game. So they had an edge on release. But after everything settled down, both vendors had optimized drivers. But you know how it is. Both of these companies have their partnerships and they advertise themselves like crazy "Nvidia the way it's meant to be played" "AMD Gaming Evlolved", even though in most cases there is no difference in performance. Or when there is, it's negligible.

Both AMD and Nvidia reminds me a lot of Playstation and Xbox. Well, last-gen, at least.


This was actually when Nvidia had a beta driver prepared and AMD had no driver. After AMD released a driver everything went fine. Obviously Watch Dogs after release was not fine at all, but for all intensive purposes I would call it fine cause both vendors were suffering from the same stuttering issues. . so it evens out. AMD less than Nvidia though, especially on Ultra textures. Anyways, take a look at this.
http://www.hardocp.com/images/articles/14083370007cx2pU3ZI8_8_2_l.gif

The the R9 290 and the 780 usually trade in benchmarks, but in the case of Watch Dogs the AMD card outperforms the Nvidia one. Plus it has better frame consistency. Look at the minimum fps on the 780. So really, it's very debatable if Nvidia have any optimization advantage over AMD. Yes they may release a driver 3 days earlier, but... whatevs man... 3 days is nothing compared to the life span of the game.

That's strange. From all the other benchmarks and videos I've seen from websites and channels (after patch), the Nvidia cards are usually ahead of AMD. And even after a quick Google search.

That said, I agree about AMD's consistency with its frames. It's also apparent in the video. A stable lower FPS is arguably better than an unstable high FPS.

BlastThyName
09-24-2014, 10:27 PM
I'm sure the GPU brand won't make much of a difference. Hardware wise AMD and Nvidia are matched, the drivers will differenciate the two.

Altair1789
09-25-2014, 12:40 AM
So an EVGA GeForce gtx 750 Ti won't be bottlenecked by that much in a PCIE 2.0 slot, right?

Yozziee
09-25-2014, 04:14 AM
The difference between PCI 2.0 and 3.0 is so minimal it wont affect your FPS one bit, you may get slightly lower score in 3DMark by a few points.

BlastThyName
09-25-2014, 10:57 AM
So an EVGA GeForce gtx 750 Ti won't be bottlenecked by that much in a PCIE 2.0 slot, right?
No worries to have.

Altair1789
09-25-2014, 04:56 PM
Awesome, thanks everybody

Altair1789
09-26-2014, 03:21 AM
Alright so one last question about whether or not this card will fit in my case- this computer has the same case as mine: http://www.amazon.com/gp/aw/d/B00HCZJ7GO/ref=redir_mdp_mobile?ref_=twister_B00HNENNR8 the motherboard is the MSI 760GMA-P34. I looked at my graphics card and saw that there wasn't much space between the somewhat thin card and another round thingy. so can anyone verify that the card wont be too fat to fit? Thanks

thewhitestig
09-26-2014, 09:49 AM
Alright so one last question about whether or not this card will fit in my case- this computer has the same case as mine: http://www.amazon.com/gp/aw/d/B00HCZJ7GO/ref=redir_mdp_mobile?ref_=twister_B00HNENNR8 the motherboard is the MSI 760GMA-P34. I looked at my graphics card and saw that there wasn't much space between the somewhat thin card and another round thingy. so can anyone verify that the card wont be too fat to fit? Thanks

You should measure the clearance of your case.But even the longest 750 Ti's do not exceed 25-26cm. You could also get a shorter version with a less expensive cooler if your case does not support long videocards. This one would fit in any case.
http://images10.newegg.com/ProductImageCompressAll300/14-487-025-16.jpg

AherasSTRG
09-26-2014, 10:09 AM
May I bump in with some info on the VRAM issue?

Shadow of Mordor suggests 3GBs of VRAM for High Textures and 6GBs of VRAM for Ultra Textures. So a high-VRAM card is highly future-proof.

And before anyone butts in saying how the "next-gen" consoles brought about these requirements, I will also add that High and Ultra textures are PC-exclusive features. The console versions utilise low-medium textures, which can easily be replicated by a 2GB mid-range video card.

thewhitestig
09-26-2014, 10:16 AM
May I bump in with some info on the VRAM issue?

Shadow of Mordor suggests 3GBs of VRAM for High Textures and 6GBs of VRAM for Ultra Textures.

6GB for Ultra? wow, let's hope these textures are worth it, cause the overall look of the game is certainly not impressive at all. Or at least not impressive compared to other next gen games in it's class like Assassin's Creed Unity, Dragon Age: Inquisition and The Witcher 3.

AherasSTRG
09-26-2014, 10:22 AM
6GB for Ultra? wow, let's hope these textures are worth it, cause the overall look of the game is certainly not impressive at all. Or at least not impressive compared to other next gen games in it's class like Assassin's Creed Unity, Dragon Age: Inquisition and The Witcher 3.

I think the Ultra Textures are meant for those that want to play the game at 1440p and above.

Also, agreed. All the aforementioned games look much more impressive graphically-wise. However, I bet all of the above will run at 30 FPS (or lower) on consoles. In Mordor, however, it seems like the framerate is going to be unlocked. Waiting for the Digital Foundry Analysis.

strigoi1958
09-26-2014, 09:52 PM
Won't DSR give us an image somewhere in between 1080p and 4k ? I'm reluctant to go to 4k at present because of cost and FPS but will dsr be like 1440p

Altair1789
09-26-2014, 10:40 PM
You should measure the clearance of your case.But even the longest 750 Ti's do not exceed 25-26cm. You could also get a shorter version with a less expensive cooler if your case does not support long videocards. This one would fit in any case.
http://images10.newegg.com/ProductImageCompressAll300/14-487-025-16.jpg

I'm getting this one: http://www.amazon.com/gp/product/B00IDG3IDO/ref=olp_product_details?ie=UTF8&me= , isn't it the same?

thewhitestig
09-26-2014, 10:50 PM
I'm getting this one: http://www.amazon.com/gp/product/B00IDG3IDO/ref=olp_product_details?ie=UTF8&me= , isn't it the same?

It'll fit. This is a very tiny card. There are even low profile versions of the 750 Ti.

Altair1789
09-27-2014, 03:12 AM
It is done, I purchased the card. Thank you to everyone who helped me with all my questions :)

thewhitestig
09-27-2014, 11:37 AM
It is done, I purchased the card. Thank you to everyone who helped me with all my questions :)

Now don't forget to update us on how well it performs. :cool:

BlastThyName
09-27-2014, 03:32 PM
And before anyone butts in saying how the "next-gen" consoles brought about these requirements, I will also add that High and Ultra textures are PC-exclusive features. The console versions utilise low-medium textures
Source please. High requires 3GB which is within reach of the PS4.

thewhitestig
09-27-2014, 05:39 PM
Source please. High requires 3GB which is within reach of the PS4.

http://www.neogaf.com/forum/showthread.php?t=901841

BlastThyName
09-27-2014, 06:00 PM
http://www.neogaf.com/forum/showthread.php?t=901841
I knew about the 6GB for ultra but I was asking specifically about this part :


I will also add that High and Ultra textures are PC-exclusive features. The console versions utilise low-medium textures
Source, please.
Where did you read that ? How do you know what settings consoles use ?

thewhitestig
09-27-2014, 06:44 PM
How do you know what settings consoles use ?

I'm also interested to see Aheradrim put up a source, but considering how the the differences in texture quality between the PS4, XB1 and the PC in Watch Dogs all come down to resolution, I'm inclined to believe that the consoles can do High textures on Shadow of Mordor.

XB1: http://images.eurogamer.net/2013/articles//a/1/6/8/0/8/0/5/XO_014.jpg.jpg
PS4: http://images.eurogamer.net/2013/articles//a/1/6/8/0/8/0/5/PS4_014.jpg.jpg
PC: http://images.eurogamer.net/2013/articles//a/1/6/8/0/8/0/5/PC_014.jpg.jpg

strigoi1958
09-28-2014, 12:40 AM
I'd recommend waiting for the 8GB versions to come out before buying, unless you upgrade annually. Games are rapidly requiring more VRAM (just look at Watch_Dogs). The 8GB may seem like huge overkill now, but in 3-4 years time when 4GB users are limited by their VRAM, us 8GB users will still have loads of breathing room.

I've read that DX 12 might include shared tile streaming where much more texture data is stored in your system ram and streamed quickly to the video ram... if this is effective it probably means cards with smaller vram would still be a good choice.
I wonder if it works like a RAMDISK ?

YazX_
09-28-2014, 01:23 AM
I've read that DX 12 might include shared tile streaming where much more texture data is stored in your system ram and streamed quickly to the video ram... if this is effective it probably means cards with smaller vram would still be a good choice.
I wonder if it works like a RAMDISK ?

this is already available in Dx 11.2 which is exclusive to windows 8.1 and its called Tiled Resources, DX 12 builds on that and introduces Volume Tiled Resources, the only game i know that uses Tiled Resources is Ryse: Son of Rome on Xbone and coming to PC on October 10th, and i'm not sure if it will use it on PC.

here is the Dx 11.2 Tiled Resources Demo:

http://channel9.msdn.com/Blogs/Windows-Blog/Tiled-Resources-for-DirectX-in-Windows-81

Fatal-Feit
09-28-2014, 07:19 AM
I'm also interested to see Aheradrim put up a source, but considering how the the differences in texture quality between the PS4, XB1 and the PC in Watch Dogs all come down to resolution, I'm inclined to believe that the consoles can do High textures on Shadow of Mordor.

XB1: http://images.eurogamer.net/2013/articles//a/1/6/8/0/8/0/5/XO_014.jpg.jpg
PS4: http://images.eurogamer.net/2013/articles//a/1/6/8/0/8/0/5/PS4_014.jpg.jpg
PC: http://images.eurogamer.net/2013/articles//a/1/6/8/0/8/0/5/PC_014.jpg.jpg

According to DigitalFoundry and one of Watch_Dog's developer on Twitter, the console's textures are equivalent to high settings. Ultra are PC only.

The biggest difference are in the water, apparently.

strigoi1958
09-28-2014, 10:50 AM
this is already available in Dx 11.2 which is exclusive to windows 8.1 and its called Tiled Resources, DX 12 builds on that and introduces Volume Tiled Resources, the only game i know that uses Tiled Resources is Ryse: Son of Rome on Xbone and coming to PC on October 10th, and i'm not sure if it will use it on PC.

here is the Dx 11.2 Tiled Resources Demo:

http://channel9.msdn.com/Blogs/Windows-Blog/Tiled-Resources-for-DirectX-in-Windows-81

Oh dear so it looks like I'm going to have to buy a card with more Vram as I'd rather cut my arm off with a blunt rusty knife than use windows 8.1 :D I wish microsoft would just let it die.... they made a mistake trying to take on android and then made another trying to adapt the software for PC in order to convince us to buy windows based mobiles and now they're adding features people want to 8.1 but excluding windows 7... so sad
I hope a ramdisk can do the same on the fly.

thewhitestig
09-28-2014, 11:03 AM
The biggest difference are in the water, apparently.

Oh no. There is quite the difference between high and ultra. http://www.hardocp.com/article/2014/08/18/watch_dogs_performance_image_quality_review/12#.VCfczfmSxMA

YazX_
09-28-2014, 11:19 AM
Oh dear so it looks like I'm going to have to buy a card with more Vram as I'd rather cut my arm off with a blunt rusty knife than use windows 8.1 :D I wish microsoft would just let it die.... they made a mistake trying to take on android and then made another trying to adapt the software for PC in order to convince us to buy windows based mobiles and now they're adding features people want to 8.1 but excluding windows 7... so sad
I hope a ramdisk can do the same on the fly.

Actually i had same feeling about win 8.1, but it works very well after trying it out, installed some third party software to bring StartMenu back and i can use it pretty much like Windows 7 , even without startmenu, Win 8.1 is different than Win 8, give it a try on a virtual machine sometime and you will use it pretty much like windows 7.

strigoi1958
09-28-2014, 12:10 PM
I've asked MS if VTR in DX12 will be available with Win7. It's sad we should have to use 3rd party software to bypass an operating system for handheld devices to make it useable :(.

If VTR is not accessible in win 7 I'm hoping a RAMdisk will do the same.... if it is.... I've seen a nice system

CaseCooler Master Storm Enforcer (ATX)
Intel Core i7 5930K (6x 3.50GHz
Asus X99-S - X99, USB 3.0, SATA3
16gb Corsair Vengeance DDR4 2800
4GB Nvidia GeForce GTX 970
240gb Kingston V300 SSD, SATA 6Gb (450MB/R, 450MB/W
3TB SATA3 Hard Drive (UDMA600)
750w Corsair CX750M Modular
Corsair Hydro H55 Liquid Cooler
Windows 7 Home Premium (64 bit)

strigoi1958
09-28-2014, 12:30 PM
Just found a link to win 9 features.... MS have accepted win 8 left us with a bitter taste in our mouths and have made win 9 more PC than handheld (I think there will be 2 versions) so pc gaming can continue for me :D

AherasSTRG
09-28-2014, 04:49 PM
What source are we talking about? Shadow of Mordor using medium textures on the PS4?I think it was in one of the reviews posted on Friday. PCgamer's review most probably. If I find it again, I 'll post it.

EDIT: Now I remember. The guy over at IGN reviewed the game on the PC and the PS4. And he said that the PC version had some extra graphical enhancements, including high and ultra texture settings.

EDIT #2: And in all seriousness, people: If the developers are confident enough to allow the PS4 version to ship with an unlocked framerate, then the game will be no match for a mid-range PC on console settings.

Fatal-Feit
09-29-2014, 12:12 AM
Oh no. There is quite the difference between high and ultra. http://www.hardocp.com/article/2014/08/18/watch_dogs_performance_image_quality_review/12#.VCfczfmSxMA

You and your sources, man. I freaking love them.

Never leave.

thewhitestig
09-29-2014, 11:42 AM
EDIT #2: And in all seriousness, people: If the developers are confident enough to allow the PS4 version to ship with an unlocked framerate, then the game will be no match for a mid-range PC on console settings.

Too bad they are capping the framerate to 100 for the PC. Quite a random number they chose right there, but at least they did not cap it to 60fps. The devs said that during an interview on the Nvidia GAME24 event.


You and your sources, man. I freaking love them.

Never leave.

Haha, thanks. :) I always try be very thorough when I present information. :p

AherasSTRG
09-30-2014, 09:45 AM
So, after days of lurking in youtube, I found a video about the performance of the GTX 770 on the first map of Shadow of Mordor. At absolute max (High textures), the stock GTX 770 can easily maintain 50 to 60 FPS (55+ most of the time). Although the requirement for High textures is 3GBs of VRAM, the 2GB version of the 770 manages itself quite well with no hiccups at all. I doubt there is any significant difference in performance if you run the game on the 4GB version of the GTX 770, but I am open to all sources. The framerate drops to 40s when Talion is in critical state. Probably due to the edge-of-screen blood effect.

As I see it, turning the graphics' quality to High (which I assume is the console quality setting), maintaining 60 FPS would be a walk in the park for the GTX 770.


http://www.youtube.com/watch?v=PxvoyvEn-cU

Later today, I will update this post with the performance of the game on a GTX 760 Windforce 3X OC Edition.

The reason why I am talking about Shadow of Mordor's performance is because I am confident that it is going to help people decide on what GPU they should buy.

GeorgeLazu
09-30-2014, 10:56 AM
Hey guys, i'm on a tight budget and i want to buy a second hand 750ti [2GB]. Do you think it'll work ok on medium settings @900p? Or at least at 720p. CPU: FX-8350, 8GB RAM.

When i say OK, i mean at least 30FPS

Fatal-Feit
09-30-2014, 11:40 AM
Hey guys, i'm on a tight budget and i want to buy a second hand 750ti [2GB]. Do you think it'll work ok on medium settings @900p? Or at least at 720p. CPU: FX-8350, 8GB RAM.

When i say OK, i mean at least 30FPS

...Maybe. I'm not so sure about second hand GPUs. Perhaps on 720p, but you might experience dips below 18-20fps in certain areas. This is Ubisoft we're talking about. The frame-rates are never stable.

GeorgeLazu
09-30-2014, 12:46 PM
...Maybe. I'm not so sure about second hand GPUs. Perhaps on 720p, but you might experience dips below 18-20fps in certain areas. This is Ubisoft we're talking about. The frame-rates are never stable.

It should be fine, it's used since march 2014 and it has 3 years warranty. Should anything go wrong, i am covered. And in 2 years i plan to upgrade the GPU. CPU however i hope doesn't need upgrading for at least 3 years

Voyager456
09-30-2014, 01:04 PM
It should be fine, it's used since march 2014 and it has 3 years warranty. Should anything go wrong, i am covered. And in 2 years i plan to upgrade the GPU. CPU however i hope doesn't need upgrading for at least 3 years
Your gonna be fine probably mixture of medium and high settings on 1080p 30 FPS

YazX_
09-30-2014, 01:16 PM
Hey guys, i'm on a tight budget and i want to buy a second hand 750ti [2GB]. Do you think it'll work ok on medium settings @900p? Or at least at 720p. CPU: FX-8350, 8GB RAM.

When i say OK, i mean at least 30FPS

750Ti SLI is like 15% faster than single GTX 770, so i think you will be fine @ 1920x1080, on 900p i believe you will be able to run it at high settings with 45+ FPS.

Anykeyer
09-30-2014, 02:20 PM
So, after days of lurking in youtube, I found a video about the performance of the GTX 770 on the first map of Shadow of Mordor. At absolute max (High textures), the stock GTX 770 can easily maintain 50 to 60 FPS (55+ most of the time). Although the requirement for High textures is 3GBs of VRAM, the 2GB version of the 770 manages itself quite well with no hiccups at all. I doubt there is any significant difference in performance if you run the game on the 4GB version of the GTX 770, but I am open to all sources. The framerate drops to 40s when Talion is in critical state. Probably due to the edge-of-screen blood effect.


Im curios, does it really use 6gb vram on ultra with ultra textures pack?

YazX_
09-30-2014, 03:09 PM
Im curios, does it really use 6gb vram on ultra with ultra textures pack?

me too, i really doubt it, maybe max 4GB but i expect it to require 3GB.

AherasSTRG
09-30-2014, 06:46 PM
Shadow of Mordor on a GTX 760 Windforce 3X OC Edition (2 GBs of VRAM) / Intel i5 3750:

High Preset: 70+
Very High Preset: 60-80
Playing with everything on High (including textures): 55-60 FPS

Great optimisation. I haven't downloaded the Ultra Textures yet, but the 3GB VRAM requirement for High textures is bollocks.

thewhitestig
10-01-2014, 12:12 AM
but the 3GB VRAM requirement for High textures is bollocks.

Yes it is. With a mixture of of low, medium and high settings, and the textures set to medium I barely push 850-900mb.

AherasSTRG
10-01-2014, 01:19 AM
Indeed. Stig. Is this just me or the ******* game does not support Triple Buffering?

thewhitestig
10-01-2014, 01:44 AM
Indeed. Stig. Is this just me or the ******* game does not support Triple Buffering?

Well, apparently so. I don't see an option either. btw that benchmark tool is very inconsistent at best. Sometimes I get 55fps and other times I get 85fps. And that's because of the ridiculous 250+ framerate that the benchmark starts up with. Given how high framerates are very random sometimes it might hit 200fps and other it might hit 300fps and that greatly skus the final result of the benchmark. I did 10 runs and the results were all over the place. Not a reliable benchmark tool at all. But the framerate during gameplay stays very stable.

Oh, and another thing. I don't believe that even that extra texture pack that they're gonna release is really gonna push 6 gigs, even at high resolutions. They're lying out of their asses about that. I would not put it past them considering how they lied about the minimum system requirements. They said that the game would require an i5 750 for minimum spec. Really? I'm running it with a CPU that's 35% slower, 2 generations behind but nevertheless my GPU is getting fully utilized at 98-99% with no bottlenecks. On medium settings I'm getting around 60fps with my 6870. Usually that GPU gets heavily bottlenecked by my C2Q Q8200 at newer games like BF4, Watch Dogs and AC4 which do require quite a lot of CPU horsepower. But not here. It all runs nice and dandy.

Altair1789
10-01-2014, 09:09 PM
Just a quick update and huge thanks to all those that helped- the card runs AC4 at high-ultra settings with no problem at all. So glad I got this one, thank you all very much for the help :)

Fatal-Feit
10-02-2014, 05:59 AM
I'm glad it's working well for you, NYJetsFan117.

Speaking of graphic cards, I managed to find another 970 in stock. Can't wait to test it out on Monday.

I don't expect it to run AC:IV on max with TXAA x4, but it should at least be around 40-60fps with SMAA. Blah... I wish Ubisoft's optimizations were as good as Shadow of Mordor.

strigoi1958
10-02-2014, 10:39 PM
Shadow of Mordor on a GTX 760 Windforce 3X OC Edition (2 GBs of VRAM) / Intel i5 3750:

High Preset: 70+
Very High Preset: 60-80
Playing with everything on High (including textures): 55-60 FPS

Great optimisation. I haven't downloaded the Ultra Textures yet, but the 3GB VRAM requirement for High textures is bollocks.

Thanks.... because of that I bought it and it runs brilliantly on my gtx 670 2gb. I let it auto configure and it is a mix of high/ ultra at 1080p and it benchmarked lowest 42 and avg 65 to 75 so very happy. I can wait until the initial clamour for 970's are over and maybe a Ti version comes out or a game is thrown in.

It's a nice game but not quite AC series beautiful.

AherasSTRG
10-04-2014, 04:17 PM
Thanks.... because of that I bought it and it runs brilliantly on my gtx 670 2gb. I let it auto configure and it is a mix of high/ ultra at 1080p and it benchmarked lowest 42 and avg 65 to 75 so very happy. I can wait until the initial clamour for 970's are over and maybe a Ti version comes out or a game is thrown in.

It's a nice game but not quite AC series beautiful.

Glad to be of help. There are 3 things you shoudl note:
1. There are some performance issues here and there in the second map on Nvidia cards (you won't have constant 60 FPS, but rather 50-60 :P).
2. Motion Blur does not work.
3. If the Vsync caps your framerate at 30, alt-tab out and then alt-tab in again in the game.

Issues 1 and 2 are known to the developers and will be taken care of in a patch, which is soon to come.

strigoi1958
10-06-2014, 02:01 AM
I didn't get any stuttering just a little tearing running through the dark wet bit where the ghuls are. I don't normally turn motion blur on as it kinda makes me groggy :D
I guess vsync would have stopped the tearing but it was so little I didn't worry. Good game I've all but finished I've just gone back to map 1 to clear all the bits before going back to fight sauron.
I wasn't going to buy it till I upgraded my card or system but seeing your post really surprised me. Glad I got it to help pass the time before unity and Farcry 4

SixKeys
10-06-2014, 04:54 AM
Guys, I need some advice. I just bought a new Nvidia GTX 770 card, but after doing some digging, I'm not sure if my power supply is sufficient (400 W). Nvidia's site says minimum PSU for the 700 series is 600W but I'm seeing a lot of conflicting opinions. Some people say Nvidia is just covering their asses and that the 770 doesn't require near that much power, others are saying go with the recommendation to be safe.

Do you think 400W is sufficient for my new card?

Specs:

Windows 7 64-bit Home Premium
Intel Core i5
CPU 750 @ 2.67GHz (4 CPUs), ~2.7GHz
8192MB RAM

Fatal-Feit
10-06-2014, 05:48 AM
Guys, I need some advice. I just bought a new Nvidia GTX 770 card, but after doing some digging, I'm not sure if my power supply is sufficient (400 W). Nvidia's site says minimum PSU for the 700 series is 600W but I'm seeing a lot of conflicting opinions. Some people say Nvidia is just covering their asses and that the 770 doesn't require near that much power, others are saying go with the recommendation to be safe.

Do you think 400W is sufficient for my new card?

Specs:

Windows 7 64-bit Home Premium
Intel Core i5
CPU 750 @ 2.67GHz (4 CPUs), ~2.7GHz
8192MB RAM

Hold on, you JUST bought a GTX 770? If you can return that, I highly recommend it. Go and spend your money on a GTX 970. You will get the performance of a 780/780 Ti for the price of a 770. It's also quieter and requires less wattage (500).

------------

400W with a 770 should work, but you might run into some problems, especially in the long run. I recommend picking up a 500W minimum power supply, to be safe.

SixKeys
10-06-2014, 07:45 AM
Hold on, you JUST bought a GTX 770? If you can return that, I highly recommend it. Go and spend your money on a GTX 970. You will get the performance of a 780/780 Ti for the price of a 770. It's also quieter and requires less wattage (500).

------------

400W with a 770 should work, but you might run into some problems, especially in the long run. I recommend picking up a 500W minimum power supply, to be safe.

Where I am, the 970 is going for almost 100 euros more than the 770. I'd rather save the extra money for a new power supply if people think it's necessary.

Anykeyer
10-06-2014, 09:11 AM
Do you think 400W is sufficient for my new card?

Specs:

Windows 7 64-bit Home Premium
Intel Core i5
CPU 750 @ 2.67GHz (4 CPUs), ~2.7GHz
8192MB RAM

No. Its dangerous to run this system on 400W. It may even work under low to medium load but you can expect all kind of failures under heavy load. Failing PSU can damage other components, sometimes even burn them voiding warranty. So DONT.

AherasSTRG
10-06-2014, 09:31 AM
Yeah, I agree with the people above. I remember that when I bought the 760, I also bought a 600W PSU and to this day, I still want a 700W to be even safer.

YazX_
10-06-2014, 10:51 AM
Guys, I need some advice. I just bought a new Nvidia GTX 770 card, but after doing some digging, I'm not sure if my power supply is sufficient (400 W). Nvidia's site says minimum PSU for the 700 series is 600W but I'm seeing a lot of conflicting opinions. Some people say Nvidia is just covering their asses and that the 770 doesn't require near that much power, others are saying go with the recommendation to be safe.

Do you think 400W is sufficient for my new card?

Specs:

Windows 7 64-bit Home Premium
Intel Core i5
CPU 750 @ 2.67GHz (4 CPUs), ~2.7GHz
8192MB RAM

CPU Max TDP = 95W
GTX 770 Reference = 230W

these two draw 325W, other components dont draw as much but lets say 70W , i dont know what 770 brand and edition you got since most of them come factory overclocked and draw more power, so 400W is somehow sufficient but not recommended at all since you are pulling max power from it all the time when gaming/full load and this causes PSU overheating problems and might die very soon. in addition, if your PSU is not a well known brand then it will be a problem since it wont be able to deliver 400W.

could it cause some damage to your PC?

99% no since i had PSU failures (blowing out) alot of times in the past (with cheap Chinese ones), but if i were you i wouldnt risk it and get well known brand with 80 PLUS GOLD rating.

Anykeyer
10-06-2014, 12:56 PM
Well, I had PSU failing and burning motherboard ATX connector in the process (wasnt mine PC :cool:) so the possibility exists.
Also old PSUs usually dont provide necessary power through +12V rail, Those 400W total could easily mean only 250W for +12V, but latest hardware take 99% of all power from +12V and not other power rails. Newer standards and PSUs made according to them have different balance between rails and give much more +12V power even with the same total rating.

Just found some pics:
old standard (and crappy) PSU:
http://cdn.buzznet.com/assets/users10/lancer/default/enlight-400w-psu--large-msg-11445310444-2.jpg
400W total with only 180W +12V.
New:
http://www.pcper.com/files/imagecache/article_max_width/review/2013-09-21/5-Nameplate.jpg
full 550W is available through +12V

SixKeys
10-06-2014, 07:26 PM
Thanks for the feedback, guys. What brands/models would you recommend for a new PSU?

YazX_
10-06-2014, 10:20 PM
Well, I had PSU failing and burning motherboard ATX connector in the process (wasnt mine PC :cool:) so the possibility exists.
Also old PSUs usually dont provide necessary power through +12V rail, Those 400W total could easily mean only 250W for +12V, but latest hardware take 99% of all power from +12V and not other power rails. Newer standards and PSUs made according to them have different balance between rails and give much more +12V power even with the same total rating.

Just found some pics:
old standard (and crappy) PSU:
http://cdn.buzznet.com/assets/users10/lancer/default/enlight-400w-psu--large-msg-11445310444-2.jpg
400W total with only 180W +12V.
New:
http://www.pcper.com/files/imagecache/article_max_width/review/2013-09-21/5-Nameplate.jpg
full 550W is available through +12V

yah true, thats why i said 99%, guess your friend was one of those 1% :) , but i wouldnt even risk it to be honest.


Thanks for the feedback, guys. What brands/models would you recommend for a new PSU?

best PSUs manufacturers are Seasonic and Superflower, however, most PSUs are manufactured by OEMs and branded with other names like CoolerMaster, EVGA, Corsair,... personally i would pick EVGA SuperNova G2 (MAKE SURE IT IS G2 not G1) which is covered with 10 years of warranty and manufactured by SuperFlower:

http://www.amazon.com/EVGA-SuperNOVA-80PLUS-Certified-220-G2-0750-XR/dp/B00IKDETOW/ref=sr_1_1?ie=UTF8&qid=1412630217&sr=8-1&keywords=EVGA+SuperNOVA+750+G2+Power

Review:

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=380

ofcourse there are other good alternatives, and you can also select higher wattage, but this is what i would recommend according to your budget.

SixKeys
10-07-2014, 10:15 PM
I think I found something that should fit my needs and my budget. Thanks for the help.

Fatal-Feit
10-08-2014, 12:41 AM
...So, without going into too details, I got the gtx 970, but I have to settle with good old Windows Vista instead Windows 8 from now on. I'm pretty upset because despite its flaws, I kind of like Win8. The sleek look of the UI and the performance increase for gaming was great. :(

This may not be the right place to ask, but if any of you guys have an extra product key you could bargain for a fair price, do PM me. =/

It feels like hell right now.

thewhitestig
10-08-2014, 02:24 PM
...So, without going into too details, I got the gtx 970, but I have to settle with good old Windows Vista instead Windows 8 from now on. I'm pretty upset because despite its flaws, I kind of like Win8. The sleek look of the UI and the performance increase for gaming was great. :(

This may not be the right place to ask, but if any of you guys have an extra product key you could bargain for a fair price, do PM me. =/

It feels like hell right now.

Why don't you try out the Windows 10 technical preview for a little bit? http://windows.microsoft.com/en-us/windows/preview

Altair1789
10-08-2014, 11:08 PM
1 more question-

Is there a big difference in performance between 1280x720 and 1600x900? I was considering buying a 1600x900 screen, but I'm not sure if I should

YazX_
10-08-2014, 11:52 PM
1 more question-

Is there a big difference in performance between 1280x720 and 1600x900? I was considering buying a 1600x900 screen, but I'm not sure if I should

As you know the most taxing factor in performance is resolution, FHD (1920x1080) pixel count is roughly twice 1280x720, 1600x900 is somehow in the middle between 1920x1080 and 1280x720, so its like 44% increase in pixels count, sorry if im not following this thread to know what graphics card you have, but as far as i recall, you have 750Ti, i believe you will be fine on medium/high mix with that resolution.

750Ti can sustain 30 FPS @ 1920x1080 on high/medium settings which is very good according to its price, look at this bench:

http://www.anandtech.com/bench/product/1130?vs=1037

Altair1789
10-09-2014, 12:14 AM
As you know the most taxing factor in performance is resolution, FHD (1920x1080) pixel count is roughly twice 1280x720, 1600x900 is somehow in the middle between 1920x1080 and 1280x720, so its like 44% increase in pixels count, sorry if im not following this thread to know what graphics card you have, but as far as i recall, you have 750Ti, i believe you will be fine on medium/high mix with that resolution.

750Ti can sustain 30 FPS @ 1920x1080 on high/medium settings which is very good according to its price, look at this bench:

http://www.anandtech.com/bench/product/1130?vs=1037


Interesting, thank you. I know I should just wait for min system requirements, but I'm bad at being patient so here's a question trying to avoid the need for speculation: So I have a screen that's 1680x1050, but I'm not using it, I tried it out, adjusted the settings to what would give 40-ish fps in an area with a lot of grass and nature-stuff (Cape Bonavista) assuming Unity will allow me to use similar settings with around 30 fps (does that seem accurate?) I then went to Havana and had 60 fps, but an annoying problem that may be related to either the screen or the settings (will be posted below). When I'd run, the shadows would flicker, when I'd do a leap of faith, the screen would kinda zoom in and out weirdly, and the buildings had would have this weird fading-like effect when I'd get close to them. So could this be the screen or the game with these settings:

Resolution: 1600x900
Environment quality: Normal
Texture quality: High
AA: MSAA 2x
Shadow quality: Low
Reflection quality: High
Motion blur: Off
Ambient occlusion: HBAO +
God rays: Off
Volumetric fog: On
v-sync: Off

So basically, (based on speculation) will having 40 fps in a very nature-ish area in Cape Bonavista and 60 fps in Havana mean I can play with around 30 fps in Unity? And, why are the buildings acting weird with colors when I move closer to them?

Also note that I had fraps on

Also just in case that it's possibly the screen, my normal 1280x1024 screen is the Sony SDM-HS75P, and the 1680x1050 screen is called the Envision G2219w1

fitch40
10-09-2014, 07:37 AM
If you're in the market for a new video card, then GTX 980 is a great deal

Brief review of the new graphics card http://pcgamerhome.com/nvidia-geforce-gtx-980-complete-specifications/

YazX_
10-09-2014, 11:36 AM
Interesting, thank you. I know I should just wait for min system requirements, but I'm bad at being patient so here's a question trying to avoid the need for speculation: So I have a screen that's 1680x1050, but I'm not using it, I tried it out, adjusted the settings to what would give 40-ish fps in an area with a lot of grass and nature-stuff (Cape Bonavista) assuming Unity will allow me to use similar settings with around 30 fps (does that seem accurate?) I then went to Havana and had 60 fps, but an annoying problem that may be related to either the screen or the settings (will be posted below). When I'd run, the shadows would flicker, when I'd do a leap of faith, the screen would kinda zoom in and out weirdly, and the buildings had would have this weird fading-like effect when I'd get close to them. So could this be the screen or the game with these settings:

Resolution: 1600x900
Environment quality: Normal
Texture quality: High
AA: MSAA 2x
Shadow quality: Low
Reflection quality: High
Motion blur: Off
Ambient occlusion: HBAO +
God rays: Off
Volumetric fog: On
v-sync: Off

So basically, (based on speculation) will having 40 fps in a very nature-ish area in Cape Bonavista and 60 fps in Havana mean I can play with around 30 fps in Unity? And, why are the buildings acting weird with colors when I move closer to them?

Also note that I had fraps on

Also just in case that it's possibly the screen, my normal 1280x1024 screen is the Sony SDM-HS75P, and the 1680x1050 screen is called the Envision G2219w1

the cause of it could be your screen since it doesnt do that on your other screen, best to try other games on your Envision screen and see if acts the same.

strigoi1958
10-09-2014, 09:01 PM
...So, without going into too details, I got the gtx 970, but I have to settle with good old Windows Vista instead Windows 8 from now on. I'm pretty upset because despite its flaws, I kind of like Win8. The sleek look of the UI and the performance increase for gaming was great. :(

This may not be the right place to ask, but if any of you guys have an extra product key you could bargain for a fair price, do PM me. =/

It feels like hell right now.

Have you got win 8.1 yet ? I'm looking at a gtx 970 and Yaz mentioned its ability to use the latest DX11 for tile streaming... also I'd like to know what the dsr and vxgi is like..... I expect Unity will make use of them all

Fatal-Feit
10-09-2014, 10:01 PM
Have you got win 8.1 yet ? I'm looking at a gtx 970 and Yaz mentioned its ability to use the latest DX11 for tile streaming... also I'd like to know what the dsr and vxgi is like..... I expect Unity will make use of them all

Not yet, I had Windows 8.1. --Loved it. As of now, I'm still on Vista. Windows 10 looks promising, but there aren't a lot of support for it right now (including Uplay - so no AC for me).

----------------

I haven't been able to test out the 970 yet, so no feedback on the dsr or vxgi, but they do look promising.

Altair1789
10-09-2014, 10:17 PM
the cause of it could be your screen since it doesnt do that on your other screen, best to try other games on your Envision screen and see if acts the same.

I've concluded that I don't think it's the screen, so I guess the main question now is what can I use to test my fps in Black Flag that will give me a decent sense of how my pc will run Unity

Either way not that long till system requirements are out (I think/hope)

strigoi1958
10-09-2014, 10:17 PM
The improvement DSR gives to the graphics look awesome in the demos but I'd always prefer a real users opinion

YazX_
10-09-2014, 10:32 PM
I've concluded that I don't think it's the screen, so I guess the main question now is what can I use to test my fps in Black Flag that will give me a decent sense of how my pc will run Unity

Either way not that long till system requirements are out (I think/hope)

This is hard to answer since we dont know what improvements and features Unity have, especially with the new engine they created, the game now should scale well on all CPU cores according to Ubisoft, so this can make a big difference in terms of performance among other things we still not aware of.

Even when the official requirements come out, its hard to simulate how well unity will run based on AC4, however, did you pre-order or planning to do so?

Altair1789
10-09-2014, 11:40 PM
This is hard to answer since we dont know what improvements and features Unity have, especially with the new engine they created, the game now should scale well on all CPU cores according to Ubisoft, so this can make a big difference in terms of performance among other things we still not aware of.

Even when the official requirements come out, its hard to simulate how well unity will run based on AC4, however, did you pre-order or planning to do so?

Thanks for the information. I preordered, unless a computer that plays Black Flag on high-ultra 1280x720 can't play Unity, then I'll definitely be able to play the game when it comes out. It's just a matter of whether I can or can't play with a better screen, which is no big deal for me either way. I'm really glad to hear about the CPU stuff though. I hope they optimize Unity for PC though :(

Anykeyer
10-10-2014, 07:58 AM
I'm looking at a gtx 970 and Yaz mentioned its ability to use the latest DX11 for tile streaming... also I'd like to know what the dsr and vxgi is like..... I expect Unity will make use of them all

Tiled resources have to be supported by games and none of them do. VXGI is just another attempt at ray tracing that will never make it to actual games. Its not even fully hardware accelerated. Looks more like something for 3DsMax and Maya users.
DSR below 4x is pointless, lower settings produce blurry image and I prefer original 1x resolution to this. Even 4x is not that good, in many cases the difference is barely noticable. The bad part - its usually older games with lower details, games you can run at 4x and still have 60fps. A good exception is Skyrim, runs at 60 fps with 4x and looks niticably better from it. Newer games benefit the most but single GTX 980 cant run them at 4x without noticable performance impact. 4x DSR in AC3 with performance data on screen:

http://youtu.be/IndY2r8fj-Q
(its my video btw)

strigoi1958
10-11-2014, 07:12 PM
that looks awesome when the settings are increased to 2160p
I'm sure games will take advantage of tiled resources in the future. Either that or Ramdisks

I hope VGXI does make it to games it would help make immersion just that tiny bit better.

Spikey1989
10-11-2014, 10:11 PM
Tiled resources have to be supported by games and none of them do. VXGI is just another attempt at ray tracing that will never make it to actual games. Its not even fully hardware accelerated. Looks more like something for 3DsMax and Maya users.
DSR below 4x is pointless, lower settings produce blurry image and I prefer original 1x resolution to this. Even 4x is not that good, in many cases the difference is barely noticable. The bad part - its usually older games with lower details, games you can run at 4x and still have 60fps. A good exception is Skyrim, runs at 60 fps with 4x and looks niticably better from it. Newer games benefit the most but single GTX 980 cant run them at 4x without noticable performance impact. 4x DSR in AC3 with performance data on screen:

its looks awsome..

i have bought they msi 970 Gaming.. needed a new one, the 660 has done it for me. going "all-in" :)

i ahve to ask what program are u using to keep track of current setting of cpu, gpu, ram etc while playing in that video?

/Spikey

Anykeyer
10-12-2014, 08:01 AM
Its MSI Afterburner. The recording is done with it too, using the same NVENC nvidia's own ShadowPlay is using, so there' no fps loss - what you see is the exact performance of a single reference GTX 980 board at default clocks.

YazX_
10-12-2014, 09:46 PM
Its MSI Afterburner. The recording is done with it too, using the same NVENC nvidia's own ShadowPlay is using, so there' no fps loss - what you see is the exact performance of a single reference GTX 980 board at default clocks.

perhaps you could overclock it since it does like a champ and post back with readings?

on another note, i find it really strange that the card is hitting 80 degrees, i dont know maybe i'm mistaken, but seems like its throttling as its set to prioritize the temps over power.

Anykeyer
10-13-2014, 06:54 AM
My previous GTX 770 lightning operated between 60 and 70 in all heavy games but GTX 980 is a reference card. Refs always hit their turbo thermal limit (default is 79 degrees, can be raised to 91) under full load. At least 980 remains relatively silent, unlike reference 780ti. In that video GPU LOAD numbers are: TDP (aka power limit), GPU, VPU (video coder/decoder), local memory bandwight, pcie bus. As you can see its almost always at 99% GPU load. No game without DSR is that heavy, card actually often drops its clocks below 1ghz even in WD and Crysis for another reason - full boost isnt necessary to keep constant 60fps. I dont plan to use DSR in games that dont hit 60fps with 4x anyway. To get that kind of performance in all modern games you need at least 3-way SLI and thats simply not an option for me (too much heat, too much noise, no place for sound card, not to mention the cost).

YazX_
10-13-2014, 05:09 PM
That explains the temps, anyway, i still believe we are 2-3 generations behind until we get proper 4k experience on single card with decent price range, unless if Nvidia/AMD surprised us with next generations of cards which is unlikely.

AherasSTRG
10-14-2014, 02:35 PM
The Evil Within just came out today and as far as I can see, the System Requirements were once again a lie. I am holding my breath though, cause I like Bethesda and I think (thought) they were an honest company.

The whole thing has me totally confused as to where my GFX card actually stands.

Altair1789
10-14-2014, 07:59 PM
The Evil Within just came out today and as far as I can see, the System Requirements were once again a lie. I am holding my breath though, cause I like Bethesda and I think (thought) they were an honest company.

The whole thing has me totally confused as to where my GFX card actually stands.

I'd give bethesda another chance. After all, they made Skyrim, Fallout 3 and Fallout New Vegas

hood3dassassin5
10-14-2014, 08:52 PM
Alright, I need the forums' opinion on this. I plan on being Unity for PC, but I don't know if whether or not my PC will be able to support it. Here are it's specs:

Processor: AMD A8-5500 APU with Raedon(tm) HD Graphics 3.20 GHz
Installed Memory (RAM): 8.00GB (7.40GB usable)
System Type: 64-bit Operating System, x64 based processor

If anyone could help I would gladly appreciate it.

Green_Reaper
10-14-2014, 11:08 PM
Alright, I need the forums' opinion on this. I plan on being Unity for PC, but I don't know if whether or not my PC will be able to support it. Here are it's specs:

Processor: AMD A8-5500 APU with Raedon(tm) HD Graphics 3.20 GHz
Installed Memory (RAM): 8.00GB (7.40GB usable)
System Type: 64-bit Operating System, x64 based processor

If anyone could help I would gladly appreciate it.

Do you have a dedicated video card?

Fatal-Feit
10-15-2014, 05:58 AM
Alright, I need the forums' opinion on this. I plan on being Unity for PC, but I don't know if whether or not my PC will be able to support it. Here are it's specs:

Processor: AMD A8-5500 APU with Raedon(tm) HD Graphics 3.20 GHz
Installed Memory (RAM): 8.00GB (7.40GB usable)
System Type: 64-bit Operating System, x64 based processor

If anyone could help I would gladly appreciate it.

Is that a retail built? My sister used to own one just like that.

Anykeyer
10-15-2014, 08:40 AM
ACU on A8-5500 will be barelly playable, if playable at all.


I'd give bethesda another chance. After all, they made Skyrim, Fallout 3 and Fallout New Vegas
There is Bethesda Softworks (publisher) and there is Bethesda Game Studios (developer). They didnt develop new vegas (the most overrated and lowerst quality fallout game) and evil within.

Altair1789
10-15-2014, 08:08 PM
ACU on A8-5500 will be barelly playable, if playable at all.


There is Bethesda Softworks (publisher) and there is Bethesda Game Studios (developer). They didnt develop new vegas (the most overrated and lowerst quality fallout game) and evil within.

Ah.. I seem to have my Bethesdas mixed up

hood3dassassin5
10-22-2014, 12:38 AM
Is that a retail built? My sister used to own one just like that.

Yeah

Fatal-Feit
10-22-2014, 04:40 PM
Yeah

Your PC should, but at low settings. I recommend installing a graphics card, preferably the 750 Ti. It's about $130 and should be able to play Unity at high settings with playable framerate.

Altair1789
10-22-2014, 07:48 PM
Here's where I got my 750 Ti: http://www.newegg.com/Product/Product.aspx?Item=N82E16814487024
It's a decent price and the card's been working very well, if you're considering buying one

hood3dassassin5
10-23-2014, 11:20 AM
Thanks for the help. While it seems like it's a good graphics card and all, I just can't afford it right now(Too busy saving for Unity and Rogue lol).

LeftistHominid
10-23-2014, 08:57 PM
Is it me or do the official specs not make sense. A GTX 760 is powerful enough to run Black Flag at really high settings but then insufficient for Unity?

D_cover
10-23-2014, 08:59 PM
Well the minium is a GTX 680 so I would think so on low settings.

DarkSolitudeX
10-23-2014, 09:02 PM
Well the minium is a GTX 680 so I would think so on low settings.

Benchmark wise a GTX760 can't touch a 680 though

strigoi1958
10-23-2014, 09:16 PM
I bet it will run it.... plus once the drivers are updated it will run it quite reasonably. I have a 670 and the only reason I didn't buy a 680 was at 1080p the 680 was 3 FPS more and it was irrelevant because I vsync anyway... the 760 must be slightly ahead or behind the 670 so that's my view

Johny-Al-Knox
10-23-2014, 09:17 PM
Imo minimum and recommended settings should go away. They dont mean anything.They dont say what kinda settings,resolution,fps the game will run at.

You will probably fine? i guess wait for benchmarks

Dont preorder

jeffies04
10-23-2014, 10:21 PM
Is that your only limitation you think? If you have the money to risk, I'd say try it. If you don't then I second what was said above and say wait for other folks experiences. I believe these requirements are probably a little exaggerated but I wouldn't want anybody to throw away $60 counting on that alone.

FuqUBic
10-24-2014, 12:33 AM
I bet it will run it.... plus once the drivers are updated it will run it quite reasonably. I have a 670 and the only reason I didn't buy a 680 was at 1080p the 680 was 3 FPS more and it was irrelevant because I vsync anyway... the 760 must be slightly ahead or behind the 670 so that's my view

760 is below the the 670 :) i reckon our cards can run it easily on high, most game companies are exaggerating requirements lately.

LeftistHominid
10-24-2014, 02:59 AM
When I first read the official-yet-bogus GTX 680, I was thoroughly annoyed.

Now that I remind myself I was able to get my laptop (i7-2670QM 2.2.GHz and GT 55M 2GB) to some how pull of Black Flag at middle-ish settings, I am going to keep my preorder

Also, remember the "official spec" lists the same AMD 8-core for both its recommended and minimum requirements

Green_Reaper
10-24-2014, 06:02 AM
Until I see legit official and not "leaked official" I don't buy it that enthusiast level cards are the requirements for min. and recommended specs. What will it take to run dual monitor setups and 4k then? Like 10 r9 290s/ Nvidia Titans?

D.I.D.
10-24-2014, 07:35 AM
760 is below the the 670 :)

No it isn't. They're pretty much identical in performance.

As a general rule of thumb, if you take the number of this year's Nvidia card and deduct 10 from it, you get its equivalent in the next family. This year is unusual in that the 970 is a match for the 780Ti (NB - anyone confused by this, there was no 800 series!) rather than the 780, making it definitely the better buy rather than the 980 which doesn't justify its extra cost.

D.I.D.
10-24-2014, 07:44 AM
Benchmark wise a GTX760 can't touch a 680 though

Again, "can't touch" is putting it a bit strongly. It's a jump behind the 770, which is the better match for the 680, but we're in danger here of giving people the impression that cards which are actually really good are somehow worthless.

To anyone reading - do not instantly assume your card is garbage based on the specs or marginal differences in ratings. Maybe your card really won't deal with this game, but wait for the game to find out. If you were going to buy the game anyway, then you won't be spending anything more, and all this might have been a storm in a teacup for you.

If you ran Black Flag at a satisfactory level, maybe all you'll need to do is adjust resolution or switch off AA to still have a great-looking game. Pre-emptively buying a new graphics card for this game is a really bad idea, unless you were already intending to buy one.

FCvlam
10-24-2014, 08:57 AM
Hey, very excited about this game and preorded it. I just want to now if i will be able to run the game at least on medium graphics.
My rig is:
Gforce Gt 640 4GB ram
I7-3770 CPU@ 3.4GHz
8 GB ram
and Windows 8.1

My only concern is my graphics card and I don't want to wast money on a game i wont be able to run.

nultma
10-24-2014, 09:08 AM
Hey, very excited about this game and preorded it. I just want to now if i will be able to run the game at least on medium graphics.
My rig is:
Gforce Gt 640 4GB ram
I7-3770 CPU@ 3.4GHz
8 GB ram
and Windows 8.1

My only concern is my graphics card and I don't want to wast money on a game i wont be able to run.

You've already spent the money though? Honestly if it's under the minimum requirements you're better off waiting until the game comes out and people start posting their actual experiences. It could be a situation where the requirements are grossly enlarged, it could not.

D.I.D.
10-24-2014, 09:20 AM
Hey, very excited about this game and preorded it. I just want to now if i will be able to run the game at least on medium graphics.
My rig is:
Gforce Gt 640 4GB ram
I7-3770 CPU@ 3.4GHz
8 GB ram
and Windows 8.1

My only concern is my graphics card and I don't want to wast money on a game i wont be able to run.

The 640 is not really a gaming card - great for display tasks, web/video browsing, but not for fast 3d tasks. The [x]50 and above are the gaming market, and there's a big gap between the performance of the sub-50s and the above-50s.

Try looking on eBay and comparing 2nd-hand prices for high end 600s and 700s (i.e 680, 770, 780, 780 Ti) with the prices offered by shops for the same. You might be able to pick up a bargain now that people's attentions are on the new 900s. A 900 series card will be more futureproof though, and it'll be cheaper to run due to being less power-hungry. If the difference is not that great, get a 970.

Check out shops with reduced price stock, too. Sometimes you'll get a great deal on a refurbished or ex-demo card, and the shop's guarantee makes it risk-free.

fashric
10-24-2014, 09:25 AM
Hey, very excited about this game and preorded it. I just want to now if i will be able to run the game at least on medium graphics.
My rig is:
Gforce Gt 640 4GB ram
I7-3770 CPU@ 3.4GHz
8 GB ram
and Windows 8.1

My only concern is my graphics card and I don't want to wast money on a game i wont be able to run.

You should honestly cancel your pre order and wait and see the benchmarks of the game. You are well under the recommended GPU requirement. Also what is the point of a GT640 having 4gb vram? It will be performance choked way before it would ever be able to use 4gb.

ABigFatCow
10-24-2014, 09:34 AM
dear ubisoft,on behalf of around 90% of the gamers around here.we all want to play at ultra 60 fps full hd etc.Most of us have pretty decent hardware that runs games on our desired settings but i would like you to know that all of us cannot afford the titan black in quad sli plus liquid cooling so we can overclock it by atleast 500 mhz so we get the desired outcome.therefore,the solution is optimize the game.atlest a little so it executes

FCvlam
10-24-2014, 11:22 AM
Thanks, my graphic card is older than a year now. Got it in March 2013, I have no problem running battlefield 4 on high graphics. But then again BF4 is almost a year old. Luckily the game launches the 12th November and isn't released until the 14th of November here in South Africa. I will keep an eye on everyone's experiences and hopefully i will have no problem running the game.

fashric
10-24-2014, 12:02 PM
Thanks, my graphic card is older than a year now. Got it in March 2013, I have no problem running battlefield 4 on high graphics. But then again BF4 is almost a year old. Luckily the game launches the 12th November and isn't released until the 14th of November here in South Africa. I will keep an eye on everyone's experiences and hopefully i will have no problem running the game.

What resolution are you running at? The benchmarks for this card say that it barely gets above 30fps at medium settings on BF3 @ 1280x1024 so I seriously doubt you can run BF4 at high settings with a playable framerate are you sure its the GT640 you have?

Leo_2301
10-24-2014, 12:42 PM
Can I run this game with Intel core-i7 4770k and Zotac-Nvidia GeForce GTX 650 2GB? :confused:

FCvlam
10-24-2014, 01:03 PM
What resolution are you running at? The benchmarks for this card say that it barely gets above 30fps at medium settings on BF3 @ 1280x1024 so I seriously doubt you can run BF4 at high settings with a playable framerate are you sure its the GT640 you have?

Yes I'm sure. It's the Gigabyte Geforce GT640 4Gb ram, Dedicated video memory 2GB and shared system memory 2gb. Running BF4 high graphics @ 1366x768 59.79hz

Threat999
10-24-2014, 01:44 PM
Can i run it with Gtx 860m and i7 4710MQ 8gb RAM?

Izack124
10-24-2014, 05:13 PM
Graphic check. Well this card work? AMD Radeon HD 7700 Series 4g

Altair1789
10-24-2014, 09:20 PM
Graphic check. Well this card work? AMD Radeon HD 7700 Series 4g

I think it might work, I don't trust these system requirements at all

blogger360
10-25-2014, 07:17 AM
can I run it with i5 3450 and gtx 760 and 8 gb ram?

pleb87
10-25-2014, 10:17 AM
Laptops aren't supported but you might be in luck. To be honest I think everyone needs to chill and wait till the game comes out and we get some benchmark results. I would not be surprised if it turns out you can run the game in Low with half the minimum requirements!

strigoi1958
10-25-2014, 02:56 PM
I think this is the reason we haven't seen ACU bundled with Nvidia cards. If Nvidia started selling acu bundled with new gtx 660's and 750's and 670's it would make nonsense of the minimum spec... also it would lose sales of 970's

Personally I like the 970 and it's price is incredibly low compared to cards like the titan, 780 ti, etc it is incredibly good value for money ( and I never say that about gfx cards) I wish everyone could have one and IF Nvidia had bundled ACU with the 970 then even more would have sold.

in 20 days time people will have youtube videos showing ACU running on a gtx 650ti or something similar I expect.....

oliacr
10-25-2014, 04:54 PM
in 20 days time people will have youtube videos showing ACU running on a gtx 650ti or something similar I expect.....

I really hope that you are right man. It seems the same way to me as well. Sometimes this minimum requirement is as surreal as the recommended, sometimes I'm afraid :D :D

RaulO4
10-25-2014, 05:12 PM
if the AC u starts getting bundle with 900, i will cancel my 970 order.......than re buy it.

was not going to get AC until like 20$ sale so if i get it free than why not... man i could dream right?

FCvlam
10-26-2014, 08:38 AM
I think they do have alot of supported graphic cards, that they haven't listed yet. I can't imagine that they make a game requiring a $150-$200 graphic card just to run minimum graphics

AviM.OE
10-26-2014, 11:44 AM
Hey guys , I am planning to buy a GTX 760 (zotac) , will it run AC:confused: Unity at least on Medium ?

Thanks

Mr_Shade
10-26-2014, 12:22 PM
http://forums.ubi.com/showthread.php/937186-Assassin-s-Creed-Unity-Minimum-amp-Recommended-PC-Requirements

MINIMUM:

OS: Windows 7 SP1, Windows 8/8.1 (64-bit operating system required)
Processor: Intel Core i5-2500K @ 3.3 GHz or AMD FX-8350 @ 4.0 GHz
Memory: 6 GB RAM
Graphics: NVIDIA GeForce GTX 680 or AMD Radeon HD 7970 (2 GB VRAM)
Hard Drive: 50 GB available space
Sound Card: DirectX 9.0c compatible sound card with latest drivers
Additional Notes: Windows-compatible keyboard and mouse required, optional controller




RECOMMENDED:

OS: Windows 7 SP1, Windows 8/8.1 (64-bit operating system required)
Processor: Intel Core i7-3770 @ 3.4 GHz or AMD FX-8350 @ 4.0 GHz or better
Memory: 8 GB RAM
Graphics: NVIDIA GeForce GTX 780 or AMD Radeon R9 290X (3 GB VRAM)
Hard Drive: 50 GB available space
Sound Card: DirectX 9.0c compatible sound card with latest drivers


Peripherals Supported
Windows-compatible keyboard and mouse required, optional controller

Multiplayer
256 kbps or faster broadband connection

Additional Notes
Supported video cards at the time of release: NVIDIA GeForce GTX 680 or better, GeForce GTX 700 series; AMD Radeon HD7970 or better, Radeon R9 200 series
Note: Laptop versions of these cards may work but are NOT officially supported.

Winyboy
10-26-2014, 12:30 PM
and can you specify the resolution? :)

oliacr
10-26-2014, 12:33 PM
and can you specify the resolution? :)

No info on that.

FCvlam
10-26-2014, 01:08 PM
http://forums.ubi.com/showthread.php/937186-Assassin-s-Creed-Unity-Minimum-amp-Recommended-PC-Requirements

MINIMUM:

OS: Windows 7 SP1, Windows 8/8.1 (64-bit operating system required)
Processor: Intel Core i5-2500K @ 3.3 GHz or AMD FX-8350 @ 4.0 GHz
Memory: 6 GB RAM
Graphics: NVIDIA GeForce GTX 680 or AMD Radeon HD 7970 (2 GB VRAM)
Hard Drive: 50 GB available space
Sound Card: DirectX 9.0c compatible sound card with latest drivers
Additional Notes: Windows-compatible keyboard and mouse required, optional controller




RECOMMENDED:

OS: Windows 7 SP1, Windows 8/8.1 (64-bit operating system required)
Processor: Intel Core i7-3770 @ 3.4 GHz or AMD FX-8350 @ 4.0 GHz or better
Memory: 8 GB RAM
Graphics: NVIDIA GeForce GTX 780 or AMD Radeon R9 290X (3 GB VRAM)
Hard Drive: 50 GB available space
Sound Card: DirectX 9.0c compatible sound card with latest drivers


Peripherals Supported
Windows-compatible keyboard and mouse required, optional controller

Multiplayer
256 kbps or faster broadband connection

Additional Notes
Supported video cards at the time of release: NVIDIA GeForce GTX 680 or better, GeForce GTX 700 series; AMD Radeon HD7970 or better, Radeon R9 200 series
Note: Laptop versions of these cards may work but are NOT officially supported.

This is probably a stupid question but please let me now if I can run the game on this specs:
I7-3770 @ 3.4Ghz
8Gb ram
Windows8.1
Gigabyte Geforce GT 640 4Gb ram - 2GB dedicated and 2GB shared

I can run most latest games on high/ultimate graphics with no problem.

oliacr
10-26-2014, 01:19 PM
This is probably a stupid question but please let me now if I can run the game on this specs:
I7-3770 @ 3.4Ghz
8Gb ram
Windows8.1
Gigabyte Geforce GT 640 4Gb ram - 2GB dedicated and 2GB shared

I can run most latest games on high/ultimate graphics with no problem.

No one knows. You'd better wait till release before ordering this game. If you've done so , I suggest that you should cancel.

strigoi1958
10-26-2014, 02:16 PM
I think the 640 might be a little low. But definitely cancel the pre-order until you know for sure.

YazX_
10-26-2014, 02:22 PM
Hey Guys, since the OP got his answer on this thread and discussion drifted to system req , i'm locking this one and redirect to the one below as there is no need to have two threads for the same discussion, please use the link below to discuss System req further:

http://forums.ubi.com/showthread.php/891165-Assassin-s-Creed-Unity-System-Requirements