KillzEmAllGod KillzEmAllGod

ATI or Nvida

ATI or Nvida

man not sure what brand i want for a computer that i will be getting in 2 months

ATI so far has DX 11 cards but Nvida seems to have a better quality,

Nvida do not yet have DX 11 as far as i know had a bad time with my old computer with ati mainly the driver side which i sorted.

anyway this might be a little early to ask because who knows quantum computers could come out in that time.

348,627 views 163 replies
Reply #26 Top

ATI.  :)

 

Reply #27 Top

Until recently I was pretty big on Nvidia. But the last few months they've had terrible drivers. I'm still using ones from September because newer ones cause me performance losses, and the most recent ones actually fried cards outright until Nvidia pulled them (the driver would shut the fans off and overheat the card).

Also, Nvidia's got nothing to match ATI performance wise right now, and their upcoming card is a disaster in the making really. It's so hot that it requires a specially certified case just to keep it cool, and performance is worse then ATI's current lineup except in highly contrived synthetic benchmarks that have no application to real world games.

My last few cards have all been Nvidia, but if I were to be buying one right now, I'd go ATI. For whatever reason, Nvidia just can't do anything right lately. Hopefully they turn it around for next year.

Reply #28 Top

Quoting jongalt26, reply 25
Ask this question again in April and you will have a more definitive answer. Benchmarks for the new nvidia card will be released.  Keep in mind that the new nvidia card is going to be $700 and it will be compared to the  ATI $700 dual gpu 5970.  The nice thing is that you are planning on waiting and the answer will most likley be obvious in a couple of months.

They will release budget versions in the future.  Personally I like love the fact that they release the top end part first then follow up with the less expensive ones.  This gives us early adopters a longer liefspan on the component and we don't have to wait any additional time for it to be released. 

If you are playing in that price range for a video card then the questions will be:

1.  Which card has the best benchmarks and in game framerates?  (My money is on the 5970.... i doubt a single GPU card will be better, although it may be close, see info below.)  Then that brings us to the 2nd question:

2.  To PhysX or not to Physx. 

This is the position i'm currently in.  I had a card burn out on me back in October so i got an inexpensive nvidia 'interim' card.  The 5000 series of ATI cards were announced but not readily available and nvidia hadn't delayed their release of Fermi at that point.  I figured that i could use the interim card for physx if i went with Fermi or i could stick it in an HTPC if i went with the ATI 5000 series.  I've been waiting to play batman AA until i could fully anable physx lol. 

I've only used nvidia cards in my own computers, since the TNT days of yore.  I've used ATI cards for some of the systems i've built at work and my wifes laptop etc.  ATI's cards are good, nvidia cards have just been a bit better when i was purchasing.

As an FYI in regards to SLI / crossfire, it is best to buy the single fastest card that you can afford. For arguments sake, lets assume you have a $400 budget for your vid card.  The $400 video card will outperform (2) $200 videocards.  ie.  the ati 5970 will outperform (2) 5830's.

 

In any case I found 'some' ***benchmarks***

The GTX480 beats the 5970:

29% in Call of Duty: Modern Warfare 2
61% in Borderlands
25% in Far Cry 2
28% in Resident Evil 5
18% in Media

***HOWEVER I do not believe those numbers at all and suspect that the test were tweaked or adjusted to favor the nvidia card.  One main reason being is that the card isn't being released to the major hardware reviewers.  The GF100 is just 11 days from release.  If this card was the 5970 destroyer that nvidia claims then they would have this card at anandtech, hardocp, toms and similar for benchmarking. In Nvidias defense though most of the time their cards outperformed ATI's.

Here's an interesting article.

http://www.semiaccurate.com/2010/02/20/semiaccurate-gets-some-gtx480-scores/

good luck bro!

j

 

Why would you compare the gtx480 to the 5970? For that matter why would you quote anything from semiaccurate? I don't know if it matters but someone credible just said that the NDA gets lifted 26MAR, partners will have cards by 29MAR, retail embargo until 06APR. At least we will have real numbers before we start throwing money around.

Reply #29 Top

Why would you compare the gtx480 to the 5970?

Ummm for those that lack a keen insight for the obvious, i would say that it's because they are essentially the same price. Why would I compare a $700 nvidia card to a $400 ATI card?  Also, the 480 should damn well blow away a 5870 due to the significant price difference.

For that matter why would you quote anything from semiaccurate?

They were not kind to nvidia at all.  It was a juxtaposition argument in relation to the benchmarks that I provided prior to posting the article.  The benchmarks say one thing and appear to be far superior hence suspect while semi-accurate says another.  At this point in time I am more inclined to place my wagers on semiaccurate's article.  I sure hope I lose that bet.  Even if they are only 42.0% accurate then the performance difference isn't as great as what nvidia is touting.

I don't know if it matters but someone credible just said that the NDA gets lifted 26MAR, partners will have cards by 29MAR, retail embargo until 06APR. At least we will have real numbers before we start throwing money around.

I would expect real 3rd party numbers after the card is released.  I truly hope you are correct in regards to the NDA so that we get accurate benchmarks before it is available for retail.  I think that if the card was that superior the NDA would have been lifted already.  Numerous locations have stated that the cards will be in short supply so i'll personally want the ability to make an informed decision on release day.  I don't want to delay ordering and get on a back order list. 

I really want the nvidia card due to its expanded gpgpu (and physX) properties but since my system is gaming first, video editing 2nd, i'll go with the card that has better gaming numbers.  I'm keeping my fingers crossed, thats for sure.

 

 

Reply #30 Top

I have no idea why it keeps double posting.....

 

 

Reply #31 Top

I would go with nVidia. Its somewhat more expensive in many cases, however, it games perfectly AND if you ever want to be involved with Folding @ Home or you really need those CUDA cores for programs that benefit from physX (engineering programs, like AutoCad perhaps) then nVidia is definitely the way to go.

 

Plus DirectX11 is used on how many programs right now? I can only think of one game (and its not my type of game anyways), so FERMI will definitely be on time by any important perspective, the only reason why folks would go ATI just for DirectX11 is... well I dont know lol. Its not really rational, unless you really want to play that one little racing game with ALL of its fancy graphics turned on.

 

Plus, that eyefinity stuff - especially if you use the "wonderful" 6 screen setup - besides the exorbitant price of such a setup, the bezels on the edges of the screen REALLY get in the way of the game, they are a real eyesore having a thick line of black right smack in the middle of your view of the game, and the games details DO get hidden behind them.

Reply #32 Top

Quoting rothdave1, reply 28

Why would you compare the gtx480 to the 5970? For that matter why would you quote anything from semiaccurate? I don't know if it matters but someone credible just said that the NDA gets lifted 26MAR, partners will have cards by 29MAR, retail embargo until 06APR. At least we will have real numbers before we start throwing money around.

Probably because Charlie was the only one who got Nvidia's "bumpgate" problems reported correctly until after the fact. He's got a pretty good track record on getting through Nvidia's reality distortion field, especially when they're only giving review cards to their most loyal shills right now. The major trustworthy hardware sites aren't getting them yet. Wonder why that is?

Fermi's not a good card unless you want to do well in Folding@Home... and most people don't buy hardware for that.

Reply #33 Top

Tridus I agree bro, folding @ home is a nice benefit.  In addition though, Fermi offers PhysX, the CUDA API as well as C++ which i think will help software development significantly. I use some video editing software though that will greatly benefit from Fermi...

Lord Vale - As of now I see no performance difference at work between the ATI and Nvidia workstation cards in relation to Autocad (Civil 3d and Architecture versions).   We even have a couple of old systems running standard gaming cards and I don't see a difference.  The gaming cards don't use the CAD signed drivers but autocad does offload some of the processing to the cards anyways.  We don't use high end workstation cards though since we don't really use 3ds Max or Revit.

Reply #34 Top

Good thing the most graphic intense game I play is Command & Conquer Generals.

Reply #35 Top

Quoting Lord-Vale3, reply 31
Plus, that eyefinity stuff - especially if you use the "wonderful" 6 screen setup - besides the exorbitant price of such a setup, the bezels on the edges of the screen REALLY get in the way of the game, they are a real eyesore having a thick line of black right smack in the middle of your view of the game, and the games details DO get hidden behind them.

Thats right but 3d isn't going to be anybetter.

Reply #36 Top

I've use both ATI and Nvidia... surely your decision should be base on price and features, rather than who's best. Prior to my current card (ATI 5870 HD) I used a GTX 9800 both have served me well, both were in my price range at the time of purchase.

Good luck

Reply #37 Top

Personally, I like NVIDIA. They have always seemed to be more compatible, in an all around sense.

Yet, my nephew usually has ATI cards in his computer, and he is the more serious gamer.

We both have Intel CPU's.

Guess what? He never comes to me for graphics card related problems.

 

I say, go with whatever is fastest at the time you are buying - if you have the budget. No matter what, it will still be respectable in a year or two.

And right now it is the ATI 5970 - if you want a single (or comparable dual) card solution for under $700.

Just be sure you have a case, CPU and PSU that can accommodate it. Nothing less than a full tower, i7, and at least 1000W.

 

Reply #38 Top

Quoting Moosetek13, reply 37
Just be sure you have a case, CPU and PSU that can accommodate it. Nothing less than a full tower, i7, and at least 1000W.

Where do people get these power numbers? NOTHING any normal person is doing uses anywhere close to 1000W. Even ATI only recommends 750 for a 5970, and the vast majority of people are buying more mid range cards that draw far less then that.

Reply #39 Top

Where do people get these power numbers? NOTHING any normal person is doing uses anywhere close to 1000W.

Ah, but you see.....that's where the logic fails down.

 

 

 

 

 

 

 

 

 

 

None of us is normal....;)

Reply #40 Top

Um, why do you need cars with 7ltr engines when my 1.6ltr will do 120mph?

It's what Rolls Royce calls 'waftability'. My 1200W PSU is exactly that ;)

Reply #41 Top

Where do people get these power numbers? NOTHING any normal person is doing uses anywhere close to 1000W. Even ATI only recommends 750 for a 5970, and the vast majority of people are buying more mid range cards that draw far less then that.

The card itself can draw nearly 300W, and it can easily be overclocked to take nearly 400W.

Add in the rest of the system.

If you want to run your PS at near capacity much of the time, go ahead. Me? I'd rather keep it at a safer level.

Reply #42 Top

Hrmm, im curious to see how you came up with your numbers.

I spec'd a system with an i7 860, (6) sticks of ram, (2) 10k rpm drives, (2) SSDs, (1) 5970, (2) dvd burners, (2) 120 mm LED fans, (2) 120 mm fans, (1) PCIe - 4x card, (2) usb ports requiring power.

Configured for 100% loading, which means everything is running at peak capacity and calculated a total of 512 watts.

Assuming overclocking adds a max. of 100 watts (not likely at all) then peak power wiould be 600 watts.  I don't like to run my PSU's over 80% of max capacity so that would require 720 watts.  A 750 would do, but then again you get power drop at higher temps and while the 80% mentioned before should cover the efficiency rating and some power drop I would throw in another 100 watts which should give me a longer lifetime for the PSU.  850 watts will satisfy any need for a single proc. computer with (1) 5970 including overclocking.  You could even run (2) 5970's quite easily and bump their clocks up a bit as well.

 

As far as the waftability - well a rolls royce, cruising down the street, garners the attention of the social elite (ie. hot girls) who want to go for a ride in your ride.  It also gets you a lot of free grey poupon mustard. 

A 1200 watt power supply waftability gets you the attention of some fantastic dust bunnies.

Also please keep in mind that 1200 watts is getting close to the maximum allowable load on a 20 amp circuit (which the ceiling is 1600 for new construction).  Most homes (and business for that matter) don't have a dedicated home run circuit for their computer alone.  They share the 20 amps with whatever is connected to the circuit.  (haha luckily i did 2 dedicated home runs for my office when i finished my basement, hows that for geek creds?)

 

In any case while a 1200 watt PSU is nice to have, it's not really needed...

Reply #43 Top

Also please keep in mind that 1200 watts is getting close to the maximum allowable load on a 20 amp circuit (which the ceiling is 1600 for new construction). Most homes (and business for that matter) don't have a dedicated home run circuit for their computer alone. They share the 20 amps with whatever is connected to the circuit. (haha luckily i did 2 dedicated home runs for my office when i finished my basement, hows that for geek creds?)

What country are you in? The standard ring main here is 32 amps with a maximum of 7200W. As my cooker and shower are on seperate spurs, I can safely turn on everything else in the house (should I wish to do so...).

Reply #44 Top

Quoting jongalt26, reply 29
Why would you compare the gtx480 to the 5970?

Ummm for those that lack a keen insight for the obvious, i would say that it's because they are essentially the same price. Why would I compare a $700 nvidia card to a $400 ATI card?  Also, the 480 should damn well blow away a 5870 due to the significant price difference.

For that matter why would you quote anything from semiaccurate?

They were not kind to nvidia at all.  It was a juxtaposition argument in relation to the benchmarks that I provided prior to posting the article.  The benchmarks say one thing and appear to be far superior hence suspect while semi-accurate says another.  At this point in time I am more inclined to place my wagers on semiaccurate's article.  I sure hope I lose that bet.  Even if they are only 42.0% accurate then the performance difference isn't as great as what nvidia is touting.

I don't know if it matters but someone credible just said that the NDA gets lifted 26MAR, partners will have cards by 29MAR, retail embargo until 06APR. At least we will have real numbers before we start throwing money around.

I would expect real 3rd party numbers after the card is released.  I truly hope you are correct in regards to the NDA so that we get accurate benchmarks before it is available for retail.  I think that if the card was that superior the NDA would have been lifted already.  Numerous locations have stated that the cards will be in short supply so i'll personally want the ability to make an informed decision on release day.  I don't want to delay ordering and get on a back order list. 

I really want the nvidia card due to its expanded gpgpu (and physX) properties but since my system is gaming first, video editing 2nd, i'll go with the card that has better gaming numbers.  I'm keeping my fingers crossed, thats for sure.

 

 

I still don't see the rational for comparing a single gpu card to a dual gpu card that's significantly underclocked to meet official pcie specs. I'd wait for a gtx495 to compare to the two. gtx vs 5870 etc. That said no one knows prices. People that attempt to be credible have the prices gtx480 400-450 gtx470 325-350 (i think these are usd but my german is a combination of listening to my grandparents 20 years ago and google translate). That said it's all bullcrap and guessing. What's not guessing though is that the offical reveiws that are getting done in that ten day window are coming from cards nvidia sends to reveiwers not from evga, sapphire, asus etc like usual and that concerns me a little bit. I really hope that the 470 is in that price range and has some solid overclock potential without requiring  a 1200 watt psu and a 360 rad.

Reply #45 Top

Just for the info the new ati 5970 has a max power consumption of 300 watts .....so if you have 2 of those suckers you'll probably need a 1000 watt psu.(600 watts for them and 400 for everything else)  but 1200 is still unwarranted

Reply #46 Top

Quoting rothdave1, reply 44


 

 
I still don't see the rational for comparing a single gpu card to a dual gpu card that's significantly underclocked to meet official pcie specs. I'd wait for a gtx495 to compare to the two. gtx vs 5870 etc. That said no one knows prices. People that attempt to be credible have the prices gtx480 400-450 gtx470 325-350 (i think these are usd but my german is a combination of listening to my grandparents 20 years ago and google translate). That said it's all bullcrap and guessing. What's not guessing though is that the offical reveiws that are getting done in that ten day window are coming from cards nvidia sends to reveiwers not from evga, sapphire, asus etc like usual and that concerns me a little bit. I really hope that the 470 is in that price range and has some solid overclock potential without requiring  a 1200 watt psu and a 360 rad.

I was under the impression that the gtx 480 was going to run around $700.  I apologize if that's not the case....(it's also currently on pre-order at sabre PC for $700 but i dont necessarily trust the price nor the pre-order) If it's in euros 450 eu = $614 + - .. The reason for the comparison is that i want the best bang for my $700.  If the 480 is indeed $450 or so (great news to me!) then i would compare it to the 5870 which is in the same price range.  I would even take a small performance hit on the nvidia car (in comparison) to get the additional feature i want with the card (as mentioned before)

 

Like you im concerned with the selected reviewers, unless the reviewers are the ones we visit regularly ie. maximumpc, toms, anand, hard etc. etc.

j

Reply #47 Top

Quoting Fuzzy, reply 43
What country are you in? The standard ring main here is 32 amps with a maximum of 7200W. As my cooker and shower are on seperate spurs, I can safely turn on everything else in the house (should I wish to do so...).

That would be the US.  Circuit breakers for household outlets are generally 15 amps (~1500W) and that's what household wiring is rated for.  A single breaker is usually shared between several outlets.  So, if your computer has a real high current draw shared with other appliances, you could overload the circuit.  Of course, I don't think I'd want to pay for the power draw of such a computer.  Realistically, computers draw less than the rating.  Engineers usually pad ratings 50% to 100% for safety reasons.  My computer isn't real high power, but it draws around 200W altogether when it's in gaming mode, at least according to the gauge on my UPS unit.  If I went by the ratings, it should be twice that.

Most EU countries run 220V which provides the ability to power a lot more stuff on the same circuit.

Reply #48 Top

Wait for Fermi to drop on the 26th, even if you are planning on getting an ATI card because ATI may drop their prices then.

From the resources that each card has dedicated to gaming Fermi SHOULD be faster than the HD 5xxx series, but it will also PROBABLY be more expensive (think gt200 and the hd 4xxx series pricing differences) and Nvidia has stated it will run very hot.

If you can afford to wait a few months, then I think a card like a GT 450 or 460 would be the best value, barring any refresh from ATI.

Reply #49 Top

Quoting jongalt26, reply 46

Quoting rothdave1, reply 44

 

 

I was under the impression that the gtx 480 was going to run around $700.  I apologize if that's not the case....(it's also currently on pre-order at sabre PC for $700 but i dont necessarily trust the price nor the pre-order) If it's in euros 450 eu = $614 + - .. The reason for the comparison is that i want the best bang for my $700.  If the 480 is indeed $450 or so (great news to me!) then i would compare it to the 5870 which is in the same price range.  I would even take a small performance hit on the nvidia car (in comparison) to get the additional feature i want with the card (as mentioned before)

 

Like you im concerned with the selected reviewers, unless the reviewers are the ones we visit regularly ie. maximumpc, toms, anand, hard etc. etc.

j

As far as wattages go I understand your point but you have to remember that a 1200w psu with an 87 percent efficiency is dropping almost 200 watts. Running a 5970 at 5870 speeds nets you about 390 watts of usage. My 5870s draw about 275 watts at 1k core 1250 memory. The I7 in the other room draws about 260 watts at 4.2 GHz. The Phenom965 I'm using right now draws about 225 at 4.0. An OC'd i7 wif an pair of GTX480's assuming they do runn close to 300 stock with a mild overclock my run up close to needing 1000w therefore 1200 psu may be needs. The article, by the by, had a forward dollar sign on numbers and was talking about US release but didn't say explicitly usd so i don't know.

Reply #50 Top

That is true my 300 watt per 5970 was non overclocked.  But why would you overclock since if you have one of these beast machines there is no point until you find something it cant handle.. Which should be at least a few years down the road.