GTX 680 reviews (finally!) Nvidia has a winner on their hands
  1 / 9    
The Kepler reviews are now posted by all the major review sites. I noticed that Newegg even has a few PNY models in stock ($499 USD).

It seems Nvidia has a winner on their hands. Not only does the 680 trump the 7970 in most situations, it does so while being largely more efficient. Particularly, the 680's shader performance is insanely impressive, which allows it to match the dual-GPU GTX 590 in games like BF3.

New features include adaptive vsync and the ability to run games in surround with a single GTX 680.

I have to say, Nvidia has done a great job with GK104. Yeah, it's not the beast we'd expect from Nvidia at launch, but it still easily takes the single-GPU crown.

Let's get to the reviews...

[url="http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review"][b]AnandTech[/b][/url]

[url="http://www.techradar.com/reviews/pc-mac/pc-components/graphics-cards/nvidia-geforce-gtx-680-1072796/review"][b]Techradar[/b][/url]

[url="http://www.guru3d.com/article/geforce-gtx-680-review/"][b]Guru3d[/b][/url]

[url="http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/"][b]TechPowerUp[/b][/url]

[url="http://hothardware.com/Reviews/NVIDIA-GeForce-GTX-680-Review-Kepler-Debuts/"][b]HotHardware[/b][/url]

[url="http://www.bit-tech.net/hardware/2012/03/22/nvidia-geforce-gtx-680-2gb-review/1"][b]Bit-Tech[/b][/url]

[url="http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/52616-nvidia-geforce-gtx-680-2gb-review.html"][b]HardwareCanucks[/b][/url]
The Kepler reviews are now posted by all the major review sites. I noticed that Newegg even has a few PNY models in stock ($499 USD).



It seems Nvidia has a winner on their hands. Not only does the 680 trump the 7970 in most situations, it does so while being largely more efficient. Particularly, the 680's shader performance is insanely impressive, which allows it to match the dual-GPU GTX 590 in games like BF3.



New features include adaptive vsync and the ability to run games in surround with a single GTX 680.



I have to say, Nvidia has done a great job with GK104. Yeah, it's not the beast we'd expect from Nvidia at launch, but it still easily takes the single-GPU crown.



Let's get to the reviews...



AnandTech



Techradar



Guru3d



TechPowerUp



HotHardware



Bit-Tech



HardwareCanucks

EVGA E758 A1 X58 // Core i7 920@4Ghz // OCZ Platinum DDR3 1600 // EVGA GTX 670 SLI // Seasonic X Series Gold 1050w // Corsair 800D // Dual Dell Ultrasharp U2410 displays // Dell Ultrasharp 2408WFP

#1
Posted 03/22/2012 04:54 PM   
and 2 more sites


Hardware heaven
[url="http://www.hardwareheaven.com/reviews/1452/pg1/nvidia-geforce-gtx-680-kepler-graphics-card-review-introduction.html"]http://www.hardwareh...troduction.html[/url]

Alienbabeltech
[url="http://alienbabeltech.com/main/?p=28910"]http://alienbabeltec...m/main/?p=28910[/url]




Overall im not so impressed, not for that price..

Both AMD & NV 28nm should be next-gen mid-rage not high-end for what they offer, at least when compared to last high-end gen.


GK110 & AMD Tenerife? here i come /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' />
and 2 more sites





Hardware heaven

http://www.hardwareh...troduction.html



Alienbabeltech

http://alienbabeltec...m/main/?p=28910









Overall im not so impressed, not for that price..



Both AMD & NV 28nm should be next-gen mid-rage not high-end for what they offer, at least when compared to last high-end gen.





GK110 & AMD Tenerife? here i come /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' />

GPU: Inno3D GTX 580 1536MB OC
Monitor: EIZO Foris FS2333 [HDMI]
CPU: Intel i7 4770K @ 4.7GHz [1.274v] | Corsair H90 custom P/P fans
Mainboard: Asus Z87-Deluxe [Bios 1802]
Ram:Crucial Balistix Elite 16GB CL10/1T @ 2400MHz [1.65v]
Sound: X-FI Titanium HD [Pcie SB1270]
HDD: Intel SSD 330 180Gb; Samsung F3 1Tb; Hitachi 500Gb
PSU: Chieftec NiTRO88+ 650w [52A]
OS: Windows 8.1.1 Pro WMC x64


Image

#2
Posted 03/22/2012 05:46 PM   
only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do
only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do

intel 3960x @4.1Ghz

3 x 680 2gb water cooled

3 x 42inch fullhd @6080x1080

32 gigs gskill XMP @2133

win 7 64bit

#3
Posted 03/22/2012 06:11 PM   
To me, the most impressive features are the GPUBoost and Frame Rrate Targeting, which helps in lowering power consumption.

Also, you can do surround + 1 monitor on one GTX680.
To me, the most impressive features are the GPUBoost and Frame Rrate Targeting, which helps in lowering power consumption.



Also, you can do surround + 1 monitor on one GTX680.

eVGA Z68 SLI | Intel Core i5-3570K @ 4.5 GHz | Corsair Hydro Series H80i
8GB G.Skill Sniper Series DDR3 | EVGA GTX 580 | OCZ ZX 1000W
OCZ Agility3 120 GB SSD + SanDisk Ultra 120GB SSD
Samsung UN55D6000 + Samsung T240
Win8 Pro x64 / WEI - 8.0 - 8.0 - 8.1 - 8.1 - 7.8
3DMark11 - P7545 / Vantage (PPU disabled) - P28859
F@H Team: 142900

#4
Posted 03/22/2012 06:13 PM   
Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

[list]
[*]Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
[*]In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
[*]Things carry on happily this way.
[*]Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
[*]AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
[*]GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
[*]Speculation ensues.
[/list]

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:

[quote]Gamers told us they want GPUs that are cooler, quieter, and more power efficient. So we re-designed the architecture to do just that. The GeForce GTX 680 consumes less power than any flagship GPU since the GeForce 8800 Ultra, yet it outperforms every GPU we or anyone else have ever built.[/quote]

And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.
Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...




  • Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
  • In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
  • Things carry on happily this way.
  • Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
  • AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
  • GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
  • Speculation ensues.




Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />



Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:



Gamers told us they want GPUs that are cooler, quieter, and more power efficient. So we re-designed the architecture to do just that. The GeForce GTX 680 consumes less power than any flagship GPU since the GeForce 8800 Ultra, yet it outperforms every GPU we or anyone else have ever built.




And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.

Official GeForce Forums Benchmarking Leaderboards
NVIDIA SLI Technology: A Canine's Guide

Corsair Obsidian 350D mATX, Asus Maximus VI GENE Z87 mATX, Intel Core i7-4770k @ 4.40GHz, Corsair H110, Corsair Dominator Platinum 16GB (4x4GB) @ 2400MHz, 1x OCZ Vertex 4 256GB, 1x WD Scorpio Black 750GB, 2x WD Caviar Black 1TB, EVGA GeForce GTX 780Ti Superclock, Enermax 1250W Evolution, Windows 8 64bit.

Logitech G9x, Razer Black Widow Ultimate, Logitech G930, 2x Eizo EV2333W.

Twitter | Steam

#5
Posted 03/22/2012 09:12 PM   
[quote name='jimbonbon' date='22 March 2012 - 04:12 PM' timestamp='1332450742' post='1386476']
Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

[list]
[*]Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
[*]In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
[*]Things carry on happily this way.
[*]Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
[*]AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
[*]GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
[*]Speculation ensues.
[/list]

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:



And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.
[/quote]

Amen, preacher. That is all.
[quote name='jimbonbon' date='22 March 2012 - 04:12 PM' timestamp='1332450742' post='1386476']

Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...




  • Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
  • In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
  • Things carry on happily this way.
  • Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
  • AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
  • GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
  • Speculation ensues.




Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />



Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:







And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.





Amen, preacher. That is all.

Case: Antec 1200

PSU: Corsair AX850 850W

CPU: Intel Core i7 860 @4GHz 1.352v cooled w. Xigmatek Thor's Hammer

RAM: 2x4GB G. Skill Sniper Series DDR3-1866 @1910MHz 9-10-9-28 2T 1.5v 2:10 FSB:DRAM

MB: ASUS Maximus III Formula Intel P55

HD: WD Velociraptor 160GB (Boot), WD Caviar Black 750GB (all my junk)

OD: ASUS Blu-Ray

GPU: 2 x EVGA GeForce GTX 670 FTW

Sound Card: Creative X-Fi Titanium HD

OS: Windows 7 Home Premium 64-bit

Peripherals: Logitech G510 keyboard, Logitech G500 laser mouse, Logitech Z-5500 5.1 speakers, Gateway FPD2485W 24'' monitor

Image

#6
Posted 03/22/2012 10:44 PM   
I agree wholeheartedly Jimbonbon. I truly don't understand what the naysayers are complaining about. The GTX 680 trumps the dual-GPU GTX 590 in games like BF3, while consuming around 190W of juice. In addition to that, the new features the 600 series brings to the table are most welcomed. I think this is a GPU we should all be excited about.

I'm happy with the direction Nvidia is going here. They've developed a lean GPU that's wicked fast. What's not to like? I'll admit that I would have liked to see the price a bit lower, but considering the competition I think it's fair.
I agree wholeheartedly Jimbonbon. I truly don't understand what the naysayers are complaining about. The GTX 680 trumps the dual-GPU GTX 590 in games like BF3, while consuming around 190W of juice. In addition to that, the new features the 600 series brings to the table are most welcomed. I think this is a GPU we should all be excited about.



I'm happy with the direction Nvidia is going here. They've developed a lean GPU that's wicked fast. What's not to like? I'll admit that I would have liked to see the price a bit lower, but considering the competition I think it's fair.

EVGA E758 A1 X58 // Core i7 920@4Ghz // OCZ Platinum DDR3 1600 // EVGA GTX 670 SLI // Seasonic X Series Gold 1050w // Corsair 800D // Dual Dell Ultrasharp U2410 displays // Dell Ultrasharp 2408WFP

#7
Posted 03/22/2012 11:18 PM   
nicely said jimbonbon. yes i always didn't like to add anything to the speculation thread that was so overdue. i even hardly looked at it. sort of a waste of time really posting on speculation and worse arguing about something that we didn't know about.
that being said i still stick to my thought and that is before GTX680 release, the HD7970 would have been the ideal choice over what was available at the time of its release without question. i would have recommended the HD7970 to anyone without hesitation.
but now things are nicer /tongue.gif' class='bbc_emoticon' alt=':tongue:' />
nicely said jimbonbon. yes i always didn't like to add anything to the speculation thread that was so overdue. i even hardly looked at it. sort of a waste of time really posting on speculation and worse arguing about something that we didn't know about.

that being said i still stick to my thought and that is before GTX680 release, the HD7970 would have been the ideal choice over what was available at the time of its release without question. i would have recommended the HD7970 to anyone without hesitation.

but now things are nicer /tongue.gif' class='bbc_emoticon' alt=':tongue:' />

#8
Posted 03/22/2012 11:58 PM   
I hate prices in NZ though.
I assume it will be all over the world, right?
Here it will launch at 999NZD
converting the 499USD to NZD I get [url="http://www.xe.com/ucc/convert/?Amount=499&From=USD&To=NZD"]this[/url].

So if I buy it from US and pay for shipping to get it sent to me I still pay far less than buying it from here in NZ.
I hate prices in NZ though.

I assume it will be all over the world, right?

Here it will launch at 999NZD

converting the 499USD to NZD I get this.



So if I buy it from US and pay for shipping to get it sent to me I still pay far less than buying it from here in NZ.

#9
Posted 03/23/2012 12:35 AM   
thats nutz..... but serious that really sucks. Prices here range from 499-534 (with MIR lol)
thats nutz..... but serious that really sucks. Prices here range from 499-534 (with MIR lol)
#10
Posted 03/23/2012 01:25 AM   
Im still waiting for Folding performance

I see 4GB models will be out in about 2 months
Love looking at EVGA lineup
Im still waiting for Folding performance



I see 4GB models will be out in about 2 months

Love looking at EVGA lineup



ImageImage


Image

Join the NVidia Forum Team - Please Help Medical Research - Folding@home


#11
Posted 03/23/2012 01:29 AM   
[quote name='chiz' date='23 March 2012 - 03:28 AM' timestamp='1332469729' post='1386630']
Nvidia reps should really consider the ramifications of treating their customers as if they're uninformed or ignorant. In the end though, it just makes you look dishonest, is that the image you really want to put out there? Better to just own up to it. LilK performed better than expected, Tahiti sucked, and BigK isn't quite ready and is yet to be released, so we get a mid-range part selling at $500 flagship prices.

There's about 100 bits of evidence that all point to GK104 initially targeting mid-range performance SKUs but bumped up to prime time as a result of Tahiti's poor showing. Here's a nice summary, many of the breadcrumbs are provided by Nvidia, they just didn't do a very good job of sweeping the evidence under the rug. But I guess that's what happens when you spend 2+ years developing a product line and then decide to change gears in the 24th hour before launch.

http://www.techpowerup.com/forums/showthread.php?t=162901

That said, the only thing not to like about lilKepler is its pricing. $400-$450 would've made a lot more sense and been a better reflection of its performance improvement over last-gen's GTX 580, but at the very least, it provides a glimpse of what is to come with BigK. Should be great, will be nice to see what Nvidia's flagship is capable of if their mid-range performs this well! /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />
[/quote]

BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.
[quote name='chiz' date='23 March 2012 - 03:28 AM' timestamp='1332469729' post='1386630']

Nvidia reps should really consider the ramifications of treating their customers as if they're uninformed or ignorant. In the end though, it just makes you look dishonest, is that the image you really want to put out there? Better to just own up to it. LilK performed better than expected, Tahiti sucked, and BigK isn't quite ready and is yet to be released, so we get a mid-range part selling at $500 flagship prices.



There's about 100 bits of evidence that all point to GK104 initially targeting mid-range performance SKUs but bumped up to prime time as a result of Tahiti's poor showing. Here's a nice summary, many of the breadcrumbs are provided by Nvidia, they just didn't do a very good job of sweeping the evidence under the rug. But I guess that's what happens when you spend 2+ years developing a product line and then decide to change gears in the 24th hour before launch.



http://www.techpowerup.com/forums/showthread.php?t=162901



That said, the only thing not to like about lilKepler is its pricing. $400-$450 would've made a lot more sense and been a better reflection of its performance improvement over last-gen's GTX 580, but at the very least, it provides a glimpse of what is to come with BigK. Should be great, will be nice to see what Nvidia's flagship is capable of if their mid-range performs this well! /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />





BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.



What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.

EVGA E758 A1 X58 // Core i7 920@4Ghz // OCZ Platinum DDR3 1600 // EVGA GTX 670 SLI // Seasonic X Series Gold 1050w // Corsair 800D // Dual Dell Ultrasharp U2410 displays // Dell Ultrasharp 2408WFP

#12
Posted 03/23/2012 02:47 AM   
[quote name='slamscaper' date='22 March 2012 - 10:47 PM' timestamp='1332470864' post='1386636']
BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.
[/quote]
I'm sure BigK's release schedule has been impacted as a result of Tahiti's poor showing without a doubt. Regardless of when it could've or would've been ready, its undoubtedly going to come out later than intended because Tahiti sucks and lilK is now a GTX 680.

As for performance, there's no doubt the 680 provides exceptional performance for a mid-range part, but when held to the same standards as previous next-gen flagship SKUs on a new process node in this flagship price range, its the worst price:performance shift since the 9800GTX. You can map out the increases yourself, GTX 680 is ~35-40% faster than the GTX 580. Compare GTX 480 to GTX 285, or GTX 280 to 8800GTX and you'll see exactly what I'm talking about.

People don't seem to be considering the long-term ramifications here though. GK104 mid-range selling as high-end sets a terrible precedent and allows Nvidia to continue selling mid-range ASICs at high-end prices as long as they meet the (ever-decreasing) standards and expectations of how much of an improvement makes upgrading worthwhile. It seems nowadays, +35-40% is enough of an increase over 16-24 months to command the same price premiums when its always been +50% or more in the past. Bottomline is this, you get less for your money in the way of improvement and innovation than you did in the past. Shame really, Moore's Law is dead in the GPU space now as well (I guess JHH will have to take that bulletpoint off his next slide-deck).
[quote name='slamscaper' date='22 March 2012 - 10:47 PM' timestamp='1332470864' post='1386636']

BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.



What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.



I'm sure BigK's release schedule has been impacted as a result of Tahiti's poor showing without a doubt. Regardless of when it could've or would've been ready, its undoubtedly going to come out later than intended because Tahiti sucks and lilK is now a GTX 680.



As for performance, there's no doubt the 680 provides exceptional performance for a mid-range part, but when held to the same standards as previous next-gen flagship SKUs on a new process node in this flagship price range, its the worst price:performance shift since the 9800GTX. You can map out the increases yourself, GTX 680 is ~35-40% faster than the GTX 580. Compare GTX 480 to GTX 285, or GTX 280 to 8800GTX and you'll see exactly what I'm talking about.



People don't seem to be considering the long-term ramifications here though. GK104 mid-range selling as high-end sets a terrible precedent and allows Nvidia to continue selling mid-range ASICs at high-end prices as long as they meet the (ever-decreasing) standards and expectations of how much of an improvement makes upgrading worthwhile. It seems nowadays, +35-40% is enough of an increase over 16-24 months to command the same price premiums when its always been +50% or more in the past. Bottomline is this, you get less for your money in the way of improvement and innovation than you did in the past. Shame really, Moore's Law is dead in the GPU space now as well (I guess JHH will have to take that bulletpoint off his next slide-deck).

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 920 D0 @4.0GHz 1.29V | EVGA X58 Classified 760 | Win7 x64 Ultimate | Antec 620 LC
Galaxy GeForce GTX 670 GC SLI | Asus VG278H 120Hz LCD + 3D Vision 2 | 12GB Samsung DDR3 1600 35nm
Intel X25-M G2 160GB SSD | WD Black 3x1.5TB RAID 0 | Kingston HyperX 2x128GB SSD RAID 0
Sony STR-DG1000 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G700/G110/G27 | CM HAF X Blue | Antec TPQ-1200W

#13
Posted 03/23/2012 03:10 AM   
Indeed it would be pointless to opt for the GTX 580's at this point ... even considering the current prices , GTX680's sold out , not surprising though , will see how the drivers work here in the near future anyways .
Indeed it would be pointless to opt for the GTX 580's at this point ... even considering the current prices , GTX680's sold out , not surprising though , will see how the drivers work here in the near future anyways .

Image

#14
Posted 03/23/2012 03:13 AM   
[quote name='Tepescovir' date='22 March 2012 - 12:11 PM' timestamp='1332439911' post='1386390']
only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do
[/quote]

I want to see Surround #s on 2 680s myself. Specifically in comparison to 580s & 7970s, because the results I found show the 7970s beat 580s by 45% at 1080px3 (5760x1080p), on BF3 with Ultra Settings, HBAO & no AA, with Post Process AA on High. I'd love to see how the 680s do on this, as it's a BIG selling point for me since I run that resolution & would really like to run this game maxed out or nearly maxed out (since no AA doesn't count as maxed out).

So if/when anyone gets two of these things if you have Surround & BF3, PLEASE post a run through on a map or something.

I'm going to do the same when I get my next set of cards, and I'll gladly do it with 580s as well, as long as I have them at least.
[quote name='Tepescovir' date='22 March 2012 - 12:11 PM' timestamp='1332439911' post='1386390']

only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do





I want to see Surround #s on 2 680s myself. Specifically in comparison to 580s & 7970s, because the results I found show the 7970s beat 580s by 45% at 1080px3 (5760x1080p), on BF3 with Ultra Settings, HBAO & no AA, with Post Process AA on High. I'd love to see how the 680s do on this, as it's a BIG selling point for me since I run that resolution & would really like to run this game maxed out or nearly maxed out (since no AA doesn't count as maxed out).



So if/when anyone gets two of these things if you have Surround & BF3, PLEASE post a run through on a map or something.



I'm going to do the same when I get my next set of cards, and I'll gladly do it with 580s as well, as long as I have them at least.



Image


Image

Help fight Cancer, Alzheimer's and Parkinson's Disease by donating unused CPU and GPU power to Stanford University's Research Folding@Home projects:

Simplest method is to setup the FAH v7 client with this Windows Installation Guide

#15
Posted 03/23/2012 06:29 AM   
  1 / 9    
Scroll To Top