GTX 680 reviews (finally!) Nvidia has a winner on their hands
  1 / 9    
The Kepler reviews are now posted by all the major review sites. I noticed that Newegg even has a few PNY models in stock ($499 USD).

It seems Nvidia has a winner on their hands. Not only does the 680 trump the 7970 in most situations, it does so while being largely more efficient. Particularly, the 680's shader performance is insanely impressive, which allows it to match the dual-GPU GTX 590 in games like BF3.

New features include adaptive vsync and the ability to run games in surround with a single GTX 680.

I have to say, Nvidia has done a great job with GK104. Yeah, it's not the beast we'd expect from Nvidia at launch, but it still easily takes the single-GPU crown.

Let's get to the reviews...

[url="http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review"][b]AnandTech[/b][/url]

[url="http://www.techradar.com/reviews/pc-mac/pc-components/graphics-cards/nvidia-geforce-gtx-680-1072796/review"][b]Techradar[/b][/url]

[url="http://www.guru3d.com/article/geforce-gtx-680-review/"][b]Guru3d[/b][/url]

[url="http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/"][b]TechPowerUp[/b][/url]

[url="http://hothardware.com/Reviews/NVIDIA-GeForce-GTX-680-Review-Kepler-Debuts/"][b]HotHardware[/b][/url]

[url="http://www.bit-tech.net/hardware/2012/03/22/nvidia-geforce-gtx-680-2gb-review/1"][b]Bit-Tech[/b][/url]

[url="http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/52616-nvidia-geforce-gtx-680-2gb-review.html"][b]HardwareCanucks[/b][/url]
The Kepler reviews are now posted by all the major review sites. I noticed that Newegg even has a few PNY models in stock ($499 USD).



It seems Nvidia has a winner on their hands. Not only does the 680 trump the 7970 in most situations, it does so while being largely more efficient. Particularly, the 680's shader performance is insanely impressive, which allows it to match the dual-GPU GTX 590 in games like BF3.



New features include adaptive vsync and the ability to run games in surround with a single GTX 680.



I have to say, Nvidia has done a great job with GK104. Yeah, it's not the beast we'd expect from Nvidia at launch, but it still easily takes the single-GPU crown.



Let's get to the reviews...



AnandTech



Techradar



Guru3d



TechPowerUp



HotHardware



Bit-Tech



HardwareCanucks

Corsair 800D full-tower case // Gigabyte Z97X-UD5H Rev 1.1 MOBO // Intel Core i7 4790K "Devil's Canyon" // Corsair H100i RGB Closed-Loop LC // 16GB Corsair Vengeance PC3-12800 // Dual Nvidia-made GTX 970's in SLI with Nvidia LED bridge // Creative SB X-Fi Titanium // GSKILL RIPJAWS MX780 8200 DPI RGB gaming mouse // Corsair MM600 solid aircraft-grade aluminum mouse pad // GSKILL RIPJAWS KM780 RGB mechanical keyboard // Dual OCZ Vector 180 240GB SSD's in RAID 0 config // Silcon Power 240GB SSD // Dual WD Black 2TB Mechanical drives // Seagate Barracuda 4TB SSHD Hybrid drive // WD My Passport 256GB USB 3.1 Gen 2 External SSD // WD My Passport USB 3.0 2TB external hard drive // Seagate Barracuda USB 3.0 2TB external hard drive // Seasonic X Series Gold 1050w PSU // LG 27UD69PW AH-IPS LED UHD 4K monitor // Dell Ultrasharp U2410 H-IPS calibrated monitor // Klipsch Promedia 2.1 sound system // TP-Link Archer C9 AC1900 Router // Logitech 1080p C922 Pro Stream Webcam with tri-pod // Windows 10 Pro 64-bit "Creators Update" build 15063 // Mobile hardware: Samsung Galaxy Note 5 64GB Black Smartphone running Android Nougat 7.0

#1
Posted 03/22/2012 04:54 PM   
and 2 more sites


Hardware heaven
[url="http://www.hardwareheaven.com/reviews/1452/pg1/nvidia-geforce-gtx-680-kepler-graphics-card-review-introduction.html"]http://www.hardwareh...troduction.html[/url]

Alienbabeltech
[url="http://alienbabeltech.com/main/?p=28910"]http://alienbabeltec...m/main/?p=28910[/url]




Overall im not so impressed, not for that price..

Both AMD & NV 28nm should be next-gen mid-rage not high-end for what they offer, at least when compared to last high-end gen.


GK110 & AMD Tenerife? here i come /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' />
and 2 more sites





Hardware heaven

http://www.hardwareh...troduction.html



Alienbabeltech

http://alienbabeltec...m/main/?p=28910









Overall im not so impressed, not for that price..



Both AMD & NV 28nm should be next-gen mid-rage not high-end for what they offer, at least when compared to last high-end gen.





GK110 & AMD Tenerife? here i come /biggrin.gif' class='bbc_emoticon' alt=':biggrin:' />

GPU: ZOTAC GTX 780 3072MB OC [boost 1097MHz] Amp!Fan {80.80.21.00.48}
Monitor: EIZO Foris FS2333 [DVI-D]
CPU: Intel i7 4770K @ 4.7GHz [1.282v] | Corsair H90 custom P/P fans
Mainboard: Asus Z87-Deluxe [Bios 2103]
Ram:Crucial Balistix Elite 16GB CL10/1T @ 2400MHz [1.65v]
Sound:Creative X-FI Titanium HD [Pcie SB1270]
HDD: Intel SSD 330 180Gb; Samsung F3 1Tb; Hitachi 500Gb
PSU: Chieftec NiTRO88+ 650w [52A]
OS: Windows 8.1.1 Pro WMC x64
Keyboard: Logitech G19s (lgps 8.55); Mouse: Razer Copperhead

Image

#2
Posted 03/22/2012 05:46 PM   
only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do
only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do

intel 5960x
1080ti water cooled
3 x 42inch fullhd @5860x1080
16 Gigswin
10 64bit

#3
Posted 03/22/2012 06:11 PM   
To me, the most impressive features are the GPUBoost and Frame Rrate Targeting, which helps in lowering power consumption.

Also, you can do surround + 1 monitor on one GTX680.
To me, the most impressive features are the GPUBoost and Frame Rrate Targeting, which helps in lowering power consumption.



Also, you can do surround + 1 monitor on one GTX680.

eVGA Z68 SLI | Intel Core i5-3570K @ 4.5 GHz | Corsair Hydro Series H100i
16GB G.Skill Sniper Series DDR3 | MSI GTX 970 Gaming 4G SLIed | OCZ ZX 1000W
OCZ Agility3 120 GB SSD + Samsung 850 Pro 256 GB SSD
Samsung UN55D6000 + Samsung T240
Win10 Pro x64 / WEI - 8.2 - 8.2 - 8.9 - 8.9 - 7.9
3DMark Fire Strike: 15648
F@H Team: 142900

#4
Posted 03/22/2012 06:13 PM   
Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

[list]
[*]Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
[*]In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
[*]Things carry on happily this way.
[*]Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
[*]AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
[*]GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
[*]Speculation ensues.
[/list]

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:

[quote]Gamers told us they want GPUs that are cooler, quieter, and more power efficient. So we re-designed the architecture to do just that. The GeForce GTX 680 consumes less power than any flagship GPU since the GeForce 8800 Ultra, yet it outperforms every GPU we or anyone else have ever built.[/quote]

And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.
Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...




  • Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
  • In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
  • Things carry on happily this way.
  • Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
  • AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
  • GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
  • Speculation ensues.




Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />



Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:



Gamers told us they want GPUs that are cooler, quieter, and more power efficient. So we re-designed the architecture to do just that. The GeForce GTX 680 consumes less power than any flagship GPU since the GeForce 8800 Ultra, yet it outperforms every GPU we or anyone else have ever built.




And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.

Official GeForce Forums Benchmarking Leaderboards

Minima: Corsair Obsidian 350D mATX, Asus Maximus VI GENE Z87, Intel Core i7-4790k @ 4.60GHz, Corsair H110, Corsair Dominator Platinum 16GB (4x4GB) @ 2400MHz, 1x Intel 730k 480GB, 1x WD Scorpio Black 750GB, 2x WD Caviar Black 1TB, 1x GTX Titan X, 1x GTX 750 Ti, Enermax 1250W Evolution, Windows 10 64bit.

Asus ROG Swift PG278Q monitor, Logitech G9X mouse, Corsair Vengeance keyboard.

Feel free to PM me if I don't follow-up on a thread that I have responded to.

Community Moderator - not employed by NVIDIA.

#5
Posted 03/22/2012 09:12 PM   
[quote name='jimbonbon' date='22 March 2012 - 04:12 PM' timestamp='1332450742' post='1386476']
Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...

[list]
[*]Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
[*]In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
[*]Things carry on happily this way.
[*]Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
[*]AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
[*]GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
[*]Speculation ensues.
[/list]

Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />

Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:



And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.
[/quote]

Amen, preacher. That is all.
[quote name='jimbonbon' date='22 March 2012 - 04:12 PM' timestamp='1332450742' post='1386476']

Really need to start reading through all these reviews! For those still harbouring the 'it should have been a mid-range GPU' arguement, I still think that maybe the point of Kepler hasn't totally become clear. You know what I think happened here (pure conjecture by the way)...




  • Fermi performance was awesome, but many noted they were not happy with the size of the GPU, the amount of power required, and the amount of heat produced. Meanwhile people praised AMD for their improved efficiency.
  • In designing Kepler, the focus is of course to increase performance across the board, but also improve upon those performance per watt figures.
  • Things carry on happily this way.
  • Someone leaks GPU die sizes, and people start jumping to conclusions; "It's small so it must be mid-range". This assumption spreads until someone coins the 660 Ti based on 560 Ti size comparisons.
  • AMD release their 7900 series, and someone from NVIDIA comments they are pleasantly suprised (now it becomes very clear why).
  • GTX 680 branding gets leaked, and people make assumptions based on the aforementioned comments and the leaked die size that the 660Ti has suddenly become a 680.
  • Speculation ensues.




Of course you could argue that it would have helped if NVIDIA had publicised what was coming, but as you all know this is not how things are done - by either vendor. Consider that you are also implying that NVIDIA would/could completely change their chosen direction, branding, marketing, PCB design etc. in such a short space of time /blink.gif' class='bbc_emoticon' alt=':blink:' />



Kepler gives us the new fastest GPU in the world by a good margin, with a raft of new features (including surround off a single GPU), but also an amazing improvement in performance per watt - twice the amount per SM, as mentioned in the linked article. The conclusion in that release sums it up well:







And it comes in at the same price point (or cheaper in many cases) than the AMD equivalent on launch day... moderator title completely aside, I think its an awesome piece of work.





Amen, preacher. That is all.

Case: Antec 1200
PSU: Corsair AX850 850W
CPU: Intel Core i7 860 @4GHz 1.352v cooled w. Xigmatek Thor's Hammer
RAM: 2x4GB G. Skill Sniper Series DDR3-1866 @1910MHz 9-10-9-28 2T 1.5v 2:10 FSB:DRAM
MB: ASUS Maximus III Formula Intel P55
HD: 2x WD Velociraptor 250GB, WD Caviar Black 750GB (all my junk)OD: ASUS Blu-Ray
GPU: EVGA GTX 780Ti SC
Sound Card: Creative X-Fi Titanium HD
OS: Windows 7 Home Premium 64-bit
Peripherals: Logitech G510 keyboard, Coolermaster Storm Spawn gaming mouse, Logitech Z-5500 5.1 speakers, ASUS PA258Q 24'' monitor
3DMark Firestrike: 10864

#6
Posted 03/22/2012 10:44 PM   
I agree wholeheartedly Jimbonbon. I truly don't understand what the naysayers are complaining about. The GTX 680 trumps the dual-GPU GTX 590 in games like BF3, while consuming around 190W of juice. In addition to that, the new features the 600 series brings to the table are most welcomed. I think this is a GPU we should all be excited about.

I'm happy with the direction Nvidia is going here. They've developed a lean GPU that's wicked fast. What's not to like? I'll admit that I would have liked to see the price a bit lower, but considering the competition I think it's fair.
I agree wholeheartedly Jimbonbon. I truly don't understand what the naysayers are complaining about. The GTX 680 trumps the dual-GPU GTX 590 in games like BF3, while consuming around 190W of juice. In addition to that, the new features the 600 series brings to the table are most welcomed. I think this is a GPU we should all be excited about.



I'm happy with the direction Nvidia is going here. They've developed a lean GPU that's wicked fast. What's not to like? I'll admit that I would have liked to see the price a bit lower, but considering the competition I think it's fair.

Corsair 800D full-tower case // Gigabyte Z97X-UD5H Rev 1.1 MOBO // Intel Core i7 4790K "Devil's Canyon" // Corsair H100i RGB Closed-Loop LC // 16GB Corsair Vengeance PC3-12800 // Dual Nvidia-made GTX 970's in SLI with Nvidia LED bridge // Creative SB X-Fi Titanium // GSKILL RIPJAWS MX780 8200 DPI RGB gaming mouse // Corsair MM600 solid aircraft-grade aluminum mouse pad // GSKILL RIPJAWS KM780 RGB mechanical keyboard // Dual OCZ Vector 180 240GB SSD's in RAID 0 config // Silcon Power 240GB SSD // Dual WD Black 2TB Mechanical drives // Seagate Barracuda 4TB SSHD Hybrid drive // WD My Passport 256GB USB 3.1 Gen 2 External SSD // WD My Passport USB 3.0 2TB external hard drive // Seagate Barracuda USB 3.0 2TB external hard drive // Seasonic X Series Gold 1050w PSU // LG 27UD69PW AH-IPS LED UHD 4K monitor // Dell Ultrasharp U2410 H-IPS calibrated monitor // Klipsch Promedia 2.1 sound system // TP-Link Archer C9 AC1900 Router // Logitech 1080p C922 Pro Stream Webcam with tri-pod // Windows 10 Pro 64-bit "Creators Update" build 15063 // Mobile hardware: Samsung Galaxy Note 5 64GB Black Smartphone running Android Nougat 7.0

#7
Posted 03/22/2012 11:18 PM   
nicely said jimbonbon. yes i always didn't like to add anything to the speculation thread that was so overdue. i even hardly looked at it. sort of a waste of time really posting on speculation and worse arguing about something that we didn't know about.
that being said i still stick to my thought and that is before GTX680 release, the HD7970 would have been the ideal choice over what was available at the time of its release without question. i would have recommended the HD7970 to anyone without hesitation.
but now things are nicer /tongue.gif' class='bbc_emoticon' alt=':tongue:' />
nicely said jimbonbon. yes i always didn't like to add anything to the speculation thread that was so overdue. i even hardly looked at it. sort of a waste of time really posting on speculation and worse arguing about something that we didn't know about.

that being said i still stick to my thought and that is before GTX680 release, the HD7970 would have been the ideal choice over what was available at the time of its release without question. i would have recommended the HD7970 to anyone without hesitation.

but now things are nicer /tongue.gif' class='bbc_emoticon' alt=':tongue:' />

#8
Posted 03/22/2012 11:58 PM   
I hate prices in NZ though.
I assume it will be all over the world, right?
Here it will launch at 999NZD
converting the 499USD to NZD I get [url="http://www.xe.com/ucc/convert/?Amount=499&From=USD&To=NZD"]this[/url].

So if I buy it from US and pay for shipping to get it sent to me I still pay far less than buying it from here in NZ.
I hate prices in NZ though.

I assume it will be all over the world, right?

Here it will launch at 999NZD

converting the 499USD to NZD I get this.



So if I buy it from US and pay for shipping to get it sent to me I still pay far less than buying it from here in NZ.

#9
Posted 03/23/2012 12:35 AM   
thats nutz..... but serious that really sucks. Prices here range from 499-534 (with MIR lol)
thats nutz..... but serious that really sucks. Prices here range from 499-534 (with MIR lol)
#10
Posted 03/23/2012 01:25 AM   
Im still waiting for Folding performance

I see 4GB models will be out in about 2 months
Love looking at EVGA lineup
Im still waiting for Folding performance



I see 4GB models will be out in about 2 months

Love looking at EVGA lineup



ImageImage


Image

Join the NVidia Forum Team - Please Help Medical Research - Folding@home


#11
Posted 03/23/2012 01:29 AM   
[quote name='chiz' date='23 March 2012 - 03:28 AM' timestamp='1332469729' post='1386630']
Nvidia reps should really consider the ramifications of treating their customers as if they're uninformed or ignorant. In the end though, it just makes you look dishonest, is that the image you really want to put out there? Better to just own up to it. LilK performed better than expected, Tahiti sucked, and BigK isn't quite ready and is yet to be released, so we get a mid-range part selling at $500 flagship prices.

There's about 100 bits of evidence that all point to GK104 initially targeting mid-range performance SKUs but bumped up to prime time as a result of Tahiti's poor showing. Here's a nice summary, many of the breadcrumbs are provided by Nvidia, they just didn't do a very good job of sweeping the evidence under the rug. But I guess that's what happens when you spend 2+ years developing a product line and then decide to change gears in the 24th hour before launch.

http://www.techpowerup.com/forums/showthread.php?t=162901

That said, the only thing not to like about lilKepler is its pricing. $400-$450 would've made a lot more sense and been a better reflection of its performance improvement over last-gen's GTX 580, but at the very least, it provides a glimpse of what is to come with BigK. Should be great, will be nice to see what Nvidia's flagship is capable of if their mid-range performs this well! /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />
[/quote]

BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.
[quote name='chiz' date='23 March 2012 - 03:28 AM' timestamp='1332469729' post='1386630']

Nvidia reps should really consider the ramifications of treating their customers as if they're uninformed or ignorant. In the end though, it just makes you look dishonest, is that the image you really want to put out there? Better to just own up to it. LilK performed better than expected, Tahiti sucked, and BigK isn't quite ready and is yet to be released, so we get a mid-range part selling at $500 flagship prices.



There's about 100 bits of evidence that all point to GK104 initially targeting mid-range performance SKUs but bumped up to prime time as a result of Tahiti's poor showing. Here's a nice summary, many of the breadcrumbs are provided by Nvidia, they just didn't do a very good job of sweeping the evidence under the rug. But I guess that's what happens when you spend 2+ years developing a product line and then decide to change gears in the 24th hour before launch.



http://www.techpowerup.com/forums/showthread.php?t=162901



That said, the only thing not to like about lilKepler is its pricing. $400-$450 would've made a lot more sense and been a better reflection of its performance improvement over last-gen's GTX 580, but at the very least, it provides a glimpse of what is to come with BigK. Should be great, will be nice to see what Nvidia's flagship is capable of if their mid-range performs this well! /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />





BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.



What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.

Corsair 800D full-tower case // Gigabyte Z97X-UD5H Rev 1.1 MOBO // Intel Core i7 4790K "Devil's Canyon" // Corsair H100i RGB Closed-Loop LC // 16GB Corsair Vengeance PC3-12800 // Dual Nvidia-made GTX 970's in SLI with Nvidia LED bridge // Creative SB X-Fi Titanium // GSKILL RIPJAWS MX780 8200 DPI RGB gaming mouse // Corsair MM600 solid aircraft-grade aluminum mouse pad // GSKILL RIPJAWS KM780 RGB mechanical keyboard // Dual OCZ Vector 180 240GB SSD's in RAID 0 config // Silcon Power 240GB SSD // Dual WD Black 2TB Mechanical drives // Seagate Barracuda 4TB SSHD Hybrid drive // WD My Passport 256GB USB 3.1 Gen 2 External SSD // WD My Passport USB 3.0 2TB external hard drive // Seagate Barracuda USB 3.0 2TB external hard drive // Seasonic X Series Gold 1050w PSU // LG 27UD69PW AH-IPS LED UHD 4K monitor // Dell Ultrasharp U2410 H-IPS calibrated monitor // Klipsch Promedia 2.1 sound system // TP-Link Archer C9 AC1900 Router // Logitech 1080p C922 Pro Stream Webcam with tri-pod // Windows 10 Pro 64-bit "Creators Update" build 15063 // Mobile hardware: Samsung Galaxy Note 5 64GB Black Smartphone running Android Nougat 7.0

#12
Posted 03/23/2012 02:47 AM   
[quote name='slamscaper' date='22 March 2012 - 10:47 PM' timestamp='1332470864' post='1386636']
BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.

What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.
[/quote]
I'm sure BigK's release schedule has been impacted as a result of Tahiti's poor showing without a doubt. Regardless of when it could've or would've been ready, its undoubtedly going to come out later than intended because Tahiti sucks and lilK is now a GTX 680.

As for performance, there's no doubt the 680 provides exceptional performance for a mid-range part, but when held to the same standards as previous next-gen flagship SKUs on a new process node in this flagship price range, its the worst price:performance shift since the 9800GTX. You can map out the increases yourself, GTX 680 is ~35-40% faster than the GTX 580. Compare GTX 480 to GTX 285, or GTX 280 to 8800GTX and you'll see exactly what I'm talking about.

People don't seem to be considering the long-term ramifications here though. GK104 mid-range selling as high-end sets a terrible precedent and allows Nvidia to continue selling mid-range ASICs at high-end prices as long as they meet the (ever-decreasing) standards and expectations of how much of an improvement makes upgrading worthwhile. It seems nowadays, +35-40% is enough of an increase over 16-24 months to command the same price premiums when its always been +50% or more in the past. Bottomline is this, you get less for your money in the way of improvement and innovation than you did in the past. Shame really, Moore's Law is dead in the GPU space now as well (I guess JHH will have to take that bulletpoint off his next slide-deck).
[quote name='slamscaper' date='22 March 2012 - 10:47 PM' timestamp='1332470864' post='1386636']

BigK (as you call it) may actually be ready for prime time, but the green team sees no need to release it at this point. If this is the case, I really can't blame them from a business standpoint (that doesn't mean I like it though). Either way, we don't know exactly what's going on behind the scenes, so this is all just speculation.



What we know for sure is that the 680 chews through modern games with ease. Call it "mid-range" all you want, but this GPU offers high-end performance.



I'm sure BigK's release schedule has been impacted as a result of Tahiti's poor showing without a doubt. Regardless of when it could've or would've been ready, its undoubtedly going to come out later than intended because Tahiti sucks and lilK is now a GTX 680.



As for performance, there's no doubt the 680 provides exceptional performance for a mid-range part, but when held to the same standards as previous next-gen flagship SKUs on a new process node in this flagship price range, its the worst price:performance shift since the 9800GTX. You can map out the increases yourself, GTX 680 is ~35-40% faster than the GTX 580. Compare GTX 480 to GTX 285, or GTX 280 to 8800GTX and you'll see exactly what I'm talking about.



People don't seem to be considering the long-term ramifications here though. GK104 mid-range selling as high-end sets a terrible precedent and allows Nvidia to continue selling mid-range ASICs at high-end prices as long as they meet the (ever-decreasing) standards and expectations of how much of an improvement makes upgrading worthwhile. It seems nowadays, +35-40% is enough of an increase over 16-24 months to command the same price premiums when its always been +50% or more in the past. Bottomline is this, you get less for your money in the way of improvement and innovation than you did in the past. Shame really, Moore's Law is dead in the GPU space now as well (I guess JHH will have to take that bulletpoint off his next slide-deck).

-=HeliX=- Mod 3DV Game Fixes
My 3D Vision Games List Ratings

Intel Core i7 5930K @4.5GHz | Gigabyte X99 Gaming 5 | Win10 x64 Pro | Corsair H105
Nvidia GeForce Titan X SLI Hybrid | ROG Swift PG278Q 144Hz + 3D Vision/G-Sync | 32GB Adata DDR4 2666
Intel Samsung 950Pro SSD | Samsung EVO 4x1 RAID 0 |
Yamaha VX-677 A/V Receiver | Polk Audio RM6880 7.1 | LG Blu-Ray
Auzen X-Fi HT HD | Logitech G710/G502/G27 | Corsair Air 540 | EVGA P2-1200W

#13
Posted 03/23/2012 03:10 AM   
Indeed it would be pointless to opt for the GTX 580's at this point ... even considering the current prices , GTX680's sold out , not surprising though , will see how the drivers work here in the near future anyways .
Indeed it would be pointless to opt for the GTX 580's at this point ... even considering the current prices , GTX680's sold out , not surprising though , will see how the drivers work here in the near future anyways .

Image

#14
Posted 03/23/2012 03:13 AM   
[quote name='Tepescovir' date='22 March 2012 - 12:11 PM' timestamp='1332439911' post='1386390']
only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do
[/quote]

I want to see Surround #s on 2 680s myself. Specifically in comparison to 580s & 7970s, because the results I found show the 7970s beat 580s by 45% at 1080px3 (5760x1080p), on BF3 with Ultra Settings, HBAO & no AA, with Post Process AA on High. I'd love to see how the 680s do on this, as it's a BIG selling point for me since I run that resolution & would really like to run this game maxed out or nearly maxed out (since no AA doesn't count as maxed out).

So if/when anyone gets two of these things if you have Surround & BF3, PLEASE post a run through on a map or something.

I'm going to do the same when I get my next set of cards, and I'll gladly do it with 580s as well, as long as I have them at least.
[quote name='Tepescovir' date='22 March 2012 - 12:11 PM' timestamp='1332439911' post='1386390']

only benchmarks i'm intrested in is with nvidia surround. My 580's will already max out every game out there on a single display. But i want to play bf3 in surround which 3 580's cant do





I want to see Surround #s on 2 680s myself. Specifically in comparison to 580s & 7970s, because the results I found show the 7970s beat 580s by 45% at 1080px3 (5760x1080p), on BF3 with Ultra Settings, HBAO & no AA, with Post Process AA on High. I'd love to see how the 680s do on this, as it's a BIG selling point for me since I run that resolution & would really like to run this game maxed out or nearly maxed out (since no AA doesn't count as maxed out).



So if/when anyone gets two of these things if you have Surround & BF3, PLEASE post a run through on a map or something.



I'm going to do the same when I get my next set of cards, and I'll gladly do it with 580s as well, as long as I have them at least.

CPU: i7 3930k @ 4.2GHz 1.21V ~ Motherboard: Asus RoG Rampage IV Extreme
RAM: G.Skill Sniper Gaming Series (4x4GB @ 2133MHz, 9-11-10-28-2T 1.65V)
GPUs: 1x R9 280X Vapor-X (Games/F@H), 1x R9 290 Core Edition (F@H), 1x R9 290X Core Edition (F@H)
PSU: LEPA G 1600W (Hate it, don't buy it) ~ Case: Enermax Fulmo GT
SSDs/HDDs: Corsair Force3 240GB, Corsair Force3 90GB, 1x Seagate Momentus XT 750GB
OS: Windows 7 Ultimate ~ Display: Sony Bravia 52" LED LCD HDTV
Laptops: Alienware M18x (2013, GTX 560M SLi) & Alienware M14x (2014, GTX 765M)

#15
Posted 03/23/2012 06:29 AM   
  1 / 9    
Scroll To Top