Memory Interface at 256 bit? Why not a higher memory face like the 580?
Hello,

I was looking at the 680 and noticed it only has a 256 bit Memory Interface, compared to the 570's 384 bit. With being the newer upscale card should it stay at least the same not actually lower?
Hello,



I was looking at the 680 and noticed it only has a 256 bit Memory Interface, compared to the 570's 384 bit. With being the newer upscale card should it stay at least the same not actually lower?

#1
Posted 04/03/2012 06:08 AM   
[quote name='Beast666' date='03 April 2012 - 02:08 AM' timestamp='1333433299' post='1391293']
With being the newer upscale card should it stay at least the same not actually lower?
[/quote]

Memory on the 680 is clocked higher, so bandwidth remained about the same as 580.
[quote name='Beast666' date='03 April 2012 - 02:08 AM' timestamp='1333433299' post='1391293']

With being the newer upscale card should it stay at least the same not actually lower?





Memory on the 680 is clocked higher, so bandwidth remained about the same as 580.

"This is your code. These are also your bugs. Really. Yes, the API runtime and the
driver have bugs, but this is not one of them. Now go fix it already." -fgiesen

#2
Posted 04/04/2012 02:30 AM   
[quote name='Beast666' date='02 April 2012 - 10:08 PM' timestamp='1333433299' post='1391293']I was looking at the 680 and noticed it only has a 256 bit Memory Interface, compared to the 570's 384 bit.[/quote]

The general consensus is that the 680 was developed to be the 660, but it performed well enough [u]in games[/u] that it became the 680. And that's why it seems a bit different than previous flagship parts (ie, just 2x6-pin power, 256 bit memory, difficult overclocking controls, non-impressive vrm setup, reduced gpgpu performance, etc).
[quote name='Beast666' date='02 April 2012 - 10:08 PM' timestamp='1333433299' post='1391293']I was looking at the 680 and noticed it only has a 256 bit Memory Interface, compared to the 570's 384 bit.



The general consensus is that the 680 was developed to be the 660, but it performed well enough in games that it became the 680. And that's why it seems a bit different than previous flagship parts (ie, just 2x6-pin power, 256 bit memory, difficult overclocking controls, non-impressive vrm setup, reduced gpgpu performance, etc).

#3
Posted 04/04/2012 05:31 AM   
[quote name='xorbe' date='04 April 2012 - 07:31 AM' timestamp='1333517500' post='1391767']
The general consensus is that the 680 was developed to be the 660, but it performed well enough [u]in games[/u] that it became the 680. And that's why it seems a bit different than previous flagship parts (ie, just 2x6-pin power, 256 bit memory, difficult overclocking controls, non-impressive vrm setup, reduced gpgpu performance, etc).
[/quote]

Thats only rumors, NVidia stated early that they were going to increase performance per watt, and I dont know how you find it hard to overclock, I think it's very easy.
[quote name='xorbe' date='04 April 2012 - 07:31 AM' timestamp='1333517500' post='1391767']

The general consensus is that the 680 was developed to be the 660, but it performed well enough in games that it became the 680. And that's why it seems a bit different than previous flagship parts (ie, just 2x6-pin power, 256 bit memory, difficult overclocking controls, non-impressive vrm setup, reduced gpgpu performance, etc).





Thats only rumors, NVidia stated early that they were going to increase performance per watt, and I dont know how you find it hard to overclock, I think it's very easy.

OS:Windows 8.1 64-bit CPU: Intel Core i7-3930K - EK HF Supreme Nickel/Acetal Memory: 16GB Corsair Dominator GT CL9 2133mhz @2400mhz Motherboard: Rampage IV Formula Graphics: SLI 3x GTX 680 - EK Nickel/Acetal Waterblocks Sound: Asus Xonar Xense Storage: 2x Samsung 840 Pro 512GB, 2x Samsung 840 Pro 256GB, NAS for backups and various other files. Network: Intel 82579V Gigabit, (BBB 1Gbit fiber), Monitor: Samsung S27B971D 27" 2560x1440 PSU: Corsair AX1200 Others: Mad Catz Strike 7, Mad Catz MMO7 Mouse. Using 2x XTX 360 Radiator with 3+3 Gentle Typhoon 1850rpm push+pull, Aquaero 5 Pro.

#4
Posted 04/04/2012 05:37 AM   
[quote name='nvtweakman' date='03 April 2012 - 08:30 PM' timestamp='1333506646' post='1391716']
Memory on the 680 is clocked higher, so bandwidth remained about the same as 580.
[/quote]

Even if somehow they were able to clock it higher, why not use that tech to clock 384 bit higher instead getting a real performance increase not just the same as the last card?
[quote name='nvtweakman' date='03 April 2012 - 08:30 PM' timestamp='1333506646' post='1391716']

Memory on the 680 is clocked higher, so bandwidth remained about the same as 580.





Even if somehow they were able to clock it higher, why not use that tech to clock 384 bit higher instead getting a real performance increase not just the same as the last card?

#5
Posted 04/04/2012 04:19 PM   
We dramatically increased the memory clock speeds of the Geforce GTX 680 compared to the Geforce GTX 580 which makes up for the decrease in the number of lanes for the memory controller.
We dramatically increased the memory clock speeds of the Geforce GTX 680 compared to the Geforce GTX 580 which makes up for the decrease in the number of lanes for the memory controller.

Please send me a PM if I fail to keep up on replying in any specific thread or leave a driver feedback: Driver Feedback

#6
Posted 04/04/2012 06:35 PM   
The lower 256-bit memory interface was a surprise to me after the 384-bit GTX 580 one but so far it doesn't appear to have impacted the performance in anyway on my system even with lots of anti-aliasing at 1920x1200. And the memory on my card overclocks easily to 6.8 GHz which increaes the memory bandwidth to 218 GB/sec, more than enough for any game I throw at it. In the reviews I've read the GTX 680 often bests the 384-bit AMD HD 7970, which has memory bandwidth of 264 MB/sec, so the 256-bit memory interface certainly doesn't impair the card in any way. Of course, the questions remains as to whether the GTX 680 would fare better with a 384-bit memory interface and the same memory clock and the answer to that is: yes, but only at ridiculously high resolutions or with multi-display Surround gaming.
The lower 256-bit memory interface was a surprise to me after the 384-bit GTX 580 one but so far it doesn't appear to have impacted the performance in anyway on my system even with lots of anti-aliasing at 1920x1200. And the memory on my card overclocks easily to 6.8 GHz which increaes the memory bandwidth to 218 GB/sec, more than enough for any game I throw at it. In the reviews I've read the GTX 680 often bests the 384-bit AMD HD 7970, which has memory bandwidth of 264 MB/sec, so the 256-bit memory interface certainly doesn't impair the card in any way. Of course, the questions remains as to whether the GTX 680 would fare better with a 384-bit memory interface and the same memory clock and the answer to that is: yes, but only at ridiculously high resolutions or with multi-display Surround gaming.

Intel Core i7-4770K @ 3.5 GHz | Noctua NH-D14 cooler | ASUS Z87 Deluxe motherboard v1707 | 16 GB Corsair Vengeance LP 1,600 MHz DDR3 memory | 3 GB EVGA NVIDIA GeForce GTX 780 graphics | Creative X-Fi Titanium HD audio | 256 GB Crucial M4 SSD (boot) v070H | 2 x 2 TB Seagate Barracuda SATAIII hard drives | 1 x 1 TB Samsung Spinpoint F3 SATAII hard drive | 2 x Samsung SH-B123L BD-ROM/DVD rewriters | Cooler Master HAF 932 Advanced case | Cooler Master Silent Pro M850 PSU | Windows 8.1 Pro with Media Center (64-bit)

#7
Posted 04/04/2012 08:32 PM   
[quote name='Daz1967' date='04 April 2012 - 02:32 PM' timestamp='1333571533' post='1392042']
The lower 256-bit memory interface was a surprise to me after the 384-bit GTX 580 one but so far it doesn't appear to have impacted the performance in anyway on my system even with lots of anti-aliasing at 1920x1200. And the memory on my card overclocks easily to 6.8 GHz which increaes the memory bandwidth to 218 GB/sec, more than enough for any game I throw at it. In the reviews I've read the GTX 680 often bests the 384-bit AMD HD 7970, which has memory bandwidth of 264 MB/sec, so the 256-bit memory interface certainly doesn't impair the card in any way. Of course, the questions remains as to whether the GTX 680 would fare better with a 384-bit memory interface and the same memory clock and the answer to that is: yes, but only at ridiculously high resolutions or with multi-display Surround gaming.
[/quote]

I see what your saying but dont people generally buy the 80's card in a series specifically because they plan to use ridiculously high resolutions or with a multi-display setups? I mean since Nvidia was able to massively clock the 256 bit they already have the 384 bit so why not clock that with the same stuff since its the flagship for the 600 series?
[quote name='Daz1967' date='04 April 2012 - 02:32 PM' timestamp='1333571533' post='1392042']

The lower 256-bit memory interface was a surprise to me after the 384-bit GTX 580 one but so far it doesn't appear to have impacted the performance in anyway on my system even with lots of anti-aliasing at 1920x1200. And the memory on my card overclocks easily to 6.8 GHz which increaes the memory bandwidth to 218 GB/sec, more than enough for any game I throw at it. In the reviews I've read the GTX 680 often bests the 384-bit AMD HD 7970, which has memory bandwidth of 264 MB/sec, so the 256-bit memory interface certainly doesn't impair the card in any way. Of course, the questions remains as to whether the GTX 680 would fare better with a 384-bit memory interface and the same memory clock and the answer to that is: yes, but only at ridiculously high resolutions or with multi-display Surround gaming.





I see what your saying but dont people generally buy the 80's card in a series specifically because they plan to use ridiculously high resolutions or with a multi-display setups? I mean since Nvidia was able to massively clock the 256 bit they already have the 384 bit so why not clock that with the same stuff since its the flagship for the 600 series?

#8
Posted 04/07/2012 07:03 PM   
We all know that GTX 680 won't be the best performing Kepler part. Why use six memory controllers to do what you can with four, and it sells.
We all know that GTX 680 won't be the best performing Kepler part. Why use six memory controllers to do what you can with four, and it sells.

"This is your code. These are also your bugs. Really. Yes, the API runtime and the
driver have bugs, but this is not one of them. Now go fix it already." -fgiesen

#9
Posted 04/07/2012 10:52 PM   
If I recall correctly the 280 used 512..so a higher number doesn't necessarily mean better.
If I recall correctly the 280 used 512..so a higher number doesn't necessarily mean better.

#10
Posted 04/11/2012 07:15 PM   
Scroll To Top