GTX 260 and GTX 280 and SLI Investigation.
  1 / 14    
[b] The GTX 260 and GTX 280[/b]


[b]Introduction:[/b] Its been a while since we have gotten a true single GPU upgraded to the aging Geforce 8800GTX card. We have seen incremental upgrades such as the 9800GTX which brought 8800GTX performance to the masses. As well as multi GPU cards such as the 9800GX2 which brought SLI to one board. While the 9800GX2 brought great performance. It did have its drawbacks such as relying on AFR for performance as well as being limited to 512 megs of memory. A step back from the 768 megs available on the 8800GTX. Today Nvidia is launching the Geforce GTX 260 and Geforce GTX 280.

[b]Whats in a name?[/b]: Nvidia took some criticism in the past for its confusing naming convention with the G92 cards. Namely the Geforce 8800GT and Geforce 9800GTX. Going by names alone it was difficult to determine what card you were actually getting. To simplify this Nvidia has adopted a new naming scheme. Geforce GTX XXX. The idea is that GTX represents the high end and the numbering convention represents the performance expectations. We'll see if this new naming convention helps the consumers and OEM make easier choices.


[b]GTX280[/b]

[url="http://chrisray.soterial.com/GT200/SystemCards/GTX280front.jpg"][img]http://chrisray.soterial.com/GT200/Thumbnail/GTX280frontthumb.jpg[/img][/url]

[url="http://chrisray.soterial.com/GT200/SystemCards/GTX280connector.jpg"][img]http://chrisray.soterial.com/GT200/Thumbnail/GTX280connectorthumb.jpg[/img][/url]

[b]GTX 260[/b]

[url="http://chrisray.soterial.com/GT200/SystemCards/GTX260front.jpg"][img]http://chrisray.soterial.com/GT200/Thumbnail/GTX260frontthumb.jpg[/img][/url]

[url="http://chrisray.soterial.com/GT200/SystemCards/GTX260back.jpg"][img]http://chrisray.soterial.com/GT200/Thumbnail/GTX260connectorthumb.jpg[/img][/url]

[b]First Thoughts[/b]: By just looking at the cards. They are identical. Same cooler. Same VGA/DVI connectors. But under the hood they can be quite different. The GTX260 has part of its shader/ROPS/Texture units cut off in comparison to the GTX 280. And only requires two 6 pin connectors. While the GTX 280 requires 1 6 pin and 1 8 pin connector. Lets take a closer look at the technical specs to see the differences in detail.

[b]The GT200 Architecture[/b]

[url="http://chrisray.soterial.com/GT200/Graphs/GT200arch.png"][img]http://chrisray.soterial.com/GT200/Thumbnail/GT200archthumb.png[/img][/url]

[img]http://chrisray.soterial.com/GT200/Graphs/GPUspecs.png[/img]

[img]http://chrisray.soterial.com/GT200/Graphs/GPUClocks.png[/img]

[b]Specs Explored[/b]: At a glance you can see why these chips carry 1.4 Billion transistors. The shader count has been massively increased and the ROPS have been doubled compared to the Geforce 9800GTX. The GTX 280 features 1 gigabyte of available system memory. The GTX260 has had 2 of its clusters disabled. As a result the stream processors, Texture Map units, and ROPS and memory bus have been cut slightly while only featuring 896 megs of memory. The GTX 260 is very similar to the 8800GTS 640 that was featured with the Geforce 8800GTX launch. The Geforce GTX280 has 141 gigs of bandwith while the Geforce GTX260 carries a more conservative 112 gigs of bandwith.

[b]Dual Issue Shader[/b]: With the Geforce 8800 and Geforce 9800 series. Each shader was capable of performing 1 MADD per clock. While under certain circumstances it could perform a "Mul". However most of the time this Mul function went unexposed. With the GTX 200 series the Mad + Mul functions for each unit is fully exposed under all circumstance. This can lead to a performance improvement from the shaders clock per clock. The range seems to vary from 3-15% dependent on the type of shader being rendered. But those numbers are just based on a few limited tests I saw. Its hard to quantify how much this will matter in a wide range of titles.

[b]Geometry Shader[/b]: Internal Buffer structures have been upsized by 6x. This should provide a significant improvement to streamout performance.

[b]Register Space and Double Precision:[/b] The register space on the G200 shader array has been doubled. This should allow for better performance in heavy shader utilization situations as well as open up support for Double Precision.

[img]http://chrisray.soterial.com/GT200/Thumbnail/purevideoHD.png[/img]

[b]Video[/b]: The Geforce GTX 260/280 cards have the same Purevideo HD functions that the G92 architecture carries. Also they carry a special power saving mode for DVD video playback.
The GTX 260 and GTX 280





Introduction: Its been a while since we have gotten a true single GPU upgraded to the aging Geforce 8800GTX card. We have seen incremental upgrades such as the 9800GTX which brought 8800GTX performance to the masses. As well as multi GPU cards such as the 9800GX2 which brought SLI to one board. While the 9800GX2 brought great performance. It did have its drawbacks such as relying on AFR for performance as well as being limited to 512 megs of memory. A step back from the 768 megs available on the 8800GTX. Today Nvidia is launching the Geforce GTX 260 and Geforce GTX 280.



Whats in a name?: Nvidia took some criticism in the past for its confusing naming convention with the G92 cards. Namely the Geforce 8800GT and Geforce 9800GTX. Going by names alone it was difficult to determine what card you were actually getting. To simplify this Nvidia has adopted a new naming scheme. Geforce GTX XXX. The idea is that GTX represents the high end and the numbering convention represents the performance expectations. We'll see if this new naming convention helps the consumers and OEM make easier choices.





GTX280



Image



Image



GTX 260



Image



Image



First Thoughts: By just looking at the cards. They are identical. Same cooler. Same VGA/DVI connectors. But under the hood they can be quite different. The GTX260 has part of its shader/ROPS/Texture units cut off in comparison to the GTX 280. And only requires two 6 pin connectors. While the GTX 280 requires 1 6 pin and 1 8 pin connector. Lets take a closer look at the technical specs to see the differences in detail.



The GT200 Architecture



Image



Image



Image



Specs Explored: At a glance you can see why these chips carry 1.4 Billion transistors. The shader count has been massively increased and the ROPS have been doubled compared to the Geforce 9800GTX. The GTX 280 features 1 gigabyte of available system memory. The GTX260 has had 2 of its clusters disabled. As a result the stream processors, Texture Map units, and ROPS and memory bus have been cut slightly while only featuring 896 megs of memory. The GTX 260 is very similar to the 8800GTS 640 that was featured with the Geforce 8800GTX launch. The Geforce GTX280 has 141 gigs of bandwith while the Geforce GTX260 carries a more conservative 112 gigs of bandwith.



Dual Issue Shader: With the Geforce 8800 and Geforce 9800 series. Each shader was capable of performing 1 MADD per clock. While under certain circumstances it could perform a "Mul". However most of the time this Mul function went unexposed. With the GTX 200 series the Mad + Mul functions for each unit is fully exposed under all circumstance. This can lead to a performance improvement from the shaders clock per clock. The range seems to vary from 3-15% dependent on the type of shader being rendered. But those numbers are just based on a few limited tests I saw. Its hard to quantify how much this will matter in a wide range of titles.



Geometry Shader: Internal Buffer structures have been upsized by 6x. This should provide a significant improvement to streamout performance.



Register Space and Double Precision: The register space on the G200 shader array has been doubled. This should allow for better performance in heavy shader utilization situations as well as open up support for Double Precision.



Image



Video: The Geforce GTX 260/280 cards have the same Purevideo HD functions that the G92 architecture carries. Also they carry a special power saving mode for DVD video playback.

#1
Posted 06/06/2008 04:28 PM   
[b]GT200 Power Requirements and Power Saving[/b]

[b]Power Requirements:[/b] The GTX280 requires one six pin connector and one eight pin connector. Using an adaptor is not recommended unless you can ensure your PSU molex is capable of delivering 150 watts. The GTX260 requires two six pin connector's similar to prior high end cards. The GTX280 consumes a maximum of 240 Watts while the GTX 260 consumes 192 Watts of Power at full load.

[b]Power Saving Tech:[/b] The GTX 200 series features alot of advanced power saving tech where the card powers down in 2D/Desktop mode. When running idle modes the clocks are significantly down throttled as well changes to the vcore. The result is significantly lower power consumption while idling than previous high end hardware.

[img]http://chrisray.soterial.com/GT200/Graphs/GT200powermodes.png[/img]

[b]GTX 280 2D Clocks[/b]

[img]http://chrisray.soterial.com/GT200/Graphs/lowpowerGTX280.png[/img]

[b] GTX 260 2D Clocks[/b]

[img]http://chrisray.soterial.com/GT200/Graphs/lowpowerGTX260.png[/img]

[b]Thoughts[/b]: These idle power strides are significant for pretty much any gamer who uses there computer for other tasks besides gaming. High End SLI users will really appreciate this as 3 way SLI setup will only consume 75 watts at idle verses the 140-200 on other 3 way SLI setups. Thats a significant amount of heat NOT dumped into your room enviroment. The GTX 200 series also supports Hybrid Power like the 9800GX2 and 9800GTX cards which can completely power down the GPU with a supporting Nvidia chipset.

[b]Important Note[/b] GTX 260 used in this review was an engineering sample. Newer models of the cards contain a bios with a slightly faster fan RPM and thus will effect idle/load tempatures.


[b] Cuda, What does mean it for me?[/b]

[b]Opening Thoughts[/b]: I wanted this section to be larger. But unfortunately I have been unable to obtain the applications I needed to fully discuss it. We've been hearing about GPU computing for a while. And Cuda has been talked about since the Geforce 8800 GTX was first announced. I'll be the first to admit that back then it was mostly a yawner material item because it didnt really effect gaming in any serious way. Things are changing now however. There are dozens of apps out there now which support CUDA. Many of them probably will not effect the average joe gamer obviously. But CUDA is providing a real alternative to Super Computing Solutions right now.


[url="http://chrisray.soterial.com/GT200/Cuda/cudaapps.png"][img]http://chrisray.soterial.com/GT200/Thumbnail/cudaappsthumb.png[/img][/url]

[b]Thoughts[/b]: Most of these apps probably arent going to be of much interest to the gamer or most end users. One notable app was the ability to encode 1080 Videos to Ipod video format at 18x the speed of a Quad Core CPU. Nvidia demo'd encoding a 2 hour movie in 38 minutes. The most notable thing CUDA will probably bring is....

[b]GTX 280[/b]

[url="http://chrisray.soterial.com/GT200/Graphs/GTX280folding.png"][img]http://chrisray.soterial.com/GT200/Thumbnail/GTX280foldingthumb.jpg[/img][/url]


[b]GTX 260[/b]

[url="http://chrisray.soterial.com/GT200/Graphs/GTX260folding.png"][img]http://chrisray.soterial.com/GT200/Thumbnail/GTX260foldingthumb.jpg[/img][/url]

[b]Folding At Home Cuda Client[/b]: At long last there's a Folding at home client for Nvidia hardware. I wanted to test the client more but I unfortunately got the client late into the review cycle. The GTX 260 averages about 450 NS while GTX Averages about 510 NS. Both took about 26-32 minutes to complete a project. The client is still beta but should be released soon. I want to emphasize all DX10 Geforce owners can experience folding at home now. This has been a long time coming and while its a great step for showing the potential of cuda its even more important that this will benefit others and save lives in the process.


[b] PhysX[/b]: With Nvidia's recent buyout of Ageia they have been put in a pretty interesting position. The Ageia API has already been converted over to CUDA in a matter of months. Verses the year and half Ageia took to write it all in Assembly. All Geforce cards since the 8 series are going to support it. While no games are currently available that support PhysX. Nvidia demos presented alot PhysX enabled software in the coming year. Some of it looked very interesting and alot of it impacted gameplay in a way that the game just wouldn't have been the same without it. The PhysX API is compatible with XBox360, PS3, and the PC and it looks like it may have a positive influence on the gaming industry.

[b] Final Thought[/b]: Both Cuda and PhysX have the potential to make a GPU purchase more than just a GPU purchase. It's one of those things that is going to have to be watched closely these upcoming months. Because that potential could change the way we percieve graphic cards.
GT200 Power Requirements and Power Saving



Power Requirements: The GTX280 requires one six pin connector and one eight pin connector. Using an adaptor is not recommended unless you can ensure your PSU molex is capable of delivering 150 watts. The GTX260 requires two six pin connector's similar to prior high end cards. The GTX280 consumes a maximum of 240 Watts while the GTX 260 consumes 192 Watts of Power at full load.



Power Saving Tech: The GTX 200 series features alot of advanced power saving tech where the card powers down in 2D/Desktop mode. When running idle modes the clocks are significantly down throttled as well changes to the vcore. The result is significantly lower power consumption while idling than previous high end hardware.



Image



GTX 280 2D Clocks



Image



GTX 260 2D Clocks



Image



Thoughts: These idle power strides are significant for pretty much any gamer who uses there computer for other tasks besides gaming. High End SLI users will really appreciate this as 3 way SLI setup will only consume 75 watts at idle verses the 140-200 on other 3 way SLI setups. Thats a significant amount of heat NOT dumped into your room enviroment. The GTX 200 series also supports Hybrid Power like the 9800GX2 and 9800GTX cards which can completely power down the GPU with a supporting Nvidia chipset.



Important Note GTX 260 used in this review was an engineering sample. Newer models of the cards contain a bios with a slightly faster fan RPM and thus will effect idle/load tempatures.





Cuda, What does mean it for me?



Opening Thoughts: I wanted this section to be larger. But unfortunately I have been unable to obtain the applications I needed to fully discuss it. We've been hearing about GPU computing for a while. And Cuda has been talked about since the Geforce 8800 GTX was first announced. I'll be the first to admit that back then it was mostly a yawner material item because it didnt really effect gaming in any serious way. Things are changing now however. There are dozens of apps out there now which support CUDA. Many of them probably will not effect the average joe gamer obviously. But CUDA is providing a real alternative to Super Computing Solutions right now.





Image



Thoughts: Most of these apps probably arent going to be of much interest to the gamer or most end users. One notable app was the ability to encode 1080 Videos to Ipod video format at 18x the speed of a Quad Core CPU. Nvidia demo'd encoding a 2 hour movie in 38 minutes. The most notable thing CUDA will probably bring is....



GTX 280



Image





GTX 260



Image



Folding At Home Cuda Client: At long last there's a Folding at home client for Nvidia hardware. I wanted to test the client more but I unfortunately got the client late into the review cycle. The GTX 260 averages about 450 NS while GTX Averages about 510 NS. Both took about 26-32 minutes to complete a project. The client is still beta but should be released soon. I want to emphasize all DX10 Geforce owners can experience folding at home now. This has been a long time coming and while its a great step for showing the potential of cuda its even more important that this will benefit others and save lives in the process.





PhysX: With Nvidia's recent buyout of Ageia they have been put in a pretty interesting position. The Ageia API has already been converted over to CUDA in a matter of months. Verses the year and half Ageia took to write it all in Assembly. All Geforce cards since the 8 series are going to support it. While no games are currently available that support PhysX. Nvidia demos presented alot PhysX enabled software in the coming year. Some of it looked very interesting and alot of it impacted gameplay in a way that the game just wouldn't have been the same without it. The PhysX API is compatible with XBox360, PS3, and the PC and it looks like it may have a positive influence on the gaming industry.



Final Thought: Both Cuda and PhysX have the potential to make a GPU purchase more than just a GPU purchase. It's one of those things that is going to have to be watched closely these upcoming months. Because that potential could change the way we percieve graphic cards.

#2
Posted 06/07/2008 02:16 AM   
[b] Test Setup and Configurations[/b]

[img]http://chrisray.soterial.com/GT200/Graphs/systemspecs.png[/img]

[b] GTX 280 SLI[/b]

[url="http://chrisray.soterial.com/GT200/SystemCards/GTX280SLI.jpg"][img]http://chrisray.soterial.com/GT200/Thumbnail/280SLIthumb.jpg[/img][/url]

[b] GTX 260 SLI[/b]

[url="http://chrisray.soterial.com/GT200/SystemCards/GTX2603way.jpg"][img]http://chrisray.soterial.com/GT200/Thumbnail/2603waySLIthumb.jpg[/img][/url]




[b] Crysis Testing Medium[/b]


[img]http://chrisray.soterial.com/GT200/Graphs/crysismedium.png[/img]

[b]Performance Thoughts:[/b] Theres nothing paticularly of interest here. The GTX260 and GTX280 both pull ahead of the 9800GX2. Crysis is not paticularly the best scaler for SLI as the 9800GX2 doesnt perform very well here. Unfortunately Crysis also seems heavily CPU dependent at these settings and SLI reaches a hard CPU bottleneck at these settings. It should be noted that this game's performance with SLI will scale better as clock speeds increase on your CPU.

[b] Crysis Testing High[/b]


[img]http://chrisray.soterial.com/GT200/Graphs/Crysishigh.png[/img]


[b]Performance Thoughts:[/b] Now these settings tell a slightly different story. The 9800GX2 and 9800GTX solutions fall behind the GTX 260 and GTX 280 rather significantly. This is most likely due to running out of video memory due to only having 512 megs of memory. At these settings significant stuttering was observed from the 9800GTX and 9800GX2 configurations due to not having the memory to run these settings at this resolution. As with Medium settings. Crysis SLI scaling can be very CPU dependent. Raising your clocks will increase the relative performance of your Crysis performance. The GTS 260 SLI and GTX 280 and GTX 280 SLI provide most the consistent experience. 3 Way SLI in this case provides very little benefit.


[b]Crysis Very High[/b]


[img]http://chrisray.soterial.com/GT200/Graphs/crysisveryhigh.png[/img]

[color="#008000"]
[b]Performance Thoughts:[/b] First I apologize for not including other cards. The 9800GTX and 9800GX2 struggled with AA enabled to the point that they were unable to complete the benchmark with very high. Also it is very important to note that these benches were added and included late into the test phase. So they are tested on an [b] Intel[/b] System. Here we actually see very good SLI scaling due to the extreme pixel bottlenecks and extra available memory preventing the cards from being memory bound. In this case the extra memory available to the GTX 280/260 cards is very useful. . None of the single cards solutions were very playable however. Once you turn on 2 way SLI both system reached playable settings with 16xAA enabled. An impressive feat. The 3 way SLI system manages to once again pull away of the other configurations tested.

[b]Note:[/b] DX10 mode struggled to properly enable on my HDTV so 1680x1050 was the resolution tested.

[/color]


[b]Bioshock Testing[/b]

[img]http://chrisray.soterial.com/GT200/Graphs/Bioshockgraph.png[/img]

[b]Performance Thoughts:[/b] . The GTX 260 and GTX280 do exceptionally well in 16xAA and 16xQ settings... The superior bandwith of the GTX280 allows it to pull significantly ahead of the 9800GX2 with 16xQ enabled. While the GTX 260 is able to match the 9800GX2 in raw FPS value. The super dual issue shader capabilities may also be playing a role in the performance here. SLI scaling also couldn't be better with the GTX 260 3 way SLI setup holding a significant lead over the other solutions.
Test Setup and Configurations



Image



GTX 280 SLI



Image



GTX 260 SLI



Image









Crysis Testing Medium





Image



Performance Thoughts: Theres nothing paticularly of interest here. The GTX260 and GTX280 both pull ahead of the 9800GX2. Crysis is not paticularly the best scaler for SLI as the 9800GX2 doesnt perform very well here. Unfortunately Crysis also seems heavily CPU dependent at these settings and SLI reaches a hard CPU bottleneck at these settings. It should be noted that this game's performance with SLI will scale better as clock speeds increase on your CPU.



Crysis Testing High





Image





Performance Thoughts: Now these settings tell a slightly different story. The 9800GX2 and 9800GTX solutions fall behind the GTX 260 and GTX 280 rather significantly. This is most likely due to running out of video memory due to only having 512 megs of memory. At these settings significant stuttering was observed from the 9800GTX and 9800GX2 configurations due to not having the memory to run these settings at this resolution. As with Medium settings. Crysis SLI scaling can be very CPU dependent. Raising your clocks will increase the relative performance of your Crysis performance. The GTS 260 SLI and GTX 280 and GTX 280 SLI provide most the consistent experience. 3 Way SLI in this case provides very little benefit.





Crysis Very High





Image





Performance Thoughts: First I apologize for not including other cards. The 9800GTX and 9800GX2 struggled with AA enabled to the point that they were unable to complete the benchmark with very high. Also it is very important to note that these benches were added and included late into the test phase. So they are tested on an Intel System. Here we actually see very good SLI scaling due to the extreme pixel bottlenecks and extra available memory preventing the cards from being memory bound. In this case the extra memory available to the GTX 280/260 cards is very useful. . None of the single cards solutions were very playable however. Once you turn on 2 way SLI both system reached playable settings with 16xAA enabled. An impressive feat. The 3 way SLI system manages to once again pull away of the other configurations tested.



Note: DX10 mode struggled to properly enable on my HDTV so 1680x1050 was the resolution tested.









Bioshock Testing



Image



Performance Thoughts: . The GTX 260 and GTX280 do exceptionally well in 16xAA and 16xQ settings... The superior bandwith of the GTX280 allows it to pull significantly ahead of the 9800GX2 with 16xQ enabled. While the GTX 260 is able to match the 9800GX2 in raw FPS value. The super dual issue shader capabilities may also be playing a role in the performance here. SLI scaling also couldn't be better with the GTX 260 3 way SLI setup holding a significant lead over the other solutions.

#3
Posted 06/07/2008 05:29 AM   
[b] Unreal Tournament 3 Performance[/b]

[img]http://chrisray.soterial.com/GT200/Graphs/UT3graph.png[/img]\

[b]Performance Thoughts[/b]: Using the Deimo's map we can see decent multi GPU scaling. Though no where near as impressive as Bioshock. Regardless of scaling the GTX 280 is able to pull ahead of the 9800GX2 by a few percent. While the GTX 260 remains slightly behind it, 2 GTX 280's in SLI are also able to maintain their lead over the Quad 9800GX2 setup while the GTX 260 drags a little behind. Once 3 way SLI is enabled the GTX 260 pulls ahead of all configurations tested. Due to the constant crashing of the 512 meg cards with 16xQ enabled. 16xQ results were excluded from this comparison.


[b] Lost Planet Results[/b]


[img]http://chrisray.soterial.com/GT200/Graphs/lostplanetbench.png[/img]

[b]Performance Thoughts:[/b] First the demo did not like the HDTV used for testing. So a standard 1680x1050 widescreen resolution was used. To increase GPU load 16xQ was selected. Also unlike prior tests. These results were not recorded with fraps but rather the built in benchmarking utility. The lost planet benchmark has built in multi GPU support and scales exceedingly well with SLI configuration. GTe GTX 280 is once again able to pull ahead of the 9800GX2 cards while the GTX 260 stays a little behind. While in SLI mode the GTX 280 holds the same lead over Quad SLI. The GTX 260 comes up behind the 9800GX2 configuration in both Single and SLI mode but pulls ahead of the Quad 9800GX2 and GTX 280 SLI setup with 3 way SLI enabled.

[b]Fear Results[/b]

[img]http://chrisray.soterial.com/GT200/Graphs/fearresults.png[/img]

[b]Performance Thoughts[/b]: With FEAR I was unable to provide 16xQ results because like UT3. The game kept crashing on the 512 solutions with it enabled. None the less I set maximum details with soft shadows enabled for this testing. The GTX 260 and GTX 280 follow similar trends to Lost Planet and Bioshock as overall performance goes. Nothing paticularly interesting or out of the ordinary all things considering. The GTX 260 3 way solution once again provides the highest framerates.

[b] Neverwinter Nights 2 [/b]

[img]http://chrisray.soterial.com/GT200/Graphs/NWN2.png[/img]

[b]Performance Thoughts[/b]: Neverwinter Nights also does not like the HDTV resolution so 1680x1050 was the resolution of choice. Also limited was the ingame AA settings. Limited to only 8x Multisampling. With this title two interesting things can be noted. One it is largely shader bound title where normal maps are applied to every surface in game. Also the game seems to suffer memory limitations at maximum settings. Paticularly with high shadows enabled. This allows the GTX 260 and GTX 280 SLI solutions to pull above the Quad 9800GX2 in performance. The GTX 260 3 way SLI also shows strong signs of application becoming CPU limited with so much GPU power.

[b] Age of Conan[/b]:

[img]http://chrisray.soterial.com/GT200/Graphs/AoCgraph.png[/img]

[b]Performance Thoughts[/b]: Age of Conan is the latest MMORPG on the market. It makes use of multi core SM 3.0 and HDR. Future revisions are supposed to support DX10 but these tests were included with the launch DX 9.0 renderer. This title shows similar performance to Unreal Tournament 3. Though the performance spread between cards is small. The 9800GX2, GTX 260 and GTX 280 are able to provide playable performance with a single card. Truly comfortable performance only comes from the SLI solutions. With the GTX 280 pulling ever so slightly ahead of Quad 9800GX2. Like previous tests the 3 way GTX 260 SLI solution does pull ahead of the other SLI solutions.

[b]Note:[/b] 16xQ results were not included due to the 512 meg cards being unable to perform with texture errors and crashes.
Unreal Tournament 3 Performance



Image\



Performance Thoughts: Using the Deimo's map we can see decent multi GPU scaling. Though no where near as impressive as Bioshock. Regardless of scaling the GTX 280 is able to pull ahead of the 9800GX2 by a few percent. While the GTX 260 remains slightly behind it, 2 GTX 280's in SLI are also able to maintain their lead over the Quad 9800GX2 setup while the GTX 260 drags a little behind. Once 3 way SLI is enabled the GTX 260 pulls ahead of all configurations tested. Due to the constant crashing of the 512 meg cards with 16xQ enabled. 16xQ results were excluded from this comparison.





Lost Planet Results





Image



Performance Thoughts: First the demo did not like the HDTV used for testing. So a standard 1680x1050 widescreen resolution was used. To increase GPU load 16xQ was selected. Also unlike prior tests. These results were not recorded with fraps but rather the built in benchmarking utility. The lost planet benchmark has built in multi GPU support and scales exceedingly well with SLI configuration. GTe GTX 280 is once again able to pull ahead of the 9800GX2 cards while the GTX 260 stays a little behind. While in SLI mode the GTX 280 holds the same lead over Quad SLI. The GTX 260 comes up behind the 9800GX2 configuration in both Single and SLI mode but pulls ahead of the Quad 9800GX2 and GTX 280 SLI setup with 3 way SLI enabled.



Fear Results



Image



Performance Thoughts: With FEAR I was unable to provide 16xQ results because like UT3. The game kept crashing on the 512 solutions with it enabled. None the less I set maximum details with soft shadows enabled for this testing. The GTX 260 and GTX 280 follow similar trends to Lost Planet and Bioshock as overall performance goes. Nothing paticularly interesting or out of the ordinary all things considering. The GTX 260 3 way solution once again provides the highest framerates.



Neverwinter Nights 2



Image



Performance Thoughts: Neverwinter Nights also does not like the HDTV resolution so 1680x1050 was the resolution of choice. Also limited was the ingame AA settings. Limited to only 8x Multisampling. With this title two interesting things can be noted. One it is largely shader bound title where normal maps are applied to every surface in game. Also the game seems to suffer memory limitations at maximum settings. Paticularly with high shadows enabled. This allows the GTX 260 and GTX 280 SLI solutions to pull above the Quad 9800GX2 in performance. The GTX 260 3 way SLI also shows strong signs of application becoming CPU limited with so much GPU power.



Age of Conan:



Image



Performance Thoughts: Age of Conan is the latest MMORPG on the market. It makes use of multi core SM 3.0 and HDR. Future revisions are supposed to support DX10 but these tests were included with the launch DX 9.0 renderer. This title shows similar performance to Unreal Tournament 3. Though the performance spread between cards is small. The 9800GX2, GTX 260 and GTX 280 are able to provide playable performance with a single card. Truly comfortable performance only comes from the SLI solutions. With the GTX 280 pulling ever so slightly ahead of Quad 9800GX2. Like previous tests the 3 way GTX 260 SLI solution does pull ahead of the other SLI solutions.



Note: 16xQ results were not included due to the 512 meg cards being unable to perform with texture errors and crashes.

#4
Posted 06/07/2008 06:40 PM   
[b] Final Thoughts and Conclusion[/b]

[b] Quick Note[/b]: Since alot of configurations were tested here with multiple levels of SLI I am going to try and discuss them at there various price and performance levels.


[b] The GTX 280[/b]

[img]http://chrisray.soterial.com/GT200/Thumbnail/GTX280connectorthumb.jpg[/img]

[b]GTX 280 Verses 9800GX2:[/b] Theres no doubt about it. The GTX 280 is the better card card here. For multiple reasons. Not only does it provide 1 gigabyte of pure framebuffer memory for the card to access. It also consistently outperforms the 9800GX2 without the need for AFR rendering. This is important for those who are are sensitive AFR latency issues between inter-frame delays. When the geometry shader becomes more of an issue. The GTX 280 will have a significant advantage as well. The GTX 280 is also does not have artificial performance barriers such as 512 megs of memory hamstringing it from achieving its full potential. The GTX 280 also consumes significantly less power while idling. The downsides are that at load the GTX 280 does run hotter and consumes more power than the 9800GX2.

[b]Final Note:[/b] For those wanting the single most powerful GPU Nvidia has to offer. This is it. The retail price of this card is not cheap however. 650 Dollars.

[b]GTX 280 SLI[/b]: The GTX 280 SLI verses Quad 9800GX2 is pretty much the same story. The only difference is both solutions use AFR and profiles to provide performance boosts. However with the power behind the GTX 280 SLI it never find its rendering performance hamstringed by the 512 meg limit the 9800GX2 suffers. This allows for it to pull far ahead of the 9800GX2 with 16xQ and higher levels of AA/resolutions enabled. However at 1300 dollar asking price. The GTX 280 does have t content with another solution. The GTX 260 3 way SLI. Which I'll get too in a moment. The 9800GX2 also would occasionally interfere with LED/PWR Connectors on the bottom slot while the GTX 280 cooler is designed not to do this.

[b]The GTX 260[/b]

[img]http://chrisray.soterial.com/GT200/Thumbnail/GTX260connectorthumb.jpg[/img]

[b]GTX 260 verses 9800GX2[/b]: The GTX 260 finds itself in an interesting position. Usually it falls slightly behind the 9800GX2 in raw FPS. But one must never forget that the 9800GX2 relies on profiles and AFR for its performance. When you take these things into account its difficult to recommended the 9800GX2 over a single GTX 260. The other advantages the GTX 260 offers is its reliance on only having 2 6 pin connecttors verses the 9800GX2's power requirements is also a compelling argument. The GTX 260 does only have 896 megs of memory but that is still far and above the 512 limitation suffered by the 9800GX2 which will allow it to pull ahead in certain situations such as Crysis at High Settings. The GTX 260 retails for 400 dollars which puts the 9800GX2 in a difficult place. I am guessing the 9800GX2 will eventually be phased out.

[b]GTX 260 verses 9800GX2 Quad SLI[/b]: The recommendation in this case echoes itself. With only requiring 6 pin connectors. The GTX 260 solution will be open to more power supplies. . Like the 9800GX2 Quad solution. It does rely on AFR rendering and SLI profiles. The GTX 260 is a little slower in some situations but the additional power features and shader capabilities of the GTX 260 make it a more future proof solution. Like the GTX280 cooler you will never have to worry about the second bottom card hitting the LED/PWR connectors that the 9800GX2 would tend to squish up against.

[b] GTX 260 3 way SLI verses GTX 280 SLI[/b]: This choice is quite a bit more interesting. The Geforce GTX 260 retails for 400 dollars verses the 650 of the GTX 280. In a 3way SLI configuration you are paying a net total of 1200 dollars and getting superior performance most of the time. While the GTX 280 SLI setup costs 1300 dollars. Other factors to be considered is the GTX 260 only requires two six pin connectors rather than the eight pin requirements of the GTX 280. However the downsides is the GTX 260 3 way SLI solution consumes more power at load and idle and will require good airflow and case setup. You will also need a 3 Way Capable SLI motherboard. Unfortunately I was unable to test the GTX 260 SLI

[b] Overall Thoughts and Conclusions[/b]

[img]http://chrisray.soterial.com/GT200/SystemCards/GT200medusademo.jpg[/img]

[b] Lasting Thoughts[/b]: The GT200 is an interesting architecture. Its a 1.4 Billion transistor beast. However it is a single GPU that is capable of delivering performance beyond last years multi GPU solutions. That alone is an impressive feat and should not be ignored. However other than performance enhancement features and improved power efficiency there isnt a whole lot "new" to the GTX 200 series that isnt seen on prior DX 10 Geforce cards. 9800GX2 users are unlikely to run out and replace there cards immediately unless they are aiming for a 3 way SLI solution for highest level of performance. More performance isn't a bad thing. As newer titles come out and start really stressing the current cards. These newer cards are really going to help keep the frame rates and resolution up. There are however some other considerations to look at when looking at an Nvidia card purchase. And this is not just exclusive to the GTX 200 series. CUDA and PhysX. Nvidia is clamoring hard about the important of PhysX and CUDA. And there are some interesting apps taking advantage of CUDA now. With PhysX its slightly different. While no current games currently support PhysX there are several upcoming A titles and B titles which have PhysX support built in. Nvidia demo'd several of these games at editors day. If Nvidia can deliver the PhysX promises they are making then gaming on a DX10 Geforce card could become quite interesting in the future.

[b]Non GPU Related CPU Note[/b]: Nvidia has been making a big deal about the "optimized" PC lately. Stating that you should buy a PC with a good GPU AND good CPU for optimal performance. In this case I have to give a small nod to AMD. While the Phenom isnt delivering the fastest CPU experience. It was more than capable of providing an excellent gaming experience even on a 3 way GTX 260 solution. While this may not be the most "Balanced" approach. It does lend some credibility the idea that a decent CPU paired up with powerful graphic cards can provide an excellent gaming solution. While some may question the choice of using a Phenom in this investigation I felt it gave an interesting perspective.
Final Thoughts and Conclusion



Quick Note: Since alot of configurations were tested here with multiple levels of SLI I am going to try and discuss them at there various price and performance levels.





The GTX 280



Image



GTX 280 Verses 9800GX2: Theres no doubt about it. The GTX 280 is the better card card here. For multiple reasons. Not only does it provide 1 gigabyte of pure framebuffer memory for the card to access. It also consistently outperforms the 9800GX2 without the need for AFR rendering. This is important for those who are are sensitive AFR latency issues between inter-frame delays. When the geometry shader becomes more of an issue. The GTX 280 will have a significant advantage as well. The GTX 280 is also does not have artificial performance barriers such as 512 megs of memory hamstringing it from achieving its full potential. The GTX 280 also consumes significantly less power while idling. The downsides are that at load the GTX 280 does run hotter and consumes more power than the 9800GX2.



Final Note: For those wanting the single most powerful GPU Nvidia has to offer. This is it. The retail price of this card is not cheap however. 650 Dollars.



GTX 280 SLI: The GTX 280 SLI verses Quad 9800GX2 is pretty much the same story. The only difference is both solutions use AFR and profiles to provide performance boosts. However with the power behind the GTX 280 SLI it never find its rendering performance hamstringed by the 512 meg limit the 9800GX2 suffers. This allows for it to pull far ahead of the 9800GX2 with 16xQ and higher levels of AA/resolutions enabled. However at 1300 dollar asking price. The GTX 280 does have t content with another solution. The GTX 260 3 way SLI. Which I'll get too in a moment. The 9800GX2 also would occasionally interfere with LED/PWR Connectors on the bottom slot while the GTX 280 cooler is designed not to do this.



The GTX 260



Image



GTX 260 verses 9800GX2: The GTX 260 finds itself in an interesting position. Usually it falls slightly behind the 9800GX2 in raw FPS. But one must never forget that the 9800GX2 relies on profiles and AFR for its performance. When you take these things into account its difficult to recommended the 9800GX2 over a single GTX 260. The other advantages the GTX 260 offers is its reliance on only having 2 6 pin connecttors verses the 9800GX2's power requirements is also a compelling argument. The GTX 260 does only have 896 megs of memory but that is still far and above the 512 limitation suffered by the 9800GX2 which will allow it to pull ahead in certain situations such as Crysis at High Settings. The GTX 260 retails for 400 dollars which puts the 9800GX2 in a difficult place. I am guessing the 9800GX2 will eventually be phased out.



GTX 260 verses 9800GX2 Quad SLI: The recommendation in this case echoes itself. With only requiring 6 pin connectors. The GTX 260 solution will be open to more power supplies. . Like the 9800GX2 Quad solution. It does rely on AFR rendering and SLI profiles. The GTX 260 is a little slower in some situations but the additional power features and shader capabilities of the GTX 260 make it a more future proof solution. Like the GTX280 cooler you will never have to worry about the second bottom card hitting the LED/PWR connectors that the 9800GX2 would tend to squish up against.



GTX 260 3 way SLI verses GTX 280 SLI: This choice is quite a bit more interesting. The Geforce GTX 260 retails for 400 dollars verses the 650 of the GTX 280. In a 3way SLI configuration you are paying a net total of 1200 dollars and getting superior performance most of the time. While the GTX 280 SLI setup costs 1300 dollars. Other factors to be considered is the GTX 260 only requires two six pin connectors rather than the eight pin requirements of the GTX 280. However the downsides is the GTX 260 3 way SLI solution consumes more power at load and idle and will require good airflow and case setup. You will also need a 3 Way Capable SLI motherboard. Unfortunately I was unable to test the GTX 260 SLI



Overall Thoughts and Conclusions



Image



Lasting Thoughts: The GT200 is an interesting architecture. Its a 1.4 Billion transistor beast. However it is a single GPU that is capable of delivering performance beyond last years multi GPU solutions. That alone is an impressive feat and should not be ignored. However other than performance enhancement features and improved power efficiency there isnt a whole lot "new" to the GTX 200 series that isnt seen on prior DX 10 Geforce cards. 9800GX2 users are unlikely to run out and replace there cards immediately unless they are aiming for a 3 way SLI solution for highest level of performance. More performance isn't a bad thing. As newer titles come out and start really stressing the current cards. These newer cards are really going to help keep the frame rates and resolution up. There are however some other considerations to look at when looking at an Nvidia card purchase. And this is not just exclusive to the GTX 200 series. CUDA and PhysX. Nvidia is clamoring hard about the important of PhysX and CUDA. And there are some interesting apps taking advantage of CUDA now. With PhysX its slightly different. While no current games currently support PhysX there are several upcoming A titles and B titles which have PhysX support built in. Nvidia demo'd several of these games at editors day. If Nvidia can deliver the PhysX promises they are making then gaming on a DX10 Geforce card could become quite interesting in the future.



Non GPU Related CPU Note: Nvidia has been making a big deal about the "optimized" PC lately. Stating that you should buy a PC with a good GPU AND good CPU for optimal performance. In this case I have to give a small nod to AMD. While the Phenom isnt delivering the fastest CPU experience. It was more than capable of providing an excellent gaming experience even on a 3 way GTX 260 solution. While this may not be the most "Balanced" approach. It does lend some credibility the idea that a decent CPU paired up with powerful graphic cards can provide an excellent gaming solution. While some may question the choice of using a Phenom in this investigation I felt it gave an interesting perspective.

#5
Posted 06/08/2008 08:05 PM   
Misc Crysis


Very High 16xAA/16xAF GTX 280 SLI


[img]http://chrisray.soterial.com/GT200/Crysis/veryhigh280sli16x.jpg[/img]



Very High 16xAA/16xAF GTX 260 3 Way SLI

[img]http://chrisray.soterial.com/GT200/Crysis/veryhigh2603way16x.jpg[/img]


[b] 3DMARK Vantage[/b]


This is the Q6600 Rig with the GTX 260 3 Way at stock GPU clocks

[url="http://chrisray.soterial.com/GT200/Graphs/vantage3way.png"][img]http://chrisray.soterial.com/GT200/Thumbnail/vantage3waythumb.png[/img][/url]


Phenom 9600 2.4 Ghz and GTX 280 SLI Stock GPU Clocks

[url="http://chrisray.soterial.com/GT200/Graphs/phenomvantage.png"][img]http://chrisray.soterial.com/GT200/Thumbnail/phenomvantagethumb.png[/img][/url]
Misc Crysis





Very High 16xAA/16xAF GTX 280 SLI





Image







Very High 16xAA/16xAF GTX 260 3 Way SLI



Image





3DMARK Vantage





This is the Q6600 Rig with the GTX 260 3 Way at stock GPU clocks



Image





Phenom 9600 2.4 Ghz and GTX 280 SLI Stock GPU Clocks



Image

#6
Posted 06/16/2008 01:25 AM   
Nice, Nice and nice.

Now to see the price tag for these cards. I just got the final stages sorted for WCing my mobo & ultras.

Like you said, not all cards are compared as I would be interested to see the performance over the 8800U.
Nice, Nice and nice.



Now to see the price tag for these cards. I just got the final stages sorted for WCing my mobo & ultras.



Like you said, not all cards are compared as I would be interested to see the performance over the 8800U.

#7
Posted 06/16/2008 12:16 PM   
I'm a little disappointed with the single card Crysis very high benchmark results, but I suppose that's due to the resolution being a bottleneck more than the card itself. Everything else though:

Kick ASS!

Nice job with the review Chris! You covered exactly what I wanted.

Edit: Just realised why the Crysis benchmarks were so low: you used the 175 drivers; no onboard PhysX.

In that case, as 3D Game Man would say: "This is a 100% kick-ass review."

Another edit: The Inquirer, the "Let's tear nVidia down for everything they do" guys of the internet have done a review of their own (terrrible in comparison to your's Chris!) using the new drivers with onboard PhysX support. They are amazing! And for comparison:

GTX-280 single
Chris's: Crysis, very high, 4xAA, 1680x1050: 23fps
Inquirer: Crysis, very high, 4xAA, 1680x1050: 37fps
I'm a little disappointed with the single card Crysis very high benchmark results, but I suppose that's due to the resolution being a bottleneck more than the card itself. Everything else though:



Kick ASS!



Nice job with the review Chris! You covered exactly what I wanted.



Edit: Just realised why the Crysis benchmarks were so low: you used the 175 drivers; no onboard PhysX.



In that case, as 3D Game Man would say: "This is a 100% kick-ass review."



Another edit: The Inquirer, the "Let's tear nVidia down for everything they do" guys of the internet have done a review of their own (terrrible in comparison to your's Chris!) using the new drivers with onboard PhysX support. They are amazing! And for comparison:



GTX-280 single

Chris's: Crysis, very high, 4xAA, 1680x1050: 23fps

Inquirer: Crysis, very high, 4xAA, 1680x1050: 37fps

i5 3570K | Z77 Extreme4 | 16GB Corsair Vengeance | GTX 670
Enermax Platimax 600W | Auzentech X-Fi Forte | QPad MK-80 | Logitech Performance MX | Logitech Z906 5.1 THX | Dell U2412HM
128GB Samsung 830 (OS, apps) | 1TB Samsung F1 (Games) | Corsair 600T (Silver) | Windows 7 Ultimate 64-Bit

Phobya Xtreme 200 + Alphacool 120 + 240 | Swiftech MCP 655 | XSPC Raystorm | XSPC GTX 670 | Sycthe GT AP-12

#8
Posted 06/16/2008 12:29 PM   
Awesome Chris; thanks! I'm browsing through it reading snippets as I'm at work lol. You didn't have 177.34?
Awesome Chris; thanks! I'm browsing through it reading snippets as I'm at work lol. You didn't have 177.34?

Image

My name is Legion, for we are many

Sig: BluSOUL | HAFX | Asrock P67 Fatal1ty Pro | Intel Core i7-2600K | Corsair H80 | G.Skill 1333MHz CL7 8GB | EVGA GeForce GTX 770 4GB Dual Bios ACX | Seagate 500GB SATA 3 & Seagate 2TB SATA 2 | Thermaltake Toughpower 750W W0116 | Philips 273E LH 27" | Logitech X-540 5.1 | Windows 7 Home Premium x64 SP1 |

#9
Posted 06/16/2008 12:44 PM   
Excellent review, I think Ill jump for a single GTX280 myself then SLI later this year.
Excellent review, I think Ill jump for a single GTX280 myself then SLI later this year.

#10
Posted 06/16/2008 12:59 PM   
[quote name='F.E.A.R.' post='182951' date='Jun 16 2008, 01:44 PM']Awesome Chris; thanks! I'm browsing through it reading snippets as I'm at work lol. You didn't have 177.34?[/quote]

Ok sorry everyone that i whined on about Very high DX10, but I suspected this is where 2x and Tri would suddenly lose the System IO bottleneck and make our Jaw hit the floor. (and it did !).

I suspect we will see some benchmarks at 1920x1200 and even XHD that sudednly make TRI 280 show its teeth soon.

As you can see, even with current drivers pretty much 100% scaling at 1600 res in very high. I also suspect the new ATi cards will impode just at this res at 16q, so the 1920x1200 and XHD benchamrks we can get done soon will just be rubbing salt in the eye for Ati.

Hell the GTX 280 makes Crysis DX10 very high playable!

Before we OC the CPU or GPUs ;).
[quote name='F.E.A.R.' post='182951' date='Jun 16 2008, 01:44 PM']Awesome Chris; thanks! I'm browsing through it reading snippets as I'm at work lol. You didn't have 177.34?



Ok sorry everyone that i whined on about Very high DX10, but I suspected this is where 2x and Tri would suddenly lose the System IO bottleneck and make our Jaw hit the floor. (and it did !).



I suspect we will see some benchmarks at 1920x1200 and even XHD that sudednly make TRI 280 show its teeth soon.



As you can see, even with current drivers pretty much 100% scaling at 1600 res in very high. I also suspect the new ATi cards will impode just at this res at 16q, so the 1920x1200 and XHD benchamrks we can get done soon will just be rubbing salt in the eye for Ati.



Hell the GTX 280 makes Crysis DX10 very high playable!



Before we OC the CPU or GPUs ;).

Image


3D RIG:I7 2600k 4.9 Ghz, P8P67 PRO, Fort02 SE, MSi GTX 680 2GB, 4GB DDR3, 60GB Vertex 2E SSD (W7 64 Boot) OCZ Revo 2 x2 PCIE SSD 240 GB, G930 7.1.

2D: MSi 6970 2GB-QX9650-160GB SSD-TJ109. DISPLAYS:2D 120" 21:9 100HZ DLP 0.1MS, 3D 50" 400HZ Passive 0.5 MS 3D LED - 24" 120HZ 2MS Active N3D LCD, all 1080p

#11
Posted 06/16/2008 01:14 PM   
Chris:

- When is that PhysX driver coming? An Nvidia employee told me within a month and that was 4 weeks ago...
- How do the powersaving options respond to GPU OC's?
- Does the 2D mode still work with Vista's native 3D - or am I wrong here?
- Is it a dual core chip or not?

I defo want one of these cards and maybe...just maybe...2...even with my 16 x 10 screen it will be awesome, espcially as the drivers improve!
Chris:



- When is that PhysX driver coming? An Nvidia employee told me within a month and that was 4 weeks ago...

- How do the powersaving options respond to GPU OC's?

- Does the 2D mode still work with Vista's native 3D - or am I wrong here?

- Is it a dual core chip or not?



I defo want one of these cards and maybe...just maybe...2...even with my 16 x 10 screen it will be awesome, espcially as the drivers improve!

Image

My name is Legion, for we are many

Sig: BluSOUL | HAFX | Asrock P67 Fatal1ty Pro | Intel Core i7-2600K | Corsair H80 | G.Skill 1333MHz CL7 8GB | EVGA GeForce GTX 770 4GB Dual Bios ACX | Seagate 500GB SATA 3 & Seagate 2TB SATA 2 | Thermaltake Toughpower 750W W0116 | Philips 273E LH 27" | Logitech X-540 5.1 | Windows 7 Home Premium x64 SP1 |

#12
Posted 06/16/2008 01:33 PM   
As a single card.... GREAT!

If you're board has more than one PCIe x 16.... I do not understand what Nvidia is trying to do. The 8800GTS G82 in sli should rival a GTX260 setup. The G92's will cost less than half as much. Does Nvidia think that it's Bugatti or something? Or is this a ploy to force people to purchase Nforce boards and run sli setups on G92's?

I hate to say it, but I'm ready to take my chances with a 4870. (or maybe an x2)
As a single card.... GREAT!



If you're board has more than one PCIe x 16.... I do not understand what Nvidia is trying to do. The 8800GTS G82 in sli should rival a GTX260 setup. The G92's will cost less than half as much. Does Nvidia think that it's Bugatti or something? Or is this a ploy to force people to purchase Nforce boards and run sli setups on G92's?



I hate to say it, but I'm ready to take my chances with a 4870. (or maybe an x2)



8130p @ 5.1Ghz : GTX 570 x2 - cuda workstation



i5-2500k @ 4.4Ghz : GTX 570 - itx media pc

#13
Posted 06/16/2008 01:38 PM   
[quote name='Luke' post='182967' date='Jun 16 2008, 02:38 PM']As a single card.... GREAT!

If you're board has more than one PCIe x 16.... I do not understand what Nvidia is trying to do. The 8800GTS G82 in sli should rival a GTX260 setup. The G92's will cost less than half as much. Does Nvidia think that it's Bugatti or something? Or is this a ploy to force people to purchase Nforce boards and run sli setups on G92's?

I hate to say it, but I'm ready to take my chances with a 4870. (or maybe an x2)[/quote]


At lower res, as we have been saying for months 8800 anything sli, even 9800 sli would provide insane firepower for most gamers.

The GTX 280, it's price tag and spec' is about the high end for now, Like the 8800 GTX was, at 1600 res with high AA/AF to XHD, the GTX 280/260 single or sli are untouchable. So if your monitor is now 1600+ res or will be,if you buy a new screen soon, you have a simple choice, run 8800/9800 high res and spare the AF/AA, even game settings in modern titles like AOC/Crysis and others coming. Or Buy the GTX 260/280 single or SLI and have upto 2 years of running everything all maxed out with insane aa/af. 8800 GTX did till Crysis.

Now the new ATI cards might provide amazing value and firepower for 1600 res and below, (but erm, so does the 8800 GT sli!). And monitors are getting higher in res larger in size, faster in response times, better in brightness and cheaper by the second. A decent 2MS 24" 1920x1200 screen is half the price of a GTX 280, so not a cost issue for most high end PC gamers.

So if your going to buy a shiney new screen, suddenly anything but a 9800GX2, 9800 GTX Sli and 8800 GTX/Ultra Sli start to be a bit crap at 1920x1200 and above and till now they all sucked it you started to ramp AF/AA especially in modern titles like Crysis and now AOC.

A GTX 260 single card offers better value than almost any other combo at 1920x1280.

GTX 260 Sli offer the best price for real perfromance for 1920x1200 with high game settings and fairly high AA/AF

Where the GTX 280 offers the same for Extreme AA/AF at 1920x1200 or anything to do with XHD.

Now the GTX 280 is here and finally XHD is for the masses the monitor price will crash and go mass market.

16Q AA playable in Crysis very high DX10 1600 res takes just 2x GTX 260's 1/3rd more than a single GTX 280, at stock speeds with early drivers, good luck with the red army. By the time they get there drivers to work the next Nvidia card will be here.

Remeber 95% of the benchies shown today were on stock CPU's stock GPU's.

with better driver and people pushing 790i boards and DDR3 ram and Quad cores we may see 35-40 FPS at 16Q at 1920x1200 in crysis very high GTX 280 dual Sli.

Not to mention cuda and nvidia physix moving into new games or being retro added!
[quote name='Luke' post='182967' date='Jun 16 2008, 02:38 PM']As a single card.... GREAT!



If you're board has more than one PCIe x 16.... I do not understand what Nvidia is trying to do. The 8800GTS G82 in sli should rival a GTX260 setup. The G92's will cost less than half as much. Does Nvidia think that it's Bugatti or something? Or is this a ploy to force people to purchase Nforce boards and run sli setups on G92's?



I hate to say it, but I'm ready to take my chances with a 4870. (or maybe an x2)





At lower res, as we have been saying for months 8800 anything sli, even 9800 sli would provide insane firepower for most gamers.



The GTX 280, it's price tag and spec' is about the high end for now, Like the 8800 GTX was, at 1600 res with high AA/AF to XHD, the GTX 280/260 single or sli are untouchable. So if your monitor is now 1600+ res or will be,if you buy a new screen soon, you have a simple choice, run 8800/9800 high res and spare the AF/AA, even game settings in modern titles like AOC/Crysis and others coming. Or Buy the GTX 260/280 single or SLI and have upto 2 years of running everything all maxed out with insane aa/af. 8800 GTX did till Crysis.



Now the new ATI cards might provide amazing value and firepower for 1600 res and below, (but erm, so does the 8800 GT sli!). And monitors are getting higher in res larger in size, faster in response times, better in brightness and cheaper by the second. A decent 2MS 24" 1920x1200 screen is half the price of a GTX 280, so not a cost issue for most high end PC gamers.



So if your going to buy a shiney new screen, suddenly anything but a 9800GX2, 9800 GTX Sli and 8800 GTX/Ultra Sli start to be a bit crap at 1920x1200 and above and till now they all sucked it you started to ramp AF/AA especially in modern titles like Crysis and now AOC.



A GTX 260 single card offers better value than almost any other combo at 1920x1280.



GTX 260 Sli offer the best price for real perfromance for 1920x1200 with high game settings and fairly high AA/AF



Where the GTX 280 offers the same for Extreme AA/AF at 1920x1200 or anything to do with XHD.



Now the GTX 280 is here and finally XHD is for the masses the monitor price will crash and go mass market.



16Q AA playable in Crysis very high DX10 1600 res takes just 2x GTX 260's 1/3rd more than a single GTX 280, at stock speeds with early drivers, good luck with the red army. By the time they get there drivers to work the next Nvidia card will be here.



Remeber 95% of the benchies shown today were on stock CPU's stock GPU's.



with better driver and people pushing 790i boards and DDR3 ram and Quad cores we may see 35-40 FPS at 16Q at 1920x1200 in crysis very high GTX 280 dual Sli.



Not to mention cuda and nvidia physix moving into new games or being retro added!

Image


3D RIG:I7 2600k 4.9 Ghz, P8P67 PRO, Fort02 SE, MSi GTX 680 2GB, 4GB DDR3, 60GB Vertex 2E SSD (W7 64 Boot) OCZ Revo 2 x2 PCIE SSD 240 GB, G930 7.1.

2D: MSi 6970 2GB-QX9650-160GB SSD-TJ109. DISPLAYS:2D 120" 21:9 100HZ DLP 0.1MS, 3D 50" 400HZ Passive 0.5 MS 3D LED - 24" 120HZ 2MS Active N3D LCD, all 1080p

#14
Posted 06/16/2008 02:26 PM   
[quote name='Luke' post='182967' date='Jun 16 2008, 02:38 PM']As a single card.... GREAT!

If you're board has more than one PCIe x 16.... I do not understand what Nvidia is trying to do. The 8800GTS G82 in sli should rival a GTX260 setup. The G92's will cost less than half as much. Does Nvidia think that it's Bugatti or something? Or is this a ploy to force people to purchase Nforce boards and run sli setups on G92's?[/quote]

I have to agree, even the single card solution doesn't give the "groundbreaking revolutionary performance" i was hoping for. It isn't that far ahead of the GX2 and the prices are just silly for that extra few FPS...

The EVGA GeForce GTX 280 SSC Edition version here in the UK is going for £563.99 which is $1,107.53 for a single GPU?? You gotta be kidding me. I wouldn't mind paying that if the card stood up to the reputation that preceeded it, i have to say i am a dissapointed.

I am thankful to chris for this review though, i can see he spent a lot of time on it and it's a lot more comprehensive than others i have read so far.

[quote]I hate to say it, but I'm ready to take my chances with a 4870. (or maybe an x2)[/quote]

I have never owned an ATI card, this might be my 1st time....
[quote name='Luke' post='182967' date='Jun 16 2008, 02:38 PM']As a single card.... GREAT!



If you're board has more than one PCIe x 16.... I do not understand what Nvidia is trying to do. The 8800GTS G82 in sli should rival a GTX260 setup. The G92's will cost less than half as much. Does Nvidia think that it's Bugatti or something? Or is this a ploy to force people to purchase Nforce boards and run sli setups on G92's?



I have to agree, even the single card solution doesn't give the "groundbreaking revolutionary performance" i was hoping for. It isn't that far ahead of the GX2 and the prices are just silly for that extra few FPS...



The EVGA GeForce GTX 280 SSC Edition version here in the UK is going for £563.99 which is $1,107.53 for a single GPU?? You gotta be kidding me. I wouldn't mind paying that if the card stood up to the reputation that preceeded it, i have to say i am a dissapointed.



I am thankful to chris for this review though, i can see he spent a lot of time on it and it's a lot more comprehensive than others i have read so far.



I hate to say it, but I'm ready to take my chances with a 4870. (or maybe an x2)




I have never owned an ATI card, this might be my 1st time....

#15
Posted 06/16/2008 02:26 PM   
  1 / 14    
Scroll To Top