GTX 295 Quad SLI Preview.
  1 / 7    
[b] GTX 295 Quad SLI[/b]


[url="http://chrisray.soterial.com/295/card/295dual.jpg"][img]http://chrisray.soterial.com/295/card/295dualthumb.jpg[/img][/url]


[b]Introduction:[/b] Following in the tradition of the Geforce 7950GX2 and the Geforce 9800GX2. Nvidia is releasing another dual GPU/Dual PCB graphic card. Dubbed the Geforce GTX 295. Both the Geforce 7950GX2 and 9800GX2 have recieved criticism. Both for different reasons. I think when moving forward with GPU designs it's important to look back. And see what we have learned.

[b]7950GX2 Problems:[/b] The 7950GX2 had an uphill battle to face when dealing with the rendering limitations of Windows XP. In Windows XP DirectX 9.0C could not store more than 3 frames at a given time. So 3 Way AFR was infact not possible under Windows XP. So it defaulted to a hybrid of AFR to SFR rendering. This was partially explained in my [url="http://forums.slizone.com/index.php?showtopic=18781"]9800GX2 Quad SLI scaling guide[/url]. This was corrected under Windows Vista which came out at the end of the cards lifespan.

[b] 2006 SLI Rendering[/b]

[url="http://chrisray.soterial.com/295/graphs/oldslichart.jpg"][img]http://chrisray.soterial.com/295/graphs/oldslichartthumb.png[/img][/url]

[b]2008/2009 SLI rendering[/b]

[url="http://chrisray.soterial.com/295/graphs/newslichart.jpg"][img]http://chrisray.soterial.com/295/graphs/newslichartthumb.png[/img][/url]



[b]9800GX2 Short lifespan[/b]: The 9800GX2 solved this problem by using Nvidia's Nway Under Windows Vista. Windows Vista provided the ability to SLI more than 2 GPUS without having to resort to split frame rendering. Which is grossly incompatible in today's method of rendering games. Under this new condition the 9800GX2 scaled great. 4 Way AFR allowed the GPU's potential to be realized.. however.. it was often hampered and limited by its 512 megs of memory. Preventing it from reaching its potential. It was an artificially frustrating situation where you couldnt turn AA/AF up because the memory subsystem was too small to allow for it at the highest resolutions. So you'd often left making texture/AA compromises in order to get optimal scaling. Too this day the 9800GX2 scales well so long as you make these memory compromises.

[b]Today and the Geforce GTX 295[/b]: The big question that lays in front of us is did the GTX 295 Quad SLI solve these problems. Taking a look at the specs....


[img]http://chrisray.soterial.com/295/graphs/specschart.png[/img]


[b] Specs at a glance[/b]: The Geforce GTX 295 as a single core card sits somewhere between the Geforce GTX 280 and the Geforce GTX 260. It carries all the shader and texturing power of the GTX 280 ((minus some minor clock speed differences)) but still has 1 memory partition 4 less ROPS than the Geforce GTX 280. This means the card has an effective 896 Megs of Video ram available to it while rendering 3D scenes. It still a step down in memory/bus width from the GTX 280 but it does carry signiifcantly more memory than the 9800GX2. Which should alleviate most memory bottlenecks until you reach 8x multisampling and 2560x1600 resolution. Other than its the same GT200 ((x2)) that we have come to know and love from Nvidia. It has all the power saving technology for idling/desktop that are seeen from the GTX 260/280 variant cards for a quiet 2D experience.

[b]Quiet? Lets talk about the cooler[/b]: Nvidia claims the card can dissipate up to 289 watts of heat. Which is about 40% improvement over the 9800Gx2. However there are some things you need to be aware of. The card itself has a full back grate for sending hot air out the back which is different from the 9800GX2 which barely had any space in the back for air exhuast. However despite this improvement to airflow not all of the air reaches the back. The side heatsink grates will push air out towards the side of the card. Meaning there will be some heat dumped into your case verses going out the back. As far pushing heat out the back of your case not as much air will reach the back compared to a GTX 280 or GTX 260 and some will get exhuasted into your case. This is a minor complaint but something case builders should be aware of when they are building their GTX 295 or Quad SLI computers.

[url="http://chrisray.soterial.com/295/card/295back.jpg"][img]http://chrisray.soterial.com/295/card/295backthumb.jpg[/img][/url]

[url="http://chrisray.soterial.com/295/card/295side.jpg"][img]http://chrisray.soterial.com/295/card/295sidethumb.jpg[/img][/url]

[b]Ok So how loud is it?[/b]: When the card first loads up and the PC starts on the fan boots at about 40% speed. Very quickly once the bios boots up the card down throttles to nearly inaudible levels. When gaming under a stressful enviroment the noise levels can pick up, They do so gradually as the tempature rises. The dynamic fan control is quite good. As far as overall noise is concerned. It is very similar to the 9800GX2. While in 2D mode you'll probably never hear it as Nvidia has excellent thermal and power management on their GT200 cards.


[b] Power[/b]: Just like the 9800GX2 and Geforce GTX 280 this card requires one 8pin and one 6pin power connector. This means you will need a PSU that is at least compatible with the 9800GX2 or GTX 280. It is not recommended to use a molex to 8 pin connector.
GTX 295 Quad SLI





Image





Introduction: Following in the tradition of the Geforce 7950GX2 and the Geforce 9800GX2. Nvidia is releasing another dual GPU/Dual PCB graphic card. Dubbed the Geforce GTX 295. Both the Geforce 7950GX2 and 9800GX2 have recieved criticism. Both for different reasons. I think when moving forward with GPU designs it's important to look back. And see what we have learned.



7950GX2 Problems: The 7950GX2 had an uphill battle to face when dealing with the rendering limitations of Windows XP. In Windows XP DirectX 9.0C could not store more than 3 frames at a given time. So 3 Way AFR was infact not possible under Windows XP. So it defaulted to a hybrid of AFR to SFR rendering. This was partially explained in my 9800GX2 Quad SLI scaling guide. This was corrected under Windows Vista which came out at the end of the cards lifespan.



2006 SLI Rendering



Image



2008/2009 SLI rendering



Image







9800GX2 Short lifespan: The 9800GX2 solved this problem by using Nvidia's Nway Under Windows Vista. Windows Vista provided the ability to SLI more than 2 GPUS without having to resort to split frame rendering. Which is grossly incompatible in today's method of rendering games. Under this new condition the 9800GX2 scaled great. 4 Way AFR allowed the GPU's potential to be realized.. however.. it was often hampered and limited by its 512 megs of memory. Preventing it from reaching its potential. It was an artificially frustrating situation where you couldnt turn AA/AF up because the memory subsystem was too small to allow for it at the highest resolutions. So you'd often left making texture/AA compromises in order to get optimal scaling. Too this day the 9800GX2 scales well so long as you make these memory compromises.



Today and the Geforce GTX 295: The big question that lays in front of us is did the GTX 295 Quad SLI solve these problems. Taking a look at the specs....





Image





Specs at a glance: The Geforce GTX 295 as a single core card sits somewhere between the Geforce GTX 280 and the Geforce GTX 260. It carries all the shader and texturing power of the GTX 280 ((minus some minor clock speed differences)) but still has 1 memory partition 4 less ROPS than the Geforce GTX 280. This means the card has an effective 896 Megs of Video ram available to it while rendering 3D scenes. It still a step down in memory/bus width from the GTX 280 but it does carry signiifcantly more memory than the 9800GX2. Which should alleviate most memory bottlenecks until you reach 8x multisampling and 2560x1600 resolution. Other than its the same GT200 ((x2)) that we have come to know and love from Nvidia. It has all the power saving technology for idling/desktop that are seeen from the GTX 260/280 variant cards for a quiet 2D experience.



Quiet? Lets talk about the cooler: Nvidia claims the card can dissipate up to 289 watts of heat. Which is about 40% improvement over the 9800Gx2. However there are some things you need to be aware of. The card itself has a full back grate for sending hot air out the back which is different from the 9800GX2 which barely had any space in the back for air exhuast. However despite this improvement to airflow not all of the air reaches the back. The side heatsink grates will push air out towards the side of the card. Meaning there will be some heat dumped into your case verses going out the back. As far pushing heat out the back of your case not as much air will reach the back compared to a GTX 280 or GTX 260 and some will get exhuasted into your case. This is a minor complaint but something case builders should be aware of when they are building their GTX 295 or Quad SLI computers.



Image



Image



Ok So how loud is it?: When the card first loads up and the PC starts on the fan boots at about 40% speed. Very quickly once the bios boots up the card down throttles to nearly inaudible levels. When gaming under a stressful enviroment the noise levels can pick up, They do so gradually as the tempature rises. The dynamic fan control is quite good. As far as overall noise is concerned. It is very similar to the 9800GX2. While in 2D mode you'll probably never hear it as Nvidia has excellent thermal and power management on their GT200 cards.





Power: Just like the 9800GX2 and Geforce GTX 280 this card requires one 8pin and one 6pin power connector. This means you will need a PSU that is at least compatible with the 9800GX2 or GTX 280. It is not recommended to use a molex to 8 pin connector.

#1
Posted 12/28/2008 03:37 AM   
[b]Multi Monitor. About time?[/b]: Yes multi monitor has been here for a few months now. I just felt like including it in this preview to demonstrate that it is here and its working now. It will work in Multi GPU or Quad SLI mode. Allowing basic spanning of monitors to another screen. You can set primary/secondary display and clone your single monitor. All the functionality that you'd expect from multi monitor drivers. Despite the log wait it's finally here and it works well.


[url="http://chrisray.soterial.com/295/card/monitor1.jpg"][img]http://chrisray.soterial.com/295/card/monitor1thumb.jpg[/img][/url]


[url="http://chrisray.soterial.com/295/card/monitor2.png"][img]http://chrisray.soterial.com/295/card/monitor2thumb.jpg[/img][/url]


[b] Test Setup and Configuration:[/b] A few changes have been made to my testup. Unfortunately my HDTV is not functioning as it should in DX10. I am still looking for a workaround to the problem. Rather then submit a bunch of inconsistent results between 1920/1680/1600. I chose one resolution and stuck with it. While 22 inch is not always optimal. I ensured GPU limited settings to ensure scaling. At the end I will also clarify and make recommendations regarding Quad SLI and scaling to ensure everyone understands the nature of the tests and how they were performed. I also have stuck with a Q6600 @ 3.2 Ghz overclock on a 780I motherboard. While not the fastest setup currently available I do not believe the results were hurt by this testbed.


[img]http://chrisray.soterial.com/295/card/GTX295systemthumb.jpg[/img]



[b] Test Setup[/b]

Q6660 @ 3.2 Ghz
Nforce 780I @ 1419 FSB
4 Gigs Corsair XMS2 @ 800 Mhz
Windows Vista 64 Bit SP1
Samsung SyncMaster 2233RZ
180.87 Forceware


[b] Synthetics and PhysX[/b]


[b]3Dmark Vantage PoM Test[/b]


[img]http://chrisray.soterial.com/295/graphs/vantagepom.png[/img]


[b] Thoughts[/b]: This test uses the shader units to accelerate lighting and ray tracing conditions. I'm not 100% satisfied with it for using it as a test for judging shader performance. I will be looking into other apps to determine where I will go for shader tests. However you can see that the test is quite hard on all 3 of these cards. I only used 1 board to distinguish just how close a single GTX 295 shader's performance is to a GTX 280.



[b]Unreal Tournament 3 PhysX[/b]

[img]http://chrisray.soterial.com/295/graphs/utphysics.png[/img]


[b]Thoughts[/b]: UT3 PhysX is probably another test that I will not be using in the near future as alot of new games are coming out with PhysX support. The game shifts from CPU pHysX limited to GPU PhysX. But the GPUS are so fast at this PhysX processing they once again become standard CPU bound by CPU for basic 3d Rendering.


[b] GRAW 2 Physics[/b]

[img]http://chrisray.soterial.com/295/graphs/grawphysics.png[/img]


[b] Thoughts[/b]: GRAW2 is a little different than Unreal Tournament 3. As PhysX processing is actually quite high in this application. Once enabled you can see the specific cards taking their lead. Multi GPU PhysX scaling doesn't exist. As only one card does PhysX processing in Multi GPU/SLI configurations. However the game does show some scaling with modern rendering as well.
Multi Monitor. About time?: Yes multi monitor has been here for a few months now. I just felt like including it in this preview to demonstrate that it is here and its working now. It will work in Multi GPU or Quad SLI mode. Allowing basic spanning of monitors to another screen. You can set primary/secondary display and clone your single monitor. All the functionality that you'd expect from multi monitor drivers. Despite the log wait it's finally here and it works well.





Image





Image





Test Setup and Configuration: A few changes have been made to my testup. Unfortunately my HDTV is not functioning as it should in DX10. I am still looking for a workaround to the problem. Rather then submit a bunch of inconsistent results between 1920/1680/1600. I chose one resolution and stuck with it. While 22 inch is not always optimal. I ensured GPU limited settings to ensure scaling. At the end I will also clarify and make recommendations regarding Quad SLI and scaling to ensure everyone understands the nature of the tests and how they were performed. I also have stuck with a Q6600 @ 3.2 Ghz overclock on a 780I motherboard. While not the fastest setup currently available I do not believe the results were hurt by this testbed.





Image







Test Setup



Q6660 @ 3.2 Ghz

Nforce 780I @ 1419 FSB

4 Gigs Corsair XMS2 @ 800 Mhz

Windows Vista 64 Bit SP1

Samsung SyncMaster 2233RZ

180.87 Forceware





Synthetics and PhysX





3Dmark Vantage PoM Test





Image





Thoughts: This test uses the shader units to accelerate lighting and ray tracing conditions. I'm not 100% satisfied with it for using it as a test for judging shader performance. I will be looking into other apps to determine where I will go for shader tests. However you can see that the test is quite hard on all 3 of these cards. I only used 1 board to distinguish just how close a single GTX 295 shader's performance is to a GTX 280.







Unreal Tournament 3 PhysX



Image





Thoughts: UT3 PhysX is probably another test that I will not be using in the near future as alot of new games are coming out with PhysX support. The game shifts from CPU pHysX limited to GPU PhysX. But the GPUS are so fast at this PhysX processing they once again become standard CPU bound by CPU for basic 3d Rendering.





GRAW 2 Physics



Image





Thoughts: GRAW2 is a little different than Unreal Tournament 3. As PhysX processing is actually quite high in this application. Once enabled you can see the specific cards taking their lead. Multi GPU PhysX scaling doesn't exist. As only one card does PhysX processing in Multi GPU/SLI configurations. However the game does show some scaling with modern rendering as well.

#2
Posted 12/28/2008 07:12 AM   
[b] Gaming Benchmarks[/b]

[b]
Note[/b]: I have made a few changes to the benchmark's from previous setups. I have begun to include both 16x and 16xQ by popular request. Once again I want to remind everyone that the settings I use are specifically designed with SLI in mind. And may make Single GPUs appear worse than they actually are for more mainstream gaming and settings.


[b]Crysis Very High[/b]


[img]http://chrisray.soterial.com/295/graphs/crysisveryhigh.png[/img]


[b]Thoughts[/b]: The GTX 295 is able to put out some respectable numbers here. Pulling just behind the GTX 280 SLI setup and ahead of the GTX 260 SLI setup. The most notable call to fame goes to the Quad SLI setup. Can we game @ 1680x1050 with 16xAA/16xAF and Very High settings in Crysis with the new Quad GTX 295 setup? A new political phrase comes to mind.... [b]"Yes We Can"[/b].


[b] Unreal Tournament 3[/b]


[img]http://chrisray.soterial.com/295/graphs/ut316x.png[/img]


[img]http://chrisray.soterial.com/295/graphs/ut16xQ.png[/img]

[b]
Thoughts[/b]: The GTX 295 pulls ahead of the GTX 280 and GTX 260 by a significant margin with 16xAA enabled. However ends up in a jumbled mess of CPU limitations with the other setups. However with 16xQ enabled. The load shifts dramatically back towards the GPU and you can see the results shift to where you'd expect. With the GTX 295 trailing the GTX 280 SLI by a small margin and pulling quite a bit ahead in Quad SLI.


[b] Far Cry 2[/b]

[img]http://chrisray.soterial.com/295/graphs/farcry216x.png[/img]

[img]http://chrisray.soterial.com/295/graphs/farcry216xQ.png[/img]

[b]Thoughts[/b]: Far Cry 2 has a very similar performance trend to Unreal Tournament 3. With standard 16x the SLI systems all bungle up in a CPU limited situation. 16xQ does change this situation and allows for the Quad SLI setup to pull ahead. But even with 16xQ enabled the Quad SLI system hits the same 90 FPS barrier. Overall a GTX 295 will provide a very comfortable Far Cry 2 experience.
Gaming Benchmarks





Note
: I have made a few changes to the benchmark's from previous setups. I have begun to include both 16x and 16xQ by popular request. Once again I want to remind everyone that the settings I use are specifically designed with SLI in mind. And may make Single GPUs appear worse than they actually are for more mainstream gaming and settings.





Crysis Very High





Image





Thoughts: The GTX 295 is able to put out some respectable numbers here. Pulling just behind the GTX 280 SLI setup and ahead of the GTX 260 SLI setup. The most notable call to fame goes to the Quad SLI setup. Can we game @ 1680x1050 with 16xAA/16xAF and Very High settings in Crysis with the new Quad GTX 295 setup? A new political phrase comes to mind.... "Yes We Can".





Unreal Tournament 3





Image





Image





Thoughts
: The GTX 295 pulls ahead of the GTX 280 and GTX 260 by a significant margin with 16xAA enabled. However ends up in a jumbled mess of CPU limitations with the other setups. However with 16xQ enabled. The load shifts dramatically back towards the GPU and you can see the results shift to where you'd expect. With the GTX 295 trailing the GTX 280 SLI by a small margin and pulling quite a bit ahead in Quad SLI.





Far Cry 2



Image



Image



Thoughts: Far Cry 2 has a very similar performance trend to Unreal Tournament 3. With standard 16x the SLI systems all bungle up in a CPU limited situation. 16xQ does change this situation and allows for the Quad SLI setup to pull ahead. But even with 16xQ enabled the Quad SLI system hits the same 90 FPS barrier. Overall a GTX 295 will provide a very comfortable Far Cry 2 experience.

#3
Posted 12/28/2008 10:44 PM   
[b] Lost Planet[/b]


[img]http://chrisray.soterial.com/295/graphs/lostplanet16x.png[/img]


[img]http://chrisray.soterial.com/295/graphs/lostplanet16xQ.png[/img]



[b]Thoughts[/b]: Lost Planet has always been a fairly predictable SLI scaler. Even at moderate settings the GTX 295 pulls far ahead of the GTX 280 and only trails the GTX 280 SLI by a bit. Quad SLI pulls way ahead of the pack having nothing that even compares to it.


[b]Bioshock[/b]


[img]http://chrisray.soterial.com/295/graphs/bioshockdx10.png[/img]


[b]Thoughts[/b]: Bioshock is another game that has always been a strong SLI scaler. Based upon the Unreal engine this is to be expected. The Quad SLI setup is in entire different category of performance. To be fair not a single setup here was unable to run the test. I did not include 16xQ results due to inconclusive results in my benchmarks and will look into it in the future.




[b]Fallout 3[/b]:

[img]http://chrisray.soterial.com/295/graphs/fallout16x.png[/img]


[img]http://chrisray.soterial.com/295/graphs/fallout16xQ.png[/img]


[b]Thoughts[/b]: Fallout 3 is a new title in my testing suite. Which has shown to be surprisingly GPU limited with transparency supersampling enabled. To fully 16xAA a GTX 295 or SLI setup was needed. With 16xQ I was only able to get playable gaming experience on the Quad SLI setup. Note: If you stick to basic transparency multisampling the game is more playable across a variety of setups.



[b]Neverwinter Nights 2[/b]


[img]http://chrisray.soterial.com/295/graphs/nwn16xQ.png[/img]


[b] Thoughts [/b]: Despite its age. Neverwinter Nights 2 remains a fairly GPU limited title. Every terrain and texture is covered with normal maps and is loaded with alpha textures for trees and foliage. As a result enabling transparency SS can take quite a dramatic hit on performance. The GTX 280 Quad SLI was able to maintain a playable performance at even the most difficult settings in this game while single GPUS were unable to maintain playable framerates. Like fallout 3. If you disable trasnparency SS the game becomes alot more playable on a single GTX 280 or GTX 260 and the benefits of SLI/Quad SLI diminish somewhat.
Lost Planet





Image





Image







Thoughts: Lost Planet has always been a fairly predictable SLI scaler. Even at moderate settings the GTX 295 pulls far ahead of the GTX 280 and only trails the GTX 280 SLI by a bit. Quad SLI pulls way ahead of the pack having nothing that even compares to it.





Bioshock





Image





Thoughts: Bioshock is another game that has always been a strong SLI scaler. Based upon the Unreal engine this is to be expected. The Quad SLI setup is in entire different category of performance. To be fair not a single setup here was unable to run the test. I did not include 16xQ results due to inconclusive results in my benchmarks and will look into it in the future.









Fallout 3:



Image





Image





Thoughts: Fallout 3 is a new title in my testing suite. Which has shown to be surprisingly GPU limited with transparency supersampling enabled. To fully 16xAA a GTX 295 or SLI setup was needed. With 16xQ I was only able to get playable gaming experience on the Quad SLI setup. Note: If you stick to basic transparency multisampling the game is more playable across a variety of setups.







Neverwinter Nights 2





Image





Thoughts : Despite its age. Neverwinter Nights 2 remains a fairly GPU limited title. Every terrain and texture is covered with normal maps and is loaded with alpha textures for trees and foliage. As a result enabling transparency SS can take quite a dramatic hit on performance. The GTX 280 Quad SLI was able to maintain a playable performance at even the most difficult settings in this game while single GPUS were unable to maintain playable framerates. Like fallout 3. If you disable trasnparency SS the game becomes alot more playable on a single GTX 280 or GTX 260 and the benefits of SLI/Quad SLI diminish somewhat.

#4
Posted 12/28/2008 10:50 PM   
[b] Overall Thoughts and Conclusion[/b]


[url="http://chrisray.soterial.com/295/card/295dual.jpg"][img]http://chrisray.soterial.com/295/card/295dualthumb.jpg[/img][/url]




[b] Overall Thoughts and Conclusion[/b]


[b] As a Single GPU[/b]:The Geforce GTX 295 is an interesting piece of hardware. Much of its performance can be gotten the same way as SLIing 2 Geforce GTX 260s or 2 Geforce GTX 280s. However that doesn't make it a bad card. This opens the performance level of dual Geforce GTX 260's too the masses without SLI motherboards or the newest intel platform. I am very happy to see the inclusion of the full GTX 280 shader and TMU's. This will allow for the GTX 295 to stay competitive as the more shader intensive games come out. Not to mention PhysX and CUDA acceleration. As seen in some tests the GTX 295 even as a single board is only marginally slower than a Geforce GTX 280. Onto the issue of slower memory bus and ROPS. After taking a close look at the performance differences between the Geforce GTX 280 SLI and Geforce GTX 295. The huge ROP/Bandwith deficit doesnt seem to be effecting performance that much. The performance difference is very small. I do not believe that bandwith is a limiting factor in the GT200 design. At least not at these levels of performance. However memory amount does have 128 megs cut off compared to Nvidia's old high end offering. For the most part this should be a non issue to most users. However the rougher settings at 2560x1600 with 8x multisampling will undoubtedly notice performance drops offs compared to a 280 SLI setup. . At said resolution 8x or 16x CSAA might prove a better experience. The final point to consider is that this "is" a multi GPU setup. So it will have some of the same flaws and benefits inherited by all multi GPU setups. For those who don't want to deal with SLI. A single GTX 285 might be more up your alley. More on this card later..


[b]As Quad SLI Multi GPU[/b]: Quad SLI is as always a high end Niche product. However in comparison to 3 Way SLI. It is far easier to set up and much friendlier on your power and heat management. That being said. Those with dual GTX 280's looking to boost their performance might just consider buying another 280 card. As the performance differences here wont be enormous. The downside to this upgrade path is the heat and power requirements compared to the GTX 295 and will likely lag behind performance slightly. For those who just want to use 2 PCIE slots. Two Geforce GTX 295 cards are extremely fast and are much easier to manage than your typical 3 way SLI setup. For those buying Quad SLI please keep in mind your performance results will vary greatly on the GPU load. Unless you are trying to achieve the best framerates at the higher IQ settings or using a super high res monitor. You will not see the performance benefits right away. Before committing to Quad SLI. Make sure you understand what games you currently play and how GPU limited you actually are. Despite being easier to setup. Quad SLI is just like 3 way SLI in regards to your scaling being very dependent on your current bottleneck.


[b] Pros[/b]

- Very Fast Single card Multi GPU product
- Can be Quad SLI'd easily for super high end performance
- Quiet in 2D mode. And very low power usage while browsing desktop or watching DVDs.
- Uses same profiles as a normal SLI setup. Scaling should work well dependent on how GPU you limited you are

[b] Cons[/b]

- Under extreme load it can be a little louder than your typical single GPU. Comparable to 9800GX2
- 896 megs of memory compared to 1 gig can be a bottleneck under certain extreme circumstances such as 2560x1600 with 8x Multisampling.
Overall Thoughts and Conclusion





Image









Overall Thoughts and Conclusion





As a Single GPU:The Geforce GTX 295 is an interesting piece of hardware. Much of its performance can be gotten the same way as SLIing 2 Geforce GTX 260s or 2 Geforce GTX 280s. However that doesn't make it a bad card. This opens the performance level of dual Geforce GTX 260's too the masses without SLI motherboards or the newest intel platform. I am very happy to see the inclusion of the full GTX 280 shader and TMU's. This will allow for the GTX 295 to stay competitive as the more shader intensive games come out. Not to mention PhysX and CUDA acceleration. As seen in some tests the GTX 295 even as a single board is only marginally slower than a Geforce GTX 280. Onto the issue of slower memory bus and ROPS. After taking a close look at the performance differences between the Geforce GTX 280 SLI and Geforce GTX 295. The huge ROP/Bandwith deficit doesnt seem to be effecting performance that much. The performance difference is very small. I do not believe that bandwith is a limiting factor in the GT200 design. At least not at these levels of performance. However memory amount does have 128 megs cut off compared to Nvidia's old high end offering. For the most part this should be a non issue to most users. However the rougher settings at 2560x1600 with 8x multisampling will undoubtedly notice performance drops offs compared to a 280 SLI setup. . At said resolution 8x or 16x CSAA might prove a better experience. The final point to consider is that this "is" a multi GPU setup. So it will have some of the same flaws and benefits inherited by all multi GPU setups. For those who don't want to deal with SLI. A single GTX 285 might be more up your alley. More on this card later..





As Quad SLI Multi GPU: Quad SLI is as always a high end Niche product. However in comparison to 3 Way SLI. It is far easier to set up and much friendlier on your power and heat management. That being said. Those with dual GTX 280's looking to boost their performance might just consider buying another 280 card. As the performance differences here wont be enormous. The downside to this upgrade path is the heat and power requirements compared to the GTX 295 and will likely lag behind performance slightly. For those who just want to use 2 PCIE slots. Two Geforce GTX 295 cards are extremely fast and are much easier to manage than your typical 3 way SLI setup. For those buying Quad SLI please keep in mind your performance results will vary greatly on the GPU load. Unless you are trying to achieve the best framerates at the higher IQ settings or using a super high res monitor. You will not see the performance benefits right away. Before committing to Quad SLI. Make sure you understand what games you currently play and how GPU limited you actually are. Despite being easier to setup. Quad SLI is just like 3 way SLI in regards to your scaling being very dependent on your current bottleneck.





Pros



- Very Fast Single card Multi GPU product

- Can be Quad SLI'd easily for super high end performance

- Quiet in 2D mode. And very low power usage while browsing desktop or watching DVDs.

- Uses same profiles as a normal SLI setup. Scaling should work well dependent on how GPU you limited you are



Cons



- Under extreme load it can be a little louder than your typical single GPU. Comparable to 9800GX2

- 896 megs of memory compared to 1 gig can be a bottleneck under certain extreme circumstances such as 2560x1600 with 8x Multisampling.

#5
Posted 12/28/2008 10:50 PM   
In before it gets mentioned. I know the resolutions I tested were not exactly "The best" in the industry. And I'm well aware of this. As such I strongly urge people to look at other websites doing tests on these resolutions as well. The reason I couldnt provide them is my HDTV is having serious compatibility issues with most games under DirectX 10. This is a problem I am looking into but I simply could not come up with an effective workaround in the mean time. I do apologize for this but please keep in mind that I do what I can with what I have. And I dont get paid big money for these previews so a big new shiny monitor just wasn't in the budget.

Chris
In before it gets mentioned. I know the resolutions I tested were not exactly "The best" in the industry. And I'm well aware of this. As such I strongly urge people to look at other websites doing tests on these resolutions as well. The reason I couldnt provide them is my HDTV is having serious compatibility issues with most games under DirectX 10. This is a problem I am looking into but I simply could not come up with an effective workaround in the mean time. I do apologize for this but please keep in mind that I do what I can with what I have. And I dont get paid big money for these previews so a big new shiny monitor just wasn't in the budget.



Chris

#6
Posted 01/07/2009 08:19 AM   
well looking at those charts the GTX295 falls a tiny midget behind GTX280 SLIed, but considering cost the GTX295 is better on the pocket.

Also we understand Chris that not many monitors are available to you but we greatly appreciate you still dedicating your time to give us this preview.
Thanks a lot Chris.
well looking at those charts the GTX295 falls a tiny midget behind GTX280 SLIed, but considering cost the GTX295 is better on the pocket.



Also we understand Chris that not many monitors are available to you but we greatly appreciate you still dedicating your time to give us this preview.

Thanks a lot Chris.

#7
Posted 01/08/2009 08:24 AM   
3 Way 280 is typically 25% slower than 4 way 285's. People who already have GTX 280 SLIs may want to consider that. Or not upgrade at all. The 280 in Multi GPU is still a damn good setup.
3 Way 280 is typically 25% slower than 4 way 285's. People who already have GTX 280 SLIs may want to consider that. Or not upgrade at all. The 280 in Multi GPU is still a damn good setup.

#8
Posted 01/08/2009 08:32 AM   
very nice chris. thank you for your review. as always very thorough and uber niceness.

now go get some sleep. lol

-lycan

all the reviews keep confirming the 295 is not the best at 30" WQXGA when u want eye candy.

-lycan
very nice chris. thank you for your review. as always very thorough and uber niceness.



now go get some sleep. lol



-lycan



all the reviews keep confirming the 295 is not the best at 30" WQXGA when u want eye candy.



-lycan

#9
Posted 01/08/2009 08:44 AM   
[quote name='ChrisRay' post='256319' date='Dec 28 2008, 04:37 PM'][b]Quiet? Lets talk about the cooler[/b]: Nvidia claims the card can dissipate up to 289 watts of heat. Which is about 40% improvement over the 9800Gx2. However there are some things you need to be aware of. The card itself has a full back grate for sending hot air out the back which is different from the 9800GX2 which barely had any space in the back for air exhuast. However despite this improvement to airflow not all of the air reaches the back. The side heatsink grates will push air out towards the side of the card. Meaning there will be some heat dumped into your case verses going out the back. As far pushing heat out the back of your case not as much air will reach the back compared to a GTX 280 or GTX 260 and some will get exhuasted into your case. This is a minor complaint but something case builders should be aware of when they are building their GTX 295 or Quad SLI computers.[/quote]
This is where having a good case with good air flow comes in.

Chris you know how in the 3D mark vantage test the GTX280 beats the GTX295.
now this is because of the GTX280 having a higher shader clock right?
and you have said single board so does that mean only 1 of the GPUs is being used there rather than both.
I don't quite understand that one there.
[quote name='ChrisRay' post='256319' date='Dec 28 2008, 04:37 PM']Quiet? Lets talk about the cooler: Nvidia claims the card can dissipate up to 289 watts of heat. Which is about 40% improvement over the 9800Gx2. However there are some things you need to be aware of. The card itself has a full back grate for sending hot air out the back which is different from the 9800GX2 which barely had any space in the back for air exhuast. However despite this improvement to airflow not all of the air reaches the back. The side heatsink grates will push air out towards the side of the card. Meaning there will be some heat dumped into your case verses going out the back. As far pushing heat out the back of your case not as much air will reach the back compared to a GTX 280 or GTX 260 and some will get exhuasted into your case. This is a minor complaint but something case builders should be aware of when they are building their GTX 295 or Quad SLI computers.

This is where having a good case with good air flow comes in.



Chris you know how in the 3D mark vantage test the GTX280 beats the GTX295.

now this is because of the GTX280 having a higher shader clock right?

and you have said single board so does that mean only 1 of the GPUs is being used there rather than both.

I don't quite understand that one there.

#10
Posted 01/08/2009 08:54 AM   
Very nice review! :) Thanks for taking the time to do it.

It seems to be a great performer for just a single card, im tempted to get one and just use my current 260 as a physx beast.
Very nice review! :) Thanks for taking the time to do it.



It seems to be a great performer for just a single card, im tempted to get one and just use my current 260 as a physx beast.

#11
Posted 01/08/2009 09:03 AM   
[quote name='g3n3r41xan' post='260522' date='Jan 8 2009, 12:54 AM']This is where having a good case with good air flow comes in.

Chris you know how in the 3D mark vantage test the GTX280 beats the GTX295.
now this is because of the GTX280 having a higher shader clock right?
and you have said single board so does that mean only 1 of the GPUs is being used there rather than both.
I don't quite understand that one there.[/quote]


Correct. Keep in Mind the vantage test is only run with "1" GPU. Its not using multi GPU. I did this test only to compare shader performance of a single GPU. If it was multi GPU it'd be scaling as an SLI setup would. It was simply an educational test.
[quote name='g3n3r41xan' post='260522' date='Jan 8 2009, 12:54 AM']This is where having a good case with good air flow comes in.



Chris you know how in the 3D mark vantage test the GTX280 beats the GTX295.

now this is because of the GTX280 having a higher shader clock right?

and you have said single board so does that mean only 1 of the GPUs is being used there rather than both.

I don't quite understand that one there.





Correct. Keep in Mind the vantage test is only run with "1" GPU. Its not using multi GPU. I did this test only to compare shader performance of a single GPU. If it was multi GPU it'd be scaling as an SLI setup would. It was simply an educational test.

#12
Posted 01/08/2009 09:15 AM   
[quote name='ChrisRay' post='260528' date='Jan 8 2009, 10:15 PM']Correct. Keep in Mind the vantage test is only run with "1" GPU. Its not using multi GPU. I did this test only to compare shader performance of a single GPU. If it was multi GPU it'd be scaling as an SLI setup would. It was simply an educational test.[/quote]
oh i see.
I mean I have never owned a dual GPU card before but what you have done here is nice.
another possibly retarded question that I sometimes ask.
did you have to disable one GPU on the GTX295 and then test it or can the shader test be done individually on each GPU while they are both running?
sorry for asking so many questions Chris, but I'm sort of supercharged to ask these, lol.
[quote name='ChrisRay' post='260528' date='Jan 8 2009, 10:15 PM']Correct. Keep in Mind the vantage test is only run with "1" GPU. Its not using multi GPU. I did this test only to compare shader performance of a single GPU. If it was multi GPU it'd be scaling as an SLI setup would. It was simply an educational test.

oh i see.

I mean I have never owned a dual GPU card before but what you have done here is nice.

another possibly retarded question that I sometimes ask.

did you have to disable one GPU on the GTX295 and then test it or can the shader test be done individually on each GPU while they are both running?

sorry for asking so many questions Chris, but I'm sort of supercharged to ask these, lol.

#13
Posted 01/08/2009 09:27 AM   
You can do the shader test with only 1 GPU enabled, 2 GPUS enabled, 3 GPUS ((with Quad)) or 4 GPUS enabled. You just have to set the profile accordingly. In this case I just disabled all forms of SLI/multi GPU rendering to perform this test.

Chris
You can do the shader test with only 1 GPU enabled, 2 GPUS enabled, 3 GPUS ((with Quad)) or 4 GPUS enabled. You just have to set the profile accordingly. In this case I just disabled all forms of SLI/multi GPU rendering to perform this test.



Chris

#14
Posted 01/08/2009 09:55 AM   
[quote name='ChrisRay' post='260003' date='Jan 7 2009, 09:19 AM']In before it gets mentioned. I know the resolutions I tested were not exactly "The best" in the industry. And I'm well aware of this. As such I strongly urge people to look at other websites doing tests on these resolutions as well. The reason I couldnt provide them is my HDTV is having serious compatibility issues with most games under DirectX 10. This is a problem I am looking into but I simply could not come up with an effective workaround in the mean time. I do apologize for this but please keep in mind that I do what I can with what I have. And I dont get paid big money for these previews so a big new shiny monitor just wasn't in the budget.

Chris[/quote]

Actullay chris you showed quad with high AA_AF scales great even at 1600 res! It should in theory scale only better at 1080p 1200p and 1600p and or triple head resolutions.

And barring GTA4 it almost impossible to hit a VRAM barrier at 1600 res and with 6-64x AA it will look pretty bloody good ;)
[quote name='ChrisRay' post='260003' date='Jan 7 2009, 09:19 AM']In before it gets mentioned. I know the resolutions I tested were not exactly "The best" in the industry. And I'm well aware of this. As such I strongly urge people to look at other websites doing tests on these resolutions as well. The reason I couldnt provide them is my HDTV is having serious compatibility issues with most games under DirectX 10. This is a problem I am looking into but I simply could not come up with an effective workaround in the mean time. I do apologize for this but please keep in mind that I do what I can with what I have. And I dont get paid big money for these previews so a big new shiny monitor just wasn't in the budget.



Chris



Actullay chris you showed quad with high AA_AF scales great even at 1600 res! It should in theory scale only better at 1080p 1200p and 1600p and or triple head resolutions.



And barring GTA4 it almost impossible to hit a VRAM barrier at 1600 res and with 6-64x AA it will look pretty bloody good ;)

Image


3D RIG:I7 2600k 4.9 Ghz, P8P67 PRO, Fort02 SE, MSi GTX 680 2GB, 4GB DDR3, 60GB Vertex 2E SSD (W7 64 Boot) OCZ Revo 2 x2 PCIE SSD 240 GB, G930 7.1.

2D: MSi 6970 2GB-QX9650-160GB SSD-TJ109. DISPLAYS:2D 120" 21:9 100HZ DLP 0.1MS, 3D 50" 400HZ Passive 0.5 MS 3D LED - 24" 120HZ 2MS Active N3D LCD, all 1080p

#15
Posted 01/08/2009 10:04 AM   
  1 / 7    
Scroll To Top