OPS - shot in the right direction but improvemnt urgently needed
I love the OPS concept, I really do as it attempts to ease some GPU buying and scoping decisions . However I have to say that the values provided seem to extremly far off and most likely not be a true reflection for games where there is more changing enviroment elements that your traditional linear games. It almost seem this is just basically marketing values in my case.

The OPS for World of Warcraft: Cataclysm is a good example. The values descripted here just do not make any sense and frankly I challange NVIDIA to prove that the FPS values can even be concidered average values. The FPS value might be applicable for standing in a fairly quiet area of the game world. As soon as you get other players around you, casting effects, etc those FPS values take a massive dive quickly. What is being left out here is that 100% will be confronted with enviroments where there is a lot of activity weather it be random dungeons, raids, battlegrounds, group questing, moving around major cities, etc.

Below is the mentioned specifiction of the reference machine for the OPS:

Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 940 (2933Mhz), Intel X58, 3Gb RAM
Software configuration: NVIDIA 266.58 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise

These are my computers specifications:

Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 850 (2800Mhz), Intel P55, 16Gb RAM
Software configuration: NVIDIA 296.10 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise

I have used the FPS reading from the 1920x1200 resolution settings with mine as it is the closest match for my 1920x1080 resolution. My FPS drop way below 30fps when doing raiding and moving around heavy populated major cities or even in battlegrounds. Furthermore playing the game at barely 30fps versus 60fps is noticable. There is absolutly nothing wrong with my computer in terms of hardware and software - the GPU just cannot handle the type of detail as decripted in the OPS throughout one's playing time as described scenarious previously.

I would suggest that the FPS values should be rather posted as average values or excepted min and max values. Furthermore I presume that vertical sync is disabled as the FPS would be normally capped to ones screen refresh rate. In my case it would be 60fps. Turning this off causes huge tearing issues which can ruin the gaming experience but in the OPS guide this is not taken in concideration. The max FPS/no vertical sync is similiar to measuring speaker performance by how loud they can go but does not factor in distortion caused.
I love the OPS concept, I really do as it attempts to ease some GPU buying and scoping decisions . However I have to say that the values provided seem to extremly far off and most likely not be a true reflection for games where there is more changing enviroment elements that your traditional linear games. It almost seem this is just basically marketing values in my case.



The OPS for World of Warcraft: Cataclysm is a good example. The values descripted here just do not make any sense and frankly I challange NVIDIA to prove that the FPS values can even be concidered average values. The FPS value might be applicable for standing in a fairly quiet area of the game world. As soon as you get other players around you, casting effects, etc those FPS values take a massive dive quickly. What is being left out here is that 100% will be confronted with enviroments where there is a lot of activity weather it be random dungeons, raids, battlegrounds, group questing, moving around major cities, etc.



Below is the mentioned specifiction of the reference machine for the OPS:



Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 940 (2933Mhz), Intel X58, 3Gb RAM

Software configuration: NVIDIA 266.58 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise



These are my computers specifications:



Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 850 (2800Mhz), Intel P55, 16Gb RAM

Software configuration: NVIDIA 296.10 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise



I have used the FPS reading from the 1920x1200 resolution settings with mine as it is the closest match for my 1920x1080 resolution. My FPS drop way below 30fps when doing raiding and moving around heavy populated major cities or even in battlegrounds. Furthermore playing the game at barely 30fps versus 60fps is noticable. There is absolutly nothing wrong with my computer in terms of hardware and software - the GPU just cannot handle the type of detail as decripted in the OPS throughout one's playing time as described scenarious previously.



I would suggest that the FPS values should be rather posted as average values or excepted min and max values. Furthermore I presume that vertical sync is disabled as the FPS would be normally capped to ones screen refresh rate. In my case it would be 60fps. Turning this off causes huge tearing issues which can ruin the gaming experience but in the OPS guide this is not taken in concideration. The max FPS/no vertical sync is similiar to measuring speaker performance by how loud they can go but does not factor in distortion caused.

#1
Posted 03/21/2012 10:40 AM   
[quote name='1-0-1' date='21 March 2012 - 06:40 AM' timestamp='1332326458' post='1385740']
I love the OPS concept, I really do as it attempts to ease some GPU buying and scoping decisions . However I have to say that the values provided seem to extremly far off and most likely not be a true reflection for games where there is more changing enviroment elements that your traditional linear games. It almost seem this is just basically marketing values in my case.

The OPS for World of Warcraft: Cataclysm is a good example. The values descripted here just do not make any sense and frankly I challange NVIDIA to prove that the FPS values can even be concidered average values. The FPS value might be applicable for standing in a fairly quiet area of the game world. As soon as you get other players around you, casting effects, etc those FPS values take a massive dive quickly. What is being left out here is that 100% will be confronted with enviroments where there is a lot of activity weather it be random dungeons, raids, battlegrounds, group questing, moving around major cities, etc.

Below is the mentioned specifiction of the reference machine for the OPS:

Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 940 (2933Mhz), Intel X58, 3Gb RAM
Software configuration: NVIDIA 266.58 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise

These are my computers specifications:

Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 850 (2800Mhz), Intel P55, 16Gb RAM
Software configuration: NVIDIA 296.10 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise

I have used the FPS reading from the 1920x1200 resolution settings with mine as it is the closest match for my 1920x1080 resolution. My FPS drop way below 30fps when doing raiding and moving around heavy populated major cities or even in battlegrounds. Furthermore playing the game at barely 30fps versus 60fps is noticable. There is absolutly nothing wrong with my computer in terms of hardware and software - the GPU just cannot handle the type of detail as decripted in the OPS throughout one's playing time as described scenarious previously.

I would suggest that the FPS values should be rather posted as average values or excepted min and max values. Furthermore I presume that vertical sync is disabled as the FPS would be normally capped to ones screen refresh rate. In my case it would be 60fps. Turning this off causes huge tearing issues which can ruin the gaming experience but in the OPS guide this is not taken in concideration. The max FPS/no vertical sync is similiar to measuring speaker performance by how loud they can go but does not factor in distortion caused.
[/quote]

[i]"Optimal Playable Settings" is controlled and contributed by the team at GeForce.com

You have to understand, when ANY web site or tester generates performance averages there are variables both hardware and software that contribute to performance differences that are not always foreseeable, or generated accurately. Using an MMORPG is very difficult genre to gather correct statistics that can be utilized universally. There are far too many factors that contribute to performance gains, or losses. Server occupancy, number of spells generated in a certain area etc. There are literally countless variables to test that would need to be accumulated to draw a NEAR accurate assessment of performance...given the dynamic atmosphere of an MMORPG such as WoW; that is mathematically impossible with the manpower of GeForce.com. Whereas a single player title with static or linear game-play, such as Kingdoms of Amalur: Reckoning is a far easier title to accurately generate performance statistics around. You should view all game benchmarks as rough statistics based on hardware properties, as they are never fully bulletproof. [/i]

-Hooks
[quote name='1-0-1' date='21 March 2012 - 06:40 AM' timestamp='1332326458' post='1385740']

I love the OPS concept, I really do as it attempts to ease some GPU buying and scoping decisions . However I have to say that the values provided seem to extremly far off and most likely not be a true reflection for games where there is more changing enviroment elements that your traditional linear games. It almost seem this is just basically marketing values in my case.



The OPS for World of Warcraft: Cataclysm is a good example. The values descripted here just do not make any sense and frankly I challange NVIDIA to prove that the FPS values can even be concidered average values. The FPS value might be applicable for standing in a fairly quiet area of the game world. As soon as you get other players around you, casting effects, etc those FPS values take a massive dive quickly. What is being left out here is that 100% will be confronted with enviroments where there is a lot of activity weather it be random dungeons, raids, battlegrounds, group questing, moving around major cities, etc.



Below is the mentioned specifiction of the reference machine for the OPS:



Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 940 (2933Mhz), Intel X58, 3Gb RAM

Software configuration: NVIDIA 266.58 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise



These are my computers specifications:



Hardware configuration: NVIDIA GeForce GTX 580, Intel Core i7 850 (2800Mhz), Intel P55, 16Gb RAM

Software configuration: NVIDIA 296.10 Driver, World of Warcraft Cataclysm, Windows 7 Enterprise



I have used the FPS reading from the 1920x1200 resolution settings with mine as it is the closest match for my 1920x1080 resolution. My FPS drop way below 30fps when doing raiding and moving around heavy populated major cities or even in battlegrounds. Furthermore playing the game at barely 30fps versus 60fps is noticable. There is absolutly nothing wrong with my computer in terms of hardware and software - the GPU just cannot handle the type of detail as decripted in the OPS throughout one's playing time as described scenarious previously.



I would suggest that the FPS values should be rather posted as average values or excepted min and max values. Furthermore I presume that vertical sync is disabled as the FPS would be normally capped to ones screen refresh rate. In my case it would be 60fps. Turning this off causes huge tearing issues which can ruin the gaming experience but in the OPS guide this is not taken in concideration. The max FPS/no vertical sync is similiar to measuring speaker performance by how loud they can go but does not factor in distortion caused.





"Optimal Playable Settings" is controlled and contributed by the team at GeForce.com



You have to understand, when ANY web site or tester generates performance averages there are variables both hardware and software that contribute to performance differences that are not always foreseeable, or generated accurately. Using an MMORPG is very difficult genre to gather correct statistics that can be utilized universally. There are far too many factors that contribute to performance gains, or losses. Server occupancy, number of spells generated in a certain area etc. There are literally countless variables to test that would need to be accumulated to draw a NEAR accurate assessment of performance...given the dynamic atmosphere of an MMORPG such as WoW; that is mathematically impossible with the manpower of GeForce.com. Whereas a single player title with static or linear game-play, such as Kingdoms of Amalur: Reckoning is a far easier title to accurately generate performance statistics around. You should view all game benchmarks as rough statistics based on hardware properties, as they are never fully bulletproof.




-Hooks

QUOTE (The Professor @ Oct 31 2010, 04:59 AM)

*Jeremy Clarkson face*



So we must hand it over to our tame PC tweaker. Some say he sticky tapes a block of uranium to his dinner before eating it and that he sucks moisture out of ducks. All we know is, he's called Hooks.



"Eye of the Storm" Window Mod Tutorial <> "Inside Crysis 2" <> Top Tier Water-Blocks 2011 <> SSD Unlimited Storage Tutorial

#2
Posted 03/21/2012 11:49 AM   
Fair enough however having huge difference in OPS and real life value clearly indicates some additional testing elements should be included. For started maybe worst case scenarios in terms of gameplay which most of the time get be easily reproduced at any given time. E.g. go into one of the major cities during peak hours, do a LFR run, flying over the many water sections in the game, etc - that alone will give a good reflection on how much the card can take under the suggested settings. 100 fps as a average is extremly optimistic even if the underlining game is of a very dynamic nature.

I forgot to mention I am running my setup from a SSD but still I cannot get anywhere close to the average OPS fps as these seem to be static testing (e.g. logging in a quiet area of the game). As soon as the servers are up I can highlight this with some real numbers to highlight that the OPS for MMO's and in specifically for World of Warcraft is overally optimistic which will give a false pretence on making buying decisions or just for comparisions.

Is there a way to find out how the average FPS value was obtained in terms of in-game FPS counter (which does not have average values over a longer period), in-game add-on, third party software (e.g. FRAPS), etc.
Fair enough however having huge difference in OPS and real life value clearly indicates some additional testing elements should be included. For started maybe worst case scenarios in terms of gameplay which most of the time get be easily reproduced at any given time. E.g. go into one of the major cities during peak hours, do a LFR run, flying over the many water sections in the game, etc - that alone will give a good reflection on how much the card can take under the suggested settings. 100 fps as a average is extremly optimistic even if the underlining game is of a very dynamic nature.



I forgot to mention I am running my setup from a SSD but still I cannot get anywhere close to the average OPS fps as these seem to be static testing (e.g. logging in a quiet area of the game). As soon as the servers are up I can highlight this with some real numbers to highlight that the OPS for MMO's and in specifically for World of Warcraft is overally optimistic which will give a false pretence on making buying decisions or just for comparisions.



Is there a way to find out how the average FPS value was obtained in terms of in-game FPS counter (which does not have average values over a longer period), in-game add-on, third party software (e.g. FRAPS), etc.

#3
Posted 03/21/2012 12:24 PM   
[quote name='1-0-1' date='21 March 2012 - 08:24 AM' timestamp='1332332660' post='1385758']
Fair enough however having huge difference in OPS and real life value clearly indicates some additional testing elements should be included. For started maybe worst case scenarios in terms of gameplay which most of the time get be easily reproduced at any given time. E.g. go into one of the major cities during peak hours, do a LFR run, flying over the many water sections in the game, etc - that alone will give a good reflection on how much the card can take under the suggested settings. 100 fps as a average is extremly optimistic even if the underlining game is of a very dynamic nature.

I forgot to mention I am running my setup from a SSD but still I cannot get anywhere close to the average OPS fps as these seem to be static testing (e.g. logging in a quiet area of the game). As soon as the servers are up I can highlight this with some real numbers to highlight that the OPS for MMO's and in specifically for World of Warcraft is overally optimistic which will give a false pretence on making buying decisions or just for comparisions.

Is there a way to find out how the average FPS value was obtained in terms of in-game FPS counter (which does not have average values over a longer period), in-game add-on, third party software (e.g. FRAPS), etc.
[/quote]
[i]
First, the central processing unit feeds the graphics processing unit the information to render, your i7850 is a very heavy bottleneck for a GTX580 (GF110) class GPU, especially operating at such a low frequency. Although, World of Warcraft may not be the most graphically intensive title on the market, it does rely on information being distributed from the CPU (especially when more entities are on screen).

Second, a Solid Sate Drive has very little "active" gain in gaming, it is more a reading agent to data stored on the drive that increases loading times and cache samples required when entering a new zone, or when initially launching a title.

If you base purchasing decisions upon game performance, you are doing it all wrong. You should purchase hardware based on hardware performance as a whole; meaning compute properties and fill rates. For example: The Fermi GTX580 (3Gig) version received countless positive reviews surrounding the additional VRAM option, when already the GF110 refresh has substantial gaming advantages over the predecessor GTX480. Therefor, if you encountered VRAM limitations due to engine demands, or 3Dvision/Surround, this would be the wisest investment for a graphics processing unit. Similarly, if you are receiving an identifiable CPU bottleneck from "said" unit, a heavy overlocking CPU such as the 2600k or 3960X would provide a substantial advantage in eliminating the bottleneck, while providing information more appropriately to the graphics solution(s).

I have been building enthusiast class gaming PCs for a long time, the fact you have an i7850 feeding a GTX580 is clearly a bottleneck especially at the indicated frequency you provided. I'm not saying this is definitive reason behind the differences in the provided GeForce.com performance charts relating to WoW, but in your case it is for overall system performance. [/i]

-Hooks
[quote name='1-0-1' date='21 March 2012 - 08:24 AM' timestamp='1332332660' post='1385758']

Fair enough however having huge difference in OPS and real life value clearly indicates some additional testing elements should be included. For started maybe worst case scenarios in terms of gameplay which most of the time get be easily reproduced at any given time. E.g. go into one of the major cities during peak hours, do a LFR run, flying over the many water sections in the game, etc - that alone will give a good reflection on how much the card can take under the suggested settings. 100 fps as a average is extremly optimistic even if the underlining game is of a very dynamic nature.



I forgot to mention I am running my setup from a SSD but still I cannot get anywhere close to the average OPS fps as these seem to be static testing (e.g. logging in a quiet area of the game). As soon as the servers are up I can highlight this with some real numbers to highlight that the OPS for MMO's and in specifically for World of Warcraft is overally optimistic which will give a false pretence on making buying decisions or just for comparisions.



Is there a way to find out how the average FPS value was obtained in terms of in-game FPS counter (which does not have average values over a longer period), in-game add-on, third party software (e.g. FRAPS), etc.





First, the central processing unit feeds the graphics processing unit the information to render, your i7850 is a very heavy bottleneck for a GTX580 (GF110) class GPU, especially operating at such a low frequency. Although, World of Warcraft may not be the most graphically intensive title on the market, it does rely on information being distributed from the CPU (especially when more entities are on screen).



Second, a Solid Sate Drive has very little "active" gain in gaming, it is more a reading agent to data stored on the drive that increases loading times and cache samples required when entering a new zone, or when initially launching a title.



If you base purchasing decisions upon game performance, you are doing it all wrong. You should purchase hardware based on hardware performance as a whole; meaning compute properties and fill rates. For example: The Fermi GTX580 (3Gig) version received countless positive reviews surrounding the additional VRAM option, when already the GF110 refresh has substantial gaming advantages over the predecessor GTX480. Therefor, if you encountered VRAM limitations due to engine demands, or 3Dvision/Surround, this would be the wisest investment for a graphics processing unit. Similarly, if you are receiving an identifiable CPU bottleneck from "said" unit, a heavy overlocking CPU such as the 2600k or 3960X would provide a substantial advantage in eliminating the bottleneck, while providing information more appropriately to the graphics solution(s).



I have been building enthusiast class gaming PCs for a long time, the fact you have an i7850 feeding a GTX580 is clearly a bottleneck especially at the indicated frequency you provided. I'm not saying this is definitive reason behind the differences in the provided GeForce.com performance charts relating to WoW, but in your case it is for overall system performance.




-Hooks

QUOTE (The Professor @ Oct 31 2010, 04:59 AM)

*Jeremy Clarkson face*



So we must hand it over to our tame PC tweaker. Some say he sticky tapes a block of uranium to his dinner before eating it and that he sucks moisture out of ducks. All we know is, he's called Hooks.



"Eye of the Storm" Window Mod Tutorial <> "Inside Crysis 2" <> Top Tier Water-Blocks 2011 <> SSD Unlimited Storage Tutorial

#4
Posted 03/21/2012 12:51 PM   
I can't say that I fully agree with you Hooks (which is very rare). I can't imagine that a stock i7 850 would severely bottle-neck a single GTX 580. The i7 850 is roughly equivalent to an i7 920 and the latter CPU still does very well in the newest games. Even at stock frequencies, the 920 still benches well in games compared to the newest SB CPU's.

In my own experiences with the i7 920, I've found that OC'ing it does not significantly increase my framerate when running games on a single GTX 480\580. Of course, some games are more CPU bound than others, but generally all LGA 1366 \ LGA 1156 i7 CPU's can easily feed the fastest single GPU's. SLI and Crossfire configurations are a different story though, as heavy CPU OC's are required to achieve optimal GPU scaling in this scenario.

WoW is definitely a CPU bound game, so the OP would no doubt see a performance boost by OC'ing the CPU. However, I don't think this boost would be significant. Perhaps the OP has a point about the OPS settings being inflated a bit.

@ the OP: I strongly recommend giving D3DOverrider a shot in this scenario. D3DOverrider is a small utility that's included with Rivatuner. You don't actually have to use Rivatuner to access D3DOverrider though. This utility will allow you to force triple buffering in DX games, which goes a long way in combating the negative side effects of vsync. When vsync is enabled in standard "double buffering" mode, the GPU has to wait for the monitor to refresh before it can render the next frame. This can cause severe performance degradation, especially when the GPU cannot maintain a framerate equal to the monitors refresh rate (60 fps for a 60hz monitor). Triple buffering gives the GPU "breathing room" so to speak, which allows the GPU to render ahead even if the monitor is not ready. When it works correctly, triple buffering allows one to achieve the same performance they would without vsync enabled, albeit with a 60 fps cap.

I've had a very good experience with D3DOverrider and I'd recommend it to anyone that runs their games with vsync enabled (I can't stand tearing myself). SLI\Crossfire users need not apply though, as TB will not work with 2-way SLI configs. Another drawback to forcing TB with this utility is that it can cause more input lag. This input lag can be alleviated by capping the framerate to a value slightly under below monitor's refresh rate (e.g. 57 fps on a 60hz monitor). Inspector can now be used to cap the framerate of virtually any game, so this is another utility I strongly recommend.
I can't say that I fully agree with you Hooks (which is very rare). I can't imagine that a stock i7 850 would severely bottle-neck a single GTX 580. The i7 850 is roughly equivalent to an i7 920 and the latter CPU still does very well in the newest games. Even at stock frequencies, the 920 still benches well in games compared to the newest SB CPU's.



In my own experiences with the i7 920, I've found that OC'ing it does not significantly increase my framerate when running games on a single GTX 480\580. Of course, some games are more CPU bound than others, but generally all LGA 1366 \ LGA 1156 i7 CPU's can easily feed the fastest single GPU's. SLI and Crossfire configurations are a different story though, as heavy CPU OC's are required to achieve optimal GPU scaling in this scenario.



WoW is definitely a CPU bound game, so the OP would no doubt see a performance boost by OC'ing the CPU. However, I don't think this boost would be significant. Perhaps the OP has a point about the OPS settings being inflated a bit.



@ the OP: I strongly recommend giving D3DOverrider a shot in this scenario. D3DOverrider is a small utility that's included with Rivatuner. You don't actually have to use Rivatuner to access D3DOverrider though. This utility will allow you to force triple buffering in DX games, which goes a long way in combating the negative side effects of vsync. When vsync is enabled in standard "double buffering" mode, the GPU has to wait for the monitor to refresh before it can render the next frame. This can cause severe performance degradation, especially when the GPU cannot maintain a framerate equal to the monitors refresh rate (60 fps for a 60hz monitor). Triple buffering gives the GPU "breathing room" so to speak, which allows the GPU to render ahead even if the monitor is not ready. When it works correctly, triple buffering allows one to achieve the same performance they would without vsync enabled, albeit with a 60 fps cap.



I've had a very good experience with D3DOverrider and I'd recommend it to anyone that runs their games with vsync enabled (I can't stand tearing myself). SLI\Crossfire users need not apply though, as TB will not work with 2-way SLI configs. Another drawback to forcing TB with this utility is that it can cause more input lag. This input lag can be alleviated by capping the framerate to a value slightly under below monitor's refresh rate (e.g. 57 fps on a 60hz monitor). Inspector can now be used to cap the framerate of virtually any game, so this is another utility I strongly recommend.

EVGA E758 A1 X58 // Core i7 920@4Ghz // OCZ Platinum DDR3 1600 // EVGA GTX 670 SLI // Seasonic X Series Gold 1050w // Corsair 800D // Dual Dell Ultrasharp U2410 displays // Dell Ultrasharp 2408WFP

#5
Posted 03/21/2012 07:00 PM   
Just a quick reply as I am a little bit occupied with work and such. Thanks for the input you two but I did chuckle at Righthooks comment that my i7 is a bottleneck. Needless to say that is absurd on so many levels - as my CPU utilization is never over 70% during gameplay. I think the current OPS settings and who their demograph is aimed at is people like Righthooks. Please do not take this personal or a insult but the OPS values is a extended marketing gimmick and basically blaming my FPS on my i7 is clearly a technical oversight of yourself or your impression on synthetic system performance versus real system performance is far off.

I would love anybody to provide me with proof that the average advertised FPS is 100 with average play of World of Warcraft as per OPS. Don't get me wrong I am happy with the performance of my GTX580 but posting widly enthustic OPS values makes me wonder if there was any real testing done.
Just a quick reply as I am a little bit occupied with work and such. Thanks for the input you two but I did chuckle at Righthooks comment that my i7 is a bottleneck. Needless to say that is absurd on so many levels - as my CPU utilization is never over 70% during gameplay. I think the current OPS settings and who their demograph is aimed at is people like Righthooks. Please do not take this personal or a insult but the OPS values is a extended marketing gimmick and basically blaming my FPS on my i7 is clearly a technical oversight of yourself or your impression on synthetic system performance versus real system performance is far off.



I would love anybody to provide me with proof that the average advertised FPS is 100 with average play of World of Warcraft as per OPS. Don't get me wrong I am happy with the performance of my GTX580 but posting widly enthustic OPS values makes me wonder if there was any real testing done.

#6
Posted 04/16/2012 08:31 PM   
Scroll To Top