Using FurMark and other Stress Tests with GeForce graphics cards
  1 / 7    
Furmark is an application designed to stress the GPU by maximizing power draw well beyond any real world application or game. In some cases, this could lead to slowdown of the graphics card due to hitting over-temperature or over-current protection mechanisms. These protection mechanisms are designed to ensure the safe operation of the graphics card. Using Furmark or other applications to disable these protection mechanisms can result in permanent damage to the graphics card and void the manufacturer's warranty.
Furmark is an application designed to stress the GPU by maximizing power draw well beyond any real world application or game. In some cases, this could lead to slowdown of the graphics card due to hitting over-temperature or over-current protection mechanisms. These protection mechanisms are designed to ensure the safe operation of the graphics card. Using Furmark or other applications to disable these protection mechanisms can result in permanent damage to the graphics card and void the manufacturer's warranty.

Please send me a PM if I fail to keep up on replying in any specific thread or leave a driver feedback: Driver Feedback

#1
Posted 02/28/2011 07:25 PM   
[quote name='ManuelG' date='28 February 2011 - 02:25 PM' timestamp='1298921123' post='1199976']
Furmark is an application designed to stress the GPU by maximizing power draw well beyond any real world application or game. In some cases, this could lead to slowdown of the graphics card due to hitting over-temperature or over-current protection mechanisms. These protection mechanisms are designed to ensure the safe operation of the graphics card. Using Furmark or other applications to disable these protection mechanisms can result in permanent damage to the graphics card and void the manufacturer's warranty.
[/quote]
i only use what is suggested by evga they're software and i use future mark software furball is black listed just like you said can cause damage to your gpu not good.
[quote name='ManuelG' date='28 February 2011 - 02:25 PM' timestamp='1298921123' post='1199976']

Furmark is an application designed to stress the GPU by maximizing power draw well beyond any real world application or game. In some cases, this could lead to slowdown of the graphics card due to hitting over-temperature or over-current protection mechanisms. These protection mechanisms are designed to ensure the safe operation of the graphics card. Using Furmark or other applications to disable these protection mechanisms can result in permanent damage to the graphics card and void the manufacturer's warranty.



i only use what is suggested by evga they're software and i use future mark software furball is black listed just like you said can cause damage to your gpu not good.

#2
Posted 02/28/2011 08:12 PM   
so what you are saying, is the cooling solution cannot cope with the graphics card being maxed out?

I understand that it is not something that is commonly ran on a graphics card, but what if you wish to benchmark against others who are running furmark?

It is disappointing that the new geforce cards are unable to run a benchmark that is designed to run on graphics cards. very disappointing.

so what you are saying, is the cooling solution cannot cope with the graphics card being maxed out?



I understand that it is not something that is commonly ran on a graphics card, but what if you wish to benchmark against others who are running furmark?



It is disappointing that the new geforce cards are unable to run a benchmark that is designed to run on graphics cards. very disappointing.


_ NVLDDMKM problems_ | _ problems getting a driver for a laptop graphics card_ | _What PSU do I need?_

[quote name='The Professor' date='11 August 2011 - 10:33 AM' timestamp='1313055223' post='1277858']

I think Qazax is a pretty cool guy. eh kills aleins and doesnt afraid of anything.

#3
Posted 02/28/2011 09:11 PM   
[quote name='Qazax' date='01 March 2011 - 07:11 AM' timestamp='1298927507' post='1200030']
so what you are saying, is the cooling solution cannot cope with the graphics card being maxed out?

I understand that it is not something that is commonly ran on a graphics card, but what if you wish to benchmark against others who are running furmark?

It is disappointing that the new geforce cards are unable to run a benchmark that is designed to run on graphics cards. very disappointing.


[/quote]

no, more the fact that even under 100% gaming or folding load, the card does not run under the same load furmark puts it under.
[quote name='Qazax' date='01 March 2011 - 07:11 AM' timestamp='1298927507' post='1200030']

so what you are saying, is the cooling solution cannot cope with the graphics card being maxed out?



I understand that it is not something that is commonly ran on a graphics card, but what if you wish to benchmark against others who are running furmark?



It is disappointing that the new geforce cards are unable to run a benchmark that is designed to run on graphics cards. very disappointing.









no, more the fact that even under 100% gaming or folding load, the card does not run under the same load furmark puts it under.



In Memory of Chris "ChrisRay" Arthington, 1982-2010

CPU:Intel i7 920 @ 3.8(D0), Mainboard:Asus Rampage II Gene, Memory:12GB Corsair Vengeance 1600
Video:EVGA Geforce GTX 680+ 4GB, Sound:Creative XFI Titanium Fatal1ty Pro, Monitor:BenQ G2400WD
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black
Case:NZXT Guardian 921RB, PSU:Corsair 620HX, OS:Windows 7 SP1

#4
Posted 03/01/2011 03:40 PM   
Ouch!

Am I right in assuming that this disclaimer is geared more towards the GTX 590 (Dual GPU Fermi). The GTX 590 pulls over [b]400W[/b] when fully loaded in furmark, which is quite concerning as I cannot see a lot of PSUs holding up to that amount of sustained stress,

However the ATi Radeon HD6990 also pulls approximately [b]400W[/b] when fully loaded in furmark so they are both pretty intense on the PSU.

On a lighter note the BFG Single PCB GTX 295 I have reaches [b]82-84C[/b] in Furmark after an hour running at 1920*1200 16XFSAA

The hottest I have seen during gaming (Crysis with 2X FSAA at the same resolution) was 74C

I am not saying it is impossible for things to load the card as much as Furmark, but nVidia are treading on thin ice with this.

Imagine if Intel decided tomorrow that nobody was allowed to run their linpack_xeon64 CPU torture test or ban PRIME95?!

Stress testing is performed for a reason (stability testing to the point you are 100% sure your cards will work within normal circumstances).

I do not wish to troll or flame or get involved in squabbles or fights, but is this a potential admission from nVidia that:

1) Their cards are not really able to withstand a 100% sustained workload
2) nVidia do not have confidence in a lot of PSUs out on the market so therefore are protecting the end user from PSU and system damage?

I would hope number 2 is the reason behind nVidia's disclaimer and overvolt/power protection.

The reason for this is lots of PSUs out there are not really up to the grade of running premium components, nVidia are in a way protecting themselves by putting these limitations into the drivers and onto their cards.

HOWEVER

It does make me somewhat concerned...

I will now consider removing Furmark from my test suite. As Ati have had this stance against Furmark for a while so it is not just nVidia (FYI ATi put furmark anti hacks into the 5000 series and newer cards and also put anti furnark detection into their drivers for other cards).

John
Ouch!



Am I right in assuming that this disclaimer is geared more towards the GTX 590 (Dual GPU Fermi). The GTX 590 pulls over 400W when fully loaded in furmark, which is quite concerning as I cannot see a lot of PSUs holding up to that amount of sustained stress,



However the ATi Radeon HD6990 also pulls approximately 400W when fully loaded in furmark so they are both pretty intense on the PSU.



On a lighter note the BFG Single PCB GTX 295 I have reaches 82-84C in Furmark after an hour running at 1920*1200 16XFSAA



The hottest I have seen during gaming (Crysis with 2X FSAA at the same resolution) was 74C



I am not saying it is impossible for things to load the card as much as Furmark, but nVidia are treading on thin ice with this.



Imagine if Intel decided tomorrow that nobody was allowed to run their linpack_xeon64 CPU torture test or ban PRIME95?!



Stress testing is performed for a reason (stability testing to the point you are 100% sure your cards will work within normal circumstances).



I do not wish to troll or flame or get involved in squabbles or fights, but is this a potential admission from nVidia that:



1) Their cards are not really able to withstand a 100% sustained workload

2) nVidia do not have confidence in a lot of PSUs out on the market so therefore are protecting the end user from PSU and system damage?



I would hope number 2 is the reason behind nVidia's disclaimer and overvolt/power protection.



The reason for this is lots of PSUs out there are not really up to the grade of running premium components, nVidia are in a way protecting themselves by putting these limitations into the drivers and onto their cards.



HOWEVER



It does make me somewhat concerned...



I will now consider removing Furmark from my test suite. As Ati have had this stance against Furmark for a while so it is not just nVidia (FYI ATi put furmark anti hacks into the 5000 series and newer cards and also put anti furnark detection into their drivers for other cards).



John

MSI GTX 580 3GB Lightning XE , Factory Overclocked 832Mhz Core, 1664Mhz Shader and 4200Mhz Memory

Intel Core 2 Extreme QX9650

ASUS Rampage Extreme X48 Motherboard

8GB of DDR3 @ 1333Mhz CL8

NEC 24WMGX3

Windows 7 x64 with Service Pack 1

Creative Labs X-Fi Fatal1ty

#5
Posted 03/01/2011 06:41 PM   
John, that intel burn in utility (for the life of me the name escapes me) is not recommended to be run constantly because it accelerates the degradation of the core. The same goes for furmark.

Its also an unrealistic measurement for finding core stabilities on cards that sync the domains as you can have a stable/unstable shader clock but the slower transistors might hit the limit and not be discovered in furmark.

The best games for stability testing an overclock..... would have to be crysis and oblivion.

both are overly sensitive to unstable shader clocks, and generate enough non shader graphics to load the slowe parts of the core as well.
John, that intel burn in utility (for the life of me the name escapes me) is not recommended to be run constantly because it accelerates the degradation of the core. The same goes for furmark.



Its also an unrealistic measurement for finding core stabilities on cards that sync the domains as you can have a stable/unstable shader clock but the slower transistors might hit the limit and not be discovered in furmark.



The best games for stability testing an overclock..... would have to be crysis and oblivion.



both are overly sensitive to unstable shader clocks, and generate enough non shader graphics to load the slowe parts of the core as well.



In Memory of Chris "ChrisRay" Arthington, 1982-2010

CPU:Intel i7 920 @ 3.8(D0), Mainboard:Asus Rampage II Gene, Memory:12GB Corsair Vengeance 1600
Video:EVGA Geforce GTX 680+ 4GB, Sound:Creative XFI Titanium Fatal1ty Pro, Monitor:BenQ G2400WD
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black
Case:NZXT Guardian 921RB, PSU:Corsair 620HX, OS:Windows 7 SP1

#6
Posted 03/02/2011 06:01 AM   
I understand what you are saying Sora

However I do not overclock any components, for what it is worth I have not used any "hacks" or methods to counteract nVidia or ATi application detection of Furmark when doing any of my testing on cards.
Just I have always considered it to be a good method of testing the cooling solution and noise of the cooling solution.

Now that nVidia has issued this warning I will need to find something else....

FYI due to the current state of drivers (any forceware driver beyond 195.62) Crysis [b]cannot[/b] be used as a form of stability testing due to game breaking bugs which cause GTX 295 graphics card (and GTX 260 and 285) to slow down to [b]1.1-1.4[/b]FPS in certain areas.

Fortunately Forceware [b]267.24BETA[/b] addressed the majority of graphical lod, texture and shadow and water flickering when SLi was enabled, but the slow down bugs and some flickering still remains.
I do not have Oblivion.

It seems for now stability testing is quite a difficult thing to do as we need to all agree on something which all vendors are happy with.

John
I understand what you are saying Sora



However I do not overclock any components, for what it is worth I have not used any "hacks" or methods to counteract nVidia or ATi application detection of Furmark when doing any of my testing on cards.

Just I have always considered it to be a good method of testing the cooling solution and noise of the cooling solution.



Now that nVidia has issued this warning I will need to find something else....



FYI due to the current state of drivers (any forceware driver beyond 195.62) Crysis cannot be used as a form of stability testing due to game breaking bugs which cause GTX 295 graphics card (and GTX 260 and 285) to slow down to 1.1-1.4FPS in certain areas.



Fortunately Forceware 267.24BETA addressed the majority of graphical lod, texture and shadow and water flickering when SLi was enabled, but the slow down bugs and some flickering still remains.

I do not have Oblivion.



It seems for now stability testing is quite a difficult thing to do as we need to all agree on something which all vendors are happy with.



John

MSI GTX 580 3GB Lightning XE , Factory Overclocked 832Mhz Core, 1664Mhz Shader and 4200Mhz Memory

Intel Core 2 Extreme QX9650

ASUS Rampage Extreme X48 Motherboard

8GB of DDR3 @ 1333Mhz CL8

NEC 24WMGX3

Windows 7 x64 with Service Pack 1

Creative Labs X-Fi Fatal1ty

#7
Posted 03/02/2011 08:06 AM   
EVGA OC-Scanner is approved for Fermi, for lower than that, i'd have to say throw everything you have at it.
EVGA OC-Scanner is approved for Fermi, for lower than that, i'd have to say throw everything you have at it.



In Memory of Chris "ChrisRay" Arthington, 1982-2010

CPU:Intel i7 920 @ 3.8(D0), Mainboard:Asus Rampage II Gene, Memory:12GB Corsair Vengeance 1600
Video:EVGA Geforce GTX 680+ 4GB, Sound:Creative XFI Titanium Fatal1ty Pro, Monitor:BenQ G2400WD
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black
Case:NZXT Guardian 921RB, PSU:Corsair 620HX, OS:Windows 7 SP1

#8
Posted 03/02/2011 10:46 AM   
I ran furmark last year for about 10 mins then uninstalled it.

It made my card hotter in 10 mins in winter than a 3 hour session of Crysis on the hottest day of the summer, and that just aint right.

Gpu's are made for a purpose and furmark stresses them far beyond what any game ever will.
I ran furmark last year for about 10 mins then uninstalled it.



It made my card hotter in 10 mins in winter than a 3 hour session of Crysis on the hottest day of the summer, and that just aint right.



Gpu's are made for a purpose and furmark stresses them far beyond what any game ever will.

#9
Posted 03/03/2011 05:58 AM   
Hmm, I seem to recall some people complaining about the Final Fantasy XIV beta game causing their cards to run very hot, as hot as FurMark and some claimed even hotter than FurMark. If the cooling solution on the graphics card can't withstand the heat generated by plain software such as FurMark, then Nvidia's reference design is cutting corners to save money by providing inadequate cooling. FurMark is just software, so any software/game has the potential to get the graphics cards to run very hot either intentionally or due to bugs or poor coding. Cooling is something that should be over-designed especially when some people are playing intensive games (like the FF XIV beta) in tropical countries without air conditioning, like I do. Nvidia is trying to cover up their inadequate or cheap cooling designs by making statements that FurMark (or similar software) should not be run on their cards.
Hmm, I seem to recall some people complaining about the Final Fantasy XIV beta game causing their cards to run very hot, as hot as FurMark and some claimed even hotter than FurMark. If the cooling solution on the graphics card can't withstand the heat generated by plain software such as FurMark, then Nvidia's reference design is cutting corners to save money by providing inadequate cooling. FurMark is just software, so any software/game has the potential to get the graphics cards to run very hot either intentionally or due to bugs or poor coding. Cooling is something that should be over-designed especially when some people are playing intensive games (like the FF XIV beta) in tropical countries without air conditioning, like I do. Nvidia is trying to cover up their inadequate or cheap cooling designs by making statements that FurMark (or similar software) should not be run on their cards.

CPU: Intel Core i5-2550K @4.4GHz
Mainboard: MSI Z77A-GD43 (Intel Z77 chipset)
Graphics: MSI N660Ti PE 2GD5/OC (GeForce GTX 660 Ti @1019MHz)
RAM: 2 x 4GB Visipro PC3-12800 (1.5V @933MHz)
OS: Windows 7 Ultimate 64-bit Service Pack 1
PSU: Seasonic Eco 600W SS-600BT Active PFC T3
Monitor: Asus VX239H (23" Full HD AH-IPS LED Display)

#10
Posted 03/03/2011 09:57 AM   
[quote]Nvidia is trying to cover up their inadequate or cheap cooling designs by making statements that FurMark (or similar software) should not be run on their cards[/quote]

Troll elsewhere.
Nvidia is trying to cover up their inadequate or cheap cooling designs by making statements that FurMark (or similar software) should not be run on their cards




Troll elsewhere.



In Memory of Chris "ChrisRay" Arthington, 1982-2010

CPU:Intel i7 920 @ 3.8(D0), Mainboard:Asus Rampage II Gene, Memory:12GB Corsair Vengeance 1600
Video:EVGA Geforce GTX 680+ 4GB, Sound:Creative XFI Titanium Fatal1ty Pro, Monitor:BenQ G2400WD
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black
Case:NZXT Guardian 921RB, PSU:Corsair 620HX, OS:Windows 7 SP1

#11
Posted 03/03/2011 10:41 AM   
I am limited as to what I can say at the moment, but regarding Furmark I think nVidia have more than thermal conerns.
The Power draw and stress/strain on power supply and VRM's and electrical components is immense when using Furmark. (Far more than any game).
What we see here from the current generation of cards are fairly power hungry ~250W-300W single GPU's.
nVidia's up and coming run Dual GPU behemouth can consume [u]more[/u] than [b]400W.[/b]
This would test even the most powerful and efficient high quality Teir 1 Power supplies.
John
I am limited as to what I can say at the moment, but regarding Furmark I think nVidia have more than thermal conerns.

The Power draw and stress/strain on power supply and VRM's and electrical components is immense when using Furmark. (Far more than any game).

What we see here from the current generation of cards are fairly power hungry ~250W-300W single GPU's.

nVidia's up and coming run Dual GPU behemouth can consume more than 400W.

This would test even the most powerful and efficient high quality Teir 1 Power supplies.

John

MSI GTX 580 3GB Lightning XE , Factory Overclocked 832Mhz Core, 1664Mhz Shader and 4200Mhz Memory

Intel Core 2 Extreme QX9650

ASUS Rampage Extreme X48 Motherboard

8GB of DDR3 @ 1333Mhz CL8

NEC 24WMGX3

Windows 7 x64 with Service Pack 1

Creative Labs X-Fi Fatal1ty

#12
Posted 03/03/2011 11:59 AM   
[quote name='Sora' date='03 March 2011 - 05:41 PM' timestamp='1299148914' post='1201514']
Troll elsewhere.
[/quote]
It's called constructive criticism, but it seems like Nvidia doesn't need any since it has such feisty defenders such as Sora here.

Anyway, thank goodness for custom coolers so people don't have to get stuck with Nvidia reference coolers. If your card uses a reference cooler design or your system uses barely adequate PSUs, then please listen to Nvidia and don't run FurMark on your system unless you like to play with fire. /smile.gif' class='bbc_emoticon' alt=':smile:' />
[quote name='Sora' date='03 March 2011 - 05:41 PM' timestamp='1299148914' post='1201514']

Troll elsewhere.



It's called constructive criticism, but it seems like Nvidia doesn't need any since it has such feisty defenders such as Sora here.



Anyway, thank goodness for custom coolers so people don't have to get stuck with Nvidia reference coolers. If your card uses a reference cooler design or your system uses barely adequate PSUs, then please listen to Nvidia and don't run FurMark on your system unless you like to play with fire. /smile.gif' class='bbc_emoticon' alt=':smile:' />

CPU: Intel Core i5-2550K @4.4GHz
Mainboard: MSI Z77A-GD43 (Intel Z77 chipset)
Graphics: MSI N660Ti PE 2GD5/OC (GeForce GTX 660 Ti @1019MHz)
RAM: 2 x 4GB Visipro PC3-12800 (1.5V @933MHz)
OS: Windows 7 Ultimate 64-bit Service Pack 1
PSU: Seasonic Eco 600W SS-600BT Active PFC T3
Monitor: Asus VX239H (23" Full HD AH-IPS LED Display)

#13
Posted 03/03/2011 12:07 PM   
Ital
Even though nVidia's Dual GPU Fermi (590) has had it's clock rate reduced to 620Mhz (from 772Mhz for GTX 580) it still consumes an awful lot of power. Even a decent 900W+ (Gold) PSU would struggle to cope if the OCP was disabled.
I cannot really elaborate too much, but a power draw test on a well ventilated and sufficiently powered GTX 580 CAN consume upto [b]300W[/b] in Furmark (this is with all nVidia furmark detection circumvented). With the GTX 590 (Dual GPU) it is more than this, significantly more even.
The issue per say may not be the Wattage as a decent Gold PSU could in theory just about cope the issue is a combination of current, stress on the PSU itself as a LOT of draw is placed onto the 12V rail, but also the VRM's on the nVidia card as well.
Then of course we have the increase in temperature.

I understand nvidia's stance on this, just it is an eye opener to see that GPU's are getting too powerful, and in some respect are perhaps ahead of their time when compared to other system components. I am just a bit concerned as to what test suite is sufficient to push hardware to a warranted limit for stability testing purposes.

Suffice to say Sora has been helpful with his suggestions of Crysis and EVGA-OC scanner. My only concern is that nVidia have serious driver bugs in Crysis which render stability testing on G200 based systems impossible, however Manuel has passed on the bug report to the relevant departments so hopefully soon Crysis would be a viable option once again for G200 users wanting to stability test .
John
Ital

Even though nVidia's Dual GPU Fermi (590) has had it's clock rate reduced to 620Mhz (from 772Mhz for GTX 580) it still consumes an awful lot of power. Even a decent 900W+ (Gold) PSU would struggle to cope if the OCP was disabled.

I cannot really elaborate too much, but a power draw test on a well ventilated and sufficiently powered GTX 580 CAN consume upto 300W in Furmark (this is with all nVidia furmark detection circumvented). With the GTX 590 (Dual GPU) it is more than this, significantly more even.

The issue per say may not be the Wattage as a decent Gold PSU could in theory just about cope the issue is a combination of current, stress on the PSU itself as a LOT of draw is placed onto the 12V rail, but also the VRM's on the nVidia card as well.

Then of course we have the increase in temperature.



I understand nvidia's stance on this, just it is an eye opener to see that GPU's are getting too powerful, and in some respect are perhaps ahead of their time when compared to other system components. I am just a bit concerned as to what test suite is sufficient to push hardware to a warranted limit for stability testing purposes.



Suffice to say Sora has been helpful with his suggestions of Crysis and EVGA-OC scanner. My only concern is that nVidia have serious driver bugs in Crysis which render stability testing on G200 based systems impossible, however Manuel has passed on the bug report to the relevant departments so hopefully soon Crysis would be a viable option once again for G200 users wanting to stability test .

John

MSI GTX 580 3GB Lightning XE , Factory Overclocked 832Mhz Core, 1664Mhz Shader and 4200Mhz Memory

Intel Core 2 Extreme QX9650

ASUS Rampage Extreme X48 Motherboard

8GB of DDR3 @ 1333Mhz CL8

NEC 24WMGX3

Windows 7 x64 with Service Pack 1

Creative Labs X-Fi Fatal1ty

#14
Posted 03/03/2011 06:05 PM   
[quote name='JohnZS' date='03 March 2011 - 06:59 AM' timestamp='1299153593' post='1201559']
What we see here from the current generation of cards are fairly power hungry ~250W-300W single GPU's.
[/quote]

People have gotten over 400 watt draw during furmark (before the mosfets exploded) with a highly overclocked 570gtx and voltage protection disabled. Geforce 590 power draw under the right conditions would wake Nikola Tesla from his grave.
[quote name='JohnZS' date='03 March 2011 - 06:59 AM' timestamp='1299153593' post='1201559']

What we see here from the current generation of cards are fairly power hungry ~250W-300W single GPU's.





People have gotten over 400 watt draw during furmark (before the mosfets exploded) with a highly overclocked 570gtx and voltage protection disabled. Geforce 590 power draw under the right conditions would wake Nikola Tesla from his grave.

Motherboard: Gigabyte Z77 UD5H + 2500k

Ram: 8g (4g x 2 Samsung 30nm MV-3V4G3)

GPU: MSI 570GTX reference card - stock

PSU: SeaSonic S12 Energy Plus SS-650HT 650W

Sound: Creative Audigy 2 ZS PCI

HD: OCZ Agility 60g, Intel RST AHCI driver, Win7 x64

Mouse: G9x, Sensei, + more

One LCD attached, DVI mode, DPC Latency = 4

#15
Posted 03/04/2011 03:39 AM   
  1 / 7    
Scroll To Top