GTX 600 Series Bad for video editing / encoding?
I've been hearing some pretty bad things about the 600 series. I use Adobe Premiere and Sony Vegas for video editing and rendering. Everyone says that the change in architecture from Fermi to Kepler or 500 to 600 series is good for gamers and terrible for those doing video editing. Supposedly this is because the Direct Compute and GPGPU power of the 500 series is now clearly secondary. I've been told that the GPGPU power of the 500 series has been stripped in the 600 series. The 600 series is marketed as having over 1,000 CUDA cores for many of the cards but the compute power for video encoding and editing is actually just a fraction of what 500 CUDA cores are. Performance reports make me wonder whether the software is just not taking advantage of this new architecture to the fullest or the reports are true - that the nVidia wants to have gamer cards only and high end video editing cards that are compute intensive. This would mean that the GTX 670 or 660ti is a great gaming card but won't hold it's own against a GTX 570 for Premiere or Vegas rendering and video encoding / transcoding. The 580 seems to blow away the 670 for rendering video MP4 files. Can anyone speak on this? Right now all recommendations are to go AMD for computing power or the 500 series which is at an end. I'm trying to also determine 570 and 660ti / 670 performance for render times versus using an i7-2600k CPU at 3.5. I'd really appreciate it if someone can give me insight into the value of the GTX 600 series for video editing only, not gaming.
I've been hearing some pretty bad things about the 600 series. I use Adobe Premiere and Sony Vegas for video editing and rendering. Everyone says that the change in architecture from Fermi to Kepler or 500 to 600 series is good for gamers and terrible for those doing video editing. Supposedly this is because the Direct Compute and GPGPU power of the 500 series is now clearly secondary. I've been told that the GPGPU power of the 500 series has been stripped in the 600 series. The 600 series is marketed as having over 1,000 CUDA cores for many of the cards but the compute power for video encoding and editing is actually just a fraction of what 500 CUDA cores are. Performance reports make me wonder whether the software is just not taking advantage of this new architecture to the fullest or the reports are true - that the nVidia wants to have gamer cards only and high end video editing cards that are compute intensive. This would mean that the GTX 670 or 660ti is a great gaming card but won't hold it's own against a GTX 570 for Premiere or Vegas rendering and video encoding / transcoding. The 580 seems to blow away the 670 for rendering video MP4 files. Can anyone speak on this? Right now all recommendations are to go AMD for computing power or the 500 series which is at an end.

I'm trying to also determine 570 and 660ti / 670 performance for render times versus using an i7-2600k CPU at 3.5. I'd really appreciate it if someone can give me insight into the value of the GTX 600 series for video editing only, not gaming.

#1
Posted 11/27/2012 04:19 AM   
I dont know much about editing and rendering but if video converting is in that section then I see A HUGE Diffrance between 560 & 680!... (HUGE)!
I dont know much about editing and rendering but if video converting is in that section then I see A HUGE Diffrance between 560 & 680!... (HUGE)!

ºi7 4790K, ºAsus Z97-A, º2x8GB GBXM Ripjaws X, ºMSI GTX970 Gaming, ºCorsair GS700, ºCM 690 III, ºLG E2241T, ºWindows 10 EN x64

#2
Posted 11/27/2012 07:50 AM   
I'm with you and it appears the link is removed. It was render times or "Blender Benchmarks." All across the board it looks like the GTX 570 was besting the GTX 670 and 680. From the surface, it appears that the 600 series is a gaming card and really goes far away from computing tasks for things like video encoding and rendering. As I don't have money to test everything myself, I was trying to find out answers.
I'm with you and it appears the link is removed. It was render times or "Blender Benchmarks." All across the board it looks like the GTX 570 was besting the GTX 670 and 680. From the surface, it appears that the 600 series is a gaming card and really goes far away from computing tasks for things like video encoding and rendering. As I don't have money to test everything myself, I was trying to find out answers.

#3
Posted 11/27/2012 05:22 PM   
I'm sad about this too. Now i own GTX 660 Ti. In games, it performs good, but in CUDA applications it seams to be almost same as my old GTX 260 :(
I'm sad about this too. Now i own GTX 660 Ti. In games, it performs good, but in CUDA applications it seams to be almost same as my old GTX 260 :(

#4
Posted 01/03/2013 10:12 AM   
Sad... my GTX560 ti is about to die on me (after a good 2 years run), getting all sorts of graphic glitches, but I also encode ALOT. I just now bought a GTX660, guess I'm not going to jump up and down much.
Sad... my GTX560 ti is about to die on me (after a good 2 years run), getting all sorts of graphic glitches, but I also encode ALOT. I just now bought a GTX660, guess I'm not going to jump up and down much.

#5
Posted 10/10/2013 01:26 AM   
It's true. I didn't know about this at all when I purchased the GTX 760 a few months back. Here I was thinking that I had 1152 CUDA cores, compared to my 336 in my GTX 460(SLI). Man was I pissed. Sure, the new card was great for playing the new games FarCry 3 and Crysis 3, but not so great for my video converting software. The option for CUDA in my software was no longer available and I couldn't select it in some of my apps. I documented this immediately after and posted the video. At the time I had no idea it was because of Fermi vs Kepler: https://www.youtube.com/watch?v=gdv4WICVNgE I did a review on Amazon stating that it was great for gaming but if anyone was buying this card to do video encoding, it wasn't worth it because the video software doesn't support the Kepler architecture I ended up getting a bunch of "not helpful" thumbs down on the review, like 1 helpful and 12 not helpful. I was like, WTF?! I deleted the review as I didn't see how it wasn't helpful. If I had known I wouldn't be able to use the new card to speed up my video conversions, I wouldn't have bought it. (Just an FYI, with the GTX 760, I played FarCry 3 on the highest [Ultra] setting and Crysis 3 on the highest settings as well. The games didn't even studder. With the GTX 460 SLI, the best setting I could muster was on Medium.) Funny... the GTX 760 ended up being just a paper weight for a couple months, until I plugged it into my PC in the living room a few weeks ago. It's a better upgrade for watching movies over the GT 430 card, even though that wasn't too bad for a $40 card (that's frickin' sad how it's being utilized now). I just hated seeing the $270 760 sitting in the box, unused. I quit playing Crysis and FarCry, so what was the point? Fallout 3 plays great on the 460 SLI and is the only game I play over and over again.
It's true. I didn't know about this at all when I purchased the GTX 760 a few months back. Here I was thinking that I had 1152 CUDA cores, compared to my 336 in my GTX 460(SLI). Man was I pissed. Sure, the new card was great for playing the new games FarCry 3 and Crysis 3, but not so great for my video converting software. The option for CUDA in my software was no longer available and I couldn't select it in some of my apps. I documented this immediately after and posted the video. At the time I had no idea it was because of Fermi vs Kepler:

I did a review on Amazon stating that it was great for gaming but if anyone was buying this card to do video encoding, it wasn't worth it because the video software doesn't support the Kepler architecture I ended up getting a bunch of "not helpful" thumbs down on the review, like 1 helpful and 12 not helpful. I was like, WTF?! I deleted the review as I didn't see how it wasn't helpful. If I had known I wouldn't be able to use the new card to speed up my video conversions, I wouldn't have bought it.

(Just an FYI, with the GTX 760, I played FarCry 3 on the highest [Ultra] setting and Crysis 3 on the highest settings as well. The games didn't even studder. With the GTX 460 SLI, the best setting I could muster was on Medium.)

Funny... the GTX 760 ended up being just a paper weight for a couple months, until I plugged it into my PC in the living room a few weeks ago. It's a better upgrade for watching movies over the GT 430 card, even though that wasn't too bad for a $40 card (that's frickin' sad how it's being utilized now). I just hated seeing the $270 760 sitting in the box, unused. I quit playing Crysis and FarCry, so what was the point? Fallout 3 plays great on the 460 SLI and is the only game I play over and over again.

#6
Posted 10/11/2013 02:18 PM   
Some software implement device id checks and lock down support if an unknown card is inserted.
Some software implement device id checks and lock down support if an unknown card is inserted.



In Memory of Chris "ChrisRay" Arthington, 1982-2010

Specs:
CPU:Intel Xeon x5690 @ 4.2Ghz, Mainboard:Asus Rampage III Extreme, Memory:48GB Corsair Vengeance LP 1600
Video:EVGA Geforce GTX 1080 Founders Edition, NVidia Geforce GTX 1060 Founders Edition
Monitor:ROG PG279Q, BenQ BL2211, Sound:Creative XFI Titanium Fatal1ty Pro
SDD:Crucial MX300 275, Crucial MX300 525, Crucial MX200 250
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black
Case:NZXT Phantom 820, PSU:Seasonic X-850, OS:Windows 7 SP1
Cooler: ThermalRight Silver Arrow IB-E Extreme

WIP:
CPU:Intel Xeon x5660, Mainboard:Asus Rampage II Gene, Memory:16GB Corsair Vengeance 1600 LP
Video:EVGA Geforce GTX 680+ 4GB, Palit Geforce GTX 550ti
Monitor:Pending, Sound:Pending
SDD:Pending
HDD:Pending
Case:NZXT Guardian 921RB, PSU:Corsair 620HX, OS:Windows 7 SP1
Cooler: ThermalRight True Spirit 120M

#7
Posted 10/12/2013 09:33 AM   
IMO using GPU to encode video is a bad, as a person that deals with video encoding on the fly. All the few times I have tested out gpu encoding of video, had it not help any even increased cpu load, another time the output looked like crap. Video I have to worry about quality else my customers get real unhappy.
IMO using GPU to encode video is a bad, as a person that deals with video encoding on the fly. All the few times I have tested out gpu encoding of video, had it not help any even increased cpu load, another time the output looked like crap. Video I have to worry about quality else my customers get real unhappy.

CPU: Intel Core i7 4770k
Motherboard: MSI z87-Plus
Ram: G.SKILL Ripjaws Series DDR3 SDRAM DDR3 2400 (8x4GB) 32GB
GPU: eVGA Nvidia GTX1080 ACX 3.0
Sound: Creative Labs Sound Blaster X-FI fatality Platium Edition
PSU: OCZ Gold Series 850 watt's
OS: Windows 10 Pro

#8
Posted 10/12/2013 10:02 AM   
yeah, most cuda encoders don't do motion detection and gop
yeah, most cuda encoders don't do motion detection and gop



In Memory of Chris "ChrisRay" Arthington, 1982-2010

Specs:
CPU:Intel Xeon x5690 @ 4.2Ghz, Mainboard:Asus Rampage III Extreme, Memory:48GB Corsair Vengeance LP 1600
Video:EVGA Geforce GTX 1080 Founders Edition, NVidia Geforce GTX 1060 Founders Edition
Monitor:ROG PG279Q, BenQ BL2211, Sound:Creative XFI Titanium Fatal1ty Pro
SDD:Crucial MX300 275, Crucial MX300 525, Crucial MX200 250
HDD:500GB Spinpoint F3, 1TB WD Black, 2TB WD Red, 1TB WD Black
Case:NZXT Phantom 820, PSU:Seasonic X-850, OS:Windows 7 SP1
Cooler: ThermalRight Silver Arrow IB-E Extreme

WIP:
CPU:Intel Xeon x5660, Mainboard:Asus Rampage II Gene, Memory:16GB Corsair Vengeance 1600 LP
Video:EVGA Geforce GTX 680+ 4GB, Palit Geforce GTX 550ti
Monitor:Pending, Sound:Pending
SDD:Pending
HDD:Pending
Case:NZXT Guardian 921RB, PSU:Corsair 620HX, OS:Windows 7 SP1
Cooler: ThermalRight True Spirit 120M

#9
Posted 10/13/2013 08:58 AM   
I downgraded to GeForce 335.23 and the CUDA encoding is now enabled for my video converters. GeForce 340 and above, the CUDA feature is disabled (From GeForce, not the software). Go figure.
I downgraded to GeForce 335.23 and the CUDA encoding is now enabled for my video converters.
GeForce 340 and above, the CUDA feature is disabled (From GeForce, not the software). Go figure.

#10
Posted 10/05/2014 08:00 AM   
Scroll To Top