Need to disable HDCP on Quadro FX 580
Hello all,

I'm running into an obstacle with an implementation for my work, and I was hoping that some of you may be able to help.

The What:
I work for a digital signage company, and we use H.264 streaming devices to replicate a desktop image out to multiple endpoints.
In this case, the vid card is an NVidia Quadro FX 580.
The output connection is DisplayPort out to HDMI.
The 'monitor' device is an Exterity e3532 (HDMI in, H.264 back out over the network)
The platform is MS Windows Server 2008 Enterprise R2 x64.

The How:
Digital sign runs on the Server 2008 machine, pipes it out DisplayPort/HDMI via the FX 580, and the Exterity device converts it to a network stream.

The Huh?:
I've actually got this setup 5 times over (5 unique screens needed = 5 servers). I've been seeing them obtain a 'sync'(Exterity actually picks up the signal and streams it) and lose that sync all day long. I'm not really sure what has triggered it to catch or drop the signal, but it has been intermittent all day through my testing. I have looked at the logging for the Exterity device and it says that it cannot play the HDCP signal. I'm not certain how HDCP works, but I'm guessing by the intermittent drops that it's only an occasional 'check' for HDCP.


So... I know that the Exterity devices will play the signal. HDCP is just killing it. I am hoping that someone can tell me how to disable the HDCP for these cards. I've seen other threads refer to an Overscan workaround that incidentally disables HDCP, so I feel certain that it's possible.


Please let me know if you need any additional information, and thanks in advance for your help.

-DK
Hello all,



I'm running into an obstacle with an implementation for my work, and I was hoping that some of you may be able to help.



The What:

I work for a digital signage company, and we use H.264 streaming devices to replicate a desktop image out to multiple endpoints.

In this case, the vid card is an NVidia Quadro FX 580.

The output connection is DisplayPort out to HDMI.

The 'monitor' device is an Exterity e3532 (HDMI in, H.264 back out over the network)

The platform is MS Windows Server 2008 Enterprise R2 x64.



The How:

Digital sign runs on the Server 2008 machine, pipes it out DisplayPort/HDMI via the FX 580, and the Exterity device converts it to a network stream.



The Huh?:

I've actually got this setup 5 times over (5 unique screens needed = 5 servers). I've been seeing them obtain a 'sync'(Exterity actually picks up the signal and streams it) and lose that sync all day long. I'm not really sure what has triggered it to catch or drop the signal, but it has been intermittent all day through my testing. I have looked at the logging for the Exterity device and it says that it cannot play the HDCP signal. I'm not certain how HDCP works, but I'm guessing by the intermittent drops that it's only an occasional 'check' for HDCP.





So... I know that the Exterity devices will play the signal. HDCP is just killing it. I am hoping that someone can tell me how to disable the HDCP for these cards. I've seen other threads refer to an Overscan workaround that incidentally disables HDCP, so I feel certain that it's possible.





Please let me know if you need any additional information, and thanks in advance for your help.



-DK

#1
Posted 03/29/2011 08:18 PM   
Scroll To Top