DVI-HDMI, comp won't detect TV
First, some system specs:

NVIDIA System Information report created on: 07/19/2009 15:36:29
System name: SPENCER-PC

[Display]
Processor: Intel® Core(tm)2 CPU T7400 @ 2.16GHz (2161 MHz)
Operating System: Windows 7 Ultimate, 32-bit
DirectX version: 10.0
GPU processor: GeForce Go 7900 GS
Driver version: 179.48
Core clock: 375 MHz
Memory clock: 507 MHz (1014 MHz data rate)
Memory interface: 256-bit
Total available graphics memory: 1023 MB
Dedicated video memory: 256 MB
System video memory: 0 MB
Shared system memory: 767 MB
Video BIOS version: 5.71.22.28.01
IRQ: 16
Bus: PCI Express x16

[Components]

nvCplUIR.dll 2.2.275.00 NVIDIA Control Panel
nvCpl.cpl 2.2.275.00 NVIDIA Control Panel Applet
nvCplUI.exe 2.2.275.00 NVIDIA Control Panel
nvViTvSR.dll 7.15.11.7948 NVIDIA Video and TV Server
nvViTvS.dll 7.15.11.7948 NVIDIA Video and TV Server
nvDispSR.dll 7.15.11.7948 NVIDIA Display Server
NVMCTRAY.DLL 7.15.11.7948 NVIDIA Media Center Library
nvDispS.dll 7.15.11.7948 NVIDIA Display Server
NVCPL.DLL 7.15.11.7948 NVIDIA Compatible Windows Vista Display driver, Version 179.48
nvGameSR.dll 7.15.11.7948 NVIDIA 3D Settings Server
nvGameS.dll 7.15.11.7948 NVIDIA 3D Settings Server


OK, so I just got an RCA 40" with 1080p resolution. The owner's manual does not provide a breakdown of acceptable resolutions for input connections, but having done some research, it looks like it should be able to display 1920x1080 @60-85Hz. I already had a VGA cable that I had previously used to connect my laptop to my old 720p 32" TV and I got great 1280x720 resolution picture from it. When I connected the analog VGA to my new TV I learned that I not only couldn't send 1080P resolution to it through analog (Would get a 'test failed' message with anything over 720 horizontal lines) but I couldn't pass a widescreen (16:9 or 16:10) resolution of any size through it.

Crawling through forum after forum the general consensus seemed to be that one needed to run DVI-HDMI or component. Well my GFX card has s-vid, VGA, and DVI connections so I bought a "HDMI to DVI" cable. What I get is nvidia control panel will not detect the tv, and the tv shows a "no signal" blue screen. Yes, I have tried using the 2nd and 3rd HDMI inputs as well, -same result. When I restart my computer I do see the TV go black for a second or two, and then return to blue screen.

I took the HDMI-DVI cable back to the electronics store (which shall remain anonymous) and was told that "HDMI to DVI" is a one way cable and that I needed one that said "DVI to HDMI". Does that sound right? I thought it was a little strange. They didn't have on on hand and instead set me up with a DVI/HDMI adapter plug and a HDMI cable. I hooked this up, tried all 3 HDMI inputs on the TV and still got the same result.

I have also tried updating my nvidia driver already.

My computer will still recognize the TV over the analog VGA connection, but it will not give me widescreen resolutions or 1080 lines.

I haven't found anything in all my searching that would suggest my GFX card cannot send a signal out through DVI to HDMI


ANY BRIGHT IDEAS?


*edit* -By the way, I don't know if this is worth anything, but the largest resolution that nvidia control panel will let my apply to my tv through analog is 1600x1200, which doesn't fit the tv's aspect ratio, next size down is, 1440x900 @85Hz which does fit, but if I can pass 1440x900 through analog, why can't I get 1920x1080 to test out?

Any way I can get a native resolution from my comp to my TV would be fantastic, I'm all out of ideas, does anybody have any solutions? or am I trying to do something that's not possible with my GFX card?
First, some system specs:



NVIDIA System Information report created on: 07/19/2009 15:36:29

System name: SPENCER-PC



[Display]

Processor: Intel® Core(tm)2 CPU T7400 @ 2.16GHz (2161 MHz)

Operating System: Windows 7 Ultimate, 32-bit

DirectX version: 10.0

GPU processor: GeForce Go 7900 GS

Driver version: 179.48

Core clock: 375 MHz

Memory clock: 507 MHz (1014 MHz data rate)

Memory interface: 256-bit

Total available graphics memory: 1023 MB

Dedicated video memory: 256 MB

System video memory: 0 MB

Shared system memory: 767 MB

Video BIOS version: 5.71.22.28.01

IRQ: 16

Bus: PCI Express x16



[Components]



nvCplUIR.dll 2.2.275.00 NVIDIA Control Panel

nvCpl.cpl 2.2.275.00 NVIDIA Control Panel Applet

nvCplUI.exe 2.2.275.00 NVIDIA Control Panel

nvViTvSR.dll 7.15.11.7948 NVIDIA Video and TV Server

nvViTvS.dll 7.15.11.7948 NVIDIA Video and TV Server

nvDispSR.dll 7.15.11.7948 NVIDIA Display Server

NVMCTRAY.DLL 7.15.11.7948 NVIDIA Media Center Library

nvDispS.dll 7.15.11.7948 NVIDIA Display Server

NVCPL.DLL 7.15.11.7948 NVIDIA Compatible Windows Vista Display driver, Version 179.48

nvGameSR.dll 7.15.11.7948 NVIDIA 3D Settings Server

nvGameS.dll 7.15.11.7948 NVIDIA 3D Settings Server





OK, so I just got an RCA 40" with 1080p resolution. The owner's manual does not provide a breakdown of acceptable resolutions for input connections, but having done some research, it looks like it should be able to display 1920x1080 @60-85Hz. I already had a VGA cable that I had previously used to connect my laptop to my old 720p 32" TV and I got great 1280x720 resolution picture from it. When I connected the analog VGA to my new TV I learned that I not only couldn't send 1080P resolution to it through analog (Would get a 'test failed' message with anything over 720 horizontal lines) but I couldn't pass a widescreen (16:9 or 16:10) resolution of any size through it.



Crawling through forum after forum the general consensus seemed to be that one needed to run DVI-HDMI or component. Well my GFX card has s-vid, VGA, and DVI connections so I bought a "HDMI to DVI" cable. What I get is nvidia control panel will not detect the tv, and the tv shows a "no signal" blue screen. Yes, I have tried using the 2nd and 3rd HDMI inputs as well, -same result. When I restart my computer I do see the TV go black for a second or two, and then return to blue screen.



I took the HDMI-DVI cable back to the electronics store (which shall remain anonymous) and was told that "HDMI to DVI" is a one way cable and that I needed one that said "DVI to HDMI". Does that sound right? I thought it was a little strange. They didn't have on on hand and instead set me up with a DVI/HDMI adapter plug and a HDMI cable. I hooked this up, tried all 3 HDMI inputs on the TV and still got the same result.



I have also tried updating my nvidia driver already.



My computer will still recognize the TV over the analog VGA connection, but it will not give me widescreen resolutions or 1080 lines.



I haven't found anything in all my searching that would suggest my GFX card cannot send a signal out through DVI to HDMI





ANY BRIGHT IDEAS?





*edit* -By the way, I don't know if this is worth anything, but the largest resolution that nvidia control panel will let my apply to my tv through analog is 1600x1200, which doesn't fit the tv's aspect ratio, next size down is, 1440x900 @85Hz which does fit, but if I can pass 1440x900 through analog, why can't I get 1920x1080 to test out?



Any way I can get a native resolution from my comp to my TV would be fantastic, I'm all out of ideas, does anybody have any solutions? or am I trying to do something that's not possible with my GFX card?

#1
Posted 07/19/2009 08:53 PM   
One more thing, I'm not 100% on this, but the DVI port coming out of my comp looks like a DVI-D dual link connection, the adapter plug I'm using looks like a DVI-D single link (doesn't have the extra prongs in the center).
One more thing, I'm not 100% on this, but the DVI port coming out of my comp looks like a DVI-D dual link connection, the adapter plug I'm using looks like a DVI-D single link (doesn't have the extra prongs in the center).

#2
Posted 07/19/2009 10:02 PM   
Also, I tried testing 1728x1080 and 1920x1200 (both are 8:5 aspect ratio) through analog and still got 'custom mode test failed'.

Still cannot get any signal at all from DVI to any of my HDMI ports.
Also, I tried testing 1728x1080 and 1920x1200 (both are 8:5 aspect ratio) through analog and still got 'custom mode test failed'.



Still cannot get any signal at all from DVI to any of my HDMI ports.

#3
Posted 07/19/2009 10:05 PM   
Here's a link to the spec sheet for the TV I'm trying to connect to:

[url="http://tv.rca.com/RCATV/Assets/PDFs/L40FHD41_V6_Spec.pdf"]http://tv.rca.com/RCATV/Assets/PDFs/L40FHD41_V6_Spec.pdf[/url]
Here's a link to the spec sheet for the TV I'm trying to connect to:



http://tv.rca.com/RCATV/Assets/PDFs/L40FHD41_V6_Spec.pdf

#4
Posted 07/19/2009 11:17 PM   
A DVI to HDMI cable should do the trick for you if the monitor sends the proper handshake and such. Of course if your laptop doesn't have HDCP and the display requires it for digital connections like DVI/HDMI, you are simply out of luck.
A DVI to HDMI cable should do the trick for you if the monitor sends the proper handshake and such. Of course if your laptop doesn't have HDCP and the display requires it for digital connections like DVI/HDMI, you are simply out of luck.

#5
Posted 07/20/2009 05:44 PM   
Wow, I just read up on this HDCP stuff, thanks for the pointer and I think you're right, I bought my laptop in jan 2007.

This HDCP business is total bulls***! :angry:

I'm really surprised that no one has come up with a handy little software workaround for this yet.
Wow, I just read up on this HDCP stuff, thanks for the pointer and I think you're right, I bought my laptop in jan 2007.



This HDCP business is total bulls***! :angry:



I'm really surprised that no one has come up with a handy little software workaround for this yet.

#6
Posted 07/21/2009 04:02 AM   
Scroll To Top