Can ANYONE set a 1366 x 768 resolution?
  1 / 2    
I have a dual 8800 GT rig, displayed by a Westinghouse 27" 720p/1080i HDTV.

If I hook it all up with my VGA cable, all seems fine, although slightly fuzzy (text), and it runs at 1360 x 768.

When I hook it up with the DVI cable, the picture is MUCH clearer and sharper (and colorful), however the 1360 x 768 resolution causes a portion of the screen to be cut off on the left side of the TV...missing maybe an inch or so of desktop.

I of course can set it to run as a 720p, but that lowers the resolution down to 1280 x 720. When I try 1080i, looks like garbage of course.

I have messed around with custom timings to set it at 1366 x 768 to remove that missing portion on the left, which only ended in failure and frustration. I always revert back to the VGA cable.

Has anyone accomplished this 1366 resolution, or has anyone been able to remove that screen cut-off?

I'm considering buying a DVI to HDMI cable and hooking it up that way, but am hesitant that I will have the same issue.

If anyone can throw me a bone here, and tell me about this 1366 issue, and also if it's worth running the computer through the TV's HDMI input.

The picture is certainly better when viewed through a DVI hookup...but missing that little bit of screen is unacceptable. Thanks.
I have a dual 8800 GT rig, displayed by a Westinghouse 27" 720p/1080i HDTV.



If I hook it all up with my VGA cable, all seems fine, although slightly fuzzy (text), and it runs at 1360 x 768.



When I hook it up with the DVI cable, the picture is MUCH clearer and sharper (and colorful), however the 1360 x 768 resolution causes a portion of the screen to be cut off on the left side of the TV...missing maybe an inch or so of desktop.



I of course can set it to run as a 720p, but that lowers the resolution down to 1280 x 720. When I try 1080i, looks like garbage of course.



I have messed around with custom timings to set it at 1366 x 768 to remove that missing portion on the left, which only ended in failure and frustration. I always revert back to the VGA cable.



Has anyone accomplished this 1366 resolution, or has anyone been able to remove that screen cut-off?



I'm considering buying a DVI to HDMI cable and hooking it up that way, but am hesitant that I will have the same issue.



If anyone can throw me a bone here, and tell me about this 1366 issue, and also if it's worth running the computer through the TV's HDMI input.



The picture is certainly better when viewed through a DVI hookup...but missing that little bit of screen is unacceptable. Thanks.

#1
Posted 03/06/2008 05:10 PM   
You'll have the overscan issue with the DVI>HDMI also. It has to do with HDTV's specifically.

Try component. That's what I have to use. You'll be able to resize the desktop correctly in the Control Panel. You can also pick 1440 x 900 resolution while maintaining your 1080i connection, which looks great.

Of course using 1920 x 1080 @ 1080i is going to look bad. Your tv doesn't actually support it natively.

I'm in the same boat so....
You'll have the overscan issue with the DVI>HDMI also. It has to do with HDTV's specifically.



Try component. That's what I have to use. You'll be able to resize the desktop correctly in the Control Panel. You can also pick 1440 x 900 resolution while maintaining your 1080i connection, which looks great.



Of course using 1920 x 1080 @ 1080i is going to look bad. Your tv doesn't actually support it natively.



I'm in the same boat so....

#2
Posted 03/06/2008 06:06 PM   
[quote name='SmilesLikeJoker' date='Mar 6 2008, 10:06 AM']You'll have the overscan issue with the DVI>HDMI also.  It has to do with HDTV's specifically.

Try component.  That's what I have to use.  You'll be able to resize the desktop correctly in the Control Panel.  You can also pick 1440 x 900 resolution while maintaining your 1080i connection, which looks great.

Of course using 1920 x 1080 @ 1080i is going to look bad.  Your tv doesn't actually support it natively. 

I'm in the same boat so....
[right][snapback]338645[/snapback][/right]
[/quote]


So just to clarify...I should use the DVI to Component cable to hook up to my TV?

And even though my TV's max resolution is 1366 x 768 (manual states this) I'll still be able to select 1440 x 900 resolution and it will look good?
[quote name='SmilesLikeJoker' date='Mar 6 2008, 10:06 AM']You'll have the overscan issue with the DVI>HDMI also.  It has to do with HDTV's specifically.



Try component.  That's what I have to use.  You'll be able to resize the desktop correctly in the Control Panel.  You can also pick 1440 x 900 resolution while maintaining your 1080i connection, which looks great.



Of course using 1920 x 1080 @ 1080i is going to look bad.  Your tv doesn't actually support it natively. 



I'm in the same boat so....

[snapback]338645[/snapback]








So just to clarify...I should use the DVI to Component cable to hook up to my TV?



And even though my TV's max resolution is 1366 x 768 (manual states this) I'll still be able to select 1440 x 900 resolution and it will look good?

#3
Posted 03/06/2008 06:41 PM   
[quote name='Przybocki' date='Mar 6 2008, 01:41 PM']So just to clarify...I should use the DVI to Component cable to hook up to my TV?

And even though my TV's max resolution is 1366 x 768 (manual states this) I'll still be able to select 1440 x 900 resolution and it will look good?
[right][snapback]338669[/snapback][/right]
[/quote]
Use the component dongle-thing that should have came with your card. That's the HDTV-out.

Yea....it's weird. 1440 x900 actually works pretty good. At least I think so.... If you hook up with component, it should be one of the resolution you can pick. When you first boot, you'll be at 1920 x 1080. First, resize your desktop from the Control Panel....it will actually stick this time. Last, change your resolution. You'll be at 1440 x 900 while maintaining the HD connection @ 1080i.

My HDTV has the same native resolution.
[quote name='Przybocki' date='Mar 6 2008, 01:41 PM']So just to clarify...I should use the DVI to Component cable to hook up to my TV?



And even though my TV's max resolution is 1366 x 768 (manual states this) I'll still be able to select 1440 x 900 resolution and it will look good?

[snapback]338669[/snapback]




Use the component dongle-thing that should have came with your card. That's the HDTV-out.



Yea....it's weird. 1440 x900 actually works pretty good. At least I think so.... If you hook up with component, it should be one of the resolution you can pick. When you first boot, you'll be at 1920 x 1080. First, resize your desktop from the Control Panel....it will actually stick this time. Last, change your resolution. You'll be at 1440 x 900 while maintaining the HD connection @ 1080i.



My HDTV has the same native resolution.

#4
Posted 03/06/2008 08:26 PM   
[quote name='SmilesLikeJoker' date='Mar 6 2008, 12:26 PM']Use the component dongle-thing that should have came with your card.  That's the HDTV-out. 

Yea....it's weird. 1440 x900 actually works pretty good.  At least I think so....  If you hook up with component, it should be one of the resolution you can pick.  When you first boot, you'll be at 1920 x 1080.  First, resize your desktop from the Control Panel....it will actually stick this time.  Last, change your resolution.  You'll be at 1440 x 900 while maintaining the HD connection @ 1080i.

My HDTV has the same native resolution.
[right][snapback]338711[/snapback][/right]
[/quote]


Thanks, I'll give that a try tonight when I get home. So that's a pretty good resolution? Do many games support that rez?
[quote name='SmilesLikeJoker' date='Mar 6 2008, 12:26 PM']Use the component dongle-thing that should have came with your card.  That's the HDTV-out. 



Yea....it's weird. 1440 x900 actually works pretty good.  At least I think so....  If you hook up with component, it should be one of the resolution you can pick.  When you first boot, you'll be at 1920 x 1080.  First, resize your desktop from the Control Panel....it will actually stick this time.  Last, change your resolution.  You'll be at 1440 x 900 while maintaining the HD connection @ 1080i.



My HDTV has the same native resolution.

[snapback]338711[/snapback]








Thanks, I'll give that a try tonight when I get home. So that's a pretty good resolution? Do many games support that rez?

#5
Posted 03/06/2008 08:40 PM   
[quote name='Przybocki' date='Mar 6 2008, 03:40 PM']Thanks, I'll give that a try tonight when I get home. So that's a pretty good resolution? Do many games support that rez?
[right][snapback]338725[/snapback][/right]
[/quote]
I haven't had problems using it. You'll have to play around with each game. Some do....some don't. All the games I'm actually playing atm have no issue with it.
[quote name='Przybocki' date='Mar 6 2008, 03:40 PM']Thanks, I'll give that a try tonight when I get home. So that's a pretty good resolution? Do many games support that rez?

[snapback]338725[/snapback]




I haven't had problems using it. You'll have to play around with each game. Some do....some don't. All the games I'm actually playing atm have no issue with it.

#6
Posted 03/06/2008 08:45 PM   
[quote name='SmilesLikeJoker' date='Mar 6 2008, 01:06 PM']You'll have the overscan issue with the DVI>HDMI also.  It has to do with HDTV's specifically.

Try component.  That's what I have to use.  You'll be able to resize the desktop correctly in the Control Panel.  You can also pick 1440 x 900 resolution while maintaining your 1080i connection, which looks great.

Of course using  Your tv doesn't actually support it natively. 

I'm in the same boat so....
[right][snapback]338645[/snapback][/right]
[/quote]

First of all, the TV in question here is 720p/1080i HDTV, meaning it supports HDTV resolutions of 1280*720 Progressively (60hz) and 1920*1080 Interlaced (30hz). So neither one should look bad at all. And using the pc input at 1360*768 will give you the clearest and sharpest picture, although the DVI will have much better color and contrast. Reason is your native resolution is 1360*768, so when you set your desktop to this, the card makes and sends information on a per-pixel basis without the card or tv scaling the image. Just think of it as this, when at 1360*768, your card has specific information for what each pixel on your TV should be doing. Anything else is just scaled to fit your native resolution. But the 1280*720 (720p) should also look good over hdmi, but doesn't becausee of driver issues with the 8 series gpu' that hasn't been resolved yet. Pay close attention to how clear text appears, not images, when changing connections

Smiles, do you even have an HDTV? How could you say "of course 1920 x 1080 @ 1080i is going to look bad.?" You understand that 1080i is what makes these tv's look so good and is standard in the world of high definition, don't you?

Component supports the same resolutions as HDMI, but HDMI and DVI offer lossless uncompressed digital video, drastically increasing color accuracy and reproduction, while at the same time improving overall image quality because the signal isn't converted to analog before being displayed. This is especially effective on modern HDTV's, as they are digital displays and benefit from a digital signal.
You should be able to resize the desktop with either component or HDMI as both support HDTV signals.

A direct connection between the video card and TV is required (look in your manual) for proper function. If you use the HDMI input on the TV you should get the resize desktop option in the nVidia control panel.

"1440 x 900 resolution while maintaining your 1080i connection"
This is because the second number, 900, is higher than 720, which is the maximum for 720p. That it looks better is only due to the fact that 1920*1080 is too high for a pc signal, things becomes so small that distorion occurs as the card tries to show fine detail or even text over such small screen areas.

Of course 1080i is going to look bad?!?! yeah, but ONLY for a PC signal. 1080i HDTV is HIGHER definition than 720p HDTV, therefor looks better. Native resolution only determines at what resolution an image an be displayed progressively, or entirely all at once in one frame. Higher resolutions can still be displayed (theoricly, up to twice the native) as an interlaced signal. This is because only half the image is displayed at any one instant. These "halves" are displayed one after the other so fast humans can't tell. So, despite 720p being progressive, 1080i still looks more life-like because these halves contain almost as much picture data as a whole image does for 720p.

Przybocki: you will want to use you VGA cable and stay with 1360*768. nVidia has issues with the 8 series gpu's and LCD HDTV displays. The problem is that if you use a digital type connection, (ie: DVI, HDMI, or any combination thereof with right cables)there is distortion in the image due to forces unknown(scaling conflicts, really). Look at text as you move it across the screen, left and right. Or how a series of dots, like the vertical ones used in the nVidia control panel connecting all the options, appears to "break" in a regular pattern. If your sharpness is set all the way down or eyesight is not that good, you may not even see it, but many people due, and it is a known issue.

I hope that clears up some misconceptions about HDTV's, native resolutions, and your 8800gt's. And you can get color and brightness close to that of HDMI/DVI, just play with the sliders in mange desktop color options in nvidia cp. Good Luck
[quote name='SmilesLikeJoker' date='Mar 6 2008, 01:06 PM']You'll have the overscan issue with the DVI>HDMI also.  It has to do with HDTV's specifically.



Try component.  That's what I have to use.  You'll be able to resize the desktop correctly in the Control Panel.  You can also pick 1440 x 900 resolution while maintaining your 1080i connection, which looks great.



Of course using  Your tv doesn't actually support it natively. 



I'm in the same boat so....

[snapback]338645[/snapback]






First of all, the TV in question here is 720p/1080i HDTV, meaning it supports HDTV resolutions of 1280*720 Progressively (60hz) and 1920*1080 Interlaced (30hz). So neither one should look bad at all. And using the pc input at 1360*768 will give you the clearest and sharpest picture, although the DVI will have much better color and contrast. Reason is your native resolution is 1360*768, so when you set your desktop to this, the card makes and sends information on a per-pixel basis without the card or tv scaling the image. Just think of it as this, when at 1360*768, your card has specific information for what each pixel on your TV should be doing. Anything else is just scaled to fit your native resolution. But the 1280*720 (720p) should also look good over hdmi, but doesn't becausee of driver issues with the 8 series gpu' that hasn't been resolved yet. Pay close attention to how clear text appears, not images, when changing connections



Smiles, do you even have an HDTV? How could you say "of course 1920 x 1080 @ 1080i is going to look bad.?" You understand that 1080i is what makes these tv's look so good and is standard in the world of high definition, don't you?



Component supports the same resolutions as HDMI, but HDMI and DVI offer lossless uncompressed digital video, drastically increasing color accuracy and reproduction, while at the same time improving overall image quality because the signal isn't converted to analog before being displayed. This is especially effective on modern HDTV's, as they are digital displays and benefit from a digital signal.

You should be able to resize the desktop with either component or HDMI as both support HDTV signals.



A direct connection between the video card and TV is required (look in your manual) for proper function. If you use the HDMI input on the TV you should get the resize desktop option in the nVidia control panel.



"1440 x 900 resolution while maintaining your 1080i connection"

This is because the second number, 900, is higher than 720, which is the maximum for 720p. That it looks better is only due to the fact that 1920*1080 is too high for a pc signal, things becomes so small that distorion occurs as the card tries to show fine detail or even text over such small screen areas.



Of course 1080i is going to look bad?!?! yeah, but ONLY for a PC signal. 1080i HDTV is HIGHER definition than 720p HDTV, therefor looks better. Native resolution only determines at what resolution an image an be displayed progressively, or entirely all at once in one frame. Higher resolutions can still be displayed (theoricly, up to twice the native) as an interlaced signal. This is because only half the image is displayed at any one instant. These "halves" are displayed one after the other so fast humans can't tell. So, despite 720p being progressive, 1080i still looks more life-like because these halves contain almost as much picture data as a whole image does for 720p.



Przybocki: you will want to use you VGA cable and stay with 1360*768. nVidia has issues with the 8 series gpu's and LCD HDTV displays. The problem is that if you use a digital type connection, (ie: DVI, HDMI, or any combination thereof with right cables)there is distortion in the image due to forces unknown(scaling conflicts, really). Look at text as you move it across the screen, left and right. Or how a series of dots, like the vertical ones used in the nVidia control panel connecting all the options, appears to "break" in a regular pattern. If your sharpness is set all the way down or eyesight is not that good, you may not even see it, but many people due, and it is a known issue.



I hope that clears up some misconceptions about HDTV's, native resolutions, and your 8800gt's. And you can get color and brightness close to that of HDMI/DVI, just play with the sliders in mange desktop color options in nvidia cp. Good Luck

#7
Posted 03/06/2008 09:37 PM   
[quote name='EddieDoo-Wop' date='Mar 6 2008, 01:37 PM']First of all, the TV in question here is 720p/1080i HDTV, meaning it supports HDTV resolutions of 1280*720 Progressively (60hz) and 1920*1080 Interlaced (30hz).  So neither one should look bad at all.  And using the pc input at 1360*768 will give you the clearest and sharpest picture, although the DVI will have much better color and contrast.  Reason is your native resolution is 1360*768, so when you set your desktop to this, the card makes and sends information on a per-pixel basis without the card or tv scaling the image.  Just think of it as this, when at 1360*768, your card has specific information for what each pixel on your TV should be doing.  Anything else is just scaled to fit your native resolution.  But the 1280*720 (720p) should also look good over hdmi, but doesn't becausee of driver issues with the 8 series gpu' that hasn't been resolved yet.  Pay close attention to how clear text appears, not images, when changing connections

Smiles, do you even have an HDTV? How could you say "of course 1920 x 1080 @ 1080i is going to look bad.?"  You understand that 1080i is what makes these tv's look so good and is standard in the world of high definition, don't you?

Component supports the same resolutions as HDMI, but HDMI and DVI offer lossless uncompressed digital video, drastically increasing color accuracy and reproduction, while at the same time improving overall image quality because the signal isn't converted to analog before being displayed.  This is especially effective on modern HDTV's, as they are digital displays and benefit from a digital signal.
You should be able to resize the desktop with either component or HDMI as both support HDTV signals.

A direct connection between the video card and TV is required (look in your manual) for proper function.  If you use the HDMI input on the TV you should get the resize desktop option in the nVidia control panel.

"1440 x 900 resolution while maintaining your 1080i connection"
This is because the second number, 900, is higher than 720, which is the maximum for 720p.  That it looks better is only due to the fact that 1920*1080 is too high for a pc signal, things becomes so small that distorion occurs as the card tries to show fine detail or even text over such small screen areas.

Of course 1080i is going to look bad?!?! yeah, but ONLY for a PC signal.  1080i HDTV is HIGHER definition than 720p HDTV, therefor looks better.  Native resolution only determines at what resolution an image an be displayed progressively, or entirely all at once in one frame.  Higher resolutions can still be displayed (theoricly, up to twice the native) as an interlaced signal.  This is because only half the image is displayed at any one instant.  These "halves" are displayed one after the other so fast humans can't tell.  So, despite 720p being progressive, 1080i still looks more life-like because these halves contain almost as much picture data as a whole image does for 720p.

Przybocki: you will want to use you VGA cable and stay with 1360*768.  nVidia has issues with the 8 series gpu's and LCD HDTV displays.  The problem is that if you use a digital type connection, (ie: DVI, HDMI, or any combination thereof with right cables)there is distortion in the image due to forces unknown(scaling conflicts, really).  Look at text as you move it across the screen, left and right.  Or how a series of dots, like the vertical ones used in the nVidia control panel connecting all the options, appears to "break" in a regular pattern.  If your sharpness is set all the way down or eyesight is not that good, you may not even see it, but many people due, and it is a known issue.

I hope that clears up some misconceptions about HDTV's, native resolutions, and your 8800gt's.  And you can get color and brightness close to that of HDMI/DVI, just play with the sliders in mange desktop color options in nvidia cp.  Good Luck
[right][snapback]338762[/snapback][/right]
[/quote]

Well, this is much more info...thanks.

I know that I tried using the Component cables before, and as I remembered it wasn't an improvement. Which is why I always go back to the VGA cable.

I also bought a high-end VGA cable (very thick). What a difference it made from the skinny cable that came with my an older monitor.

With the VGA, everything except the text looks great. I wouldn't even say the text looks "bad", but it's not as sharp and clear as when viewed through DVI, so I wasn't sure if I was doing something wrong.

So what's the problem with Nvidia then? Why can't they get this issue fixed? Really aggravating.

Thanks again for the detailed answer.
[quote name='EddieDoo-Wop' date='Mar 6 2008, 01:37 PM']First of all, the TV in question here is 720p/1080i HDTV, meaning it supports HDTV resolutions of 1280*720 Progressively (60hz) and 1920*1080 Interlaced (30hz).  So neither one should look bad at all.  And using the pc input at 1360*768 will give you the clearest and sharpest picture, although the DVI will have much better color and contrast.  Reason is your native resolution is 1360*768, so when you set your desktop to this, the card makes and sends information on a per-pixel basis without the card or tv scaling the image.  Just think of it as this, when at 1360*768, your card has specific information for what each pixel on your TV should be doing.  Anything else is just scaled to fit your native resolution.  But the 1280*720 (720p) should also look good over hdmi, but doesn't becausee of driver issues with the 8 series gpu' that hasn't been resolved yet.  Pay close attention to how clear text appears, not images, when changing connections



Smiles, do you even have an HDTV? How could you say "of course 1920 x 1080 @ 1080i is going to look bad.?"  You understand that 1080i is what makes these tv's look so good and is standard in the world of high definition, don't you?



Component supports the same resolutions as HDMI, but HDMI and DVI offer lossless uncompressed digital video, drastically increasing color accuracy and reproduction, while at the same time improving overall image quality because the signal isn't converted to analog before being displayed.  This is especially effective on modern HDTV's, as they are digital displays and benefit from a digital signal.

You should be able to resize the desktop with either component or HDMI as both support HDTV signals.



A direct connection between the video card and TV is required (look in your manual) for proper function.  If you use the HDMI input on the TV you should get the resize desktop option in the nVidia control panel.



"1440 x 900 resolution while maintaining your 1080i connection"

This is because the second number, 900, is higher than 720, which is the maximum for 720p.  That it looks better is only due to the fact that 1920*1080 is too high for a pc signal, things becomes so small that distorion occurs as the card tries to show fine detail or even text over such small screen areas.



Of course 1080i is going to look bad?!?! yeah, but ONLY for a PC signal.  1080i HDTV is HIGHER definition than 720p HDTV, therefor looks better.  Native resolution only determines at what resolution an image an be displayed progressively, or entirely all at once in one frame.  Higher resolutions can still be displayed (theoricly, up to twice the native) as an interlaced signal.  This is because only half the image is displayed at any one instant.  These "halves" are displayed one after the other so fast humans can't tell.  So, despite 720p being progressive, 1080i still looks more life-like because these halves contain almost as much picture data as a whole image does for 720p.



Przybocki: you will want to use you VGA cable and stay with 1360*768.  nVidia has issues with the 8 series gpu's and LCD HDTV displays.  The problem is that if you use a digital type connection, (ie: DVI, HDMI, or any combination thereof with right cables)there is distortion in the image due to forces unknown(scaling conflicts, really).  Look at text as you move it across the screen, left and right.  Or how a series of dots, like the vertical ones used in the nVidia control panel connecting all the options, appears to "break" in a regular pattern.  If your sharpness is set all the way down or eyesight is not that good, you may not even see it, but many people due, and it is a known issue.



I hope that clears up some misconceptions about HDTV's, native resolutions, and your 8800gt's.  And you can get color and brightness close to that of HDMI/DVI, just play with the sliders in mange desktop color options in nvidia cp.  Good Luck

[snapback]338762[/snapback]






Well, this is much more info...thanks.



I know that I tried using the Component cables before, and as I remembered it wasn't an improvement. Which is why I always go back to the VGA cable.



I also bought a high-end VGA cable (very thick). What a difference it made from the skinny cable that came with my an older monitor.



With the VGA, everything except the text looks great. I wouldn't even say the text looks "bad", but it's not as sharp and clear as when viewed through DVI, so I wasn't sure if I was doing something wrong.



So what's the problem with Nvidia then? Why can't they get this issue fixed? Really aggravating.



Thanks again for the detailed answer.

#8
Posted 03/06/2008 09:59 PM   
[quote name='EddieDoo-Wop' date='Mar 6 2008, 04:37 PM']Smiles, do you even have an HDTV? How could you say "of course 1920 x 1080 @ 1080i is going to look bad.?"¦nbsp; You understand that 1080i is what makes these tv's look so good and is standard in the world of high definition, don't you?[/quote]
Umm.....you answered your own question below.

[quote name='EddieDoo-Wop' date='Mar 6 2008, 04:37 PM']Of course 1080i is going to look bad?!?! yeah, but ONLY for a PC signal.[/quote]That's what it looks like the OP was trying to do. The real question is do you have a HDTV that you are using for a computer monitor?

I don't think so because of this statement:
[quote name='EddieDoo-Wop' date='Mar 6 2008, 04:37 PM']You should be able to resize the desktop with either component or HDMI as both support HDTV signals.[/quote] Because if you did you would know that the 'Resize Desktop' does not work when connected DVI or DVI>HDMI. That's the issue and has been the issue for over a year. The only confirmed way to get rid of the overscan is to use component, which only supports up to 1080i. You would also know that when you immediately connect with component your resolution is 1920 x1080 and that you can't even read the text because, you said it yourself, it looks awful. You would also know that you have to ability to correctly resize your desktop and change to lower resolutions, such as 1440 x 900, while your video card still recognizes the HD connection as 1080i. Not 720p, but 1080i. This is dealing strictly with HDTV's that are being used for a computer monitor with NVidia cards.

Oh yea....since you still have the 1080i connection, you can actually use a dvd rom to automatically upscale standard dvd's to 1080i signal. That's something you can't do on VGA.

EDIT: Here's a link to the specs on my HDTV.
[url="http://www.lcdtvbuyingguide.com/sony-lcd-tv/sony-kdl40s2000.html"]http://www.lcdtvbuyingguide.com/sony-lcd-t...kdl40s2000.html[/url]




No one has to listen to me. I'm just some stupid white trash redneck that doesn't have a clue what I'm talking about












.....that has he sh*t working right.
EDIT: Good luck to the OP /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' /> I hope you find what makes you happy.

EDIT II: Yes....I took some offense to the post. I apologize to Eddie for ranting. We're both here just trying to help. And yes....the VGA input will pick up your native resolution without any issues whatsoever. It's analog instead of digital....that's all.
[quote name='EddieDoo-Wop' date='Mar 6 2008, 04:37 PM']Smiles, do you even have an HDTV? How could you say "of course 1920 x 1080 @ 1080i is going to look bad.?"¦nbsp; You understand that 1080i is what makes these tv's look so good and is standard in the world of high definition, don't you?

Umm.....you answered your own question below.



[quote name='EddieDoo-Wop' date='Mar 6 2008, 04:37 PM']Of course 1080i is going to look bad?!?! yeah, but ONLY for a PC signal.That's what it looks like the OP was trying to do. The real question is do you have a HDTV that you are using for a computer monitor?



I don't think so because of this statement:

[quote name='EddieDoo-Wop' date='Mar 6 2008, 04:37 PM']You should be able to resize the desktop with either component or HDMI as both support HDTV signals. Because if you did you would know that the 'Resize Desktop' does not work when connected DVI or DVI>HDMI. That's the issue and has been the issue for over a year. The only confirmed way to get rid of the overscan is to use component, which only supports up to 1080i. You would also know that when you immediately connect with component your resolution is 1920 x1080 and that you can't even read the text because, you said it yourself, it looks awful. You would also know that you have to ability to correctly resize your desktop and change to lower resolutions, such as 1440 x 900, while your video card still recognizes the HD connection as 1080i. Not 720p, but 1080i. This is dealing strictly with HDTV's that are being used for a computer monitor with NVidia cards.



Oh yea....since you still have the 1080i connection, you can actually use a dvd rom to automatically upscale standard dvd's to 1080i signal. That's something you can't do on VGA.



EDIT: Here's a link to the specs on my HDTV.

http://www.lcdtvbuyingguide.com/sony-lcd-t...kdl40s2000.html









No one has to listen to me. I'm just some stupid white trash redneck that doesn't have a clue what I'm talking about

























.....that has he sh*t working right.

EDIT: Good luck to the OP /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' /> I hope you find what makes you happy.



EDIT II: Yes....I took some offense to the post. I apologize to Eddie for ranting. We're both here just trying to help. And yes....the VGA input will pick up your native resolution without any issues whatsoever. It's analog instead of digital....that's all.

#9
Posted 03/06/2008 10:46 PM   
[quote name='SmilesLikeJoker' date='Mar 6 2008, 05:46 PM']No one has to listen to me.  I'm just some stupid white trash redneck that doesn't have a clue what I'm talking about
[right][snapback]338796[/snapback][/right]
[/quote]

Then git oph yer sistur and git to gewgaling. /haha.gif' class='bbc_emoticon' alt=':haha:' />

Axis
[quote name='SmilesLikeJoker' date='Mar 6 2008, 05:46 PM']No one has to listen to me.  I'm just some stupid white trash redneck that doesn't have a clue what I'm talking about

[snapback]338796[/snapback]






Then git oph yer sistur and git to gewgaling. /haha.gif' class='bbc_emoticon' alt=':haha:' />



Axis

[CPU] QX6700 @ 3.50ghz with TR Ultra 120 Extreme (XP Pro)
[MB] DFI LP LT X48
[GPU] MSI GTX260
[Mem] Corsair Dominators 2x2gig PC2 8500
[HDD] 3X Seagate SATA- 180, 250, and 320
[PSU] OCZ GamersXtreme GsxSli 600w
[Case] CoolerMaster CM690

#10
Posted 03/07/2008 07:55 AM   
[quote name='axis' date='Mar 7 2008, 02:55 AM']Then git oph yer sistur and git to gewgaling.  /haha.gif' class='bbc_emoticon' alt=':haha:' />

Axis
[right][snapback]339012[/snapback][/right]
[/quote]
/haha.gif' class='bbc_emoticon' alt=':haha:' />
[quote name='axis' date='Mar 7 2008, 02:55 AM']Then git oph yer sistur and git to gewgaling.  /haha.gif' class='bbc_emoticon' alt=':haha:' />



Axis

[snapback]339012[/snapback]




/haha.gif' class='bbc_emoticon' alt=':haha:' />

#11
Posted 03/07/2008 12:07 PM   
[quote name='SmilesLikeJoker' date='Mar 6 2008, 02:46 PM']Umm.....you answered your own question below.

That's what it looks like the OP was trying to do.  The real question is do you have a HDTV that you are using for a computer monitor? 

I don't think so because of this statement:
  Because if you did you would know that the 'Resize Desktop' does not work when connected DVI or DVI>HDMI.  That's the issue and has been the issue for over a year.  The only confirmed way to get rid of the overscan is to use component, which only supports up to 1080i.  You would also know that when you immediately connect with component your resolution is 1920 x1080 and that you can't even read the text because, you said it yourself, it looks awful.  You would also know that you have to ability to correctly resize your desktop and change to lower resolutions, such as 1440 x 900,  while your video card still recognizes the HD connection as 1080i.  Not 720p, but 1080i.  This is dealing strictly with HDTV's that are being used for a computer monitor with NVidia cards.

Oh yea....since you still have the 1080i connection, you can actually use a dvd rom to automatically upscale standard dvd's to 1080i signal.  That's something you can't do on VGA.

EDIT:  Here's a link to the specs on my HDTV.
[url="http://www.lcdtvbuyingguide.com/sony-lcd-tv/sony-kdl40s2000.html"]http://www.lcdtvbuyingguide.com/sony-lcd-t...kdl40s2000.html[/url]
No one has to listen to me.  I'm just some stupid white trash redneck that doesn't have a clue what I'm talking about
.....that has he sh*t working right.
EDIT:  Good luck to the OP /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />  I hope you find what makes you happy.

EDIT II:  Yes....I took some offense to the post.  I apologize to Eddie for ranting.  We're both here just trying to help.  And yes....the VGA input will pick up your native resolution without any issues whatsoever.  It's analog instead of digital....that's all.
[right][snapback]338796[/snapback][/right]
[/quote]

Hey, I appreciate any and all help. Whatever works as far as I'm concerned.

My main concern/point was that I read on Nvidia's site that their drivers work for you in that you can set up the custom resolution of 1366 x 768...provided that your TV comes with that "EDID" I think it's called. Well, most actual HDTV's do not.

Oh, and has anyone installed the new "beta" drivers (169.44)? I did, and it added an icon on my desktop to play Portal for free for a time. I love how we have to deal with advertising now with our drivers.

If I didn't own two 8800 GTS's and two 8800 GT's...I'd switch to ATI. But I forked out a fortune for Nvidia stuff, and I need to get my use out of it.
[quote name='SmilesLikeJoker' date='Mar 6 2008, 02:46 PM']Umm.....you answered your own question below.



That's what it looks like the OP was trying to do.  The real question is do you have a HDTV that you are using for a computer monitor? 



I don't think so because of this statement:

  Because if you did you would know that the 'Resize Desktop' does not work when connected DVI or DVI>HDMI.  That's the issue and has been the issue for over a year.  The only confirmed way to get rid of the overscan is to use component, which only supports up to 1080i.  You would also know that when you immediately connect with component your resolution is 1920 x1080 and that you can't even read the text because, you said it yourself, it looks awful.  You would also know that you have to ability to correctly resize your desktop and change to lower resolutions, such as 1440 x 900,  while your video card still recognizes the HD connection as 1080i.  Not 720p, but 1080i.  This is dealing strictly with HDTV's that are being used for a computer monitor with NVidia cards.



Oh yea....since you still have the 1080i connection, you can actually use a dvd rom to automatically upscale standard dvd's to 1080i signal.  That's something you can't do on VGA.



EDIT:  Here's a link to the specs on my HDTV.

http://www.lcdtvbuyingguide.com/sony-lcd-t...kdl40s2000.html

No one has to listen to me.  I'm just some stupid white trash redneck that doesn't have a clue what I'm talking about

.....that has he sh*t working right.

EDIT:  Good luck to the OP /thumbup.gif' class='bbc_emoticon' alt=':thumbup:' />  I hope you find what makes you happy.



EDIT II:  Yes....I took some offense to the post.  I apologize to Eddie for ranting.  We're both here just trying to help.  And yes....the VGA input will pick up your native resolution without any issues whatsoever.  It's analog instead of digital....that's all.

[snapback]338796[/snapback]






Hey, I appreciate any and all help. Whatever works as far as I'm concerned.



My main concern/point was that I read on Nvidia's site that their drivers work for you in that you can set up the custom resolution of 1366 x 768...provided that your TV comes with that "EDID" I think it's called. Well, most actual HDTV's do not.



Oh, and has anyone installed the new "beta" drivers (169.44)? I did, and it added an icon on my desktop to play Portal for free for a time. I love how we have to deal with advertising now with our drivers.



If I didn't own two 8800 GTS's and two 8800 GT's...I'd switch to ATI. But I forked out a fortune for Nvidia stuff, and I need to get my use out of it.

#12
Posted 03/07/2008 01:42 PM   
[quote name='Przybocki' date='Mar 7 2008, 08:42 AM']Hey, I appreciate any and all help. Whatever works as far as I'm concerned.

My main concern/point was that I read on Nvidia's site that their drivers work for you in that you can set up the custom resolution of 1366 x 768...provided that your TV comes with that "EDID" I think it's called. Well, most actual HDTV's do not.

Oh, and has anyone installed the new "beta" drivers (169.44)? I did, and it added an icon on my desktop to play Portal for free for a time. I love how we have to deal with advertising now with our drivers.

If I didn't own two 8800 GTS's and two 8800 GT's...I'd switch to ATI. But I forked out a fortune for Nvidia stuff, and I need to get my use out of it.
[right][snapback]339121[/snapback][/right]
[/quote]
I hear ya on the money issue /haha.gif' class='bbc_emoticon' alt=':haha:' /> NVidia rox on gaming imo, but lacks in some other areas. I don't regret my purchase, but sometimes I get aggrevated /haha.gif' class='bbc_emoticon' alt=':haha:' />

That's the first I've heard about Portal with the beta drivers....
[quote name='Przybocki' date='Mar 7 2008, 08:42 AM']Hey, I appreciate any and all help. Whatever works as far as I'm concerned.



My main concern/point was that I read on Nvidia's site that their drivers work for you in that you can set up the custom resolution of 1366 x 768...provided that your TV comes with that "EDID" I think it's called. Well, most actual HDTV's do not.



Oh, and has anyone installed the new "beta" drivers (169.44)? I did, and it added an icon on my desktop to play Portal for free for a time. I love how we have to deal with advertising now with our drivers.



If I didn't own two 8800 GTS's and two 8800 GT's...I'd switch to ATI. But I forked out a fortune for Nvidia stuff, and I need to get my use out of it.

[snapback]339121[/snapback]




I hear ya on the money issue /haha.gif' class='bbc_emoticon' alt=':haha:' /> NVidia rox on gaming imo, but lacks in some other areas. I don't regret my purchase, but sometimes I get aggrevated /haha.gif' class='bbc_emoticon' alt=':haha:' />



That's the first I've heard about Portal with the beta drivers....

#13
Posted 03/07/2008 04:18 PM   
just to point out, its not 1360x768, or 1360x766.

its 1360x765
it says so right here on my shiny new Wide screen.
just to point out, its not 1360x768, or 1360x766.



its 1360x765

it says so right here on my shiny new Wide screen.

#14
Posted 03/07/2008 04:54 PM   
[quote name='squall_leonhart' date='Mar 7 2008, 08:54 AM']just to point out, its not 1360x768, or 1360x766.

its 1360x765
it says so right here on my shiny new Wide screen.
[right][snapback]339222[/snapback][/right]
[/quote]

Well, my Widescreen says "1360 x 768" and not "765", so abviously YOURS is different!
[quote name='squall_leonhart' date='Mar 7 2008, 08:54 AM']just to point out, its not 1360x768, or 1360x766.



its 1360x765

it says so right here on my shiny new Wide screen.

[snapback]339222[/snapback]






Well, my Widescreen says "1360 x 768" and not "765", so abviously YOURS is different!

#15
Posted 03/07/2008 05:57 PM   
  1 / 2    
Scroll To Top