LinHES Forums
http://forum.linhes.org/

Changing from 720p to 1080p
http://forum.linhes.org/viewtopic.php?f=23&t=21905
Page 1 of 1

Author:  tscholl [ Thu Nov 25, 2010 11:03 am ]
Post subject:  Changing from 720p to 1080p

I am running a Nvidia 8400 PCI video card connected to the PC input of a Samsung 720p plasma TV.

If upgrade my TV to a Samsung 1080p 120mhz LCD TV. I'm wondering if I should still use the PC input or would it be better to switch to the Nvidia's DVI output and then use a DVI to HDMI connector to the HDMI connector on the TV. The only issue if I go to the HDMI route I can think of is how do I handle the audio?

Or should I just get a new card? If so I looking for suggestions as to which card?

Any suggestions as to what would be the best approach would be appreciated.

Author:  jzigmyth [ Thu Nov 25, 2010 3:03 pm ]
Post subject:  Re: Changing from 720p to 1080p

tscholl wrote:
The only issue if I go to the HDMI route I can think of is how do I handle the audio?
I run 1080p from an AGP Nvidia FX5200 DVI out to HDMI on a Sony 46" TV. On the Sony, one of the HDMI ports has two stereo RCA jacks associated with it for the audio. I ran a stereo patch cable from the sound card to these jacks and it all works just fine.

Check your TV to see if it has an HDMI input with audio jacks next to it. Some do and some don't. You might have to select separate audio in a TV menu somewhere.

Author:  tscholl [ Thu Nov 25, 2010 5:04 pm ]
Post subject: 

Thanks for the reply I'll have to check the new set and see if that option is available.

Author:  mattbatt [ Thu Nov 25, 2010 9:38 pm ]
Post subject: 

the tv might also have digital audio in, if of course your box has digital out.

Author:  tscholl [ Fri Nov 26, 2010 11:17 am ]
Post subject: 

What if any changes have to be made to the xorg.conf to change from the VGA out to the DVI out?

Author:  jzigmyth [ Fri Nov 26, 2010 11:47 am ]
Post subject: 

If you shut down, disconnect your VGA cable and connect your DVI to HDMI cable and restart, it may just work. Otherwise for starters, you need to change the "ConnectedMonitor" option from "VGA" to "DFP" You may also need an appropriate "mode line" for the new resolution.


Code:
Section "Monitor"   
       Option  "ConnectedMonitor" "DFP"
   


Also, Make sure you can ssh in to your machine from another one first before messing with anything, in case your display doesn't work at all, then you can recover your xorg.conf through ssh.

Author:  mattbatt [ Fri Nov 26, 2010 10:30 pm ]
Post subject: 

Why did you say DFP? mine is DVI

Author:  Martian [ Fri Nov 26, 2010 10:40 pm ]
Post subject: 

mattbatt wrote:
Why did you say DFP? mine is DVI


[D]igital [F]lat [P]anel

Author:  jzigmyth [ Sat Nov 27, 2010 8:20 am ]
Post subject: 

mattbatt wrote:
Why did you say DFP? mine is DVI
DVI is not listed as an argument for "ConnectedMonitor" in the driver that I am using (96.xx), DFP is listed as the argument to use to select the DVI port.

Here is an excerpt from http://us.download.nvidia.com/XFree86/L ... dix-d.html
Quote:
Option "ConnectedMonitor" "string"

Allows you to override what the NVIDIA kernel module detects is connected to your video card. This may be useful, for example, if you use a KVM (keyboard, video, mouse) switch and you are switched away when X is started. In such a situation, the NVIDIA kernel module cannot detect what display devices are connected, and the NVIDIA X driver assumes you have a single CRT.

Valid values for this option are "CRT" (cathode ray tube), "DFP" (digital flat panel), or "TV" (television); if using TwinView, this option may be a comma-separated list of display devices; e.g.: "CRT, CRT" or "CRT, DFP".

It is generally recommended to not use this option, but instead use the "UseDisplayDevice" option.

NOTE: anything attached to a 15 pin VGA connector is regarded by the driver as a CRT. "DFP" should only be used to refer to digital flat panels connected via a DVI port.

Default: string is NULL (the NVIDIA driver will detect the connected display devices).
Option "UseDisplayDevice" "string"

When assigning display devices to X screens, the NVIDIA X driver by default assigns display devices in the order they are found (looking first at CRTs, then at DFPs, and finally at TVs). This option can be used to override this assignment. For example, if both a CRT and a DFP are connected, you could specify:

Option "UseDisplayDevice" "DFP"

to make the X screen use the DFP, even though it would have used a CRT by default.

Note the subtle difference between this option and the "ConnectedMonitor" option: the "ConnectedMonitor" option overrides what display devices are actually detected, while the "UseDisplayDevice" option controls which of the detected display devices will be used on this X screen.

Author:  tscholl [ Sat Nov 27, 2010 8:17 pm ]
Post subject: 

jzigmyth

Thanks for the clarification on the:
Code:
Option "UseDisplayDevice" "DFP"

line to the xorg.conf. Should I also specify all 3 modes expected modes in the Display section?
Code:
    SubSection     "Display"
        Depth       24
        Modes      "1920x1080" "1280x720" "720x480"
    EndSubSection

Author:  jzigmyth [ Sun Nov 28, 2010 5:51 pm ]
Post subject: 

I only specify the 1080p mode that I use. I let mythtv convert everything to 1080p before sending it to the TV.

Author:  tjc [ Tue Nov 30, 2010 10:15 pm ]
Post subject: 

See my Samsung thread under Tier 1 hardware, I cover a lot of the details around switching there. http://knoppmyth.net/phpBB2/viewtopic.php?t=19668 I just added my xorg.conf for R6 to the thread.

Like jzigmyth I run a DVI to HDMI cable and a stereo audio cable from the box to the DVI-2 cluster on the back of my LN46A650.

Author:  manicmike [ Thu Dec 09, 2010 8:05 pm ]
Post subject: 

tscholl wrote:
Code:
    SubSection     "Display"
        Depth       24
        Modes      "1920x1080" "1280x720" "720x480"
    EndSubSection


You have two better options than "1920x1080".

I would go with this line:
Code:
        Modes       "nvidia-auto-select"


That way then nvidia driver does a bit of probing and if your display supports 1080p it will try it first. If it fails it goes to the next highest resolution in its list, etc.

Mike

Author:  Martian [ Fri Dec 10, 2010 9:57 am ]
Post subject: 

manicmike wrote:
tscholl wrote:
Code:
    SubSection     "Display"
        Depth       24
        Modes      "1920x1080" "1280x720" "720x480"
    EndSubSection


You have two better options than "1920x1080".

I would go with this line:
Code:
        Modes       "nvidia-auto-select"


That way then nvidia driver does a bit of probing and if your display supports 1080p it will try it first. If it fails it goes to the next highest resolution in its list, etc.

Mike


Be careful with this, some 1080 panels are 1920x1080, others are 1920x1200. Many 720 panels are actually 1366x768 or 1360x768 and others are 1280x768. in some cases the "auto-select" resolution does NOT match the actual panel resolution. At best this would result in the output not being pixel matched to the panel, at worst this will result in no display.

Example: I just purchased a new (low-end) 720p TV. The specs claim the panel is 1366x768 however it was unable to display the autodetected resolution. Instead I had to use a 1360x768 mode line, then everything worked perfectly. I'm guessing this isn't very common in higher end sets but just thought I'd share my experience.

Martian

Page 1 of 1 All times are UTC - 6 hours
Powered by phpBB® Forum Software © phpBB Group
http://www.phpbb.com/