Author |
Message |
tscholl
|
Posted: Thu Nov 25, 2010 11:03 am |
|
Joined: Mon Apr 10, 2006 3:48 pm
Posts: 997
Location:
Lexington, Ky
|
I am running a Nvidia 8400 PCI video card connected to the PC input of a Samsung 720p plasma TV.
If upgrade my TV to a Samsung 1080p 120mhz LCD TV. I'm wondering if I should still use the PC input or would it be better to switch to the Nvidia's DVI output and then use a DVI to HDMI connector to the HDMI connector on the TV. The only issue if I go to the HDMI route I can think of is how do I handle the audio?
Or should I just get a new card? If so I looking for suggestions as to which card?
Any suggestions as to what would be the best approach would be appreciated.
|
|
Top |
|
|
jzigmyth
|
Posted: Thu Nov 25, 2010 3:03 pm |
|
Joined: Thu Mar 02, 2006 5:42 pm
Posts: 410
Location:
middleton wi usa atsc
|
tscholl wrote: The only issue if I go to the HDMI route I can think of is how do I handle the audio? I run 1080p from an AGP Nvidia FX5200 DVI out to HDMI on a Sony 46" TV. On the Sony, one of the HDMI ports has two stereo RCA jacks associated with it for the audio. I ran a stereo patch cable from the sound card to these jacks and it all works just fine.
Check your TV to see if it has an HDMI input with audio jacks next to it. Some do and some don't. You might have to select separate audio in a TV menu somewhere.
|
|
Top |
|
|
tscholl
|
Posted: Thu Nov 25, 2010 5:04 pm |
|
Joined: Mon Apr 10, 2006 3:48 pm
Posts: 997
Location:
Lexington, Ky
|
Thanks for the reply I'll have to check the new set and see if that option is available.
|
|
Top |
|
|
mattbatt
|
Posted: Thu Nov 25, 2010 9:38 pm |
|
Joined: Tue Aug 15, 2006 11:14 am
Posts: 1343
Location:
Orlando FL
|
the tv might also have digital audio in, if of course your box has digital out.
_________________ My System
|
|
Top |
|
|
tscholl
|
Posted: Fri Nov 26, 2010 11:17 am |
|
Joined: Mon Apr 10, 2006 3:48 pm
Posts: 997
Location:
Lexington, Ky
|
What if any changes have to be made to the xorg.conf to change from the VGA out to the DVI out?
|
|
Top |
|
|
jzigmyth
|
Posted: Fri Nov 26, 2010 11:47 am |
|
Joined: Thu Mar 02, 2006 5:42 pm
Posts: 410
Location:
middleton wi usa atsc
|
If you shut down, disconnect your VGA cable and connect your DVI to HDMI cable and restart, it may just work. Otherwise for starters, you need to change the "ConnectedMonitor" option from "VGA" to "DFP" You may also need an appropriate "mode line" for the new resolution.
Code: Section "Monitor" Option "ConnectedMonitor" "DFP"
Also, Make sure you can ssh in to your machine from another one first before messing with anything, in case your display doesn't work at all, then you can recover your xorg.conf through ssh.
|
|
Top |
|
|
mattbatt
|
Posted: Fri Nov 26, 2010 10:30 pm |
|
Joined: Tue Aug 15, 2006 11:14 am
Posts: 1343
Location:
Orlando FL
|
Why did you say DFP? mine is DVI
_________________ My System
|
|
Top |
|
|
Martian
|
Posted: Fri Nov 26, 2010 10:40 pm |
|
Joined: Wed Feb 08, 2006 6:13 pm
Posts: 480
Location:
IN
|
mattbatt wrote: Why did you say DFP? mine is DVI
[D]igital [F]lat [P]anel
_________________ ABIT NF-M2 nView | Athlon 64 X2 3800+ | 2GB DDR2 800 | HDHomerun | GeForce 6150 (onboard) | WD 640 GB SATA HD | DVD-RW (sata) | StreamZap IR receiver with Logitech Harmony remote
Vizio 37" LCD HDTV (1080p)
|
|
Top |
|
|
jzigmyth
|
Posted: Sat Nov 27, 2010 8:20 am |
|
Joined: Thu Mar 02, 2006 5:42 pm
Posts: 410
Location:
middleton wi usa atsc
|
mattbatt wrote: Why did you say DFP? mine is DVI DVI is not listed as an argument for "ConnectedMonitor" in the driver that I am using (96.xx), DFP is listed as the argument to use to select the DVI port. Here is an excerpt from http://us.download.nvidia.com/XFree86/L ... dix-d.htmlQuote: Option "ConnectedMonitor" "string"
Allows you to override what the NVIDIA kernel module detects is connected to your video card. This may be useful, for example, if you use a KVM (keyboard, video, mouse) switch and you are switched away when X is started. In such a situation, the NVIDIA kernel module cannot detect what display devices are connected, and the NVIDIA X driver assumes you have a single CRT.
Valid values for this option are "CRT" (cathode ray tube), "DFP" (digital flat panel), or "TV" (television); if using TwinView, this option may be a comma-separated list of display devices; e.g.: "CRT, CRT" or "CRT, DFP".
It is generally recommended to not use this option, but instead use the "UseDisplayDevice" option.
NOTE: anything attached to a 15 pin VGA connector is regarded by the driver as a CRT. "DFP" should only be used to refer to digital flat panels connected via a DVI port.
Default: string is NULL (the NVIDIA driver will detect the connected display devices). Option "UseDisplayDevice" "string"
When assigning display devices to X screens, the NVIDIA X driver by default assigns display devices in the order they are found (looking first at CRTs, then at DFPs, and finally at TVs). This option can be used to override this assignment. For example, if both a CRT and a DFP are connected, you could specify:
Option "UseDisplayDevice" "DFP"
to make the X screen use the DFP, even though it would have used a CRT by default.
Note the subtle difference between this option and the "ConnectedMonitor" option: the "ConnectedMonitor" option overrides what display devices are actually detected, while the "UseDisplayDevice" option controls which of the detected display devices will be used on this X screen.
|
|
Top |
|
|
tscholl
|
Posted: Sat Nov 27, 2010 8:17 pm |
|
Joined: Mon Apr 10, 2006 3:48 pm
Posts: 997
Location:
Lexington, Ky
|
jzigmyth
Thanks for the clarification on the:
Code: Option "UseDisplayDevice" "DFP" line to the xorg.conf. Should I also specify all 3 modes expected modes in the Display section? Code: SubSection "Display" Depth 24 Modes "1920x1080" "1280x720" "720x480" EndSubSection
|
|
Top |
|
|
jzigmyth
|
Posted: Sun Nov 28, 2010 5:51 pm |
|
Joined: Thu Mar 02, 2006 5:42 pm
Posts: 410
Location:
middleton wi usa atsc
|
I only specify the 1080p mode that I use. I let mythtv convert everything to 1080p before sending it to the TV.
|
|
Top |
|
|
tjc
|
Posted: Tue Nov 30, 2010 10:15 pm |
|
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location:
Arlington, MA
|
See my Samsung thread under Tier 1 hardware, I cover a lot of the details around switching there. http://knoppmyth.net/phpBB2/viewtopic.php?t=19668 I just added my xorg.conf for R6 to the thread.
Like jzigmyth I run a DVI to HDMI cable and a stereo audio cable from the box to the DVI-2 cluster on the back of my LN46A650.
|
|
Top |
|
|
manicmike
|
Posted: Thu Dec 09, 2010 8:05 pm |
|
Joined: Sun Aug 28, 2005 7:07 pm
Posts: 821
Location:
Melbourne, Australia
|
tscholl wrote: Code: SubSection "Display" Depth 24 Modes "1920x1080" "1280x720" "720x480" EndSubSection You have two better options than "1920x1080". I would go with this line: Code: Modes "nvidia-auto-select"
That way then nvidia driver does a bit of probing and if your display supports 1080p it will try it first. If it fails it goes to the next highest resolution in its list, etc.
Mike
_________________ ********************* LinHES 7.4 Australian Dragon *********************
|
|
Top |
|
|
Martian
|
Posted: Fri Dec 10, 2010 9:57 am |
|
Joined: Wed Feb 08, 2006 6:13 pm
Posts: 480
Location:
IN
|
manicmike wrote: tscholl wrote: Code: SubSection "Display" Depth 24 Modes "1920x1080" "1280x720" "720x480" EndSubSection You have two better options than "1920x1080". I would go with this line: Code: Modes "nvidia-auto-select" That way then nvidia driver does a bit of probing and if your display supports 1080p it will try it first. If it fails it goes to the next highest resolution in its list, etc. Mike
Be careful with this, some 1080 panels are 1920x1080, others are 1920x1200. Many 720 panels are actually 1366x768 or 1360x768 and others are 1280x768. in some cases the "auto-select" resolution does NOT match the actual panel resolution. At best this would result in the output not being pixel matched to the panel, at worst this will result in no display.
Example: I just purchased a new (low-end) 720p TV. The specs claim the panel is 1366x768 however it was unable to display the autodetected resolution. Instead I had to use a 1360x768 mode line, then everything worked perfectly. I'm guessing this isn't very common in higher end sets but just thought I'd share my experience.
Martian
_________________ ABIT NF-M2 nView | Athlon 64 X2 3800+ | 2GB DDR2 800 | HDHomerun | GeForce 6150 (onboard) | WD 640 GB SATA HD | DVD-RW (sata) | StreamZap IR receiver with Logitech Harmony remote
Vizio 37" LCD HDTV (1080p)
|
|
Top |
|
|