LinHES Forums http://forum.linhes.org/ |
|
Perplexing Problems with X, ATI DVI and HDTV http://forum.linhes.org/viewtopic.php?f=2&t=2755 |
Page 1 of 1 |
Author: | kd4pbs [ Thu Oct 21, 2004 9:27 am ] |
Post subject: | Perplexing Problems with X, ATI DVI and HDTV |
First, let me say a big, HUGE "Thank you guys!!!" for everyone involved in the KnoppMyth project ![]() Also, this forum is an excellent repository of information. I used it to figure out why my new Hauppauge 250 card could not see anything but static on the tuner input (tuner = 39 fixed me up!) Now, with the problem. I've seen some other people with what sounds like the same problem, but never found an answer. It's not directly related to knoppmyth, but I have noticed there's alot of smart people around here ![]() I'm using an ATI radeon 8500 with DVI output, and have a Mitsubishi HDTV with DVI input. -X displays on the TV, but at a resolution of 1920x540p. -I have made a modeline that is set for 960x540p. -MythTV looks like doo-doo when displayed at the 1920x540p resolution that X is running. -No matter what I put into the XF86Config file, it still displays at 1920x540. This is even when I have only one modeline (960x540) and only one corresponding screen setting referring to this modeline. I have tried everything short of taking out all modelines and screen settings, which one would think would make X fail totally. -Looking at the XF86 log, I verified that X is using the XF86Config file that I'm editing, not some other one from somewhere on the system. -Looking further in the XF86 log, I see that X is detecting my monitor, and that my monitor is (incorrectly) reporting that the max res it is capable of is 1920x540, along with it's associated scan frequencies. The monitor's native max resolution is actually 1920x1080i, but I can't seem to even make X run at this resolution. -Looking even further down the XF86 log, I see that it has pulled a "1920x540" modeline out of it's ass somewhere, since it reports that it's using this modeline, even though I don't have a "1920x540" modeline in my config file... grepping through my system for "1920x540" only finds refrence to this in my XF86 log, and ... get this... the X binary itself. -I have removed all traces of telling X to use DDC or any other type of automagic display capability discovery. (at least I think I did ![]() -If I boot the same configuration using a DVI-SVGA adaptor and a SVGA monitor, I get the desired scan rate and resolution to make the 960x540p picture (I get the mode for the modeline that I specified in my XF86Config file). If I disconnect the DVI-SVGA adaptor and connect my DVI cable going to my HDTV (without rebooting - I know, I'm living dangerously here ![]() So, to make a long story longer, does anyone know what's going on here? It would appear that the X server I'm using is ignoring my modelines I told it to use, and is using a "1920x540" modeline that is internal to the X binary, probably put in there at compile time. How do I tell X to ignore what it thinks I want and go to what I want it to do? X seems to be behaving almost as bad as Windoze, I think, trying to tell me what I need, instead of letting me tell it what I need! ![]() |
Author: | kd4pbs [ Fri Oct 22, 2004 9:28 pm ] |
Post subject: | |
Well, I found the problem myself. I bought a different computer and video card, because I figured the ATI was the problem. While reading through the installation readme for the Nvidia FX5200 I got, I stumbled across this... Code: Option "IgnoreEDID" "boolean"
Disable probing of EDID (Extended Display Identification Data) from your monitor. Requested modes are compared against values gotten from your monitor EDIDs (if any) during mode validation. Some monitors are known to lie about their own capabilities. Ignoring the values that the monitor gives may help get a certain mode validated. On the other hand, this may be dangerous if you do not know what you are doing. Default: Use EDIDs. Yep, the Mitsubishi was sending it's EDID properties and forcing it to override te monitor section. I haven't done it yet, but this should take care of my problems with this. Now if I can get XvMC and this doggone Nvidia working properly.... |
Author: | kd4pbs [ Sun Oct 24, 2004 11:06 am ] |
Post subject: | |
I found that KnoppMyth doesn't install an Nvidia driver by default, so that is why the XvMC wasn't working. Problem solved. Now I have another issue... The interlaced mode that I'm running my TV with (1920x1080i) causes the whole image to shift up & down several pixels at the field rate. It looks as if the interlaced mode is shifting several lines instead of 1/2 line every field like it's supposed to do. I seem to remember reading something about someone else having this problem with an Nvidia card. Now to find where I read this... Thanks for everyone's help! -Matt |
Page 1 of 1 | All times are UTC - 6 hours |
Powered by phpBB® Forum Software © phpBB Group http://www.phpbb.com/ |