First, let me say a big, HUGE "Thank you guys!!!" for everyone involved in the KnoppMyth project

. You guys have put together a very nice package. I futzed with installing myth on a RH9 system, and after becoming impatient, I found knoppmyth. Let me say, (i'm sure y'all know) you've made the single most easiest MythTV package out there. Very well done, indeed!
Also, this forum is an excellent repository of information. I used it to figure out why my new Hauppauge 250 card could not see anything but static on the tuner input (tuner = 39 fixed me up!)
Now, with the problem. I've seen some other people with what sounds like the same problem, but never found an answer. It's not directly related to knoppmyth, but I have noticed there's alot of smart people around here

....
I'm using an ATI radeon 8500 with DVI output, and have a Mitsubishi HDTV with DVI input.
-X displays on the TV, but at a resolution of 1920x540p.
-I have made a modeline that is set for 960x540p.
-MythTV looks like doo-doo when displayed at the 1920x540p resolution that X is running.
-No matter what I put into the XF86Config file, it still displays at 1920x540. This is even when I have only one modeline (960x540) and only one corresponding screen setting referring to this modeline. I have tried everything short of taking out all modelines and screen settings, which one would think would make X fail totally.
-Looking at the XF86 log, I verified that X is using the XF86Config file that I'm editing, not some other one from somewhere on the system.
-Looking further in the XF86 log, I see that X is detecting my monitor, and that my monitor is (incorrectly) reporting that the max res it is capable of is 1920x540, along with it's associated scan frequencies. The monitor's native max resolution is actually 1920x1080i, but I can't seem to even make X run at this resolution.
-Looking even further down the XF86 log, I see that it has pulled a "1920x540" modeline out of it's ass somewhere, since it reports that it's using this modeline, even though I don't have a "1920x540" modeline in my config file... grepping through my system for "1920x540" only finds refrence to this in my XF86 log, and ... get this... the X binary itself.
-I have removed all traces of telling X to use DDC or any other type of automagic display capability discovery. (at least I think I did

)
-If I boot the same configuration using a DVI-SVGA adaptor and a SVGA monitor, I get the desired scan rate and resolution to make the 960x540p picture (I get the mode for the modeline that I specified in my XF86Config file). If I disconnect the DVI-SVGA adaptor and connect my DVI cable going to my HDTV (without rebooting - I know, I'm living dangerously here

), I get nothing on the HDTV, presumably because my video card initialized in analog RGB mode instead of digital DVI (it outputs an RGB signal from the DVI, since the "adaptor" does not actually convert a digital signal to an analog signal, it just tells the video card to send analog RGB signals instead of digital DVI signals). One thing I haven't verified here is going RGBHV into my HDTV at this point, just to verify the set operates at this frequency. I'm 99% sure that it does, however, since I used PowerStrip in windoze at one point in time at 960x540p, and even 1920x1080i via the DVI connector, using PowerStrip. I also checked the H & V refresh rates with an O-scope and verified the scan rate was within the range my TV wants to see.
So, to make a long story longer, does anyone know what's going on here? It would appear that the X server I'm using is ignoring my modelines I told it to use, and is using a "1920x540" modeline that is internal to the X binary, probably put in there at compile time.
How do I tell X to ignore what it thinks I want and go to what I want it to do? X seems to be behaving almost as bad as Windoze, I think, trying to tell me what I need, instead of letting me tell it what I need!
