Hi all,
I'm having a problem with my 720p OTA channels after upgrading my video card from a FX 5200 to a 6200. Here's what's going on:
My complete hardware configuration is in my signature below. After installing the 6200, I am running MythTV in my monitor's native resolution, 1080p, with a modeline from the monitor's EDID. The monitor is a
Westinghouse LVM-37w1. Watching 1080i OTA channels is no problem (CBS, NBC, PBS) and they look great. However, whenever I change channels from a 1080i channel (or select "Watch TV" from the MythTV main menu) and land on a 720p OTA broadcast (FOX, ABC) the video is not quite smooth and it jerks slightly (kind of like a strobe effect, but not as severe) and the sound is out of sync. I can resolve the problem by changing from the 720p station to another 720p station and the video becomes smooth and audio syncs up again. When watching the 720p station, either the first one with the slight jerkiness or the second smooth picture, top reports a very high load between 2.9 and 3.5. CPU usage is at about 75% with mythfrontend and XFree86 gobbling up a large bulk of the power. I am not commercial flagging on the fly, nor is the box doing any extra transcoding, commercial flagging or recording with another input. I am using the libmpeg2 decoder and I am not deinterlacing my OTA content. OpenGL is NOT selected and I've tried with OpenGL as well as Standard and XvMC decoding, but without any success. My DISH Network/PVR-150 card reports much more normal diagnostic results for SDTV. Top reports a load around 1.0 and CPU usage is 15-20%. I have set my DISH source to use kernel deinterlacing with the help of
Marc Aronson's bash script. I really only stole the mysql commands from that script to add kernel deinterlacing to my DISH source. I verified it by inspecting the channel table with webmin. DMA is enabled on my IDE hard drives, following the
instructions on the wiki for that.
This is my Monitor, Device and Screen sections from my XFree86Config-4
Code:
Section "Monitor"
Identifier "Monitor0"
Option "DPMS" "true"
VendorName "WDE"
ModelName "WDE3701"
HorizSync 30 - 80 # DDC-probed
VertRefresh 50 - 75 # DDC-probed
ModeLine "1280x720p" 73.78 1280 1312 1592 1624 720 735 742 757
ModeLine "1920x1080i" 77.60 1920 1952 2240 2272 1080 1104 1110 1135 interlace
ModeLine "1920x1080p" 138.5 1920 1968 2000 2080 1080 1082 1087 1111 +hsync -vsync
EndSection
Section "Device"
Option "hw_cursor" "1"
Option "NoLogo" "1"
Identifier "Card0"
# The following line is auto-generated by KNOPPIX mkxf86config
Driver "nvidia"
VendorName "All"
BoardName "All"
# BusID "PCI:1:0:0"
EndSection
Section "Screen"
Identifier "Screen0"
Device "Card0"
Monitor "Monitor0"
DefaultColorDepth 24
SubSection "Display"
Depth 24
Modes "1920x1080p" "1920x1080i" "1280x720p”
EndSubSection
EndSection
My XFree86.0.log showing my mode is validated:
Code:
# startx -- -logverbose 6
# more XFree86.0.log
<<cut>>
(II) NVIDIA(0): --- Building ModePool for WDE LVM-37w1 (DFP-0) ---
(II) NVIDIA(0): Validating Mode "1920x1080p":
(II) NVIDIA(0): 1920 x 1080 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 138.50 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 1968
(II) NVIDIA(0): HSyncEnd, HTotal : 2000, 2080
(II) NVIDIA(0): VRes, VSyncStart : 1080, 1082
(II) NVIDIA(0): VSyncEnd, VTotal : 1087, 1111
(II) NVIDIA(0): H/V Polarity : +/-
(II) NVIDIA(0): Mode is valid.
<<cut>>
Any ideas as to what may be causing the strobe/jerkiness upon tuning a 720p broadcast?