View unanswered posts    View active topics

All times are UTC - 6 hours





Post new topic Reply to topic  [ 2 posts ] 
Print view Previous topic   Next topic  
Author Message
Search for:
PostPosted: Sat Aug 19, 2006 8:13 pm 
Offline
Joined: Sun Sep 25, 2005 3:50 pm
Posts: 1013
Location: Los Angeles
Hi all,
I'm having a problem with my 720p OTA channels after upgrading my video card from a FX 5200 to a 6200. Here's what's going on:

My complete hardware configuration is in my signature below. After installing the 6200, I am running MythTV in my monitor's native resolution, 1080p, with a modeline from the monitor's EDID. The monitor is a Westinghouse LVM-37w1. Watching 1080i OTA channels is no problem (CBS, NBC, PBS) and they look great. However, whenever I change channels from a 1080i channel (or select "Watch TV" from the MythTV main menu) and land on a 720p OTA broadcast (FOX, ABC) the video is not quite smooth and it jerks slightly (kind of like a strobe effect, but not as severe) and the sound is out of sync. I can resolve the problem by changing from the 720p station to another 720p station and the video becomes smooth and audio syncs up again. When watching the 720p station, either the first one with the slight jerkiness or the second smooth picture, top reports a very high load between 2.9 and 3.5. CPU usage is at about 75% with mythfrontend and XFree86 gobbling up a large bulk of the power. I am not commercial flagging on the fly, nor is the box doing any extra transcoding, commercial flagging or recording with another input. I am using the libmpeg2 decoder and I am not deinterlacing my OTA content. OpenGL is NOT selected and I've tried with OpenGL as well as Standard and XvMC decoding, but without any success. My DISH Network/PVR-150 card reports much more normal diagnostic results for SDTV. Top reports a load around 1.0 and CPU usage is 15-20%. I have set my DISH source to use kernel deinterlacing with the help of Marc Aronson's bash script. I really only stole the mysql commands from that script to add kernel deinterlacing to my DISH source. I verified it by inspecting the channel table with webmin. DMA is enabled on my IDE hard drives, following the instructions on the wiki for that.

This is my Monitor, Device and Screen sections from my XFree86Config-4
Code:
Section "Monitor"
        Identifier      "Monitor0"
        Option  "DPMS"  "true"
        VendorName      "WDE"
        ModelName       "WDE3701"
        HorizSync 30 - 80 # DDC-probed
        VertRefresh 50 - 75 # DDC-probed
        ModeLine "1280x720p"    73.78 1280 1312 1592 1624 720 735 742 757
        ModeLine "1920x1080i"   77.60 1920 1952 2240 2272 1080 1104 1110 1135 interlace
        ModeLine "1920x1080p"   138.5 1920 1968 2000 2080 1080 1082 1087 1111 +hsync -vsync
EndSection
Section "Device"
        Option          "hw_cursor"     "1"
        Option          "NoLogo"        "1"
        Identifier      "Card0"
# The following line is auto-generated by KNOPPIX mkxf86config
        Driver      "nvidia"
        VendorName  "All"
        BoardName   "All"
#       BusID       "PCI:1:0:0"
EndSection
Section "Screen"
        Identifier "Screen0"
        Device     "Card0"
        Monitor    "Monitor0"
        DefaultColorDepth 24
        SubSection "Display"
                Depth     24
                Modes "1920x1080p" "1920x1080i" "1280x720p”
        EndSubSection
EndSection


My XFree86.0.log showing my mode is validated:
Code:
 # startx -- -logverbose 6
# more XFree86.0.log
<<cut>>
(II) NVIDIA(0): --- Building ModePool for WDE LVM-37w1 (DFP-0) ---
(II) NVIDIA(0):   Validating Mode "1920x1080p":
(II) NVIDIA(0):     1920 x 1080 @ 60 Hz
(II) NVIDIA(0):     Mode Source: EDID
(II) NVIDIA(0):       Pixel Clock      : 138.50 MHz
(II) NVIDIA(0):       HRes, HSyncStart : 1920, 1968
(II) NVIDIA(0):       HSyncEnd, HTotal : 2000, 2080
(II) NVIDIA(0):       VRes, VSyncStart : 1080, 1082
(II) NVIDIA(0):       VSyncEnd, VTotal : 1087, 1111
(II) NVIDIA(0):       H/V Polarity     : +/-
(II) NVIDIA(0):     Mode is valid.
<<cut>>


Any ideas as to what may be causing the strobe/jerkiness upon tuning a 720p broadcast?

_________________
Mike
My Hardware Profile


Top
 Profile  
 
 Post subject:
PostPosted: Mon Aug 21, 2006 12:57 pm 
Offline
Joined: Sun Sep 25, 2005 3:50 pm
Posts: 1013
Location: Los Angeles
Just a little update on things... First, I realized that I never said why I went from a FX5200 to a 6200. Well, I could not get 1080i or 1080p (monitor's native resolution) to work over DVI on my FX5200 card. I know some have reported they can, but I just could not get it done.

I tried re-installing the 8762 nvidia drivers as well as rolling back the drivers as far as I could with a 6200 (7xxx series nvidia driver I believe) with no luck. I adjusted the AGP settings in my mobo's BIOS, opening the aperture (?) from 64MB to 128MB without success. I tried different modelines (1080p, 1080i and 720p) without success. I think I may very well just do a good backup of my files/settings and then perform an auto upgrade to wipe the root partition. Maybe there's something that got corrupted after I changed the card out... I dunno. I'm apparently not smart enough to figure it out! :wink:

_________________
Mike
My Hardware Profile


Top
 Profile  
 

Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 2 posts ] 


All times are UTC - 6 hours




Who is online

Users browsing this forum: No registered users and 3 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
cron
Powered by phpBB® Forum Software © phpBB Group

Theme Created By ceyhansuyu