LinHES Forums
http://forum.linhes.org/

1080p with XFX5200
http://forum.linhes.org/viewtopic.php?f=5&t=12158
Page 1 of 2

Author:  jmacmythtv [ Sat Oct 14, 2006 6:36 am ]
Post subject:  1080p with XFX5200

Hi everyone,

I am struggling to get 1080p running on my system. The Xserver will start but the image is blurred vertically - as though the image is jumping up and down. This happens on both the desktop and in the MythFrontend. This have tried (among others) the following modelines with the same results:

Modeline "1920-1080-myth" 148.5 1920 2008 2048 2200 1080 1084 1094 1124 -hsync -vsync
Modeline "1920x1080-59.94p" 148.352 1920 1960 2016 2200 1080 1082 1088 1125 +hsync -vsync
ModeLine "ATSC-1080-60p" 148.5 1920 1960 2016 2200 1080 1082 1088 1125

Some other possibly pertinent XFree86-4 stuff:

Section "Monitor"
Identifier "LVM-42w2"
Option "DPMS" "true"

Monitors

HorizSync 30 - 80
VertRefresh 50.0 - 75.0
DisplaySize 650 365

The monitor is a westinghouse lvm-42w2 and is hooked up with DVI. Other specs:

MoBo: Asus A8V (Via K8T800Pro, Via VT8237)
Proc: AMD Athlon 64 3700
Cooler: Zalmann cnps7700
Case: Silverstone LC14M
Mem: 2 x 512 corsair
PS: Antec NEO HE 430
Coax: 2 x Wintv PVR-250
Video: XFX AGP geforce 5200
Optical: 2 x NEC DVD-RW ND-3550A
HD: 2 x 250GB SATA Maxtor SL250SO

Any thoughts would be much appreciated!

Author:  ed.gatzke [ Sat Oct 14, 2006 7:23 am ]
Post subject:  results???

If you get this working, please make sure to post the solution here. I was looking at upgrading to that Westinghouse (or maybe the 37 inch 1080p version).

Anybody have luck with any of the 1080p Westinghouse LCDs?

Author:  thornsoft [ Sat Oct 14, 2006 8:11 am ]
Post subject: 

It's definitely a modeline problem. You're outputting something that the monitor can't really handle.
I had better luck switching to component, as my TV (3-yr SONY RPTV) wouldn't handle 1080 via DVI from Myth. It would handle it fine from a STB, and I could sort of get a stable picture with WindowsXP, but that too, jumped around a bit and was hard to look at. But by switching to VGA/Component (using a converter box from mythic.tv), everything cleared right up.

Author:  jmacmythtv [ Sun Oct 15, 2006 7:52 am ]
Post subject: 

Thanks guys,

I will try using the vga connectors and continue expirimentation with the modelines.

Could Nvidia driver version differences explain why others have made these modelines work with this monitor/video card?

Also, does anyone know the significance of the +/- h or v sync at the end of the modelines?

Thanks!

Author:  MisoSoup777 [ Sun Oct 15, 2006 10:38 am ]
Post subject: 

I don't believe that the 5200 has enough pixel clock speed to do 1080p. I believe it maxes at 135mhz, your modelines need 148.5.

Author:  jmacmythtv [ Sun Oct 15, 2006 1:49 pm ]
Post subject: 

That would be too bad ... what am I missing in this post where someone claims to have made it work? - are there different 5200 cards?

http://mysettopbox.tv/phpBB2/viewtopic. ... ight=1080p

Author:  tjc [ Sun Oct 15, 2006 7:09 pm ]
Post subject: 

Different core clock rates (from 230MHz to 250Mhz), different bus interfaces (PCI or AGP), different memory interfaces (64 or 128 bit wide), memory clocks from 333MHz to 400Mhz, different RAMDAC speeds (350-400Mhz), ... You can see this by hitting any big shopping site and comparing the specs side-by-side. Max resolutions are listed as anything from 2048x1536@85Hz to 2048x1536@60Hz

Author:  mihanson [ Sun Oct 15, 2006 8:59 pm ]
Post subject: 

jmacmythtv:
I have a Westinghouse LVM-37w1. I too had trouble getting a 1080p modeline working with a FX5200. However, your EDID [i]should[/i] spit out modelines (including 1080p if it's supported) that you can test out. Just do this to "see" them:
[code]Ctrl-Alt-F1
# /etc/init.d/gdm stop
# startx - --logverbose 6
Ctrl-Alt-F1
Ctrl-C
# pico /var/log/XFree86.0.log[/code]
and you'll find the timings in there. FWIW, I could not get 1080i or 1080p working with my FX5200. I had to upgrade to a 6200. Even then I had some troubles with stability. I [i]think[/i] the stability issue was a MythTV problem and it was fixed with the release of mythtv 0.20, but I don't know for sure, as I've yet to upgrade to .20. I also think you'll need one of the most recent 8xxx series nvidia drivers. If you use the "nvidia-auto-select" option in your screen section of XF86Config-4, the nvidia driver should use the "best" validated mode from your monitor's EDID. You may want to try messing with the nvidia X options. If your having trouble with non-60Hz modes, try adding:
[code]Option "ModeValidation" "AllowNon60HzDFPModes"[/code]
See the nvidia readme for your driver. i.e. http://download.nvidia.com/XFree86/Linu ... dix-d.html
Hope this helps.

Author:  jmacmythtv [ Mon Oct 16, 2006 5:52 am ]
Post subject: 

Thanks guys... great info! I had no idea that there were so many differences between "Fx5200" video cards - although it explains some of the price variation.

FWIW, I was able to get 1080i working with this card - this is what I am currently running. When I get home I will post the modeline for future reference and play around with the EDID to find the details on what this particular card can handle.

Thanks again!

Author:  jmacmythtv [ Mon Oct 16, 2006 4:36 pm ]
Post subject: 

After checking out the log files in more detail I found that the above stated modelines are listed in the valid modeline section. ( BTW mihanon, I could not startx with that log option and therefore only found the max pixel clock listed as 400 -referencing the Display I believe). Do you have to be local to run the startx command? -I am running it from a remote ssh.)

Others prompt this kind of warning:

(WW) (1600x1200,LVM-42w2) mode clock 162MHz exceeds DDC maximum 150MHz

which makes me think that maybe my card has a 150mhz and not 135mhz max dot clock?

Also, in some of my searching I found this neat online modeline generator:

http://xtiming.sourceforge.net/cgi-bin/xtiming.pl

Interestingly, if I choose a 51hz refresh, with the other details for my monitor, I come up with this mode:

Modeline "1920x1080@51" 147.23 1920 1952 2504 2536 1080 1103 1112 1135

For some reason however, my logfile won't accept this as it claims the v refresh is out of range?

Any thoughts?

Author:  mihanson [ Mon Oct 16, 2006 5:08 pm ]
Post subject: 

Whoops. :oops: Sorry about that jmac... It should read:

[code] # startx -- -logverbose 6
[/code]
And the log should produce some info like this:
[code]
<<cut>>
(II) NVIDIA(0): --- Building ModePool for WDE LVM-37w1 (DFP-0) ---
(II) NVIDIA(0): Validating Mode "1920x1080":
(II) NVIDIA(0): 1920 x 1080 @ 60 Hz
(II) NVIDIA(0): Mode Source: EDID
(II) NVIDIA(0): Pixel Clock : 138.50 MHz
(II) NVIDIA(0): HRes, HSyncStart : 1920, 1968
(II) NVIDIA(0): HSyncEnd, HTotal : 2000, 2080
(II) NVIDIA(0): VRes, VSyncStart : 1080, 1082
(II) NVIDIA(0): VSyncEnd, VTotal : 1087, 1111
(II) NVIDIA(0): H/V Polarity : +/-
(II) NVIDIA(0): Mode is valid.
<<cut>>
[/code]
Yes, you do need to be local to do this. It needs to be done from a console (Ctrl-Alt-F1).

[quote](WW) (1600x1200,LVM-42w2) mode clock 162MHz exceeds DDC maximum 150MHz

which makes me think that maybe my card has a 150mhz and not 135mhz max dot clock? [/quote]

DDC is a synomyn for EDID. That warning is telling you that your MONITOR'S max pixel clock is 150MHz. 135MHz is a limitation of the FX5200 video card. I've had the same problem.

Author:  tony_c [ Tue Oct 17, 2006 2:07 pm ]
Post subject: 

Hi, How much memory is on your XFX5200 Geforce card? I read somewhere that FX5200 w/ 128MB will not do 1920x1080 on DVI (only on VGA), and you'll need 256MB version to work at 1920x1080. Not sure if this is your problem though..

Author:  jmacmythtv [ Sat Oct 21, 2006 6:28 am ]
Post subject: 

Hi Guys,

Sorry for the dealy in replying -was away this week.

The card has 256Mb of memory so I should be OK on this front.

Starting X with verbose logging gave me a couple of new lines that I hadn't seen:

(II) NVIDIA(0): Manufacturer's mask: 0
(II) NVIDIA(0): Supported Future Video Modes:
(II) NVIDIA(0): #0: hsize: 1152 vsize 864 refresh: 75 vid: 20337
(II) NVIDIA(0): #1: hsize: 1280 vsize 1024 refresh: 60 vid: 32897
(II) NVIDIA(0): #2: hsize: 1280 vsize 720 refresh: 60 vid: 49281
(II) NVIDIA(0): #3: hsize: 1280 vsize 1280 refresh: 60 vid: 129
(II) NVIDIA(0): Supported additional Video Mode:
(II) NVIDIA(0): clock: 148.5 MHz Image Size: 930 x 520 mm
(II) NVIDIA(0): h_active: 1920 h_sync: 2008 h_sync_end 2052 h_blank_end 2200 h_border: 0
(II) NVIDIA(0): v_active: 1080 v_sync: 1084 v_sync_end 1089 v_blanking: 1125 v_border: 0
(II) NVIDIA(0): Supported additional Video Mode:
(II) NVIDIA(0): clock: 74.2 MHz Image Size: 930 x 520 mm
(II) NVIDIA(0): h_active: 1920 h_sync: 2008 h_sync_end 2052 h_blank_end 2200 h_border: 0
(II) NVIDIA(0): v_active: 540 v_sync: 542 v_sync_end 547 v_blanking: 562 v_border: 0
(II) NVIDIA(0): Ranges: V min: 60 V max: 75 Hz, H min: 30 H max: 80 kHz, PixClock max 150 MHz

Any ideas?

Thanks!

Author:  jmacmythtv [ Sat Oct 21, 2006 7:20 am ]
Post subject: 

Further Update:

ATSC-1080-60p works perfectly through the vga cable:

ModeLine "ATSC-1080-60p" 148.5 1920 1960 2016 2200 1080 1082 1088 1125
Although, I haven't tried the others that I was having problems with, I'm sure they would work as well.

The problem where

Modeline "1920x1080@51" 147.23 1920 1952 2504 2536 1080 1103 1112 1135 claimed the v refresh is out of range was becuase somewhere in my troubleshooting, I had commented out the "Ignore EDID true" line - and EDID was overriding my Vrefresh.

Another strange thing that I noticed: I had set up "separate modes for GUI and video" using 1920 x 1080 for GUI and
720x480 for video.

When using the interlaced modeline with the DVI cable and playing video, my tv was registering 1280x1024p.???

Now that I am using the modeline for 1920x1080p, it correctly registers 720x480p during video. I am really happy to have gotten the correct modeline working, but I had gotten used to seeing my TV in 1280x1024 - a format that looked perfect if I used the "Fill feature" on both myth and my tv.

Anybody know why certain modelines only work through certain interfaces? Is it the monitor, video card, or cable?

Anyway, thanks everybody for all your help! I will keep posting if I figure anything else out.

Author:  Ramon2007 [ Sun Oct 22, 2006 1:15 pm ]
Post subject: 

I am quite sure the difference in stream decoding ability between fx5200 cards are base on the bus size ONLY, aka 64bit or 128bit.

Page 1 of 2 All times are UTC - 6 hours
Powered by phpBB® Forum Software © phpBB Group
http://www.phpbb.com/