LinHES Forums
http://forum.linhes.org/

New Nvidia graphic drivers.
http://forum.linhes.org/viewtopic.php?f=5&t=1871
Page 2 of 6

Author:  tanzania [ Sat Jul 03, 2004 8:07 am ]
Post subject: 

Well, i donĀ“t know how far R5 is, but these drivers are well worth thinking about. The interlaced modes are just great for TV-OUT, no need for deinterlacing anymore (good for weaker machines) and Overscan and antiflicker are just great as well, as the nvidia-settings tool is.

And for the glx problem i guess the debian gurus here might solve this anyway (maybe we need kernel 2.6 for this)


btw How can i run a script before mythfrontend autostarts? Wanna run nvidia-settings -l before mythfrontend.

Thanks

Author:  tjc [ Sat Jul 03, 2004 11:22 am ]
Post subject: 

tanzania wrote:
btw How can i run a script before mythfrontend autostarts? Wanna run nvidia-settings -l before mythfrontend.

Just add it to your /home/mythtv/.fvwm/.fvwm2rc out near the end.

Most of the things that it controls can also be done through settings in your /etc/X11/XF86Config-4 or through the Xv controls, which has the advantage of reducing your "part count". The biggest advantage of the tool is being able to interactively tune the video output. 8-)

Author:  SSChicken [ Sat Jul 03, 2004 5:00 pm ]
Post subject:  Re: works great

xmichael wrote:
I'm still sad that NVidia hasn't done anything to provide "flicker reduction" features, like there windows counterpart...


Update the drivers and then nvidia-settings and there is a flicker reduction exactly like windows. It works very well, makes text for program info much more readable

**Edit- Oops, it's nvidia-settings not nvidia-configure

Author:  red321 [ Sun Jul 04, 2004 8:45 am ]
Post subject: 

Hate to rain on everyones parade.

I have pulled my Matrox card out and put back my 440MX, loaded the new drivers, played for hours with modelines etc.

I am using DVB feeds which are interlaced in nature. As far as I can see there is no benifit in interlaced over the original drivers. Seems that the improvement is interlaced modes for the VGA output/ It does not seem to help with passing interlaced content seemlessly through the card to the s-video output. Therefore this is great for those with an HDTV with VGA input, but for us old fashioned people with S-Video Composite, there is nothing to be gained on the interlacing front. :(

Author:  tjc [ Sun Jul 04, 2004 10:49 am ]
Post subject: 

Are you running the display at 640x480 (or the PAL native resolution)?

Author:  red321 [ Sun Jul 04, 2004 1:31 pm ]
Post subject: 

I am running at 800x600 50Hz interlaced. Unfortunately the new drivers refuse the 720x576 modeline that worked perfectly well with previous drivers and marks it as (not a valid tv out mode ) [Telll that to the whole of europe apart from france, Australia ...... :lol: :lol: ]

I was running the Myth GUI at 720x576 and sizing the picture to that, and then using overscan, which does work :D, to fill the screen.

I have just seen there is a similar thread running on the Myth mailing list. Seems I am not alone in pointing out the emperor needs a new tailor 8)

Playing films and anything that is frame based works great with out kerneldeint, but it always has. A good test is sports or fast panning TV studio shows.

Author:  ceenvee703 [ Sun Jul 04, 2004 4:33 pm ]
Post subject: 

I had always heard that nVidia's TV-Out did not properly leave alone an interlaced signal. I saw as much when trying to do PVR-type stuff under Windows... although the nVidia software claimed to support passthrough of interlacing, I never saw it on sports and live TV shows.

It seemed that only certain boards (Matrox, possibly?) support passthrough of an interlaced signal through their TV outs.

Author:  TransAmGore [ Mon Jul 05, 2004 2:58 pm ]
Post subject: 

red321 wrote:
Hate to rain on everyones parade.

I have pulled my Matrox card out and put back my 440MX, loaded the new drivers, played for hours with modelines etc.

I am using DVB feeds which are interlaced in nature. As far as I can see there is no benifit in interlaced over the original drivers. Seems that the improvement is interlaced modes for the VGA output/ It does not seem to help with passing interlaced content seemlessly through the card to the s-video output. Therefore this is great for those with an HDTV with VGA input, but for us old fashioned people with S-Video Composite, there is nothing to be gained on the interlacing front. :(


Maybe you had your settings wrong. I just updated to the new nvidia drivers, I have a GeForce4 MX card, and I use S-video out. By switching the output mode to either of the new HD480 modes (instead of the NTSC-M mode), I get an awesome, flicker-free output on my Toshiba 27" TV. And the XvMC video playback is MUCH better than it ever was with the old drivers. I had previously given up on XvMC and was using software decoding with the kerneldeint filter active.

The video playback quality with these new drivers and XvMC enabled is vastly superior to software decoding with the standard deinterlacing filter included with MythTV, and it's almost as good (maybe even exactly as good) as using the kerneldeint filter. Only now the processor load has dropped drastically during video playback. There's no distracting ghosting effect that I can see, and even my girlfriend who knows nothing about such things commented on how nice the video output was.

As far as the anti-flicker setting goes, that seems to be enabled when you use any of the HDxxxx output modes. I didn't have to run nvidia-settings to get the flicker to go away. All thin horizontal lines in the settings screens are now rock-steady and smooth. All fonts are much easier to read. As far as the menus go, the viewing quality is now comparable to my Bell ExpressVu satellite receiver's menu screens. Clear, sharp, and no flicker whatsoever.

One note; I did have to comment out the GLX module for the new driver to work. I don't use GLX-enabled stuff anyways, so I didn't bother fixing that Debian-related problem.

Author:  red321 [ Tue Jul 06, 2004 9:20 am ]
Post subject: 

Can you post your XF86config-4 ? I would be very happy to be wrong on this one :lol: but my understanding agrees with this thread:

http://www.gossamer-threads.com/lists/m ... sers/75519

I am using PAL with interlaced content, and it will not accept any 576 modelines as valid.

If it works for you I am happy, but I would be even happier if it works for me :wink:

Author:  lynchaj [ Thu Jul 08, 2004 5:06 am ]
Post subject:  nvidia-settings

Hi,
I am running R4 with the new Nvidia 6106 drivers. When I start an xterm and then start nvidia-settings, the program opens in a window at the bottom of the screen so that most of the window is off-screen. I can only see about the top inch of the nvidia-settings program window.

I have tried click and drag on the window to move it or resize it or maximize it with no luck. I can get it to minimize but cannot ever seem to get the window so I can actually use nvidia-settings.

Is anyone else having this problem? How do I fix it? Help!

THANKS!

Andrew Lynch

Author:  tanzania [ Thu Jul 08, 2004 2:22 pm ]
Post subject: 

glx problem solved, apt-get update, apt-get install nvidia-glx worked now. Seems as if it took some time until they were released.

Author:  tjc [ Thu Jul 08, 2004 4:11 pm ]
Post subject:  Re: nvidia-settings

lynchaj wrote:
Hi,
I am running R4 with the new Nvidia 6106 drivers. When I start an xterm and then start nvidia-settings, the program opens in a window at the bottom of the screen so that most of the window is off-screen. I can only see about the top inch of the nvidia-settings program window.


At a guess you have a virtual screen size that is bigger than the actual screen size. Look at your X logs (in /var/log/XFree86.0.log) to see what the actual resolution is, you may even find a virtual vs. actual message in there. Fix your /etc/X11/XF86Config-4 to use that same resolution.

Author:  tjc [ Thu Jul 08, 2004 4:16 pm ]
Post subject: 

tanzania wrote:
glx problem solved, apt-get update, apt-get install nvidia-glx worked now. Seems as if it took some time until they were released.
Did you do your initial install via apt-get? What was your package list?

I used the Nvidia script (with the results previously mentioned), and am wondering if I should rip it out and take the apt-get route...

Author:  lynchaj [ Thu Jul 08, 2004 6:59 pm ]
Post subject:  virtual vs. actual size

Is this what you are referring to in XFree86.0.log?

(**) NVIDIA(0): Validated modes for display device TV-0:
(**) NVIDIA(0): Mode "640x480": 25.2 MHz, 31.5 kHz, 59.9 Hz
(II) NVIDIA(0): Virtual screen size determined to be 640 x 480
(==) NVIDIA(0): DPI set to (75, 75)

I fiddled around with this a bit more and ended up starting a second X client with

startx -- :1

On that, plain old icewm is the wm and (ahhhh....) no more funky window management.

I got the nvidia-settings to run and write a config file. I'll just edit that with vi from now on and skip the nvidia-settings.

Thanks!

Andrew Lynch

Author:  tjc [ Thu Jul 08, 2004 8:20 pm ]
Post subject:  Re: virtual vs. actual size

lynchaj wrote:
Is this what you are referring to in XFree86.0.log?

(**) NVIDIA(0): Validated modes for display device TV-0:
(**) NVIDIA(0): Mode "640x480": 25.2 MHz, 31.5 kHz, 59.9 Hz
(II) NVIDIA(0): Virtual screen size determined to be 640 x 480
(==) NVIDIA(0): DPI set to (75, 75)

Yes, but it looks like it's right, unless fvwm has a bigger virtual size...
lynchaj wrote:
On that, plain old icewm is the wm and (ahhhh....) no more funky window management.

"Plain old icewm"! My how times have changed. The fvwm project was started to be a lightweight and minimal virtual window manager based on twm. It's still pretty svelte compared to a KDE or Gnome environment.

Page 2 of 6 All times are UTC - 6 hours
Powered by phpBB® Forum Software © phpBB Group
http://www.phpbb.com/