View unanswered posts    View active topics

All times are UTC - 6 hours





Post new topic Reply to topic  [ 12 posts ] 
Print view Previous topic   Next topic  
Author Message
Search for:
PostPosted: Fri Dec 02, 2005 10:53 am 
Offline
Joined: Thu Dec 01, 2005 9:05 am
Posts: 1
Everyone raves about the PVR-350's S-Video signal quality versus the "classic" Nvidia FX5200-vintage cards' S-Video, which is important since I still have a non-HDTV CRT set. However, how does a slightly-more modern card, like my eVga Nvidia-baed 6200TC, compare? (I find it OK but somewhat blurry, for the record, even after working with XF86Config-4 and nvidia-settings.) I ask because I bought a PVR-350 but would rather return it unopened if using it wouldn't give me an improved picture, especially since it'd free up room for another card.


Top
 Profile  
 
 Post subject:
PostPosted: Fri Dec 02, 2005 6:13 pm 
Offline
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location: Arlington, MA
The 350 is built specifically to output to a TV, and the TV output on a regular video card is generally an afterthought. I would not expect the 6200 to work any better on this count than the 5200.

While my FX5200 card works far better than the MX4 built into a Nforce4 IGP I first tried, I suspect that is mostly a result of having "in spec." voltage levels on the S-Video connector.


Top
 Profile  
 
 Post subject:
PostPosted: Sat Dec 03, 2005 4:30 pm 
Offline
Joined: Fri Aug 26, 2005 9:54 pm
Posts: 617
Simply put there is no other card (that I know of) which will compete with the pvr-350's tv-out. Each TV frame is made up of two fields. One field is made of scan lines 1,3,5,7, . . . The next field is scan lines 2,4,6,8, . . . Field two actually takes place 1/60th of a second after field one. All video cards with tv-out are incapable of displaying fields correctly. The pvr-350 does display them correctly when playing back mpeg footage through the hardware playback feature.

All normal video cards render an eitire frame at once (both fields at the same time). They then display each field one after the other for 1/60th of a second each. In reality the second field should be based on the video feed 1/60th of a second after the first field.

This is a general explination. But in short the pvr-350 handles the fields of each frame better.


Top
 Profile  
 
 Post subject:
PostPosted: Sat Dec 03, 2005 9:19 pm 
Offline
Joined: Mon May 10, 2004 8:08 pm
Posts: 1891
Location: Adelaide, Australia
There was an interesting thread on mythtv-dev that discussed a possible playback implementation that would cause video cards to display the interlaced fields correctly on a TV. http://www.gossamer-threads.com/lists/mythtv/dev/160286

It took me a while to work out what the guy was going on about, but I believe his idea would actually work quite well.

Its a shame no one gave the guy the help he needed to implement his idea.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Dec 04, 2005 12:23 am 
Offline
Joined: Mon Oct 06, 2003 10:38 am
Posts: 4978
Location: Nashville, TN
actually some of the nvidia drivers will do interlaced output. I've never been able to find a good 480i modeline, but the 1080i output is incredible on an hdtv, and doesn't require any sort of deinterlacing even when playing back 480i video. Too bad 1080i has been broken in the last several nvidia releases. Here's hoping they get it fixed for the 8xxx release.

_________________
Have a question search the forum and have a look at the KnoppMythWiki.

Xsecrets


Top
 Profile  
 
 Post subject:
PostPosted: Sun Dec 04, 2005 3:07 am 
Offline
Joined: Mon May 10, 2004 8:08 pm
Posts: 1891
Location: Adelaide, Australia
All cards with an svideo output do interlaced output. It's getting the software to display the interlaced fields in the correct order that is the problem. Bob sometimes gets it right, sometimes it doesnt.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Dec 04, 2005 7:21 am 
Offline
Joined: Fri Aug 26, 2005 9:54 pm
Posts: 617
Xsecrets wrote:
actually some of the nvidia drivers will do interlaced output. I've never been able to find a good 480i modeline, but the 1080i output is incredible on an hdtv, and doesn't require any sort of deinterlacing even when playing back 480i video. Too bad 1080i has been broken in the last several nvidia releases. Here's hoping they get it fixed for the 8xxx release.

The problem is the way tv NTSC spec is written when compared to the way video cards do interlacing. Like I said before the two fields in a frame are suppost to be separated by 1/60th of a second on a TV. But videocards render the entire framebuffer into the memory at once. Then a RAMDAC (random access memory digital to analogue converter) comes along and grabs the scanlines for field one then field two from the same framebuffer.

Interlaced output works great on HDTVs because they are equipped with better inputs. A RGB input on a HDTV can read the sync signal from the VGA output on the videocard. If you had the same input on your normal TV you could run the videocard in 720x240 at 60Hz. That would produce almost the correct fields for NTSC video. Even better would be to render everyother frame half a scanline high or low.

Unfortuently the way the tv-out is wired on video cards it needs a full 720x480 framebuffer before the RAMDAC can start outputting the first field of any frame. (all other resolutions get resampled to 720x480 before the RAMDAC has at it).

With HDTVs becomming more and more popular I would be very supprised if any company spent time improving the tv-out of their cards to enable correct field rendering on standard deffinition sets. The improvment just isn't large enough for the few of us who want the best quality. But that is why I got a pvr-350.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Dec 04, 2005 3:18 pm 
Offline
Joined: Mon May 10, 2004 8:08 pm
Posts: 1891
Location: Adelaide, Australia
If you update the display at every 1/60th of a second, and only update the odd lines on one pass and the even lines on the next, it won't matter what order the ramdac grabs the lines, it will always display them in the correct order. This is what the guy in the mythtv-dev thread was suggesting. The closest we get at the moment is the bob deinterlacer, but because it updates all lines with the odd field and then all lines with the even field, half of the time the ramdac will display the fields in the wrong order.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Dec 04, 2005 4:50 pm 
Offline
Joined: Fri Aug 26, 2005 9:54 pm
Posts: 617
Greg Frost wrote:
If you update the display at every 1/60th of a second, and only update the odd lines on one pass and the even lines on the next, it won't matter what order the ramdac grabs the lines, it will always display them in the correct order.

That would probuce a perfectly good picture in the framebuffer. But the RAMDAC in not capable of translating that onto the s-video/composite port on your videocard. And I have never herd of a standard deffinition TV's that has RGB (VGA) inputs that can sync at 60Hz.


Top
 Profile  
 
 Post subject:
PostPosted: Sun Dec 04, 2005 6:18 pm 
Offline
Joined: Mon May 10, 2004 8:08 pm
Posts: 1891
Location: Adelaide, Australia
The trick is ensuring that the vertical resolution matches that of your TV standard (480 lines for NTSC, 576 for PAL). That way, when the ramdac produces a scaline it is taking it directly from one of the horizontal lines of pixels in the framebuffer. One full scan of the screen would be updated from the even lines and the next from the odd lines. If there are flicker contorols on the svideo output, you would need to turn it off (since it will blend with lines from the other field).


Top
 Profile  
 
 Post subject:
PostPosted: Tue Dec 20, 2005 9:02 pm 
Offline
Joined: Sat Oct 22, 2005 8:19 am
Posts: 23
tjc wrote:
The 350 is built specifically to output to a TV, and the TV output on a regular video card is generally an afterthought. I would not expect the 6200 to work any better on this count than the 5200.


The first MythTV box I put together was made primarily with old hardware I either borrowed or had laying around. The video card was an MX4000 with svideo out.

I can't comment on either the 5200 or the PVR350, but I can say that the 6200 I'm currently using certainly outclasses the older generation Nvidia card I had.

Some of the 6200's (mine included) have a component video out "dongle" and this is where the difference lies. After connecting a component cable and switching to a 480p modeline, immediately I saw a stark, stark difference in picture quality -- obviously for the better.

But on the other hand... if I were to be strictly judging quality of apples to apples (svideo), I'd have to say that the 4000 and 6200 seemed about the same to me.


Top
 Profile  
 
 Post subject:
PostPosted: Tue Dec 20, 2005 9:23 pm 
Offline
Joined: Thu Mar 25, 2004 11:00 am
Posts: 9551
Location: Arlington, MA
I will say that the quality difference between the MX4 bult into the Nforce2 IGP in my original machine, and the FX5200 card I'm using now, was very significant, even though both were S-Video output. I put it down to more "in spec." signal levels on the FX5200 possibly because it also supports DVI and thus has better line drivers.


Top
 Profile  
 

Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 12 posts ] 


All times are UTC - 6 hours




Who is online

Users browsing this forum: No registered users and 29 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Jump to:  
Powered by phpBB® Forum Software © phpBB Group

Theme Created By ceyhansuyu