Jump to content

Question for video geeks (like me)

1 post in this topic

Recommended Posts

Question for video geeks


I capture HD broadcasts via firewire from my HD DVR, a Scientific Atlanta 8300HD. I captured some of these with my real mac (G5 dual 2 Ghz) and my Hackintosh. When I play back these videos in my HDTV the result is the same: jerky motion, but not like it's dropping frames, it's more like the fields are reversed, so wherever there's motion there's a trail. This happens when playing back from VLC, and also if I make an HDDVD with DVD Studio Pro and then play back it using Apple DVD Player, unless I select De-interlace, in which case the trail doesn't show, but the playback is not as smooth as it should be.


At first I thought it could be the Nvidia drivers for the Hackintosh, so I tried to play the same files in the G5, using a DVI to HDMI high quality cable, and it shows the same problem. The G5 has an ATI Radeon 9650 Pro with 256 MB of RAM, and the Hackintosh had a MSI card based on the NVidia 7600GS chipset with HDMI. To see if it was something wrong with the card, I put an eVGA Nvidia 7600GT, although without HDMI, just two DVI outputs. The result is the same.


I even tried to swap the fields using Final Cut Pro, but I still get that jerky motion. A really weird thing about this HD content is that I cannot detect the field order correctly, although some research on the net tells me it's upper field, but there's a method in After Effects to detect the real field order, but with this content it fails.


When I use the card in Windows XP what I get is different, but still not what I want. Windows doesn't show the stuttery motion, but it never shows real interlaced 1080i, it's always de-interlaced, except that if you force the player (whatever it may be, VLC, WinDVD) to play interlaced video, and the source is interlaced, what you see on the screen is the same you get when you play interlaced videos on your computer monitor, which is, video with the "comb" lines visible because the monitor is progressive by nature and is not made to display interlaced content. The same happens if I play a DVD that has real 29.97 interlaced content that didn't come from a film camera but a video camera instead.


Also, what puzzles me is that when I play HD videos in the Hackintosh or Macintosh with VLC and I select BOB de-interlacing, which is the one that looks the best to me of all the de-interlacing modes, every now and then it still skips entire frames. Could this be because the video card is not fast enough? I would think that a 7600 is at least fast enough to display HD interlaced content, but maybe I'm wrong. And also, when I play the interlaced content without de-interlacing, at times it plays back perfectly, in the correct field order, but eventually it starts the jerky motion again, and then it goes back to play smoothly, and so on.


Well, anyway, if any video enthusiast that reads this can illustrate me I would greatly appreciate it.



Link to comment
Share on other sites


  • Create New...