Avoiding time_increment_bits problem when encoding bad header MPEG4 videos to Ogg Theora
There is some debate going on lately about the migration of YouTube to [[HTML5]], and whether they (i.e. YouTube’s owner, Google) should support [[H.264]] or [[Theora]] as standard codecs for the upcoming <video> tag. See, for example, how the FSF asks for support for Theora.
The thing is, I discovered [[x264]] not so long ago, and I thought it was a “free version” of H.264. I began using it to reencode the medium-to-low quality videos I keep (e.g., movies and series). The resulting quality/file size ratio stunned me. I could reencode most material downloaded from e.g. p2p sources to 2/3 of their size, keeping the copy indistinguishable from the original with the bare eye.
However, after realizing that x264 is just a free implementation of the proprietary H.264 codec, and in the wake of the H.264/Theora debate, I decided to give Ogg Theora a go. I expected a fair competitor to H.264, although still noticeably behind in quality/size ratio. And that I found. I for one do not care if I need a 10% larger file to attain the same quality, if it means using free formats, so I decided to adopt Theora for everyday reencoding.
After three paragraphs of introduction, let’s get to the point. Which is that reencoding some files with [[ffmpeg2theora]] I would get the following error:
% ffmpeg2theora -i example_video.avi -o output.ogg [avi @ 0x22b7560]Something went wrong during header parsing, I will ignore it and try to continue anyway. [NULL @ 0x22b87f0]hmm, seems the headers are not complete, trying to guess time_increment_bits [NULL @ 0x22b87f0]my guess is 15 bits ;) [NULL @ 0x22b87f0]looks like this file was encoded with (divx4/(old)xvid/opendivx) -> forcing low_delay flag Input #0, avi, from 'example_video.avi': Metadata: Title : example_video.avi Duration: 00:44:46.18, start: 0.000000, bitrate: 1093 kb/s Stream #0.0: Video: mpeg4, yuv420p, 624x464, 23.98 tbr, 23.98 tbn, 23.98 tbc Stream #0.1: Audio: mp3, 48000 Hz, 2 channels, s16, 32 kb/s . [mpeg4 @ 0x22b87f0]hmm, seems the headers are not complete, trying to guess time_increment_bits [mpeg4 @ 0x22b87f0]my guess is 16 bits ;) [mpeg4 @ 0x22b87f0]hmm, seems the headers are not complete, trying to guess time_increment_bits [mpeg4 @ 0x22b87f0]my guess is 16 bits ;) [mpeg4 @ 0x22b87f0]looks like this file was encoded with (divx4/(old)xvid/opendivx) -> forcing low_delay flag Last message repeated 1 times [mpeg4 @ 0x22b87f0]warning: first frame is no keyframe
I searched the web for solutions, but to no avail. Usually pasting literal errors in Google yields good results, but in this case I only found developer forums where this bug was discussed. What I haven’t found is simple instructions on how to avoid it in practice.
Well, here it goes my simple solution: pass it through [[MEncoder]] first. Where the following fails:
the following succeeds:
% ffmpeg2theora -i filtered.avi -o output.ogg
I guess that what happens is basically that mencoder takes the “raw” video data in input.avi and makes a copy into filtered.avi (which ends up being exactly the same video), building sane headers in the process.