Badwrong Ways To Discover Your Encoding Is Bad And Wrong

I originally wanted to write a paragraph or two about how I’ve changed the pc5 hardsub releases to add to the episode 21 release post, but I ended up writing a lot more. Super long, extra derpy post after the jump.

tl;dr We’ve may have been releasing broken SD encodes for the past 7 months and now they are probably fixed.

Recently, Magenta and I started simulwatching our respective encodes just before releasing to check for any errors we may have caused or that our QCers missed (and to force me to actually watch stuff I work on). I realized while watching this ep that I had made a mistake encoding it. I promised when we released Precure 5 – 08 that future Precure 5 encodes would be easier for older computers to decode, but, again, I forgot to enable the so-called “ez-decode” settings. Strangely, I only noticed because mpc-hc was not using DXVA to decode the video and our blag’s lovely animated banner was causing ffdshow to lag. The normal encode settings should be DXVA-compatible, but for some reason, it wasn’t being used. So I began investigating.

I did a quick encode using the ez-decode settings and spot-checked it for encoder errors. Everything seemed fine and DXVA was working. I let the OP run for some reason and had to pause it when I saw this:
Precure 5 broken decoding with DXVA

I figured it might be a DXVA-related problem and considered releasing it anyway since _really_ old computers probably don’t have a DXVA-compatible graphics card anyway. I debated with myself for a few moments then decided to try some other decoders in the name of science. CoreAVC also produced broken output, but not in the same places:
Precure 5 broken decoding with CoreAVC

FFdshow, on the other hand, did not break at all. Again, I considered the notion of simply telling users with slow hardware to just use ffdshow, but I realized how silly this was considering my computer can’t decode 720p h.264 in realtime and I rely on DXVA for this reason. I tried asking for help from some more experienced encoders, but the general consensus is that supporting hardware decoders (and people who use them) is somehow a bad idea.

I compared the normal encode settings with the ez-decode variant and isolated the differences. Through trial, error, and a long process of elimination, I found the cause of the broken decoder output. The culprit was –bframes 0, enabled by –profile baseline option. When we first decided to make ez-decode settings, we were trying to encode files that would be comparable to xvid’s playback requirements. We came up with settings that could be played back on a wii at the time. Only a few months before we started working on Heartcatch, the wii couldn’t even play h.264 at all, so we considered this an accomplishment. The important feature of the baseline setting in x264 that makes it easier to decode is –no-cabac. This setting disables advanced picture coding algorithms which are more efficient, but much more computationally expensive. This setting can be enabled independently, but somewhere along the line I thought that supporting iPod classic without conversion would be a good idea, so I looked up its specs. We decided not to do iPod-compatible encodes in the end, but the baseline option stuck anyway. This is how I decided to use baseline profile, without properly understanding what it does. Since then, I never tested the SD files because Magenta always encodes them.

After researching the difference between baseline and main profiles, I’ve come up with new settings that should still be decodeable on slower computers, while gaining some compression due to main profile’s features. I benchmarked the old ez-decode settings and the new ones using timecodec.exe (part of Haali Splitter package) and found that the new settings are slightly faster (I used CoreAVC just as a reference decoder; new settings are likely relatively faster than old settings using ffdshow as well). As for compression, Precure 5 – 21 was 20MB bigger with the old settings compared to the new settings. Whether it was caused by a recent change in x264 or if our releases have been broken this whole time, I have tested files encoded with the new settings with DXVA, CoreAVC, and ffdshow and they all decoded properly, in real time, and without broken output.

7 thoughts on “Badwrong Ways To Discover Your Encoding Is Bad And Wrong

  1. I’ve had this problem as well using Quicktime, I’m glad you got it all sorted out. I didn’t really care as I usually re-encode for my iPod Touch, after the re-encode, it all works fine.

    • I’m not sure about the maximum resolution on iPod Touch, but the new SD encodes might be compatible without conversion because the Touch supports up to Main@Level 3.1 which is what we’re using now.

      • Well it all works without re-encoding, I’m jailbroken and can play 720P, but I like re-encoding to 720×404.

  2. You should NOT use CoreAVC as the reference decoder, it’s well known to be very buggy and break randomly with new x264 features.

    You should make your encodes compatible with DXVA and ffdshow-tryouts’s FFmpeg based H.264 decoder. Neither decoder broke when CoreAVC couldn’t handle weightp encodes. CoreAVC is a buggy decoder and should not be encouraged its usage.

    • At the time of testing, timecodec wouldn’t let me use ffdshow for some reason and I didnt’ feel like rebooting. When I said “reference,” I only meant that I used the same decoder to benchmark all 3 H.264 profiles (High, Main, and Baseline) to see relative decoding times. Obviously, real decode time will depend on each user’s hardware and other process (f.ex, leaving the animated gif in the background was causing my decoder to lag until i closed it).

      Semi-officially I’ve mentioned that we target latest CCCP, but to be honest, we haven’t done thorough testing against it in a long time even though we’ve changed x264 version and encode settings multiple times. That DXVA works should be a happy coincidence. Level flagging and vbv are not required by ffdshow, but adding them does not affect ffdshow users negatively.

    • Probably not. Nobody’s complained about broken decoding all this time, so if I didn’t make this post, its likely we could have made the switch silently and no one would have noticed. I’ll re-encode PC5 08 when we get around to it as we redo 01-07 but that’s probably the only one. This is sort of a good consequence of not doing a v2 for PC5 08 when the slow decoding for older computers was brought up because it avoids a v3 later on.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s