Imagine sitting down to watch an episode of the HBO hit series Game of Thrones—and hardly being able to understand anything. That’s the case for non-native English speakers or any of the 36 million deaf or hard-of-hearing Americans. HBO doesn’t expect its viewers to have a knowledge of High Valyrian; that’s why it takes care to offer subtitles to viewers understand exactly how Daenerys intends to free the slaves of Essos.
If only most online streaming companies took as much care in everyday captioning.
Machine translation is responsible for much of today’s closed-captioning and subtitling of broadcast and online streaming video. It can’t register sarcasm, context, or word emphasis. It can’t capture the cacophonous sounds of multiple voices speaking at once, essential for understand the voice of an angry crowd of protestors or a cheering crowd. It just types what it registers. Imagine watching classic baseball comedy Major League and only hearing the sound of one fan shouting from the stands. Or only hearing every other line of lightning-fast dialogue when watching reruns of the now-classic sitcom 30 Rock.
As of April 30, streaming video companies are now required to provide closed captioning. On all programming. There’s no doubt that we’re in a better place than we were even five years ago, when streaming video companies weren’t required to closed-caption any of its content. But, there still is a long way to go in improving the accuracy of subtitles. Netflix and Amazon Prime users have bemoaned the quality of the streaming companies’ closed captions, citing nonsense words, transcription errors, and endless “fails.” These companies blame the studios for not wanting to pay for accurate translations but excuses aren’t flying with paying streaming video subscribers.