A little while back I wrote a blog post about audio description for TV shows and films, and audio navigation on DVD menus. Things like that really help people who are visually impaired. But if you have partial or complete hearing loss, then that kind of feature isn’t much use. Instead, the equivalent form of assistance for such people is subtitles and captions, which display a text transcript of what people are saying and what sounds can be heard. And these also make a huge difference. And experimenting with it on Youtube has earned me a shoutout on a fellow blogger’s channel, which I’m very flattered about. If you’ve come here because of that video, which I’ll mention later, then hi! 🙂
On TV, a lot of programmes are subtitled of course. On digital TVs and DVD players, there’s often an option in the menus somewhere to bring it up, or even a dedicated button on the remote if you’re lucky. But in the old days, on analogue TV channels here in the UK, subtitles were provided using the various teletext services, on page 888. Presumably that number was picked as it was both easy to remember and well out of the way of other teletext content, which took up the lower hundred numbers. I still remember old services like the BBC’s Ceefax, ITV’s Oracle and Channel 4’s 4-Tel with fondness from my childhood – that was the closest thing many of us had to the internet in those days, certainly in my house. One of the things I particularly liked was the Bamboozle! quiz game on Channel 4, which used the red, green, yellow and blue Fastext buttons to let you choose your answers, and occasionally replaced the quiz with a Knightmare-style adventure game. Happy days!
Anyway, it is also worth noting that subtitles aren’t just for people with hearing loss. They’re very useful for language translation as well. For instance, there’s been an increasing trend for foreign dramas to be shown on channels like BBC4, which use subtitles to translate things into English. These subtitles are built into the recording in those cases, so you don’t have to select them from any menus. I haven’t been watching those programmes, because staring at the screen to read subtitles for ages is hard on the eyes, and I’ll miss other things that are happening on screen if I’m reading those all the time. But for other people, those programmes have proven very popular.
I don’t mind TV shows or films with occasional subtitles in them, as long as they’re on screen long enough and with good enough size and contrast for me to read them. But an entire programme would be too frustrating. Audio dubbing – where an English voice talks over the foreign one – would be easier for lengthy content, even if it may look strange seeing someone’s lips moving without it actually matching what’s being said. With eyesight like mine, I wouldn’t necessarily notice those anomalies too much unless I was deliberately look for them.
Subtitles don’t always go right though, especially when it’s generated live. For TV broadcasts like the news, for example, special voice recognition systems may sometimes be used to automatically generate the captions – and if you’ve seen those, or Youtube’s auto-generated subtitles, you’ll know how dodgy the process can be. For the hard of hearing it is very frustrating that they’re not seeing accurate results, and it’s something that still needs work, though it’s a bit better than it used to be.
Apart from that serious aspect though, some of the mistakes can be quite amusing as well, admittedly. There are countless news articles highlighting some of the best mistakes on TV, and there even appears to be a blog by someone who posts subtitling errors regularly. Try turning on auto-generated subtitles for many Youtube videos and it often gets quite bizarre as well. I’ve seen the phrase “visually impaired” come out as “sheep head” in one of Emily Davison’s videos for example! That would be a very different type of disability altogether. You wouldn’t be able to pull the wool over anyone’s eyes with that one, only your own! Sorry, that was shear cheek making a baaaaad joke like that. I’m fleecing ewe of proper content now. 😉
So the fastest solutions aren’t always the best ones. And, ultimately, the only really reliable way to get accurate subtitles is for a human to type them. So it’s great that Youtube is allowing people to do that.
I don’t know how long they’ve had the feature for community contributions, as I’ve only found out about it after reading Fashioneyesta’s post about subtitling her own clips. But when you’re so into videos and blogs about visual impairment, it is easy to overlook the fact that the hard of hearing, and those who speak other languages, may still find your content interesting and useful as well, as long as they’re able to understand it. So it’s great that it seems to be getting highlighted more lately.
With that in mind, I’m now adding subtitles to all the videos on my channel, in case they’re useful to anybody. The method I use is called “Transcribe and auto-sync”, because it’s very quick and works well. For this, Youtube puts the video on the left, and a single text box on the right, so that you can watch the video and type everything that you hear at the same time. You don’t have to divide it up, as Youtube will sort that out for you later.
If you have a script written for your video already, you can just paste the text straight into the box. Or, instead, you can type along with video as you listen to it. What I like about this is that Youtube will pause whenever you type, so you don’t have to keep clicking anything to stop it and catch up. It will simply resume playing a second after you stop typing. And there is also a button to jump back 5 seconds if you didn’t quite catch something. Youtube also saves your text regularly as a Draft, so you don’t have to transcribe it all at once or worry about losing it. You can exit when you’ve done enough, then later on you can go back into your Creator Studio and pick up your draft from the Community Subtitling area.
Once you’ve typed everything and click the Set Timings button, Youtube will throw you back to your drafts list and get to work marrying the text up with the audio. It doesn’t tell you when it’s finished, but if you give it a few minutes and come back a bit later, you should find it’s done. The results are very accurate too. I still watched it back and made a couple of very minor tweaks, because you can edit what it gives you, but really I could have left it as it was, it was fine.
The alternative is to manually type each segment and create the timings yourself, instead of Youtube doing it for you. It takes longer, and it won’t be the easiest thing to do if you can’t see properly. But it does give you very precise control.
Again, you get the video on the left of the screen, but this time you can create lots of separate text segments on the right. As you add them, they are displayed below the video, so you can see where they start and end in relation to the ‘waveform’ (a graph of the audio levels throughout the video). You can move the text segments, and change their durations by dragging their edges here, or you can type the timings manually next to the text on the right.
Typing the text for the video requires you to hit Enter on the keyboard fter each segment to add it to the list. And there are other keyboard shortcuts as well – Shift+Left and Shift+Right will take you forward or backwards 5 sections, Shift+Space will pause and unpause the video (though it will also pause for you when you start typing), and Shift+Enter will add a line break within the current text segment. You can also edit the text afterwards too of course. How easy this process is for people who can’t see, I’m not sure – I imagine setting the timings could be rather fiddly. But maybe technology like Voiceover is helpful there, I don’t know.
In any case, that manual method is also used for Community Contributions, where Youtubers can open up their videos for other people to add subtitles to them, in any language. I think that’s a really great idea, and I’ve turned the feature on for my videos. I’ll be very surprised if it gets used, but if anyone wants to translate my content into languages other than English using that method, feel free to do so.
And that brings me back to Emily Davison at Fashioneyesta, because she’s turned on community contributions for some of her Talking Disability clips, and has explained the process of adding subtitles in her blog post that I mentioned earlier. She’s also subtitling some of her other clips too, but this allows her viewers to chip in on some of the most important disability-related ones.
So, as a fan of her work, I’ve now added captions to 3 of them – The Top 10 Misconceptions of Visual Impairment Part 1 and Part 2, and The Scary Things About Being Visually Impaired. They’re all great clips which I can relate to a lot and highly recommend to others. There’s lots of good humour among the more serious points that get raised, so they strike a good balance and aren’t patronising. They’re just enjoyable ways of getting some important points across.
As well as typing what Emily was saying, I was also noting when Emily’s mother was laughing or saying things off-camera, or when Emily was talking as another person (when relating an experience she’d had, or pretending to be her dog), and other little things like that. Yet I was also careful not to be too literal as well – for instance, if people say things such as “Erm”, “like” or “you know” a lot, then there’s no point including them every single time. The main content of the sentence is what matters. Occasionally it might be useful to include extra little things, but not very often.
I subtitled those clips as a gesture of goodwill, and I may do one or two more as time allows. I wasn’t after anything in return, it’s just lovely to be able to give something back as a fan, given the huge amount of time and effort Emily puts into producing so much great content. Her videos always stand up to multiple viewings, so it wasn’t a chore rewatching them.
Nevertheless, Emily has very generously given me mentions on Twitter and Facebook, added me to the Transcriber page on her blog, and has now given me a shoutout on her latest vlog (the relevant segment starts at 8:49). All of which is very flattering! So if you happen to read this far into this rambling post Emily, thank you ever so much, that’s extremely kind of you! I’m just happy to help! 😀
So if anybody else wants to help Emily with one or two clips – or indeed any other Youtubers who have requested assistance with captioning – please do consider doing so. Even if it won’t improve the viewing experience for you personally, or even if it seems like a tedious job to you, rest assured you’ll be helping a lot of other people to enjoy the same content that you do. It will increase each video’s potential audience significantly, which in turn could lead to greater subscribers for that channel. It’s not often that you get to give something substantial back to a Youtube channel that you enjoy, so it’s a good opportunity.
Likewise, if any other Youtubers I follow want a bit of help with subtitling a clip or two, I’d be happy to take a look if it helps you out a little bit. I can’t do loads, obviously, but I’m willing to look at the occasional one or two here and there. Every little helps, as a certain supermarket chain will tell you. And on that note, I’ll leave it there, this post is more than long enough. Thanks for reading, and thanks again to Emily for the shoutouts! 🙂