I am in no doubt that Ofcom’s work on measuring live subtitling quality is a very good thing.

There is a need to improve accuracy, reduce latency and the volume of transmission faults, enhance the presentational style and minimise the number of pre-recorded programmes that carry live subtitles. It’s also worth saying that every subtitler, every subtitling company and every broadcaster that I’ve ever worked with supports all of these aims and they spend time and money trying to achieve them. We all want to make the UK’s record in this area demonstrably world-leading and Ofcom’s effectiveness as a regulator plays a part in achieving that goal.

It’s important to understand what their most recent report represents. It is a set of statistics which describes the accuracy, speed and latency of subtitles on a small sample of live TV. It is the first of four reports which will be used to inform future policy in this area. It is not, at this stage, a judgment on good or bad – it is a snapshot of what is. It looks at 11 hours of TV. To put that in context, Red Bee alone subtitles about 165 hours of live TV output every day in the UK. As these reports are six-monthly, the Red Bee samples represent 0.0002% of our total live output in that period.

That’s not to say the results are misleading, just that it is impossible to make judgments or draw conclusions on practices at this stage. The press reaction focused on the latency issue, highlighting the fact that the latency median in this sample was 5.6 seconds, well above “the maximum recommended 3 seconds in Ofcom’s guidelines”. So what conclusion should we draw from this? That subtitlers are working too slowly and we need to crack the whip? That broadcasters are wilfully ignoring Ofcom’s guidance? Or that the 3-second recommendation is an analogue-era bit of wishful thinking that does not reflect the current reality of live subtitling and the long and complex process of turning audio into text and displaying it on a digital TV screen?

Having said that, I have enormous optimism for the future with regard to the challenges above. Ofcom’s report (on page 26) highlights the “considerable improvement (in accuracy) from the last analysis…and it bears witness to the effort made by access service providers and subtitlers to increase the quality of live subtitles on TV, despite the challenging nature of the job”. Quite right too – and this effort continues day in day out. Accuracy will continue to get better. The other issues – transmission losses, presentation and pre-recorded programmes done live – all require collaboration between different parts of the TV production and distribution chain, but this collaboration is happening. I’m confident that, now we have a baseline, the next three reports will show progress in each of these areas. I should caution that, in my opinion, latency is the most intractable of these issues, but I know that a lot of thought is going into how incremental improvements can be made and I would expect to see progress in this area as well.

So I don’t think the reaction at this stage should be criticism or disappointment. Along with Ofcom, we should all gather more evidence, seek to understand the issues as best we can and redouble our existing efforts to implement improvements. I think we should also applaud the progress on accuracy made by subtitlers and the continuing willingness of many UK broadcasters to go above and beyond the regulations in terms of the volume of subtitling on TV and to remain open to scrutiny and challenge on the subject of quality. The frustration of poor quality subtitling, whatever the cause, is well understood and very keenly felt by those involved in the industry and there continues to be real determination across the board to tackle it. Ofcom’s reports will undoubtedly help to focus minds and encourage progress.

David Padmore, Director, Access Services.