Ofcom seminar about usable and accessible smartphone and TV apps

11 June 2013

Apps are an increasingly important way in which consumers use equipment such as smartphones, tablets and IPTVs. This holds true generally, but there are some people who can particularly benefit from developments in technology.

Ofcom has a duty to encourage the availability of easy-to-use consumer equipment, and we have stretched this definition to include apps given their growing importance. On 17 April 2013 we organized an event, hosted by BT, to talk about usable and accessible apps. This event was chaired by Jo Connell, Chair of the Communications Consumer Panel and attended by app developers, TV equipment manufacturers, usability and accessibility specialists and organisations representing older and disabled people.

Our BT host, Liz Williams, said that more than half of UK adults are now using a smartphone, nearly a fifth are using tablets and more and more people are choosing to access live TV or on-demand content via an app.

Chris Taylor, Director of Consumer Policy at Ofcom, said that discussions with mobile network operators and other interested parties had thrown up a key theme, which was that apps are increasingly important. It's not enough now to have an accessible handset because it's what you do with your handset that's important. And in broadcasting, both live and video on demand is increasingly being delivered by apps.

Please scroll down to see all the speakers' slide presentations

Robin Christopherson, AbilityNet

Robin spoke about the blurring of the distinction between usability and accessibility, and the importance of making accessibility a core feature not a bolt-on. He praised Apple for its inclusive design. The iPhone has high contrast, larger text and built-in drivers for digital hearing aids and braille displays and everything you can do on the touchscreen you can do using a keyboard. Some features benefit disabled people; for example, turning on screen curtain doubles the battery life. And many technologies that benefit blind people have wider benefits for example, you can use Siri while driving.

Robin demonstrated Fleksy, which enables someone to type accurately on a virtual keyboard even if they can't see it, and Talking Goggles which correctly identified various objects including a tin of Vaseline, a debit card and a picture of the Eiffel Tower.

Smartphones were the next big steps after PCs, and can also replace some high-cost specialist devices designed for disabled users. The next big step may be wearable technology that intelligently observes its surroundings and pushes useful information, for example that there are road works in the street. A camera could also be useful to someone on the autistic spectrum, because it can 'read' people's facial expressions and provide feedback to the user, for example if their date is feeling uncomfortable.

Robin shared links to all the apps and videos he demonstrated:

John Paton and Paul Porter, RNIB

John and Paul's presentation was about connected TVs, companion apps and video on-demand. John spoke about the need for open application programming interfaces (APIs) to allow different programs to talk to each other. There are a few models of television equipment with built-in text-to-speech, but with an open API you can also use the text-to-speech built into smartphones or tablets. An API can allow the user to change the channel, replicate the menu of options on a secondary device and have access to any information the TV already has, such as the EPG using a smartphone. Open APIs also allow third parties to add value to a product at no cost to the original developer. For companies that didn't want to use open APIs the challenge from RNIB was that keeping APIs in-house meant taking responsibility for the accessibility of the device either by integrating accessibility or providing it through a companion app.

John set out some things that make an app accessible, including

  • Good size and contrast
  • Meaningful names - calling a button BTN 1392 is not meaningful or memorable
  • Correct reading order, for example volume up next to volume down.

Paul, who is blind, gave a powerful demonstration of the difference between good and bad design. Using a well-designed app, he was able to find the time and platform of the next train from Kings Cross to Stevenage in seconds. However, the other app he demonstrated was almost impossible for him to use, not least because all the buttons were labelled 'button'.

Gareth Ford Williams and Ian Pouncey, BBC

One of the best known apps is iPlayer, and Gareth and Ian spoke about their work integrating accessibility into iPlayer. There are around 600 devices on which iPlayer can be used, which makes integrating accessibility a challenge.

  • British Sign Language (BSL) content is broadcast as part of the relevant programme so no technical adaptations are needed for it to be available on iPlayer. However, the BBC created a content category on iPlayer, making it easy for users to find this content.
  • Subtitles are now available on iPlayer on over 90% of devices. However, there are new challenges with handheld devices, for example delivering legible subtitles on a phone.
  • Audio description has been the most challenging service to integrate, because on VOD it doesn't work in the same way as on broadcast television.

Because the developers don't have always have information about the devices people are using to view content, for example the screen size, it is necessary to create websites that respond and adapt.

The BBC's accessibility standards and guidelines were published in 2006 and are being updated. The BBC was doing more user testing and sharing the results. It was also rewriting its training courses and had a new course about web accessibility.

Ben Foster, Patient Services

Ben demonstrated the new app from www.patient.co.uk that enables GP appointments to be booked, prescriptions to be renewed and messages to be exchanged securely. Patient Access started off as a desktop website; however, feedback from GPs was that many patients didn't have home broadband or PCs but did have smartphones.

The app is mainstream, but likely to be valuable for people who have difficulty using the phone. It is free to download, and shows available appointments, which can be booked by clicking on them. Around half the GP practices in the UK have already signed up for this service.

Michael Day, BT

Ofcom has recently mandated major improvements to the text relay system used by deaf and speech-impaired people to use the telephone, and Michael spoke about the app that BT has commissioned to allow Text Relay to be used on smartphones and tablets.

Next Generation Text Relay (NGTR) will replace the current text relay service in April 2014. It will be available on a range of mainstream device such as smartphones and tablets, rather than needing one party to have a textphone. Text-to-text calls, which currently require both parties to have a textphone, will also be possible via the app.

With NGTR both parties will be able to speak, hear (if they have some hearing), type and read text at the same time. There will be no need to flip between modes, so people who have understandable speech will be able to speak their end of the call and read the captions without having to switch modes. This should make calls much more natural.

NGTR will require a phone call, which could be fixed, mobile or VoIP, and an internet connection, which could be broadband, Wi-Fi or mobile internet. Whichever connection is used for the text, it will be the phone call that is charged, not the app. (However, although NGTR will work with VoIP, VoIP providers will not be obliged to provide it.)

The app will augment the telephone call by providing a simple text channel. Once configured to be associated with the user's phone number or numbers (e.g. home/work/mobile), it won't be necessary to use different logins for different numbers. The 'connect' button on the screen will connect the user to NGTR, and the service then looks to see if there is a live call for any of the associated numbers.

The app will have a pop-up numerical keypad. This will help compliance with payment card industry security standards, by allowing users to enter their own card numbers rather than having them repeated by the relay assistant. It will also be possible to store phrases in the app, so that regularly used text, such your address, can be sent quickly. For 999 calls, the ability to locate the caller will be the same as on a voice call.

For outgoing calls from deaf people, the announcements such as 'dialling', 'ringing' will appear on the app. The associated phone call will just be used for voice.

For incoming calls, the deaf person's phone will ring (or flash or vibrate according to how they have set it up). When the phone rings, the user opens the app, which looks to see if there is a call via the relay service. If the app doesn't indicate a call, that is because it has not come via NGTS. If the app recognises a NGTR call, then the user can pick up the phone. If the call is text-only it can be answered on the app and the phone call will be released. Otherwise the phone call and the internet connection will both stay open.

Ideally, users will be able to use a single smartphone to make and answer calls and handle the text. All handsets that have been tested so far work when using Wi-Fi, but it may be that some older handsets do not work when using both a voice call and mobile broadband. BT is continuing to carry out testing, and will publish a list of which handsets can and can't do this. An API will be available; this will enable the NGTR app to work with other software including the BT SmartTalk app.

Michael confirmed that there was nothing to stop businesses and services with call centres from using the app if the call handler is in front of a screen and keyboard. This would remove the need for a relay assistant on the call, making it potentially quicker and more private. However, he was sure that the relay service would be needed for years to come.

[Ofcom note: existing users of Text Relay will continue to be able to use the service as they do now using textphones, but will need to use the service in the way described above in order to benefit from the enhanced features that are being introduced.]

Ben Shirley, University of Salford

Ben reported on some about research commissioned by Ofcom into how speech recognition could be integrated with VoIP and installed as a smartphone app.

The research was designed to assess if automatic speech recognition was yet sufficiently well developed to make voice chat accessible for deaf and hard of hearing people. The rationale was providing a text service for telephone calls where having an intermediary wasn't necessarily desired, for example conversations with family and friends. The software for the speech recognition had to be installed at the hearing party's end of the call.

The ideal speech recognition experience would be to walk up to a machine that hadn't been trained to your voice, speak and be understood, but it doesn't work like that yet. For anything more than simple commands there are issues with latency, which increase with the length of input.

At the outset of the project, there was a draft ETSI standard which set a 10% acceptable error rate of speech recognition. During the research, the performance of some 'best in class' speech recognition engines was assessed with reference to that standard to see if it was useful. The final phase of the research, which was added in response to requests from participants, was to allow them to use the software in their homes for three months.

The first issue identified in research with users was that speech recognition doesn't work on a word by word basis; it waits for the end of a sentence in order to make sense of it, so there is always an inherent delay. There is also a training requirement for the speech recognition engine.

Six readers with varying accents were used to evaluate the speech recognition engines, three male and three female. Each user trained each engine for 15-20 minutes.

Two pieces of text were used for evaluation. The first was the directions to the test venue. You could work out exactly where they would have ended up, which gave a representation of how accurate it was. The second piece of text was a fictional conversation with an aunt abroad.

With the first piece of text, for the first speech recognition engine the average accuracy was about 80% for Subjects 2-6 and 96% for Subject 1. For the second engine, the average was 85-86%, but Subject 1 achieved 95%. Across the three engines, Subjects 2-6 got 85-90%, with Subject 1 getting 96%, 95% and 99%.

Subject 1 was the person who was used to using speech recognition engines. However, the engines being tested weren't trained to his voice more than the others, so the only explanation for the difference is that he was used to using this technology. The best results were where the user, as well as the speech recognition engine, was effectively being trained.

As a comparison, the researchers also tested the engines with voices to which they hadn't been trained, with very poor results. They also tested previous incarnations of speech recognition engines. One that came with an earlier version of Windows was 30-40% less accurate than the later ones. The ones that were tested towards the end of the study were scoring consistently 85, 95, 96 per cent.

For the trials in users' homes, there were nine pairs of participants that completed the trial. Technical support was provided, including home visits. Data was gathered through focus groups, interviews and questionnaires. All participants were positive about the performance of the speech recognition engine. They all used the software several times a week, often for calls to friends and family in other countries. The biggest reason given for not using the software for more calls was they didn't want to bother people to install it on their machine - the software has to be at the hearing person's end of the call.

Most users said they would like to use this as a web-based service, for example to contact NHS Direct. This would only require the software to be trained to the voice of one or two people in the relevant call centre. For an organisation that wanted the custom, there would be an incentive for them to do this. About half the users said they wouldn't use this service for banking, perhaps because of fears of accidentally transferring 10,000 instead of 10.

One additional participant didn't have a computer, so the team put the software on a relative's PC and installed the Google Talk app on his mobile phone and it worked. With Google Talk you can speak and type without switching mode of operation. It worked fine with Google Talk on the mobile phone and the Salford app at the other end of the call.

Overall, the software worked well for contact with friends and family, and there was support for using it to contact businesses and public services. We don't know when it will be possible to do the speech recognition on the mobile device rather than at the hearing person's end of the call, but the capability of devices is getting faster and more powerful and technology continues to improve.

Sabine Lobning, Mobile Manufacturers' Forum

The MMF is an association of mobile phone manufacturers, representing 80% of the handset market and a majority of the global network providers. The Forum works on health and safety, accessibility, preventing counterfeiting and the harmonisation of standards.

Sabine spoke about the expansion of the Global Accessibility Reporting Initiative website to cover apps as well as handsets. The GARI website allows users to search for handsets with features that are appropriate for them, for example good compatibility with hearing aids. There are currently 700 handsets listed and the template lists over 100 accessibility features. Information is available in 13 languages. Some countries require accessibility requirements to be reported, and the website is a tool for that. The website is free to use, both for developers and users.

The website is being expanded to cover tablets and apps, and will be re-launched in July 2013. The MMF is keen to work with other organisations such as disability groups and regulators to publicise the site and the relaunch.

Panel discussion

The event closed with a panel discussion, touching on: the need for open APIs; the 'internet of things'; intellectual property rights; the take-up of smartphones by older citizens; the general state of play with web and app accessibility; the work done by Apple to make its products accessible; the fact that app developers are often very responsive to feedback from users and very often read the reviews of their products; the difference between apps for smartphones and for tablets; the benefits of user testing; websites becoming more mobile platform-friendly.

Feedback on Twitter

Many participants used Twitter to post live comments. One participant, an iOS developer, tweeted:

  • Really humbling experience watching a blind person try to navigate his way around a lazily-written iPhone app #accessibleapps


  • It would have taken an hour to add the accessibility hints - something I'll do from now on #accessibleapps

Related documents

Seminar on accessible apps (PDF, 79.7 KB)

John Paton (PDF, 244.8 KB)

Gareth Ford Williams (PDF, 165.4 KB)

Draft BBC Mobile Accessibility Standards and Guidelines (PDF, 1.2 MB)

Ben Foster (PDF, 747.4 KB)

Michael Day (PDF, 1.0 MB)

Ben Shirley (PDF, 220.7 KB)

Sabine Lobning (PDF, 2.2 MB)