From Julia Child to mobile devices: How one group cooks up accessible media technology

The National Center for Accessible Media helps develop tools, standards and policies so people with disabilities can access new technology and content

By , ITworld |  Mobile & Wireless, accessibility, Android

Media Access Mobile

Botkin and NCAM’s Manager of Business and Sales, Peter Villa, took me into their lab to demonstrate and talk about one of their latest accessible technologies, Media Access Mobile (MAM). In a nutshell, MAM is a complete system for displaying closed captions, foreign language subtitles and DVS on mobile devices. The vision, said Villa, is to use MAM in ”a bizarre space like a museum, that’s not traditional, for video artists who don’t want open captions to ever grace their image,” said Villa. But it could also potentially be applied to other places where captions aren’t traditionally available. “We get a lot of interest from live theater people, “ said Villa. “They thought there could be some application in their space.”

MAM was first used publicly at IBM’s THINK Exhibit in New York city in September, 2011. “It was for... a big multimedia exhibit. They needed a way to share languages and captions over a localized wi-fi network,” said Villa. “Closed captions have been around for a long time and we’re always looking for ways to repurpose them. We got to thinking about that data and streaming it, over the web,” said Botkin. The solution was to display the captioning and description data on the browser of mobile devices, so no special app or software is required on the part of the end of user. 

MAM was based on existing tools and utilities, some of which had been prototyped earlier during a project for the Whitney Museum. But to bring it all together, additional development work was required. While NCAM already had technology to support the streaming of captions, additional work was needed to support the multi-threaded streaming of captions, descriptions and subtitles. 

An additional complication in the development of MAM was the need to serve captions and descriptions for media that didn’t have timecodes built in. “After speaking to some museums we realized that a lot of their artists are just delivering a DVD for a multimedia exhibit. They just burn something on a disk and say ‘Here it is.’ How do you synchronize something with that?” said Botkin. The solution was to use an off-the-shelf DVD player that outputs a relative time code, and to serve captions based on that code. 

After about a two-month coding push, MAM was ready for its public debut last fall.


Mobile Access Media

ITworld/Keith Shaw

In the NCAM lab, Botkin and Villa showed me a working version of the system that grew out of that work. As we talked, Casablanca played on a big screen TV, and was being made accessible via MAM simultaneously to deaf, hard-of-hearing and visually impaired users as well as viewers comfortable in a range of languages other than English, all via a mobile device. While it was at it, MAM was also driving Rear Window Captioning in the lab. For the visually impaired who use an iPhone, audio descriptions can be read by VoiceOver. 

In term of technical specs, a server is required to run the MAM software, which can be an inexpensive PC. In the NCAM lab, they ran it off a $250 netbook. A dedicated wi-fi network is also required, so a wireless router is needed. The MAM software consists of a .NET application that reads the caption, description and subtitle data and passes it to the caption server, which is written in Java. The captioning data itself is read from a file, so no database is required, though, as Botkin said, “If you wanted to caption multiple exhibits from it, it might be best to integrate it with a CMS.” The mobile code to display the captions is JavaScript and HTML5.

Normally, precomposed captioning/descriptive data files are required for MAM. These files, if not already available, can be created by WGBH’s Caption Center (much as they do for TV and movies). However, MAM can also support the display of live captions, generated on the fly by stenocaptioners. 

MAM can serve captions and descriptions based on a supplied SMPTE timecode, a pseudo-time code (such as a relative one from a DVD player) or even be manually driven, if need be. “If you’ve got something so free-form that the timing changes from performance to performance, you could have someone drive it from, say, a lighting board,” said Botkin.

The browser display can be customized to look how the client wants or, as Botkin said, “It could be wrapped into a more complicated end user device that has wayfinding integrated into it,” which would be of extra use to the blind. 

As for the total cost of implementing the system, Villa says that, soup-to-nuts, including consulting, captioning and hardware, MAM can be up and running for less than $8,000. “We would hope closer to $5,000,” said Villa. For that money, says Botkin, “You get closed captions, and you get multi-language subtitles and you can do DVS. You get a lot of bang for the buck.” 

To put it another way:

PC to run MAM: $250
Wireless router: $25
Displaying “Here’s looking at you, kid” in Hebrew on my iPhone: priceless

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question