Hollywood's elite data centers must deliver star IT performances

By John Brandon, CIO |  Data Center, Amazon, Amazon EC2

One of the great challenges for Technicolor is that, on any movie, the work is often completed in different geographic areas. Sound work might be completed in London, for example, while the final rendering is done in Los Angeles. As a result, the distribution network must run smoothly, with no dropped frames. This is compounded by the fact that there are often many versions of the same film--producers and directors often want to look at different versions of scenes and choose the one they like.

As you can imagine, the content for upcoming films also has to be secure. Technicolor uses a secure transmission system from Aspera to make sure no one can steal, say, the third installment of the Dark Knight franchise. Instead of being audited by a banking regulator, Technicolor opens its IT infrastructure to inspection by studios, who make sure individual workers can't offload a film to a personal drive or transmit files over the network without strict clearance.

One lesson for any IT shop, says Davis, is to let broadband carriers bid on services. This has the distinct advantage of lowering costs and providing flexibility for projects. Davis also takes the need for low latency very seriously. One movie alone can use up to 8 TB per file--double that for a 3D movie. Where you locate your data centers has a great impact on latency. (Fortunately, in Los Angeles, there are plenty of options for dark fiber that Technicolor can tap at any time for boosts in speed.)

Technicolor also uses a fabric virtualization tech from Xsigo. This lets the company connect data centers as though there is one main connection, not 40 different connections. This makes provisioning simpler and easier, and it results in fewer hardware switches and ports. The network is thus defined primarily by software to link all of the data centers for smooth transmissions.

Livestream: Boosting Scalability with 15 PB NAS Fliers

Livestream, as its name implies, is known for live streaming events. The company can process many terabytes of data for a single event, such as Whitney Houston's funeral. Livestream also handles user authentication and other back-end responsibilities. In many cases, the level of interest surprises the company--for example, 100,000 users signed on almost immediately, with no prior warning, to watch Houston's funeral.

Slideshow: Massive Data Centers

Nicholas Tang, vice president of development operations at Livestream, says the data processing needs can be jaw-dropping. The site might have 200,000 concurrent users viewing the same stream, and the data center has to process this material at speeds up to 1 Gbps to keep the video smooth. IT executives often have to deal with perceptions of file transmission accuracy and circumvent any downtime--but in the video world, everyone knows when there are dropped frames.


Originally published on CIO |  Click here to read the original story.
Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Data CenterWhite Papers & Webcasts

See more White Papers | Webcasts

Answers - Powered by ITworld

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Ask a Question
randomness