Linux leaps to supercomputer

Computer World –

Houston-based Conoco Inc. announced last Wednesday that it has built and deployed a huge, new Linux-based supercomputer to analyze massive amounts of seismic data gathered in the process of exploring for oil and gas.

The Intel-based geophysical computer -- which boasts enough storage capacity to house the entire U.S. Library of Congress -- was built during the past two years, primarily by an 80-person internal information technology and engineering team headed by Alan Huffman, manager of the $27 billion energy company's Seismic Imaging Technology Center. The Linux-based system cost one-tenth the price of a conventional supercomputer.

The system has already been used to analyze seismic data from the North Sea and the Gulf of Mexico, where Conoco recently discovered oil and is drilling two deepwater wells.

Conoco isn't the first oil and gas company to implement its seismic software on Linux, according to Stacey Quandt, an analyst at Cambridge, Mass.-based Giga Information Group Inc. New York-based competitor Amerada Hess Corp. also implemented a Linux-based seismic research system on a 64-computer system from Round Rock, Texas-based Dell Computer Corp.

Still, the Conoco implementation is very significant in that it signals a "continuation of the trend" within the oil and gas industry of companies willing to run mission-critical supercomputing applications on "a commodity operating system," Quandt said.

The new supercomputing system integrates Linux and Intel Corp.'s cluster chip architecture with advanced tape robotics, 10 terabytes of massive hard-disk storage and its own proprietary seismic software.

The new system has been designed so that it's accessible from almost any Conoco substation via a company intranet. This task involved re-engineering Conoco's proprietary seismic software to operate in Linux with an XML-compatible, Java-based user interface.

"We jumped on Linux because it had the flexibility to customize to our needs. The software re-engineering is quite a significant component to switching over [from a conventional supercomputing system]," Huffman said.

"We've also designed the hardware so we can break away a minicluster of 43 or 64 CPUs, so a geophysicist can process data on-site," Huffman added. "If a scientist is sitting in central Asia and has a bunch of tapes to be analyzed, he can bring a minicluster and process it right there."

The tapes that geophysicists analyze contain sound waves recorded in the field and used to build an image of the subsurface of the Earth, similar to the way physicians use ultrasound data to build a physical picture of a body part.

"The bottom line is this: We control costs, we control the data, then we can focus the technical efforts of our geophysical team on developing the best imaging software necessary to make the very best decisions," Huffman said.

Last week, Prudential Securities Inc. in New York issued a report ranking Conoco first in exploration and production results for last year.

Top 10 Hot Internet of Things Startups
Join the discussion
Be the first to comment on this article. Our Commenting Policies