high performance computing

  • Thanks to the NSA, quantum computing may some day be in the cloud

    Posted January 6, 2014 - 1:07 pm

    The NSA is spending some $80 million in basic research on quantum computing, money that may ultimately help commercialize quantum computing for the private sector.
  • HPCC Takes on Hadoop's Big Data Dominance

    Posted February 12, 2013 - 9:31 pm

    Hadoop is everywhere, but a strong competitor is making headway in the market. It's all thanks to years of production use and a solid big data pedigree that powers the billion-dollar LexisNexis database.
  • China moves to beat U.S. in exascale computing

    Posted November 20, 2012 - 4:36 pm

    U.S. efforts to develop the next-generation high performance computing platform are lagging, which could give China an opening to develop an exascale system first.
  • Exascale unlikely before 2020 due to budget woes

    Posted November 19, 2012 - 4:57 pm

    The U.S. Dept. of Energy, which builds the world's largest supercomputers, is now targeting 2020 to 2022 target for an exascale system, two to four years later than earlier expectations.
  • High-performance computing turns to apps to cut cost and frustration

    Posted November 14, 2012 - 12:44 pm

    Steve Jobs was right about apps in more ways than perhaps he ever knew. The concept of using apps to make software easily available and affordable to large numbers is arriving in high performance computing.
  • 2010's GSA conference scandal hits SC12

    Posted November 14, 2012 - 12:43 pm

    GSA excesses at a Las Vegas conference two year's ago prompted a cutback in U.S. government travel budgets -- and that means fewer federal researchers at this year's supercomputing conference.
  • Amazon hails era of 'utility supercomputing'

    Posted March 11, 2012 - 8:18 am

    Cloud computing giant Amazon Web Services is heralding the era of utility supercomputing, whereby massive computational resources and storage requirements can be accessed on demand.
  • U.S. HPC lead in danger

    Posted December 6, 2011 - 5:00 pm

    The DOE's exascale program is stalled, and its prospects appear bleak at a time of budget-cutting fever.
  • Breakthroughs bring the next two major leaps in computing power into sight

    Posted September 9, 2011 - 1:15 pm

    Modelling the unimaginably complex environment of a quantum computer and figuring out how graphene to work just like silicon while still being faster, lighter and thinner make far more practical two stages of computer development that have been theoretical until now.
  • IBM finds a way to completely ruin computers

    Posted August 19, 2011 - 12:11 pm

    IBM's SyNAPSE processors are designed to mimic the human brain's ability to analyze several streams of data at once and switch topics quickly, a huge advantage for systems designed for long-term monitoring, but carried too far HEY, LOOK, A BIRD!!
  • Clemson IT team embraces call to be entrepreneurial

    Posted August 15, 2011 - 6:41 am

    Five years ago Clemson University named James Bottum chief information officer and gave him the mandate to overhaul the school's IT infrastructure and build out a high performance computing environment. The goal: catapult the school into a leading research university and help attract faculty and students.
  • Russia steps up game in supercomputing

    Posted July 25, 2011 - 6:07 am

    Russia's profile in supercomputing is being raised thanks to a Moscow-based company and a Russian president who sees high-performance computing as critical to the nation's future.
  • JP Morgan supercomputer offers risk analysis in near real-time

    Posted July 12, 2011 - 9:45 am

    JP Morgan is now able to run risk analysis and price its global credit portfolio in near real-time after implementing application-led, High Performance Computing (HPC) capabilities developed by Maxeler Technologies.
  • IDC: High-performance computing poses challenges for SMBs

    Posted June 20, 2011 - 12:22 pm

    Cloud computing could help boost the use of high-performance computing (HPC) among small and medium-size businesses, but there are hurdles that have to be overcome before that can happen, IDC said on Monday during a presentation at the International Supercomputing Conference in Hamburg, Germany.
  • IBM pitches HPC load balancing as high-performance cloud

    Posted June 9, 2011 - 12:59 pm

    IBM is pitching a 'cloud' management software for high-performance computing that looks like a great HPC workload balancer, but suffers from being labelled with the wrong buzzword.
  • Wave of Big Data could swamp corporate IT

    Posted March 10, 2011 - 2:59 pm

    Big Data requires more than just lots of storage. Data sets that big break every app they run through, bog down nets and servers and raise IT's Excedrin bills.
  • Thunderbolt hard drives should make data-transfer scream

    Posted March 1, 2011 - 12:52 pm

    LaCie, Western Digital and others are working on products with the screaming-fast Thunderbolt data exchange, but won't ship until summer.
  • Microsoft offers "HPC" on Azure

    Posted November 18, 2010 - 7:04 pm

    Proteomic research is huge and complex, but many of the calculations that enable it are small, statistical calculations well suited for MPP across clusters of smallish PC hardware.
  • HPC experts look forward to exascale

    Posted November 18, 2010 - 6:15 pm

    Do you have any idea how fast you could play COD Black Ops on that?

Join today!

See more content
Ask a Question