Listening to the way people in the world of technology talk about Alan Turing, it's difficult to believe that the English computer scientist isn't more of a household name.
IN PICTURES: Alan Turing in the media
"The man challenged everyone's thinking," says Vint Cerf, Google's chief Internet evangelist, in an interview with Network World. "He was so early in the history of computing, and yet so incredibly visionary about it."
Cerf -- who is president-elect of the Association for Computing Machinery and general chair of that organization's effort to celebrate the upcoming 100th anniversary of Turing's birth on June 23 -- says that it's tough to overstate the importance of Turing's role in shaping the world of modern computing.
"Alan had such a broad impact on so many aspects of computer science," says Cerf. "The deep notion of computability is so fundamental to everything we do in computing."
"People ... have done computing for thousands of years," says Moshe Vardi, a distinguished computer science professor at Rice University who is working closely with Cerf on the upcoming ACM celebrations. "But the theory of computing really started in the 20th century, and Turing is one of the foremost -- if not the foremost -- parents of the theory of computing."
Both Vardi and Cerf -- who are influential figures in computer science in their own rights -- cited the idea of computability, or the ability to solve a problem efficiently, as a foundational concept for the development of modern computers.
"Businesses don't even realize how much they rely on [the idea]," says Vardi. "Today, when you have people doing algorithmic trading on Wall Street, they are following in Turing's footsteps. ... There is a line of development, each one following the other, that led from the question 'what is computable?' to the world you see around us today."
Who was Turing?
Alan Turing was born on June 23, 1912, in London. After studying at King's College, Cambridge, and becoming a fellow at the age of 22, he did some of his most important conceptual work in inventing what he called the "a-machine," better known to the world as the Turing machine (pictured below). This hypothetical device -- which reads symbols from a paper strip of theoretically infinite length and interprets them according to an inbuilt table of rules -- is crucially important to the development of computational theory.
The work for which Turing is probably best known to the public, however, is his central role in cracking German military codes during World War II.
Turing's earlier research stood him in good stead at the ultra-secret Government Code and Cipher School, located at Bletchley Park. Along with contributing enormously to the war effort by providing detailed intelligence on German communications, his work at GCCS presaged the development of the rudimentary computers he would design after hostilities ended.
While he had already created a digital multiplier during a spell at Princeton University before the war, the design of the Automated Computing Engine, or ACE (pictured below), was to have far greater effects on the development of the computer -- providing a basis for a whole generation of devices, including the Bendix G-15.
"The very first machine that I ever got to really program was called a Bendix G-15 computer," recalls Cerf.
Turing went on to make many other important advances in several fields. The well-known Turing Test -- which holds, broadly, that a machine which is indistinguishable from a human in normal conversation can be said to have achieved artificial intelligence -- is his work, as is a well-known hypothesis on pattern formation in biology.
His contributions, however, were cut short by his untimely death. Turing had been persecuted for his homosexuality by the British government, and agreed to undergo chemical castration rather than face jail on a charge of indecency in 1952. He was found dead, the victim of apparently self-inflicted cyanide poisoning, on June 8, 1954.
Former U.K. Prime Minister Gordon Brown recently issued an official apology to Turing in 2009.
Nevertheless, Turing's memory is the driving force behind a landmark celebration in the scientific and computing communities. While a Turing Award -- "the Nobel Prize in computing," according to Vardi -- has been handed out every year since 1966 by the ACM, this year's presentation, on June 16, will be different (See "Judea Pearl, a big brain behind AI, wins 2011 Turing Award").
The group, Vardi says, is trying to bring all living Turing Award winners together in celebration of Alan Turing's life and work. The event will be held June 15-16 at the Palace Hotel in San Francisco.
He predicts that the group will have about 32 Turing Award winners on hand, which he described as "an amazing opportunity." In order to ensure that attendees have a chance to hear from all of the award winners, several panel discussions featuring multiple prize winners will be held.
The response has been well beyond what the ACM initially expected. Vardi says that the group estimated they would get roughly 250 or 300 people to attend; there are now more than 1,000 attendees registered.
"We kept going to the hotel, 'We need more room! We need more room!'" he says.
Nor are the ACM's celebrations the only ones of their kind. The University of Manchester is planning its own conference for June 22-25, and a 100th Birthday party will be held in Cambridge on June 16 at Turing's alma mater, King's College, among many more.
Part of the reason for the massive scope of the celebrations, according to Vardi, is the nature of the 100-year anniversary.
"Now, with the perspective of 100 years, we understand how important [Turing's work] was in a way that, even 20 years ago, we couldn't understand," he says.
Email Jon Gold at email@example.com and follow him on Twitter at @NWWJonGold.
Read more about data center in Network World's Data Center section.
This story, "Tech world preps to honor 'Father of Computer Science' Alan Turing, as centenary nears" was originally published by Network World.