Google Search gets smarter. But is it better?

Do we want Google stereotyping us, and giving us a version of the Internet based on that stereotype?

By  

Google this week quietly improved search input, spell check and auto-correct by applying available contextual knowledge to results.

Google is now offering region-specific versions of Google Suggest. So, for example, typing in "sub" might prompt Google Suggest to offer subway fare and station information in New York City, Submarine base info in New London and sandwich chain stores in Dallas.

Google also says people often misspell names, but also add contextual information when searching for people. Users might misspell "Elgan" as "Elgin," but might include with their query "brilliant blogger," which enables Google to immediately identify the target even with the name misspelled. Additional contextual content might be added to location data to narrow down the right result.

Google also says they'll improve spelling auto-correction by eliminating the suggestion. Now, if you misspell a word, Google asks at the top of the results page, "Did you mean..." followed by the suggested correct spelling. Now they say they'll simply assume they know the correct answer and include search results for that search. Google says they'll only show corrected results when they're "highly confident" they know you accidentally misspelled.

These changes make search smarter, but do they make it better?

Are we starting to head down a road where everyone has his or her own personal Internet? It's easy to come up with examples where smart search and spell-check results would definitely be helpful. But isn't there some value in a search engine that makes no assumptions about what you're looking for?

The late Carl Sagan, an astronomer, astrophysicist, author, used to tell the story of when as a child, he asked a librarian for a book about stars. She returned with something about celebrities, rather than one about the celestial fireballs future astronomer was really interested in.

Isn't that what Google's increasingly smart search engine does? Make assumptions about us based on what it knows about average people? Is this a dumbing down of search? Doesn't it channel the fringe into the mainstream?

Little Carl Sagan is just some kid, so there's no way he'd be interested astronomy, right?

When some other kid types in "bart," Google's smart search will return all kinds of links related to the cartoon character Bart Simpson. But what if he's really interested in Portuguese explorer Bartolomeu Dias?

What if an entrepreneur is trying to brand a new smoothie company named with a deliberate misspelling, such as "jooce." Will Google just automatically re-direct to competitor Jamba Juice?

These are probably bad examples. The point is that there may be dangerous and unintended outcomes we cannot foresee.

Do we want a non-objective search engine -- especially if users believe it to be objective? Do we want Google Search to make assumptions for the convenience of average users at the expense of non-average users?

Join us:
Facebook

Twitter

Pinterest

Tumblr

LinkedIn

Google+

Answers - Powered by ITworld

Ask a Question
randomness