Even the Smithsonian is foggy on mobile-app privacy

Leafsnap identifies plants using iPhone camera, send photos to central database

You don't usually associate the Smithsonian Institution as being overly digital, or overly contemporary, despite its effort to promote its association with both Steven Colbert and The Simpsons.

Online you're more likely to find The Mystery of the Singing Mice, or Who Had the Best Civil War Facial Hair? than you are to find something that reflects the tremendous change in communications, art and computing during the last 10 or fifteen years.

Various groups within the institution are trying, however – in one case so successfully it makes the whole institution seem as current the most recent iPhone scandal.

And even though some of their efforts seem a little behind the times, others hit its educational mission and the capabilities of contemporary technology just perfectly.

Ten Unforgettable Web Memes, for example, seems more the kind of thing you'd see on Cracked.com (and not as one of the site's surprisingly funny bits), than highly promoted as clickbait on Smithsonian.com.

It makes Smithsonian magazine seem a little bit desperate, though it does make one nostalgic for the mild shame of having once laughed at I Can Haz A Cheezburger? or thought RickRolling was less stale that the song into which it has breathed unnatural life.

The iPhone app Leafsnap, on the other hand, hits both Smithsonian's targets – the need to educate, and the need to do it in ways that will engage a population more eager to see fictional displays come to life in ,a href="http://www.imdb.com/title/tt1078912/" target="_blank">Night at the Museum II than to see the real ones in person.

Created by a group of programmers and botanists at Columbia University and the University of Maryland, Leafsnap uses the algorithms being developed for facial recognition, but applies them to images of plants, to make it easier to know the name of grasses, bushes, trees and other flora that are easier to identify as obstacles in your hike or things to mow on Saturday than as individual species.

It uses iPhone's camera to take an image of the plant, then compare it to the shape of others in its database before listing possible identifications scaled according to the closeness of the match.

Originally designed as a tool for experts, Leafsnap has been adapted with data from Smithsonian’s National Museum of Natural History as a potential replacement for field guides used by non-scientists.

The free app has images and identification data for trees in the Northeast, but will eventually include have data covering the rest of the country as well.

It may also help scientists track changing patterns or location of individual species as well. In addition to helping hikers identify a plant, it sends images of the plants back to a central database, along with the location, date and time of the identification.

For botanists, it could deliver data from the equivalent of a field team tens of thousands strong, at no cost.

For leaf peepers, it could help differentiate between wild plants that are safe to eat and those that would cause weeks of skin rash for the incautious.

There is no indication what other kind of data Leafsnap might collect, or what, if any, right its users have to know what it's doing.

In that way it's similar to the functions in iPhone and iPad that track users without securing the data, and without letting them even know whether their data is being collected.

Highly detailed, location-aware, demonstrably useful, but ambiguous in the way it treats users and the data they may consider private: That's as up to date as you can get.

Insider: How the basic tech behind the Internet works
Join the discussion
Be the first to comment on this article. Our Commenting Policies