Created by a group of programmers and botanists at Columbia University and the University of Maryland, Leafsnap uses the algorithms being developed for facial recognition, but applies them to images of plants, to make it easier to know the name of grasses, bushes, trees and other flora that are easier to identify as obstacles in your hike or things to mow on Saturday than as individual species.
It uses iPhone's camera to take an image of the plant, then compare it to the shape of others in its database before listing possible identifications scaled according to the closeness of the match.
Originally designed as a tool for experts, Leafsnap has been adapted with data from Smithsonian’s National Museum of Natural History as a potential replacement for field guides used by non-scientists.
The free app has images and identification data for trees in the Northeast, but will eventually include have data covering the rest of the country as well.
It may also help scientists track changing patterns or location of individual species as well. In addition to helping hikers identify a plant, it sends images of the plants back to a central database, along with the location, date and time of the identification.
For botanists, it could deliver data from the equivalent of a field team tens of thousands strong, at no cost.
For leaf peepers, it could help differentiate between wild plants that are safe to eat and those that would cause weeks of skin rash for the incautious.
There is no indication what other kind of data Leafsnap might collect, or what, if any, right its users have to know what it's doing.
In that way it's similar to the functions in iPhone and iPad that track users without securing the data, and without letting them even know whether their data is being collected.
Highly detailed, location-aware, demonstrably useful, but ambiguous in the way it treats users and the data they may consider private: That's as up to date as you can get.