Before Web 2.0, before mashups, before FreeOurData.org.uk and other pleas, before the Internet itself, things used to be so much simpler for geo data. You were either an end user and accessed the data as a map or you were a GIS Professional and accessed the data via a (frequently very expensive and very specialised) Geographical Information System. But now we have geo data, lots of geo data, some of it free, some of it far from free, both in terms of usage and cost and a fundamental problem has replaced the paucity of data.
Everyone wants free, open, high quality geo data and no one wants to pay for it. But it’s not quite that simple.
The recent acquisitions of Tele Atlas and Navteq, the two big global geo data providers, by TomTom and Nokia respectively show the inherent value in owning data. But owning the data isn’t enough any more as the market for licensing the data is a shrinking one, despite the phenomenal growth of the satnav market, both in car and on mobile handsets. Why is the market shrinking? Because no one wants to pay for it, at least directly.
TomTom, primarily a hardware vendor, are differentiating into the software and data market, seems to be concentrating on the PND usage of the data, although we’ve yet to see how the outlay necessary to acquire Tele Atlas coupled with the overall economic downturn will effect their overall 2009 earnings. Their Q1 2009 report somewhat dryly notes that “market conditions were challenging” and that “we are making clear progress with the transformation of Tele Atlas into a focused business to business digital content and services production company“. There may be other aspirations at play here but for now at least, the company is keeping quiet.
Nokia, also primarily a hardware vendor in the form of mobile and cellular handsets, are also moving away from their roots and into a wider market, hopefully in an attempt to stop the encroachment of upstarts such as HTC, Apple and RIM into Nokia’s traditionally strong smartphone heartland. Again, Nokia has yet to make a public play into this arena but all the composite elements are in place to enable this to happen.
Taking the opposite route, Google, which started off as a software player are now moving to being a player in the data market by gathering high quality geo and mapping data under the smokescreen of gathering Street View. This has allowed them to gather sufficient data to supplant Tele Atlas as a data provider, at least in the Continental United States.
All three companies are either making or have the prospect of making determined plays in the location space but all three of them have ways of leveraging the value inherent in their data. Google has their unique users, their search index and a vast amount of advertising inventory; TomTom their satnav customers; Nokia their handset customers, albeit one level removed with the Mobile Network Operators as an uneasy partner and intermediary.
So what of the open data providers? It’s important to remember here that open doesn’t always mean free, it means the ability to create derived works and to use the data in ways that the originator may not have immediately foreseen. True, a lot of open data is free, but even then it’s the Free Software Foundation’s definition of the word.
“Free (software) is a matter of liberty, not price. To understand the concept, you should think of free as in free speech, not as in free beer.”
The poster child of open geo data is OpenStreetMap, the “free editable map of the world”. Founded in 2004 by Steve Coast, OSM has enjoyed phenomenal growth in users and in contributions of data that can be used anywhere and by anyone and which espouses the values of free as in speech and as in beer. As with all community or crowd sourced collaborative projects, OSM’s challenge is to sustain that growth and once complete coverage of a region is reached, in keeping that coverage fresh, current and valid. We’ll leave aside that fact that complete coverage is an extremely subjective concept and means many things to many people.
Traditionally strongest in urban regions, one of OSM’s other key challenges is to match the expectations of their user community who consume that data rather than those who create it. Both internationalisation of the data and expansion out of the urban conurbations will potentially prove challenging in the years to come. That’s not to say OSM isn’t a significant player in this space and the quality of the data, though varying and in some places duplicated, is for the majority of use cases, good enough. This was backed up by research undertaken by Muki Haklay of UCL which answered the perennial question of “how good is OSM data” with a pithy “good enough”.
Attempts to capitalise on and monetize the success and data corpus of OSM through the Venture Capital funded Cloudmade have yet to deliver on the promise and with the exception of a set of APIs, Cloudmade has announced the loss of their OpenStreetMap Community Ambassadors and the closure of their London office. All of which lends credence to the fact that simply owning the data isn’t enough.
So how to solve the dichotomy of geo data? Everyone wants it but no one’s willing to pay for it with the exception of the big players, the Googles, the Yahoos and the Microsofts of the world and control of the proprietary data sources has centralised into TomTom and Nokia, both of whom are well placed to capitalise on their data assets but who haven’t yet delivered on that promise.
Maybe the answer is twofold. Firstly develop an open attribution model whereby the provenance of an atom of data can be tagged and preserved; this would remove a lot of the prohibitions on creating derived works at the original data provenance could still be maintained. Secondly allow limited usage of proprietary data at varying levels of granularity, accuracy and currency, thus creating a freemium model for the data and stimulate developer involvement in donating data to the community as a whole.
It’s too early to see whether this will come to pass or whether an already tight hold on the data will become tighter still.
2 Comments
Comments are closed.
Interesting loop between your opening thrust regarding the arcane world of GIS professionals and their rarified use of GI to produce the users’ map, the remaining reach of your analysis that is primarily focused on creating value from what have been until very recently “PND data” and your closing ideas on granularity.
It may be small beer in the multi-billion dollar world of the major search engines, let alone TomTom and Nokia, and their battle for differentiation but it certainly won’t be a revelation to most that there are people willing to pay directly for high quality geographic content!
The perhaps uncomfortable dichotomy will I think persist; for many of these users, the in the main ‘professionals’, have demands that for the time being are not met by PND type data. No surprise there really. I know “I would say that”!
The continuing acceleration in the volumes of n-dimensional geodata predicated among other things by Google’s terrestrial LIDAR sensors, their SketchUp building maker development and a flurry of other technologies will of course change this perspective a little, or even a lot! Though last time I looked Google’s UTOS don’t look to give much back to their contributors so the control sentiment may well persist.
The atomic provenance model in some ways rebrands metadata as a mechanism by which all parts of the GI value chain from initial capture through to complex derivation and analysis can be recognised (though I am note sure this would overcome the derivation prohibitions so readily). Much of this capacity exists (such is the nature of GI data capture, validation, storage etc) and while it has proven a ‘dry’ area it is rife with some of the nomenclature of an open future.
However the data collectors choose to play this out, it is evident already that some recognise the validity of extending the implications of the provenance/metadata approach to a freemium model. I would hazard that the coming years will see considerably more play in this area both in terms of data liberation at one end and the evolution of a variety of willingly paid for, indirectly and perhaps even directly, products and services at the other. As you suggest, and as with current geo products, there will have to be some basis for differentiating these. Traditionally this has taken place through the lens or proxy of scale but enhanced, atomic attribution and open metadata open up non-scale based possibilities….
This is a great synopsis Gary, and blame it on my background as a geologist, but I’d expand your argument to trichotomy (is that English?) and include guv data. You allude to it, and it’s a storm of controversy perhaps best discussed elsewhere, but it’s a key component IMHO. A blog I lost track of even suggested how the US guv missed an opportunity, when stateside @ least it may have opened data to the likes of Google, who may well outflank them with crowdsourcing for example parcel data. Come to think of it actually, is that “the story behind the story”, that guv data may become irrelevant over time? That keeps Dangermond up at night, the same way Google docs kept Gates then and now Allen up at night…