How will the geospatial data market evolve over the next ten years?

by Matt Ball on June 18, 2010

Perspectives Header

The ability to collect, distribute and access geospatial data continues to improve in terms of speed and precision of collection, timeliness of delivery,  and affordability. In addition to these expected improvements, brought about by the rising importance of geospatial data and advancements in geotechnology, there are a number of profound changes that will greatly impact the geospatial data market in the years to come.

The changes underway are a combination of increased individual data collection, more open government, the explosion of sensor data, technology that automatically extracts information from data, and the ability to synthesize information from many sources. While there are increasing business opportunities in each of these areas, there are also significant disruptions taking place that threaten the business models of many established geospatial data organizations. The next ten years will be a time of many changes, but will also bring a greater empowerment of the GIS user given the amount of available data, with much of it for free.

Volunteered Map Data

The advent of crowd-sourced geospatial data, such as OpenStreetMaps (OSM), is continuing to gain momentum, and could easily replace the market for much commercial geospatial data. At present, the coverage of  OSM data nearly matches that of the commercial providers, and in some cases it surpasses it for accuracy and level of detail. By 2020, the market for navigation data will be much smaller, but there will still be a place for highly-accurate and trusted-source data.

Mobile platforms are of increasing importance for data collection. The better location precision of these devices will help greatly in both the collection of accurate geospatial data, and the delivery of helpful location-aware applications. The mobile platforms are quickly dwarfing all other computing platforms in terms of their number, and their pace of innovation. This trend will continue to the point where we have less robust computing platforms, but much greater connectivity to each other and the details that are of interest to us.

Open Government

The growing government transparency movement, with more open data, and the advent of application-development contests such as “Code for America,” are placing the emphasis on what can be done with government data to improve decision making and offer greater services to constituents. The shift is away from services for citizens toward a collaboration with citizens.

The increased involvement of citizens with both the data and the services that are offered, will mean a vested partnership in assuring the quality and accuracy of data. With more people accessing and using the data, the data exposure will mean quality improvements, particularly if there is a means for the crowd to conduct quality control and create updates. With an open data approach, there will be less data drudge work, and this freeing of time and effort will enable governments to employ greater analytics to make sense of inputs and predict and chart future courses.

Automated Collection and Extraction

Machine learning and automated extraction tools that pull information from data are on the rise. The ability to pull different data products from raw imagery or other sensor inputs will become a focus area for many. The users will have the capability to use and tune data inputs for their own purposes, and data providers will concentrate on creating more diverse data products. The more specialized data products that can be derived, the more value there is in the sensors themselves.

The demand for this specialized data is already high, but the ability to deliver real-time information to create sophisticated programs that monitor and react to data inputs autonomously and adaptively will really see this interest take off. The “app for that” mentality could easily take hold toward a “data for that” ability, with the software developer orchestrating the different data feeds in order to create custom solutions.

Synthesis and Fusion

Geospatial data interoperability plays a huge role in the ability to pull together a variety of data, particularly when moving toward real-time. The more normalized the data are to each other, the faster and greater the synthesis of information.

Experts, such as today’s geospatial technologists, will evolve toward more active developers of software, but also as synthesizers of data. The huge volumes of data available will require skilled technicians to verify, aggregate and analyze this information for rich insights. The connectivity of the Web will feed these specialists, and organizations and governments might simply subscribe to regular data scrubbing and synthesis services. The ability to craft solutions that return reliable results and improve organizational efficiency means that these knowledge workers will be in high demand.

{ 6 comments… read them below or add one }

Rollo June 18, 2010 at 9:39 am
Matt Ball June 18, 2010 at 10:19 am

That report is definitely a worthy reference: http://www.agi.org.uk/foresight/
I hadn’t read it closely, but it’s nice to see that it echoes a lot of our coverage on V1 Magazine. One of the conclusions in the report that I’ve been hearing for years is that “spatial isn’t special.” I just don’t buy that line of thinking as it is a specialized domain that will continue to have value for a long time to come. I agree that IT will take greater ownership of the tools and capabilities, but those that can think and analyze spatially are still a special bunch and they’ll continue to want to be a community.

Justin C. houk June 18, 2010 at 1:38 pm

hey Matt,

This is great coverage Matt. I’m curious what your take is on the way cloud computing will impact the data market. It ties into many places as infrastructure? It’s too app centric for a data discussion?

Matt Ball June 18, 2010 at 1:45 pm

Thanks for the query Justin. I probably could have called out the cloud there more explicitly as the source for most data feeds. I view it as enabling infrastructure, and an important component to get GISers away from time-consuming maintenance tasks and onto more meaningful work.

Chris Grayson June 18, 2010 at 11:10 pm

Hi Matt,

I was directed to this post from an article by Marshall Kirkpatrick at Read Write Web. I have actually loosely followed you on Twitter as I include you on my “Virtual Reality and Virtual Worlds” Twitter List… but I digress.

I would be curious to get your perspective on a video I produced for my blog, GigantiCo. The title of the video is:

Mobile Augmented Reality, the Ultimate Game Changer:
http://tinyurl.com/34laagg

In the mid-portion of the video, I cover some of the same territory that you focus on here in this article. I think it may interest you, and I would be interested to hear your thoughts.

Very truly,
Chris

Matt Ball June 20, 2010 at 8:19 am

Chris, nice job on the video! You cover a lot of good ground there, and have tipped me off to several interesting companies that I was unaware of.

Leave a Comment

*

{ 6 trackbacks }

Previous post:

Next post: