This column is sponsored by ESRI
The approaches for spatial data infrastructure (SDI) vary widely within these continents, and particularly from country to country in North America. One overriding difference in data policy that still has broad play across regions is the fee vs. free model for data collected by the government. The United States has long made federal data available for free, and has cultivated a strong geospatial industry based on that openness.
There appears to be more traction for the open data model of late across the world. Just this week, Geoff Zeiss reported on efforts in Australia and the United Kingdom to quantify the net benefits of open vs. free geospatial data to the overall economy. In both countries it was found that free and open data would have a greater positive impact on the economy than the restrictive licensing and cost recovery that currently exists in some countries.
Setting a Firm Foundation
While the United States established a policy of open data sharing among agencies and the public, the federated rather than centralized approach has long meant data collection redundancy and a lack of a strong foundation. While the U.S. Geological Survey owned the mapping mandate with the national topographic map set, the mandate became splintered by different agency interests when digital map data came to the forefront.
To be fair, there was no easy means for coordinated and centralized efforts when this paradigm shift occurred. It made sense for agencies to forge their own path with their own mainframe computers, in light of the fact that data sharing at the time for data at the national scale was near impossible. That momentum has been very hard to shift, even with the advent of the Internet as an enabler for collaboration.
The problem in the developed world is that geospatial data exists in many formats and for many purposes. It’s a very complex task to normalize and set the foundation after a great deal of work has already begun, and a number of different stakeholders have become entrenched with their own approaches.
Canada jump-started their GeoConnections program in 1999 with Internet distribution and data normalization as key components for building a better national data set. This approach, coupled with an emphasis on partnerships with government and industry, has provided good traction for the creation of the Canadian Spatial Data Infrastructure, and is serving as a model for other nations. The USGS does have the right idea for the National Map, along the same partnering and distribution model as GeoConnections, but it’s lacking a real mandate.
Critical to the success of broad geospatial initiatives is a spirit of cooperation, collaboration and common purpose. From the perspective of the United States, these three ingredients have existed at different points in time, and haven’t come together yet to forge much progress on SDI that would take us to the next level.
Of late, there’s been more traction in the United States to ensure that no two federal players are paying twice for the same data. There’s now a growing trend to spread the collaboration efforts from the federal government down to the states in such initiatives as Imagery for the Nation and Transportation for the Nation, where all stakeholders could share in data collected once and distributed many times. The fact that so many states and agencies are behind this effort gives great hope for better data collaboration in the next administration.
The United States federal partners have long collaborated to establish data standards and outreach programs via the Federal Geographic Data Committee. This effort to set baseline standards for data across agencies and the country have effectively established best practices that have gained much respect around the world. While the standards creation process has been strong in this agency, a lack of authority to force change has hampered it’s ability to forge progress.
The common purpose has nowhere been more evident than immediately after 9/11, when it became overwhelmingly obvious that geospatial data was critical for disaster response. There was good movement immediately after this event to elevate the cause of standards and interoperability, but this momentum has largely faded, due in large part to the confusion and chaos of combining many agencies, including FEMA, into the behemoth Homeland Security Department.
Data Availability and Quality
As outlined above, the open data policy that has existed in the United States has only gone so far in solving cross-institutional redundancy and distribution issues.This contrasts with the standardization and high quality of the data created by the United Kingdom’s Ordnance Survey that operates solely on a cost recovery basis with strict licensing restrictions. Somewhere there’s a happy medium here where we can have high quality and openness.
I see some great promise in Europe’s INSPIRE program to forge this new paradigm, but I’m also daunted by the complexity of the data normalization problem that they face. In light of other normalization issues that haven’t garnered broad adherence, such as the standard Euro currency, there’s a tough road ahead.
The objectives of North American and European geospatial initiatives are in alignment as are most other parts of the world. The issue for major progress comes down to political will and simplification of complex problems. Perhaps climate change and the fate of our planet will provide the rallying point we all need to get beyond our current bottlenecks.
Read what Jeff Thurston has to say on this subject here.