Geospatial Web Services and Cross-Boundary Information Sharing During Disasters

EarthzineArticles, Disasters, Earth Observation, Original, Sections, Technology

Aftermath of Hurricane Katrina. Image: aradisecreative

Aftermath of Hurricane Katrina. Image: aradisecreative

Introduction: the value of “random access” to diverse Web resources at each stage of disaster management

During a disaster, everyday lives are thrown into chaos as once-remote and generalized risks crescendo into a flurry of immediate, specific and often unforeseen and life-threatening risks that are almost unimaginable during everyday life. Physical boundaries and familiar landscapes fast-forward into crazy and broken new forms, and normal personal and institutional roles, relationships and behaviors change suddenly.

We can’t avoid most disasters, but we can reduce risks and losses by improving the ability of planners, managers, responders and impacted citizens to publish, discover, assess, access and use geospatial data across institutional boundaries in each stage of disaster management:

1. In planning, we use spatial information to run what-if scenarios, calculate anticipated consequences, and consider best actions, and we think about who is likely to need what information for what purpose in each stage of disaster.

2. In preparedness, we organize our information infrastructure and information systems to maximize the likelihood that each player in each stage of a disaster can get the information they need.

3. In mitigation, we do things such as rehearse roles and test communication; gather information and run models to make decisions about how to protect communication channels or where to strengthen levees or locate field hospitals; and inform citizens about what they will be asked to do.

4. In response, we use our information systems to maximize real-time situational awareness, weigh decisions, seek consensus and compliance, send warnings, send orders, launch rescues, etc.

5. In recovery, we bring relief, assess damage, clean up, rebuild, reconstruct and analyze events, and note the effectiveness or failures of work done in stages 1 through 4 to improve our management of the next disaster.

This is the main point of the 2007 National Academy of Sciences study, “Successful Response Starts with a Map:” that geospatial information should be an essential part of all aspects of emergency and disaster management. The same point was made in a June 2005 report of the National Science and Technology Council Committee on Environment and Natural Resources: “Grand Challenges for Disaster Reduction — A Report of the Subcommittee on Disaster Reduction”. In that report, Grand Challenge #1 is to “Provide hazard and disaster information where and when it is needed”.

This paper is mainly about the second stage of disaster management, information system preparedness: organizing our information systems to maximize the likelihood that each player in each stage of a disaster can get the information they need.

2. Organizing information using 21st Century technologies

We need to organize our information systems so everyone playing has role-appropriate “random access” to information across institutional boundaries. No centralized “disaster data center” can be created that meets all the needs of planners, first responders, and rescue and clean-up teams, because most geospatial data valuable during a disaster are created and used by many organizations over a long period of time for a wide range of non-disaster applications. During each stage of disaster management, disaster managers may want access to data from local fire departments, municipal and county governments, police departments, health departments, hospital-based and for-profit ambulance services, volunteer organizations, public health agencies, federal agencies, civil engineering firms, architects, utilities, transportation planners and providers, sewage treatment plants, fuel and chemical companies, shippers, and other sources. Most of this data is collected during routine operations to support routine operations. But it is critical during disasters.

If there were a centralized disaster data center, the data would be full of dangerously out-of-date content – unless the content had been constantly updated by all the disaster management partners. Because getting routine updates from dozens or hundreds of partners is a difficult task to manage, developing communications and data systems that can give emergency management agencies access to any of the distributed databases makes much more sense.

Integrating these diverse systems so that they are interoperable is important and it can be accomplished much more easily than in previous years. In the last two decades, the larger Information and Communication Technology industry has moved with increasing rapidity in the direction of integration through open consensus standards. This is the Web paradigm: any client or server node in the network that exposes an interface based on an open standard can communicate with any server or client that exposes an interface based on the same open standard. These open “application programming interfaces” (APIs) are an incredibly efficient way to integrate systems, and consensus standards organizations are the most efficient way to introduce users’ application requirements into the standard-setting process. The industry is moving beyond proprietary computing platforms in which vendors control the APIs.

In the geospatial technology world, the Open Geospatial Consortium (OGC) is the main open consensus standards organization. The OGC also works closely with many other standards organizations, such as ISO TC/211 (international government-sponsored geospatial standards group), the Internet Engineering Task Force (IETF), IEEE 1451 (a “smart sensors” standards group), and the Organization for the Advancement of Structured Information Standards (OASIS). The OGC has a memorandum of understanding with the COMCARE Emergency Response Alliance.

Communities of organizations that adapt their information systems to the open Web paradigm need to agree to buy systems that implement certain standards, but this is not difficult or objectionable because most of the system vendors implement the standards. The benefits come not just from shared APIs, but also from the flexibility of Web-based approaches to encoding and translating data. XML-based encodings, such as the OGC’s OpenGIS Geography Markup Language (GML) Encoding Standard, an XML grammar for expressing geographical features, make it much easier than before to resolve problems that result from the different data formats and different data schemas used by different information communities.

3. A standards-facilitated multi-party Katrina website at Telascience

Soon after Katrina hit New Orleans, a small group of individuals working in government, academia, and industry implemented an initial emergency mapping portal based on open standards and open source software. The portal provided access to a diverse and distributed collection of data provided by the National Oceanographic and Atmospheric Administration (NOAA), US Geological Survey (USGS), US Naval Research Laboratory (NRL), the National Reconnaissance Office (NRO), and other agencies, companies and organizations. The American Red Cross, the National Institute of Urban Search and Rescue (NIUSR) and other relief agencies used the data portal and map viewing application in their efforts.

Figure 1. Map provided by the Telascience Katrina portal showing carcinogen sites map layer from the US National Institute of Health's National Institute of Environmental Health Sciences (NIEHS)

Figure 1. Map provided by the Telascience Katrina portal showing carcinogen sites map layer from the US National Institute of Health’s National Institute of Environmental Health Sciences (NIEHS)

The Katrina.Telascience.org project (Figure 1) was a direct result of the vision of using servers with open interfaces that implement the OpenGIS Web Map Server (WMS) Interface Standard as data sources for “web enabled seamless mosaic viewers” based on open standards.

Immediately after Hurricane Katrina passed over New Orleans, the US National Oceanic & Atmospheric Administration (NOAA) used an airborne camera to acquire a set of 1,500 JPEG images of the storm-damaged Gulf Coast. A volunteer put them into geotiff form so it could be served by a server through a WMS interface. Another important addition to Telascience’s Katrina spatial data clearinghouse was a collection of datasets from the US National Institute of Health’s National Institute of Environmental Health Sciences (NIEHS). The data included map layers such as Hazardous Air Pollutants; Metals and Metal Compounds; Chemicals Industry Facilities; OSHA Carcinogens; and Persistent, Bioaccumulative, and Toxic Chemicals. The Telascience server was soon providing 46 Katrina-related data layers through the open WMS interface and through the browser application.

One of the lessons learned from Katrina was that any data that might be useful in disaster management ought to be available on a server or servers that have open interfaces. At a minimum, these should be interfaces that implement the WMS specification for access to map images. Also, the data and its server should be discoverable by means of standards-based, XML-encoded metadata (conforming to the ISO 19115 metadata standard) registered in a catalog that conforms to the OpenGIS Catalog Services Specification. Geospatial One-Stop and USGS’s The National Map provide such catalogs. When critical data is accessible via open interfaces and is discoverable through a catalog that anyone can use, the data has much more value in efforts to save lives, property, and ecosystems. The technology for this life-saving, money-saving disaster mitigation infrastructure is already available. It just needs to be deployed at local, state and federal levels. Section 8 below addresses the issue of institutional cooperation.

3. Sensor Webs: The German tsunami warning system for the Indian Ocean

The OGC has developed Sensor Web Enablement (SWE) standards to enable developers to make all types of sensors, transducers and sensor data repositories discoverable, accessible and useable via the Web. The German organization 52North provides a complete set of Sensor Web Enablement (SWE) based services under GPL license. This open source software is being used in a number of real-world systems, including a monitoring and control system for the Wupper River watershed in Germany, an Advanced Fire Information System (AFIS), and a wildfire monitoring system in South Africa.

One of several research projects using 52North’s software is the German Indonesian Tsunami Early Warning System (GITEWS), a 35 Million Euro project of the German aerospace agency (DLR) and the GeoForschungsZentrum Potsdam (GFZ), Germany’s National Research Centre for Geosciences. GITEWS is using SWE services as a front-end for sharing Tsunami related information among the various components of the GITEWS software itself. GITEWS uses real time sensors, simulation models, and other data sources.

4. The ocean community’s collaborative model

The ocean observation community has made great progress in advancing the use of standards to provide publishing, discovery, access and use of geospatial data from widely distributed and diverse sources. In January 2007, members of the OGC launched the Ocean Science OGC Interoperability Experiment (Oceans IE) to study implementations of OWS and SWE standards (and complementary standards from organizations including the ISO, IEEE, and OASIS) being used by the international ocean-observing community. More than a dozen ocean observation initiatives are using these standards, including regional application-oriented organizations such as the Gulf of Maine Ocean Observing System (GoMOOS), the open-source, community-oriented OOSTethys initiative, and others such as:

– In the US, the NOAA IOOS program made a recent decision to leverage the OGC’s Sensor Web Enablement (SWE) standards as the basis for interoperability of sensors.

Coastal Ocean Observing and Prediction (SCOOP) and the related community initiative OPENIOOS.ORG.

Interoperable GMES Services for Environmental Risk Management in Marine and Coastal Areas of Europe (InterRisk)

The organizations involved represent a developing federation of regional and global partners contributing to ever-larger pools of data and information for use by resource managers, ecologists, climatologists, educators, mariners, search-and-rescue teams, emergency response teams, and public health officials. This data network will clearly be useful in managing ocean disasters.

5. Disaster scenarios in OGC interoperability initiatives

The OGC Interoperability Program (IP) facilitates testbeds, pilot projects and interoperability experiments in which standards are created and tested. Most of them have relevance to disaster management. For example, some of the current OGC interoperability initiatives include the following activities that are relevant to disaster management:

‰Û¢ Architecture / Engineering / Construction / Owner / Operator (AECOO) Testbed — a unique initiative that brings together expertise from OGC, the buildingSMART Alliance, buildingSMART International, and major technology providers from the AEC community to advance building information model (BIM) standards. This is a key technology area for OGC in support of 3D and 4D urban modeling and analysis. As we work together with partner standards organizations and their members, we will improve our ability to address increasingly complex information integration issues with open standards. In urban disasters, online access to data developed over the lifetime of buildings and capital projects (bridges, airports, fuel depots, etc.) can be a critical asset.

‰Û¢ OGC Web Services, Phase 6 (OWS-6) — OWS-6 brought together 10 sponsor organizations and 32 participating organizations to advance standards in five major technology focus areas or ‰ÛÏthreads‰Û, including:

1) Sensor Web Enablement (SWE)

2) Geo Processing Workflow (GPW)

3) Aeronautical Information Management (AIM)

4) Decision Support Services (DSS)

‰Û¢ Delhi Transit Routing Interoperability Pilot — This initiative will demonstrate best practices and standards enabling interoperability among diverse information resources used for transportation routing in the context of transportation planning for the 2010 Commonwealth Games in Delhi, India. Transportation planning and emergency routing are important elements of disaster management.

‰Û¢ The Geo-interface for Atmosphere, Land, Earth, and Ocean netCDF Interoperability Experiment (GALEON IE) has produced reports that will improve sharing of information important in responding to major storm events, tsunamis and earthquakes.

‰Û¢ The recent Empire Challenge 08 (EC08) OGC Pilot was a multivendor demonstration of workflow that demonstrated the “chaining” of Web services in an Intelligence, Surveillance and Reconnaissance (ISR) scenario. The demonstration involved off-the-shelf software that implemented standard interfaces and encodings to task and control an airborne or spaceborne imaging device; collect the data; orthorectify it, and display it in a single frame, all in near-real time. Such capabilities are important during forest fires and floods.

6. ORCHESTRA in Europe

ORCHESTRA, a major European “integrated project” under IST-FP6, was undertaken to improve technical interoperability for risk management. The project developed a service-oriented architecture for risk management based on open standards, together with a software infrastructure for enabling risk management services. The ORCHESTRA Architecture (Caballero et al. 2007) is a platform-neutral specification based on Web service specifications of the ISO, OGC, W3C and OASIS. Three Orchestra pilots were organized to provide practical tests and demonstrations of what could be accomplished using this architecture. One goal was to define workflows that combine several services into one value-added service chain that achieves a certain goal, as shown in figure 2.

Note that service chaining involves discovery, access and use not only of distributed data servers, but also distributed service servers.

7. Relevance to GEOSS

The Global Earth Observing System of Systems (GEOSS) is being developed by the Group on Earth Observations (GEO), a partnership of 126 governments and international organizations. Through agreements on technical standards and through institutional coordination on policies, the governments are making available on Web servers a very large collection of geospatial data, including live sensor data and stored geodata of many kinds. The goal is to improve understanding and capacity in dealing with challenges involving disasters, health, energy, climate, water, weather, ecosystems, agriculture and biodiversity.

Figure 2. In one ORCHESTRA scenario, a user first finds data and services by means of a Catalog Service containing metadata about available data and services, then creates a service chain and deploys it to a Service Chain Access Service (SCAS), which deploys the service chain as multi-component Risk Assessment Service. (Figure from the ORCHESTRA project.)

Figure 2. In one ORCHESTRA scenario, a user first finds data and services by means of a Catalog Service containing metadata about available data and services, then creates a service chain and deploys it to a Service Chain Access Service (SCAS), which deploys the service chain as multi-component Risk Assessment Service. (Figure from the ORCHESTRA project.)

Development of Global Earth Observation System of Systems (GEOSS) interoperability is led by the GEO Architecture and Data Committee. As part of this committee, the OGC leads a core task to develop the GEOSS initial operating capability. The OGC leads the GEOSS Architecture Implementation Pilot (AIP), a multi-year OGC Interoperability Initiative, which has brought together technical contributions from over 120 organizations.

The Disasters Working Group in the GEOSS AIP is working from a disaster management scenario that includes use cases involving a wide variety of players. One objective of the scenario is to understand and improve the links between global systems (based on weather satellites) and local/regional systems (based on low/high resolution satellite and aerial images and in situ sensors). Integration of local/regional data sets with global capabilities is a core requirement of the GEOSS architecture.

8. The role of standards in promoting institutional interoperability

Technology often changes faster than the institutions that it serves. Technical interoperability is an accomplished fact, and technical interoperability is providing not only communication of instructions between diverse systems but also unprecedented ease of accommodating semantic differences between different information communities’ different data models. The third kind of interoperability to address in a distributed information environment is institutional interoperability.

The obstacles that stand in the way of information sharing during disasters have much to do with the willingness of data owners to share their data. Developers of data in the public and private sectors resist publishing and sharing their data for many good reasons as well as some bad reasons. The barriers that result not only create problems during a disaster, but they also impede the development of necessary interoperability and cooperation during “routine” periods between disasters. Disaster management organizations and coalitions need to address a wide range of legal and commercial issues associated with the collection, distribution and use of geospatial data. These issues include liability, privacy, security and intellectual property rights (IPR) in spatial data. These issues are complicated by the issue of data provenance, that is, accounting for the source, history, integrity, currency and reliability of “original” data sets as well as the component layers of a composite data layer.

The volunteer effort that Katrina precipitated resulted not only in collections of software and data, but organizational connections that will be helpful when (not if) other hurricanes cause devastation along the Gulf Coast. But even if a number of data sharing agreements are in place, the lack of a Web services rights management infrastructure will delay access to information.

“Geospatial Rights Management”, or “GeoRM,” is a work in progress. As Figure 3 illustrates, GeoRM necessarily addresses a wide range of requirements.

Orchestra
Table showing list of issues needing to be addressed in the digital locks and keys of rights management.
Figure 3. Many issues need to be addressed in the digital

“locks and keys” of rights management services architecture.

(Image courtesy of Ordnance Survey, developed as part of

the ORCHESTRA Project.)

Government, private sector, NGO and university members have been working in the OGC GeoRM Domain Working Group to study use cases and define requirements for GeoRM standards. The participants have developed the Geospatial Digital Rights Management Reference Model (GeoDRM RM), an abstract specification for the management of digital rights in the area of geospatial data and services. The GeoDRM RM defines specific requirements for managing IP rights by controlling geodata distribution and use. These must accommodate factors such as:

‰Û¢ The business of the organization (i.e., the motivations of commercial, public-sector, and academic organizations to make their geodata available)

‰Û¢ The type of data and media formats (e.g., physical, electronic, text, graphic, audio, video, vector, raster, observation, etc.)

‰Û¢ The content distribution channels (e.g., size of content, network bandwidth, types of end devices)

‰Û¢ The types and granularity of intellectual property rights to be managed and the contractual obligations for its use (e.g., unlimited distribution, license to use, license to reuse parts, limited distribution, sensitive/classified, etc).

The GeoDRM RM has been approved by the OGC membership, who are now using it to develop draft OpenGISå¨ interface and encoding standards that will enable diverse systems to participate in transactions involving data and services, but under the control of owner-configured rights management mechanisms. The GeoRM Domain Working Group seeks to provide a trusted infrastructure for purchasing, managing and protecting rights to online digital content, while exploiting mainstream DRM approaches, technologies and standards wherever possible.

Organizations must have the ability to license resources whether the resources are charged for or not. Thus the GeoRM Domain Working Group adheres to clear separation between licensing and pricing models. Their intention is to create a system that can accommodate all data sharing agreements: creative commons models, public library models, emergency access, purchases, licenses, control of derivative products, and so on.

9. Conclusion

The vision of “the network is the computer” is becoming a reality through open standards. Widely distributed information resources of all kinds, including complex geospatial data and geoprocessing resources, can be published, discovered, evaluated, accessed and put to our uses by our digital systems. This maturation of the Web and Web services has tremendous positive implications for disaster management. The underlying Internet is designed to “route around failure” and to benefit from the “distributedness” of network nodes. These key features of a decentralized Internet suggest that designers of such systems should exploit the potentials inherent in the decentralized resource model. The sewer system managers are naturally the best keepers of their sewer system data, and the same can be said for the many other keepers of local or specialized data. All of this data can be immediately accessible to disaster management authorities if local experts’ data and processing resources are available through their having deployed systems that implement open interfaces. Data and services, of course, need to be accompanied by descriptive standard metadata, and they should be registered in online catalogues that use standard catalog service interfaces. This model enables easy construction of “data roll-ups”, central portals and fail-safe systems involving such things as remote mirrored databases.

There are other benefits as well. Software that implements open standards cuts down on the time it takes to find crucial information, combine different information layers and change and share information on remote servers. Committing to standards gives purchasers more choices. New “best of breed” software components can seamlessly replace old ones. Also, standards-based solutions reduce costs associated with custom design and integration. And they enable the separation of data and presentation, so that systems can use the same data to provide different map styles for different user communities.

Disaster managers who understand these arguments often understand, too, that committing to interoperability is the first step in moving others toward that same commitment. This is especially valuable for organizations that serve both emergency responders and other clients within government and the private sector.