Technical fixes for climate change 'must be tested'
OGC netCDF: Powerful Tool for Science
- Published on Friday, 14 March 2014 15:41
- Ben Domenico and Stefano Nativi
- 2 Comments
In order to unleash the full potential of Geographical Information Systems (GIS), better interoperability is needed between the traditional GIS and those in a community we sometimes refer to as the “fluid Earth systems” or FES, which pertains mostly to oceanography and atmospheric sciences. This issue is becoming increasingly important as FES observations and forecasts achieve higher spatial resolutions — less than a few kilometers — of the GIS realm. The challenge is to enable practitioners in each realm to continue using the powerful tools available through their traditional applications while allowing for integration of data and applications between the two by means of standard, web -based interfaces. A powerful tool known as “netCDF” may bridge the gap.
What is netCDF?
The network Common Data Form (netCDF) is a data model and a collection of access libraries for array-oriented scientific data. Originally developed by the University Corp. for Atmospheric Research (UCAR), netCDF has been formally recognized by U.S. government standards bodies and is now a de facto standard used around the world. As an example, output datasets from climate models being used for the Fifth Assessment Report of the Intergovernmental Panel on Climate Change must be submitted in netCDF format, using the associated Climate and Forecast (CF) metadata conventions (CF-netCDF). Stakeholders include search engine developers, GIS vendors, the geosciences research and education community, international government agencies that distribute and use global Earth observations and forecasts, mass market software vendors interested in geographic based applications, and the consumer. Representatives of the netCDF community chartered the Open Geospatial Consortium (OGC) CF-netCDF Standards Working Group in 2009 to make CF-netCDF an international standard.
To a GIS practitioner, a dataset is a collection of static features such as roads, lakes, or plots of land whose boundaries are geometrically referenced to the Earth’s surface by means of a spatial reference system. The features are discrete objects with attributes which can be stored and manipulated conveniently in a database. Examples of data types in Earth sciences related GIS data collections are:
• hydrological data (streamflow and water quality)
• output of flood and landslide models
• land use and surface characteristics
• soil moisture
A typical example of a GIS-rendered map is shown in Figure 1. It is easy to see how the items in the legend can be stored as tables in a relational database system and rendered as “layers” on a map.
Fluid Earth Systems (FES) Realm
In contrast to the GIS world, the FES world is characterized by a set of parameters (e.g., pressure, temperature, wind speed) that vary as continuous functions in 4-dimensional space and time. The behavior of the parameters in space and time is governed by a set of partial differential equations. Data are simply discrete points within the mathematical function space. Examples of common data types in Earth sciences data collections related to the fluid Earth include radar, satellite radiances and derived products, output of numerical forecast models (which predict weather, climate, ocean circulation, storm surge, and many other parameters), weather station observations, and vertical soundings (from balloons, aircraft, and ocean depth probes).
A visualization of the output of a numerical weather forecast model is shown in Figure 2. The visualization illustrates the true multidimensionality of the dataset, with the jet stream rendered as a “solid” time-varying object in three spatial dimensions. Additionally, other variables such as temperature and pressure for a given elevation are shown as a surface and line contours in both vertical and horizontal cross-sections. The map background in this particular image was generated from GIS shapefiles.
Why the OGC?
Developing and maintaining the various standards components of netCDF in the OGC provides a number of benefits. The OGC provides the netCDF community with an organized international forum and a well-managed formal consensus process as well as facilities for prototyping, field-testing, conformance testing, outreach and communication with other standards bodies whose standards and activities may have significance for the FES world. The OGC’s intellectual property rights policies ensure against future proprietary claims on standards. Perhaps most importantly, working within the OGC provides an opportunity to influence and make the best use of OGC standards that are part of the larger standards platform that supports interoperability and resource sharing within the netCDF community.
GALEON: an ongoing Interoperability Experiment
The OGC GALEON IE (Geo-interface for Air, Land, Earth, Oceans NetCDF Interoperability Experiment) is an ongoing netCDF activity in the OGC Interoperability Program. Its goal is to provide an environment for rapid prototyping and testing of various combinations of software components that implement officially adopted or experimental interface and encoding standards. The GALEON experiments have resulted in many recommendations for modifications to OGC standards. Over the years, an impressive group of organizations and individuals have actively participated in GALEON. These include:
• IIA-CNR (Stefano Nativi, Lorenzo Bigagli)
• International University Bremen (Peter Baumann)
• George Mason University (Liping DI, Wenli Yang)
• CadCorp (Martin Daly, Frank Warmerdam)
• Exelis Visual Information Solutions UK (formerly RSI UK) (David Burridge, Norman Barker)
• Exelis Visual Information Solutions Inc. (formerly RSI Boulder) (Louis Genduso)
• NERC Natural Environment Research Council/British Atmospheric Data Center (Dominic Lowe, Andrew Woolf)
• Unidata/UCAR (Ben Domenico, John Caron)
• Interactive Instruments (Clemens Portele)
• NASA Geospatial Interoperability Office (John Evans)
• Washington University, St. Louis (Stefan Falke, Rudolf B. Husar)
• PFEL, Pacific Fisheries Environmental Laboratory (Roy Mendelssohn)
• NCDC, National Climatic Data Center (Glenn Rutledge).
In FES, netCDF is just one of many technologies. Hundreds of servers communicate via a complex set of client-server protocols that have evolved in the FES community over the last decade. These servers make a wide variety and large volume of data available to existing client applications, many of which implement the OGC Web Coverage Service (WCS) Interface Standard. A key aim of GALEON is to expand the usefulness of these servers by adding a standards-based interface to provide a gateway so that clients that implement WCS can access the datasets.
Figure 5 depicts an application schematic of this gateway implementation as it was envisioned at the time GALEON was initiated. Since that time, development work has integrated the underlying THREDDS/OPenDAP services into a package called the THREDDS Data Server which has a rudimentary WCS interface built in.
Since GALEON was initially proposed, a number of interesting and potentially productive ideas have arisen. Some of these found their way into the initial use cases, some have been undertaken as the experiment progressed, and others are just now being formulated. Ideas, interoperability strategies, recommendations for standards revisions and enhancements, and recommended best practices for interoperability are being documented in OGC Engineering Reports that are vetted among CF-netCDF SWG members and posted for public comment on the OGC website.
The GALEON IE experiments have led to increased use of CF-netCDF as an encoding standard in its own right, independent of any one web services protocol such as WCS.
Recent achievements of the CF-netCDF SWG have included the drafting of a general roadmap for the CF-netCDF standardization, the adoption of the CF convection extension to the “netCDF Core,” a document specifying the CF-netCDF encoding model for the OGC WCS 2.0 Interface Standard that will likely result in a new encoding model for OGC Web Service Common Interface Standard, semantics to address a specific discipline’s content, a mapping of the CF-netCDF data model onto the ISO 19123 coverage schema, and a discussion paper on netCDF conventions for handling uncertainty information.
Managing uncertainty: a universal need
The cross-fertilization that happens in the OGC has not only benefited the netCDF community, but also other communities within the OGC. The netCDF Uncertainty Conventions (netCDF-U) Discussion Paper could be the beginning of something extremely important for the geospatial world at large. This paper proposes a set of conventions for managing uncertainty information within the netCDF-3 data model and format. These conventions could provide a basis for other candidate OGC standards that would address the critical need in many application domains for robust provision of provenance information, including uncertainty information.
Ben Domenico, outreach coordinator at Unidata Program Center, chair of OGC CF-netCDF 1.0 StandardsWorking Group
Stefano Nativi, head of Florence division of the National Research Council of Italy – Institute of Atmospheric Pollution Research, co-chair of the OGC CF-netCDF 1.0 Standards Working Group.