Hyrax Appendix C: Hyrax Handlers

Hyrax Data Server Installation and Configuration Guide

Appendices

Appendix C: Hyrax Handlers

11.C.1. CSV Handler
Introduction

This handler will serve Comma-Separated Values type data. Form many kinds of files, only very modifications to the data files are needed. If you have very complex ASCII data (e.g., data with headers), take a look at the FreeForm handler, too.

Data File Configuration

Given a simple CSV data file, such as would be written out by Excel, add a single line at the start that provides a name and OpenDAP datatype for each column. Just as the data values in a given row are separated by a comma, so are the column names and types. Here is a small example data file with the added name<type> configuration row.

"Station<String>","latitude<Float32>","longitude<Float32>","temperature_K<Float32>","Notes<String>"

"CMWM",-34.7,23.7,264.3,

"BWWJ",-34.2,21.5,262.1,"Foo"

"CWQK",-32.7,22.3,268.4,

"CRLM",-33.8,22.1,270.2,"Blah"

"FOOB",-32.9,23.4,269.69,"FOOBAR"

Supported OpenDAP Datatypes

The CSV handler supports the following DAP2 simple types: Int16, Int32, Float32, Float64, String.

Dataset representation

The CSV handler will return represent the columns in the dataset as arrays with the named dimension record. For example, the sample data shown above will be represented in DAP2 by this handler as:

Dataset {
        String Station[record = 5];
        Float32 latitude[record = 5];
        Float32 longitude[record = 5];
        Float32 temperature_K[record = 5];
        String Notes[record = 5];
    } temperature.csv;

This is in contrast to the FreeForm handler that would represent these data as a Sequence with five columns.

For each column, the corresponding Array in the OpenDAP dataset has one attribute named type with a string value of Int16, …, String. However, see below for information on how to add custom attributes to a dataset.

Known Problems

There are no known problems.

Configuration Parameters

Configuring the Handler

This handler has no specific configuration parameters.

Configuring Datasets

There are two ways to add custom attributes to a dataset. First, you can use an ancillary attribute file in the same directory as the dataset. Alternatively, you can use NcML to add new attributes, variables, etc. See the NcML Handler documentation for more information on that option. Here we will describe how to set up an ancillary attribute file.

Simple Attribute Definitions

For any OpenDAP dataset, it is possible to write an ancillary attributes file like the following. If that file has the name dataset.das then whenever Hyrax reads dataset, it will also read those attributes, and return them when asked.

Attributes {
       Station {
          String bouy_type "flashing";
          Byte Age 53;
       }
       Global {
           String DateCompiled "11/17/98";
           String Conventions "CF-1.0", "CF-1.6";
       }
    }

The format of this file is very simple: Each variable in the dataset may have a collection of attributes, each of which consists of a type, a name and one or more values. In the above example, the variable Station will have the additional attributes bouy_type and Age with the respective types and values. Note that datasets may also define global attributes - information about the dataset as a whole - by adding a section with a name that doesn’t match the name of any variable in the dataset. In this example, I used Global for this (because it’s obvious) but I could have used foo. Also note the attributeConventions has two values, CF-1.0 and CF-1.6

11.C.2. GeoTIFF, GRIB2, JPEG2000 Handler
Introduction

This handler will serve data stored in files that can be read using the GDAL GIS library, including GeoTIFF, JPEG2000 and GRIB2.

Dataset Representation

These are all GIS datasets, so DAP2 responses will contains Grid variables with latitude and longitude maps. For DAP4, the responses will be coverages with latitude and longitude domain variables.

Known Problems

Often the data returned when using nothing but a GeoTIFF, JPEG2000, or GRIB2 file contains none of the metadata that make them useful for people not extremely familiar with the particular dataset. Thus, in most cases some extra work will have to be done either using NcML or an ancillary DAS file to add metadata to the dataset.

Configuration Parameters

None.

11.C.3. The HDF4 Handler
Introduction

This release of the server supports HDF4.2 and can read any file readable using that version of the API. It also supports reading/parsing HDF-EOS attribute information and provides some special mappings for HDF-EOS files depeding on the handler’s build options.

Mappings Between the HDF4 Data Model and DAP2 Data Types

SDS

This is mapped to a DAP2 Grid (if it has a dimension scale) or Array (if it lacks a dim scale).

Raster image

This is read via the HDF 4.0 General Raster interface and is mapped to Array. Each component of a raster is mapped to a new dimension labeled accordingly. For example, a 2-dimensional, 3-component raster is mapped to an m x n x 3 Array.

Vdata

This is mapped to a Sequence, each element of which is a Structure. Each subfield of the Vdata is mapped to an element of the Structure. Thus a Vdata with one field of order 3 would be mapped to a Sequence of 1 Structure containing 3 base types. Note: Even though these appear as Sequences, the data handler does not support applying relational constraints to them. You can use the array notation to request a range of elements.

Attributes

HDF attributes on SDS, rasters are straight-forwardly mapped to DAP attributes (HDF doesn’t yet support Vdata attributes). File attributes (both SDS, raster) are mapped as attributes of a DAP variable called "HDF_GLOBAL" (by analogy to the way DAP handles netCDF global attributes, i.e., attaching them to "NC_GLOBAL").

Annotations

HDF file annotations mapped in the DAP to attribute values of type "String" attached to the fake DAP variable named "HDF_ANNOT". HDF annotations on objects are currently not read by the server.

Vgroups

Vgroups are straight-forwardly mapped to Structures.

Mappings for the HDF-EOS Data Model

This needs to be documented.

Special Characters in HDF Identifiers

A number of non-alphanumeric characters (e.g., space, #, +, -) used in HDF identifiers are not allowed in the names of DAP objects, object components or in URLs. The HDF4 data handler therefore deals internally with translated versions of these identifiers. To translate the WWW convention of escaping such characters by replacing them with "%" followed by the hexadecimal value of their ASCII code is used. For example, "Raster Image #1" becomes "Raster%20Image%20%231". These translations should be transparent to users of the server (but they will be visible in the DDS, DAS and in any applications which use a client that does not translate the identifiers back to their original form).

Known Problems

Handling of Floating Point Attributes

Because the DAP software encodes attribute values as ASCII strings there will be a loss of accuracy for floating point attributes. This loss of accuracy is dependent on the version of the C++ I/O library used in compiling/linking the software (i.e., the amount of floating point precision preserved when outputting to ASCII is dependent on the library). Typically it is very small (e.g., at least six decimal places are preserved).

Handling of Global attributes

  • The server will merge the separate global attributes for the SD, GR interfaces with any file annotations into one set of global attributes. These will then be available through any of the global attribute access functions.
  • If the client opens a constrained dataset (e.g., in SDstart), any global attributes of the unconstrained dataset will not be accessible because the constraint creates a "virtual dataset" which is a subset of the original unconstrained dataset.

How to Install CF-enabled HDF4 Handler Correctly

The first step of using the HDF4 handler with CF option is to install the handler correctly because it has three different options. We’ll call them default, generic, and hdfeos2 for convenience.

  • default: This option gives the same output as the legacy handler.
  • generic: This option gives the output that meets the basic CF conventions regardless of HDF4 and HDF-EOS2 products. Some HDF4 products can meet the extra CF conventions while most HDF-EOS2 products will fail to meet the extra CF conventions.
  • hdfeos2: This option treats HDF-EOS2 products differently so that their output follows not only the basic CF conventions but also the extra CF conventions. For HDF4 products, the output is same as the generic option.

Pick the Right RPM Instead of Building from Source

If you use Linux system that supports RPM package manager and have a super user privilege, the easiest way to install the HDF4 handler is using RPMs provided by OPeNDAP, Inc. website.

The OPeNDAP’s download website provides two RPMs --- one with HDF-EOS and one without HDF-EOS. You should pick the one with HDF-EOS if you want to take advantage of the extra CF support provided by the handler. If you pick one without HDF-EOS, please make sure that the H4.EnableCF key is set "true" in h4.conf file. See section 3.1 below for the full usage.

Here are two basic commands for deleting and adding RPMs:

  • Remove any existing RPM package using 'rpm -e <package_name>'.
  • Install a new RPM package using 'rpm -i <package_name.rpm>'.

1) Download and install the latest "libdap", "BES", and "General purpose handlers (aka dap-server)" RPMs first from

  http://opendap.org/download/hyrax

3) Download and install the latest "hdf4_handler" RPM from

  http://opendap.org/download/hyrax

4) (Optional) Configure the handler after reading the section 3 below.

5) (Re)start the BES server.

  %/usr/bin/besctl (re)start

Build With the HDF-EOS2 Library If You Plan to Support HDF-EOS2 Products

If you plan to build one instead of using RPMs and to support HDF-EOS2 products, please consider installing the HDF-EOS2 library first. Then, build the handler by specifying --with-hdfeos2=/path/to/hdfeos2-install-prefix during the configuration stage like below:

  ./configure --with-hdf4=/usr/local --with-hdfeos2=/usr/local/

Although the HDF-EOS2 library is not required to clean dataset names and attributes that CF conventions require, visualization will fail for most HDF-EOS2 products without the use of HDF-EOS2 library. Therefore, it is strongly recommended to use --with-hdfeos2 configuration option if you plan to serve NASA HDF-EOS2 data products. The --with-hdfeos2 configuration option will affect only the outputs of the HDF-EOS2 files including hybrid files, not pure HDF4 files.

As long as the H4.EnableCF key is set to be true as described in section 3.1 below, the HDF4 handler will generate outputs that conform to the basic CF conventions even though the HDF-EOS2 library is not specified with the --with-hdfeos2 configuration option. All HDF-EOS2 objects will be treated as pure HDF4 objects.

Please see the INSTALL document on step-by-step instruction on building the handler.

Configuration Parameters

CF Conventions and How they are Related to the New HDF4 Handler?

Before we discuss the usage further, it’s very important to know what the CF conventions are. The CF conventions precisely define metadata that provide a description of physical, spatial, and temporal properties of the data. This enables users of data from different sources to decide which quantities are comparable, and facilitates building easy-to-use visualization tools with maps in different projections.

Here, we define the two levels of meeting the CF conventions: basic and extra.

  • Basic: CF conventions have basic (syntactic) rules in describing the metadata itself correctly. For example, dimensions should have names; certain characters are not allowed; no duplicate variable dimension names are allowed.
  • Extra: All physical, spatial, and temporal properties of the data are correctly described so that visualization tools (e.g., IDV and Panoply) can pick them up and display datasets correctly with the right physical units. A good example is the use of "units" and "coordinates" attributes.

If you look at NASA HDF4 and HDF-EOS2 products, they are very diverse in self-describing data and fail to meet CF conventions in many ways. Thus, the HDF4 handler aims to meet the conventions by correcting OPeNDAP attribute(DAS)/description(DDS)/data outputs on the fly. Although we tried our best effort to implement the "extra" level of meeting the CF conventions, some products are inherently difficult to meet such level. In those cases, we ended up meeting the basic level of meeting the CF conventions.

BES Keys in h4.conf

You can control HDF4 handler’s output behavior significantly by changing key values in a configuration file called "h4.conf".

If you used RPMs, you can find the h4.conf file in /etc/bes/modules/. If you built one, you can find the h4.conf file in {prefix}/etc/bes/modules.

The following 6 BES keys are newly added in the h4.conf file. The default configuration values are specified in the parentheses.

H4.EnableCF (true)

If this key’s value is false, the handler will behave same as the default handler. The output will not follow basic CF conventions. Object and attribute names will not be corrected to follow the CF conventions. Most NASA products cannot be visualized by visualization tools that follow the CF conventions. Such tools include IDV and Panoply.

The rest of keys below relies on this option. This key must be set to be "true" to ensure other keys to be valid. Thus, this is the most important key to be turned on.

H4.EnableMODAPSFile(false)

By turning EnableMODAPSFile to be true, when HDF-EOS2 library is used, an extra HDF file handle(by calling SDstart) will be generated at the beginning of DAS,DDS and Data build. This may be useful for a server that mounts its data over the network. If you are not sure about your server settings, always leave it as false or comment out this key. By default this key is turned off.

H4.EnableSpecialEOS (true)

When turning on this key, the handler will handle AIRS level 3 version 6 products and MOD08_M3-like products in a speedy way by taking advantage of the special data structures in these two products. Using this key requires the use of HDF-EOS2 library now although HDF-EOS2 library will not be called. By turning on this key, potentially HDF-EOS2 files that provide dimension scales for all dimensions may also be handled quickly. By default, this key should be set to true.

H4.DisableScaleOffsetComp (true)

Some NASA HDF4(MODIS etc.) products don’t follow the CF rule to pack the data. To avoid the confusion for OPeNDAP’s clients , the handler may adopt the following two approaches:

  1. Apply the scale and offset computation to the individual data point if the scale and offset rule doesn’t follow CF in the handler.
  2. If possible, transform the scale and offset rule to CF rule.

Since approach 1) may degrade the performance of fetching large size data by heavy computation, we recommend approach 2), which is indicated by setting this key to be true. By default, this key should always be true.

H4.EnableCheckScaleOffsetType (true)

By turning on this key, the handler will check if the datatype of scale_factorand offset is the same. This is required by CF. If they don’t share the same datatype, the handler will make the data type of offset be the same as that of scale_factor.

Since we haven’t found the data type inconsistencies of scale_factor and offset, in order not affect the performance,this key will be set to false by default.

H4.EnableHybridVdata (true)

If this key’s value is false, additional Vdata such as "Level 1B Swath Metadta" in LAADS MYD021KM product will not be processed and visible in the DAS/DDS output. Those additional Vdatas are added directly using HDF4 APIs and HDF-EOS2 APIs cannot access them.

H4.EnableCERESVdata (false)

Some CERES products(CER_AVG,CER_ES4,CER_SRB and CER_ZAVG, see description in the HDFSP.h) have many SDS fields and some Vdata fields. Correspondingly, the DDS and DAS page may be very long. The performance of accessing such products with visualization clients may be greatly affected. It may potentially even choke netCDF java clients.

To avoid such cases, we will not map vdata to DAP in such products by default. Users can turn on this key to check vdata information of some CERES products. This key will not affect the access of other products.

H4.EnableVdata_to_Attr (true)

If this key’s value is false, small vdata datasets will be mapped to arrays in DDS output instead of attributes in DAS.

If this key’s value is true, vdata is mapped to attribute if there are less than or equal to 10 records.

For example, the DAS output of TRMM data 1B21 will show vdata as an attribute:

  DATA_GRANULE_PR_CAL_COEF {
           String hdf4_vd_desc "This is an HDF4 Vdata.";
           Float32 Vdata_field_transCoef -0.5199999809;
           Float32 Vdata_field_receptCoef 0.9900000095;
           Float32 Vdata_field_fcifIOchar 0.000000000, 0.3790999949, 0.000000000,
           -102.7460022, 0.000000000, 24.00000000, 0.000000000, 226.0000000, 0.000000000,
           0.3790999949, 0.000000000, -102.7460022, 0.000000000, 24.00000000, 0.000000000,
           226.0000000;
       }

H4.EnableCERESMERRAShortName (true)

If this key’s value is false, the object name will be prefixed by the vgroup path and the fullpath attribute will not be printed in DAS output. This key only affects NASA CERES and MERRA products we support.

For example, the DAS output for Region_Number dataset

    Region_Number {
            String coordinates "Colatitude Longitude";
            String fullpath "/Monthly Hourly Averages/Time And Position/Region Number";
       }

becomes

   Monthly_Hourly_Averages_Time_And_Position_Region_Number {
            String coordinates "Monthly_Hourly_Averages_Time_And_Position_Colatitude 
              Monthly_Hourly_Averages_Time_And_Position_Longitude";
       }

in CER_AVG_Aqua-FM3-MODIS_Edition2B_007005.200510.hdf.

H4.DisableVdataNameclashingCheck (true)

If this key’s value is false, the handler will check if there’s any vdata that has the same name as SDS. We haven’t found such a case in NASA products so it’s safe to disable this to improve performance.

H4.EnableVdataDescAttr (false)

If this key’s value is true, the handler will generate vdata’s attributes. By default, it’s turned off because most NASA hybrid products do not seem to store important information in vdata attributes. If you serve pure HDF4 files, it’s recommended to turn this value to true so that users can see all data. This key will not affect the behavior of the handler triggered by the H4.EnableVdata_to_Attr key in section 3.3 except the vdata attributes of small vdatas that are mapped to attributes in DAS instead of arrays in DDS. That is, only attributes of small vdatas will be also turned off from the DAS output if this key is turned off, not the values of vdatas. If a vdata doesn’t have any attribute or field attribute, the description

       String hdf4_vd_desc "This is an HDF4 Vdata.";

will not appear in the attribute for that vdata although the key is true. The attribute container of the vdata will always appear regardless of this key’s value.

H4.EnableCheckMODISGeoFile (false)

For MODIS swath data products that use the dimension map, if this key’s value is true and a MODIS geo-location product such as MOD03 is present and under the same directory as the swath product, the geolocation values will be retrieved using the geolocation fields in MOD03/MYD03 file instead of using the interpolation according to the dimension map formula.

We feel this is a more accurate approach since additional corrections may be done for geo-location values stored in those files1 although we’ve done a case study that shows the differences between the interpolated values and the values stored in the geo-location file are very small.

For example, when the handler serves…

       "MOD05_L2.A2010001.0000.005.2010005211557.hdf"

…file, it will first look for a geo-location file

       "MOD03.A2010001.0000.005.2010003235220.hdf"

…first from the SAME DIRECTORY where MOD05_L2 file exists.

Please note that the "A2010001.0000" in the middle of the name is the "Acquisition Date" of the data so the geo-location file name should have exactly the same string. Handler uses this string to identify if a MODIS geo-location file exists or not.

This feature works only with HDF-EOS2 MODIS products. It will not work on the pure HDF4 MODIS product like MOD14 that requires the MOD03 geo-location product. That is, putting the MOD03 file with MOD14 in the same directory will not affect the handler’s DAS/DDS/DDX output of the MOD14 product.

1 http://modis.gsfc.nasa.gov/data/dataprod/nontech/MOD0203.php

H4.CacheDir (no longer supported)

The HDF4 handler used to support caching its response objects, but that feature has been removed do to problems with it and datasets where multiple SDS objects had arrays with the same names. This parameter is now ignored. Note that no error message is generated if your h4.conf file includes this, but it’s ignored by hyrax 1.7 and later.

11.C.4. The HDF5 Handler
Introduction

This release of the server supports HDF5 files written using any version of the HDF5 API. The handler should be built/linked with version 1.8.x of the API.

Mappings Between the HDF5 Data Model and DAP2 Data Types

The mapping between the HDF5 and HDF-EOS5 data model and DAP2 is documented in a NASA Technical Note (ESDS-RFC-017). This note is quite detailed; a summary from its appendix is provided below.

Special Characters in HDF Identifiers

A number of non-alphanumeric characters (e.g., space, #, +, -) used in HDF identifiers are not allowed in the names of DAP objects, object components or in URLs. The HDF5 data handler therefore deals internally with translated versions of these identifiers. To translate the WWW convention of escaping such characters by replacing them with "%" followed by the hexadecimal value of their ASCII code is used. For example, "Raster Image #1" becomes "Raster%20Image%20%231". These translations should be transparent to users of the server (but they will be visible in the DDS, DAS and in any applications which use a client that does not translate the identifiers back to their original form).

Known Problems

Handling of Floating Point Attributes

Because the DAP software encodes attribute values as ASCII strings there will be a loss of accuracy for floating point attributes. This loss of accuracy is dependent on the version of the C++ I/O library used in compiling/linking the software (i.e., the amount of floating point precision preserved when outputting to ASCII is dependent on the library). Typically it is very small (e.g., at least six decimal places are preserved).

Configuration Parameters

H5.EnableCF

This is an option to support NASA HDF5/HDF-EOS5 data products. The HDF5/HDF-EOS5 data products do not follow CF conventions. However, hdf5_handler can make them follow the conventions if you turn on this option. The key benefit of this option is to allow OPeNDAP visualization clients to display remote data seamlessly. Visit the HDF-EOS site for more details. for details.

H5.IgnoreUnknownTypes

Ignore variables that use data types the handler cannot process. In practice this means 64-bit integers. DAP2 does not support the 64-bit integer type; using this parameter (i.e., setting its value to yes or true) means that 64-bit integer variables are ignored and the rest of the variables in the file can be read. The default value of this parameter (no or false) configures the handler to return an error when a 64-bit integer variable is found.

H5.EnableCheckNameClashing

This option checks if there are duplicate variable names when group information is removed and some characters are replaced with underscore character. Although this option is required to ensure that no two variable names are same, the operation is quite costly and thus can degrade the performance significantly. If you are certain that your HDF5 data won’t have any duplicate name, you can turn this off to gain the performance of the server. For the NASA HDF5/HDF-EOS5 products we tested (AURA OMI/HIRDLS/MLS/TES, MEaSUREs SeaWiFS/Ozone, Aquarius, GOSAT/acos, SMAP), we did not find any name clashing for those objects. So the name clashing check seems unnecessary and this option is turned off by default. The handler will check the name clashing for any products not tested regardless of turning this option on or off.

H5.EnableAddPathAttr

When this option is turned off, the HDF5 handler will not insert the fullnamepath and origname attribute in DAS output. For example, the DAS output like below:

     temperature {
            String units "K";
            String origname "temperature";
            String fullnamepath "/HDFEOS/GRIDS/GeoGrid/Data Fields/temperature";
            String orig_dimname_list "XDim ";
          }

…will change to…

     temperature {
            String units "K";
            String orig_dimname_list "XDim ";
         }
H5.EnableDropLongString

NetCDF Java client cannot handle string size bigger than 32767 and will throw an error if such variable is seen. Thus, the HDF5 handler need to hide such long string variables from DAS and DDS output. Setting this option to true will ensure that NetCDF Java OPeNDAP visualization clients such as IDV and Panoply can visualize other important variables.

H5.DisableStructMetaAttr

When this option is true, StructMetadata attribute will not be generated in DAS output.

HDF5 and HDF-EOS to DAP Type Mappings

  1. The complete set of mappings for the types in the HDF5 and HDF-EOS5 data model

HDF5 data type

DAP2 data name

Notes

8-bit unsigned integer

Byte

8-bit signed integer

Int16

16-bit unsigned integer

UInt16

16-bit signed integer

Int16

32-bit unsigned integer

UInt32

32-bit signed integer

Int32

64-bit unsigned integer

N/A

Results in an error unless H5.IgnoreUnknownTypes is true. See above.

64-bit signed integer

N/A

… H5.IgnoreUnknownTypes …

32-bit floating point

Float32

64-bit floating point

Float64

String

String

Object/region reference

URL

Compound

Structure

HDF5 compound can be mapped to DAP2 under the condition that the base members (excluding object/region references) of compound can be mapped to DAP2.

Dataset

Variable

HDF5 dataset can be mapped to DAP2 under the condition that the datatype of the HDF5dataset can be mapped to DAP2.

Attribute

Attribute

HDF5 attribute can be mapped to DAP2 under the condition that the datatype of the HDF5 dataset can be mapped to DAP2, and the data is either scalar or one-dimensional array.

Group

naming convention

A special attribute HDF5_ROOT_GROUPis used to represent the HDF5 group structure; The absolute path of the HDF5 dataset as the DAP2 variable name; HDF5 group can be mapped to DAP2 under the condition that the file structure is a tree structure.

HDF-EOS5 grid w/1-D projection

Grid

The latitude and longitude are encoded according to CF

HDF-EOS5 grid w/2-D projection

Arrays

Map data variables to DAP2 Arrays; generate DAP2 Arrays for latitude and longitude (following CF); add a coordinates attribute for each variable providing the names of the coordinate variables (following CF).

HDF-EOS5 Swath

Arrays

Follow the same prescription as with HDF-EOS5 2-D grids

11.C.5. The NetCDF Handler
Introduction

There are several versions of the netCDF software for reading and writing data and using those different versions, it’s possible to make several different kinds of data files. For the most part, netCDF strives to maintain compatibility so that any older file can be read using any newer version of the library. To ensure that the netCDF handler can read (almost) any valid netCDF data file, you should make sure to use the latest version of the netCDF library when you build or install the handler.

However, as of netCDF 4, there are some new data model components in netCDF that are hard to represent in DAP2 (hence the 'almost' in the preceding paragraph). If the handler, as of version 3.10.x, is linked with netCDF 4.1.x or later, you will be able to read any netCDF file that fits the 'classic' model of netCDF (as defined by Unidata’s documentation) which essentially means any file that uses only data types present in the netCDF 3.x API but with the addition that these files can employ both internal compression and chunking.

The new data types present in the netCDF data model present more of a challenge. However, as of version 3.10.x, the Hyrax data handler will serve most of the new cardinal types and the more commonly used 'user defined types'.

Mappings Between NetCDF Version 4 Data Model and DAP2 Data Types

All of the cardinal types in the netCDF 4 data model map directly to types in DAP2 except for the following:

NC_BYTE

There is no 'signed byte' type in DAP2 so these map to an unsigned byte or signed Int16, depending on the value of the option NC.PromoteByteToShort (see below where the configuration parameters are described).

NC_CHAR

There is no 'character' type in DAP2 so these map to DAP Strings of length one. Arrays of N characters in netCDF map to arrays of N-1 Strings in DAP

NC_INT64, NC_UINT64

DAP2 does not support 64-bit integers (this will be added soon to the next version of the protocol).

Mappings for netCDF 4’s User Defined types

In the netCDF documentation, types such as Compound (which is effectively C’s struct type), et c., are called User Definedtypes. Unlike the cardinal types, netCDF 4’S user defined types don’t always have a simple mapping to DAP2’s types. However, the most important of the user defined types, NC_COMPOUND, does map directly to DAP2’s Structure. Here’s how the user defined types are mapped by the handler as of version 3.10:

NC_COMPOUND

This maps directly to a DAP2 Structure. The handler works with both compound variables and attributes. For attributes, the handler only recognizes scalar and vector (one-dimensional) compounds. For variables scalar and array compounds are supported including compounds within compounds and compounds with fields that are arrays.

NC_VLEN

Not supported

NC_ENUM

Supported so long as the 'base type' is not a 64-bit integer. We add extra attributes to help the downstream user. We addDAP2_OriginalNetCDFBaseType with the value NC_ENUM and DAP2_OriginalNetCDFTypeName with the name of the type from the file (Enums in netCDF are user-defined types, so they have names set y the folks who wrote the file). We also add two attributes that provide information about the integral values and they names (e.g., Clear = 0, Cumulonimbus = 1, Stratus = 2, …, Missing = 255) using two attributes: DAP2_EnumValues and DAP2_EnumNames.

NC_OPAQUE

This type is mapped to an array of Bytes (so the scalar NC_OPAQUE becomes a one-dimensional array in DAP2). If a netCDf file contains an array (with M dimensions) of NC_OPAQUE vars, then the DAP response will contain a Byte array with M+1 dimensions. In addition, the handler adds an attribute DAP2_OriginalNetCDFBaseType with the valueNC_OPAQUE and DAP2_OriginalNetCDFTypeName with the name of the type from the file to the Byte variable so that savvy clients can see what’s going on. Even though the DAP2 object for an NC_OPAQUE is an array, it cannot be subset (but arrays of NC_OPAQUEs can be subset with the restriction that M+1 dimensional DAP2 Byte array can only be subset in the original NC_OPAQUE’s M dimensions).

NetCDF 4’s Group

The netCDF handler currently reads only from the root group.

Configuration parameters

IgnoreUnknownTypes

When the handler reads a type that it does not recognize, it will normally signal an error and stop processing. Setting this parameter to true will cause it to silently ignore the unknown type (an error message may be written to the bes log file).

Accepted values: true,yes|false,no, defaults to false.

Example:

NC.IgnoreUnknownTypes=true

ShowSharedDimensions

Include shared dimensions as separate variables. This feature is included to support older clients based on the netCDF library. Some versions of the library depend on the shared dimensions appearing as variables at the 'top' of the file.

Clients that announce to the server that they understand newer versions of the DAP (3.2 and up) won’t need these extra variables, while older ones likely will. In the 3.10.0 version of the handler, the DAP version that clients announce they can accept will determine how the handler responses unless this parameter is set, in which case, the value set in the configuration file will override that default behavior.

Accepted values: true,yes|false,no, defaults to false.

Example:

NC.ShowSharedDimensions=false

PromoteByteToShort

This option first appears in Hyrax 1.8; version 3.10.0 of the netcdf_handler.

Note: Hyrax version 1.8 ships with this turned on in the netcdf handler’s configuration file, even though the default for the option is off.

Use this option to promote DAP2 Byte variables and attributes to Int16, noting that Byte is unsigned and Int16 is signed, so this is a way to preserve the sign of netCDF’s signed Byte data type.

For netcdf4 files, this option behaves the same except that NC_OPAQUE variables are externalized as DAP Bytes regardless of the option’s value; their Byte attributes, on the other hand, as promoted to Int16 when the option is true.

Backstory: In NetCDF the Byte data type is signed while in DAP2 it is unsigned. For data (i.e., variables) this often makes no real difference because byte data are often read from the network and dumped into an array where their sign is interpreted (correctly or not) by the client software - in other words byte-data is often a special case. However, this is, strictly speaking, wrong. In addition, and maybe more importantly, with attributes the values are interpreted by the server and represented in ASCII (and sent to the client as text), so the sign is interpreted by the server and and the resulting text is converted into a binary value by the client; the simple trick of letting the default C types handle the value’s sign won’t work. One way around this incompatibility is to promote Byte in DAP2 to Int16, which is a signed type.

Accepted values: true,yes|false,no, defaults to false, the server's original behavior.

Example:

NC.PromoteByteToShort=true

NetCDF to DAP Type Mappings

  1. _The complete set of mappings for the types in the netCDF 4 data model
    _ (entries in gray are new types not currently supported; entries in green are new types that are supported)

netCDF type name

netCDF type description

DAP2 type name

DAP2 type description

Notes

NC_BYTE

8-bit signed integer

dods_byte
dods_int16 (see note)

8-bit unsigned integer
16-bit signed int(see note)

The DAP2 type is unsigned; This mapping can be changed so that netcdf Byte mapps to DAP2 Int16 (which will preserve the netCDF Byte’s sign bit (see the NC.PromoteByteToShort configuration parameter).

NC_UBYTE

8-bit unsigned integer

dods_byte

8-bit unsigned integer

NC_CHAR

8-bit unsigned integer

dods_str

variable length character string

Treated as character data; arrays are treated specially (see text)

NC_SHORT

16-bit signed integer

dods_int16

16-bit signed integer

NC_USHORT

16-bit unsigned integer

dods_uint16

16-bit unsigned integer

NC_INT

32-bit signed integer

dods_int32

32-bit signed integer

NC_UINT

32-bit unsigned integer

dods_uint32

32-bit unsigned integer

NC_INT64

64-bit signed integer

None

Not supported

NC_UINT64

64-bit unsigned integer

None

Not supported

NC_FLOAT

32-bit floating point

dods_float32

32-bit floating point

NC_DOUBLE

64-bit floating point

dods_float64

64-bit floating point

NC_STRING

variable length character string

dods_str

variable length character string

In DAP2 it’s impossible to distinguish this from an array of NC_CHAR

NC_COMPOUND

A user defined type similar to C’s struct

dods_structure

A DAP Structure; similar to C’s struct

NC_OPAQUE

A BLOB data type

dods_byte

an array of bytes

The handler adds two attributes (DAP2_OriginalNetCDFBaseTypewith the value NC_OPAQUE

and DAP2_OriginalNetCDFTypeNamewith the type’s name) that provide info for savvy clients;
see text above about subsetting details

NC_ENUM

Similar to C’s enum

dods_byte, …, dods_uint32

any integral type

The handler chooses an integral type depending on the type used in the NetCDF file.

It adds the DAP2_OriginalNetCDFBaseTypeandDAP2_OriginalNetCDFTypeNameattributes
as with NC_OPAQUE and also DAP2_EnumNames and DAP2_EnumValues. Enums with 64-bit
integer base types are not supported.

NC_VLEN

variable length arrays

None

11.C.6. The SQL Hander
Introduction
Information Icon This handler is not included with the source or binary versions od Hyrax we distribute as our official releases. You must download the software and build it yourself at this time.

This handler will serve data stored in a relational database if that database is configured to be accessed using ODBC. The handler has been tested using both the unixODBC and iODBC driver managers on Linux and OS/X, respectively. While our testing has been limited to the MySQL and Postgres database servers, the handler is not specific to either of those severs; it should work with any database that can be accessed using an ODBC driver.

The handler can be configured to combine information from several tables and provide access to it as a single dataset, including performing the full range of SQL operations. At the same time, the SQL database server is never exposed to the web using this handler, so the database contents are safe.

Mappings Between the ODBC Data Types and DAP2 Data Types

The SQL Handler maps the datatypes defined by SQL into types defined by DAP. In most cases the mapping is obvious. Here we document each of the supported SQL types and their corresponding DAP type. Note that any types not listed here causes a runtime fatal error. That is, if you include in the [select] part of the dataset file the name of a column with an unsupported data type, the handler will return an error saying SQL Handler: The datatype read from the Data Source is not supported. The problem type code is: <type code>.

Table 7. The Mapping between ODBC and DAP datatypes
ODBC Type DAP Type

SQL_C_CHAR

Str

SQL_C_SLONG, SQL_C_LONG

Int32

SQL_C_SHORT

Int16

SQL_C_FLOAT

Float32

SQL_C_DOUBLE

Float64

SQL_C_NUMERIC

Int32

SQL_C_DEFAULT

Str

SQL_C_DATE, SQL_C_TIME, SQL_C_TIMESTAMP,
SQL_C_TYPE_DATE, SQL_C_TYPE_TIME, SQL_C_TYPE_TIMESTAMP

Str

SQL_C_BINARY, SQL_C_BIT

Int16

SQL_C_SBIGINT, SQL_C_UBIGINT

Int32

SQL_C_TINYINT, SQL_C_SSHORT, SQL_C_STINYINT

Int16

SQL_C_ULONG, SQL_C_USHORT

Int32

SQL_C_UTINYINT

Int32

SQL_C_CHAR

Str

SQL_C_CHAR

Str

Table 8. The Mapping between SQL and ODBC datatypes
SQL Type ODBC Type

SQL_CHAR, SQL_VARCHAR, SQL_LONGVARCHAR

SQL_WCHAR, SQL_WVARCHAR, SQL_WCHAR

SQL_DECIMAL, SQL_NUMERIC

Known Problems

It’s not exactly a problem, but the configuration of this handler is dependent on correctly configuring the ODBC driver and these drivers vary by operating system and implementation. This does not simplify the configuration this component of the server!

Configuration Parameters

Configuring the ODBC Driver

To configure the handler the handler itself must be told which tables, or parts of tables, should be accessed and the ODBC driver must be configured. In general, ODBC drivers are pretty easy to configure and, while each driver has its idiosyncrasies, most of the setup is the same for any driver/database combination. Both unixODBC and iODBC use two configuration fills: /etc/odbcinst.ini and /etc/odbc.ini. The driver should have documentation on these files and their setup. There is one parameter you will need to know to make use of the sql handler. In the odbc.ini file, the parameter databaseis used to reference the actual database that is matched to particular Data Source Name (DSN). You will need to know the DSN since programs that use ODBC to access a database use the DSN and not the name of the database. In addition, there is a user and password parameter set defined for a particular DSN; the sql handler will likely need that too (NB: This might not actually be needed 9/9/12).

What the configuration files look like on OSX:

odbcinst.ini

[ODBC Drivers]
MySQL ODBC 5.1 Driver = Installed
psqlODBC              = Installed
[ODBC Connection Pooling]
PerfMon    = 0
Retry Wait =
<span class="redactor-invisible-space">
</span>[psqlODBC]
Description = PostgreSQL ODBC driver
Driver      = /Library/PostgreSQL/psqlODBC/lib/psqlodbcw.so
[MySQL ODBC 5.1 Driver]
Driver = /usr/local/lib/libmyodbc5.so

This file holds information about the database name and the Data Source Name (DSN). Here it’s creatively named 'test'.

odbc.ini:

[ODBC Data Sources]
data_source_name = test
[ODBC]
Trace         = 0
TraceAutoStop = 0
TraceFile     =
TraceLibrary  =
[test]
Description = MySQL test database
Trace       = Yes
TraceFile   = sql.log
Driver      = MySQL ODBC 5.1 Driver
Server      = localhost
User        = jimg
Password    =
Port        = 3306
DATABASE    = test
Socket      = /tmp/mysql.sock</code>

Configuring the Handler

SQL.CheckPoint

Checkpoints in the SQL handler are phases of the database access process where error conditions can be tested for and reported. If these are activated using the SQL.CheckPoint parameter and an error is found, then a message will be printed in the bes.log and an exception will be thrown. There are five checkpoints supported by the handler:

CONNECT

1 (Fatal error)

CLOSE

2

QUERY

3

GET_NEXT

4 (Recoverable error)

NEXT_ROW

5

The default for the handler is to test for and report all errors:

SQL.CheckPoint=1,2,3,4,5

Configuring Datasets

One aspect of the SQL handler that sets it appart from other handlers is that the datasets it serves are not files or collections of files. Instead they are values read from one or more tables in a database. The handler uses one file for each dataset it serves; we call them dataset files. Within a dataset file there are several sections that define which Data Set Name (DSN) to use (recall that the DSN is set in the odbc.ini file which maps the DSN to a particular database, user and password), which tables, how to combine them and which columns to select and if any other constraints should be applied when retrieving the values from the database server. As a data provider, you should plan on having a dataset file for each dataset you want people to access, even if those all come from the same table.

A dataset file has five sections:

section

This is where the DSN and other information are given

select

Here the arguments to passed to select are given. This may be * or the names of columns, just as with an SQL SELECTstatement

from

The names of the tables. This is just like the FROM part of an SQL SELECT statement.

where

You’re probably seeing a pattern by now: SELECT … FROM … WHERE

other

Driver-specific parameters

Each of the sections is denoted by starting a line in the dataset file with its name in square brackets such as:

<code>[section]</code>

or

<code>[select]</code>

Information in the section Part of the Dataset File

There are six parameters that may be set in the select part of the dataset file:

api

Currently this must be odbc

server

The DSN.

user, pass, dbname, port

Unused. These are detected by the code, however, and can be used by a new submodule that connects to a database using a scheme other than ODBC. For example, if you were to specialize the connection mechanism so that it used a database’s native API, these keywords could be used to set the database name, user, etc., in place of the ODBC DSN. In that case the value of api would need to be the base name of the new connection specialization.

Note that a dataset file may have several [section] parts, each which lists a different DSN. This provides a failover capability so that if the same information (or similar enough to be accessible using the same SQL statement) exists both locally and remotely, both sources can be given. For example, suppose that your institution maintains a database with many thousands of observations and you want to serve a subset of those. You have a copy of those data on your own computer too, but you would rather have people access the data from the institution’s high performance hardware. You can list both DSNs, knowing that the first listed will get preference.

The select Part

This part lists the columns to include as you would write them in an SQL SELECT statement. Each column name has to be unique. You can use aliases (defined in the preamble of the dataset file) to define different names for two columns from different database tables that are the same. For example, you could define aliases like these:

table1.theColumn as col1
table2.theColumn as col2

and then use col1,col2 in the select part of the dataset file

The from and where Parts

Each of these parts are simply substituted and passed to the database just as you would expect. Note that you do not include the actual words FROM or WHERE, just the contents of those parts of the SQL statement.

The other Part

Entries in this parts should be of the form key = value, one per line. They are taken as a group and passed to the ODBC driver. Use this section to provide any parameters that are specific to a particular driver.

Using Variables

The dataset files also support 'variables' that can be used to define a name once and then use it repeatedly by simply using the variable name instead. Then if you decide to read from a different table, only the variable definition needs to be changed. Variables are defined as the beginning o the dataset file, before the section part. The syntax for variable is simple: define $variable$ = value, one per line (the $ characters are literal, as is the word define). To reference a variable, use $variable$ wherever you would otherwise use a literal.

Some Example Dataset Files

[section]
#  Required.
api=odbc
# This is the name of the configured DSN
server=MySQL_DSN
[select]
# The attribute list to query
# NOTE: The order used here will be kept in the results
id, wind_chill, description
[from]
# The table to use can be a complex FROM clause
wind_08_2010
[where]
# this is optional constraint which will be applied to ALL
# the requests and can be used to limit the shared data.
id<100
Information Icon The following two descriptions of the File Out NetCDF code need to be combined.
11.C.7. NetCDF file responses
Introduction

The File Out NetCDF module provides the ability to return OPeNDAP DataDDS objects as netcdf files. The module takes an OPeNDAP DataDDS and translates the attributes, data structure, and data into a netcdf file and streams the resulting file back to the caller. Currently, simple types, arrays, structures and grids are supported. Sequences are not yet supported.

Services Handled

This module does not handle any services but adds to an existing service.

Services Provided

The module provides an additional format to the dap service’s dods command. The format is used to specify a "returnAs" format. Typically you will see responses of the dap2 format. This module provides the additional format of returning the OPenDAP data object as a netcdf file.

How to Use the Module

Once installed, the fonc.conf file is installed in the BES etc/bes/modules directory and is automatically loaded by the BES at startup. There is a configuration option that you can change for this module. The FONc.Tempdir parameter in the fonc.conf configuration file tells the module where to store the netcdf files generated by the module until the file is streamed back to the caller. The default value for this parameter is the /tmp directory. You should change this to a location where there is plenty of disk space/quota that is owned by the user set to run the BES.

FONc.Tempdir=/tmp

Other BES keys that can be used to control the handler’s behavior:

FONc.UseCompression=true

Use compression when making netCDF4 files; true by default

FONc.ChunkSize=4096

The default chunk size when making netCDF4 files, in KBytes (4k by default)

FONc.ClassicModel=true

When making a netCDF4 file, use only the 'classic' netCDF data model; true by default.

The next time the BES is started it will load this module. And, once installed, the OLFS will know that it can use this module to transform your data. Next to a dataset you will see the list of data products provided for that dataset. This will include a link for File Out Netcdf.

If not using the OLFS to serve your data, for example if using the bescmdln, you would run a command file that would look something like this:

<?xml version="1.0" encoding="UTF-8"?>
<request reqID="some_unique_value" >
    <setContext name="dap_format">dap2</setContext>
    <setContainer name="c" space="catalog">data/nc/fnoc1.nc</setContainer>
    <define name="d">
    <container name="c" />
    </define>
    <get type="dods" definition="d" returnAs="netcdf"/>
</request>
11.C.8. Background on Returning NetCDF
General Questions and Assumptions
Information Icon This appendix holds general design information that we used when first implementing the Hyrax netCDF response. The fundemental problem that needs to be solved in the software is to map the full spectrum of OPeNDAP datasets to the netCDF 3 and 4 data models.
  • What version of netCDF will this support? Hyrax supports returing both Version 3 and 4 netCDF files.
  • Should I traverse the data structure to see if there are any sequences? Yes. An initial version should note their presence and add an attribute noting that they have been elided.

How to Flatten Hierarchical Types

For a structure such as:

Structure {
        Int x;
        Int y;
    } Point;

…represent that as:

Point.x
Point.y

Explicitly including the dot seems ugly and like a kludge and so on, but it means that the new variable name can be feed back into the server to get the data. That is, a client can look at the name of the variable and figure out how to ask for it again without knowing anything about the translation process.

Because this is hardly a lossless scheme (a variable might have a dot in its name…), we should also add an attribute that contains the original real name of the variable - information that this is the result of a flattening operation, that the parent variable was a Structure, Sequence or Grid and its name was xyz. Given that, it should be easy to sort out how to make a future request for the data in the translated variable.

This in some way obviates the need for the dot, but it’s best to use it anyway.

Attributes of Flattened Types/Variables

If the structure Point has attributes, those should be copied to both the new variables (Point.x and Point.y). It’s redundant but this way the recipient of the file gets all of the information held in the original data source 96 January 2009 (PST) Added based on email from Patrick).

The name of the attributes should be Point.name for any attributes of the structure Point, and just the name of the attribute for the variables x and y. So, if x has attributes a1 and a2 and Point has attributes a1 and a3 then the new variable Point.x will have attributes a1, a2, Point,a1 and Point.a3.

Extra Data To Be Included

For a file format like netCDF it is possible to include data about the source data using it’s original data model as expressed using DAP. We could then describe where each variable in the file came from. This would be a good thing if we can do it in a light-weight way. It would also be a good thing to add an attribute to each variable that names where in the original data it came from so that client apps & users don’t have to work too hard to sort out what has been changed to make the file.

Information About Specific Types

Strings

  • Add dimension representing the max length of the string with name varname_len.
  • For scalar there will be a dimension for the length and the value written using nc_put_vara_text with type NC_CHAR
  • For arrays add an additional dimension for the max length and the value written using nc_put_vara_text with type NC_CHAR

7 January 2008 (MST) Received message from Russ Rew

Yes, that’s fine and follows a loose convention for names of string-length dimensions for netCDF-3 character arrays. For netCDF-4, of course, no string-length dimension is needed, as strings are supported as a netCDF data type.

Structures

  • Flatten
  • Prepend name of structure with a dot followed by the variable name. Keep track as there might be embedded structures, grids, et cetera.

18 December 2008 (PST) James Gallagher

I would use a dot even though I know that dots in variable names are, in general, a bad idea. If we use underscores then it maybe hard for clients to form a name that can be used to access values from a server based on the information in the file.

Grid

  • Flatten.
  • Use the name of the grid for the array of values
  • Prepend the name of the grid plus a dot to the names of each of the map vectors.

21 December 2008 (PST) James Gallagher

A more sophisticated version might look at the values of two or more grids that use the same names and have the same type (e.g., Float64 lon[360]) and if they are the same, make them shared dimensions.
More information about Grid translation

The idea here is that each of the map vectors will become an array with one dimension, the name of the dimension the same as the name of the variable (be careful about nested maps, see flatten). Then the grid array of values uses the same dimensions as those used in the map variables.

If there are multiple grids then they either use the same map variables and dimensions or they use different variables with different dimensions. In other words, if one grid has a map called x with dimension x, and another grid has a map called x then it better be the same variable with the same dimension and values. If not, it’s an error, it should be using a map called y that gets written out as variable y with dimension y.

  1. Read the dap spec on grids and see if this is the convention.
  2. Read the netcdf component guide (section 2.2.1 and 2.3.1)

coads_climatology.nc (4 grids, same maps and dimensions)

Dataset {
        Grid {
          Array:
            Float32 X[TIME = 12][COADSY = 90][COADSX = 180];
          Maps:
            Float64 TIME[TIME = 12];
            Float64 COADSY[COADSY = 90];
            Float64 COADSX[COADSX = 180];
        } X;
        Grid {
          Array:
            Float32 Y[TIME = 12][COADSY = 90][COADSX = 180];
          Maps:
            Float64 TIME[TIME = 12];
            Float64 COADSY[COADSY = 90];
            Float64 COADSX[COADSX = 180];
        } Y;
        Grid {
          Array:
            Float32 Z[TIME = 14][COADSY = 75][COADSX = 75];
          Maps:
            Float64 TIME[TIME = 14];
            Float64 COADSY[COADSY = 75];
            Float64 COADSX[COADSX = 75];
        } Z;
        Grid {
          Array:
            Float32 T[TIME = 14][COADSY = 75][COADSX = 90];
          Maps:
            Float64 TIME[TIME = 14];
            Float64 COADSY[COADSY = 75];
            Float64 COADSX[COADSX = 90];
        } T;
    } coads_climatology.nc;

Array

  • write_array appears to be working just fine.
  • If array of complex types?

16:43, 8 January 2008 (MST) Patrick West

DAP allows for the array dimensions to not have names, but NetCDF does not allow this. If the dimension name is empty then create the dimension name using the name of the variable + "_dim" + dim_num. So, for example, if array a has three dimensions, and none have names, then the names will be a_dim1, a_dim2, a_dim3.

Sequences

  • For now throw an exception
  • To translate a Sequence, there are several cases to consider:
    • A Sequence of simple types only (which means a one-level sequence): translate to a set of arrays using a name-prefix flattening scheme.
    • A nested sequence (otherwise with only simple types) should first be flattened to a one level sequence and then that should be flattened.
    • A Sequence with a Structure or Grid should be flattened by recursively applying the flattening logic to the components.

21 December 2008 (PST) James Gallagher

Initial version should elide [sequences] because there are important cases where they appear as part of a dataset but not the main part. We can represent these as arrays easily in the future.

Attributes

  • Global Attributes?
    • For single container DDS (no embedded structure) just write out the global attributes to the netcdf file
    • For multi-container DDS (multiple files each in an embedded Structure), take the global attributes from each of the containers and add them as global attributes to the target netcdf file. If the value already exists for the attribute then discard the value. If not then add the value to the attribute as attributes can have multiple values.
  • Variable Attributes
    • This is the way attributes should be stored in the DAS. In the entry class/structure there is a vector of strings. Each of these strings should contain one value for the attribute. If the attribute is a list of 10 int values then there will be 10 strings in the vector, each string representing one of the int values for the attribute.
    • What about attributes for structures? Should these attributes be created for each of the variables in the structure? So, if there is a structure Point with variables x and y then the attributes for a will be attributes for Point.x and Point.y? Or are there attributes for each of the variables in the structure? 6 January 2009 (PST) James Gallagher See above under the information about hierarchical types.
    • For multi-dimensional datasets there will be a structure for each container, and each of these containers will have global attributes.
    • Attribute containers should be treated just as structures. The attributes will be flattened with dot separation of the names. For example, if there is an attribute a that is a container of attributes with attributes b and c then we will create an attribute a.b and a.c for that variable.
    • Attributes with multiple string values will be handled like so. The individual values will be put together with a newline character at the end of each, making one single value.

Added Attributes

14 January, 2009 Patrick West

This feature will not be added as part of [Hyrax] 1.5, but a future release.

After doing some kind of translation, whether with constraints, aggregation, file out, whatever, we need to add information to the resulting data product telling how we came about this result. Version of the software, version of the translation (file out), version of the aggregation engine, whatever. How do we do that?

The ideas might be not to have all of this information in, say, the GLOBAL attributes section of the data product, or in the attributes of the opendap data product (DDX, DataDDX, whatever) but instead a URI pointing to this information. Perhaps this information is stored at OPeNDAP, provenance information for the different software components. Perhaps the provenance information for this data product is stored locally, referenced in the data product, and this provenance information references software component provenance.

http://www.opendap.org/provenance?id=xxxxxx

might be something referenced in the local provenance. The local provenance would keep track of…

  • containers used to generate the data product
  • constraints (server side functions, projections, etc…)
  • aggregation handler and command
  • data product requested
  • software component versions

Peter Fox mentions that we need to be careful of this sort of thing (storing provenance information locally) as this was tried with log information. Referencing this kind of information is dangerous.

Support for CF

If we can recognize and support files that contain CF-compliant information, we should strive to make sure that the resulting netCDF files built by this module from those files are also CF compliant. This will have a number of benefits, most of which are likely unknown right now because acceptance of CF is not complete. But one example is that ArcGIS understands CF, so that means that returning a netCDF file that follows CF provides a way to get information from our servers directly into this application without any modification to the app itself.

Here’s a link to information about CF.

11.C.9. Returning GeoTiff and JPEG2000
Introduction

The File Out GDAL module provides the ability to return various kinds of GIS data files as responses from Hyrax. The handler currently supports returning GeoTIFF and JPEG2000 files. Not every dataset served by Hyrax can be returned as a GIS dataset, either because it lacks latitude/longitude information or because it is not organized so that the latitude and longitude values are recognized by this module.

Most GIS data include information about their coordinate reference systems, but how that information is encoded can vary widely. This handler looks for geographical information that follows the CF-1.4 standard for [grid mappings and projections http://cfconventions.org/Data/cf-conventions/cf-conventions-1.6/build/cf-conventions.html#grid-mappings-and-projections] (note that the link is actually to the CF-1.6 standard; it seems the CF-1.4 site from LLNL is no longer available). It will recognize either the EPSG:4047 or WGS84 Geographical Coordinate systems (GCS) and provides an option to set the default GCS using a parameter (described below).

Services Handled

This module does not handle any services but adds to an existing service

Services Provided

The module provides an additional format to the dap service’s dods command. The format is used to specify a "returnAs" format. This module provides the additional format of returning the OPenDAP data object as a GeoTIFF or JPEG2000 file.

How to Use the Module

Once installed, the fong.conf file is installed in the BES etc/bes/modules directory and is automatically loaded by the BES at startup. There is a configuration option that you can change for this module. The FONg.Tempdir' parameter in thefong.conf configuration file tells the module where to store the files generated by the module until the file is streamed back to the caller. The default value for this parameter is the /tmp directory. You should change this to a location where there is plenty of disk space/quota that is owned by the user set to run the BES.

FONg.Tempdir=/tmp

The next time the BES is started it will load this module. And, once installed, the OLFS will know that it can use this module to transform your data. You can get GeoTIFF or JPEG2000 responses for applicable datasets by appending the extensions .tiff or .jp2 to the dataset’s OpenDAP URL.

If not using the OLFS to serve your data, for example if using the bescmdln, you would run a command file that would look something like this:

<?xml version="1.0" encoding="UTF-8"?>
    <request reqID="some_unique_value" >
        <setContext name="dap_format">dap2</setContext>
        <setContainer name="c" space="catalog">data/nc/coads_climatology.nc</setContainer>
        <define name="d">
        <container name="c" />
        </define>
        <get type="dods" definition="d" returnAs="tiff"/>
    </request>

In addition to setting the directory where the response file is initially built, you can use the FONg.default_gcs configuration parameter to set the default Geographical Coordinate System (GCS) for the handler. This GCS will be used when the dataset’s metadata provides information GCS that the handler can not recognize.

11.C.10. JSON Responses
Overview

With funding from the Australian Bureau of Meteorology we have developed prototype JSON data and metadata responses for Hyrax. After reviewing some the existing JSON encodings for DAP content we chose to implement two prototype encodings.

The first, and most likely the most useful, is based on the w10n specification as realized by the good folks at JPL. This encoding utilizes an abstract model to capture the structure of the dataset and it’s metadata. In this model the properties of the JSON object are made of a controlled vocabulary. This means that clients utilizing these responses can always "know" what to look for in each returned object. No matter what dataset is being accessed the client has a consistent mechanism for extracting variable names and values.

The second encoding utilizes an "instance" representation model wherein the datasets variable names are used to create the properties of the returned object. This means that each dataset potentially has a different set of properties and that client software must be written to navigate each dataset. For data providers with large sets of homogeneous holdings this representation allows the quick development of targeted clients that can work with these data. However since the variable names form the dataset become JSON properties there is no promise that the JSON objects will actually be valid as variable names in DAP datasets have few content restriction and the JSON property names must be valid Javascript variable names. Because of this this second representation probably doesn’t have the required flexibility to become an official JSON representation for the DAP.

The intention is to develop this work (in particular the w10n representation) into a DAP4 extension that defines the JSON representation for the DAP4 data and metadata responses.

Details

Data Type Transform

w10n

The w10n data model views the world as a directed graph of nodes and leaves. This view starts at the catalog level and continues into the structure of the datasets. +

  • Only leaves are allowed to have data.
  • Both nodes and leaves have metadata (attributes).
  • Leaf data must be transmittable as either a single value, or an N-dimensional array of values, of a simple type. +
    Simple Types
    f - Floating point value

This means that only DAP arrays of simple types and instances of simple types may be represented as leaves. Everything else must be a node.

Since the DAP data model also can be seen as a directed graph the mapping is nearly complete.

  • There may be incomplete matching with type space of the simple types supported in both models.
    1. Simple Types Type Map

DAP Type

w10n Type

Byte

Int16

UInt16

Int32

UInt32

Float32

f

Float64

String

Url

(Needed: A complete type list from w10n - In section 5.2.2 of the w10n spec. the type property for the leaf response is identified but there is no listing of the allowed values presented. We are expecting to get this information from JPL by 08/18/2014 at which point I will complete this section and update the code to reflect the mapping as stated here.)

Unmapped Types

  • The DAP allows arrays of complex types like structures and grids. No w10n representation for this if offered.

Navigation

W10n defines a navigation component that allows the user to traverse the directed graph of a collection of dataset holdings on the server. This work is focused not on implementing the collection navigation aspects of the w10n standard but rather on the JSON data and metadata representations. Thus, DAP request URLs (and alternately HTTP Accepts headers received from the requesting client) are used here to solicit JSON encoded responses from the server. The use of DAP constraint expressions (i.e. query strings) in the regular DAP manner in conjunction with the DAP URL will have the typical effects on the result. Subsetting by index, selection of variables, and subsetting by value (where supported) will control what variables and what parts of variables will be returned in the response.

Installation

The JSON functionality is implemented as components of the OLFS and the BES. The OLFS JSON code is built into the OLFS and is not an add-on. In the BES the JSON support is contained in the fileout_json module. The file out_json module is now part of the Hyrax trunk (the shrew project) and part of the Hyrax-1.9 release branch. The next minor release of the Hyrax server will contain the JSON response capability.

From Subversion Trunk

From the Hyrax-1.9 Release Branch

Soliciting the JSON Response

Let datasetUrl=http://54.84.172.19:8080/opendap/data/nc/coads_cli...

DAP4 Requests

Using the DAP4 URLs to request both the DMR and the Data responses in a JSON encoding.

NB: Currently what is returned is really a JSON encoding of the DAP data (.dods) and metadata (.ddx) objects. When we have full DAP4 within Hyrax these responses will return JSON version of the DAP4 DMR and Data objects.

DAP4 w10n JSON Metadata request

datasetUrl.dmr.json

DAP4 w10n JSON Data request

datasetUrl.dap.json

DAP4 Instance Object Metadata request

datasetUrl.dmr.ijsn

DAP4 Instance Object Data request

datasetUrl.dap.ijsn

DAP2 requests

DAP2 w10n JSON Data request
Entire Dataset

datasetUrl.json

Just the variable named "COADSX"

datasetUrl.json?COADSX

DAP2 Instance Object JSON Data request
Entire Dataset

datasetUrl.ijsn

Just the variable named "COADSX"

datasetUrl.ijsn?COADSX

Examples

Dataset - coads_climatology.nc

(I’m putting in the DAP2 dataset descriptions for now, the DAP4 will follow)

DDS

Here is the DDS for the grid dataset, our friend coads_climatology.nc:

Dataset {
        Float64 COADSX[COADSX = 180];
        Float64 COADSY[COADSY = 90];
        Float64 TIME[TIME = 12];
        Grid {
          Array:
            Float32 SST[TIME = 12][COADSY = 90][COADSX = 180];
          Maps:
            Float64 TIME[TIME = 12];
            Float64 COADSY[COADSY = 90];
            Float64 COADSX[COADSX = 180];
        } SST;
        Grid {
          Array:
            Float32 AIRT[TIME = 12][COADSY = 90][COADSX = 180];
          Maps:
            Float64 TIME[TIME = 12];
            Float64 COADSY[COADSY = 90];
            Float64 COADSX[COADSX = 180];
        } AIRT;
        Grid {
          Array:
            Float32 UWND[TIME = 12][COADSY = 90][COADSX = 180];
          Maps:
            Float64 TIME[TIME = 12];
            Float64 COADSY[COADSY = 90];
            Float64 COADSX[COADSX = 180];
        } UWND;
        Grid {
          Array:
            Float32 VWND[TIME = 12][COADSY = 90][COADSX = 180];
          Maps:
            Float64 TIME[TIME = 12];
            Float64 COADSY[COADSY = 90];
            Float64 COADSX[COADSX = 180];
        } VWND;
    } coads_climatology.nc;

DAS

Attributes {
        COADSX {
            String units "degrees_east";
            String modulo " ";
            String point_spacing "even";
        }
        COADSY {
            String units "degrees_north";
            String point_spacing "even";
        }
        TIME {
            String units "hour since 0000-01-01 00:00:00";
            String time_origin "1-JAN-0000 00:00:00";
            String modulo " ";
        }
        SST {
            Float32 missing_value -9.99999979e+33;
            Float32 _FillValue -9.99999979e+33;
            String long_name "SEA SURFACE TEMPERATURE";
            String history "From coads_climatology";
            String units "Deg C";
        }
        AIRT {
            Float32 missing_value -9.99999979e+33;
            Float32 _FillValue -9.99999979e+33;
            String long_name "AIR TEMPERATURE";
            String history "From coads_climatology";
            String units "DEG C";
        }
        UWND {
            Float32 missing_value -9.99999979e+33;
            Float32 _FillValue -9.99999979e+33;
            String long_name "ZONAL WIND";
            String history "From coads_climatology";
            String units "M/S";
        }
        VWND {
            Float32 missing_value -9.99999979e+33;
            Float32 _FillValue -9.99999979e+33;
            String long_name "MERIDIONAL WIND";
            String history "From coads_climatology";
            String units "M/S";
        }
        NC_GLOBAL {
            String history "FERRET V4.30 (debug/no GUI) 15-Aug-96";
        }
        DODS_EXTRA {
            String Unlimited_Dimension "TIME";
        }
    }

DDX

<?xml version="1.0" encoding="ISO-8859-1"?>
    <Dataset name="coads_climatology.nc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
     xsi:schemaLocation="http://xml.opendap.org/ns/DAP/3.2# http://xml.opendap.org/dap/dap3.2.xsd" 
     xmlns:grddl="http://www.w3.org/2003/g/data-view#" grddl:transformation=
     "http://xml.opendap.org/transforms/ddxToRdfTriples.xsl" xmlns="http://xml.opendap.org/ns/DAP/3.2#" 
     xmlns:dap="http://xml.opendap.org/ns/DAP/3.2#" dapVersion="3.2" xmlns:xml=
     "http://www.w3.org/XML/1998/namespace" xml:base="http://54.84.172.19:8080/opendap/data/nc/coads_climatology.nc">
        <Attribute name="NC_GLOBAL" type="Container">
            <Attribute name="history" type="String">
                <value>FERRET V4.30 (debug/no GUI) 15-Aug-96</value>
            </Attribute>
        </Attribute>
        <Attribute name="DODS_EXTRA" type="Container">
            <Attribute name="Unlimited_Dimension" type="String">
                <value>TIME</value>
            </Attribute>
        </Attribute>
        <Array name="COADSX">
            <Attribute name="units" type="String">
                <value>degrees_east</value>
            </Attribute>
            <Attribute name="modulo" type="String">
                <value> </value>
            </Attribute>
            <Attribute name="point_spacing" type="String">
                <value>even</value>
            </Attribute>
            <Float64/>
            <dimension name="COADSX" size="180"/>
        </Array>
        <Array name="COADSY">
            <Attribute name="units" type="String">
                <value>degrees_north</value>
            </Attribute>
            <Attribute name="point_spacing" type="String">
                <value>even</value>
            </Attribute>
            <Float64/>
            <dimension name="COADSY" size="90"/>
        </Array>
        <Array name="TIME">
            <Attribute name="units" type="String">
                <value>hour since 0000-01-01 00:00:00</value>
            </Attribute>
            <Attribute name="time_origin" type="String">
                <value>1-JAN-0000 00:00:00</value>
            </Attribute>
            <Attribute name="modulo" type="String">
                <value> </value>
            </Attribute>
            <Float64/>
            <dimension name="TIME" size="12"/>
        </Array>
        <Grid name="SST">
            <Array name="SST">
                <Attribute name="missing_value" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="_FillValue" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="long_name" type="String">
                    <value>SEA SURFACE TEMPERATURE</value>
                </Attribute>
                <Attribute name="history" type="String">
                    <value>From coads_climatology</value>
                </Attribute>
                <Attribute name="units" type="String">
                    <value>Deg C</value>
                </Attribute>
                <Float32/>
                <dimension name="TIME" size="12"/>
                <dimension name="COADSY" size="90"/>
                <dimension name="COADSX" size="180"/>
            </Array>
            <Map name="TIME">
                <Attribute name="units" type="String">
                    <value>hour since 0000-01-01 00:00:00</value>
                </Attribute>
                <Attribute name="time_origin" type="String">
                    <value>1-JAN-0000 00:00:00</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Float64/>
                <dimension name="TIME" size="12"/>
            </Map>
            <Map name="COADSY">
                <Attribute name="units" type="String">
                    <value>degrees_north</value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSY" size="90"/>
            </Map>
            <Map name="COADSX">
                <Attribute name="units" type="String">
                    <value>degrees_east</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSX" size="180"/>
            </Map>
        </Grid>
        <Grid name="AIRT">
            <Array name="AIRT">
                <Attribute name="missing_value" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="_FillValue" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="long_name" type="String">
                    <value>AIR TEMPERATURE</value>
                </Attribute>
                <Attribute name="history" type="String">
                    <value>From coads_climatology</value>
                </Attribute>
                <Attribute name="units" type="String">
                    <value>DEG C</value>
                </Attribute>
                <Float32/>
                <dimension name="TIME" size="12"/>
                <dimension name="COADSY" size="90"/>
                <dimension name="COADSX" size="180"/>
            </Array>
            <Map name="TIME">
                <Attribute name="units" type="String">
                    <value>hour since 0000-01-01 00:00:00</value>
                </Attribute>
                <Attribute name="time_origin" type="String">
                    <value>1-JAN-0000 00:00:00</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Float64/>
                <dimension name="TIME" size="12"/>
            </Map>
            <Map name="COADSY">
                <Attribute name="units" type="String">
                    <value>degrees_north</value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSY" size="90"/>
            </Map>
            <Map name="COADSX">
                <Attribute name="units" type="String">
                    <value>degrees_east</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSX" size="180"/>
            </Map>
        </Grid>
        <Grid name="UWND">
            <Array name="UWND">
                <Attribute name="missing_value" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="_FillValue" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="long_name" type="String">
                    <value>ZONAL WIND</value>
                </Attribute>
                <Attribute name="history" type="String">
                    <value>From coads_climatology</value>
                </Attribute>
                <Attribute name="units" type="String">
                    <value>M/S</value>
                </Attribute>
                <Float32/>
                <dimension name="TIME" size="12"/>
                <dimension name="COADSY" size="90"/>
                <dimension name="COADSX" size="180"/>
            </Array>
            <Map name="TIME">
                <Attribute name="units" type="String">
                    <value>hour since 0000-01-01 00:00:00</value>
                </Attribute>
                <Attribute name="time_origin" type="String">
                    <value>1-JAN-0000 00:00:00</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Float64/>
                <dimension name="TIME" size="12"/>
            </Map>
            <Map name="COADSY">
                <Attribute name="units" type="String">
                    <value>degrees_north</value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSY" size="90"/>
            </Map>
            <Map name="COADSX">
                <Attribute name="units" type="String">
                    <value>degrees_east</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSX" size="180"/>
            </Map>
        </Grid>
        <Grid name="VWND">
            <Array name="VWND">
                <Attribute name="missing_value" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="_FillValue" type="Float32">
                    <value>-9.99999979e+33</value>
                </Attribute>
                <Attribute name="long_name" type="String">
                    <value>MERIDIONAL WIND</value>
                </Attribute>
                <Attribute name="history" type="String">
                    <value>From coads_climatology</value>
                </Attribute>
                <Attribute name="units" type="String">
                    <value>M/S</value>
                </Attribute>
                <Float32/>
                <dimension name="TIME" size="12"/>
                <dimension name="COADSY" size="90"/>
                <dimension name="COADSX" size="180"/>
            </Array>
            <Map name="TIME">
                <Attribute name="units" type="String">
                    <value>hour since 0000-01-01 00:00:00</value>
                </Attribute>
                <Attribute name="time_origin" type="String">
                    <value>1-JAN-0000 00:00:00</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Float64/>
                <dimension name="TIME" size="12"/>
            </Map>
            <Map name="COADSY">
                <Attribute name="units" type="String">
                    <value>degrees_north</value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSY" size="90"/>
            </Map>
            <Map name="COADSX">
                <Attribute name="units" type="String">
                    <value>degrees_east</value>
                </Attribute>
                <Attribute name="modulo" type="String">
                    <value> </value>
                </Attribute>
                <Attribute name="point_spacing" type="String">
                    <value>even</value>
                </Attribute>
                <Float64/>
                <dimension name="COADSX" size="180"/>
            </Map>
        </Grid>
        <blob href="cid:"/>
    </Dataset>

DMR

Coming Soon…

w10n JSON (Abstract Model)

Metadata Responses

Single Variable Selection

DAP4 Request URL

datasetURL.dmr.json?dap4.ce=COADSX

Response
{
      "name": "coads_climatology.nc",
      "attributes": [
        {
          "name": "NC_GLOBAL",
          "attributes": [
            {"name": "history", "value": ["FERRET V4.30 (debug/no GUI) 15-Aug-96"]}
          ]
        },
        {
          "name": "DODS_EXTRA",
          "attributes": [
            {"name": "Unlimited_Dimension", "value": ["TIME"]}
          ]
        }
      ],
      "leaves": [
        {
          "name": "COADSX",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["degrees_east"]},
            {"name": "modulo", "value": [" "]},
            {"name": "point_spacing", "value": ["even"]}
          ],
          "shape": [180]
        }
      ],
      "nodes": []
    }

Entire Dataset

DAP4 Request URL

datasetURL.dmr.json

Response
{
      "name": "coads_climatology.nc",
      "attributes": [
        {
          "name": "NC_GLOBAL",
          "attributes": [
            {"name": "history", "value": ["FERRET V4.30 (debug/no GUI) 15-Aug-96"]}
          ]
        },
        {
          "name": "DODS_EXTRA",
          "attributes": [
            {"name": "Unlimited_Dimension", "value": ["TIME"]}
          ]
        }
      ],
      "leaves": [
        {
          "name": "COADSX",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["degrees_east"]},
            {"name": "modulo", "value": [" "]},
            {"name": "point_spacing", "value": ["even"]}
          ],
          "shape": [180]
        },
        {
          "name": "COADSY",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["degrees_north"]},
            {"name": "point_spacing", "value": ["even"]}
          ],
          "shape": [90]
        },
        {
          "name": "TIME",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
            {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
            {"name": "modulo", "value": [" "]}
          ],
          "shape": [12]
        }
      ],
      "nodes": [
        {
          "name": "SST",
          "attributes": [],
          "leaves": [
            {
              "name": "SST",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["SEA SURFACE TEMPERATURE"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["Deg C"]}
              ],
              "shape": [12,90,180]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180]
            }
          ],
          "nodes": []
        }
        {
          "name": "AIRT",
          "attributes": [],
          "leaves": [
            {
              "name": "AIRT",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["AIR TEMPERATURE"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["DEG C"]}
              ],
              "shape": [12,90,180]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180]
            }
          ],
          "nodes": []
        }
        {
          "name": "UWND",
          "attributes": [],
          "leaves": [
            {
              "name": "UWND",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["ZONAL WIND"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["M/S"]}
              ],
              "shape": [12,90,180]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180]
            }
          ],
          "nodes": []
        }
        {
          "name": "VWND",
          "attributes": [],
          "leaves": [
            {
              "name": "VWND",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["MERIDIONAL WIND"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["M/S"]}
              ],
              "shape": [12,90,180]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180]
            }
          ],
          "nodes": []
        }
      ]
    }

Data Responses

Single Variable Selection

DAP4 Request URL

datasetURL.dap.json?dap4.ce=COADSX

DAP2 Request URL

datasetURL.json?COADSX

Response
{
      "name": "coads_climatology.nc",
      "attributes": [
        {
          "name": "NC_GLOBAL",
          "attributes": [
            {"name": "history", "value": ["FERRET V4.30 (debug/no GUI) 15-Aug-96"]}
          ]
        },
        {
          "name": "DODS_EXTRA",
          "attributes": [
            {"name": "Unlimited_Dimension", "value": ["TIME"]}
          ]
        }
      ],
      "leaves": [
        {
          "name": "COADSX",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["degrees_east"]},
            {"name": "modulo", "value": [" "]},
            {"name": "point_spacing", "value": ["even"]}
          ],
          "shape": [180],
          "data": [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
          57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 
          97, 99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 129, 
          131, 133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 161, 
          163, 165, 167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 193, 
          195, 197, 199, 201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 225, 
          227, 229, 231, 233, 235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 257, 
          259, 261, 263, 265, 267, 269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 289, 
          291, 293, 295, 297, 299, 301, 303, 305, 307, 309, 311, 313, 315, 317, 319, 321, 
          323, 325, 327, 329, 331, 333, 335, 337, 339, 341, 343, 345, 347, 349, 351, 353, 
          355, 357, 359, 361, 363, 365, 367, 369, 371, 373, 375, 377, 379]
        }
      ],
      "nodes": []
    }

Entire Dataset

DAP4 Request URL

datasetURL.dap.json

DAP2 Request URL

datasetURL.json

Response
{
      "name": "coads_climatology.nc",
      "attributes": [
        {
          "name": "NC_GLOBAL",
          "attributes": [
            {"name": "history", "value": ["FERRET V4.30 (debug/no GUI) 15-Aug-96"]}
          ]
        },
        {
          "name": "DODS_EXTRA",
          "attributes": [
            {"name": "Unlimited_Dimension", "value": ["TIME"]}
          ]
        }
      ],
      "leaves": [
        {
          "name": "COADSX",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["degrees_east"]},
            {"name": "modulo", "value": [" "]},
            {"name": "point_spacing", "value": ["even"]}
          ],
          "shape": [180],
          "data": [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
          57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 
          97, 99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 
          129, 131, 133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 
          161, 163, 165, 167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 
          193, 195, 197, 199, 201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 
          225, 227, 229, 231, 233, 235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 
          257, 259, 261, 263, 265, 267, 269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 
          289, 291, 293, 295, 297, 299, 301, 303, 305, 307, 309, 311, 313, 315, 317, 319, 
          321, 323, 325, 327, 329, 331, 333, 335, 337, 339, 341, 343, 345, 347, 349, 351, 
          353, 355, 357, 359, 361, 363, 365, 367, 369, 371, 373, 375, 377, 379]
        },
        {
          "name": "COADSY",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["degrees_north"]},
            {"name": "point_spacing", "value": ["even"]}
          ],
          "shape": [90],
          "data": [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, -63, 
          -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, -33, -31, 
          -29, -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, -1, 1, 3, 5, 
          7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 
          47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 
          87, 89]
        },
        {
          "name": "TIME",
          "type": "f",
          "attributes": [
            {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
            {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
            {"name": "modulo", "value": [" "]}
          ],
          "shape": [12],
          "data": [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 
          6209.88, 6940.36, 7670.85, 8401.33]
        }
      ],
      "nodes": [
        {
          "name": "SST",
          "attributes": [],
          "leaves": [
            {
              "name": "SST",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["SEA SURFACE TEMPERATURE"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["Deg C"]}
              ],
              "shape": [12,90,180],
              "data": [[[-1e+34, -1e+34, -1e+34, … (many values skipped for brevity),  
              -1e+34, -1e+34, -1e+34]]]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12],
              "data": [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 
              6209.88, 6940.36, 7670.85, 8401.33]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90],
              "data": [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, 
              -63, -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, 
              -33, -31, -29, -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, 
              -1, 1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 
              39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 
              77, 79, 81, 83, 85, 87, 89]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180],
              "data": [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
              55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 
              93, 95, 97, 99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 
              125, 127, 129, 131, 133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 
              155, 157, 159, 161, 163, 165, 167, 169, 171, 173, 175, 177, 179, 181, 183, 
              185, 187, 189, 191, 193, 195, 197, 199, 201, 203, 205, 207, 209, 211, 213, 
              215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 235, 237, 239, 241, 243, 
              245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 269, 271, 273, 
              275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 303, 
              305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 
              335, 337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 
              365, 367, 369, 371, 373, 375, 377, 379]
            }
          ],
          "nodes": []
        }
        {
          "name": "AIRT",
          "attributes": [],
          "leaves": [
            {
              "name": "AIRT",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["AIR TEMPERATURE"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["DEG C"]}
              ],
              "shape": [12,90,180],
              "data": [[[-1e+34, -1e+34, -1e+34, … (many values skipped for brevity),  
              -1e+34, -1e+34, -1e+34]]]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12],
              "data": [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 
              6209.88, 6940.36, 7670.85, 8401.33]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90],
              "data": [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, 
              -63, -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, 
              -33, -31, -29, -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, 
              -1, 1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 
              39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 
              77, 79, 81, 83, 85, 87, 89]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180],
              "data": [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
              55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 
              93, 95, 97, 99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 
              125, 127, 129, 131, 133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 
              155, 157, 159, 161, 163, 165, 167, 169, 171, 173, 175, 177, 179, 181, 183, 
              185, 187, 189, 191, 193, 195, 197, 199, 201, 203, 205, 207, 209, 211, 213, 
              215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 235, 237, 239, 241, 243, 
              245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 269, 271, 273, 
              275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 303, 
              305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 
              335, 337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 
              365, 367, 369, 371, 373, 375, 377, 379]
            }
          ],
          "nodes": []
        }
        {
          "name": "UWND",
          "attributes": [],
          "leaves": [
            {
              "name": "UWND",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["ZONAL WIND"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["M/S"]}
              ],
              "shape": [12,90,180],
              "data": [[[-1e+34, -1e+34, -1e+34, … (many values skipped for brevity),  
              -1e+34, -1e+34, -1e+34]]]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12],
              "data": [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 
              6209.88, 6940.36, 7670.85, 8401.33]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90],
              "data": [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, 
              -63, -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, 
              -33, -31, -29, -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, 
              -1, 1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 
              39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 
              77, 79, 81, 83, 85, 87, 89]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180],
              "data": [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
              55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 
              93, 95, 97, 99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 
              125, 127, 129, 131, 133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 
              155, 157, 159, 161, 163, 165, 167, 169, 171, 173, 175, 177, 179, 181, 183, 
              185, 187, 189, 191, 193, 195, 197, 199, 201, 203, 205, 207, 209, 211, 213, 
              215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 235, 237, 239, 241, 243, 
              245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 269, 271, 273, 
              275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 303, 
              305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 
              335, 337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 
              365, 367, 369, 371, 373, 375, 377, 379]
            }
          ],
          "nodes": []
        }
        {
          "name": "VWND",
          "attributes": [],
          "leaves": [
            {
              "name": "VWND",
              "type": "f",
              "attributes": [
                {"name": "missing_value", "value": [-9.99999979e+33]},
                {"name": "_FillValue", "value": [-9.99999979e+33]},
                {"name": "long_name", "value": ["MERIDIONAL WIND"]},
                {"name": "history", "value": ["From coads_climatology"]},
                {"name": "units", "value": ["M/S"]}
              ],
              "shape": [12,90,180],
              "data": [[[-1e+34, -1e+34, -1e+34, … (many values skipped for brevity),  
              -1e+34, -1e+34, -1e+34]]]
            },
            {
              "name": "TIME",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["hour since 0000-01-01 00:00:00"]},
                {"name": "time_origin", "value": ["1-JAN-0000 00:00:00"]},
                {"name": "modulo", "value": [" "]}
              ],
              "shape": [12],
              "data": [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 
              6209.88, 6940.36, 7670.85, 8401.33]
            },
            {
              "name": "COADSY",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_north"]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [90],
              "data": [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, 
              -63, -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, 
              -33, -31, -29, -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, 
              -1, 1, 3, 5, 7, 9, 11, 13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 
              39, 41, 43, 45, 47, 49, 51, 53, 55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 
              77, 79, 81, 83, 85, 87, 89]
            },
            {
              "name": "COADSX",
              "type": "f",
              "attributes": [
                {"name": "units", "value": ["degrees_east"]},
                {"name": "modulo", "value": [" "]},
                {"name": "point_spacing", "value": ["even"]}
              ],
              "shape": [180],
              "data": [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
              55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 
              93, 95, 97, 99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 
              125, 127, 129, 131, 133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 
              155, 157, 159, 161, 163, 165, 167, 169, 171, 173, 175, 177, 179, 181, 183, 
              185, 187, 189, 191, 193, 195, 197, 199, 201, 203, 205, 207, 209, 211, 213, 
              215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 235, 237, 239, 241, 243, 
              245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 269, 271, 273, 
              275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 303, 
              305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 
              335, 337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 
              365, 367, 369, 371, 373, 375, 377, 379]
            }
          ],
          "nodes": []
        }
      ]
    }

Instance Model JSON

Metadata Responses

Single Variable Selection

DAP4 Request URL

datasetURL.dmr.ijsn?dap4.ce=COADSX

Response
{
     "name": "coads_climatology.nc",
     "NC_GLOBAL": {
       "history": ["FERRET V4.30 (debug/no GUI) 15-Aug-96"]
     },
     "DODS_EXTRA": {
       "Unlimited_Dimension": ["TIME"]
     },
     "COADSX":  {
       "units": ["degrees_east"],
       "modulo": [" "],
       "point_spacing": ["even"]
     }
    }

Entire Dataset

DAP4 Request URL

datasetURL.dmr.ijsn

Response
{
     "name": "coads_climatology.nc",
     "NC_GLOBAL": {
       "history": ["FERRET V4.30 (debug/no GUI) 15-Aug-96"]
     },
     "DODS_EXTRA": {
       "Unlimited_Dimension": ["TIME"]
     },
     "COADSX":  {
       "units": ["degrees_east"],
       "modulo": [" "],
       "point_spacing": ["even"]
     },
     "COADSY":  {
       "units": ["degrees_north"],
       "point_spacing": ["even"]
     },
     "TIME":  {
       "units": ["hour since 0000-01-01 00:00:00"],
       "time_origin": ["1-JAN-0000 00:00:00"],
       "modulo": [" "]
     },
     "SST": {
      "SST":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["SEA SURFACE TEMPERATURE"],
        "history": ["From coads_climatology"],
        "units": ["Deg C"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     },
     "AIRT": {
      "AIRT":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["AIR TEMPERATURE"],
        "history": ["From coads_climatology"],
        "units": ["DEG C"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     },
     "UWND": {
      "UWND":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["ZONAL WIND"],
        "history": ["From coads_climatology"],
        "units": ["M/S"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     },
     "VWND": {
      "VWND":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["MERIDIONAL WIND"],
        "history": ["From coads_climatology"],
        "units": ["M/S"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     }
    }
{
     "name": "coads_climatology.nc",
     "NC_GLOBAL": {
       "history": ["FERRET V4.30 (debug/no GUI) 15-Aug-96"]
     },
     "DODS_EXTRA": {
       "Unlimited_Dimension": ["TIME"]
     },
     "COADSX":  {
       "units": ["degrees_east"],
       "modulo": [" "],
       "point_spacing": ["even"]
     },
     "COADSY":  {
       "units": ["degrees_north"],
       "point_spacing": ["even"]
     },
     "TIME":  {
       "units": ["hour since 0000-01-01 00:00:00"],
       "time_origin": ["1-JAN-0000 00:00:00"],
       "modulo": [" "]
     },
     "SST": {
      "SST":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["SEA SURFACE TEMPERATURE"],
        "history": ["From coads_climatology"],
        "units": ["Deg C"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     },
     "AIRT": {
      "AIRT":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["AIR TEMPERATURE"],
        "history": ["From coads_climatology"],
        "units": ["DEG C"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     },
     "UWND": {
      "UWND":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["ZONAL WIND"],
        "history": ["From coads_climatology"],
        "units": ["M/S"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     },
     "VWND": {
      "VWND":  {
        "missing_value": [-9.99999979e+33],
        "_FillValue": [-9.99999979e+33],
        "long_name": ["MERIDIONAL WIND"],
        "history": ["From coads_climatology"],
        "units": ["M/S"]
      },
      "TIME":  {
        "units": ["hour since 0000-01-01 00:00:00"],
        "time_origin": ["1-JAN-0000 00:00:00"],
        "modulo": [" "]
      },
      "COADSY":  {
        "units": ["degrees_north"],
        "point_spacing": ["even"]
      },
      "COADSX":  {
        "units": ["degrees_east"],
        "modulo": [" "],
        "point_spacing": ["even"]
      }
     }
    }

Data Responses

Single Variable Selection

DAP4 Request URL

datasetURL.dap.ijsn?dap4.ce=COADSX

DAP2 Request URL

datasetURL.ijsn?COADSX

Response
{
     "name": "coads_climatology.nc",
     "COADSX":  [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
     57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 97, 
     99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 129, 131, 
     133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 161, 163, 165, 
     167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 193, 195, 197, 199, 
     201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 
     235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 
     269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 
     303, 305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 335, 
     337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 365, 367, 369, 
     371, 373, 375, 377, 379]
    }

Entire Dataset

DAP4 Request URL

datasetURL.dap.ijsn

DAP2 Request URL

datasetURL.ijsn

Response
{
     "name": "coads_climatology.nc",
     "COADSX":  [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
     57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 97, 
     99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 129, 131, 
     133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 161, 163, 165, 
     167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 193, 195, 197, 199, 
     201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 
     235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 
     269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 
     303, 305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 335, 
     337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 365, 367, 369, 
     371, 373, 375, 377, 379],
     "COADSY":  [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, -63, 
     -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, -33, -31, -29, 
     -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, -1, 1, 3, 5, 7, 9, 11, 
     13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
     55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89],
     "TIME":  [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 6209.88, 
     6940.36, 7670.85, 8401.33],
     "SST": {
      "SST":  [[[-1e+34, -1e+34, -1e+34, … (Many values omitted for brevity), -1e+34, 
      -1e+34, -1e+34]]],
      "TIME":  [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 6209.88, 
      6940.36, 7670.85, 8401.33],
      "COADSY":  [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, -63, 
      -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, -33, -31, -29, 
      -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, -1, 1, 3, 5, 7, 9, 11, 
      13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
      55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89],
      "COADSX":  [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
      57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 97, 
      99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 129, 131, 
      133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 161, 163, 165, 
      167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 193, 195, 197, 199, 
      201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 
      235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 
      269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 
      303, 305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 335, 
      337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 365, 367, 369, 
      371, 373, 375, 377, 379]
     },
     "AIRT": {
      "AIRT":  [[[-1e+34, -1e+34, -1e+34, … (Many values omitted for brevity), -1e+34, 
      -1e+34, -1e+34]]],
      "TIME":  [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 6209.88, 
      6940.36, 7670.85, 8401.33],
      "COADSY":  [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, -63, 
      -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, -33, -31, -29, 
      -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, -1, 1, 3, 5, 7, 9, 11, 
      13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
      55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89],
      "COADSX":  [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
      57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 97, 
      99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 129, 131, 
      133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 161, 163, 165, 
      167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 193, 195, 197, 199, 
      201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 
      235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 
      269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 
      303, 305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 335, 
      337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 365, 367, 369, 
      371, 373, 375, 377, 379]
     },
     "UWND": {
      "UWND":  [[[-1e+34, -1e+34, -1e+34, … (Many values omitted for brevity), -1e+34, 
      -1e+34, -1e+34]]],
      "TIME":  [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 6209.88, 
      6940.36, 7670.85, 8401.33],
      "COADSY":  [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, -63, 
      -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, -33, -31, -29, 
      -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, -1, 1, 3, 5, 7, 9, 11, 
      13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
      55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89],
      "COADSX":  [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
      57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 97, 
      99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 129, 131, 
      133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 161, 163, 165, 
      167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 193, 195, 197, 199, 
      201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 
      235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 
      269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 
      303, 305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 335, 
      337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 365, 367, 369, 
      371, 373, 375, 377, 379]
     },
     "VWND": {
      "VWND":  [[[-1e+34, -1e+34, -1e+34, … (Many values omitted for brevity), -1e+34, 
      -1e+34, -1e+34]]],
      "TIME":  [366, 1096.49, 1826.97, 2557.45, 3287.94, 4018.43, 4748.91, 5479.4, 6209.88, 
      6940.36, 7670.85, 8401.33],
      "COADSY":  [-89, -87, -85, -83, -81, -79, -77, -75, -73, -71, -69, -67, -65, -63, 
      -61, -59, -57, -55, -53, -51, -49, -47, -45, -43, -41, -39, -37, -35, -33, -31, -29, 
      -27, -25, -23, -21, -19, -17, -15, -13, -11, -9, -7, -5, -3, -1, 1, 3, 5, 7, 9, 11, 
      13, 15, 17, 19, 21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 
      55, 57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89],
      "COADSX":  [21, 23, 25, 27, 29, 31, 33, 35, 37, 39, 41, 43, 45, 47, 49, 51, 53, 55, 
      57, 59, 61, 63, 65, 67, 69, 71, 73, 75, 77, 79, 81, 83, 85, 87, 89, 91, 93, 95, 97, 
      99, 101, 103, 105, 107, 109, 111, 113, 115, 117, 119, 121, 123, 125, 127, 129, 131, 
      133, 135, 137, 139, 141, 143, 145, 147, 149, 151, 153, 155, 157, 159, 161, 163, 165, 
      167, 169, 171, 173, 175, 177, 179, 181, 183, 185, 187, 189, 191, 193, 195, 197, 199, 
      201, 203, 205, 207, 209, 211, 213, 215, 217, 219, 221, 223, 225, 227, 229, 231, 233, 
      235, 237, 239, 241, 243, 245, 247, 249, 251, 253, 255, 257, 259, 261, 263, 265, 267, 
      269, 271, 273, 275, 277, 279, 281, 283, 285, 287, 289, 291, 293, 295, 297, 299, 301, 
      303, 305, 307, 309, 311, 313, 315, 317, 319, 321, 323, 325, 327, 329, 331, 333, 335, 
      337, 339, 341, 343, 345, 347, 349, 351, 353, 355, 357, 359, 361, 363, 365, 367, 369, 
      371, 373, 375, 377, 379]
     }
    }
Information Icon Maybe this should its own own Appendix - just like Server functions and aggregations which are really loaded in via the module system, get their own appendix
11.C.11. The Gateway Module
Introduction

The Gateway Service provides interoperability between Hyrax and other web services. Using the Gateway module, Hyrax can be used to access and subset data served by other web services so long as those services return the data in a form Hyrax has been configured to serve. For example, if a web service returns data using HDF4 files, then Hyrax, using the gateway module, can subset and return DAP responses for those data.

Special Options Supported by the Handler

Limiting Access to Specific Hosts

Because this handler behaves like a web client there are some special options that need to be configured to make it work. When we distribute the client, it is limited to accessing only the local host. This prevents misuse (where your copy of Hyrax might be used to access all kinds of other sites). This gateway’s configuration file contains a 'whitelist' of allowed hosts. Only hosts listed on the whitelist will be accessed by the gateway.

Gateway.Whitelist

provides a list of URL of the form protocol://host.domain:port that will be passed through the gateway module. If a request is made to access a web service not listed on the Whitelist, Hyrax returns an error. Note that the whitelist can be more specific than just a hostname - it could in principal limit access to a specific set of requests to a particular web service.

Example:

Gateway.Whitelist=http://test.opendap.org/opendap
Gateway.Whitelist+=http://opendap.rpi.edu/opendap

Recognizing Responses

Gateway.MimeTypes

provides a list of mappings from data handler module to returned mime types. When the remote service returns a response, if that response contains one of the listed MIME types (e.g., application/x-hdf5) then the gateway will process it using the named handler (e.g., h5). Note that if the service does not include this information the gateway will try other ways to figure out how to work with the response.

These are the default types:

Gateway.MimeTypes=nc:application/x-netcdf
Gateway.MimeTypes+=h4:application/x-hdf
Gateway.MimeTypes+=h5:application/x-hdf5

Network Proxies and Performance Optimizations

There are four parameters that are used to configure a proxy server that the gateway will use. Nominally this is used as a cache, so that files do not have to be repeatedly fetched from the remote service and that’s why we consider this a 'performance' feature. We have tested the hander with Squid because it is widely used on both linux and OS/X and because in addition to it’s proxy capabilities, it is often used as a cache. This can also be used to navigate firewalls.

Gateway.ProxyProtocol

Which protocol(s) does this proxy support. Nominally this should be http.

Gateway.ProxyHost

On what host does the proxy server operate? Often you want to use localgost for this.

Gateway.ProxyPort

What port does the proxy listen on? Squid defaults to 3218; some documentation for web accelerators

Gateway.NoProxy

Provide a regular expression that describes URLs that should not be sent to the proxy. This is particularly useful for running the gateway on the hosts that stage the service accessed via the gateway. In this cases, a proxy/cache like squid may not process 'localhost' URLs unless its configuration is tweaked quite a bit (and there may be no performance advantage to having the proxy/cache store extra copies of the files given that they are on the host already). This parameter was added in version 1.1.0.

Gateway.ProxyProtocol=
Gateway.ProxyHost=
Gateway.ProxyPort=
Gateway.NoProxy=
Using Squid

Squid makes a great cache for the gateway. In our testing we have used Squid only for services running on port 80.

Squid is a powerful tool and it is worth looking at its web page.

Squid and Dynamic Content

Squid follows the HTTP/1.1 specification to determine what and how long to cache items. However, you may want to force Squid to ignore some of the information supplied by certain web services (or to different default values when the standard information is not present). If you are working with a web server that does not include caching control headers in its responses but does have 'cgi-bin' or '?' in the URL, here’s how override Squid’s default behavior (which is to never cache items returned from a 'dynamic' source (i.e., one with 'cgi-bin' or '?' in the URL). The value below will cause Squid to cache response from a dynamic source for 1440 minutes unless that response includes an Expires: header telling to cache to behave differently

In the squid configuration file, find the lines:

# refresh patterns (squid-recommended)
refresh_pattern ^ftp:       1440    20% 10080
refresh_pattern ^gopher:    1440    0%  1440
refresh_pattern -i (/cgi-bin/|\?) 0 0%  0
refresh_pattern .       0   20% 4320

And change the third refresh_pattern to read:

refresh_pattern -i (/cgi-bin/|\?) 1440  20% 10080

How can I tell if a service sends Cache Control headers?

Here are two ways to check:

Using Squid on OS/X

If you’re using OS/X to run Hyrax, the easiest Squid port is SquidMan. We tested version SquidMan 3.0 (Squid 3.1.1). Run the SquidMan application and under Preferences… General set the port to something like 3218, the cache size to something big (16GB) and Maximum object size to 256M. Click 'Save' and you’re almost done.

Now in the gateway.conf file, set the proxy parameters like so:

Gateway.ProxyProtocol=http
Gateway.ProxyHost=localhost
Gateway.ProxyPort=3218
Gateway.NoProxy=http://localhost.*

…assuming you’re running both Squid and Hyrax on the same host.

Restart the BES and you’re all set.

To test, make some requests using the gateway (http://localhost/opendap/gateway) and click on SquidMan’s 'Access Log' button to see the caching at work. The first access, which fetches the data, will say DIRECT/<ip number> while cache hits will be labeled NONE/-.

Squid, OS/X and Caching Dynamic Content

By default SquidMan does not cache dynamic content that lacks cache control headers in the response. To hack the squid.conf file and make the change in the refresh_pattern described above do the following:

  1. Under Preferences… choose the 'Template' tab and scroll to the bottom of the text;
  2. Hyrax 47
  3. Edit the line, replacing "0 0% 0" with "1440 20% 10080"; and
  4. 'Save' and then 'Stop Squid' and 'Start Squid' (note the helpful status messages in the 'Start/Stop' window)

Hyrax 48

Hyrax 49

Hyrax 50

Known Problems

For version 1.0.1 of the gateway, we know about the following problems:

  1. Squid does not cache requests to localhost, but our use of the proxy server does not by-pass requests to localhost. Thus, using the gateway to access data from a service running on localhost will fail when using squid since the gateway will route the request to the proxy (i.e., squid) where it will generate an error.
  2. Not using a caching proxy server will result in poor performance.
Information Icon I think we should group all of the 'other services' that Hyrax provides so that it’s obvious that’s what’s going on. The server provides the DAP API, but it also provides the Gateway service, Aggregation service, WMS, and (soon) WCS. All these services have their own web API.
11.C.12. Gateway Service
Gateway Service Overview

Hyrax 51

The Gateway Service provides Hyrax with the ability to apply DAP constraint expressions and server side functions to data available through any network URL. This is accomplished by encoding the data source URL into the DAP request URL supplied to the gateway_service. The Gateway Service decodes the URL and uses the BES to to retrieve the remote data resource and transmit the appropriate DAP response back to the client. The system employs a white list to control what data systems the BES will access.

Information Icon Rewrite this to explain that we are providing a kind of 'URL enveloping' scheme. If we are still actually using this. jhrg 9/19/17

A Data Service Portal (DSP), such as Mirador will:

  • Provide the navigation/search/discovery interface to the data source.
  • Generate the data source URLs.
  • Encode the data source URLs.
  • Build a regular DAP query as the DAP dataset ID.
  • Hand this to the client (via a link or what have you in the DSP interface)

BES Gateway Module

The Gateway Module handles the gathering of the remote data resource and the construction of the DAP response.

The Gateway Module:

  • Evaluates the data source URL against a white list to access permission
  • Retrieves remote data source
  • Determines data type by:
    • Data type information supplied by the other parts of the server
    • HTTP Content-Disposition header
    • Applying the BES.TypeMatch string to the last term in the path section of the data source URL.

The BES will not persist the data resources beyond the scope of each request.

OLFS Gateway Service

The Gateway Service is responsible for:

  • Decoding the incoming dataset URLs.
  • Building the request for the BES.
  • Returning the response from the BES to the client.

Encoding Data Source URLs

The data source URLs need to be encoded in the DAP data request URL that is used to access the Gateway Service.

There are many ways to encode something in this context.

Prototype Encoding

As a prototype encoding we’ll use an hex ascii encoding. In this encoding each character in the data source URL is expressed as is hexadecimal value using ascii characters.

Here is hexEncoder.tgz (sig), a gzipped tar file containing a java application can perform the encoding and decoding duties from the command line. Give it a whirl - it’s a java application in a jar file. There is a bash script (hexEncode) that should launch it.

The source code for the EncodeDecode java class used by hexEncode is available here:http://scm.opendap.org/svn/trunk/olfs/src/opendap/gateway/EncodeDecode.java

Example 1. Encoding a simple URL

stringToHex(http://www.google.com) → 687474703a2f2f7777772e676f6f676c652e636f6d

hexToString(687474703a2f2f7777772e676f6f676c652e636f6d) → http://www.google.com

Last Updated: Sep 24, 2019 at 4:13 PM EDT