⇐ ⇒

[CF-metadata]

From: Bob Bane <bob.bane>
Date: Tue, 3 Aug 2004 10:23:21 -0400

The increment and offset are scalars. In HDF-EOS, by convention either both of them are positive (meaning a small geolocation dimension -> larger data dimension) or both negative (large geolocation -> smaller data - I think they threw that one in for generality, not that they ever expect to see it implemented).

If you have an offset of 1 and an increment of 2, then

Geoloc[0] has the value for Data[1] (0 * 2 + 1)
Geoloc[1] has the value for Data[3] (1 * 2 + 1)
Geoloc[2] has the value for Data[5] (2 * 2 + 1)
etc. ...

This gets nasty when you want to geolocate (say) Data[2] - I was going to punt and do a linear interpolation, but my coworkers say I should consider some sort of curve fit over more points.

Also, this is just the one-dimensional case - two-D will be most common, and the HDF-EOS API supports N-dimensions. My head hurts...

    - Bob



-----Original Message-----
From: Steve Hankin [mailto:Steven.C.Hankin at noaa.gov]
Sent: Mon 8/2/2004 1:21 PM
To: Bob Bane
Cc: cf-metadata at cgd.ucar.edu
Subject: Re: [CF-metadata]
 
Hi Bob,

I'll start the conversation going, because I believe that several of the
principals are off on vacations ...

Can you clarify the key statement

     "Swaths let you map data variables to geolocation variables,
     but they also allow the geolocation dimensions to be different
     sizes than the data dimensions - the mapping between the
     dimensions contain offset and increment values so that, in the
     one-dimensional case:

     Geoloc(i) is mapped to Data(i * increment + offset)"

In the example that you have given ("Data(i * increment + offset)") the
transformation does not appear to have changed the dimensionality --
there is still the single index "i". Are "increment" or "offset" arrays
rather than scalars?

    thanks - steve

======================================================================

Bob Bane wrote:

> I'm in the middle of updating a program that converts HDF-EOS files to
> netCDF. The old version is available here:
>
> http://hdfeos.gsfc.nasa.gov/hdfeos/details.cfm?swID=84
>
> It converts HDF-EOS5 files to netCDF, and attempts to preserve COARDS
> compliance (if the original file had COARDS-compatible metadata, so
> will the new one).
>
> The updated version is going to handle both HDF-EOS2 and HDF-EOS5
> (which already works), and I now want to make it CF-compatible. The
> auxiliary coordinate variables conventions in particular look like a
> good fit for the HDF-EOS Swath datatype, except for one thing.
>
> Swaths let you map data variables to geolocation variables, but they
> also allow the geolocation dimensions to be different sizes than the
> data dimensions - the mapping between the dimensions contain offset
> and increment values so that, in the one-dimensional case:
>
> Geoloc(i) is mapped to Data(i * increment + offset)
>
> As near as I can tell from the CF spec, auxiliary coordinate variables
> must have the same size dimensions as their corresponding data
> variables, so there's no way to translate a Swath to CF-compliant
> netCDF without expanding and interpolating the gelocation variables to
> make them match the data dimensions. Does this make sense, or am I
> missing something?
>
> - Bob
>
>
> ----------------------------------------------------------------
> _______________________________________________
> CF-metadata mailing list
> CF-metadata at cgd.ucar.edu
> http://www.cgd.ucar.edu/mailman/listinfo/cf-metadata
>
--
Steve Hankin, NOAA/PMEL -- Steven.C.Hankin at noaa.gov
7600 Sand Point Way NE, Seattle, WA 98115-0070
ph. (206) 526-6080, FAX (206) 526-6744
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.cgd.ucar.edu/pipermail/cf-metadata/attachments/20040803/196a168e/attachment.html>
Received on Tue Aug 03 2004 - 08:23:21 BST

This archive was generated by hypermail 2.3.0 : Tue Sep 13 2022 - 23:02:40 BST

⇐ ⇒