Hi Jonathan, Ethan,
> Dear Ethan
>
> I think that the current rules are a good compromise between the needs of
> people who write and analyse data, and the needs of developers of analysis
> and other software. The former group of people would like CF to be modified
> fairly rapidly, when they are about to start producing data from a project,
> and they want that data to have proper metadata. As you will have seen from
> previous discussions, our discussions are too slow as it is sometimes. Hence
> we decided the rules so that changes could be made, but marked as provisional.
Indeed. I think the 4-plus years between CF 1.0 and CF 1.1 - according
to the date stamps on the documents - says it all. Perhaps the recent
flurry of CF proposal activity in part reflects a general desire to
'play catch-up'.
>
> For provisional changes to become permanent depends on at least two
> applications supporting them. That requires some development effort to be
> invested. CF doesn't have staff resources of its own to commit to it. I think
> the most likely applications to make changes first are the cf-checker and
> libcf. It will be interesting to see how long it takes for the changes so far
> agreed to be implemented in these or other applications.
>
> I fear that if we followed this approach:
>
> > 2) Don't add changes to the upcoming version of the specification
> > document "until at least two applications have successfully interpreted
> > the test data".
>
> development of CF would effectively be halted altogether. It would be
> impossible for writers of data to agree changes to the CF standard on a
> short enough timescale. Consequently they would bypass CF, and write and
> analyse data with their own metadata conventions, and the usefulness of CF
> in providing a common standard would be undermined.
I agree 100% with this. If, as a community, we set the barrier to
progress [of the CF conventions] too high then people will necessarily
devise local, incompatible solutions - not out of willfulness, but
simply to meet project deadlines.
>
> Applications don't have to keep entirely up to date, do they? I think the
> value of the Conventions attribute should be that it is easy to be clear
> about what conventions are being implemented in data and metadata.
>
> I agree about the test data. We should construct a file which contains
> some test data for the changes of CF 1.2. (The changes of CF 1.1 did not
> introduce any new attribute.) We'll need a place to deposit such files.
> As moderator of that ticket, I'll discuss it with Phil and Velimir.
I can produce some simple test files for the changes at CF 1.2. But the
question of what constitutes application conformance is, I suggest, not
easily defined. For instance, I could create a noddy netcdf file with
two new grid mapping attributes, as follows:
float temperature(t,z,lat,lon);
:grid_mapping = "crs";
char crs;
:grid_mapping_name = "latitude_longitude";
:semi_major_axis = "92389234"; // new at CF 1.2
:semi_minor_axis = "78682347"; // new at CF 1.2
And I could read this file today using, say, ncdump and ncview. Which
clearly doesn't tell us much. Yet a proposer of a given CF change cannot
force the hands of software developers to produce compliant software
within a particular time frame, if at all. In some (many?) circumstances
I think we have to take it as an act of faith that a particular update
to the CF convention will be advantageous. Plus I believe that the
robustness of the CF peer review and challenge mechanisms is sufficient
to ensure that those updates will be advantageous.
Regards,
Phil
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <
http://mailman.cgd.ucar.edu/pipermail/cf-metadata/attachments/20080508/c0ac8bd7/attachment-0002.html>
Received on Thu May 08 2008 - 04:53:44 BST