v2.3 (19 Dec 2019)
This document explains the changes made to Iris for this release (View all changes.)
Support for CF 1.7
We have introduced several changes that contribute to Iris’s support for the CF Conventions, including some CF 1.7 additions. We are now able to support:
You can read more about each of these below.
Additionally, the conventions attribute, added by Iris when saving to
NetCDF, has been updated to
Climatological Coordinate Support
Iris can now load, store and save NetCDF climatological coordinates. Any cube time
coordinate can be marked as a climatological time axis using the boolean
climatological. The climatological bounds are stored in the
When an Iris climatological coordinate is saved in NetCDF, the NetCDF
coordinate variable will be given a ‘climatology’ attribute, and the
contents of the
bounds property are written to a NetCDF boundary variable
called ‘<coordinate-name>_bounds’. These are in place of a standard
‘bounds’ attribute and accompanying boundary variable. See below
for an example adapted from CF conventions:
dimensions: time=4; bnds=2; variables: float temperature(time,lat,lon); temperature:long_name="surface air temperature"; temperature:cell_methods="time: minimum within years time: mean over years"; temperature:units="K"; double time(time); time:climatology="time_climatology"; time:units="days since 1960-1-1"; double time_climatology(time,bnds); data: // time coordinates translated to date/time format time="1960-4-16", "1960-7-16", "1960-10-16", "1961-1-16" ; time_climatology="1960-3-1", "1990-6-1", "1960-6-1", "1990-9-1", "1960-9-1", "1990-12-1", "1960-12-1", "1991-3-1" ;
If a climatological time axis is detected when loading NetCDF -
indicated by the format described above - the
of the Iris coordinate will be set to
New Chunking Strategy
Iris now makes better choices of Dask chunk sizes when loading from NetCDF files: If a file variable has small, specified chunks, Iris will now choose Dask chunks which are a multiple of these up to a default target size.
This is particularly relevant to files with an unlimited dimension, which previously could produce a large number of small chunks. This had an adverse effect on performance.
In addition, Iris now takes its default chunk size from the default configured
in Dask itself, i.e.
Several statistical operations can now be done lazily, taking advantage of the performance improvements offered by Dask:
Cube data equality testing (and hence cube equality) now uses a more relaxed tolerance : This means that some cubes may now test ‘equal’ that previously did not. Previously, Iris compared cube data arrays using
abs(a - b) < 1.e-8
We now apply the default operation of
numpy.allclose()instead, which is equivalent to
abs(a - b) < (1.e-8 + 1.e-5 * b)
Added support to render HTML for
CubeListin Jupyter Notebooks and JupyterLab.
Loading CellMeasures with integer values is now supported.
iris.coord_systems.VerticalPerspectivecan now be saved to and loaded from NetCDF files.
Iris now supports standard name modifiers. See Appendix C, Standard Name Modifiers for more information.
iris.cube.Cube.remove_cell_measure()now also allows removal of a cell measure by its name (previously only accepted a CellMeasure object).
iris.analysis.RMSaggregator now supports a lazy calculation. However, the “weights” keyword is not currently supported by this, so a weighted calculation will still return a realised result, and force realisation of the original cube data.
Iris now supports NetCDF Climate and Forecast (CF) Metadata Conventions 1.7 (see CF 1.7 Conventions Document for more information)
Updated standard name support to CF standard name table version 70, 2019-12-10
Updated UM STASH translations to metarelate/metOcean commit 448f2ef, 2019-11-29
Cube equality of boolean data is now handled correctly.
Fixed a bug where cell measures were incorrect after a cube
transpose()operation. Previously, this resulted in cell-measures that were no longer correctly mapped to the cube dimensions.
AuxCoorddisregarded masked points and bounds, as did the
DimCoord. Fix permits an
AuxCoordto contain masked points/bounds, and a TypeError exception is now raised when attempting to create or set the points/bounds of a
DimCoordwith arrays with missing points.
The following var_name properties will now only allow valid NetCDF name tokens to reference the said NetCDF variable name. Note that names with a leading underscore are not permitted.
Rendering a cube in Jupyter will no longer crash for a cube with attributes containing
NetCDF variables which reference themselves in their
cell_measuresattribute can now be read.
quiver()now handles circular coordinates.
The names of cubes loaded from abf/abl files have been corrected.
Fixed a bug in UM file loading, where any landsea-mask-compressed fields (i.e. with LBPACK=x2x) would cause an error later, when realising the data.
iris.cube.Cube.collapsed()now handles partial collapsing of multidimensional coordinates that have bounds.
Fixed a bug in the
PROPORTIONaggregator, where cube data in the form of a masked array with
array.mask=Falsewould cause an error, but possibly only later when the values are actually realised. ( Note: since netCDF4 version 1.4.0, this is now a common form for data loaded from netCDF files ).
Fixed a bug where plotting a cube with a
iris.coord_systems.LambertConformalcoordinate system would result in an error. This would happen if the coordinate system was defined with one standard parallel, rather than two. In these cases, a call to
iris.cube.Cube.aggregated_by()now gives correct values in points and bounds when handling multidimensional coordinates.
Fixed a bug in the
iris.cube.Cube.collapsed()operation, which caused the unexpected realization of any attached auxiliary coordinates that were bounded. It now correctly produces a lazy result and does not realise the original attached AuxCoords.
Iris now supports Proj4 up to version 5, but not yet 6 or beyond, pending fixes to some cartopy tests.
Iris now requires Dask >= 1.2 to allow for improved coordinate equality checks.
Adopted a new colour logo for Iris
Added a gallery example showing how to concatenate NEMO ocean model data, see Load a Time Series of Data From the NEMO Model.
Added an example for loading Iris cubes for Constraining on Time in the user guide, demonstrating how to load data within a specified date range.
Added notes to the
iris.load()documentation, and the user guide Loading Iris Cubes chapter, emphasizing that the order of the cubes returned by an iris load operation is effectively random and unstable, and should not be relied on.