Support loading Iris cubes from NetCDF files using the CF conventions for metadata interpretation.

See : NetCDF User’s Guide and netCDF4 python module.

Also : CF Conventions.

iris.fileformats.netcdf.loader.CHUNK_CONTROL = <iris.fileformats.netcdf.loader.ChunkControl object>#

The global ChunkControl object providing user-control of Dask chunking when Iris loads NetCDF files.

class iris.fileformats.netcdf.loader.ChunkControl(var_dim_chunksizes=None)[source]#

Bases: _local

Provide user control of Chunk Control.

Provide user control of Dask chunking.

The NetCDF loader is controlled by the single instance of this: the CHUNK_CONTROL object.

A chunk size can be set for a specific (named) file dimension, when loading specific (named) variables, or for all variables.

When a selected variable is a CF data-variable, which loads as a Cube, then the given dimension chunk size is also fixed for all variables which are components of that Cube, i.e. any Coord, CellMeasure, AncillaryVariable etc. This can be overridden, if required, by variable-specific settings.

For this purpose, MeshCoord and Connectivity are not Cube components, and chunk control on a Cube data-variable will not affect them.

class Modes(value, names=<not given>, *values, module=None, qualname=None, type=None, start=1, boundary=None)[source]#

Bases: Enum

Modes Enums.

AS_DASK = 3#
classmethod __contains__(value)#

Return True if value is in cls.

value is in cls if: 1) value is a member of cls, or 2) value is the value of one of the cls’s members.

classmethod __getitem__(name)#

Return the member matching name.

classmethod __iter__()#

Return members in definition order.

classmethod __len__()#

Return the number of members (no aliases)


Rely on Dask Array to control chunk sizes.


This function acts as a context manager, for use in a with block.


Ensure the chunk sizes are loaded in from NetCDF file variables.


KeyError – If any NetCDF data variables - those that become Cube - do not specify chunk sizes.


This function acts as a context manager, for use in a with block.

set(var_names=None, **dimension_chunksizes)[source]#

Control the Dask chunk sizes applied to NetCDF variables during loading.

  • var_names (str or list of str, default=None) – Apply the dimension_chunksizes controls only to these variables, or when building Cube from these data variables. If None, settings apply to all loaded variables.

  • **dimension_chunksizes (dict of {str: int}) – Kwargs specifying chunksizes for dimensions of file variables. Each key-value pair defines a chunk size for a named file dimension, e.g. {'time': 10, 'model_levels':1}. Values of -1 will lock the chunk size to the full size of that dimension.


This function acts as a context manager, for use in a with block.

>>> import iris
>>> from iris.fileformats.netcdf.loader import CHUNK_CONTROL
>>> with CHUNK_CONTROL.set("air_temperature", time=180, latitude=-1):
...     cube = iris.load(iris.sample_data_path(""))[0]

When var_names is present, the chunk size adjustments are applied only to the selected variables. However, for a CF data variable, this extends to all components of the (raw) Cube created from it.

Un-adjusted dimensions have chunk sizes set in the ‘usual’ way. That is, according to the normal behaviour of iris._lazy_data.as_lazy_data(), which is: chunk size is based on the file variable chunking, or full variable shape; this is scaled up or down by integer factors to best match the Dask default chunk size, i.e. the setting configured by dask.config.set({'array.chunk-size': '250MiB'}).

iris.fileformats.netcdf.loader.load_cubes(file_sources, callback=None, constraints=None)[source]#

Load cubes from a list of NetCDF filenames/OPeNDAP URLs.

  • file_sources (str or list) – One or more NetCDF filenames/OPeNDAP URLs to load from. OR open datasets.

  • callback (function, optional) – Function which can be passed on to

  • constraints (optional)

Return type:

Generator of loaded NetCDF iris.cube.Cube.