You are viewing the latest unreleased documentation 3.9.0.dev5. You can switch to a stable version.

v3.6 (18 May 2023)#

This document explains the changes made to Iris for this release (View all changes.)

v3.6 Release Highlights

We’re so excited about our recent support for delayed saving of lazy data to netCDF (PR #5191) that we’re celebrating this important step change in behavour with its very own dedicated release 🥳

By using iris.save(..., compute=False) you can now save to multiple NetCDF files in parallel. See the new compute keyword in iris.fileformats.netcdf.save(). This can share and re-use any common (lazy) result computations, and it makes much better use of resources during any file-system waiting (i.e., it can use such periods to progress the other saves).

Usage example:

# Create output files with delayed data saving.
delayeds = [
    iris.save(cubes, filepath, compute=False)
    for cubes, filepath in zip(output_cubesets, output_filepaths)
]
# Complete saves in parallel.
dask.compute(*delayeds)

This advance also includes another substantial benefit, because NetCDF saves can now use a Dask.distributed scheduler. With Distributed you can parallelise the saves across a whole cluster. Whereas previously, the NetCDF saving only worked with a “threaded” scheduler, limiting it to a single CPU.

We’re so super keen for the community to leverage the benefit of this new feature within Iris that we’ve brought this release forward several months. As a result, this minor release of Iris is intentionally light in content. However, there are some other goodies available for you to enjoy, such as:

As always, get in touch with us on GitHub, particularly if you have any feedback with regards to delayed saving, or have any issues or feature requests for improving Iris. Enjoy!

v3.6.1 (26 June 2023)#

v3.6.1 Patches

📢 Announcements

Welcome and congratulations to @sloosvel who made their first contribution to Iris! 🎉

The patches in this release of Iris include:

Features

  1. @rcomer rewrote broadcast_to_shape() so it now handles lazy data. This pull-request has been included to support PR #5341. (PR #5307) [pre-v3.7.0]

🐛 Bugs Fixed

  1. @stephenworsley fixed convert_units() to allow unit conversion of lazy data when using a Distributed scheduler. (Issue #5347, PR #5349)

  2. @schlunma fixed a bug in the concatenation of cubes with aux factories which could lead to a KeyError due to dependencies that have not been properly updated. (Issue #5339, PR #5340)

  3. @schlunma fixed a bug which realized all weights during weighted aggregation. Now weighted aggregation is fully lazy again. (Issue #5338, PR #5341)

🚀 Performance Enhancements

  1. @sloosvel improved concatenate_cube() and concatenate() to ensure that lazy auxiliary coordinate points and bounds are not realized. This change now allows cubes with high-resolution auxiliary coordinates to concatenate successfully whilst using a minimal in-core memory footprint. (Issue #5115, PR #5142)

Note that, the above contribution labelled with pre-v3.7.0 is part of the forthcoming Iris v3.7.0 release, but requires to be included in this patch release.

📢 Announcements#

  1. @bjlittle added the community Contributor Covenant code of conduct. (PR #5291)

✨ Features#

  1. @pp-mo and @lbdreyer supported delayed saving of lazy data, when writing to the netCDF file format. See delayed netCDF saves. Also with significant input from @fnattino. (PR #5191)

  2. @rcomer tweaked binary operations so that dask arrays may safely be passed to arithmetic operations and mask_cube(). (PR #4929)

🐛 Bugs Fixed#

  1. @rcomer enabled automatic replacement of a Matplotlib Axes with a Cartopy GeoAxes when the Axes is on a SubFigure. (Issue #5282, PR #5288)

💣 Incompatible Changes#

  1. N/A

🚀 Performance Enhancements#

  1. N/A

🔥 Deprecations#

  1. N/A

🔗 Dependencies#

  1. @rcomer and @bjlittle (reviewer) added testing support for python 3.11. (PR #5226)

  2. @rcomer dropped support for python 3.8, in accordance with the NEP29 recommendations (PR #5226)

  3. @trexfeathers introduced the libnetcdf !=4.9.1 and numpy !=1.24.3 pins (PR #5274)

📚 Documentation#

  1. @tkknight migrated to sphinx-design over the legacy sphinx-panels. (PR #5127)

  2. @tkknight updated the make target for help and added livehtml to auto generate the documentation when changes are detected during development. (PR #5258)

  3. @tkknight updated the Installing a Development Version from a Git Checkout instructions to use pip. (PR #5273)

  4. @tkknight removed the legacy custom sphinx extensions that generate the API documentation. Instead use a less complex approach via sphinx-apidoc. (PR #5264)

  5. @trexfeathers re-wrote the Releases documentation for clarity, and wrote a step-by-step Release Do-Nothing Script for the release process. (PR #5134)

  6. @trexfeathers and @tkknight added a dark-mode friendly logo. (PR #5278)

💼 Internal#

  1. @bjlittle added the codespell pre-commit git-hook to automate spell checking within the code-base. (PR #5186)

  2. @bjlittle and @trexfeathers (reviewer) added a check-manifest GitHub Action and pre-commit git-hook to automate verification of assets bundled within a sdist and binary wheel of our scitools-iris PyPI package. (PR #5259)

  3. @rcomer removed a now redundant copying workaround from Resolve testing. (PR #5267)

  4. @bjlittle and @trexfeathers (reviewer) migrated setup.cfg to pyproject.toml, as motivated by PEP-0621. (PR #5262)

  5. @bjlittle adopted pypa/build recommended best practice to build a binary wheel from the sdist. (PR #5266)

  6. @trexfeathers enabled on-demand benchmarking of Pull Requests; see here. (PR #5286)