Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ruff compliance for D205. #5681

Merged
merged 26 commits into from
Jan 12, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
ff9813f
ruff complliance for D205 (wip)
tkknight Jan 6, 2024
501f581
wip
tkknight Jan 6, 2024
029a819
Merge branch 'SciTools:main' into ruff_D205_2
tkknight Jan 6, 2024
4e7f4f6
wip
tkknight Jan 8, 2024
685f055
Merge remote-tracking branch 'upstream/main' into ruff_D205_2
tkknight Jan 8, 2024
c69e064
wip
tkknight Jan 8, 2024
9659dfc
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Jan 8, 2024
035916d
various minor fixes.
tkknight Jan 8, 2024
50c7b28
Merge branch 'ruff_D205_2' of github.com:tkknight/iris into ruff_D205_2
tkknight Jan 8, 2024
8009a6d
fix doctest.
tkknight Jan 8, 2024
2271da2
gallery noqa and minor fixes.
tkknight Jan 8, 2024
734fc14
removed comments
tkknight Jan 8, 2024
13ba33d
Merge branch 'main' into ruff_D205_2
tkknight Jan 10, 2024
6818f85
Merge branch 'main' into ruff_D205_2
tkknight Jan 11, 2024
8efd461
Update lib/iris/_data_manager.py
tkknight Jan 12, 2024
c752826
Update lib/iris/_lazy_data.py
tkknight Jan 12, 2024
31e8b36
Update lib/iris/_merge.py
tkknight Jan 12, 2024
12d1c1a
Update lib/iris/_representation/cube_printout.py
tkknight Jan 12, 2024
c128914
Update lib/iris/_representation/cube_printout.py
tkknight Jan 12, 2024
ad4a8c2
Update lib/iris/analysis/_interpolation.py
tkknight Jan 12, 2024
2364ca6
Update lib/iris/coords.py
tkknight Jan 12, 2024
7bb2a21
Update lib/iris/experimental/ugrid/mesh.py
tkknight Jan 12, 2024
03d03c7
Apply suggestions from code review
tkknight Jan 12, 2024
a0d3cb8
Merge remote-tracking branch 'upstream/main' into ruff_D205_2
tkknight Jan 12, 2024
d168d07
minor tweaks.
tkknight Jan 12, 2024
5af2c97
Merge branch 'main' into ruff_D205_2
tkknight Jan 12, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion .ruff.toml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,6 @@ lint.ignore = [
"D102", # Missing docstring in public method
# (D-3) Temporary, before an initial review, either fix ocurrences or move to (2).
"D103", # Missing docstring in public function
"D205", # 1 blank line required between summary line and description
"D401", # First line of docstring should be in imperative mood: ...

# pyupgrade (UP)
Expand Down
9 changes: 6 additions & 3 deletions benchmarks/asv_delegated_conda.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,9 @@
#
# This file is part of Iris and is released under the BSD license.
# See LICENSE in the root of the repository for full licensing details.
"""ASV plug-in providing an alternative :class:`asv.plugins.conda.Conda`
subclass that manages the Conda environment via custom user scripts.
"""ASV plug-in providing an alternative :class:`asv.plugins.conda.Conda` subclass.
Manages the Conda environment via custom user scripts.
"""

Expand Down Expand Up @@ -42,7 +43,9 @@ def __init__(
requirements: dict,
tagged_env_vars: dict,
) -> None:
"""Parameters
"""__init__.
Parameters
----------
conf : Config instance
Expand Down
4 changes: 3 additions & 1 deletion benchmarks/benchmarks/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,9 @@ def disable_repeat_between_setup(benchmark_object):


class TrackAddedMemoryAllocation:
"""Context manager which measures by how much process resident memory grew,
"""Measures by how much process resident memory grew, during execution.
Context manager which measures by how much process resident memory grew,
during execution of its enclosed code block.
Obviously limited as to what it actually measures : Relies on the current
Expand Down
4 changes: 3 additions & 1 deletion benchmarks/benchmarks/aux_factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,9 @@ class FactoryCommon:
# * make class an ABC
# * remove NotImplementedError
# * combine setup_common into setup
"""A base class running a generalised suite of benchmarks for any factory.
"""Run a generalised suite of benchmarks for any factory.

A base class running a generalised suite of benchmarks for any factory.
Factory to be specified in a subclass.

ASV will run the benchmarks within this class for any subclasses.
Expand Down
4 changes: 3 additions & 1 deletion benchmarks/benchmarks/coords.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,9 @@ class CoordCommon:
# * make class an ABC
# * remove NotImplementedError
# * combine setup_common into setup
"""A base class running a generalised suite of benchmarks for any coord.
"""Run a generalised suite of benchmarks for any coord.
A base class running a generalised suite of benchmarks for any coord.
Coord to be specified in a subclass.
ASV will run the benchmarks within this class for any subclasses.
Expand Down
9 changes: 5 additions & 4 deletions benchmarks/benchmarks/cperf/equality.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,11 @@


class EqualityMixin(SingleDiagnosticMixin):
r"""Uses :class:`SingleDiagnosticMixin` as the realistic case will be comparing
r"""Use :class:`SingleDiagnosticMixin` as the realistic case.

Uses :class:`SingleDiagnosticMixin` as the realistic case will be comparing
:class:`~iris.cube.Cube`\\ s that have been loaded from file.

"""

# Cut down the parent parameters.
Expand All @@ -23,9 +26,7 @@ def setup(self, file_type, three_d=False, three_times=False):

@on_demand_benchmark
class CubeEquality(EqualityMixin):
r"""Benchmark time and memory costs of comparing LFRic and UM
:class:`~iris.cube.Cube`\\ s.
"""
r"""Benchmark time & memory costs of comparing LFRic & UM :class:`~iris.cube.Cube`\\ s."""

def _comparison(self):
_ = self.cube == self.other_cube
Expand Down
3 changes: 2 additions & 1 deletion benchmarks/benchmarks/cperf/load.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@
@on_demand_benchmark
class SingleDiagnosticLoad(SingleDiagnosticMixin):
def time_load(self, _, __, ___):
"""The 'real world comparison'
"""The 'real world comparison'.

* UM coords are always realised (DimCoords).
* LFRic coords are not realised by default (MeshCoords).

Expand Down
1 change: 1 addition & 0 deletions benchmarks/benchmarks/cperf/save.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
@on_demand_benchmark
class NetcdfSave:
"""Benchmark time and memory costs of saving ~large-ish data cubes to netcdf.

Parametrised by file type.

"""
Expand Down
4 changes: 3 additions & 1 deletion benchmarks/benchmarks/cube.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,9 @@ class ComponentCommon:
# * make class an ABC
# * remove NotImplementedError
# * combine setup_common into setup
"""A base class running a generalised suite of benchmarks for cubes that
"""Run a generalised suite of benchmarks for cubes.
A base class running a generalised suite of benchmarks for cubes that
include a specified component (e.g. Coord, CellMeasure etc.). Component to
be specified in a subclass.
Expand Down
4 changes: 3 additions & 1 deletion benchmarks/benchmarks/experimental/ugrid/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,9 @@


class UGridCommon:
"""A base class running a generalised suite of benchmarks for any ugrid object.
"""Run a generalised suite of benchmarks for any ugrid object.

A base class running a generalised suite of benchmarks for any ugrid object.
Object to be specified in a subclass.

ASV will run the benchmarks within this class for any subclasses.
Expand Down
20 changes: 14 additions & 6 deletions benchmarks/benchmarks/experimental/ugrid/regions_combine.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,14 +2,18 @@
#
# This file is part of Iris and is released under the BSD license.
# See LICENSE in the root of the repository for full licensing details.
"""Benchmarks stages of operation of the function
"""Benchmarks stages of operation.

Benchmarks stages of operation of the function
:func:`iris.experimental.ugrid.utils.recombine_submeshes`.

Where possible benchmarks should be parameterised for two sizes of input data:
* minimal: enables detection of regressions in parts of the run-time that do
NOT scale with data size.
* large: large enough to exclusively detect regressions in parts of the
run-time that scale with data size.

* minimal: enables detection of regressions in parts of the run-time that do
NOT scale with data size.

* large: large enough to exclusively detect regressions in parts of the
run-time that scale with data size.

"""
import os
Expand Down Expand Up @@ -193,10 +197,13 @@ def track_addedmem_compute_data(self, n_cubesphere):


class CombineRegionsSaveData(MixinCombineRegions):
"""Test saving *only*, having replaced the input cube data with 'imaginary'
"""Test saving *only*.

Test saving *only*, having replaced the input cube data with 'imaginary'
array data, so that input data is not loaded from disk during the save
operation.


"""

def time_save(self, n_cubesphere):
Expand All @@ -219,6 +226,7 @@ def track_filesize_saved(self, n_cubesphere):

class CombineRegionsFileStreamedCalc(MixinCombineRegions):
"""Test the whole cost of file-to-file streaming.

Uses the combined cube which is based on lazy data loading from the region
cubes on disk.
"""
Expand Down
12 changes: 9 additions & 3 deletions benchmarks/benchmarks/generate_data/ugrid.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,9 @@


def generate_cube_like_2d_cubesphere(n_cube: int, with_mesh: bool, output_path: str):
"""Construct and save to file an LFRIc cubesphere-like cube for a given
"""Construct and save to file an LFRIc cubesphere-like cube.

Construct and save to file an LFRIc cubesphere-like cube for a given
cubesphere size, *or* a simpler structured (UM-like) cube of equivalent
size.

Expand Down Expand Up @@ -54,7 +56,9 @@ def generate_cube_like_2d_cubesphere(n_cube: int, with_mesh: bool, output_path:


def make_cube_like_2d_cubesphere(n_cube: int, with_mesh: bool):
"""Generate an LFRIc cubesphere-like cube for a given cubesphere size,
"""Generate an LFRIc cubesphere-like cube.

Generate an LFRIc cubesphere-like cube for a given cubesphere size,
*or* a simpler structured (UM-like) cube of equivalent size.

All the cube data, coords and mesh content are LAZY, and produced without
Expand Down Expand Up @@ -155,7 +159,9 @@ def _external(xy_dims_, save_path_):


def make_cubesphere_testfile(c_size, n_levels=0, n_times=1):
"""Build a C<c_size> cubesphere testfile in a given directory, with a standard naming.
"""Build a C<c_size> cubesphere testfile in a given directory.

Build a C<c_size> cubesphere testfile in a given directory, with a standard naming.
If n_levels > 0 specified: 3d file with the specified number of levels.
Return the file path.

Expand Down
3 changes: 1 addition & 2 deletions benchmarks/benchmarks/import_iris.py
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,7 @@
class Iris:
@staticmethod
def _import(module_name, reset_colormaps=False):
"""Have experimented with adding sleep() commands into the imported
modules.
"""Have experimented with adding sleep() commands into the imported modules.

The results reveal:

Expand Down
6 changes: 5 additions & 1 deletion benchmarks/benchmarks/sperf/combine_regions.py
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,9 @@ def track_addedmem_compute_data(self, n_cubesphere):

@on_demand_benchmark
class SaveData(Mixin):
"""Test saving *only*, having replaced the input cube data with 'imaginary'
"""Test saving *only*.
Test saving *only*, having replaced the input cube data with 'imaginary'
array data, so that input data is not loaded from disk during the save
operation.
Expand All @@ -217,8 +219,10 @@ def track_filesize_saved(self, n_cubesphere):
@on_demand_benchmark
class FileStreamedCalc(Mixin):
"""Test the whole cost of file-to-file streaming.
Uses the combined cube which is based on lazy data loading from the region
cubes on disk.
"""

def setup(self, n_cubesphere, imaginary_data=False, create_result_cube=True):
Expand Down
6 changes: 4 additions & 2 deletions benchmarks/benchmarks/sperf/equality.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,10 @@

@on_demand_benchmark
class CubeEquality(FileMixin):
r"""Benchmark time and memory costs of comparing :class:`~iris.cube.Cube`\\ s
with attached :class:`~iris.experimental.ugrid.mesh.Mesh`\\ es.
r"""Benchmark time and memory costs.
Benchmark time and memory costs of comparing :class:`~iris.cube.Cube`\\ s
with attached :class:`~iris.experimental.ugrid.mesh.Mesh`\\ es.
Uses :class:`FileMixin` as the realistic case will be comparing
:class:`~iris.cube.Cube`\\ s that have been loaded from file.
Expand Down
5 changes: 3 additions & 2 deletions docs/gallery_code/general/plot_SOI_filtering.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
"""Applying a Filter to a Time-Series
"""
Applying a Filter to a Time-Series
==================================

This example demonstrates low pass filtering a time-series by applying a
Expand All @@ -17,7 +18,7 @@
Trenberth K. E. (1984) Signal Versus Noise in the Southern Oscillation.
Monthly Weather Review, Vol 112, pp 326-332

""" # noqa: D400
""" # noqa: D205, D212, D400

import matplotlib.pyplot as plt
import numpy as np
Expand Down
5 changes: 3 additions & 2 deletions docs/gallery_code/general/plot_anomaly_log_colouring.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
"""Colouring Anomaly Data With Logarithmic Scaling
"""
Colouring Anomaly Data With Logarithmic Scaling
===============================================
In this example, we need to plot anomaly data where the values have a
Expand All @@ -22,7 +23,7 @@
and :obj:`matplotlib.pyplot.pcolormesh`).
See also: https://en.wikipedia.org/wiki/False_color#Pseudocolor.
""" # noqa: D400
""" # noqa: D205, D212, D400

import cartopy.crs as ccrs
import matplotlib.colors as mcols
Expand Down
5 changes: 3 additions & 2 deletions docs/gallery_code/general/plot_coriolis.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
"""Deriving the Coriolis Frequency Over the Globe
"""
Deriving the Coriolis Frequency Over the Globe
==============================================

This code computes the Coriolis frequency and stores it in a cube with
associated metadata. It then plots the Coriolis frequency on an orthographic
projection.

""" # noqa: D400
""" # noqa: D205, D212, D400

import cartopy.crs as ccrs
import matplotlib.pyplot as plt
Expand Down
5 changes: 3 additions & 2 deletions docs/gallery_code/general/plot_cross_section.py
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
"""Cross Section Plots
"""
Cross Section Plots
===================

This example demonstrates contour plots of a cross-sectioned multi-dimensional
cube which features a hybrid height vertical coordinate system.

""" # noqa: D400
""" # noqa: D205, D212, D400

import matplotlib.pyplot as plt

Expand Down
24 changes: 12 additions & 12 deletions docs/gallery_code/general/plot_custom_aggregation.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
"""Calculating a Custom Statistic
"""
Calculating a Custom Statistic
==============================

This example shows how to define and use a custom
Expand All @@ -11,7 +12,7 @@
over North America, and we want to calculate in how many years these exceed a
certain temperature over a spell of 5 years or more.

""" # noqa: D400
""" # noqa: D205, D212, D400

import matplotlib.pyplot as plt
import numpy as np
Expand All @@ -27,25 +28,24 @@
# Note: in order to meet the requirements of iris.analysis.Aggregator, it must
# do the calculation over an arbitrary (given) data axis.
def count_spells(data, threshold, axis, spell_length):
"""Function to calculate the number of points in a sequence where the value
"""Calculate the number of points in a sequence.

Function to calculate the number of points in a sequence where the value
has exceeded a threshold value for at least a certain number of timepoints.

Generalised to operate on multiple time sequences arranged on a specific
axis of a multidimensional array.

Args:

* data (array):
Parameters
----------
data : array
raw data to be compared with value threshold.

* threshold (float):
threshold : float
threshold point for 'significant' datapoints.

* axis (int):
axis : int
number of the array dimension mapping the time sequences.
(Can also be negative, e.g. '-1' means last dimension)

* spell_length (int):
spell_length : int
number of consecutive times at which value > threshold to "count".

"""
Expand Down
Loading