Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOCS: numpydocs 5 #5715

Merged
merged 1 commit into from
Jan 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions lib/iris/_concatenate.py
Original file line number Diff line number Diff line change
Expand Up @@ -488,7 +488,7 @@ def _coordinate_differences(self, other, attr, reason="metadata"):
attr : str
The _CubeSignature attribute within which differences exist
between `self` and `other`.
reason : str
reason : str, default="metadata"
The reason to give for mismatch (function is normally, but not
always, testing metadata)

Expand Down Expand Up @@ -854,22 +854,22 @@ def register(
axis : optional
Seed the dimension of concatenation for the :class:`_ProtoCube`
rather than rely on negotiation with source-cubes.
error_on_mismatch : bool, optional
error_on_mismatch : bool, default=False
If True, raise an informative error if registration fails.
check_aux_coords : bool, optional
check_aux_coords : bool, default=False
Checks if the points and bounds of auxiliary coordinates of the
cubes match. This check is not applied to auxiliary coordinates
that span the dimension the concatenation is occurring along.
Defaults to False.
check_cell_measures : bool, optional
check_cell_measures : bool, default=False
Checks if the data of cell measures of the cubes match. This check
is not applied to cell measures that span the dimension the
concatenation is occurring along. Defaults to False.
check_ancils : bool, optional
check_ancils : bool, default=False
Checks if the data of ancillary variables of the cubes match. This
check is not applied to ancillary variables that span the dimension
the concatenation is occurring along. Defaults to False.
check_derived_coords : bool, optional
check_derived_coords : bool, default=False
Checks if the points and bounds of derived coordinates of the cubes
match. This check is not applied to derived coordinates that span
the dimension the concatenation is occurring along. Note that
Expand Down
2 changes: 1 addition & 1 deletion lib/iris/_data_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -273,7 +273,7 @@ def copy(self, data=None):

Parameters
----------
data :
data : optional
Replace the data of the copy with this data.

Returns
Expand Down
8 changes: 4 additions & 4 deletions lib/iris/_lazy_data.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,12 +72,12 @@ def _optimum_chunksize_internals(
Pre-existing chunk shape of the target data.
shape : tuple of int
The full array shape of the target data.
limit : int
limit : int, optional
The 'ideal' target chunk size, in bytes. Default from
:mod:`dask.config`.
dtype : np.dtype
Numpy dtype of target data.
dims_fixed : list of bool
dims_fixed : list of bool, optional
If set, a list of values equal in length to 'chunks' or 'shape'.
'True' values indicate a dimension that can not be changed, i.e. that
element of the result must equal the corresponding value in 'chunks' or
Expand Down Expand Up @@ -232,14 +232,14 @@ def as_lazy_data(
This will be converted to a :class:`dask.array.Array`.
chunks : list of int, optional
If present, a source chunk shape, e.g. for a chunked netcdf variable.
asarray : bool, optional
asarray : bool, default=False
If True, then chunks will be converted to instances of `ndarray`.
Set to False (default) to pass passed chunks through unchanged.
dims_fixed : list of bool, optional
If set, a list of values equal in length to 'chunks' or data.ndim.
'True' values indicate a dimension which can not be changed, i.e. the
result for that index must equal the value in 'chunks' or data.shape.
dask_chunking : bool, optional
dask_chunking : bool, default=False
If True, Iris chunking optimisation will be bypassed, and dask's default
chunking will be used instead. Including a value for chunks while dask_chunking
is set to True will result in a failure.
Expand Down
18 changes: 9 additions & 9 deletions lib/iris/_merge.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ class _CoordMetaData(
bounds_dtype :
The bounds data :class:`numpy.dtype` of an associated coordinate.
None otherwise.
kwargs:
kwargs :
A dictionary of key/value pairs required to create a coordinate.

"""
Expand Down Expand Up @@ -162,11 +162,11 @@ class _CoordPayload(namedtuple("CoordPayload", ["scalar", "vector", "factory_def

Parameters
----------
scalar:
scalar :
A :class:`_ScalarCoordPayload` instance.
vector:
vector :
A :class:`_VectorCoordPayload` instance.
factory_defns:
factory_defns :
A list of :class:`_FactoryDefn` instances.

"""
Expand Down Expand Up @@ -637,7 +637,7 @@ def _separable(name, indexes):

Returns
-------
tupl
tuple
A tuple containing the set of separable and inseparable
candidate dimensions.

Expand Down Expand Up @@ -777,7 +777,7 @@ def _is_dependent(dependent, independent, positions, function_mapping=None):
positions :
A list containing a dictionary of candidate dimension key to
scalar value pairs for each source-cube.
function_mapping : optional, default=None
function_mapping : optional
A dictionary that enumerates a valid functional relationship
between the dependent candidate dimension and the independent
candidate dimension/s.
Expand Down Expand Up @@ -1052,7 +1052,7 @@ def derive_space(groups, relation_matrix, positions, function_matrix=None):
positions :
A list containing a dictionary of candidate dimension key to
scalar value pairs for each source-cube.
function_matrix : optional, default=None
function_matrix : optional
The function mapping dictionary for each candidate dimension that
participates in a functional relationship.

Expand Down Expand Up @@ -1186,7 +1186,7 @@ def merge(self, unique=True):

Parameters
----------
unique :
unique : bool, default=True
If True, raises `iris.exceptions.DuplicateDataError` if
duplicate cubes are detected.

Expand Down Expand Up @@ -1294,7 +1294,7 @@ def register(self, cube, error_on_mismatch=False):
cube :
Candidate :class:`iris.cube.Cube` to be associated with
this :class:`ProtoCube`.
error_on_mismatch :bool, optional, default=False
error_on_mismatch :bool, default=False
If True, raise an informative
:class:`~iris.exceptions.MergeError` if registration fails.

Expand Down
2 changes: 1 addition & 1 deletion lib/iris/analysis/_area_weighted.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ def __init__(self, src_grid_cube, target_grid_cube, mdtol=1):
The :class:`~iris.cube.Cube` providing the source grid.
target_grid_cube : :class:`~iris.cube.Cube`
The :class:`~iris.cube.Cube` providing the target grid.
mdtol : float, optional
mdtol : float, default=1
Tolerance of missing data. The value returned in each element of
the returned array will be masked if the fraction of masked data
exceeds mdtol. mdtol=0 means no missing data is tolerated while
Expand Down
6 changes: 3 additions & 3 deletions lib/iris/analysis/_interpolation.py
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ def get_xy_coords(cube, dim_coords=False):
----------
cube : :class:`iris.cube.Cube`
An instance of :class:`iris.cube.Cube`.
dim_coords : bool, optional, default=False
dim_coords : bool, default=False
Set this to True to only return dimension coordinates. Defaults to
False.

Expand Down Expand Up @@ -482,7 +482,7 @@ def _points(self, sample_points, data, data_dims=None):
The data to interpolate - not necessarily the data from the cube
that was used to construct this interpolator. If the data has
fewer dimensions, then data_dims must be defined.
data_dims : optional, default=None
data_dims : optional
The dimensions of the given data array in terms of the original
cube passed through to this interpolator's constructor. If None,
the data dimensions must map one-to-one onto the increasing
Expand Down Expand Up @@ -574,7 +574,7 @@ def __call__(self, sample_points, collapse_scalar=True):
A list of N iterables, where N is the number of coordinates
passed to the constructor.
[sample_values_for_coord_0, sample_values_for_coord_1, ...]
collapse_scalar : bool, optional
collapse_scalar : bool, default=True
Whether to collapse the dimension of the scalar sample points
in the resulting cube. Default is True.

Expand Down
8 changes: 4 additions & 4 deletions lib/iris/analysis/_regrid.py
Original file line number Diff line number Diff line change
Expand Up @@ -384,7 +384,7 @@ def __init__(self, src_grid_cube, target_grid_cube, weights=None):
The :class:`~iris.cube.Cube` providing the source grid.
tgt_grid_cube : :class:`~iris.cube.Cube`
The :class:`~iris.cube.Cube` providing the target grid.
weights : optional, default=None
weights : optional
A :class:`numpy.ndarray` instance that defines the weights
for the grid cells of the source grid. Must have the same shape
as the data of the source grid.
Expand Down Expand Up @@ -636,9 +636,9 @@ def _regrid(
A 2-dimensional array of sample X values.
sample_grid_y :
A 2-dimensional array of sample Y values.
method: str, optional
method: str, default="linear"
Either 'linear' or 'nearest'. The default method is 'linear'.
extrapolation_mode : str, optional
extrapolation_mode : str, default="nanmask"
Must be one of the following strings:

* 'linear' - The extrapolation points will be calculated by
Expand All @@ -656,7 +656,7 @@ def _regrid(

Returns
-------
NumPu array
NumPy array
The regridded data as an N-dimensional NumPy array. The lengths
of the X and Y dimensions will now match those of the sample
grid.
Expand Down
8 changes: 4 additions & 4 deletions lib/iris/analysis/cartography.py
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@ def _xy_range(cube, mode=None):
----------
cube :
The cube for which to calculate xy extents.
mode : optional, default=None
mode : optional
If the coordinate has bounds, set this to specify the
min/max calculation.
Set to iris.coords.POINT_MODE or iris.coords.BOUND_MODE.
Expand Down Expand Up @@ -397,7 +397,7 @@ def area_weights(cube, normalize=False):
----------
cube : :class:`iris.cube.Cube`
The cube to calculate area weights for.
normalize : bool, optional, default=False
normalize : bool, default=False
If False, weights are grid cell areas. If True, weights are grid
cell areas divided by the total grid area.

Expand Down Expand Up @@ -605,10 +605,10 @@ def project(cube, target_proj, nx=None, ny=None):
An instance of the Cartopy Projection class, or an instance of
:class:`iris.coord_systems.CoordSystem` from which a projection
will be obtained.
nx : optional, default=None
nx : optional
Desired number of sample points in the x direction for a domain
covering the globe.
ny : optional, default=None
ny : optional
Desired number of sample points in the y direction for a domain
covering the globe.

Expand Down
13 changes: 5 additions & 8 deletions lib/iris/analysis/geometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -163,17 +163,14 @@ def geometry_area_weights(cube, geometry, normalize=False):
This function does not maintain laziness when called; it realises data.
See more at :doc:`/userguide/real_and_lazy_data`.

Args:

* cube (:class:`iris.cube.Cube`):
Parameters
----------
cube : :class:`iris.cube.Cube`
A Cube containing a bounded, horizontal grid definition.
* geometry (a shapely geometry instance):
geometry : shapely geometry instance
The geometry of interest. To produce meaningful results this geometry
must have a non-zero area. Typically a Polygon or MultiPolygon.

Kwargs:

* normalize:
normalize : bool, default=False
Calculate each individual cell weight as the cell area overlap between
the cell and the given shapely geometry divided by the total cell area.
Default is False.
Expand Down
Loading