Skip to content

Run pydocstyle in pre-commit #6382

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 12, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,14 @@ repos:
args: [--rcfile=.pylintrc]
files: ^pymc/
exclude: (?x)(pymc/_version.py)
- repo: https://github.com/PyCQA/pydocstyle
rev: 6.1.1
hooks:
- id: pydocstyle
args:
- --ignore=D100,D101,D102,D103,D104,D105,D107,D200,D202,D203,D204,D205,D209,D212,D213,D301,D400,D401,D403,D413,D415,D417
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The convention arg is missing, we should be using numpy as convention (which should remove the contradictory warnings happening all at the same time)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

e.g. D212 and D213 make no sense together.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just tried it, but with --convention one can't pass --ignore and then we'd have hundreds of errors and would need to fix them all in one go (unrealistic..)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's keep ignore then

files: ^pymc/
exclude: ^pymc/tests/
- repo: https://github.com/MarcoGorelli/madforhooks
rev: 0.3.0
hooks:
Expand Down
2 changes: 1 addition & 1 deletion pymc/blocking.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ def rmap(
"""Map 1D concatenated array to a dictionary of variables in their original spaces.

Parameters
==========
----------
array
The array to map.
start_point
Expand Down
1 change: 0 additions & 1 deletion pymc/distributions/continuous.py
Original file line number Diff line number Diff line change
Expand Up @@ -1593,7 +1593,6 @@ class LogNormal(PositiveContinuous):

Examples
--------

.. code-block:: python

# Example to show that we pass in only ``sigma`` or ``tau`` but not both.
Expand Down
1 change: 0 additions & 1 deletion pymc/distributions/discrete.py
Original file line number Diff line number Diff line change
Expand Up @@ -1549,7 +1549,6 @@ class OrderedLogistic:

Examples
--------

.. code-block:: python

# Generate data for a simple 1 dimensional example problem
Expand Down
24 changes: 13 additions & 11 deletions pymc/distributions/dist_math.py
Original file line number Diff line number Diff line change
Expand Up @@ -355,17 +355,19 @@ def grad(self, inp, grads):
def random_choice(p, size):
"""Return draws from categorical probability functions

Args:
p: array
Probability of each class. If p.ndim > 1, the last axis is
interpreted as the probability of each class, and numpy.random.choice
is iterated for every other axis element.
size: int or tuple
Shape of the desired output array. If p is multidimensional, size
should broadcast with p.shape[:-1].

Returns:
random sample: array
Parameters
----------
p : array
Probability of each class. If p.ndim > 1, the last axis is
interpreted as the probability of each class, and numpy.random.choice
is iterated for every other axis element.
size : int or tuple
Shape of the desired output array. If p is multidimensional, size
should broadcast with p.shape[:-1].

Returns
-------
random_sample : array

"""
k = p.shape[-1]
Expand Down
3 changes: 1 addition & 2 deletions pymc/distributions/distribution.py
Original file line number Diff line number Diff line change
Expand Up @@ -413,7 +413,7 @@ def _get_measurable_outputs_symbolic_random_variable(op, node):
@node_rewriter([SymbolicRandomVariable])
def inline_symbolic_random_variable(fgraph, node):
"""
This optimization expands the internal graph of a SymbolicRV when obtaining the logp
Optimization that expands the internal graph of a SymbolicRV when obtaining the logp
graph, if the flag `inline_logprob` is True.
"""
op = node.op
Expand Down Expand Up @@ -828,7 +828,6 @@ class CustomDist:

Examples
--------

Create a CustomDist that wraps a black-box logp function. This variable cannot be
used in prior or posterior predictive sampling because no random function was provided

Expand Down
1 change: 0 additions & 1 deletion pymc/distributions/multivariate.py
Original file line number Diff line number Diff line change
Expand Up @@ -796,7 +796,6 @@ class OrderedMultinomial:

Examples
--------

.. code-block:: python

# Generate data for a simple 1 dimensional example problem
Expand Down
1 change: 0 additions & 1 deletion pymc/distributions/transforms.py
Original file line number Diff line number Diff line change
Expand Up @@ -240,7 +240,6 @@ class Interval(IntervalTransform):

Examples
--------

Create an interval transform between -1 and +1

.. code-block:: python
Expand Down
6 changes: 4 additions & 2 deletions pymc/distributions/truncated.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,10 @@


class TruncatedRV(SymbolicRandomVariable):
"""An `Op` constructed from an PyTensor graph that represents a truncated univariate
random variable."""
"""
An `Op` constructed from an PyTensor graph
that represents a truncated univariate random variable.
"""

default_output = 1
base_rv_op = None
Expand Down
3 changes: 1 addition & 2 deletions pymc/gp/util.py
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ def plot_gp_dist(
):
"""A helper function for plotting 1D GP posteriors from trace

Parameters
Parameters
----------
ax: axes
Matplotlib axes.
Expand All @@ -213,7 +213,6 @@ def plot_gp_dist(

Returns
-------

ax: Matplotlib axes
"""
import matplotlib.pyplot as plt
Expand Down
2 changes: 1 addition & 1 deletion pymc/logprob/abstract.py
Original file line number Diff line number Diff line change
Expand Up @@ -194,7 +194,7 @@ def assign_custom_measurable_outputs(
`factorized_joint_logprob`.

Parameters
==========
----------
node
The node to recreate with a new cloned `Op`.
measurable_outputs_fn
Expand Down
4 changes: 2 additions & 2 deletions pymc/logprob/joint_logprob.py
Original file line number Diff line number Diff line change
Expand Up @@ -95,7 +95,7 @@ def factorized_joint_logprob(


Parameters
==========
----------
rv_values
A ``dict`` of variables that maps stochastic elements
(e.g. `RandomVariable`\s) to symbolic `Variable`\s representing their
Expand All @@ -111,7 +111,7 @@ def factorized_joint_logprob(
etc.)

Returns
=======
-------
A ``dict`` that maps each value variable to the log-probability factor derived
from the respective `RandomVariable`.

Expand Down
6 changes: 3 additions & 3 deletions pymc/logprob/rewriting.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@


class NoCallbackEquilibriumDB(EquilibriumDB):
r"""This `EquilibriumDB` doesn't hide its exceptions.
r"""An `EquilibriumDB` that doesn't hide its exceptions.

By setting `failure_callback` to ``None`` in the `EquilibriumGraphRewriter`\s
that `EquilibriumDB` generates, we're able to directly emit the desired
Expand Down Expand Up @@ -102,7 +102,7 @@ class PreserveRVMappings(Feature):
def __init__(self, rv_values: Dict[TensorVariable, TensorVariable]):
"""
Parameters
==========
----------
rv_values
Mappings between random variables and their value variables.
The keys of this map are what this `Feature` keeps updated.
Expand Down Expand Up @@ -130,7 +130,7 @@ def update_rv_maps(
original value variables.

Parameters
==========
----------
old_rv
The random variable whose mappings will be updated.
new_value
Expand Down
6 changes: 3 additions & 3 deletions pymc/logprob/scan.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ def convert_outer_out_to_in(
r"""Convert outer-graph outputs into outer-graph inputs.

Parameters
==========
----------
input_scan_args:
The source `Scan` arguments.
outer_out_vars:
Expand Down Expand Up @@ -253,7 +253,7 @@ def get_random_outer_outputs(
"""Get the `MeasurableVariable` outputs of a `Scan` (well, its `ScanArgs`).

Returns
=======
-------
A tuple of tuples containing the index of each outer-output variable, the
outer-output variable itself, and the inner-output variable that
is an instance of `MeasurableVariable`.
Expand Down Expand Up @@ -329,7 +329,7 @@ def create_inner_out_logp(value_map: Dict[TensorVariable, TensorVariable]) -> Te

@node_rewriter([Scan])
def find_measurable_scans(fgraph, node):
r"""Finds `Scan`\s for which a `logprob` can be computed.
r"""Find `Scan`\s for which a `logprob` can be computed.

This will convert said `Scan`\s into `MeasurableScan`\s. It also updates
random variable and value variable mappings that have been specified for
Expand Down
6 changes: 3 additions & 3 deletions pymc/logprob/transforms.py
Original file line number Diff line number Diff line change
Expand Up @@ -295,7 +295,7 @@ def __init__(
):
"""
Parameters
==========
----------
values_to_transforms
Mapping between value variables and their transformations. Each
value variable can be assigned one of `RVTransform`, or ``None``.
Expand Down Expand Up @@ -514,7 +514,7 @@ def __init__(self, args_fn: Callable[..., Tuple[Optional[Variable], Optional[Var
"""

Parameters
==========
----------
args_fn
Function that expects inputs of RandomVariable and returns the lower
and upper bounds for the interval transformation. If one of these is
Expand Down Expand Up @@ -660,7 +660,7 @@ def _create_transformed_rv_op(
also behaving exactly as it did before.

Parameters
==========
----------
rv_op
The `RandomVariable` for which we want to construct a `TransformedRV`.
transform
Expand Down
8 changes: 4 additions & 4 deletions pymc/logprob/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,7 @@ def walk_model(
By default, these walks will not go past ``MeasurableVariable`` nodes.

Parameters
==========
----------
graphs
The graphs to walk.
walk_past_rvs
Expand Down Expand Up @@ -104,12 +104,12 @@ def replace_rvs_in_graphs(
This will *not* recompute test values.

Parameters
==========
----------
graphs
The graphs in which random variables are to be replaced.

Returns
=======
-------
A ``tuple`` containing the transformed graphs and a ``dict`` of the
replacements that were made.
"""
Expand Down Expand Up @@ -154,7 +154,7 @@ def rvs_to_value_vars(
This will *not* recompute test values in the resulting graphs.

Parameters
==========
----------
graphs
The graphs in which to perform the replacements.
initial_replacements
Expand Down
2 changes: 1 addition & 1 deletion pymc/math.py
Original file line number Diff line number Diff line change
Expand Up @@ -196,7 +196,7 @@ def kron_matrix_op(krons, m, op):
r"""Apply op to krons and m in a way that reproduces ``op(kronecker(*krons), m)``

Parameters
-----------
----------
krons : list of square 2D array-like objects
D square matrices :math:`[A_1, A_2, ..., A_D]` to be Kronecker'ed
:math:`A = A_1 \otimes A_2 \otimes ... \otimes A_D`
Expand Down
8 changes: 3 additions & 5 deletions pymc/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -153,7 +153,7 @@ class ContextMeta(type):
"""

def __new__(cls, name, bases, dct, **kwargs): # pylint: disable=unused-argument
"Add __enter__ and __exit__ methods to the class."
"""Add __enter__ and __exit__ methods to the class."""

def __enter__(self):
self.__class__.context_class.get_contexts().append(self)
Expand Down Expand Up @@ -457,7 +457,6 @@ class Model(WithMemoization, metaclass=ContextMeta):

Examples
--------

How to define a custom model

.. code-block:: python
Expand Down Expand Up @@ -1356,7 +1355,7 @@ def make_obs_var(
"""Create a `TensorVariable` for an observed random variable.

Parameters
==========
----------
rv_var
The random variable that is observed.
Its dimensionality must be compatible with the data already.
Expand Down Expand Up @@ -1808,7 +1807,7 @@ def point_logps(self, point=None, round_vals=2):


class BlockModelAccess(Model):
"""This class can be used to prevent user access to Model contexts"""
"""Can be used to prevent user access to Model contexts"""

def __init__(self, *args, error_msg_on_access="Model access is blocked", **kwargs):
self.error_msg_on_access = error_msg_on_access
Expand All @@ -1829,7 +1828,6 @@ def set_data(new_data, model=None, *, coords=None):

Examples
--------

This example shows how to change the shape of the likelihood to correspond automatically with
`x`, the predictor in a regression model.

Expand Down
1 change: 1 addition & 0 deletions pymc/model_graph.py
Original file line number Diff line number Diff line change
Expand Up @@ -199,6 +199,7 @@ def get_plates(self, var_names: Optional[Iterable[VarName]] = None) -> Dict[str,

Just groups by the shape of the underlying distribution. Will be wrong
if there are two plates with the same shape.

Returns
-------
dict
Expand Down
2 changes: 1 addition & 1 deletion pymc/ode/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
"""
This submodule contains tools used to perform inference on ordinary differential equations.
Contains tools used to perform inference on ordinary differential equations.

Due to the nature of the model (as well as included solvers), ODE solution may perform slowly.
Another library based on PyMC--sunode--has implemented Adams' method and BDF (backward differentation formula) using the very fast SUNDIALS suite of ODE and PDE solvers.
Expand Down
4 changes: 1 addition & 3 deletions pymc/ode/ode.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,6 @@ class DifferentialEquation(Op):

Parameters
----------

func : callable
Function specifying the differential equation. Must take arguments y (n_states,), t (scalar), p (n_theta,)
times : array
Expand All @@ -57,7 +56,6 @@ class DifferentialEquation(Op):

Examples
--------

.. code-block:: python

def odefunc(y, t, p):
Expand Down Expand Up @@ -108,7 +106,7 @@ def __init__(self, func, times, *, n_states, n_theta, t0=0):
self._output_sensitivities = {}

def _system(self, Y, t, p):
r"""This is the function that will be passed to odeint. Solves both ODE and sensitivities.
r"""The function that will be passed to odeint. Solves both ODE and sensitivities.

Parameters
----------
Expand Down
8 changes: 4 additions & 4 deletions pymc/pytensorf.py
Original file line number Diff line number Diff line change
Expand Up @@ -203,7 +203,7 @@ def walk_model(
"""Walk model graphs and yield their nodes.

Parameters
==========
----------
graphs
The graphs to walk.
stop_at_vars
Expand Down Expand Up @@ -235,12 +235,12 @@ def _replace_rvs_in_graphs(
This will *not* recompute test values.

Parameters
==========
----------
graphs
The graphs in which random variables are to be replaced.

Returns
=======
-------
Tuple containing the transformed graphs and a ``dict`` of the replacements
that were made.
"""
Expand Down Expand Up @@ -296,7 +296,7 @@ def rvs_to_value_vars(
This will *not* recompute test values in the resulting graphs.

Parameters
==========
----------
graphs
The graphs in which to perform the replacements.
apply_transforms
Expand Down
Loading