Skip to content

Commit c68211d

Browse files
nada-adel-mohamadyglemaitrechkoar
authored
DOC fix some typo in miscellaneous.rst (#893)
Co-authored-by: Guillaume Lemaitre <[email protected]> Co-authored-by: Christos Aridas <[email protected]>
1 parent a9470dc commit c68211d

File tree

1 file changed

+5
-4
lines changed

1 file changed

+5
-4
lines changed

doc/miscellaneous.rst

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ to retain the 10 first elements of the array ``X`` and ``y``::
3434
>>> np.all(y_res == y[:10])
3535
True
3636

37-
In addition, the parameter ``validate`` control input checking. For instance,
37+
In addition, the parameter ``validate`` controls input checking. For instance,
3838
turning ``validate=False`` allows to pass any type of target ``y`` and do some
3939
sampling for regression targets::
4040

@@ -51,7 +51,7 @@ sampling for regression targets::
5151
75.46571114, -67.49177372, 159.72700509, -169.80498923,
5252
211.95889757, 211.95889757])
5353

54-
We illustrate the use of such sampler to implement an outlier rejection
54+
We illustrated the use of such sampler to implement an outlier rejection
5555
estimator which can be easily used within a
5656
:class:`~imblearn.pipeline.Pipeline`:
5757
:ref:`sphx_glr_auto_examples_applications_plot_outlier_rejections.py`
@@ -69,10 +69,11 @@ will generate balanced mini-batches.
6969
TensorFlow generator
7070
~~~~~~~~~~~~~~~~~~~~
7171

72-
The :func:`~imblearn.tensorflow.balanced_batch_generator` allow to generate
72+
The :func:`~imblearn.tensorflow.balanced_batch_generator` allows to generate
7373
balanced mini-batches using an imbalanced-learn sampler which returns indices.
7474

7575
Let's first generate some data::
76+
7677
>>> n_features, n_classes = 10, 2
7778
>>> X, y = make_classification(
7879
... n_samples=10_000, n_features=n_features, n_informative=2,
@@ -96,7 +97,7 @@ balanced::
9697
... random_state=42,
9798
... )
9899

99-
The ``generator`` and ``steps_per_epoch`` is used during the training of the
100+
The ``generator`` and ``steps_per_epoch`` are used during the training of a
100101
Tensorflow model. We will illustrate how to use this generator. First, we can
101102
define a logistic regression model which will be optimized by a gradient
102103
descent::

0 commit comments

Comments
 (0)