@@ -34,7 +34,7 @@ to retain the 10 first elements of the array ``X`` and ``y``::
34
34
>>> np.all(y_res == y[:10])
35
35
True
36
36
37
- In addition, the parameter ``validate `` control input checking. For instance,
37
+ In addition, the parameter ``validate `` controls input checking. For instance,
38
38
turning ``validate=False `` allows to pass any type of target ``y `` and do some
39
39
sampling for regression targets::
40
40
@@ -51,7 +51,7 @@ sampling for regression targets::
51
51
75.46571114, -67.49177372, 159.72700509, -169.80498923,
52
52
211.95889757, 211.95889757])
53
53
54
- We illustrate the use of such sampler to implement an outlier rejection
54
+ We illustrated the use of such sampler to implement an outlier rejection
55
55
estimator which can be easily used within a
56
56
:class: `~imblearn.pipeline.Pipeline `:
57
57
:ref: `sphx_glr_auto_examples_applications_plot_outlier_rejections.py `
@@ -69,10 +69,11 @@ will generate balanced mini-batches.
69
69
TensorFlow generator
70
70
~~~~~~~~~~~~~~~~~~~~
71
71
72
- The :func: `~imblearn.tensorflow.balanced_batch_generator ` allow to generate
72
+ The :func: `~imblearn.tensorflow.balanced_batch_generator ` allows to generate
73
73
balanced mini-batches using an imbalanced-learn sampler which returns indices.
74
74
75
75
Let's first generate some data::
76
+
76
77
>>> n_features, n_classes = 10, 2
77
78
>>> X, y = make_classification(
78
79
... n_samples=10_000, n_features=n_features, n_informative=2,
@@ -96,7 +97,7 @@ balanced::
96
97
... random_state=42,
97
98
... )
98
99
99
- The ``generator `` and ``steps_per_epoch `` is used during the training of the
100
+ The ``generator `` and ``steps_per_epoch `` are used during the training of a
100
101
Tensorflow model. We will illustrate how to use this generator. First, we can
101
102
define a logistic regression model which will be optimized by a gradient
102
103
descent::
0 commit comments