You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.rst
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -62,7 +62,7 @@ To install directly from the github repository :
62
62
63
63
Let us start with a basic imputation problem. Here, we generate one-dimensional noisy time series.
64
64
65
-
.. code:: sh
65
+
.. code-block:: python
66
66
67
67
import matplotlib.pyplot as plt
68
68
import numpy as np
@@ -75,7 +75,7 @@ Let us start with a basic imputation problem. Here, we generate one-dimensional
75
75
76
76
For this demonstration, let us create artificial holes in our dataset.
77
77
78
-
.. code:: sh
78
+
.. code-block:: python
79
79
80
80
from qolmat.utils.data import add_holes
81
81
plt.rcParams.update({'font.size': 18})
@@ -101,7 +101,7 @@ For this demonstration, let us create artificial holes in our dataset.
101
101
To impute missing data, there are several methods that can be imported with ``from qolmat.imputations import imputers``.
102
102
The creation of an imputation dictionary will enable us to benchmark the various imputations.
103
103
104
-
.. code:: sh
104
+
.. code-block:: python
105
105
106
106
from sklearn.linear_model import LinearRegression
107
107
from qolmat.imputations import imputers
@@ -146,7 +146,7 @@ The creation of an imputation dictionary will enable us to benchmark the various
146
146
147
147
It is possible to define a parameter dictionary for an imputer with three pieces of information: min, max and type. The aim of the dictionary is to determine the optimal parameters for data imputation. Here, we call this dictionary ``dict_config_opti``.
148
148
149
-
.. code:: sh
149
+
.. code-block:: python
150
150
151
151
search_params = {
152
152
"RPCA_opti": {
@@ -157,7 +157,7 @@ It is possible to define a parameter dictionary for an imputer with three pieces
157
157
158
158
Then with the comparator function in ``from qolmat.benchmark import comparator``, we can compare the different imputation methods. This **does not use knowledge on missing values**, but it relies data masking instead. For more details on how imputors and comparator work, please see the following `link <https://qolmat.readthedocs.io/en/latest/explanation.html>`_.
159
159
160
-
.. code:: sh
160
+
.. code-block:: python
161
161
162
162
from qolmat.benchmark import comparator
163
163
@@ -175,7 +175,7 @@ Then with the comparator function in ``from qolmat.benchmark import comparator``
Copy file name to clipboardExpand all lines: examples/benchmark.md
+16-9Lines changed: 16 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -344,8 +344,11 @@ plt.show()
344
344
In this section, we present an MLP model of data imputation using Keras, which can be installed using a "pip install tensorflow".
345
345
346
346
```python
347
-
from qolmat.imputations import imputers_keras
348
-
import tensorflow as tf
347
+
from qolmat.imputations import imputers_pytorch
348
+
try:
349
+
import torch.nn as nn
350
+
exceptModuleNotFoundError:
351
+
raise PyTorchExtraNotInstalled
349
352
```
350
353
351
354
For the MLP model, we work on a dataset that corresponds to weather data with missing values. We add missing MCAR values on the features "TEMP", "PRES" and other features with NaN values. The goal is impute the missing values for the features "TEMP" and "PRES" by a Deep Learning method. We add features to take into account the seasonality of the data set and a feature for the station name
@@ -363,13 +366,17 @@ For the example, we use a simple MLP model with 3 layers of neurons.
363
366
Then we train the model without taking a group on the stations
0 commit comments