|
117 | 117 | "source": [
|
118 | 118 | "## 1.1 Dataset\n",
|
119 | 119 | "\n",
|
120 |
| - "We will build understanding of bias and uncertainty by training a neural network for a simple 2D regression task: modeling the function $y = x^3$. We will use `CAPSA` to analyze this dataset and the performance of the model. Noise and missing-ness will be injected into the dataset.\n", |
| 120 | + "We will build understanding of bias and uncertainty by training a neural network for a simple 2D regression task: modeling the function $y = x^3$. We will use Capsa to analyze this dataset and the performance of the model. Noise and missing-ness will be injected into the dataset.\n", |
121 | 121 | "\n",
|
122 | 122 | "Let's generate the dataset and visualize it:"
|
123 | 123 | ]
|
|
131 | 131 | "outputs": [],
|
132 | 132 | "source": [
|
133 | 133 | "# Get the data for the cubic function, injected with noise and missing-ness\n",
|
| 134 | + "# This is just a toy dataset that we can use to test some of the wrappers on\n", |
134 | 135 | "def gen_data(x_min, x_max, n, train=True):\n",
|
135 | 136 | " if train: \n",
|
136 | 137 | " x = np.random.triangular(x_min, 2, x_max, size=(n, 1))\n",
|
|
211 | 212 | ")\n",
|
212 | 213 | "\n",
|
213 | 214 | "# Train the model for 30 epochs using model.fit().\n",
|
214 |
| - "loss_history = dense_NN.fit(x_train, y_train, epochs=30, verbose=0)" |
| 215 | + "loss_history = dense_NN.fit(x_train, y_train, epochs=30)" |
215 | 216 | ]
|
216 | 217 | },
|
217 | 218 | {
|
|
319 | 320 | ")\n",
|
320 | 321 | "\n",
|
321 | 322 | "# Train the wrapped model for 30 epochs.\n",
|
322 |
| - "loss_history_bias_wrap = bias_wrapped_dense_NN.fit(x_train, y_train, epochs=30, verbose=0)\n", |
| 323 | + "loss_history_bias_wrap = bias_wrapped_dense_NN.fit(x_train, y_train, epochs=30)\n", |
323 | 324 | "\n",
|
324 | 325 | "print(\"Done training model with Bias Wrapper!\")"
|
325 | 326 | ]
|
|
0 commit comments