Skip to content

Commit 29be689

Browse files
committed
Regenerate subclassing guide.
1 parent c05e0c7 commit 29be689

File tree

2 files changed

+49
-41
lines changed

2 files changed

+49
-41
lines changed

guides/ipynb/making_new_layers_and_models_via_subclassing.ipynb

Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@
4343
"source": [
4444
"## The `Layer` class: the combination of state (weights) and some computation\n",
4545
"\n",
46-
"One of the central abstraction in Keras is the `Layer` class. A layer\n",
46+
"One of the central abstractions in Keras is the `Layer` class. A layer\n",
4747
"encapsulates both a state (the layer's \"weights\") and a transformation from\n",
4848
"inputs to outputs (a \"call\", the layer's forward pass).\n",
4949
"\n",
@@ -416,7 +416,7 @@
416416
" self.rate = rate\n",
417417
"\n",
418418
" def call(self, inputs):\n",
419-
" self.add_loss(self.rate * tf.reduce_sum(inputs))\n",
419+
" self.add_loss(self.rate * tf.reduce_mean(inputs))\n",
420420
" return inputs\n",
421421
""
422422
]
@@ -427,6 +427,9 @@
427427
"colab_type": "text"
428428
},
429429
"source": [
430+
"Notice that `add_loss()` can take the result of plain TensorFlow operations.\n",
431+
"There is no need to call a `Loss` object here.\n",
432+
"\n",
430433
"These losses (including those created by any inner layer) can be retrieved via\n",
431434
"`layer.losses`. This property is reset at the start of every `__call__()` to\n",
432435
"the top-level layer, so that `layer.losses` always contains the loss values\n",
@@ -1212,4 +1215,4 @@
12121215
},
12131216
"nbformat": 4,
12141217
"nbformat_minor": 0
1215-
}
1218+
}

guides/md/making_new_layers_and_models_via_subclassing.md

Lines changed: 43 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ from tensorflow import keras
2222
---
2323
## The `Layer` class: the combination of state (weights) and some computation
2424

25-
One of the central abstraction in Keras is the `Layer` class. A layer
25+
One of the central abstractions in Keras is the `Layer` class. A layer
2626
encapsulates both a state (the layer's "weights") and a transformation from
2727
inputs to outputs (a "call", the layer's forward pass).
2828

@@ -63,8 +63,8 @@ print(y)
6363
<div class="k-default-codeblock">
6464
```
6565
tf.Tensor(
66-
[[ 0.01103698 0.03099662 -0.1009444 0.10721317]
67-
[ 0.01103698 0.03099662 -0.1009444 0.10721317]], shape=(2, 4), dtype=float32)
66+
[[-0.02134706 -0.11407568 -0.06567862 -0.03393517]
67+
[-0.02134706 -0.11407568 -0.06567862 -0.03393517]], shape=(2, 4), dtype=float32)
6868
6969
```
7070
</div>
@@ -103,8 +103,8 @@ print(y)
103103
<div class="k-default-codeblock">
104104
```
105105
tf.Tensor(
106-
[[-0.09724902 0.04435382 0.06548684 0.1264643 ]
107-
[-0.09724902 0.04435382 0.06548684 0.1264643 ]], shape=(2, 4), dtype=float32)
106+
[[-0.0213856 -0.05269931 0.04779436 0.02541557]
107+
[-0.0213856 -0.05269931 0.04779436 0.02541557]], shape=(2, 4), dtype=float32)
108108
109109
```
110110
</div>
@@ -293,11 +293,14 @@ class ActivityRegularizationLayer(keras.layers.Layer):
293293
self.rate = rate
294294

295295
def call(self, inputs):
296-
self.add_loss(self.rate * tf.reduce_sum(inputs))
296+
self.add_loss(self.rate * tf.reduce_mean(inputs))
297297
return inputs
298298

299299
```
300300

301+
Notice that `add_loss()` can take the result of plain TensorFlow operations.
302+
There is no need to call a `Loss` object here.
303+
301304
These losses (including those created by any inner layer) can be retrieved via
302305
`layer.losses`. This property is reset at the start of every `__call__()` to
303306
the top-level layer, so that `layer.losses` always contains the loss values
@@ -353,7 +356,7 @@ print(layer.losses)
353356

354357
<div class="k-default-codeblock">
355358
```
356-
[<tf.Tensor: shape=(), dtype=float32, numpy=0.0023243506>]
359+
[<tf.Tensor: shape=(), dtype=float32, numpy=0.0021371832>]
357360
358361
```
359362
</div>
@@ -406,10 +409,10 @@ model.fit(np.random.random((2, 3)), np.random.random((2, 3)))
406409

407410
<div class="k-default-codeblock">
408411
```
409-
1/1 [==============================] - 0s 131ms/step - loss: 0.1269
410-
1/1 [==============================] - 0s 45ms/step - loss: 0.0274
412+
1/1 [==============================] - 0s 95ms/step - loss: 0.1557
413+
1/1 [==============================] - 0s 47ms/step - loss: 0.0044
411414
412-
<keras.callbacks.History at 0x1643af310>
415+
<keras.callbacks.History at 0x12e57e760>
413416
414417
```
415418
</div>
@@ -465,7 +468,7 @@ print("current accuracy value:", float(layer.metrics[0].result()))
465468

466469
<div class="k-default-codeblock">
467470
```
468-
layer.metrics: [<keras.metrics.BinaryAccuracy object at 0x161505450>]
471+
layer.metrics: [<keras.metrics.metrics.BinaryAccuracy object at 0x12e2d36d0>]
469472
current accuracy value: 1.0
470473
471474
```
@@ -491,9 +494,9 @@ model.fit(data)
491494

492495
<div class="k-default-codeblock">
493496
```
494-
1/1 [==============================] - 0s 240ms/step - loss: 0.9455 - binary_accuracy: 0.0000e+00
497+
1/1 [==============================] - 0s 219ms/step - loss: 1.0331 - binary_accuracy: 0.0000e+00
495498
496-
<keras.callbacks.History at 0x1644acd50>
499+
<keras.callbacks.History at 0x12e5f5cd0>
497500
498501
```
499502
</div>
@@ -839,28 +842,30 @@ for epoch in range(epochs):
839842

840843
<div class="k-default-codeblock">
841844
```
845+
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz
846+
11490434/11490434 [==============================] - 1s 0us/step
842847
Start of epoch 0
843-
step 0: mean loss = 0.3431
844-
step 100: mean loss = 0.1273
845-
step 200: mean loss = 0.1001
846-
step 300: mean loss = 0.0897
847-
step 400: mean loss = 0.0847
848-
step 500: mean loss = 0.0812
849-
step 600: mean loss = 0.0790
850-
step 700: mean loss = 0.0774
851-
step 800: mean loss = 0.0762
852-
step 900: mean loss = 0.0751
848+
step 0: mean loss = 0.3169
849+
step 100: mean loss = 0.1252
850+
step 200: mean loss = 0.0990
851+
step 300: mean loss = 0.0891
852+
step 400: mean loss = 0.0842
853+
step 500: mean loss = 0.0809
854+
step 600: mean loss = 0.0787
855+
step 700: mean loss = 0.0771
856+
step 800: mean loss = 0.0760
857+
step 900: mean loss = 0.0750
853858
Start of epoch 1
854-
step 0: mean loss = 0.0748
855-
step 100: mean loss = 0.0742
856-
step 200: mean loss = 0.0737
857-
step 300: mean loss = 0.0732
858-
step 400: mean loss = 0.0728
859-
step 500: mean loss = 0.0724
860-
step 600: mean loss = 0.0721
861-
step 700: mean loss = 0.0718
862-
step 800: mean loss = 0.0716
863-
step 900: mean loss = 0.0713
859+
step 0: mean loss = 0.0747
860+
step 100: mean loss = 0.0740
861+
step 200: mean loss = 0.0735
862+
step 300: mean loss = 0.0730
863+
step 400: mean loss = 0.0727
864+
step 500: mean loss = 0.0723
865+
step 600: mean loss = 0.0720
866+
step 700: mean loss = 0.0717
867+
step 800: mean loss = 0.0715
868+
step 900: mean loss = 0.0712
864869
865870
```
866871
</div>
@@ -884,7 +889,7 @@ Epoch 1/2
884889
Epoch 2/2
885890
938/938 [==============================] - 2s 2ms/step - loss: 0.0676
886891
887-
<keras.callbacks.History at 0x164668d90>
892+
<keras.callbacks.History at 0x12e6e48b0>
888893
889894
```
890895
</div>
@@ -936,13 +941,13 @@ vae.fit(x_train, x_train, epochs=3, batch_size=64)
936941
<div class="k-default-codeblock">
937942
```
938943
Epoch 1/3
939-
938/938 [==============================] - 2s 1ms/step - loss: 0.0746
944+
938/938 [==============================] - 2s 2ms/step - loss: 0.0748
940945
Epoch 2/3
941-
938/938 [==============================] - 1s 1ms/step - loss: 0.0676
946+
938/938 [==============================] - 2s 2ms/step - loss: 0.0676
942947
Epoch 3/3
943-
938/938 [==============================] - 1s 1ms/step - loss: 0.0676
948+
938/938 [==============================] - 2s 2ms/step - loss: 0.0675
944949
945-
<keras.callbacks.History at 0x16469fc50>
950+
<keras.callbacks.History at 0x12e6ab7f0>
946951
947952
```
948953
</div>

0 commit comments

Comments
 (0)