@@ -22,7 +22,7 @@ from tensorflow import keras
2222---
2323## The ` Layer ` class: the combination of state (weights) and some computation
2424
25- One of the central abstraction in Keras is the ` Layer ` class. A layer
25+ One of the central abstractions in Keras is the ` Layer ` class. A layer
2626encapsulates both a state (the layer's "weights") and a transformation from
2727inputs to outputs (a "call", the layer's forward pass).
2828
@@ -63,8 +63,8 @@ print(y)
6363<div class =" k-default-codeblock " >
6464```
6565tf.Tensor(
66- [[ 0.01103698 0.03099662 -0.1009444 0.10721317 ]
67- [ 0.01103698 0.03099662 -0.1009444 0.10721317 ]], shape=(2, 4), dtype=float32)
66+ [[-0.02134706 -0.11407568 -0.06567862 -0.03393517 ]
67+ [-0.02134706 -0.11407568 -0.06567862 -0.03393517 ]], shape=(2, 4), dtype=float32)
6868
6969```
7070</div >
@@ -103,8 +103,8 @@ print(y)
103103<div class =" k-default-codeblock " >
104104```
105105tf.Tensor(
106- [[-0.09724902 0.04435382 0.06548684 0.1264643 ]
107- [-0.09724902 0.04435382 0.06548684 0.1264643 ]], shape=(2, 4), dtype=float32)
106+ [[-0.0213856 -0.05269931 0.04779436 0.02541557 ]
107+ [-0.0213856 -0.05269931 0.04779436 0.02541557 ]], shape=(2, 4), dtype=float32)
108108
109109```
110110</div >
@@ -293,11 +293,14 @@ class ActivityRegularizationLayer(keras.layers.Layer):
293293 self .rate = rate
294294
295295 def call (self , inputs ):
296- self .add_loss(self .rate * tf.reduce_sum (inputs))
296+ self .add_loss(self .rate * tf.reduce_mean (inputs))
297297 return inputs
298298
299299```
300300
301+ Notice that ` add_loss() ` can take the result of plain TensorFlow operations.
302+ There is no need to call a ` Loss ` object here.
303+
301304These losses (including those created by any inner layer) can be retrieved via
302305` layer.losses ` . This property is reset at the start of every ` __call__() ` to
303306the top-level layer, so that ` layer.losses ` always contains the loss values
@@ -353,7 +356,7 @@ print(layer.losses)
353356
354357<div class =" k-default-codeblock " >
355358```
356- [<tf.Tensor: shape=(), dtype=float32, numpy=0.0023243506 >]
359+ [<tf.Tensor: shape=(), dtype=float32, numpy=0.0021371832 >]
357360
358361```
359362</div >
@@ -406,10 +409,10 @@ model.fit(np.random.random((2, 3)), np.random.random((2, 3)))
406409
407410<div class =" k-default-codeblock " >
408411```
409- 1/1 [==============================] - 0s 131ms /step - loss: 0.1269
410- 1/1 [==============================] - 0s 45ms /step - loss: 0.0274
412+ 1/1 [==============================] - 0s 95ms /step - loss: 0.1557
413+ 1/1 [==============================] - 0s 47ms /step - loss: 0.0044
411414
412- <keras.callbacks.History at 0x1643af310 >
415+ <keras.callbacks.History at 0x12e57e760 >
413416
414417```
415418</div >
@@ -465,7 +468,7 @@ print("current accuracy value:", float(layer.metrics[0].result()))
465468
466469<div class =" k-default-codeblock " >
467470```
468- layer.metrics: [<keras.metrics.BinaryAccuracy object at 0x161505450 >]
471+ layer.metrics: [<keras.metrics.metrics. BinaryAccuracy object at 0x12e2d36d0 >]
469472current accuracy value: 1.0
470473
471474```
@@ -491,9 +494,9 @@ model.fit(data)
491494
492495<div class =" k-default-codeblock " >
493496```
494- 1/1 [==============================] - 0s 240ms /step - loss: 0.9455 - binary_accuracy: 0.0000e+00
497+ 1/1 [==============================] - 0s 219ms /step - loss: 1.0331 - binary_accuracy: 0.0000e+00
495498
496- <keras.callbacks.History at 0x1644acd50 >
499+ <keras.callbacks.History at 0x12e5f5cd0 >
497500
498501```
499502</div >
@@ -839,28 +842,30 @@ for epoch in range(epochs):
839842
840843<div class =" k-default-codeblock " >
841844```
845+ Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz
846+ 11490434/11490434 [==============================] - 1s 0us/step
842847Start of epoch 0
843- step 0: mean loss = 0.3431
844- step 100: mean loss = 0.1273
845- step 200: mean loss = 0.1001
846- step 300: mean loss = 0.0897
847- step 400: mean loss = 0.0847
848- step 500: mean loss = 0.0812
849- step 600: mean loss = 0.0790
850- step 700: mean loss = 0.0774
851- step 800: mean loss = 0.0762
852- step 900: mean loss = 0.0751
848+ step 0: mean loss = 0.3169
849+ step 100: mean loss = 0.1252
850+ step 200: mean loss = 0.0990
851+ step 300: mean loss = 0.0891
852+ step 400: mean loss = 0.0842
853+ step 500: mean loss = 0.0809
854+ step 600: mean loss = 0.0787
855+ step 700: mean loss = 0.0771
856+ step 800: mean loss = 0.0760
857+ step 900: mean loss = 0.0750
853858Start of epoch 1
854- step 0: mean loss = 0.0748
855- step 100: mean loss = 0.0742
856- step 200: mean loss = 0.0737
857- step 300: mean loss = 0.0732
858- step 400: mean loss = 0.0728
859- step 500: mean loss = 0.0724
860- step 600: mean loss = 0.0721
861- step 700: mean loss = 0.0718
862- step 800: mean loss = 0.0716
863- step 900: mean loss = 0.0713
859+ step 0: mean loss = 0.0747
860+ step 100: mean loss = 0.0740
861+ step 200: mean loss = 0.0735
862+ step 300: mean loss = 0.0730
863+ step 400: mean loss = 0.0727
864+ step 500: mean loss = 0.0723
865+ step 600: mean loss = 0.0720
866+ step 700: mean loss = 0.0717
867+ step 800: mean loss = 0.0715
868+ step 900: mean loss = 0.0712
864869
865870```
866871</div >
@@ -884,7 +889,7 @@ Epoch 1/2
884889Epoch 2/2
885890938/938 [==============================] - 2s 2ms/step - loss: 0.0676
886891
887- <keras.callbacks.History at 0x164668d90 >
892+ <keras.callbacks.History at 0x12e6e48b0 >
888893
889894```
890895</div >
@@ -936,13 +941,13 @@ vae.fit(x_train, x_train, epochs=3, batch_size=64)
936941<div class =" k-default-codeblock " >
937942```
938943Epoch 1/3
939- 938/938 [==============================] - 2s 1ms /step - loss: 0.0746
944+ 938/938 [==============================] - 2s 2ms /step - loss: 0.0748
940945Epoch 2/3
941- 938/938 [==============================] - 1s 1ms /step - loss: 0.0676
946+ 938/938 [==============================] - 2s 2ms /step - loss: 0.0676
942947Epoch 3/3
943- 938/938 [==============================] - 1s 1ms /step - loss: 0.0676
948+ 938/938 [==============================] - 2s 2ms /step - loss: 0.0675
944949
945- <keras.callbacks.History at 0x16469fc50 >
950+ <keras.callbacks.History at 0x12e6ab7f0 >
946951
947952```
948953</div >
0 commit comments