Skip to content

Commit 3036ccd

Browse files
committed
docs cost
1 parent 99fb07a commit 3036ccd

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

docs/modules/cost.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -16,13 +16,13 @@ TensorLayer provides a simple way to create you own cost function. Take a MLP be
1616

1717
.. code-block:: python
1818
19-
network = tl.InputLayer(x, name='input_layer')
20-
network = tl.DropoutLayer(network, keep=0.8, name='drop1')
21-
network = tl.DenseLayer(network, n_units=800, act = tf.nn.relu, name='relu1')
22-
network = tl.DropoutLayer(network, keep=0.5, name='drop2')
23-
network = tl.DenseLayer(network, n_units=800, act = tf.nn.relu, name='relu2')
24-
network = tl.DropoutLayer(network, keep=0.5, name='drop3')
25-
network = tl.DenseLayer(network, n_units=10, act = tl.activation.identity, name='output_layer')
19+
network = InputLayer(x, name='input_layer')
20+
network = DropoutLayer(network, keep=0.8, name='drop1')
21+
network = DenseLayer(network, n_units=800, act = tf.nn.relu, name='relu1')
22+
network = DropoutLayer(network, keep=0.5, name='drop2')
23+
network = DenseLayer(network, n_units=800, act = tf.nn.relu, name='relu2')
24+
network = DropoutLayer(network, keep=0.5, name='drop3')
25+
network = DenseLayer(network, n_units=10, act = tl.activation.identity, name='output_layer')
2626
2727
The network parameters will be ``[W1, b1, W2, b2, W_out, b_out]``,
2828
then you can apply L2 regularization on the weights matrix of first two layer as follow.

0 commit comments

Comments
 (0)