@@ -100,15 +100,51 @@ For evaluating and testing, disable all dropout layers as follow.
100100 For more details, please read the MNIST examples on Github.
101101
102102
103- Understand Dense layer
104- --------------------------
103+ Customized layer
104+ -----------------
105+
106+ A Simple layer
107+ ^^^^^^^^^^^^^^^
108+
109+ To implement a custom layer in TensorLayer, you will have to write a Python class
110+ that subclasses Layer and implement the ``outputs `` expression.
111+
112+ The following is an example implementation of a layer that multiplies its input by 2:
113+
114+ .. code-block :: python
115+
116+ class DoubleLayer (Layer ):
117+ def __init__ (
118+ self ,
119+ layer = None ,
120+ name = ' double_layer' ,
121+ ):
122+ # check layer name (fixed)
123+ Layer.__init__ (self , name = name)
124+
125+ # the input of this layer is the output of previous layer (fixed)
126+ self .inputs = layer.outputs
127+
128+ # operation (customized)
129+ self .outputs = self .inputs * 2
130+
131+ # get stuff from previous layer (fixed)
132+ self .all_layers = list (layer.all_layers)
133+ self .all_params = list (layer.all_params)
134+ self .all_drop = dict (layer.all_drop)
135+
136+ # update layer (customized)
137+ self .all_layers.extend( [self .outputs] )
138+
139+
140+ Your Dense layer
141+ ^^^^^^^^^^^^^^^^^^^
105142
106143Before creating your own TensorLayer layer, let's have a look at Dense layer.
107144It creates a weights matrix and biases vector if not exists, then implement
108145the output expression.
109146At the end, as a layer with parameter, we also need to append the parameters into ``all_params ``.
110147
111-
112148.. code-block :: python
113149
114150 class MyDenseLayer (Layer ):
@@ -146,42 +182,6 @@ At the end, as a layer with parameter, we also need to append the parameters int
146182 self .all_layers.extend( [self .outputs] )
147183 self .all_params.extend( [W, b] )
148184
149- Your layer
150- -----------------
151-
152- A simple layer
153- ^^^^^^^^^^^^^^^
154-
155- To implement a custom layer in TensorLayer, you will have to write a Python class
156- that subclasses Layer and implement the ``outputs `` expression.
157-
158- The following is an example implementation of a layer that multiplies its input by 2:
159-
160- .. code-block :: python
161-
162- class DoubleLayer (Layer ):
163- def __init__ (
164- self ,
165- layer = None ,
166- name = ' double_layer' ,
167- ):
168- # check layer name (fixed)
169- Layer.__init__ (self , name = name)
170-
171- # the input of this layer is the output of previous layer (fixed)
172- self .inputs = layer.outputs
173-
174- # operation (customized)
175- self .outputs = self .inputs * 2
176-
177- # get stuff from previous layer (fixed)
178- self .all_layers = list (layer.all_layers)
179- self .all_params = list (layer.all_params)
180- self .all_drop = dict (layer.all_drop)
181-
182- # update layer (customized)
183- self .all_layers.extend( [self .outputs] )
184-
185185
186186 Modifying Pre-train Behaviour
187187^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
0 commit comments