File tree Expand file tree Collapse file tree 1 file changed +1
-4
lines changed
docs/tutorials/3-neural_network/convolutiona_neural_network Expand file tree Collapse file tree 1 file changed +1
-4
lines changed Original file line number Diff line number Diff line change @@ -197,9 +197,6 @@ There are different types of variance-scaling initializers. The one we
197197used in is the one proposed by the paper `Understanding the difficulty
198198of training deep feedforward neural
199199networks <http://jmlr.org/proceedings/papers/v9/glorot10a/glorot10a.pdf> `__
200- and provided by the TensorFlow. is the one proposed by the paper
201- `Understanding the difficulty of training deep feedforward neural
202- networks <http://jmlr.org/proceedings/papers/v9/glorot10a/glorot10a.pdf> `__
203200and provided by the TensorFlow.
204201
205202~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -489,7 +486,7 @@ get back to it in another post.
489486
490487The image summaries are created which has the duty of
491488visualizing the input elements to the summary tensor. These elements here
492- are 3 random images from the train data. In The outputs of different layers will be fed to the relevant summary tensor.
489+ are 3 random images from the train data. In the outputs of different layers will be fed to the relevant summary tensor.
493490Finally, some scalar summaries are created in order
494491to track the *training convergence * and *testing performance *. The
495492collections argument in summary definitions is a supervisor which direct
You can’t perform that action at this time.
0 commit comments