|
| 1 | +!!! summary "" |
| 2 | + Version 1.5.3 of DynaML, released August 13, 2017, . |
| 3 | + |
| 4 | + |
| 5 | +## Additions |
| 6 | + |
| 7 | +### Tensorflow Integration |
| 8 | + |
| 9 | + |
| 10 | + **Package** `dynaml.tensorflow` |
| 11 | + |
| 12 | + #### Inception v2 |
| 13 | + |
| 14 | + The [_Inception_](https://www.cs.unc.edu/~wliu/papers/GoogLeNet.pdf) architecture, proposed by Google is an important |
| 15 | + building block of _convolutional neural network_ architectures used in vision applications. |
| 16 | + |
| 17 | +  |
| 18 | + |
| 19 | + DynaML now offers the Inception cell as a computational layer. |
| 20 | + |
| 21 | + ```scala |
| 22 | + import io.github.mandar2812.dynaml.pipes._ |
| 23 | + import io.github.mandar2812.dynaml.tensorflow._ |
| 24 | + import org.platanios.tensorflow.api._ |
| 25 | + |
| 26 | + //Create an RELU activation, given a string name/identifier. |
| 27 | + val relu_act = DataPipe(tf.learn.ReLU(_)) |
| 28 | + |
| 29 | + //Learn 10 filters in each branch of the inception cell |
| 30 | + val filters = Seq(10, 10, 10, 10) |
| 31 | + |
| 32 | + val inception_cell = dtflearn.inception_unit( |
| 33 | + channels = 3, num_filters = filters, relu_act, |
| 34 | + //Apply batch normalisation after each convolution |
| 35 | + use_batch_norm = true)(layer_index = 1) |
| 36 | + |
| 37 | + ``` |
| 38 | + |
| 39 | + |
| 40 | +### Library Organisation |
| 41 | + |
| 42 | + - Removed the `dynaml-notebook` module. |
| 43 | + |
| 44 | +## Bugfixes |
| 45 | + |
| 46 | + |
| 47 | +## Changes |
| 48 | + |
0 commit comments