Skip to content

Commit 737fc11

Browse files
authored
Merge pull request #1282 from rstudio/update-applications
Update applications - New applications: - MobileNet V3: `application_mobilenet_v3_large()`, `application_mobilenet_v3_small()` - ResNet: `application_resnet101()`, `application_resnet152()`, `resnet_preprocess_input()` - ResNet V2:`application_resnet50_v2()`, `application_resnet101_v2()`, `application_resnet152_v2()` and `resnet_v2_preprocess_input()` - EfficientNet: `application_efficientnet_b{0,1,2,3,4,5,6,7}()` - Many existing `application_*()`'s gain argument `classifier_activation`, with default `'softmax'`. Affected: `application_{xception, inception_resnet_v2, inception_v3, mobilenet, vgg16, vgg19}()`
2 parents 74a0f0c + d9b32b5 commit 737fc11

21 files changed

+1527
-365
lines changed

NAMESPACE

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -93,14 +93,29 @@ export(application_densenet)
9393
export(application_densenet121)
9494
export(application_densenet169)
9595
export(application_densenet201)
96+
export(application_efficientnet_b0)
97+
export(application_efficientnet_b1)
98+
export(application_efficientnet_b2)
99+
export(application_efficientnet_b3)
100+
export(application_efficientnet_b4)
101+
export(application_efficientnet_b5)
102+
export(application_efficientnet_b6)
103+
export(application_efficientnet_b7)
96104
export(application_inception_resnet_v2)
97105
export(application_inception_v3)
98106
export(application_mobilenet)
99107
export(application_mobilenet_v2)
108+
export(application_mobilenet_v3_large)
109+
export(application_mobilenet_v3_small)
100110
export(application_nasnet)
101111
export(application_nasnetlarge)
102112
export(application_nasnetmobile)
113+
export(application_resnet101)
114+
export(application_resnet101_v2)
115+
export(application_resnet152)
116+
export(application_resnet152_v2)
103117
export(application_resnet50)
118+
export(application_resnet50_v2)
104119
export(application_vgg16)
105120
export(application_vgg19)
106121
export(application_xception)
@@ -548,6 +563,8 @@ export(regularizer_l1)
548563
export(regularizer_l1_l2)
549564
export(regularizer_l2)
550565
export(reset_states)
566+
export(resnet_preprocess_input)
567+
export(resnet_v2_preprocess_input)
551568
export(run_dir)
552569
export(save_model_hdf5)
553570
export(save_model_tf)

NEWS.md

Lines changed: 26 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -15,34 +15,49 @@
1515
"Working with RNNs".
1616

1717
- New layers:
18-
- `layer_additive_attention()`
19-
- `layer_conv_lstm_1d()`
18+
- `layer_additive_attention()`
19+
- `layer_conv_lstm_1d()`
2020
- `layer_conv_lstm_3d()`
21-
21+
2222
- `layer_lstm()` default value for `recurrent_activation` changed from `"hard_sigmoid"` to `"sigmoid"`.
2323

2424
- `layer_cudnn_gru()` and `layer_cudnn_lstm()` are deprecated. `layer_gru()` and `layer_lstm()` will
2525
automatically use CuDNN if it is available.
2626

2727
- New vignette: "Transfer learning and fine-tuning".
2828

29+
- New applications:
30+
- MobileNet V3: `application_mobilenet_v3_large()`, `application_mobilenet_v3_small()`
31+
- ResNet: `application_resnet101()`, `application_resnet152()`, `resnet_preprocess_input()`
32+
- ResNet V2:`application_resnet50_v2()`, `application_resnet101_v2()`,
33+
`application_resnet152_v2()` and `resnet_v2_preprocess_input()`
34+
- EfficientNet: `application_efficientnet_b{0,1,2,3,4,5,6,7}()`
35+
36+
- Many existing `application_*()`'s gain argument `classifier_activation`, with default `'softmax'`.
37+
Affected: `application_{xception, inception_resnet_v2, inception_v3, mobilenet, vgg16, vgg19}()`
38+
2939
- New function `%<-active%`, a ergonomic wrapper around `makeActiveBinding()`
3040
for constructing Python `@property` decorated methods in `%py_class%`.
3141

3242
- `bidirectional()` sequence processing layer wrapper gains a `backwards_layer` arguments.
3343

34-
- Global pooling layers `layer_global_{max,average}_pooling_{1,2,3}d()` gain a
44+
- Global pooling layers `layer_global_{max,average}_pooling_{1,2,3}d()` gain a
3545
`keepdims` argument with default value `FALSE`.
3646

37-
- Signatures for layer functions are in the process of being simplified.
38-
Standard layer arguments are moving to `...` where appropriate (and will need to be provided as named arguments).
39-
Standard layer arguments include: `input_shape`, `batch_input_shape`, `batch_size`, `dtype`, `name`, `trainable`, `weights`.
40-
Layers updated: `layer_global_{max,average}_pooling_{1,2,3}d()`, `time_distributed()`, `bidirectional()`.
41-
42-
- All the backend function with a shape argument `k_*(shape =)` that now accept a
47+
- Signatures for layer functions are in the process of being simplified.
48+
Standard layer arguments are moving to `...` where appropriate
49+
(and will need to be provided as named arguments).
50+
Standard layer arguments include:
51+
`input_shape`, `batch_input_shape`, `batch_size`, `dtype`,
52+
`name`, `trainable`, `weights`.
53+
Layers updated:
54+
`layer_global_{max,average}_pooling_{1,2,3}d()`,
55+
`time_distributed()`, `bidirectional()`.
56+
57+
- All the backend function with a shape argument `k_*(shape =)` that now accept a
4358
a mix of integer tensors and R numerics in the supplied list.
4459

45-
- `k_random_uniform()` now automatically coerces `minval` and `maxval` to the output dtype.
60+
- `k_random_uniform()` now automatically casts `minval` and `maxval` to the output dtype.
4661

4762
# keras 2.6.1
4863

0 commit comments

Comments
 (0)