Skip to content

Commit c363106

Browse files
author
Wenbing Li
authored
Add the missing keras activation parameter in the conv layers. (#217)
* Add the missing keras activation parameter in the conv layers. * remove exp activation * The sudden release of onnx 1.4 looks like a disaster.
1 parent c2b6b59 commit c363106

File tree

4 files changed

+12
-4
lines changed

4 files changed

+12
-4
lines changed

.azure-pipelines/linux-conda-CI.yml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,10 +15,12 @@ jobs:
1515
matrix:
1616
Python27:
1717
python.version: '2.7'
18+
ONNX_PATH: onnx==1.2.3
1819
# Python35:
1920
# python.version: '3.5'
2021
Python36:
2122
python.version: '3.6'
23+
ONNX_PATH: onnx==1.3.0
2224
maxParallel: 3
2325

2426
steps:
@@ -36,6 +38,7 @@ jobs:
3638
conda install -c conda-forge cmake
3739
conda install -c conda-forge openmpi
3840
conda install -c conda-forge tensorflow
41+
pip install $(ONNX_PATH)
3942
pip install -r requirements.txt
4043
pip install -r requirements-dev.txt
4144
test '$(python.version)' != '2.7' && pip install onnxruntime

.azure-pipelines/win32-conda-CI.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ jobs:
2424

2525
Python36:
2626
python.version: '3.6'
27-
ONNX_PATH: onnx
27+
ONNX_PATH: onnx==1.3.0
2828
KERAS: keras
2929
COREML_PATH: git+https://github.com/apple/coremltools
3030

onnxmltools/convert/common/_apply_operation.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -378,7 +378,7 @@ def apply_prelu(scope, input_name, output_name, container, operator_name=None, s
378378
container.add_node('PRelu', [input_name, slope_tensor_name], output_name, op_version=7, name=name)
379379

380380

381-
def apply_elu(scope, input_name, output_name, container, operator_name=None, alpha=None):
381+
def apply_elu(scope, input_name, output_name, container, operator_name=None, alpha=1.0):
382382
_apply_unary_operation(scope, 'Elu', input_name, output_name, container, operator_name, alpha=alpha)
383383

384384

onnxmltools/convert/keras/operator_converters/Dense.py

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,13 +7,18 @@
77
from keras.activations import get as _get_activation
88
from keras.layers import Dense
99
from ....proto import onnx_proto
10-
from ...common._apply_operation import apply_sigmoid, apply_softmax, apply_identity, apply_relu, apply_add
10+
from ...common._apply_operation import apply_sigmoid, apply_softmax, apply_identity, apply_add
11+
from ...common._apply_operation import apply_relu, apply_elu, apply_selu, apply_tanh, apply_hard_sigmoid
1112
from ...common._registration import register_converter
1213

1314
_activation_map = {_get_activation('sigmoid'): apply_sigmoid,
1415
_get_activation('softmax'): apply_softmax,
1516
_get_activation('linear'): apply_identity,
16-
_get_activation('relu'): apply_relu}
17+
_get_activation('relu'): apply_relu,
18+
_get_activation('elu'): apply_elu,
19+
_get_activation('selu'): apply_selu,
20+
_get_activation('tanh'): apply_tanh,
21+
_get_activation('hard_sigmoid'): apply_hard_sigmoid}
1722

1823

1924
def convert_keras_dense(scope, operator, container):

0 commit comments

Comments
 (0)