|
1 | 1 | # Tensorflow Backend and Frontend for ONNX |
2 | 2 | [](https://travis-ci.org/onnx/onnx-tensorflow) |
3 | 3 |
|
4 | | -## To convert pb between Tensorflow and ONNX: |
| 4 | +## To convert models between Tensorflow and ONNX: |
5 | 5 |
|
6 | 6 | ### Use CLI: |
7 | 7 | Tensorflow -> ONNX: `onnx-tf convert -t onnx -i /path/to/input.pb -o /path/to/output.onnx` |
8 | 8 |
|
9 | 9 | ONNX -> Tensorflow: `onnx-tf convert -t tf -i /path/to/input.onnx -o /path/to/output.pb` |
10 | 10 |
|
11 | | -### Use python: |
| 11 | +### Convert programmatically: |
12 | 12 |
|
13 | | -Tensorflow -> ONNX: |
14 | | -``` |
15 | | -from tensorflow.core.framework import graph_pb2 |
16 | | -
|
17 | | -from onnx_tf.frontend import tensorflow_graph_to_onnx_model |
18 | | -
|
19 | | -
|
20 | | -graph_def = graph_pb2.GraphDef() |
21 | | -with open(input_path, "rb") as f: |
22 | | - graph_def.ParseFromString(f.read()) |
23 | | -nodes, node_inputs = set(), set() |
24 | | -for node in graph_def.node: |
25 | | - nodes.add(node.name) |
26 | | - node_inputs.update(set(node.input)) |
27 | | - output = list(set(nodes) - node_inputs) |
| 13 | +[Tensorflow -> ONNX](https://github.com/onnx/onnx-tensorflow/blob/master/example/tf_to_onnx.py) |
28 | 14 |
|
29 | | -model = tensorflow_graph_to_onnx_model(graph_def, output, ignore_unimplemented=True) |
30 | | -with open(output_path, 'wb') as f: |
31 | | - f.write(model.SerializeToString()) |
32 | | -``` |
33 | | - |
34 | | -ONNX -> Tensorflow: |
35 | | -``` |
36 | | -import onnx |
37 | | -
|
38 | | -from onnx_tf.backend import prepare |
39 | | -
|
40 | | -
|
41 | | -onnx_model = onnx.load(input_path) |
42 | | -tf_rep = prepare(onnx_model) |
43 | | -tf_rep.export_graph(output_path) |
44 | | -``` |
| 15 | +[ONNX -> Tensorflow](https://github.com/onnx/onnx-tensorflow/blob/master/example/onnx_to_tf.py) |
45 | 16 |
|
46 | 17 | ## To do inference on ONNX model by using Tensorflow backend: |
47 | 18 | ``` |
48 | 19 | import onnx |
49 | | -
|
50 | 20 | from onnx_tf.backend import prepare |
51 | 21 |
|
52 | | -
|
53 | 22 | output = prepare(onnx.load(input_path)).run(input) |
54 | 23 | ``` |
55 | 24 |
|
|
0 commit comments