|
1 | 1 | # TPU SavedModel Export API for TF2.x |
2 | 2 |
|
3 | | -Status | Proposed |
| 3 | +Status | Accepted |
4 | 4 | :------------ | :----------------------------------------------------------- |
5 | 5 | **RFC #** | [171](https://github.com/tensorflow/community/pull/171) |
6 | 6 | **Author(s) ** | Zhuoran Liu ( [email protected]), Youlong Cheng ( [email protected]) |
7 | 7 | **Sponsor ** | Jonathan Hseu ( [email protected]) |
8 | | -**Updated** | 2019-11-06 |
| 8 | +**Updated** | 2020-11-06 |
9 | 9 |
|
10 | 10 | ## Objective |
11 | 11 |
|
@@ -82,7 +82,61 @@ whatever reason, e.g. CPU computations can be parallelized, users don’t have |
82 | 82 | enough TPU resources, etc. In this case, there has to be a way for them to tell |
83 | 83 | SavedModel that only ‘dense’ is to run on TPU.</center> |
84 | 84 |
|
85 | | -## Design Proposal |
| 85 | +## Design |
| 86 | + |
| 87 | +The general idea is to allow users to store a function-alias mapping during |
| 88 | +model saving, so that they can refer to the function they want to rewrite for |
| 89 | +TPU inference when they use downstream graph transformation tools to rewrite |
| 90 | +their models for TPU serving. |
| 91 | + |
| 92 | +This alias mapping mechanism is because a single tf.function can generate many |
| 93 | +ConcreteFunctions. If a downstream tool wants to refer to all concrete functions |
| 94 | +generated by a single tf.function, it can use the `function_aliases` argument to |
| 95 | +store a map from the alias name to all concrete function names. |
| 96 | + |
| 97 | +### Major changes |
| 98 | + |
| 99 | ++ For `tf.saved_model.SaveOptions`: A new slot `function_aliases` is added, to |
| 100 | + allow users specify alias of functions they potentially wish to be rewritten |
| 101 | + by external graph transformation tools (TPU Inference Converter in this |
| 102 | + case); |
| 103 | ++ For `MetaInfoDef` in `MetaGraphDef` in `SavedModel`: A new field |
| 104 | + `functions_aliases` is added, to store names of FunctionDef mapping to their |
| 105 | + aliases. |
| 106 | + |
| 107 | +### User facing API |
| 108 | + |
| 109 | +Users can give `FunctionDef`s they potentially want to rewrite for TPU inference |
| 110 | +an alias when saving model: |
| 111 | + |
| 112 | +```python |
| 113 | +class MyModel: |
| 114 | + @tf.function |
| 115 | + def func(): |
| 116 | + ... |
| 117 | + @tf.function |
| 118 | + def serve(): |
| 119 | + ... |
| 120 | + func() |
| 121 | + |
| 122 | +model = MyModel() |
| 123 | +signatures = { |
| 124 | + 'serving_default': model.serve.get_concrete_function(), |
| 125 | +} |
| 126 | +options = tf.saved_model.SaveOptions(function_aliases={ |
| 127 | + 'my_func': model.func, |
| 128 | +}) |
| 129 | +tf.saved_model.save(model, export_dir, signatures, options) |
| 130 | +``` |
| 131 | + |
| 132 | +And leverage some model conversion tool to convert their CPU model for TPU |
| 133 | +serving: |
| 134 | + |
| 135 | +```python |
| 136 | +MyModelConversionTool(input_saved_model, output_saved_model, function_alias='my_func') |
| 137 | +``` |
| 138 | + |
| 139 | +## Alternative Design Considered |
86 | 140 |
|
87 | 141 | ### Caveat |
88 | 142 |
|
|
0 commit comments