File tree Expand file tree Collapse file tree 3 files changed +2
-4
lines changed
text_classification/hierarchical/deploy/paddle_serving Expand file tree Collapse file tree 3 files changed +2
-4
lines changed Original file line number Diff line number Diff line change @@ -135,9 +135,7 @@ python3 embedding_insert.py
135
135
| ------------ | ------------ |
136
136
| 1000万条| 12min24s|
137
137
138
- 另外,Milvus提供了可视化的管理界面,可以很方便的查看数据,安装地址为[ Milvus Enterprise Manager] ( https://zilliz.com/products/em ) .
139
-
140
- ![ ] ( ../../img/mem.png )
138
+ 另外,Milvus提供了可视化的管理界面,可以很方便的查看数据,安装地址为[ Milvus Enterprise Manager] ( https://github.com/zilliztech/attu )
141
139
142
140
143
141
运行召回脚本:
Original file line number Diff line number Diff line change @@ -47,7 +47,7 @@ pip install faster_tokenizer
47
47
48
48
使用Paddle Serving做服务化部署时,需要将保存的inference模型转换为serving易于部署的模型。
49
49
50
- 用已安装的paddle_serving_client将静态图参数模型转换成serving格式。如何使用[ 静态图导出脚本] ( export_model.py ) 将训练后的模型转为静态图模型详见[ 模型静态图导出] ( ../../README.md ) ,模型地址--dirname根据实际填写即可。
50
+ 用已安装的paddle_serving_client将静态图参数模型转换成serving格式。如何使用[ 静态图导出脚本] ( ../../ export_model.py) 将训练后的模型转为静态图模型详见[ 模型静态图导出] ( ../../README.md ) ,模型地址--dirname根据实际填写即可。
51
51
52
52
``` shell
53
53
python -m paddle_serving_client.convert --dirname ../../export --model_filename float32.pdmodel --params_filename float32.pdiparams
You can’t perform that action at this time.
0 commit comments