We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
There was an error while loading. Please reload this page.
1 parent 3a76ea0 commit 4203d4fCopy full SHA for 4203d4f
README.md
@@ -88,6 +88,8 @@ To serve NLP models through a container, run the following commands:
88
export MODEL_PACKAGE_FULL_PATH=<PATH/TO/MODEL_PACKAGE.zip>
89
export CMS_UID=$(id -u $USER)
90
export CMS_GID=$(id -g $USER)
91
+# NOTE: use if you wish to save models locally (i.e run without the mlflow component)
92
+# export export MLFLOW_TRACKING_URI="file:///tmp/mlruns/"
93
docker compose -f docker-compose.yml up -d <model-service>
94
```
95
Then the API docs will be accessible at localhost on the mapped port specified in `docker-compose.yml`. The container runs
0 commit comments