Skip to content

Commit 4203d4f

Browse files
committed
Add optional workaround for running with mlflow component
1 parent 3a76ea0 commit 4203d4f

File tree

1 file changed

+2
-0
lines changed

1 file changed

+2
-0
lines changed

README.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -88,6 +88,8 @@ To serve NLP models through a container, run the following commands:
8888
export MODEL_PACKAGE_FULL_PATH=<PATH/TO/MODEL_PACKAGE.zip>
8989
export CMS_UID=$(id -u $USER)
9090
export CMS_GID=$(id -g $USER)
91+
# NOTE: use if you wish to save models locally (i.e run without the mlflow component)
92+
# export export MLFLOW_TRACKING_URI="file:///tmp/mlruns/"
9193
docker compose -f docker-compose.yml up -d <model-service>
9294
```
9395
Then the API docs will be accessible at localhost on the mapped port specified in `docker-compose.yml`. The container runs

0 commit comments

Comments
 (0)