Skip to content

Commit 395b47b

Browse files
authored
Update how-to-batch-scoring-script.md
1 parent 39dc88d commit 395b47b

File tree

1 file changed

+21
-1
lines changed

1 file changed

+21
-1
lines changed

articles/machine-learning/how-to-batch-scoring-script.md

Lines changed: 21 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@ The scoring script must contain two methods:
3333

3434
#### The `init` method
3535

36-
Use the `init()` method for any costly or common preparation. For example, use it to load the model into a global object. This function will be called once at the beginning of the process. You model's files will be available in an environment variable called `AZUREML_MODEL_DIR`. Use this variable to locate the files associated with the model.
36+
Use the `init()` method for any costly or common preparation. For example, use it to load the model into a global object. This function will be called once at the beginning of the process. You model's files will be available in an environment variable called `AZUREML_MODEL_DIR`. Use this variable to locate the files associated with the model. Notice that some model's may be contained in a folder. See [how you can find out what's the folder used by your model](#using-models-that-are-folders).
3737

3838
```python
3939
def init():
@@ -154,6 +154,26 @@ For an example about how to achieve it see [Text processing with batch deploymen
154154

155155
Your deployment configuration controls the size of each mini-batch and the number of workers on each node. Take into account them when deciding if you want to read the entire mini-batch to perform inference. When running multiple workers on the same instance, take into account that memory will be shared across all the workers. Usually, increasing the number of workers per node should be accompanied by a decrease in the mini-batch size or by a change in the scoring strategy (if data size remains the same).
156156

157+
### Using models that are folders
158+
159+
The environment variable `AZUREML_MODEL_DIR` is typically used in the `init()` function to load the model. However, some models may contain its files inside of a folder. When reading the files in this variable, you may need to account for that. You can identify the folder where your MLflow model is placed as follows:
160+
161+
1. Go to [Azure Machine Learning portal](https://ml.azure.com).
162+
163+
1. Go to the section __Models__.
164+
165+
1. Select the model you are trying to deploy and click on the tab __Artifacts__.
166+
167+
1. Take note of the folder that is displayed. This folder was indicated when the model was registered.
168+
169+
:::image type="content" source="media/how-to-deploy-mlflow-models-online-endpoints/mlflow-model-folder-name.png" lightbox="media/how-to-deploy-mlflow-models-online-endpoints/mlflow-model-folder-name.png" alt-text="Screenshot showing the folder where the model artifacts are placed.":::
170+
171+
Then you can use this path to load the model:
172+
173+
```python
174+
model_path = os.path.join(os.environ["AZUREML_MODEL_DIR"], "model")
175+
```
176+
157177
## Next steps
158178

159179
* [Troubleshooting batch endpoints](how-to-troubleshoot-batch-endpoints.md).

0 commit comments

Comments
 (0)