You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Now you have everything you need: the data inputs, the model, the output and your inference script. Let's build the batch inference pipeline containing ParallelRunStep.
263
263
264
-
### Prepare the run environment
264
+
### Prepare the environment
265
265
266
-
First, specify the dependencies for your script. This allows you to install pip packages as well as configure the inference environment.
267
-
- Please always include **azureml-core** package.
268
-
- If your input is `FileDataset`, please include **azureml-dataprep[fuse]**.
269
-
- If your input is `TabularDataset`, please include **azureml-dataprep[pandas, fuse]**.
266
+
First, specify the dependencies for your script. This allows you to install pip packages as well as configure the environment. Please always include **azureml-core** and **azureml-dataprep[pandas, fuse]** packages.
270
267
271
-
`FileDataset` is used in this example, you will need to include **azureml-dataprep[fuse]** package.
268
+
If you use custom docker image, you should also have conda installed.
272
269
273
270
```python
274
271
from azureml.core.environment import Environment
275
272
from azureml.core.conda_dependencies import CondaDependencies
276
273
from azureml.core.runconfig importDEFAULT_GPU_IMAGE
0 commit comments