|
| 1 | +--- |
| 2 | +title: "Hello World - Python BentoML" |
| 3 | +linkTitle: "Python Bentoml" |
| 4 | +weight: 1 |
| 5 | +type: "docs" |
| 6 | +--- |
| 7 | + |
| 8 | +A simple machine learning model with API serving that is written in python and |
| 9 | +using [BentoML](https://github.com/bentoml/BentoML). BentoML is an open source |
| 10 | +framework for high performance ML model serving, which supports all major machine |
| 11 | +learning frameworks including Keras, Tensorflow, PyTorch, Fast.ai, XGBoost and etc. |
| 12 | + |
| 13 | +This sample will walk you through the steps of creating and deploying a machine learning |
| 14 | +model using python. It will use BentoML to package a classifier model trained |
| 15 | +on the Iris dataset. Afterward, it will create a container image and |
| 16 | +deploy the image to Knative. |
| 17 | + |
| 18 | +Knative deployment guide with BentoML is also available in the |
| 19 | +[BentoML documentation](https://docs.bentoml.org/en/latest/deployment/knative.html) |
| 20 | + |
| 21 | +## Before you begin |
| 22 | + |
| 23 | +- A Kubernetes cluster with Knative installed. Follow the |
| 24 | + [installation instructions](../../../../docs/install/README.md) if you need to |
| 25 | + create one. |
| 26 | +- [Docker](https://www.docker.com) installed and running on your local machine, |
| 27 | + and a Docker Hub account configured. Docker Hub will be used for a container registry). |
| 28 | +- Python 3.6 or above installed and running on your local machine. |
| 29 | + - Install `scikit-learn` and `bentoml` packages: |
| 30 | + |
| 31 | + ```shell |
| 32 | + pip install scikit-learn |
| 33 | + pip install bentoml |
| 34 | + ``` |
| 35 | + |
| 36 | +## Recreating sample code |
| 37 | + |
| 38 | +Run the following code on your local machine, to train a machine learning model and deploy it |
| 39 | +as API endpoint with KNative Serving. |
| 40 | + |
| 41 | +1. BentoML creates a model API server, via prediction service abstraction. In |
| 42 | + `iris_classifier.py`, it defines a prediction service that requires a scikit-learn |
| 43 | + model, asks BentoML to figure out the required pip dependencies, also defines an |
| 44 | + API, which is the entry point for accessing this machine learning service. |
| 45 | + |
| 46 | + {{% readfile file="iris_classifier.py" %}} |
| 47 | + |
| 48 | +2. In `main.py`, it uses the classic |
| 49 | + [iris flower data set](https://en.wikipedia.org/wiki/Iris_flower_data_set) |
| 50 | + to train a classification model which can predict the species of an iris flower with |
| 51 | + given data and then save the model with BentoML to local disk. |
| 52 | + |
| 53 | + {{% readfile file="main.py" %}} |
| 54 | + |
| 55 | + Run the `main.py` file to train and save the model: |
| 56 | + |
| 57 | + ```shell |
| 58 | + python main.py |
| 59 | + ``` |
| 60 | + |
| 61 | +3. Use BentoML CLI to check saved model's information. |
| 62 | +
|
| 63 | + ```shell |
| 64 | + bentoml get IrisClassifier:latest |
| 65 | + ``` |
| 66 | +
|
| 67 | + Example: |
| 68 | +
|
| 69 | + ```shell |
| 70 | + > bentoml get IrisClassifier:latest |
| 71 | + { |
| 72 | + "name": "IrisClassifier", |
| 73 | + "version": "20200305171229_0A1411", |
| 74 | + "uri": { |
| 75 | + "type": "LOCAL", |
| 76 | + "uri": "/Users/bozhaoyu/bentoml/repository/IrisClassifier/20200305171229_0A1411" |
| 77 | + }, |
| 78 | + "bentoServiceMetadata": { |
| 79 | + "name": "IrisClassifier", |
| 80 | + "version": "20200305171229_0A1411", |
| 81 | + "createdAt": "2020-03-06T01:12:49.431011Z", |
| 82 | + "env": { |
| 83 | + "condaEnv": "name: bentoml-IrisClassifier\nchannels:\n- defaults\ndependencies:\n- python=3.7.3\n- pip\n", |
| 84 | + "pipDependencies": "bentoml==0.6.2\nscikit-learn", |
| 85 | + "pythonVersion": "3.7.3" |
| 86 | + }, |
| 87 | + "artifacts": [ |
| 88 | + { |
| 89 | + "name": "model", |
| 90 | + "artifactType": "SklearnModelArtifact" |
| 91 | + } |
| 92 | + ], |
| 93 | + "apis": [ |
| 94 | + { |
| 95 | + "name": "predict", |
| 96 | + "handlerType": "DataframeHandler", |
| 97 | + "docs": "BentoService API", |
| 98 | + "handlerConfig": { |
| 99 | + "orient": "records", |
| 100 | + "typ": "frame", |
| 101 | + "input_dtypes": null, |
| 102 | + "output_orient": "records" |
| 103 | + } |
| 104 | + } |
| 105 | + ] |
| 106 | + } |
| 107 | + } |
| 108 | + ``` |
| 109 | +
|
| 110 | +4. Test run API server. BentoML can start an API server from the saved model. Use |
| 111 | + BentoML CLI command to start an API server locally and test it with the `curl` command. |
| 112 | +
|
| 113 | + ```shell |
| 114 | + bentoml serve IrisClassifier:latest |
| 115 | + ``` |
| 116 | +
|
| 117 | + In another terminal window, make `curl` request with sample data to the API server |
| 118 | + and get prediction result: |
| 119 | +
|
| 120 | + ```shell |
| 121 | + curl -v -i \ |
| 122 | + --header "Content-Type: application/json" \ |
| 123 | + --request POST \ |
| 124 | + --data '[[5.1, 3.5, 1.4, 0.2]]' \ |
| 125 | + 127.0.0.1:5000/predict |
| 126 | + ``` |
| 127 | +
|
| 128 | +## Building and deploying the sample |
| 129 | +
|
| 130 | +BentoML supports creating an API server docker image from its saved model directory, where |
| 131 | +a Dockerfile is automatically generated when saving the model. |
| 132 | +
|
| 133 | +1. To build an API model server docker image, replace `{username}` with your Docker Hub |
| 134 | + username and run the following commands. |
| 135 | +
|
| 136 | + ```shell |
| 137 | + # jq might not be installed on your local system, please follow jq install |
| 138 | + # instruction at https://stedolan.github.io/jq/download/ |
| 139 | + saved_path=$(bentoml get IrisClassifier:latest -q | jq -r ".uri.uri") |
| 140 | +
|
| 141 | + # Build the container on your local machine |
| 142 | + docker build - t {username}/iris-classifier $saved_path |
| 143 | +
|
| 144 | + # Push the container to docker registry |
| 145 | + docker push {username}/iris-classifier |
| 146 | + ``` |
| 147 | +
|
| 148 | +2. In `service.yaml`, replace `{username}` with your Docker hub username, and then deploy |
| 149 | + the service to Knative Serving with `kubectl`: |
| 150 | +
|
| 151 | + {{% readfile file="service.yaml" %}} |
| 152 | +
|
| 153 | + ```shell |
| 154 | + kubectl apply --filename service.yaml |
| 155 | + ``` |
| 156 | +
|
| 157 | +3. Now that your service is created, Knative performs the following steps: |
| 158 | +
|
| 159 | + - Create a new immutable revision for this version of the app. |
| 160 | + - Network programming to create a route, ingress, service, and load |
| 161 | + balance for your application. |
| 162 | + - Automatically scale your pods up and down (including to zero active |
| 163 | + pods). |
| 164 | +
|
| 165 | +4. Run the following command to find the domain URL for your service: |
| 166 | +
|
| 167 | + ```shell |
| 168 | + kubectl get ksvc iris-classifier --output=custom-columns=NAME:.metadata.name,URL:.status.url |
| 169 | +
|
| 170 | + NAME URL |
| 171 | + iris-classifier http://iris-classifer.default.example.com |
| 172 | + ``` |
| 173 | +
|
| 174 | +5. Replace the request URL with the URL return in the previous command, and execute the |
| 175 | + command to get prediction result from the deployed model API endpoint. |
| 176 | +
|
| 177 | + ```shell |
| 178 | + curl -v -i \ |
| 179 | + --header "Content-Type: application/json" \ |
| 180 | + --request POST \ |
| 181 | + --data '[[5.1, 3.5, 1.4, 0.2]]' \ |
| 182 | + http://iris-classifier.default.example.com/predict |
| 183 | +
|
| 184 | + [0] |
| 185 | + ``` |
| 186 | +
|
| 187 | +## Removing the sample app deployment |
| 188 | +
|
| 189 | +To remove the application from your cluster, delete the service record: |
| 190 | +
|
| 191 | + ```shell |
| 192 | + kubectl delete --filename service.yaml |
| 193 | + ``` |
0 commit comments