Skip to content

Commit c6bc3a5

Browse files
authored
Merge pull request #78773 from tsikiksr/master
Add schema sample to support consumption from Power BI
2 parents e4c4522 + 0b7d875 commit c6bc3a5

File tree

2 files changed

+50
-0
lines changed

2 files changed

+50
-0
lines changed

articles/machine-learning/service/how-to-consume-web-service.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -484,3 +484,11 @@ The results returned are similar to the following JSON document:
484484
```JSON
485485
[217.67978776218715, 224.78937091757172]
486486
```
487+
488+
## Consume the service from Power BI
489+
490+
Power BI supports consumption of Azure Machine Learning web services to enrich the data in Power BI with predictions.
491+
492+
To generate a web service that's supported for consumption in Power BI, the schema must support the format that's required by Power BI. [Learn how to create a Power BI-supported schema](https://docs.microsoft.com/azure/machine-learning/service/how-to-deploy-and-where#Example-script-with-dictionary-input-Support-consumption-from-Power-BI).
493+
494+
Once the web service is deployed, it's consumable from Power BI dataflows. [Learn how to consume an Azure Machine Learning web service from Power BI](https://docs.microsoft.com/power-bi/service-machine-learning-integration).

articles/machine-learning/service/how-to-deploy-and-where.md

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -183,6 +183,48 @@ def run(data):
183183
return error
184184
```
185185

186+
#### Example script with dictionary input (Support consumption from Power BI)
187+
188+
The following example demonstrates how to define input data as <key: value> dictionary, using Dataframe. This method is supported for consuming the deployed web service from Power BI ([learn more on how to consume the web service from Power BI](https://docs.microsoft.com/en-us/power-bi/service-machine-learning-integration)):
189+
190+
```python
191+
import json
192+
import pickle
193+
import numpy as np
194+
import pandas as pd
195+
import azureml.train.automl
196+
from sklearn.externals import joblib
197+
from azureml.core.model import Model
198+
199+
from inference_schema.schema_decorators import input_schema, output_schema
200+
from inference_schema.parameter_types.numpy_parameter_type import NumpyParameterType
201+
from inference_schema.parameter_types.pandas_parameter_type import PandasParameterType
202+
203+
def init():
204+
global model
205+
model_path = Model.get_model_path('model_name') # replace model_name with your actual model name, if needed
206+
# deserialize the model file back into a sklearn model
207+
model = joblib.load(model_path)
208+
209+
input_sample = pd.DataFrame(data=[{
210+
"input_name_1": 5.1, # This is a decimal type sample. Use the data type that reflects this column in your data
211+
"input_name_2": "value2", # This is a string type sample. Use the data type that reflects this column in your data
212+
"input_name_3": 3 # This is a integer type sample. Use the data type that reflects this column in your data
213+
}])
214+
215+
output_sample = np.array([0]) # This is a integer type sample. Use the data type that reflects the expected result
216+
217+
@input_schema('data', PandasParameterType(input_sample))
218+
@output_schema(NumpyParameterType(output_sample))
219+
def run(data):
220+
try:
221+
result = model.predict(data)
222+
# you can return any datatype as long as it is JSON-serializable
223+
return result.tolist()
224+
except Exception as e:
225+
error = str(e)
226+
return error
227+
```
186228
For more example scripts, see the following examples:
187229

188230
* Pytorch: [https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/training-with-deep-learning/train-hyperparameter-tune-deploy-with-pytorch](https://github.com/Azure/MachineLearningNotebooks/blob/master/how-to-use-azureml/training-with-deep-learning/train-hyperparameter-tune-deploy-with-pytorch)

0 commit comments

Comments
 (0)