|
| 1 | +# Whisper Torchserve |
| 2 | + |
| 3 | +This truss allows you to run a whisper model using [torchserve](https://pytorch.org/serve/) as the backend on truss. |
| 4 | + |
| 5 | + |
| 6 | +## Deployment |
| 7 | + |
| 8 | +Before deployment: |
| 9 | + |
| 10 | +1. Make sure you have a [Baseten account](https://app.baseten.co/signup) and [API key](https://app.baseten.co/settings/account/api_keys). |
| 11 | +2. Install the latest version of Truss: `pip install --upgrade truss` |
| 12 | + |
| 13 | +With `whisper/whisper-torchserve` as your working directory, you can deploy the model with: |
| 14 | + |
| 15 | +``` |
| 16 | +truss push |
| 17 | +``` |
| 18 | + |
| 19 | +Paste your Baseten API key if prompted. |
| 20 | + |
| 21 | +For more information, see [Truss documentation](https://truss.baseten.co). |
| 22 | + |
| 23 | +## Model Inputs |
| 24 | + |
| 25 | +The model takes in one input: |
| 26 | +- __audio__: An audio file as a base64 string |
| 27 | + |
| 28 | +## Few thing to note |
| 29 | +Torchserve requires a compiled `.mar` file in order to serve the model. Here is a [README](https://github.com/pytorch/serve/blob/master/model-archiver/README.md) providing a brief explanation for generating this file. Once the `.mar` file is generated it needs to get placed in the `data/model_store` directory. Also in the `data/` directory is a configuration file for torchserve called `config.properties`. That file looks something like this: |
| 30 | + |
| 31 | +``` |
| 32 | +inference_address=http://0.0.0.0:8888 |
| 33 | +batch_size=4 |
| 34 | +ipex_enable=true |
| 35 | +async_logging=true |
| 36 | +
|
| 37 | +models={\ |
| 38 | + "whisper_base": {\ |
| 39 | + "1.0": {\ |
| 40 | + "defaultVersion": true,\ |
| 41 | + "marName": "whisper_base.mar",\ |
| 42 | + "minWorkers": 1,\ |
| 43 | + "maxWorkers": 2,\ |
| 44 | + "batchSize": 4,\ |
| 45 | + "maxBatchDelay": 500,\ |
| 46 | + "responseTimeout": 24\ |
| 47 | + }\ |
| 48 | + }\ |
| 49 | +} |
| 50 | +``` |
| 51 | + |
| 52 | +Here you can specify the `batchSize` as well as the name of your mar file using `marName`. When torchserve starts, it will looks for the mar file inside the `data/model_store` directory with the `marName` defined above. |
| 53 | + |
| 54 | +## Invoking the model |
| 55 | + |
| 56 | +Here is an example in Python: |
| 57 | + |
| 58 | +```python |
| 59 | +import requests |
| 60 | +import base64 |
| 61 | + |
| 62 | +def wav_to_base64(file_path): |
| 63 | + with open(file_path, "rb") as wav_file: |
| 64 | + binary_data = wav_file.read() |
| 65 | + base64_data = base64.b64encode(binary_data) |
| 66 | + base64_string = base64_data.decode("utf-8") |
| 67 | + return base64_string |
| 68 | + |
| 69 | +resp = requests.post( |
| 70 | + "https://model-<model-id>.api.baseten.co/development/predict", |
| 71 | + headers={"Authorization": "Api-Key BASETEN-API-KEY"}, |
| 72 | + json={"audio": wav_to_base64("/path/to/audio-file/60-sec.wav")}, |
| 73 | +) |
| 74 | + |
| 75 | +print(resp.json()) |
| 76 | +``` |
| 77 | + |
| 78 | +Here is a sample output: |
| 79 | + |
| 80 | +```json |
| 81 | +{"output": "Let me make it clear. His conduct is unacceptable. He's unfit. And be careful of what you're gonna get. He doesn't care for the American people. It's Donald Trump first. This is what I want people to understand. These people have... I mean, she has no idea what the hell the names of those provinces are, but she wants to send our sons and daughters and our troops and our military equipment to go fight it. Look at the blank expression. She doesn't know the names of the provinces. You do this at every debate. You say, no, don't interrupt me. I didn't interrupt you."} |
| 82 | +``` |
0 commit comments