Skip to content

Commit 9584b65

Browse files
committed
add gdb
1 parent 1fb9e34 commit 9584b65

File tree

1 file changed

+7
-3
lines changed

1 file changed

+7
-3
lines changed

articles/machine-learning/how-to-inference-server-http.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -156,6 +156,8 @@ There are two ways to use Visual Studio Code (VS Code) and [Python Extension](ht
156156
1. Start debugging session in VS Code. Select "Run" -> "Start Debugging" (or `F5`).
157157

158158
- **Attach mode**: start the AzureML Inference HTTP Server in a command line and use VS Code + Python Extension to attach to the process.
159+
> [!NOTE]
160+
> If you're using Linux environment, first install the `gdb` package by running `sudo apt-get install -y gdb`.
159161
1. Add the following configuration to `launch.json` for that workspace in VS Code:
160162
161163
**launch.json**
@@ -173,11 +175,13 @@ There are two ways to use Visual Studio Code (VS Code) and [Python Extension](ht
173175
]
174176
}
175177
```
176-
1. Start the inference server using CLI (`azmlinfsrv --entry_script score.py`).
177-
1. Start debugging session in VS Code.
178+
2. Start the inference server using CLI (`azmlinfsrv --entry_script score.py`).
179+
3. Start debugging session in VS Code.
178180
1. In VS Code, select "Run" -> "Start Debugging" (or `F5`).
179-
1. Enter the process ID using the logs (from the inference server) displayed in the CLI.
181+
2. Enter the process ID of the `azmlinfsrv` (not the `gunicorn`) using the logs (from the inference server) displayed in the CLI.
180182
:::image type="content" source="./media/how-to-inference-server-http/debug-attach-pid.png" alt-text="Screenshot of the CLI which shows the process ID of the server":::
183+
> [!NOTE]
184+
> If you're using Linux environment, install `gdb` package
181185

182186
In both ways, you can set [breakpoint](https://code.visualstudio.com/docs/editor/debugging#_breakpoints) and debug step by step.
183187

0 commit comments

Comments
 (0)