Skip to content

Commit 5fe84d4

Browse files
authored
feat: Use custom tool nodes for auth handling (#551)
This PR updates the existing workflow to replace the prebuilt tool node with a new custom tool node. This new node is designed to intelligently handle tool auth by reading auth headers from the provided `RunnableConfig` by LangGraph. The custom node inspects the auth requirements of the underlying core tool within the `ToolboxTool`. If the tool requires authentication, the node dynamically creates an authenticated copy of the tool by attaching the necessary auth token getters using the `add_auth_token_getter` API. This authenticated tool instance is then used for the call and subsequently discarded. This same auth handling logic has also been applied to the node responsible for ticket insertion. > [!NOTE] > The functionality introduced in these custom nodes will be abstracted into the `ToolboxTool` itself in an upcoming release of the `toolbox-langchain` [#291](googleapis/mcp-toolbox-sdk-python#291). This will simplify the workflow in the future by handling authentication directly within the tool.
1 parent ba039fd commit 5fe84d4

26 files changed

+1492
-1848
lines changed

DEVELOPER.md

Lines changed: 69 additions & 127 deletions
Original file line numberDiff line numberDiff line change
@@ -2,128 +2,114 @@
22

33
## Before you begin
44

5-
1. Make sure you've [setup and initialized your
6-
Database](./README.md#setting-up-your-database).
5+
1. Make sure you've [setup
6+
Toolbox](README.md#launch-the-toolbox-server-choose-one).
77

88
1. Install Python 3.11+
99

1010
1. Install dependencies. We recommend using a virtualenv:
1111

1212
```bash
13-
pip install -r retrieval_service/requirements.txt -r llm_demo/requirements.txt
13+
pip install -r requirements.txt
1414
```
1515

1616
1. Install test dependencies:
1717

1818
```bash
19-
pip install -r retrieval_service/requirements-test.txt -r llm_demo/requirements-test.txt
19+
pip install -r requirements-test.txt
2020
```
2121

22-
## Run the app locally
22+
## Run the App
2323

24-
### Running the retrieval service
24+
### Setup Database
2525

26-
1. Change into the service directory:
26+
To setup the datasource to run with Toolbox, follow [these
27+
steps](README.md#one-time-database--tool-configuration).
2728

28-
```bash
29-
cd retrieval_service
30-
```
31-
32-
1. Open a local connection to your database by starting the [AlloyDB Auth Proxy][alloydb-proxy] or [Cloud SQL Auth Proxy][cloudsql-proxy] or a [SSH tunnel][tunnel] to your AlloyDB instance (for non-cloud postgres such as AlloyDB Omni).
29+
### Setup Toolbox
3330

34-
1. You should already have a [`config.yml` created with your database config][config]. Continue to use `host: 127.0.0.1` and `port: 5432`, unless you instruct the proxy to listen or the SSH tunnel to forward to a different address.
31+
To setup Toolbox (locally or on Cloud Run), follow [these
32+
steps](README.md#launch-the-toolbox-server-choose-one).
3533

34+
### Run Agent App
3635

37-
1. To run the app using uvicorn, execute the following:
38-
39-
```bash
40-
python run_app.py
41-
```
36+
1. Configure the Toolbox URL
4237

43-
### Running the frontend
38+
You need to set the `TOOLBOX_URL` environment variable to point to your
39+
running Toolbox server. Choose the option below that matches your setup.
4440

45-
1. Change into the demo directory:
41+
#### **Option A:** If you are running Toolbox locally
42+
Set the `TOOLBOX_URL` environment variable to your local server, which is
43+
typically running on port `5000`.
4644

4745
```bash
48-
cd llm_demo
46+
export TOOLBOX_URL="http://localhost:5000"
4947
```
5048

51-
1. To use a live retrieval service on Cloud Run:
52-
53-
1. Set Google user credentials:
49+
#### **Option B:** If you are running Toolbox on Cloud Run
50+
1. First, authenticate your gcloud CLI with your user credentials:
5451

5552
```bash
5653
gcloud auth login
5754
```
5855

59-
1. Set `BASE_URL` environment variable:
56+
1. Next, set the `TOOLBOX_URL` environment variable by fetching your live
57+
service's URL:
6058
6159
```bash
62-
export BASE_URL=$(gcloud run services describe retrieval-service --format 'value(status.url)')
60+
export TOOLBOX_URL=$(gcloud run services describe toolbox --format 'value(status.url)')
6361
```
6462
65-
1. Allow your account to invoke the Cloud Run service by granting the [role Cloud Run invoker][invoker]
63+
1. Allow your account to invoke the Cloud Run service by granting the [role
64+
Cloud Run invoker][invoker]
65+
66+
</details>
6667
6768
1. [Optional] Turn on debugging by setting the `DEBUG` environment variable:
6869
6970
```bash
7071
export DEBUG=True
7172
```
7273
73-
1. Set orchestration type environment variable:
74-
75-
| orchestration-type | Description |
76-
|-------------------------------|---------------------------------------------|
77-
| langchain-tools | LangChain tools orchestrator. |
78-
| vertexai-function-calling | VertexAI Function Calling orchestrator. |
79-
80-
```bash
81-
export ORCHESTRATION_TYPE=<orchestration-type>
82-
```
83-
8474
1. To run the app using uvicorn, execute the following:
8575
8676
```bash
8777
python run_app.py
8878
```
8979
90-
Note: for hot reloading of the app use: `python run_app.py --reload`
80+
1. View the app in your browser at http://localhost:8081.
9181
92-
1. View app at `http://localhost:8081/`
82+
> [!TIP]
83+
> For hot-reloading during development, use the `--reload` flag:
84+
> ```bash
85+
> python run_app.py --reload
86+
> ```
9387
9488
## Testing
9589
9690
### Run tests locally
9791
98-
1. Change into the `retrieval_service` directory
99-
1. Open a local connection to your database by starting the [AlloyDB Auth Proxy][alloydb-proxy] or [Cloud SQL Auth Proxy][cloudsql-proxy] or a [SSH tunnel][tunnel] to your AlloyDB instance (for non-cloud postgres such as AlloyDB Omni).
100-
1. Set environment variables (different provider requires different environment variables):
92+
The unit tests for this application mock the API calls to the MCP Toolbox, so
93+
you do not need a live database or a running Toolbox instance to run them.
10194
102-
| Datastore |
103-
|----------------------------------------|
104-
| [AlloyDB](./docs/datastore/alloydb.md#test-environment-variables) |
105-
| [Cloud SQL for Postgres](./docs/datastore/cloudsql_postgres.md#test-environment-variables) |
106-
| [Cloud SQL for MySQL](./docs/datastore/cloudsql_mysql.md#test-environment-variables) |
107-
| [Non-cloud Postgres (e.g. AlloyDB Omni)](./docs/datastore/postgres.md#test-environment-variables) |
108-
109-
1. Run pytest to automatically run all tests:
110-
111-
```bash
112-
pytest
113-
```
95+
```bash
96+
pytest
97+
```
11498
11599
### CI Platform Setup
116100
117-
Cloud Build is used to run tests against Google Cloud resources in test project: extension-demo-testing.
101+
Cloud Build is used to run tests against Google Cloud resources in test project:
102+
`extension-demo-testing`.
103+
118104
Each test has a corresponding Cloud Build trigger, see [all triggers][triggers].
119105
120106
#### Trigger Setup
121107
Create a Cloud Build trigger via the UI or `gcloud` with the following specs:
122108
123109
* Event: Pull request
124110
* Region:
125-
* us-central1 - for AlloyDB to connect to private pool in VPC
126-
* global - for default worker pools
111+
* `us-central1` - for AlloyDB to connect to private pool in VPC
112+
* `global` - for default worker pools
127113
* Source:
128114
* Generation: 1st gen
129115
* Repo: GoogleCloudPlatform/genai-databases-retrieval-app (GitHub App)
@@ -134,99 +120,55 @@ Create a Cloud Build trigger via the UI or `gcloud` with the following specs:
134120
* Location: Repository (add path to file)
135121
* Substitution variables:
136122
* Add `_DATABASE_HOST` for non-cloud postgres
137-
* Service account: set for demo service to enable ID token creation to use to authenticated services
123+
* Service account: set for demo service to enable ID token creation to use to
124+
authenticated services
138125
139126
#### Project Setup
140127
141128
1. Follow instructions to setup the test project:
142-
* [Set up and configure database](./README.md#setting-up-your-database)
143-
* [Instructions for deploying the retrieval service](./docs/deploy_retrieval_service.md)
144-
1. Setup Cloud Build triggers (above)
129+
* [Set up and configure database](README.md#one-time-database--tool-configuration)
130+
* [Instructions for Toolbox setup](README.md#launch-the-toolbox-server-choose-one)
131+
1. Setup Cloud Build triggers ([above](#trigger-setup))
145132
146-
##### Setup for retrieval service
133+
##### Setup for Toolbox
147134
148135
1. Create a Cloud Build private pool
149136
1. Enable Secret Manager API
150-
1. Create secret, `db_user` and `db_pass`, with your database user and database password defined here:
151-
152-
| provider |
153-
|----------------------------------------|
154-
| [AlloyDB](./docs/datastore/alloydb.md#create-a-alloydb-cluster) |
155-
| [Cloud SQL for Postgres](./docs/datastore/cloudsql_postgres.md#create-a-cloud-sql-for-postgresql-instance) |
156-
| [Cloud SQL for MySQL](./docs/datastore/cloudsql_mysql.md#create-a-cloud-sql-for-mysql-instance) |
157-
| [Non-cloud Postgres (e.g. AlloyDB Omni)](./docs/datastore/postgres.md#create-a-alloydb-cluster) |
137+
1. Create secret, `db_user` and `db_pass`, with your database user and database password defined [here](https://googleapis.github.io/genai-toolbox/resources/sources/).
158138
159139
1. Allow Cloud Build to access secret
160-
1. Add role Vertex AI User (roles/aiplatform.user) to Cloud Build Service account. Needed to run database init script.
140+
1. Add role Vertex AI User (`roles/aiplatform.user`) to Cloud Build Service
141+
account. Needed to run database init script.
161142
162-
##### Setup for demo service tests
143+
##### Setup for Agent App
163144
164-
1. Add roles Cloud Run Admin, Service Account User, Log Writer, and Artifact Registry Admin to the demo service's Cloud Build trigger service account.
145+
Add roles `Cloud Run Admin`, `Service Account User`, `Log Writer`, and `Artifact
146+
Registry Admin` to the demo service's Cloud Build trigger service account.
165147

166-
#### Run tests with Cloud Build
167-
168-
* Run Demo Service integration test:
169-
170-
```bash
171-
gcloud builds submit --config llm_demo/int.tests.cloudbuild.yaml
172-
```
173-
174-
* Run retrieval service unit tests:
175-
176-
| provider |
177-
|----------------------------------------|
178-
| [AlloyDB](./docs/datastore/alloydb.md#run-tests) |
179-
| [Cloud SQL for Postgres](./docs/datastore/cloudsql_postgres.md#run-tests) |
180-
| [Cloud SQL for MySQL](./docs/datastore/cloudsql_mysql.md#run-tests) |
181-
| [Non-cloud Postgres (e.g. AlloyDB Omni)](./docs/datastore/postgres.md#run-tests) |
182-
183-
Note: Make sure to setup secrets describe in [Setup for retrieval service](#setup-for-retrieval-service)
184-
185-
#### Trigger
186-
187-
To run Cloud Build tests on GitHub from external contributors, ie RenovateBot, comment: `/gcbrun`.
188-
189-
#### Code Coverage
190-
Please make sure your code is fully tested. The Cloud Build integration tests are run with the `pytest-cov` code coverage plugin. They fail for PRs with a code coverage less than the threshold specified in `retrieval_service/coverage/.<test>-coveragerc`. If your file is inside the main module and should be ignored by code coverage check, add it to the `omit` section of `retrieval_service/coverage/.<test>-coveragerc`.
191-
192-
Check for code coverage report any Cloud Build integration test log.
193-
Here is a breakdown of the report:
194-
- `Stmts`: lines of executable code (statements).
195-
- `Miss`: number of lines not covered by tests.
196-
- `Branch`: branches of executable code (e.g an if-else clause may count as 1 statement but 2 branches; test for both conditions to have both branches covered).
197-
- `BrPart`: number of branches not covered by tests.
198-
- `Cover`: average coverage of files.
199-
- `Missing`: lines that are not covered by tests.
200-
201-
## LLM Evaluation
202-
203-
[Optional] Export detailed metric table with row-specific scores by setting the `EXPORT_CSV` envrionment variable:
148+
#### Run Integration Tests
204149

205150
```bash
206-
export EXPORT_CSV=True
151+
gcloud builds submit --config integration.cloudbuild.yaml
207152
```
208153

209-
Set `CLIENT_ID` to run evaluation that require authentication:
154+
> [!NOTE]
155+
> Make sure to setup secrets described in [Setup for Toolbox](#setup-for-toolbox)
210156

211-
```bash
212-
export CLIENT_ID=<retrieve CLIENT_ID from GCP credentials>
213-
```
157+
#### Trigger
214158

215-
To run LLM system evaluation, execute the following:
159+
To run Cloud Build tests on GitHub from external contributors, ie RenovateBot,
160+
comment: `/gcbrun`.
216161

217-
```bash
218-
python llm_demo/run_evaluation.py
219-
```
220-
221-
To view metrics, visit [GCP dashboard][vertex-ai-experiments].
162+
#### Code Coverage
163+
Please make sure your code is fully tested.
222164

223165
## Versioning
224166

225-
This app will be released based on version number MAJOR.MINOR.PATCH:
167+
This app will be released based on version number `MAJOR.MINOR.PATCH`:
226168

227-
- MAJOR: Breaking change is made, requiring user to redeploy all or some of the app.
228-
- MINOR: Backward compatible feature change or addition that doesn't require redeploying.
229-
- PATCH: Backward compatible bug fixes and minor updates
169+
- `MAJOR`: Breaking change is made, requiring user to redeploy all or some of the app.
170+
- `MINOR`: Backward compatible feature change or addition that doesn't require redeploying.
171+
- `PATCH`: Backward compatible bug fixes and minor updates
230172
231173
[alloydb-proxy]: https://cloud.google.com/alloydb/docs/auth-proxy/connect
232174
[cloudsql-proxy]: https://cloud.google.com/sql/docs/mysql/sql-proxy

0 commit comments

Comments
 (0)