-
Notifications
You must be signed in to change notification settings - Fork 0
Description
This release includes several new features and updates that require documentation changes. This issue summarizes the recommended documentation updates to reflect the changes between ADK python release v1.21.0 and v1.22.0.
Compare link: google/adk-python@v1.21.0...v1.22.0
1. Create new documentation for the Pub/Sub tool.
Doc file: docs/tools/google-cloud/pubsub.md
Proposed Change:
hide:
- toc
Google Cloud Pub/Sub tools
The Google Cloud Pub/Sub toolset allows agents to interact with Google Cloud Pub/Sub, a real-time messaging service. Agents can use these tools to publish, pull, and acknowledge messages from Pub/Sub topics and subscriptions.
Set up
Before using the Pub/Sub toolset, you need to:
- Enable the Pub/Sub API in your Google Cloud project.
- Create a Pub/Sub topic to publish messages to.
- Create a Pub/Sub subscription to pull messages from.
- Grant the appropriate IAM roles to the service account your agent uses. The service account needs the
roles/pubsub.editorrole to publish messages and theroles/pubsub.subscriberrole to pull and acknowledge messages.Usage
To use the Pub/Sub toolset, import and initialize it in your agent's code:
from google.adk.tools.pubsub import PubSubToolset pubsub_tools = PubSubToolset()The toolset includes the following tools:
publish_messagePublish a message to a Pub/Sub topic.
Parameter Type Description topic_namestrThe name of the Pub/Sub topic (e.g., projects/my-project/topics/my-topic).messagestrThe content of the message to publish. attributesdict[str, str](Optional) A dictionary of attributes to attach to the message. ordering_keystr(Optional) The ordering key for the message. Returns: A dictionary containing the
message_idof the published message.
pull_messagesPull one or more messages from a Pub/Sub subscription.
Parameter Type Description subscription_namestrThe name of the Pub/Sub subscription (e.g., projects/my-project/subscriptions/my-sub).max_messagesintThe maximum number of messages to pull. Defaults to 1. auto_ackboolWhether to automatically acknowledge the messages after they are pulled. Defaults to False.Returns: A dictionary containing a list of the pulled messages. Each message is a dictionary with the following keys:
message_id,data,attributes,ordering_key,publish_time, andack_id.
acknowledge_messagesAcknowledge one or more messages on a Pub/Sub subscription. This removes the messages from the subscription so they won't be pulled again.
Parameter Type Description subscription_namestrThe name of the Pub/Sub subscription. ack_idslist[str]A list of acknowledgment IDs to acknowledge. These are obtained from the pull_messagestool.Returns: A dictionary with the status of the operation (
SUCCESSorERROR).
Reasoning:
The v1.22.0 release includes a new Pub/Sub tool that needs to be documented.
Reference: src/google/adk/tools/pubsub/pubsub_toolset.py
2. Create a new guide for database schema migration.
Doc file: docs/sessions/migration.md
Proposed Change:
hide:
- toc
Database Schema Migration
Starting with version 1.22.0, the ADK uses a new, more robust database schema for the
DatabaseSessionService. If you have been usingDatabaseSessionServicewith a previous version of the ADK, you will need to migrate your database to the latest schema to continue using your existing session data.Overview
The migration process involves running a script that reads data from your existing database, transforms it to the new schema, and writes it to a new database. The migration is not done in-place, which means you will need to provide a separate destination database for the migrated data. This approach ensures that your existing data is not at risk during the migration process.
How to Migrate
The migration is performed using the
upgradefunction from thegoogle.adk.sessions.migration.migration_runnermodule.Here's a step-by-step guide:
Create a new, empty database to serve as the destination for the migrated data.
Run the migration script. The following Python script demonstrates how to use the
upgradefunction:from google.adk.sessions.migration import migration_runner # The SQLAlchemy URL of your existing database. source_db_url = "sqlite+aiosqlite:///path/to/your/old/database.db" # The SQLAlchemy URL of the new, empty database for the migrated data. dest_db_url = "sqlite+aiosqlite:///path/to/your/new/database.db" # Run the migration. migration_runner.upgrade(source_db_url, dest_db_url)Update your application configuration. After the migration is complete, update your application to use the new database URL (
dest_db_url).Important Notes
- In-place migration is not supported. You must provide a different
dest_db_urlthansource_db_url.- The destination database must be empty. The migration script will create the necessary tables in the destination database.
- The migration process is sequential. If your database schema is several versions behind, the migration script will apply the necessary migrations in sequence.
- Temporary databases may be used. For complex migrations, the script may use temporary SQLite databases to store intermediate results. These temporary files are automatically cleaned up after the migration is complete.
Reasoning:
The v1.22.0 release introduces database schema versioning and a migration process. Users who have been using DatabaseSessionService with a previous version of the ADK will need a guide on how to migrate their data to the latest schema.
Reference: src/google/adk/sessions/migration/migration_runner.py
3. Add the new Pub/Sub tool to the main tools page.
Doc file: docs/tools/index.md
Current state:
Google Cloud tools
![]()
Apigee API Hub
Turn any documented API from Apigee API hub into a tool
![]()
API Registry
Dynamically connect with Google Cloud services as MCP tools
![]()
Application Integration
Link your agents to enterprise apps using Integration Connectors
![]()
BigQuery Agent Analytics
Analyze and debug agent behavior at scale
![]()
BigQuery Tools
Connect with BigQuery to retrieve data and perform analysis
![]()
Bigtable Tools
Interact with Bigtable to retrieve data and execute SQL
![]()
GKE Code Executor
Run AI-generated code in a secure and scalable GKE environment
![]()
Spanner Tools
Interact with Spanner to retrieve data, search, and execute SQL
![]()
MCP Toolbox for Databases
Connect over 30 different data sources to your agents
![]()
Vertex AI RAG Engine
Perform private data retrieval using Vertex AI RAG Engine
![]()
Vertex AI Search
Search across your private, configured data stores in Vertex AI Search
Proposed Change:
Google Cloud tools
![]()
Apigee API Hub
Turn any documented API from Apigee API hub into a tool
![]()
API Registry
Dynamically connect with Google Cloud services as MCP tools
![]()
Application Integration
Link your agents to enterprise apps using Integration Connectors
![]()
BigQuery Agent Analytics
Analyze and debug agent behavior at scale
![]()
BigQuery Tools
Connect with BigQuery to retrieve data and perform analysis
![]()
Bigtable Tools
Interact with Bigtable to retrieve data and execute SQL
![]()
GKE Code Executor
Run AI-generated code in a secure and scalable GKE environment
![]()
Pub/Sub
Publish, pull, and acknowledge messages from Pub/Sub topics and subscriptions
![]()
Spanner Tools
Interact with Spanner to retrieve data, search, and execute SQL
![]()
MCP Toolbox for Databases
Connect over 30 different data sources to your agents
![]()
Vertex AI RAG Engine
Perform private data retrieval using Vertex AI RAG Engine
![]()
Vertex AI Search
Search across your private, configured data stores in Vertex AI Search
Reasoning:
The new Pub/Sub tool should be easily discoverable by users browsing the available tools.
Reference: src/google/adk/tools/pubsub/pubsub_toolset.py
4. Update DatabaseSessionService documentation to include information about schema migration.
Doc file: docs/sessions/session.md
Current state:
DatabaseSessionServiceSupported in ADKPython v0.1.0Go v0.1.0
- How it works: Connects to a relational database (e.g., PostgreSQL,
MySQL, SQLite) to store session data persistently in tables.- Persistence: Yes. Data survives application restarts.
- Requires: A configured database.
- Best for: Applications needing reliable, persistent storage that you
manage yourself.from google.adk.sessions import DatabaseSessionService # Example using a local SQLite file: # Note: The implementation requires an async database driver. # For SQLite, use 'sqlite+aiosqlite' instead of 'sqlite' to ensure async compatibility. db_url = "sqlite+aiosqlite:///./my_agent_data.db" session_service = DatabaseSessionService(db_url=db_url)Async Driver Requirement
DatabaseSessionServicerequires an async database driver. When using SQLite, you must usesqlite+aiosqliteinstead ofsqlitein your connection string. For other databases (PostgreSQL, MySQL), ensure you're using an async-compatible driver (e.g.,asyncpgfor PostgreSQL,aiomysqlfor MySQL).
Proposed Change:
DatabaseSessionServiceSupported in ADKPython v0.1.0Go v0.1.0
- How it works: Connects to a relational database (e.g., PostgreSQL,
MySQL, SQLite) to store session data persistently in tables.- Persistence: Yes. Data survives application restarts.
- Requires: A configured database.
- Best for: Applications needing reliable, persistent storage that you
manage yourself.from google.adk.sessions import DatabaseSessionService # Example using a local SQLite file: # Note: The implementation requires an async database driver. # For SQLite, use 'sqlite+aiosqlite' instead of 'sqlite' to ensure async compatibility. db_url = "sqlite+aiosqlite:///./my_agent_data.db" session_service = DatabaseSessionService(db_url=db_url)Async Driver Requirement
DatabaseSessionServicerequires an async database driver. When using SQLite, you must usesqlite+aiosqliteinstead ofsqlitein your connection string. For other databases (PostgreSQL, MySQL), ensure you're using an async-compatible driver (e.g.,asyncpgfor PostgreSQL,aiomysqlfor MySQL).Database Schema Migration
Starting with version 1.22.0, the ADK uses a new database schema for the
DatabaseSessionService. If you have been using `DatabaseSessionService` with a previous version, you will need to migrate your database to the latest schema. For more information, see the [Database Schema Migration Guide](/adk-docs/sessions/migration/).
Reasoning:
Users of DatabaseSessionService should be aware of the new schema versioning and migration process to ensure a smooth upgrade and avoid data loss.
Reference: src/google/adk/sessions/database_session_service.py