Skip to content

Found docs updates needed from ADK python release v1.21.0 to v1.22.0 #77

@xuanyang15

Description

@xuanyang15

This release includes several new features and updates that require documentation changes. This issue summarizes the recommended documentation updates to reflect the changes between ADK python release v1.21.0 and v1.22.0.

Compare link: google/adk-python@v1.21.0...v1.22.0

1. Create new documentation for the Pub/Sub tool.

Doc file: docs/tools/google-cloud/pubsub.md

Proposed Change:


hide:

  • toc

Google Cloud Pub/Sub tools

The Google Cloud Pub/Sub toolset allows agents to interact with Google Cloud Pub/Sub, a real-time messaging service. Agents can use these tools to publish, pull, and acknowledge messages from Pub/Sub topics and subscriptions.

Set up

Before using the Pub/Sub toolset, you need to:

  1. Enable the Pub/Sub API in your Google Cloud project.
  2. Create a Pub/Sub topic to publish messages to.
  3. Create a Pub/Sub subscription to pull messages from.
  4. Grant the appropriate IAM roles to the service account your agent uses. The service account needs the roles/pubsub.editor role to publish messages and the roles/pubsub.subscriber role to pull and acknowledge messages.

Usage

To use the Pub/Sub toolset, import and initialize it in your agent's code:

from google.adk.tools.pubsub import PubSubToolset

pubsub_tools = PubSubToolset()

The toolset includes the following tools:

publish_message

Publish a message to a Pub/Sub topic.

Parameter Type Description
topic_name str The name of the Pub/Sub topic (e.g., projects/my-project/topics/my-topic).
message str The content of the message to publish.
attributes dict[str, str] (Optional) A dictionary of attributes to attach to the message.
ordering_key str (Optional) The ordering key for the message.

Returns: A dictionary containing the message_id of the published message.

pull_messages

Pull one or more messages from a Pub/Sub subscription.

Parameter Type Description
subscription_name str The name of the Pub/Sub subscription (e.g., projects/my-project/subscriptions/my-sub).
max_messages int The maximum number of messages to pull. Defaults to 1.
auto_ack bool Whether to automatically acknowledge the messages after they are pulled. Defaults to False.

Returns: A dictionary containing a list of the pulled messages. Each message is a dictionary with the following keys: message_id, data, attributes, ordering_key, publish_time, and ack_id.

acknowledge_messages

Acknowledge one or more messages on a Pub/Sub subscription. This removes the messages from the subscription so they won't be pulled again.

Parameter Type Description
subscription_name str The name of the Pub/Sub subscription.
ack_ids list[str] A list of acknowledgment IDs to acknowledge. These are obtained from the pull_messages tool.

Returns: A dictionary with the status of the operation (SUCCESS or ERROR).

Reasoning:
The v1.22.0 release includes a new Pub/Sub tool that needs to be documented.

Reference: src/google/adk/tools/pubsub/pubsub_toolset.py

2. Create a new guide for database schema migration.

Doc file: docs/sessions/migration.md

Proposed Change:


hide:

  • toc

Database Schema Migration

Starting with version 1.22.0, the ADK uses a new, more robust database schema for the DatabaseSessionService. If you have been using DatabaseSessionService with a previous version of the ADK, you will need to migrate your database to the latest schema to continue using your existing session data.

Overview

The migration process involves running a script that reads data from your existing database, transforms it to the new schema, and writes it to a new database. The migration is not done in-place, which means you will need to provide a separate destination database for the migrated data. This approach ensures that your existing data is not at risk during the migration process.

How to Migrate

The migration is performed using the upgrade function from the google.adk.sessions.migration.migration_runner module.

Here's a step-by-step guide:

  1. Create a new, empty database to serve as the destination for the migrated data.

  2. Run the migration script. The following Python script demonstrates how to use the upgrade function:

    from google.adk.sessions.migration import migration_runner
    
    # The SQLAlchemy URL of your existing database.
    source_db_url = "sqlite+aiosqlite:///path/to/your/old/database.db"
    
    # The SQLAlchemy URL of the new, empty database for the migrated data.
    dest_db_url = "sqlite+aiosqlite:///path/to/your/new/database.db"
    
    # Run the migration.
    migration_runner.upgrade(source_db_url, dest_db_url)
  3. Update your application configuration. After the migration is complete, update your application to use the new database URL (dest_db_url).

Important Notes

  • In-place migration is not supported. You must provide a different dest_db_url than source_db_url.
  • The destination database must be empty. The migration script will create the necessary tables in the destination database.
  • The migration process is sequential. If your database schema is several versions behind, the migration script will apply the necessary migrations in sequence.
  • Temporary databases may be used. For complex migrations, the script may use temporary SQLite databases to store intermediate results. These temporary files are automatically cleaned up after the migration is complete.

Reasoning:
The v1.22.0 release introduces database schema versioning and a migration process. Users who have been using DatabaseSessionService with a previous version of the ADK will need a guide on how to migrate their data to the latest schema.

Reference: src/google/adk/sessions/migration/migration_runner.py

3. Add the new Pub/Sub tool to the main tools page.

Doc file: docs/tools/index.md

Current state:

Google Cloud tools

Apigee

Apigee API Hub

Turn any documented API from Apigee API hub into a tool

Cloud API Registry

API Registry

Dynamically connect with Google Cloud services as MCP tools

Apigee Integration

Application Integration

Link your agents to enterprise apps using Integration Connectors

BigQuery

BigQuery Agent Analytics

Analyze and debug agent behavior at scale

BigQuery

BigQuery Tools

Connect with BigQuery to retrieve data and perform analysis

Bigtable

Bigtable Tools

Interact with Bigtable to retrieve data and execute SQL

Google Kubernetes Engine

GKE Code Executor

Run AI-generated code in a secure and scalable GKE environment

Spanner

Spanner Tools

Interact with Spanner to retrieve data, search, and execute SQL

MCP Toolbox for Databases

MCP Toolbox for Databases

Connect over 30 different data sources to your agents

Vertex AI

Vertex AI RAG Engine

Perform private data retrieval using Vertex AI RAG Engine

Vertex AI

Vertex AI Search

Search across your private, configured data stores in Vertex AI Search

Proposed Change:

Google Cloud tools

Apigee

Apigee API Hub

Turn any documented API from Apigee API hub into a tool

Cloud API Registry

API Registry

Dynamically connect with Google Cloud services as MCP tools

Apigee Integration

Application Integration

Link your agents to enterprise apps using Integration Connectors

BigQuery

BigQuery Agent Analytics

Analyze and debug agent behavior at scale

BigQuery

BigQuery Tools

Connect with BigQuery to retrieve data and perform analysis

Bigtable

Bigtable Tools

Interact with Bigtable to retrieve data and execute SQL

Google Kubernetes Engine

GKE Code Executor

Run AI-generated code in a secure and scalable GKE environment

Pub/Sub

Pub/Sub

Publish, pull, and acknowledge messages from Pub/Sub topics and subscriptions

Spanner

Spanner Tools

Interact with Spanner to retrieve data, search, and execute SQL

MCP Toolbox for Databases

MCP Toolbox for Databases

Connect over 30 different data sources to your agents

Vertex AI

Vertex AI RAG Engine

Perform private data retrieval using Vertex AI RAG Engine

Vertex AI

Vertex AI Search

Search across your private, configured data stores in Vertex AI Search

Reasoning:
The new Pub/Sub tool should be easily discoverable by users browsing the available tools.

Reference: src/google/adk/tools/pubsub/pubsub_toolset.py

4. Update DatabaseSessionService documentation to include information about schema migration.

Doc file: docs/sessions/session.md

Current state:

  1. DatabaseSessionService

    Supported in ADKPython v0.1.0Go v0.1.0
    • How it works: Connects to a relational database (e.g., PostgreSQL,
      MySQL, SQLite) to store session data persistently in tables.
    • Persistence: Yes. Data survives application restarts.
    • Requires: A configured database.
    • Best for: Applications needing reliable, persistent storage that you
      manage yourself.
    from google.adk.sessions import DatabaseSessionService
    # Example using a local SQLite file:
    # Note: The implementation requires an async database driver.
    # For SQLite, use 'sqlite+aiosqlite' instead of 'sqlite' to ensure async compatibility.
    db_url = "sqlite+aiosqlite:///./my_agent_data.db"
    session_service = DatabaseSessionService(db_url=db_url)

    Async Driver Requirement

    DatabaseSessionService requires an async database driver. When using SQLite, you must use sqlite+aiosqlite instead of sqlite in your connection string. For other databases (PostgreSQL, MySQL), ensure you're using an async-compatible driver (e.g., asyncpg for PostgreSQL, aiomysql for MySQL).

Proposed Change:

  1. DatabaseSessionService

    Supported in ADKPython v0.1.0Go v0.1.0
    • How it works: Connects to a relational database (e.g., PostgreSQL,
      MySQL, SQLite) to store session data persistently in tables.
    • Persistence: Yes. Data survives application restarts.
    • Requires: A configured database.
    • Best for: Applications needing reliable, persistent storage that you
      manage yourself.
    from google.adk.sessions import DatabaseSessionService
    # Example using a local SQLite file:
    # Note: The implementation requires an async database driver.
    # For SQLite, use 'sqlite+aiosqlite' instead of 'sqlite' to ensure async compatibility.
    db_url = "sqlite+aiosqlite:///./my_agent_data.db"
    session_service = DatabaseSessionService(db_url=db_url)

    Async Driver Requirement

    DatabaseSessionService requires an async database driver. When using SQLite, you must use sqlite+aiosqlite instead of sqlite in your connection string. For other databases (PostgreSQL, MySQL), ensure you're using an async-compatible driver (e.g., asyncpg for PostgreSQL, aiomysql for MySQL).

    Database Schema Migration

    Starting with version 1.22.0, the ADK uses a new database schema for the DatabaseSessionService. If you have been using `DatabaseSessionService` with a previous version, you will need to migrate your database to the latest schema. For more information, see the [Database Schema Migration Guide](/adk-docs/sessions/migration/).

Reasoning:
Users of DatabaseSessionService should be aware of the new schema versioning and migration process to ensure a smooth upgrade and avoid data loss.

Reference: src/google/adk/sessions/database_session_service.py

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions