Co-authored-by: George Weale <gweale@google.com> PiperOrigin-RevId: 858763407
BigQuery Tools Sample
Introduction
This sample agent demonstrates the BigQuery first-party tools in ADK,
distributed via the google.adk.tools.bigquery module. These tools include:
list_dataset_ids
Fetches BigQuery dataset ids present in a GCP project.
get_dataset_info
Fetches metadata about a BigQuery dataset.
list_table_ids
Fetches table ids present in a BigQuery dataset.
get_table_info
Fetches metadata about a BigQuery table.
-
get_job_infoFetches metadata about a BigQuery job. -
execute_sql
Runs or dry-runs a SQL query in BigQuery.
ask_data_insights
Natural language-in, natural language-out tool that answers questions about structured data in BigQuery. Provides a one-stop solution for generating insights from data.
Note: This tool requires additional setup in your project. Please refer to the official Conversational Analytics API documentation for instructions.
forecast
Perform time series forecasting using BigQuery's AI.FORECAST function,
leveraging the TimesFM 2.0 model.
analyze_contribution
Perform contribution analysis in BigQuery by creating a temporary
CONTRIBUTION_ANALYSIS model and then querying it with
ML.GET_INSIGHTS to find top contributors for a given metric.
detect_anomalies
Perform time series anomaly detection in BigQuery by creating a temporary
ARIMA_PLUS model and then querying it with
ML.DETECT_ANOMALIES to detect time series data anomalies.
How to use
Set up environment variables in your .env file for using
Google AI Studio
or
Google Cloud Vertex AI
for the LLM service for your agent. For example, for using Google AI Studio you
would set:
- GOOGLE_GENAI_USE_VERTEXAI=FALSE
- GOOGLE_API_KEY={your api key}
With Application Default Credentials
This mode is useful for quick development when the agent builder is the only user interacting with the agent. The tools are run with these credentials.
-
Create application default credentials on the machine where the agent would be running by following https://cloud.google.com/docs/authentication/provide-credentials-adc.
-
Set
CREDENTIALS_TYPE=Noneinagent.py -
Run the agent
With Service Account Keys
This mode is useful for quick development when the agent builder wants to run the agent with service account credentials. The tools are run with these credentials.
-
Create service account key by following https://cloud.google.com/iam/docs/service-account-creds#user-managed-keys.
-
Set
CREDENTIALS_TYPE=AuthCredentialTypes.SERVICE_ACCOUNTinagent.py -
Download the key file and replace
"service_account_key.json"with the path -
Run the agent
With Interactive OAuth
-
Follow https://developers.google.com/identity/protocols/oauth2#1.-obtain-oauth-2.0-credentials-from-the-dynamic_data.setvar.console_name. to get your client id and client secret. Be sure to choose "web" as your client type.
-
Follow https://developers.google.com/workspace/guides/configure-oauth-consent to add scope "https://www.googleapis.com/auth/bigquery".
-
Follow https://developers.google.com/identity/protocols/oauth2/web-server#creatingcred to add http://localhost/dev-ui/ to "Authorized redirect URIs".
Note: localhost here is just a hostname that you use to access the dev ui, replace it with the actual hostname you use to access the dev ui.
-
For 1st run, allow popup for localhost in Chrome.
-
Configure your
.envfile to add two more variables before running the agent:
- OAUTH_CLIENT_ID={your client id}
- OAUTH_CLIENT_SECRET={your client secret}
Note: don't create a separate .env, instead put it to the same .env file that stores your Vertex AI or Dev ML credentials
- Set
CREDENTIALS_TYPE=AuthCredentialTypes.OAUTH2inagent.pyand run the agent
Sample prompts
- which weather datasets exist in bigquery public data?
- tell me more about noaa_lightning
- which tables exist in the ml_datasets dataset?
- show more details about the penguins table
- compute penguins population per island.