This allows users to load skills from a directory and pass it into the SkillToolset constructor.
Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 868929937
Merge https://github.com/google/adk-python/pull/4435
### Link to Issue or Description of Change
- Closes: #4302
**Problem:**
`VertexAiSessionService.list_sessions()` only returns the first ~100 sessions. The `sessions_iterator` from `api_client.agent_engines.sessions.list()` is an `AsyncPager` — it implements `__aiter__`/`__anext__` for fetching subsequent pages, but the code uses a plain `for` loop which only calls `__iter__`/`__next__`, so it never fetches beyond the first page.
**Solution:**
Changed `for api_session in sessions_iterator` to `async for api_session in sessions_iterator` so the `AsyncPager` actually paginates. Updated the test mock to return an `AsyncIterableList` (supports both sync and async iteration) instead of a bare list, so the tests properly simulate real `AsyncPager` behaviour.
### Testing Plan
**Unit Tests:**
```
$ pytest tests/unittests/sessions/
115 passed, 1 warning in 2.25s
```
The existing `test_list_sessions`, `test_list_sessions_with_pagination`, and `test_list_sessions_all_users` all continue to pass with the updated mock.
Co-authored-by: Liang Wu <wuliang@google.com>
COPYBARA_INTEGRATE_REVIEW=https://github.com/google/adk-python/pull/4435 from anmolg1997:fix/vertex-ai-session-service-pagination 14c71b607ecbf2215f4b9ba6eb4b0ff6b9eaf740
PiperOrigin-RevId: 868466166
Sessions were being erroneously cached and reused across different asyncio event loops, causing "Event loop is closed" in environments with transient loops. This updates the session caching to be loop-aware: before reusing a cached session, check that the stored loop matches the current loop. Also, if session is disconnected and loops do not match, discard the cached entry without calling aclose().
Co-authored-by: Kathy Wu <wukathy@google.com>
PiperOrigin-RevId: 868380746
Adds optional token_limit and event_retention_size fields to EventsCompactionConfig. When the latest prompt token count meets/exceeds the threshold, ADK compacts older raw events after the invocation and keeps the last N events un-compacted. Updates prompt history building to apply compaction ranges correctly so retained events remain visible.
Close#4146
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 868297968
These endpoints provide basic health checks and version information for the running ADK server, including the ADK version and Python runtime details. The version information will be used to generate ADK conformance test report.
Co-authored-by: Liang Wu <wuliang@google.com>
PiperOrigin-RevId: 868283421
Adds BaseMemoryService.add_events_to_memory(session, events=..., custom_metadata=...) and CallbackContext.add_events_to_memory(events=..., custom_metadata=...) so callers can add memories from an explicit subset of ADK events.
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 868261578
This change introduces new FastAPI endpoints in adk_web_server.py and corresponding client methods in adk_web_server_client.py to allow fetching metadata for artifact versions without downloading the artifact content
Close#3710
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 868217569
Merge https://github.com/google/adk-python/pull/4365
## Summary
- Fixes `DataError` when using PostgreSQL with `asyncpg` for session storage
- PostgreSQL's default `TIMESTAMP` type is `WITHOUT TIME ZONE`, which cannot accept timezone-aware datetime objects
- The existing code handled this for SQLite but not PostgreSQL - this fix applies the same timezone stripping
## Error
When creating a session with PostgreSQL + asyncpg, the following error occurs:
```
sqlalchemy.dialects.postgresql.asyncpg.Error: <class 'asyncpg.exceptions.DataError'>:
invalid input for query argument $5: datetime.datetime(2026, 2, 3, 21, 32, 50, 353909,
tzinfo=datetime.timezone.utc) (can't subtract offset-naive and offset-aware datetimes)
```
During the INSERT:
```sql
INSERT INTO sessions (app_name, user_id, id, state, create_time, update_time)
VALUES ($1, $2, $3, $4, $5, $6)
```
Where `$5` and `$6` are timezone-aware datetimes being inserted into `TIMESTAMP WITHOUT TIME ZONE` columns.
## Root Cause
Commit 1063fa53 changed from database-generated timestamps (`func.now()`) to explicit Python datetimes (`datetime.now(timezone.utc)`). The SQLite case was handled by stripping the timezone, but PostgreSQL was overlooked.
## Test plan
- [x] Verified fix resolves the error when creating sessions with PostgreSQL + asyncpg
- [ ] Existing unit tests pass
Fixes regression from #1733
COPYBARA_INTEGRATE_REVIEW=https://github.com/google/adk-python/pull/4365 from filipecaixeta:fix-postgresql-timestamp-timezone 9d788ba99e7167a53962d93e59a80f78af091ca9
PiperOrigin-RevId: 867800330
This change introduces an in-process `asyncio.Lock` per session to serialize `append_event` calls for the same session ID within a single process. For supported database dialects (MySQL, PostgreSQL, MariaDB), it also uses `SELECT ... FOR UPDATE` to acquire row-level locks on the session, app state, and user state records, preventing race conditions across different processes or database connections. A new test case verifies that concurrent updates to stale session objects correctly merge all state changes.
Close#1049
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 867752676
Merge https://github.com/google/adk-python/pull/3462
**Please ensure you have read the [contribution guide](https://github.com/google/adk-python/blob/main/CONTRIBUTING.md) before creating a pull request.**
### Link to Issue or Description of Change
**1. Link to an existing issue (if applicable):**
- Closes: #_issue_number_
- Related: #_issue_number_
**2. Or, if no issue exists, describe the change:**
**Problem:**
When using adk in streaming mode, `usage_metadata.prompt_token_count` may be `None` which will emit log
```Invalid type NoneType for attribute 'gen_ai.usage.input_tokens' value. Expected one of ['bool', 'str', 'bytes', 'int', 'float'] or a sequence of those types```
**Solution:**
Skip setting span attribute if prompt token count is None
**Unit Tests:**
- [x] All unit tests pass locally.
_Please include a summary of passed `pytest` results._
### Checklist
- [x] I have read the [CONTRIBUTING.md](https://github.com/google/adk-python/blob/main/CONTRIBUTING.md) document.
- [x] I have performed a self-review of my own code.
- [x] I have commented my code, particularly in hard-to-understand areas.
- [x] I have added tests that prove my fix is effective or that my feature works.
- [x] New and existing unit tests pass locally with my changes.
- [x] I have manually tested my changes end-to-end.
- [x] Any dependent changes have been merged and published in downstream modules.
COPYBARA_INTEGRATE_REVIEW=https://github.com/google/adk-python/pull/3462 from wsa-2002:prompt-token-count-may-be-none-in-streaming-mode 94666862f70ed2577d5c55485e67f6da36a57bc6
PiperOrigin-RevId: 867693355
Function call and response IDs generated by ADK are now preserved in the LLM request contents when the agent is using a Gemini model with `use_interactions_api` enabled
Close#4381
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 867675945
This change updates the VertexAiMemoryBankService to utilize the asynchronous interface provided by `vertexai.Client().aio`. This involves:
- Retrieving the async client via `_get_api_client().aio`.
- Awaiting calls to `generate` and `retrieve`.
- Using `async for` to iterate over the results of the `retrieve` method, as it now returns an async iterator
Close#4386
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 867675311
The schema sanitization utility now recursively processes list items, ensuring that properties with list values (e.g., "required") are correctly handled and not altered.
Close#4363
Co-authored-by: George Weale <gweale@google.com>
PiperOrigin-RevId: 867663267