A ready-to-run example is available here!
How to use Persistence
Save conversation state to disk and restore it later for long-running or multi-session workflows.Saving State
Create a conversation with a unique ID to enable persistence:Restoring State
Restore a conversation using the same ID and persistence directory:What Gets Persisted
The conversation state includes information that allows seamless restoration:- Message History: Complete event log including user messages, agent responses, and system events
- Agent Configuration: LLM settings, tools, MCP servers, and agent parameters
- Execution State: Current agent status (idle, running, paused, etc.), iteration count, and stuck detection settings
- Tool Outputs: Results from bash commands, file operations, and other tool executions
- Statistics: LLM usage metrics like token counts and API calls
- Workspace Context: Working directory and file system state
- Activated Skills: Skills that have been enabled during the conversation
- Secrets: Managed credentials and API keys
Persistence Directory Structure
When you set apersistence_dir, your conversation will be persisted to a directory structure where each
conversation has its own subdirectory. By default, the persistence directory is workspace/conversations/
(unless you specify a custom path).
Directory structure:
workspace/conversations
<conversation-id-1>
base_state.json
events
event-00000-<event-id>.json
event-00001-<event-id>.json
...
<conversation-id-2>
...
base_state.json: The core conversation state including agent configuration, execution status, statistics, and metadataevents/: A subdirectory containing individual event files, each named with a sequential index and event ID (e.g.,event-00000-abc123.json)
events/ directory represents the same trajectory data you would find in the trajectory.json file from OpenHands V0, but split into individual files for better performance and granular access.
Ready-to-run Example
This example is available on GitHub: examples/01_standalone_sdk/10_persistence.py
examples/01_standalone_sdk/10_persistence.py
The model name should follow the LiteLLM convention:
provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.Reading serialized events
Convert persisted events into LLM-ready messages for reuse or analysis.This example is available on GitHub: examples/01_standalone_sdk/36_event_json_to_openai_messages.py
examples/01_standalone_sdk/36_event_json_to_openai_messages.py
The model name should follow the LiteLLM convention:
provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.Next Steps
- Pause and Resume - Control execution flow
- Async Operations - Non-blocking operations

