class Event
Bases:DiscriminatedUnionMixin, ABC
Base class for all events.
Properties
model_config: ConfigDictid: EventID Unique event id (ULID/UUID)timestamp: str Event timestampsource: SourceType The source of this eventvisualize: Text Return Rich Text representation of this event.
class LLMConvertibleEvent
Bases:Event, ABC
Base class for events that can be converted to LLM messages.
Methods
abstractmethod to_llm_message() -> Message source events_to_messages() -> list[Message] source Convert event stream to LLM message stream, handling multi-action batchesNone
class Condensation
Bases:Event
This action indicates a condensation of the conversation history is happening.
Properties
forgotten_event_ids: list[EventID] The IDs of the events that are being forgotten (removed from theViewgiven to the LLM).summary: str | None An optional summary of the events being forgotten.summary_offset: int | None An optional offset to the start of the resulting view (after forgotten events have been removed) indicating where the summary should be inserted. If not provided, the summary will not be inserted into the view.llm_response_id: EventID Completion or Response ID of the LLM response that generated this eventsource: SourceTypevisualize: Textsummary_event: CondensationSummaryEvent Generates a CondensationSummaryEvent.
has_summary_metadata: bool Checks if both summary and summary_offset are present.
Methods
apply() -> list[LLMConvertibleEvent] source Applies the condensation to a list of events. This method removes events that are marked to be forgotten and returns a new list of events. If the summary metadata is present (both summary and offset), the corresponding CondensationSummaryEvent will be inserted at the specified offset after the forgotten events have been removed.None
class CondensationRequest
Bases:Event
This action is used to request a condensation of the conversation history.
Properties
source: SourceTypevisualize: Text
class CondensationSummaryEvent
Bases:LLMConvertibleEvent
This event represents a summary generated by a condenser.
Properties
summary: str The summary text.source: SourceType
Methods
to_llm_message() -> Message sourceclass ConversationStateUpdateEvent
Bases:Event
Event that contains conversation state updates.
This event is sent via websocket whenever the conversation state changes,
allowing remote clients to stay in sync without making REST API calls.
All fields are serialized versions of the corresponding ConversationState fields
to ensure compatibility with websocket transmission.
Properties
source: SourceTypekey: str Unique key for this state update eventvalue: Any Serialized conversation state updates
Methods
validate_key() sourceNone
None
None
stateConversationState – The ConversationState to serializeconversation_id– The conversation ID for the event
-
ConversationStateUpdateEvent A ConversationStateUpdateEvent with serialized state data
None
class LLMCompletionLogEvent
Bases:Event
Event containing LLM completion log data.
When an LLM is configured with log_completions=True in a remote conversation,
this event streams the completion log data back to the client through WebSocket
instead of writing it to a file inside the Docker container.
Properties
source: SourceTypefilename: str The intended filename for this log (relative to log directory)log_data: str The JSON-encoded log data to be written to the filemodel_name: str The model name for contextusage_id: str The LLM usage_id that produced this log
class ActionEvent
Bases:LLMConvertibleEvent
Properties
source: SourceTypethought: Sequence[TextContent] The thought process of the agent before taking this actionreasoning_content: str | None Intermediate reasoning/thinking content from reasoning modelsthinking_blocks: list[ThinkingBlock | RedactedThinkingBlock] Anthropic thinking blocks from the LLM responseresponses_reasoning_item: ReasoningItemModel | None OpenAI Responses reasoning item from model outputaction: Action | None Single tool call returned by LLM (None when non-executable)tool_name: str The name of the tool being calledtool_call_id: ToolCallID The unique id returned by LLM API for this tool calltool_call: MessageToolCall The tool call received from the LLM response. We keep a copy of it so it is easier to construct it into LLM messageThis could be different fromaction: e.g.,tool_callmay containsecurity_riskfield predicted by LLM when LLM risk analyzer is enabled, whileactiondoes not.llm_response_id: EventID Completion or Response ID of the LLM response that generated this eventE.g., Can be used to group related actions from same LLM response. This helps in tracking and managing results of parallel function calling from the same LLM response.security_risk: risk.SecurityRisk The LLMcritic_result: CriticResult | None Optional critic evaluation of this action and preceding history.summary: str | None A concise summary (approximately 10 words) of what this action does, provided by the LLM for explainability and debugging. Examples of good summaries:visualize: Text Return Rich Text representation of this action event.
Methods
to_llm_message() -> Message source Individual message - may be incomplete for multi-action batchesclass AgentErrorEvent
Bases:ObservationBaseEvent
Error triggered by the agent.
Note: This event should not contain model “thought” or “reasoning_content”. It
represents an error produced by the agent/scaffold, not model output.
Properties
source: SourceTypeerror: str The error message from the scaffoldvisualize: Text Return Rich Text representation of this agent error event.
Methods
to_llm_message() -> Message sourceclass MessageEvent
Bases:LLMConvertibleEvent
Message from either agent or user.
This is originally the “MessageAction”, but it suppose not to be tool call.
Properties
model_config: ConfigDictsource: SourceTypellm_message: Message The exact LLM message for this message eventllm_response_id: EventID | None Completion or Response ID of the LLM response that generated this eventIf the source !=activated_skills: list[str] List of activated skill nameextended_content: list[TextContent] List of content added by agent contextsender: str | None Optional identifier of the sender. Can be used to track message origin in multi-agent scenarios.critic_result: CriticResult | None Optional critic evaluation of this message and preceding history.reasoning_content: strthinking_blocks: Sequence[ThinkingBlock | RedactedThinkingBlock] Return the Anthropic thinking blocks from the LLM message.visualize: Text Return Rich Text representation of this message event.
Methods
to_llm_message() -> Message sourceclass ObservationBaseEvent
Bases:LLMConvertibleEvent
Base class for anything as a response to a tool call.
Examples include tool execution, error, user reject.
Properties
source: SourceTypetool_name: str The tool name that this observation is responding totool_call_id: ToolCallID The tool call id that this observation is responding to
class ObservationEvent
Bases:ObservationBaseEvent
Properties
observation: Observation The observation (tool call) sent to LLMaction_id: EventID The action id that this observation is responding tovisualize: Text Return Rich Text representation of this observation event.
Methods
to_llm_message() -> Message sourceclass SystemPromptEvent
Bases:LLMConvertibleEvent
System prompt added by the agent.
Properties
source: SourceTypesystem_prompt: TextContent The system prompt texttools: list[ToolDefinition] List of tools as ToolDefinition objectsvisualize: Text Return Rich Text representation of this system prompt event.
Methods
to_llm_message() -> Message sourceclass UserRejectObservation
Bases:ObservationBaseEvent
Observation when user rejects an action in confirmation mode.
Properties
rejection_reason: str Reason for rejecting the actionaction_id: EventID The action id that this observation is responding tovisualize: Text Return Rich Text representation of this user rejection event.
Methods
to_llm_message() -> Message sourceclass TokenEvent
Bases:Event
Event from VLLM representing token IDs used in LLM interaction.
Properties
source: SourceTypeprompt_token_ids: list[int] The exact prompt token IDs for this message eventresponse_token_ids: list[int] The exact response token IDs for this message event
class PauseEvent
Bases:Event
Event indicating that the agent execution was paused by user request.
Properties
source: SourceTypevisualize: Text Return Rich Text representation of this pause event.

