Skip to main content

class Event

Bases: DiscriminatedUnionMixin, ABC Base class for all events.

Properties

  • model_config: ConfigDict
  • id: EventID Unique event id (ULID/UUID)
  • timestamp: str Event timestamp
  • source: SourceType The source of this event
  • visualize: Text Return Rich Text representation of this event.
This is a fallback implementation for unknown event types. Subclasses should override this method to provide specific visualization.

class LLMConvertibleEvent

Bases: Event, ABC Base class for events that can be converted to LLM messages.

Methods

abstractmethod to_llm_message() -> Message source events_to_messages() -> list[Message] source Convert event stream to LLM message stream, handling multi-action batches
events
list[LLMConvertibleEvent]
required
None

class Condensation

Bases: Event This action indicates a condensation of the conversation history is happening.

Properties

  • forgotten_event_ids: list[EventID] The IDs of the events that are being forgotten (removed from the View given to the LLM).
  • summary: str | None An optional summary of the events being forgotten.
  • summary_offset: int | None An optional offset to the start of the resulting view (after forgotten events have been removed) indicating where the summary should be inserted. If not provided, the summary will not be inserted into the view.
  • llm_response_id: EventID Completion or Response ID of the LLM response that generated this event
  • source: SourceType
  • visualize: Text
  • summary_event: CondensationSummaryEvent Generates a CondensationSummaryEvent.
Since summary events are not part of the main event store and are generated dynamically, this property ensures the created event has a unique and consistent ID based on the condensation event’s ID. Raises: ValueError: If no summary is present.
  • has_summary_metadata: bool Checks if both summary and summary_offset are present.

Methods

apply() -> list[LLMConvertibleEvent] source Applies the condensation to a list of events. This method removes events that are marked to be forgotten and returns a new list of events. If the summary metadata is present (both summary and offset), the corresponding CondensationSummaryEvent will be inserted at the specified offset after the forgotten events have been removed.
events
list[LLMConvertibleEvent]
required
None

class CondensationRequest

Bases: Event This action is used to request a condensation of the conversation history.

Properties

  • source: SourceType
  • visualize: Text

class CondensationSummaryEvent

Bases: LLMConvertibleEvent This event represents a summary generated by a condenser.

Properties

  • summary: str The summary text.
  • source: SourceType

Methods

to_llm_message() -> Message source

class ConversationStateUpdateEvent

Bases: Event Event that contains conversation state updates. This event is sent via websocket whenever the conversation state changes, allowing remote clients to stay in sync without making REST API calls. All fields are serialized versions of the corresponding ConversationState fields to ensure compatibility with websocket transmission.

Properties

  • source: SourceType
  • key: str Unique key for this state update event
  • value: Any Serialized conversation state updates

Methods

validate_key() source
key
required
None
validate_value() source
value
required
None
info
required
None
from_conversation_state() -> ConversationStateUpdateEvent source Create a state update event from a ConversationState object. This creates an event containing a snapshot of important state fields. Parameters:
  • state ConversationState – The ConversationState to serialize
  • conversation_id – The conversation ID for the event
Returns:
  • ConversationStateUpdateEvent A ConversationStateUpdateEvent with serialized state data
    state
    ConversationState
    required
    None

class LLMCompletionLogEvent

Bases: Event Event containing LLM completion log data. When an LLM is configured with log_completions=True in a remote conversation, this event streams the completion log data back to the client through WebSocket instead of writing it to a file inside the Docker container.

Properties

  • source: SourceType
  • filename: str The intended filename for this log (relative to log directory)
  • log_data: str The JSON-encoded log data to be written to the file
  • model_name: str The model name for context
  • usage_id: str The LLM usage_id that produced this log

class ActionEvent

Bases: LLMConvertibleEvent

Properties

  • source: SourceType
  • thought: Sequence[TextContent] The thought process of the agent before taking this action
  • reasoning_content: str | None Intermediate reasoning/thinking content from reasoning models
  • thinking_blocks: list[ThinkingBlock | RedactedThinkingBlock] Anthropic thinking blocks from the LLM response
  • responses_reasoning_item: ReasoningItemModel | None OpenAI Responses reasoning item from model output
  • action: Action | None Single tool call returned by LLM (None when non-executable)
  • tool_name: str The name of the tool being called
  • tool_call_id: ToolCallID The unique id returned by LLM API for this tool call
  • tool_call: MessageToolCall The tool call received from the LLM response. We keep a copy of it so it is easier to construct it into LLM messageThis could be different from action: e.g., tool_call may contain security_risk field predicted by LLM when LLM risk analyzer is enabled, while action does not.
  • llm_response_id: EventID Completion or Response ID of the LLM response that generated this eventE.g., Can be used to group related actions from same LLM response. This helps in tracking and managing results of parallel function calling from the same LLM response.
  • security_risk: risk.SecurityRisk The LLM
  • critic_result: CriticResult | None Optional critic evaluation of this action and preceding history.
  • summary: str | None A concise summary (approximately 10 words) of what this action does, provided by the LLM for explainability and debugging. Examples of good summaries:
  • visualize: Text Return Rich Text representation of this action event.

Methods

to_llm_message() -> Message source Individual message - may be incomplete for multi-action batches

class AgentErrorEvent

Bases: ObservationBaseEvent Error triggered by the agent. Note: This event should not contain model “thought” or “reasoning_content”. It represents an error produced by the agent/scaffold, not model output.

Properties

  • source: SourceType
  • error: str The error message from the scaffold
  • visualize: Text Return Rich Text representation of this agent error event.

Methods

to_llm_message() -> Message source

class MessageEvent

Bases: LLMConvertibleEvent Message from either agent or user. This is originally the “MessageAction”, but it suppose not to be tool call.

Properties

  • model_config: ConfigDict
  • source: SourceType
  • llm_message: Message The exact LLM message for this message event
  • llm_response_id: EventID | None Completion or Response ID of the LLM response that generated this eventIf the source !=
  • activated_skills: list[str] List of activated skill name
  • extended_content: list[TextContent] List of content added by agent context
  • sender: str | None Optional identifier of the sender. Can be used to track message origin in multi-agent scenarios.
  • critic_result: CriticResult | None Optional critic evaluation of this message and preceding history.
  • reasoning_content: str
  • thinking_blocks: Sequence[ThinkingBlock | RedactedThinkingBlock] Return the Anthropic thinking blocks from the LLM message.
  • visualize: Text Return Rich Text representation of this message event.

Methods

to_llm_message() -> Message source

class ObservationBaseEvent

Bases: LLMConvertibleEvent Base class for anything as a response to a tool call. Examples include tool execution, error, user reject.

Properties

  • source: SourceType
  • tool_name: str The tool name that this observation is responding to
  • tool_call_id: ToolCallID The tool call id that this observation is responding to

class ObservationEvent

Bases: ObservationBaseEvent

Properties

  • observation: Observation The observation (tool call) sent to LLM
  • action_id: EventID The action id that this observation is responding to
  • visualize: Text Return Rich Text representation of this observation event.

Methods

to_llm_message() -> Message source

class SystemPromptEvent

Bases: LLMConvertibleEvent System prompt added by the agent.

Properties

  • source: SourceType
  • system_prompt: TextContent The system prompt text
  • tools: list[ToolDefinition] List of tools as ToolDefinition objects
  • visualize: Text Return Rich Text representation of this system prompt event.

Methods

to_llm_message() -> Message source

class UserRejectObservation

Bases: ObservationBaseEvent Observation when user rejects an action in confirmation mode.

Properties

  • rejection_reason: str Reason for rejecting the action
  • action_id: EventID The action id that this observation is responding to
  • visualize: Text Return Rich Text representation of this user rejection event.

Methods

to_llm_message() -> Message source

class TokenEvent

Bases: Event Event from VLLM representing token IDs used in LLM interaction.

Properties

  • source: SourceType
  • prompt_token_ids: list[int] The exact prompt token IDs for this message event
  • response_token_ids: list[int] The exact response token IDs for this message event

class PauseEvent

Bases: Event Event indicating that the agent execution was paused by user request.

Properties

  • source: SourceType
  • visualize: Text Return Rich Text representation of this pause event.