Skip to content

Models

All communication between the client and Voxta server is handled through strongly-typed models. These are implemented as Python dataclasses, making it easy to inspect, validate, and manipulate the data.

Base Architecture

Every message inherits from a base class that handles SignalR serialization.

voxta_client.models.VoxtaModel dataclass

VoxtaModel()

Base class for Voxta data models.

voxta_client.models.ServerMessage dataclass

ServerMessage()

Bases: VoxtaModel

Base class for all messages from the server.

voxta_client.models.ClientMessage dataclass

ClientMessage()

Bases: VoxtaModel

Base class for all messages sent from the client to the server.

Functions

to_signalr_invocation

to_signalr_invocation(invocation_id: str) -> dict[str, Any]

Wraps the message in a SignalR invocation payload.


Server Messages (Incoming)

These messages are sent from the Voxta server to your client. You will typically receive these in your event handlers.

Core Interaction

The most common messages you'll encounter during a chat session.

voxta_client.models.ServerWelcomeMessage dataclass

ServerWelcomeMessage(assistant: dict[str, Any], user: dict[str, Any], type_name: str = 'welcome')

Bases: ServerMessage

Message received from the server upon successful connection. Contains information about the assistant and the user.

voxta_client.models.ServerChatMessage dataclass

ServerChatMessage(messageId: str, senderId: str, text: str, role: str, timestamp: str, sessionId: str, type_name: str = 'message')

Bases: ServerMessage

Represents a chat message received from the server.

Attributes:

  • messageId (str) –

    Unique identifier for the message.

  • senderId (str) –

    ID of the character or user who sent the message.

  • text (str) –

    The content of the message.

  • role (str) –

    The role of the sender (e.g., 'Assistant', 'User').

  • timestamp (str) –

    ISO timestamp of when the message was sent.

  • sessionId (str) –

    The active chat session ID.

voxta_client.models.ServerActionMessage dataclass

ServerActionMessage(value: str, role: str, senderId: str, sessionId: str, contextKey: Optional[str] = None, layer: Optional[str] = None, arguments: Optional[list[dict[str, Any]]] = None, type_name: str = 'action')

Bases: ServerMessage

Represents an action triggered by the AI (e.g., an emote or a command).

Attributes:

  • value (str) –

    The name of the action (e.g., 'smile', 'search_web').

  • arguments (Optional[list[dict[str, Any]]]) –

    Optional list of parameters for the action.

  • role (str) –

    The character role performing the action.

  • senderId (str) –

    ID of the character.

  • sessionId (str) –

    The active chat session ID.

Session Flow

Updates about the state of the chat (starting, loading, paused, etc.).

voxta_client.models.ServerChatFlowMessage dataclass

ServerChatFlowMessage(state: str, type_name: str = 'chatFlow')

Bases: ServerMessage

Indicates a change in the chat flow state (e.g., waiting for user, thinking).

voxta_client.models.ServerChatStartingMessage dataclass

ServerChatStartingMessage(type_name: str = 'chatStarting')

Bases: ServerMessage

Received when a chat session is starting.

voxta_client.models.ServerChatLoadingMessage dataclass

ServerChatLoadingMessage(type_name: str = 'chatLoading')

Bases: ServerMessage

Received while a chat session is being loaded.

voxta_client.models.ServerChatClosedMessage dataclass

ServerChatClosedMessage(chatId: str, type_name: str = 'chatClosed')

Bases: ServerMessage

Received when a chat session has been closed.


Client Messages (Outgoing)

These models are used when sending requests to the server. Note that the VoxtaClient provides helper methods for most of these, so you rarely need to instantiate them directly.

Chat Control

voxta_client.models.ClientStartChatMessage dataclass

ClientStartChatMessage(characterId: str, contexts: list[dict[str, Any]] = list(), type_name: str = 'startChat')

Bases: ClientMessage

Start a new chat with a character.

voxta_client.models.ClientResumeChatMessage dataclass

ClientResumeChatMessage(chatId: str, type_name: str = 'resumeChat')

Bases: ClientMessage

Resume a specific chat.

voxta_client.models.ClientSendMessage dataclass

ClientSendMessage(sessionId: str, text: str, doReply: bool = True, doUserActionInference: bool = True, doCharacterActionInference: bool = True, type_name: str = 'send')

Bases: ClientMessage

Send a message from the user to the session.

State & Inference

voxta_client.models.ClientTriggerActionMessage dataclass

ClientTriggerActionMessage(sessionId: str, messageId: str, value: str, arguments: Optional[dict[str, Any]] = None, type_name: str = 'triggerAction')

Bases: ClientMessage

Trigger an AI action manually.

voxta_client.models.ClientUpdateContextMessage dataclass

ClientUpdateContextMessage(sessionId: str, contextKey: str, contexts: Optional[list[dict[str, Any]]] = None, actions: Optional[list[dict[str, Any]]] = None, events: Optional[list[dict[str, Any]]] = None, setFlags: Optional[list[str]] = None, enableRoles: Optional[dict[str, bool]] = None, type_name: str = 'updateContext')

Bases: ClientMessage

Update the session context.

voxta_client.models.ClientInterruptMessage dataclass

ClientInterruptMessage(sessionId: str, type_name: str = 'interrupt')

Bases: ClientMessage

Interrupt the AI response.


Advanced Models

The library contains many more specialized models for audio streaming, resource deployment, and inspector debugging. For the full list of available models, refer to the source code.