The Agent System enables autonomous AI agents to perform multi-step tasks through the AIbitat framework, a conversation engine that orchestrates tool calling, manages multi-agent interactions, and supports streaming execution. The system is built around the AgentHandler class, which coordinates agent initialization, plugin loading, and execution flow.
This page documents the AIbitat framework architecture, AgentHandler orchestration, the five-type plugin system, tool execution modes (streaming vs non-streaming), and WebSocket communication. For general chat functionality without agents, see page 7.1. For workspace-level agent configuration, see page 8.1.
AIbitat is the conversation engine at the heart of the agent system. It manages multi-agent conversations, channels (group chats), functions (tools), and execution flow. The framework is instantiated by AgentHandler and provides methods for agent registration, function registration, and conversation orchestration.
Diagram: AIbitat Class Structure
Sources: server/utils/agents/aibitat/index.js1-1020
| Property | Type | Description |
|---|---|---|
agents | Map | Stores agent definitions with their roles and available functions |
channels | Map | Stores channel configurations for multi-agent group conversations |
functions | Map | Registry of available tools/skills that agents can invoke |
emitter | EventEmitter | Event system for message, error, terminate, interrupt events |
provider | string | The LLM provider being used (e.g., "openai", "anthropic") |
model | string | The specific model being used |
_chats | Array | Chat history with states (success, error, interrupt) |
skipHandleExecution | boolean | Flag to return tool results directly without further LLM processing |
Sources: server/utils/agents/aibitat/index.js13-58
AIbitat uses builder-pattern methods to register agents and channels:
// Register an agent
aibitat.agent("@workspace_agent", {
role: "You are a helpful AI assistant...",
functions: ["rag-memory", "web-scraping", "save-file-to-browser"]
});
// Register a channel (multi-agent group)
aibitat.channel("research_team", ["@researcher", "@writer", "@reviewer"], {
maxRounds: 10
});
Sources: server/utils/agents/aibitat/index.js76-102
Functions (tools) are registered with the function() method:
aibitat.function({
name: "rag-memory",
description: "Search workspace documents",
parameters: { ... },
handler: async (args) => { ... }
});
Sources: server/utils/agents/aibitat/index.js1013-1016
AgentHandler is the primary orchestrator that initializes the AIbitat instance, loads agents, validates provider configuration, and attaches plugins. It serves as the bridge between the chat system and the AIbitat framework.
Diagram: AgentHandler Initialization Flow
Sources: server/utils/agents/index.js14-626
AgentHandler validates that required environment variables are set for the configured provider. The checkSetup() method contains provider-specific validation logic for all 30+ supported providers:
| Provider | Required ENV Variables |
|---|---|
openai | OPEN_AI_KEY |
anthropic | ANTHROPIC_API_KEY |
ollama | OLLAMA_BASE_PATH |
lmstudio | LMSTUDIO_BASE_PATH |
azure | AZURE_OPENAI_ENDPOINT, AZURE_OPENAI_KEY |
bedrock | No validation (multiple auth methods) |
| ... | (28+ more providers) |
Sources: server/utils/agents/index.js75-241
AgentHandler implements a three-tier fallback for model selection:
workspace.agentModel if explicitly setworkspace.chatProvider and workspace.chatModelLLM_PROVIDER ENV and call providerDefault() for base modelThis logic is implemented in #getFallbackProvider() and #fetchModel().
Sources: server/utils/agents/index.js334-385
The AIbitat framework supports five types of plugins, each loaded differently by AgentHandler.#attachPlugins(). Plugins are identified by naming conventions in the agent's functions array.
Diagram: Plugin Type Detection and Loading
Sources: server/utils/agents/index.js426-552
Standard plugins are single-stage plugins identified by simple names (no special prefixes). They are loaded from the AgentPlugins object.
Examples: "rag-memory", "web-scraping", "save-file-to-browser"
Loading:
if (!name.includes("#") && !name.startsWith("@@")) {
const AIbitatPlugin = AgentPlugins[name];
const callOpts = this.parseCallOptions(args, AgentPlugins[name].startupConfig.params);
this.aibitat.use(AIbitatPlugin.plugin(callOpts));
}
Sources: server/utils/agents/index.js536-551
Child plugins are nested within parent plugins, denoted by parent#childName syntax. The parent plugin exports multiple child plugins.
Example: "sql-agent#list-database-connections"
Loading:
if (name.includes("#")) {
const [parent, childPluginName] = name.split("#");
const childPlugin = AgentPlugins[parent].plugin.find(
(child) => child.name === childPluginName
);
this.aibitat.use(childPlugin.plugin(callOpts));
}
Sources: server/utils/agents/index.js428-458
Flow plugins are user-created workflows identified by @@flow_ prefix followed by a UUID. They are loaded from the flows directory.
Example: "@@flow_a1b2c3d4-e5f6-7890-abcd-ef1234567890"
Loading:
if (name.startsWith("@@flow_")) {
const uuid = name.replace("@@flow_", "");
const plugin = AgentFlows.loadFlowPlugin(uuid, this.aibitat);
this.aibitat.use(plugin.plugin());
}
Sources: server/utils/agents/index.js460-476
MCP (Model Context Protocol) plugins integrate external MCP servers as agent tools. They are identified by @@mcp_ prefix followed by the server name. The MCP server's tools are converted to individual child plugins.
Example: "@@mcp_filesystem"
Loading:
if (name.startsWith("@@mcp_")) {
const mcpPluginName = name.replace("@@mcp_", "");
const plugins = await new MCPCompatibilityLayer()
.convertServerToolsToPlugins(mcpPluginName, this.aibitat);
// Replace parent plugin with child tool plugins
this.aibitat.agents.get("@agent").functions =
this.aibitat.agents.get("@agent").functions.filter((f) => f.name !== name);
for (const plugin of plugins) {
this.aibitat.agents.get("@agent").functions.push(plugin.name);
this.aibitat.use(plugin.plugin());
}
}
Sources: server/utils/agents/index.js478-513
Imported plugins are community-contributed plugins from the hub, identified by @@ prefix followed by a hub ID.
Example: "@@custom-weather-plugin-abc123"
Loading:
if (name.startsWith("@@")) {
const hubId = name.replace("@@", "");
const plugin = ImportedPlugin.loadPluginByHubId(hubId);
const callOpts = plugin.parseCallOptions();
this.aibitat.use(plugin.plugin(callOpts));
}
Sources: server/utils/agents/index.js515-534
Two core plugins are always attached to the AIbitat instance:
| Plugin | Purpose | Configuration |
|---|---|---|
websocket | Frontend communication via WebSocket | socket, muteUserReply: true, introspection: true |
chatHistory | Persist agent messages to database | No configuration |
Sources: server/utils/agents/index.js596-611
AIbitat supports two execution modes: synchronous (non-streaming) and asynchronous (streaming). The mode is determined by the provider's supportsAgentStreaming property.
Diagram: Execution Mode Selection
Sources: server/utils/agents/aibitat/index.js580-627 server/utils/agents/aibitat/index.js640-738 server/utils/agents/aibitat/index.js751-832
When provider.supportsAgentStreaming is true, AIbitat uses streaming execution with real-time event reporting:
Key Features:
provider.stream(messages, functions, eventHandler)textResponseChunk, toolCallInvocation, fullTextResponseskipHandleExecution flag for direct outputEvent Handler:
const eventHandler = (type, data) => {
this?.socket?.send(type, data);
};
Sources: server/utils/agents/aibitat/index.js640-738
When streaming is not supported, AIbitat waits for complete responses:
Key Features:
provider.complete(messages, functions)Sources: server/utils/agents/aibitat/index.js751-832
Diagram: Tool Invocation Process
Sources: server/utils/agents/aibitat/index.js657-734 server/utils/agents/aibitat/index.js760-828
The skipHandleExecution flag enables tools to return results directly to the chat without further LLM processing:
this.skipHandleExecution = true; // Set by tool
// ... tool execution ...
if (this.skipHandleExecution) {
this.skipHandleExecution = false; // Reset flag
return result; // Direct return, no more LLM calls
}
This is useful for:
Sources: server/utils/agents/aibitat/index.js704-719 server/utils/agents/aibitat/index.js803-813
The WebSocket plugin (AgentPlugins.websocket) provides real-time bidirectional communication between the agent system and the frontend.
Diagram: WebSocket Plugin Architecture
Sources: server/utils/agents/index.js596-604 server/utils/agents/aibitat/index.js646-648
The WebSocket plugin is configured with:
| Option | Value | Purpose |
|---|---|---|
socket | WebSocket instance | The socket connection to send events to |
muteUserReply | true | Prevents echoing user messages back |
introspection | true | Enables detailed tool execution logging |
Sources: server/utils/agents/index.js599-603
During streaming execution, events are sent to the frontend:
eventHandler?.("reportStreamEvent", {
type: "textResponseChunk",
uuid: msgUUID,
content: choice.delta.content
});
eventHandler?.("reportStreamEvent", {
type: "toolCallInvocation",
uuid: `${msgUUID}:tool_call_invocation`,
content: `Assembling Tool Call: ${functionCall.name}(...)`
});
Sources: server/utils/agents/aibitat/providers/ai-provider.js434-451
AIbitat supports two types of conversation participants: agents (individual AI assistants) and channels (multi-agent groups).
Agents are registered with a name and configuration object containing:
| Field | Type | Description |
|---|---|---|
role | string | System prompt defining the agent's behavior |
functions | string[] | Array of tool names the agent can invoke |
interrupt | string? | Interrupt behavior: "ALWAYS", "NEVER" |
Default Agents:
Sources: server/utils/agents/defaults.js1-100 (referenced), server/utils/agents/index.js554-573
Channels enable multi-agent conversations where multiple AI agents collaborate:
aibitat.channel("research_team", ["@researcher", "@writer", "@reviewer"], {
maxRounds: 10,
role: "You are coordinating a research team..."
});
When a message is sent to a channel, AIbitat uses the LLM to select which agent should respond next based on conversation history and agent roles.
Sources: server/utils/agents/aibitat/index.js88-102 server/utils/agents/aibitat/index.js336-383 server/utils/agents/aibitat/index.js439-513
Diagram: Channel Agent Selection
Sources: server/utils/agents/aibitat/index.js439-514
AIbitat creates provider instances using getProviderForConfig(), which supports all 30+ LLM providers through a factory pattern:
Example Providers:
openai → new Providers.OpenAIProvider({ model })anthropic → new Providers.AnthropicProvider({ model })ollama → new Providers.OllamaProvider({ model })bedrock → new Providers.AWSBedrockProvider({})Sources: server/utils/agents/aibitat/index.js929-1006
AIbitat uses static methods to get context limits for providers:
Provider.contextLimit(provider, modelName)
This calls the provider's promptWindowLimit() method to determine maximum token capacity.
Sources: server/utils/agents/aibitat/providers/ai-provider.js350-360
The workspace agent's system prompt is generated with variable expansion:
Provider.systemPrompt({
provider,
workspace,
user
})
This expands variables in workspace.openAiPrompt using SystemPromptVariables.expandSystemPromptVariables(), substituting placeholders like {{username}} with actual values.
Sources: server/utils/agents/aibitat/providers/ai-provider.js371-390
Some providers have custom default system prompts:
DEFAULT_WORKSPACE_PROMPTSources: server/utils/agents/aibitat/providers/ai-provider.js362-369
The WorkspaceAgentInvocation model tracks agent invocations in the workspace_agent_invocations table:
Diagram: Invocation Record Lifecycle
Sources: server/models/workspaceAgentInvocation.js1-100 (referenced), server/utils/agents/index.js23-24 server/utils/agents/index.js397-404
| Field | Purpose |
|---|---|
uuid | WebSocket connection identifier (unique) |
prompt | Original message that triggered the agent |
closed | Whether the invocation completed (boolean) |
user_id | User who invoked (nullable) |
thread_id | Thread context (nullable) |
workspace_id | Workspace context (required) |
Sources: server/prisma/schema.prisma200-214
AIbitat loads recent chat history during initialization to provide context for agent responses:
this.aibitat = new AIbitat({
provider: this.provider,
model: this.model,
chats: await this.#chatHistory(20), // Last 20 messages
handlerProps: { invocation: this.invocation }
});
History Format:
The #chatHistory() method converts WorkspaceChats records into AIbitat message format:
{
from: USER_AGENT.name, // "@user"
to: WORKSPACE_AGENT.name, // "@agent"
content: chatLog.prompt,
state: "success"
}
Sources: server/utils/agents/index.js35-73 server/utils/agents/index.js586-594
The frontend provides a dropdown interface for selecting agent providers and models through the AgentLLMSelection component.
Enabled Providers:
The following 42 providers are enabled for agents:
| Category | Providers |
|---|---|
| Cloud | openai, anthropic, groq, mistral, perplexity, togetherai, openrouter, fireworksai, deepseek, apipie, xai, novita, ppio, gemini, moonshotai, cometapi, zai, giteeai, cohere, sambanova |
| Cloud (Specialized) | azure, bedrock, nvidia-nim, foundry |
| Local/Self-Hosted | lmstudio, ollama, localai, koboldcpp, textgenwebui, litellm, generic-openai, docker-model-runner, privatemode |
Sources: frontend/src/pages/WorkspaceSettings/AgentConfig/AgentLLMSelection/index.jsx9-45
The UI displays performance warnings for local/self-hosted providers that may have slower inference speeds:
Warning for: lmstudio, koboldcpp, ollama, localai, textgenwebui, docker-model-runner
Sources: frontend/src/pages/WorkspaceSettings/AgentConfig/AgentLLMSelection/index.jsx46-53
The UI includes a "System Default" option (value: "none") which makes agents fall back to:
Sources: frontend/src/pages/WorkspaceSettings/AgentConfig/AgentLLMSelection/index.jsx55-63
The Agent System provides autonomous AI capabilities within AnythingLLM through:
workspace_agent_invocations tableThe system is designed to be optional and non-intrusive, allowing workspaces to use agents when configured while maintaining standard chat functionality for all other cases.
Sources: server/utils/chats/stream.js42-51 server/prisma/schema.prisma200-214 frontend/src/utils/chat/index.js140-141 server/models/workspace.js54-114
Refresh this wiki