Multi-agent conversation loop with human-in-the-loop support.
npm install @ekaone/n-agentyarn add @ekaone/n-agentpnpm add @ekaone/n-agentimport {
attachInteractiveConsole,
createChatBus,
createConversation,
} from "@ekaone/n-agent";
import { anthropicAdapter } from "@ekaone/n-agent/adapters/anthropic";
// 1. Create a chat bus to register agents
const bus = createChatBus();
// 2. Register LLM agents
bus.register({
name: "scientist",
type: "llm",
system: "You are a scientist. Keep responses brief.",
adapter: anthropicAdapter({ model: "claude-haiku-4-5-20251001", maxTokens: 150 }),
});
bus.register({
name: "philosopher",
type: "llm",
system: "You are a philosopher. Keep responses brief.",
adapter: anthropicAdapter({ model: "claude-haiku-4-5-20251001", maxTokens: 150 }),
});
// 3. Create a conversation
const convo = createConversation(bus, {
participants: ["scientist", "philosopher"],
topic: "What is consciousness?",
maxTurns: 6,
});
// 4. Attach interactive console (optional, for CLI usage)
const rl = attachInteractiveConsole(convo);
// 5. Start the conversation
await convo.start();
rl.close();Creates a registry for agents.
const bus = createChatBus();Methods:
bus.register(agent: ChatAgent): void— Register an agentbus.get(name: string): ChatAgent— Get an agent by namebus.has(name: string): boolean— Check if agent exists
Creates a conversation loop between registered agents.
const convo = createConversation(bus, {
participants: ["agent1", "agent2"],
topic: "Discussion topic",
maxTurns: 10,
delayMs: 2000,
pauseCondition: (ctx) => ctx.turnIndex % 3 === 2,
onToken: (chunk, speaker) => process.stdout.write(chunk),
onTurnComplete: (turn) => console.log(turn.content),
onStateChange: (state) => console.log("State:", state),
});createConversation() returns a handle that can emit typed events. This is the easiest way to attach multiple independent listeners (CLI output, logging, persistence, UI) without composing callbacks.
const convo = createConversation(bus, {
participants: ["agent1", "agent2"],
topic: "Discussion topic",
maxTurns: 10,
});
// Stream tokens
convo.on("token", ({ chunk }) => process.stdout.write(chunk));
// Turn boundaries
convo.on("turnComplete", ({ turn }) => console.log(`\n---\n${turn.speaker}: ${turn.content}`));
// State changes
convo.on("state", ({ state }) => console.log("State:", state));| Option | Type | Default | Description |
|---|---|---|---|
participants |
string[] |
required | Ordered list of agent names defining turn rotation |
topic |
string |
required | Opening message to seed the conversation |
maxTurns |
number |
10 |
Maximum number of turns before auto-stopping |
delayMs |
number |
0 |
Delay between turns in milliseconds |
stopSequence |
string |
— | String that triggers immediate stop when generated |
pauseCondition |
(ctx: TurnContext) => boolean |
— | Function to pause for human input |
onToken |
(chunk: string, speaker: string) => void |
— | Called for each token streamed from LLM |
onTurnComplete |
(turn: ChatMessage) => void |
— | Called when a turn finishes |
onStateChange |
(state: LoopState) => void |
— | Called when conversation state changes |
Note: the callback options above are still supported for backward compatibility, but events are preferred if you need more than one listener.
Attaches readline interface for CLI interaction. Provides real-time message injection and interrupt capabilities.
const rl = attachInteractiveConsole(convo, {
feedback: true, // Show interrupt/inject messages
interruptMessage: "⚡ Interrupted!",
injectMessage: "💬 Message sent.",
});
// User can:
// - Type + Enter to inject a message or interrupt current LLM
// - Ctrl+C to stop gracefully
await convo.start();
rl.close();Agents take turns automatically until maxTurns is reached.
const convo = createConversation(bus, {
participants: ["agent1", "agent2", "agent3"],
topic: "Let's discuss AI.",
maxTurns: 12,
delayMs: 1000, // 1 second pause between turns
});Pause after each turn for human approval or input.
const convo = createConversation(bus, {
participants: ["agent1", "agent2"],
topic: "Step-by-step discussion.",
pauseCondition: () => true, // Pause after every turn
// Or: pause every 3rd turn
// pauseCondition: (ctx) => ctx.turnIndex % 3 === 2,
});
const rl = attachInteractiveConsole(convo);
await convo.start();
rl.close();TurnContext:
interface TurnContext {
turnIndex: number; // Current turn number
speaker: string; // Current agent name
lastMessage: string; // Full content of last message
history: ChatMessage[]; // Complete conversation history
}Register a human agent that waits for user input as a participant.
// Register human agent in the rotation
bus.register({ name: "human", type: "human" });
const convo = createConversation(bus, {
participants: ["agent1", "human", "agent2"], // Human takes a turn
topic: "Hello everyone!",
maxTurns: 10,
});
const rl = attachInteractiveConsole(convo);
await convo.start(); // Pauses when it's human's turn
rl.close();Users can interrupt ongoing LLM generation mid-stream.
const convo = createConversation(bus, {
participants: ["agent1", "agent2"],
topic: "Rapid fire discussion.",
onTurnComplete: (turn) => {
if (turn.partial) console.log("(interrupted)");
},
});
const rl = attachInteractiveConsole(convo);
await convo.start();
// While agent is speaking, type and press Enter to interrupt
rl.close();import {
attachInteractiveConsole,
createChatBus,
createConversation,
} from "@ekaone/n-agent";
import { anthropicAdapter } from "@ekaone/n-agent/adapters/anthropic";
const bus = createChatBus();
// Color coding for each participant
const colors: Record<string, string> = {
physicist: "\x1b[36m", // Cyan
philosopher: "\x1b[35m", // Magenta
economist: "\x1b[33m", // Yellow
reset: "\x1b[0m",
};
function colorize(name: string): string {
const c = colors[name] || "";
return `${c}[${name}]${colors.reset}`;
}
// Register 3 experts
bus.register({
name: "physicist",
type: "llm",
system: "You are a theoretical physicist...",
adapter: anthropicAdapter({ model: "claude-haiku-4-5-20251001", maxTokens: 150 }),
});
bus.register({
name: "philosopher",
type: "llm",
system: "You are a philosopher...",
adapter: anthropicAdapter({ model: "claude-haiku-4-5-20251001", maxTokens: 150 }),
});
bus.register({
name: "economist",
type: "llm",
system: "You are an economist...",
adapter: anthropicAdapter({ model: "claude-haiku-4-5-20251001", maxTokens: 150 }),
});
// Track speaker for colored output
let currentSpeaker = "";
let firstToken = true;
const convo = createConversation(bus, {
participants: ["physicist", "philosopher", "economist"],
topic: "Should humanity colonize Mars?",
maxTurns: 9,
delayMs: 2000, // 2 second pause between turns
onToken: (chunk, speaker) => {
// Print speaker name once per turn with color
if (speaker !== currentSpeaker) {
currentSpeaker = speaker;
firstToken = true;
}
if (firstToken) {
process.stdout.write(`\n${colorize(speaker)} `);
firstToken = false;
}
process.stdout.write(chunk);
},
onTurnComplete: (turn) => {
console.log(`\n${"─".repeat(50)}`);
if (turn.partial) console.log("⚠️ (interrupted)");
},
onStateChange: (state) => {
if (state === "stopped") console.log("\n✅ Conversation ended.");
},
});
// Enable interactive CLI
const rl = attachInteractiveConsole(convo);
console.log("🚀 Topic: Mars colonization debate");
console.log("💡 Type + Enter to interrupt. Ctrl+C to stop.\n");
const history = await convo.start();
console.log(`\n📜 Total messages: ${history.length}`);
rl.close();Adapters bridge the framework to LLM providers. Currently available:
import { anthropicAdapter } from "@ekaone/n-agent/adapters/anthropic";
bus.register({
name: "claude",
type: "llm",
system: "You are helpful.",
adapter: anthropicAdapter({
model: "claude-3-sonnet-20250219",
maxTokens: 500,
apiKey: process.env.ANTHROPIC_API_KEY, // or auto from env
}),
});import { openaiAdapter } from "@ekaone/n-agent/adapters/ai-sdk";
bus.register({
name: "gpt",
type: "llm",
adapter: openaiAdapter({ model: "gpt-4o-mini" }),
});type AgentType = "llm" | "human";
interface ChatAgent {
name: string;
type: AgentType;
system?: string;
adapter?: AgentAdapter;
}
interface ConversationHandle {
start(): Promise<ChatMessage[]>;
send(message: string): SendResult;
stop(): void;
readonly state: LoopState;
readonly history: ChatMessage[];
}
type LoopState = "idle" | "streaming" | "awaiting-human" | "stopped";
type SendResult = {
intent: "inject" | "interrupt";
turnIndex: number;
};MIT © Eka Prasetia
⭐ If this library helps you, please consider giving it a star on GitHub!