DocsStreaming & Events

STREAMING_&_EVENTS

CORE_CONCEPTS

Build responsive UIs by streaming tokens and listening to agent lifecycle events.

WHY_STREAM#

LLMs are slow. Waiting for a complete response can take seconds or even minutes. Streaming allows you to display tokens as they are generated, improving perceived latency and user experience.

STANDARD

Request → Wait (3s) → Response

STREAMING

Request → Token (0.1s) → Token (0.2s) → ...

STREAM_API#

Instead of agent.run(), use agent.stream(). This returns an async generator that yields events.

stream-example.ts
const stream = agent.stream("Write a short poem about Rust")

for await (const event of stream) {
  switch (event.type) {
    case 'token':
      // The raw text chunk (e.g. "The ", "iron ", "rusts")
      process.stdout.write(event.data)
      break
      
    case 'tool_start':
      console.log(`\n[Using Tool: ${event.data.tool}]`)
      break
      
    case 'tool_end':
      console.log(`[Tool Result: ${event.data.output}]`)
      break
  }
}

FRONTEND_INTEGRATION#

Using Vercel AI SDK on the frontend with AKIOS on the backend is a powerful combo.

EDGE_RUNTIME

Streaming works best on the Edge. Ensure your API route uses export const runtime = 'edge'.
app/api/chat/route.ts
import { StreamingTextResponse, LangChainStream } from 'ai'
import { Agent } from '@AKIOS/sdk'

export async function POST(req: Request) {
  const { messages } = await req.json()
  const lastMessage = messages[messages.length - 1].content

  const agent = new Agent({ /* config */ })

  // Convert AKIOS stream to standard text stream
  const stream = new ReadableStream({
    async start(controller) {
      for await (const event of agent.stream(lastMessage)) {
        if (event.type === 'token') {
          controller.enqueue(event.data)
        }
      }
      controller.close()
    }
  })

  return new StreamingTextResponse(stream)
}

EVENT_TYPES_REFERENCE#

Event TypeData PayloadDescription
tokenstringA text chunk from the LLM.
tool_start{ tool: string, input: any }Agent decided to call a tool.
tool_end{ output: string }Tool execution completed.
stepStepObjectA full thought/action cycle finished.