We build the infrastructure we use. Now it's yours.
An open-source TypeScript SDK for building production AI agent platforms. Runtime, knowledge management, channels, observability — everything you need to go from idea to Kubernetes-ready deployment.
MIT License · TypeScript · Production-hardened
What's inside
Everything you need to go from idea to production agent platform. No boilerplate, no glue code.
Agent Runtime
Config loading with Zod validation, LLM client factories for OpenAI and Anthropic, JWT auth middleware with user identity mapping.
Knowledge Management
Drizzle ORM database client, vector store abstractions for Qdrant and pgvector, document extractors for PDF, Word, Excel, and more.
4 Built-in Channels
SSE for real-time streaming, Email with M365 and Gmail thread resolution, Slack with Block Kit, Teams with Adaptive Cards.
Observability
Prometheus-compatible metrics with HTTP middleware, Langfuse tracing, Sentry error tracking, structured logging with Pino.
Review Workflows
Confidence gates for human-in-the-loop approval, review repository with SLA tracking, notifier integrations, decision audit trails.
CLI Scaffolder
Generate complete agent platforms with one command. Kubernetes manifests, CI/CD workflows, Docker configs, Helm charts — all wired up.
From zero to agent in minutes
The SDK handles config, LLM wiring, auth, and channel setup. You focus on what your agent actually does.
import { createConfig, createLogger } from '@laava-ai/sdk';
import { createLLMClient } from '@laava-ai/sdk/llm';
import { createSSEChannel } from '@laava-ai/sdk/channels';
import { createVectorStore } from '@laava-ai/sdk/db';
// Load config with environment validation
const config = createConfig();
const logger = createLogger(config);
// Wire up your LLM provider
const llm = createLLMClient(config, {
provider: 'anthropic',
model: 'claude-sonnet-4-20250514',
});
// Set up knowledge retrieval
const vectorStore = createVectorStore(config, {
backend: 'qdrant',
});
// Expose via SSE channel
const channel = createSSEChannel(config, {
llm,
vectorStore,
auth: { required: true },
});
channel.start();Security-hardened. Sovereignty-ready.
Built with the same security defaults we use for enterprise deployments. No afterthoughts.
PII Redaction Built In
Logs and Sentry events automatically scrub 8+ credential and sensitive data types before they leave your environment.
JWT Authentication Required
Every /chat and /review route requires valid JWT tokens. No open endpoints by default.
Webhook Signature Verification
Slack webhooks fail-closed without valid signatures. M365 notifications are deduplicated with clientState validation.
Full Log Redaction
API keys, tokens, passwords, connection strings — all automatically stripped from logs and error reports.
Built by the team behind enterprise AI agent deployments
This SDK powers production systems for organizations across the Netherlands. Same code, same patterns, same security defaults. Need help building on it?
Start building
Install the SDK, scaffold a platform, and have an agent running in minutes. Or talk to our team about production deployment.
MIT License · @laava-ai/sdk · npm install laava