The Operating System for AI Agents
Create production-ready stateful agents in a single API call
build stateful agents with letta
Agents API
Build stateful agents with advanced memory and infinite context, using any model. Built-in persistence and memory management, with full support for custom tools and MCP.
ADE
Use the Agent Development Environment (ADE) to visualize your agent's memory, reasoning steps, and tool calls. Observe, test, and edit your agent’s state in real time.
BACKED BY RESEARCH, Trusted by developers
Your Agents Need an OS: Today's AI agents struggle with fragmented context and limited memory. Just as computers need operating systems to manage resources, your agents need intelligent context management to unlock their full potential.

Agents as APIs
Letta agents are exposed as a REST API endpoints, ready to be integrated into your applications - auth and identities included.
Stateful Agents
Letta persists all state automatically in a model-agnostic representation. Move agents between LLM providers without losing their memories.
Backed by Research
Letta manages context and memory with techniques designed by AI PhDs from UC Berkeley, including the creators of MemGPT.
Framework agnostic
Program your agents and connect them to your applications through Letta’s Agents API, SDKs, and framework integrations.
from letta_client import Letta
client = Letta(token="LETTA_API_KEY")
agent_state = client.agents.create(
model="openai/gpt-4.1",
embedding="openai/text-embedding-3-small",
memory_blocks=[
{
"label": "human",
"value": "The human's name is Chad. They like vibe coding."
},
{
"label": "persona",
"value": "My name is Sam, the all-knowing sentient AI."
}
]
)
import { LettaClient } from '@letta-ai/letta-client'
const client = new LettaClient({ token: "LETTA_API_KEY" });
const agentState = await client.agents.create({
model: "openai/gpt-4.1",
embedding: "openai/text-embedding-3-small",
memoryBlocks: [
{
label: "human",
value: "The human's name is Chad. They like vibe coding."
},
{
label: "persona",
value: "My name is Sam, the all-knowing sentient AI."
}
]
});
import { lettaCloud } from '@letta-ai/vercel-ai-sdk-provider';
import { generateText } from 'ai';
const { text } = await generateText({
model: lettaCloud('your-agent-id'),
prompt: 'Write a vegetarian lasagna recipe for 4 people.',
});
import { LettaClient } from '@letta-ai/letta-client'
const client = new LettaClient({ token: "LETTA_API_KEY" });
const agentState = await client.agents.create({
model: "openai/gpt-4.1",
embedding: "openai/text-embedding-3-small",
memoryBlocks: [
{
label: "human",
value: "The human's name is Chad. They like vibe coding."
},
{
label: "persona",
value: "My name is Sam, the all-knowing sentient AI."
}
]
});
import { LettaClient } from '@letta-ai/letta-client'
const client = new LettaClient({ token: "LETTA_API_KEY" });
const agentState = await client.agents.create({
model: "openai/gpt-4.1",
embedding: "openai/text-embedding-3-small",
memoryBlocks: [
{
label: "human",
value: "The human's name is Chad. They like vibe coding."
},
{
label: "persona",
value: "My name is Sam, the all-knowing sentient AI."
}
]
});
Production-ready, proven at scale
Scale from prototypes to millions of agents, all on the same stack. Ensure that your data, state, and agent memories are safe from vendor lock-in.
Get started today
GitHub
Contribute to the Letta open source
Documentation
Learn how to build stateful agents
Discord
Join our developer Discord community
Stay up to date with Letta