Light
Dark

Announcing Letta Client SDKs for Python and TypeScript

Product
April 17, 2025

We've officially released our auto-generated client SDKs and unified documentation platform built with Fern. For developers using Letta, this means more reliable libraries, TypeScript support (in addition to Python), and a significantly improved developer experience across all our APIs.

Letta: The OS for Agents 

Letta is the operating system for AI agents, managing state, context, and execution so developers can build intelligent applications that actually learn and improve with experience. Unlike traditional frameworks that rely on stateless APIs, Letta functions as a complete agent operating system where agents run as persistent services that maintain state across interactions, intelligently manage their context window, and access both in-context and external memory—transforming them from simple responders into systems that genuinely learn over time.

Letta’s Agents API 

Letta works as a managed API service (rather than a library), so everything from creating, modifying, and interacting with agents is through a REST API that interact with a Letta server (either a self-deployed Docker image or Letta Cloud). 

Since agents are lightweight services, they can easily be connected to your end-application. Letta agents can even create, modify, or invoke other Letta agents through connecting to the Letta API through their tools. 

Building SDKs for the Agents API 

While many early users of Letta (previously MemGPT) build applications interacting with our REST API service, we knew that most developers expected to use SDKs for building AI applications – especially given the complexity of the API. 

Previously, we manually implemented our Python SDK on top of our REST API, resulting in:

  • Missing features in the SDK compared to the REST API
  • Slow updates when our API changed
  • Documentation that often didn't match implementation
  • No TypeScript support (despite many requests)
  • Inconsistent experience between REST and SDK users

The result? You were likely hitting limitations, writing workarounds, or falling back to direct REST API calls rather than using our SDK.

What We Did

We took an API-first approach and now generate all client libraries and documentation from a single OpenAPI specification using Fern:

  1. Defined our complete API with OpenAPI as the single source of truth
  2. Set up Fern to generate our REST API docs, TypeScript SDK, and Python SDK
  3. Configured generators for our specific requirements, including streaming (server-sent events) support

See our TypeScript and Python SDK repositories and API reference.

What's Changed

For you as a developer, this means:

  • Complete feature parity - anything you can do with our REST API, you can now do with our Python and TypeScript SDKs
  • Multiple interaction modes - sync, async, and streaming support in all libraries
  • Consistent experience between cloud and self-hosted deployments
  • Developer-led language expansion - want to interact with Letta using a language we don’t yet support? Let us know by filing a GitHub Issue. With auto-generated SDKs, we can act fast to help

How It Works Now

Developers can use the Python or Node SDKs to interact with the Letta service, and easily toggle between a local Letta service or Letta Cloud: 

Streaming, which was particularly complex to implement manually, now looks like this:

Both token and step streaming are supported now, as well as async messages for long-running agent invocations. 

Benefits We're Seeing

For our engineering team, auto-generated SDKs have eliminated hours of manual work and bugs:

  1. No more SDK drift - when we update the API, SDKs automatically update
  2. Faster feature development - we can focus on building the API, not maintaining client libraries
  3. Better QA - consistent behavior across all platforms means fewer edge cases
  4. Improved developer feedback - we're hearing that the new SDKs "just work"

What's Next

With our Fern integration established, we're planning to:

  • Improve our documentation with more examples
  • Continue refining our OpenAPI specification
  • Eventually add more SDK languages, like Java and Go

Check It Out

  • You can explore our new auto-generated SDKs and documentation at docs.letta.com
  • If you've built anything with our new SDKs, we'd love to hear about your experience (contact@letta.com or join our Discord!) And if you're considering a similar approach for your own API, we're happy to share more details about our implementation.
Jul 7, 2025
Agent Memory: How to Build Agents that Learn and Remember

Traditional LLMs operate in a stateless paradigm—each interaction exists in isolation, with no knowledge carried forward from previous conversations. Agent memory solves this problem.

Jul 3, 2025
Anatomy of a Context Window: A Guide to Context Engineering

As AI agents become more sophisticated, understanding how to design and manage their context windows (via context engineering) has become crucial for developers.

Feb 13, 2025
RAG is not Agent Memory

Although RAG provides a way to connect LLMs and agents to more data than what can fit into context, traditional RAG is insufficient for building agent memory.

Nov 14, 2024
The AI agents stack

Understanding the AI agents stack landscape.

Nov 7, 2024
New course on Letta with DeepLearning.AI

DeepLearning.AI has released a new course on agent memory in collaboration with Letta.

Sep 23, 2024
Announcing Letta

We are excited to publicly announce Letta.

Sep 23, 2024
MemGPT is now part of Letta

The MemGPT open source project is now part of Letta.

Jul 24, 2025
Introducing Letta Filesystem

Today we're announcing Letta Filesystem, which provides an interface for agents to organize and reference content from documents like PDFs, transcripts, documentation, and more.

Apr 2, 2025
Agent File

Introducing Agent File (.af): An open file format for serializing stateful agents with persistent memory and behavior.

Jan 15, 2025
Introducing the Agent Development Environment

Introducing the Letta Agent Development Environment (ADE): Agents as Context + Tools

Dec 13, 2024
Letta v0.6.4 release

Letta v0.6.4 adds Python 3.13 support and an official TypeScript SDK.

Nov 6, 2024
Letta v0.5.2 release

Letta v0.5.2 adds tool rules, which allows you to constrain the behavior of your Letta agents similar to graphs.

Oct 23, 2024
Letta v0.5.1 release

Letta v0.5.1 adds support for auto-loading entire external tool libraries into your Letta server.

Oct 14, 2024
Letta v0.5 release

Letta v0.5 adds dynamic model (LLM) listings across multiple providers.

Oct 3, 2024
Letta v0.4.1 release

Letta v0.4.1 adds support for Composio, LangChain, and CrewAI tools.

May 29, 2025
Letta Leaderboard: Benchmarking LLMs on Agentic Memory

We're excited to announce the Letta Leaderboard, a comprehensive benchmark suite that evaluates how effectively LLMs manage agentic memory.

May 14, 2025
Memory Blocks: The Key to Agentic Context Management

Memory blocks offer an elegant abstraction for context window management. By structuring the context into discrete, functional units, we can give LLM agents more consistent, usable memory.

Apr 21, 2025
Sleep-time Compute

Sleep-time compute is a new way to scale AI capabilities: letting models "think" during downtime. Instead of sitting idle between tasks, AI agents can now use their "sleep" time to process information and form new connections by rewriting their memory state.

Feb 6, 2025
Stateful Agents: The Missing Link in LLM Intelligence

Introducing “stateful agents”: AI systems that maintain persistent memory and actually learn during deployment, not just during training.