Light
Dark

Letta v0.5 release

October 14, 2024

🧰 Dynamic model listing and multiple providers

In Letta v0.5, model providers (e.g. OpenAI, Ollama, vLLM, etc.) are now enabled using environment variables, where multiple providers can be enabled at a time. When a provider is enabled, all supported LLM and embedding models will be listed as options to be selected in the CLI and ADE in a dropdown.

For example for OpenAI, you can simply get started with:

> export OPENAI_API_KEY=...
> letta run
  ? Select LLM model: (Use arrow keys)
  » letta-free [type=openai] [ip=https://inference.memgpt.ai]
     gpt-4o-mini-2024-07-18 [type=openai] [ip=https://api.openai.com/v1]
     gpt-4o-mini [type=openai] [ip=https://api.openai.com/v1]
     gpt-4o-2024-08-06 [type=openai] [ip=https://api.openai.com/v1]
     gpt-4o-2024-05-13 [type=openai] [ip=https://api.openai.com/v1]
     gpt-4o [type=openai] [ip=https://api.openai.com/v1]
     gpt-4-turbo-preview [type=openai] [ip=https://api.openai.com/v1]
     gpt-4-turbo-2024-04-09 [type=openai] [ip=https://api.openai.com/v1]
     gpt-4-turbo [type=openai] [ip=https://api.openai.com/v1]
     gpt-4-1106-preview [type=openai] [ip=https://api.openai.com/v1]
     gpt-4-0613 [type=openai] [ip=https://api.openai.com/v1]
    ...

Similarly, if you are using the ADE with letta server, you can select the model to use from the model dropdown.

# include models from OpenAI
> export OPENAI_API_KEY=...

# include models from Anthropic
> export ANTHROPIC_API_KEY=...

# include models served by Ollama
> export OLLAMA_BASE_URL=...

> letta server

We are deprecating the letta configure and letta quickstart commands, and the the use of ~/.letta/config for specifying the default LLMConfig and EmbeddingConfig, as it prevents a single letta server from being able to run agents with different model configurations concurrently, or to change the model configuration of an agent without re-starting the server. This workflow also required users to specify the model name, provider, and context window size manually via letta configure.

🧠 Integration testing for model providers

We added integration tests (including testing of MemGPT memory management tool-use) for the following model providers, and fixed many bugs in the process:

📊 Database migrations

We now support automated database migrations via alembic, implemented in #1867. You can expect future release to support automated migrations even if there are schema changes.

Read the full v0.5 changelog on GitHub.