📢 Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
Zentry includes built-in support for various popular large language models. Memory can utilize the LLM provided by the user, ensuring efficient use for specific needs.
To use a llm, you must provide a configuration to customize its usage. If no configuration is supplied, a default configuration will be applied, and OpenAI will be used as the llm.For a comprehensive list of available parameters for llm configuration, please refer to Config.To view all supported llms, visit the Supported LLMs.
All LLMs are supported in Python. The following LLMs are also supported in TypeScript: OpenAI, Anthropic, and Groq.