📢 Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
Welcome to the zentry quickstart guide. This guide will help you get up and running with zentry in no time.
Installation
To install zentry, you can use pip. Run the following command in your terminal:Basic Usage
Initialize zentry
- Basic
- Async
- Advanced
- Advanced (Graph Memory)
Store a Memory
Retrieve Memories
Search Memories
Update a Memory
Memory History
Delete Memory
Reset Memory
Configuration Parameters
zentry offers extensive configuration options to customize its behavior according to your needs. These configurations span across different components like vector stores, language models, embedders, and graph stores.Vector Store Configuration
Vector Store Configuration
| Parameter | Description | Default |
|---|---|---|
provider | Vector store provider (e.g., “qdrant”) | “qdrant” |
host | Host address | ”localhost” |
port | Port number | 6333 |
LLM Configuration
LLM Configuration
| Parameter | Description | Provider |
|---|---|---|
provider | LLM provider (e.g., “openai”, “anthropic”) | All |
model | Model to use | All |
temperature | Temperature of the model | All |
api_key | API key to use | All |
max_tokens | Tokens to generate | All |
top_p | Probability threshold for nucleus sampling | All |
top_k | Number of highest probability tokens to keep | All |
http_client_proxies | Allow proxy server settings | AzureOpenAI |
models | List of models | Openrouter |
route | Routing strategy | Openrouter |
openrouter_base_url | Base URL for Openrouter API | Openrouter |
site_url | Site URL | Openrouter |
app_name | Application name | Openrouter |
ollama_base_url | Base URL for Ollama API | Ollama |
openai_base_url | Base URL for OpenAI API | OpenAI |
azure_kwargs | Azure LLM args for initialization | AzureOpenAI |
deepseek_base_url | Base URL for DeepSeek API | DeepSeek |
Embedder Configuration
Embedder Configuration
| Parameter | Description | Default |
|---|---|---|
provider | Embedding provider | ”openai” |
model | Embedding model to use | ”text-embedding-3-small” |
api_key | API key for embedding service | None |
Graph Store Configuration
Graph Store Configuration
| Parameter | Description | Default |
|---|---|---|
provider | Graph store provider (e.g., “neo4j”) | “neo4j” |
url | Connection URL | None |
username | Authentication username | None |
password | Authentication password | None |
General Configuration
General Configuration
| Parameter | Description | Default |
|---|---|---|
history_db_path | Path to the history database | ”/history.db” |
version | API version | ”v1.1” |
custom_fact_extraction_prompt | Custom prompt for memory processing | None |
custom_update_memory_prompt | Custom prompt for update memory | None |
Complete Configuration Example
Complete Configuration Example
Run zentry Locally
Please refer to the example zentry with Ollama to run zentry locally.Chat Completion
zentry can be easily integrated into chat applications to enhance conversational agents with structured memory. zentry’s APIs are designed to be compatible with OpenAI’s, with the goal of making it easy to leverage zentry in applications you may have already built. If you have azentry API key, you can use it to initialize the client. Alternatively, you can initialize zentry without an API key if you’re using it locally.
zentry supports several language models (LLMs) through integration with various providers.
Use zentry Platform
Use zentry OSS
APIs
Get started with using zentry APIs in your applications. For more details, refer to the Platform. Here is an example of how to use zentry APIs:Contributing
We welcome contributions to zentry! Here’s how you can contribute:-
Fork the repository and create your branch from
main. - Clone the forked repository to your local machine.
-
Install the project dependencies:
-
Install pre-commit hooks:
- Make your changes and ensure they adhere to the project’s coding standards.
-
Run the tests locally:
- If all tests pass, commit your changes and push to your fork.
- Open a pull request with a clear title and description.