📢 Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
- Python
- TypeScript
- Platform
- Vercel AI SDK
New Features:
- Memory: Added Group Chat Memory Feature support
- Examples: Added Healthcare assistant using Zentry and Google ADK
- SSE: Fixed SSE connection issues
- MCP: Fixed memories not appearing in MCP clients added from Dashboard
New Features:
- OpenMemory: Added OpenMemory support
- Neo4j: Added weights to Neo4j model
- AWS: Added support for Opsearch Serverless
- Examples: Added ElizaOS Example
- Documentation: Updated Azure AI documentation
- AI SDK: Added missing parameters and updated demo application
- OSS: Fixed AOSS and AWS BedRock LLM
New Features:
- Neo4j: Added support for Neo4j database
- AWS: Added support for AWS Bedrock Embeddings
- Client: Updated delete_users() to use V2 API endpoints
- Documentation: Updated timestamp and dual-identity memory management docs
- Neo4j: Improved Neo4j queries and removed warnings
- AI SDK: Added support for graceful failure when services are down
- Fixed AI SDK filters
- Fixed new memories wrong type
- Fixed duplicated metadata issue while adding/updating memories
New Features:
- HuggingFace: Added support for HF Inference
- Fixed proxy for Zentry
New Features:
- Vercel AI SDK: Added Graph Memory support
- Documentation: Fixed timestamp and README links
- Client: Updated TS client to use proper types for deleteUsers
- Dependencies: Removed unnecessary dependencies from base package
Improvements:
- Client: Fixed Ping Method for using default org_id and project_id
- Documentation: Updated documentation
- Fixed Zentry-migrations issue
New Features:
- Integrations: Added Memgraph integration
- Memory: Added timestamp support
- Vector Stores: Added reset function for VectorDBs
- Documentation:
- Updated timestamp and expiration_date documentation
- Fixed v2 search documentation
- Added “memory” in EC “Custom config” section
- Fixed typos in the json config sample
Improvements:
- Vector Stores: Initialized embedding_model_dims in all vectordbs
- Documentation: Fixed agno link
New Features:
- Memory: Added Memory Reset functionality
- Client: Added support for Custom Instructions
- Examples: Added Fitness Checker powered by memory
- Core: Updated capture_event
- Documentation: Fixed curl for v2 get_all
- Vector Store: Fixed user_id functionality
- Client: Various client improvements
New Features:
- LLM Integrations: Added Azure OpenAI Embedding Model
- Examples:
- Added movie recommendation using grok3
- Added Voice Assistant using Elevenlabs
- Documentation:
- Added keywords AI
- Reformatted navbar page URLs
- Updated changelog
- Updated openai.mdx
- FAISS: Silenced FAISS info logs
New Features:
- LLM Integrations: Added Mistral AI as LLM provider
- Documentation:
- Updated changelog
- Fixed memory exclusion example
- Updated xAI documentation
- Updated YouTube Chrome extension example documentation
- Core: Fixed EmbedderFactory.create() in GraphMemory
- Azure OpenAI: Added patch to fix Azure OpenAI
- Telemetry: Fixed telemetry issue
New Features:
- Langchain Integration: Added support for Langchain VectorStores
- Examples:
- Added personal assistant example
- Added personal study buddy example
- Added YouTube assistant Chrome extension example
- Added agno example
- Updated OpenAI Responses API examples
- Vector Store: Added capability to store user_id in vector database
- Async Memory: Added async support for OSS
- Documentation: Updated formatting and examples
New Features:
- Upstash Vector: Added support for Upstash Vector store
- Code Quality: Removed redundant code lines
- Build: Updated MAKEFILE
- Documentation: Updated memory export documentation
Improvements:
- FAISS: Added embedding_dims parameter to FAISS vector store
New Features:
- Langchain Embedder: Added Langchain embedder integration
- Langchain LLM: Updated Langchain LLM integration to directly pass the Langchain object LLM
Bug Fixes:
- Langchain LLM: Fixed issues with Langchain LLM integration
New Features:
- LLM Integrations: Added support for Langchain LLMs, Google as new LLM and embedder
- Development: Added development docker compose
- Output Format: Set output_format=‘v1.1’ and updated documentation
- Integrations: Added LMStudio and Together.ai documentation
- API Reference: Updated output_format documentation
- Integrations: Added PipeCat integration documentation
- Integrations: Added Flowise integration documentation for Zentry memory setup
- Tests: Fixed failing unit tests
New Features:
- FAISS Support: Added FAISS vector store support
New Features:
- Livekit Integration: Added Zentry livekit example
- Evaluation: Added evaluation framework and tools
- Multimodal: Updated multimodal documentation
- Examples: Added examples for email processing
- API Reference: Updated API reference section
- Elevenlabs: Added Elevenlabs integration example
- OpenAI Environment Variables: Fixed issues with OpenAI environment variables
- Deployment Errors: Added
package.jsonfile to fix deployment errors - Tools: Fixed tools issues and improved formatting
- Docs: Updated API reference section for
expiration date
Bug Fixes:
- OpenAI Environment Variables: Fixed issues with OpenAI environment variables
- Deployment Errors: Added
package.jsonfile to fix deployment errors - Tools: Fixed tools issues and improved formatting
- Docs: Updated API reference section for
expiration date
New Features:
- Supabase Vector Store: Added support for Supabase Vector Store
- Supabase History DB: Added Supabase History DB to run Zentry OSS on Serverless
- Feedback Method: Added feedback method to client
- Azure OpenAI: Fixed issues with Azure OpenAI
- Azure AI Search: Fixed test cases for Azure AI Search