Vercel AI SDK
π’ Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
The Zentry AI SDK Provider is a library developed by Zentry to integrate with the Vercel AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.
π Exciting news! Zentry AI SDK now supports Graph Memory.
Overview
- π§ Offers persistent memory storage for conversational AI
- π Enables smooth integration with the Vercel AI SDK
- π Ensures compatibility with multiple LLM providers
- π Supports structured message formats for clarity
- β‘ Facilitates streaming response capabilities
Setup and Configuration
Install the SDK provider using npm:
Getting Started
Setting Up Zentry
-
Get your Zentry API Key from the Zentry Dashboard.
-
Initialize the Zentry Client in your application:
Note: The
openai
provider is set as default. Consider usingZentry_API_KEY
andOPENAI_API_KEY
as environment variables for security.Note: The
ZentryConfig
is optional. It is used to set the global config for the Zentry Client (eg.user_id
,agent_id
,app_id
,run_id
,org_id
,project_id
etc). -
Add Memories to Enhance Context:
Standalone Features:
For standalone features, such as
addMemories
,retrieveMemories
, andgetMemories
, you must either setZentry_API_KEY
as an environment variable or pass it directly in the function call.
getMemories
will return raw memories in the form of an array of objects, whileretrieveMemories
will return a response in string format with a system prompt ingested with the retrieved memories.
getMemories
is an object with two keys:results
andrelations
ifenable_graph
is enabled. Otherwise, it will return an array of objects.
1. Basic Text Generation with Memory Context
2. Combining OpenAI Provider with Memory Utils
3. Structured Message Format with Memory
3. Streaming Responses with Memory Context
4. Generate Responses with Tools Call
5. Get sources from memory
The same can be done for streamText
as well.
Graph Memory
Zentry AI SDK now supports Graph Memory. You can enable it by setting enable_graph
to true
in the ZentryConfig
object.
You can also pass enable_graph
in the standalone functions. This includes getMemories
, retrieveMemories
, and addMemories
.
The getMemories
function will return an object with two keys: results
and relations
, if enable_graph
is set to true
. Otherwise, it will return an array of objects.
Key Features
createZentry()
: Initializes a new Zentry provider instance.retrieveMemories()
: Retrieves memory context for prompts.getMemories()
: Get memories from your profile in array format.addMemories()
: Adds user memories to enhance contextual responses.
Best Practices
-
User Identification: Use a unique
user_id
for consistent memory retrieval. -
Memory Cleanup: Regularly clean up unused memory data.
Note: We also have support for
agent_id
,app_id
, andrun_id
. Refer Docs.
Conclusion
Zentryβs Vercel AI SDK enables the creation of intelligent, context-aware applications with persistent memory and seamless integration.
Help
- For more details on Vercel AI SDK, visit the Vercel AI SDK documentation.
- For Zentry documentation, refer to the Zentry Platform.
- If you need further assistance, please feel free to reach out to us through following methods: