π’ Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
The Zentry AI SDK Provider is a library developed by Zentry to integrate with the Vercel AI SDK. This library brings enhanced AI interaction capabilities to your applications by introducing persistent memory functionality.
π Exciting news! Zentry AI SDK now supports Graph Memory.
Note: The openai provider is set as default. Consider using Zentry_API_KEY and OPENAI_API_KEY as environment variables for security.
Note: The ZentryConfig is optional. It is used to set the global config for the Zentry Client (eg. user_id, agent_id, app_id, run_id, org_id, project_id etc).
Add Memories to Enhance Context:
import { LanguageModelV1Prompt } from "ai";import { addMemories } from "@Zentry/vercel-ai-provider";const messages: LanguageModelV1Prompt = [ { role: "user", content: [{ type: "text", text: "I love red cars." }] },];await addMemories(messages, { user_id: "borat" });
For standalone features, such as addMemories, retrieveMemories, and getMemories, you must either set Zentry_API_KEY as an environment variable or pass it directly in the function call.
getMemories will return raw memories in the form of an array of objects, while retrieveMemories will return a response in string format with a system prompt ingested with the retrieved memories.
getMemories is an object with two keys: results and relations if enable_graph is enabled. Otherwise, it will return an array of objects.
import { generateText } from "ai";import { createZentry } from "@Zentry/vercel-ai-provider";const Zentry = createZentry();const { text } = await generateText({ model: Zentry("gpt-4-turbo", { user_id: "borat" }), prompt: "Suggest me a good car to buy!",});
import { generateText } from "ai";import { openai } from "@ai-sdk/openai";import { retrieveMemories } from "@Zentry/vercel-ai-provider";const prompt = "Suggest me a good car to buy.";const memories = await retrieveMemories(prompt, { user_id: "borat" });const { text } = await generateText({ model: openai("gpt-4-turbo"), prompt: prompt, system: memories,});
import { generateText } from "ai";import { createZentry } from "@Zentry/vercel-ai-provider";const Zentry = createZentry();const { text } = await generateText({ model: Zentry("gpt-4-turbo", { user_id: "borat" }), messages: [ { role: "user", content: [ { type: "text", text: "Suggest me a good car to buy." }, { type: "text", text: "Why is it better than the other cars for me?" }, ], }, ],});
import { streamText } from "ai";import { createZentry } from "@Zentry/vercel-ai-provider";const Zentry = createZentry();const { textStream } = await streamText({ model: Zentry("gpt-4-turbo", { user_id: "borat", }), prompt: "Suggest me a good car to buy! Why is it better than the other cars for me? Give options for every price range.",});for await (const textPart of textStream) { process.stdout.write(textPart);}
import { generateText } from "ai";import { createZentry } from "@Zentry/vercel-ai-provider";import { z } from "zod";const Zentry = createZentry({ provider: "anthropic", apiKey: "anthropic-api-key", ZentryConfig: { // Global User ID user_id: "borat" }});const prompt = "What the temperature in the city that I live in?"const result = await generateText({ model: Zentry('claude-3-5-sonnet-20240620'), tools: { weather: tool({ description: 'Get the weather in a location', parameters: z.object({ location: z.string().describe('The location to get the weather for'), }), execute: async ({ location }) => ({ location, temperature: 72 + Math.floor(Math.random() * 21) - 10, }), }), }, prompt: prompt,});console.log(result);
The getMemories function will return an object with two keys: results and relations, if enable_graph is set to true. Otherwise, it will return an array of objects.