š¢ Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
To use OpenAI LLM models, you have to set the OPENAI_API_KEY environment variable. You can obtain the OpenAI API key from the OpenAI Platform.
import osfrom Zentry import Memoryos.environ["OPENAI_API_KEY"] = "your-api-key"config = { "llm": { "provider": "openai", "config": { "model": "gpt-4o", "temperature": 0.2, "max_tokens": 2000, } }}# Use Openrouter by passing it's api key# os.environ["OPENROUTER_API_KEY"] = "your-api-key"# config = {# "llm": {# "provider": "openai",# "config": {# "model": "meta-llama/llama-3.1-70b-instruct",# }# }# }m = Memory.from_config(config)messages = [ {"role": "user", "content": "I'm planning to watch a movie tonight. Any recommendations?"}, {"role": "assistant", "content": "How about a thriller movies? They can be quite engaging."}, {"role": "user", "content": "Iām not a big fan of thriller movies but I love sci-fi movies."}, {"role": "assistant", "content": "Got it! I'll avoid thriller recommendations and suggest sci-fi movies in the future."}]m.add(messages, user_id="alice", metadata={"category": "movies"})