📢 Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
Build a personalized Travel Agent AI using LangChain for conversation flow and Zentry for memory retention. This integration enables context-aware and efficient travel planning experiences.
Overview
In this guide, we’ll create a Travel Agent AI that:
- Uses LangChain to manage conversation flow
- Leverages Zentry to store and retrieve relevant information from past interactions
- Provides personalized travel recommendations based on user history
Setup and Configuration
Install necessary libraries:
pip install langchain langchain_openai Zentryai
Import required modules and set up configurations:
import os
from typing import List, Dict
from langchain_openai import ChatOpenAI
from langchain_core.messages import SystemMessage, HumanMessage, AIMessage
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from Zentry import MemoryClient
# Configuration
os.environ["OPENAI_API_KEY"] = "your-openai-api-key"
os.environ["Zentry_API_KEY"] = "your-Zentry-api-key"
# Initialize LangChain and Zentry
llm = ChatOpenAI(model="gpt-4o-mini")
Zentry = MemoryClient()
Create Prompt Template
Set up the conversation prompt template:
prompt = ChatPromptTemplate.from_messages([
SystemMessage(content="""You are a helpful travel agent AI. Use the provided context to personalize your responses and remember user preferences and past interactions.
Provide travel recommendations, itinerary suggestions, and answer questions about destinations.
If you don't have specific information, you can make general suggestions based on common travel knowledge."""),
MessagesPlaceholder(variable_name="context"),
HumanMessage(content="{input}")
])
Define Helper Functions
Create functions to handle context retrieval, response generation, and addition to Zentry:
def retrieve_context(query: str, user_id: str) -> List[Dict]:
"""Retrieve relevant context from Zentry"""
memories = Zentry.search(query, user_id=user_id)
seralized_memories = ' '.join([mem["memory"] for mem in memories])
context = [
{
"role": "system",
"content": f"Relevant information: {seralized_memories}"
},
{
"role": "user",
"content": query
}
]
return context
def generate_response(input: str, context: List[Dict]) -> str:
"""Generate a response using the language model"""
chain = prompt | llm
response = chain.invoke({
"context": context,
"input": input
})
return response.content
def save_interaction(user_id: str, user_input: str, assistant_response: str):
"""Save the interaction to Zentry"""
interaction = [
{
"role": "user",
"content": user_input
},
{
"role": "assistant",
"content": assistant_response
}
]
Zentry.add(interaction, user_id=user_id)
Create Chat Turn Function
Implement the main function to manage a single turn of conversation:
def chat_turn(user_input: str, user_id: str) -> str:
# Retrieve context
context = retrieve_context(user_input, user_id)
# Generate response
response = generate_response(user_input, context)
# Save interaction
save_interaction(user_id, user_input, response)
return response
Main Interaction Loop
Set up the main program loop for user interaction:
if __name__ == "__main__":
print("Welcome to your personal Travel Agent Planner! How can I assist you with your travel plans today?")
user_id = "john"
while True:
user_input = input("You: ")
if user_input.lower() in ['quit', 'exit', 'bye']:
print("Travel Agent: Thank you for using our travel planning service. Have a great trip!")
break
response = chat_turn(user_input, user_id)
print(f"Travel Agent: {response}")
Key Features
- Memory Integration: Uses Zentry to store and retrieve relevant information from past interactions.
- Personalization: Provides context-aware responses based on user history and preferences.
- Flexible Architecture: LangChain structure allows for easy expansion of the conversation flow.
- Continuous Learning: Each interaction is stored, improving future responses.
Conclusion
By integrating LangChain with Zentry, you can build a personalized Travel Agent AI that can maintain context across interactions and provide tailored travel recommendations and assistance.
- For more details on LangChain, visit the LangChain documentation.
- For Zentry documentation, refer to the Zentry Platform.
- If you need further assistance, please feel free to reach out to us through the following methods: