Supported LLMs
Groq
📢 Announcing our research paper: Zentry achieves 26% higher accuracy than OpenAI Memory, 91% lower latency, and 90% token savings! Read the paper to learn how we're revolutionizing AI agent memory.
Groq is the creator of the world’s first Language Processing Unit (LPU), providing exceptional speed performance for AI workloads running on their LPU Inference Engine.
In order to use LLMs from Groq, go to their platform and get the API key. Set the API key as GROQ_API_KEY
environment variable to use the model as given below in the example.
Usage
Config
All available parameters for the groq
config are present in Master List of All Params in Config.