Cut your LLM bill
40–70% today
Point your base_url at cache.kaissa.ai, keep your API key,
and we cache your LLM calls semantically — zero code changes.
# one environment variable
OPENAI_BASE_URL=https://api.openai.com/v1
OPENAI_BASE_URL=https://cache.kaissa.ai/v1