internal/support-bot/CLAUDE.md

Support Bot (Maximilian)

Customer support automation for Studyflash using Gemini AI with KB-grounded responses.

Support Bot (Maximilian)

Customer support automation for Studyflash using Gemini AI with KB-grounded responses.

Quick Reference

# Local dev
uvicorn main:app --reload --port 8000

# Deploy
# Auto-deploys on push to `main` of studyflash-ai/studyflash via Dokploy's
# GitHub integration. No manual deploy step.

# Adopt / drift-check the Dokploy resources via Pulumi
pnpm preview   # pulumi preview --diff
pnpm run pulumi:up # pulumi up

# Check logs
# Open the Customer Support project in Dokploy UI → Customer Support Bot →
# Logs tab. Same source-of-truth used by the Dokploy auto-deploy pipeline.

Architecture

Key Files

FilePurpose
main.pyFastAPI app, /ask endpoint
services/draft_agent_service.pyGemini draft generation
services/category_classifier_service.pyQuery classification
services/kb_service.pyKB loading from kb/ directory
models/draft.pyResponse schemas (no defaults in DraftResponseSchema - Gemini limitation)
kb/*.mdKnowledge base articles by category

API Endpoints

  • POST /ask - Main support query endpoint
  • POST /analyze - Weekly gap analyzer
  • POST /log-correction - Log human corrections

Secrets

Sourced from Infisical at /internal/support-bot/ (prod env), written to Dokploy's native env at pulumi up time. Live keys (per application.one):

  • AUTOMATIONS_ADMIN_KEY
  • AUTO_REPLY_ENABLED
  • CHATWOOT_ACCOUNT_ID
  • CHATWOOT_API_TOKEN
  • CHATWOOT_API_URL
  • GEMINI_API_KEY
  • METABASE_EMBEDDING_SECRET_KEY
  • PORT

Deployment Notes

  • KB files must be in .dockerignore whitelist (!kb/*.md).
  • Public URL: https://support-bot.studyflash.dev (port 8080 in-container, Let's Encrypt via Dokploy).
  • Pulumi adoption stack lives next to the source — index.ts, Pulumi.yaml, package.json in this directory.