Faroo is a multi-tenant helpdesk chatbot that runs where your data lives — on-premise, at the edge, fully air-gapped. Local LLMs, vector memory, and function calling. No cloud required. Ever.
A first-class orchestrator routes user intent to sandboxed tools — each with path validation and tenant-scoped config injection.
Every tenant gets its own memory, prompt, model routing, and webhook rules. One binary, many customers — no data bleed between them.
Semantic retrieval without a separate database process. One file, one tenant, one index — portable and fast on a single node.
Escalate low-confidence answers. When a human responds, Faroo learns. State-machine approval and dedup make it robust — not just cute.
HMAC-signed webhooks, tenant-scoped routes, constant-time comparison, replay protection. Security isn't a flag — it's the posture.
Immutable base memory, mutable state cleanly separated. Ship the image, seed a tenant, done. Works offline, air-gapped, on a ship.
Faroo never phones home. No telemetry, no model calls, no sneaky egress. If the machine is offline, Faroo still works. If the regulator asks, you have a clean answer.
Base memory is immutable and versioned. Mutable data is cleanly separated. Upgrades are a docker pull. Rollbacks are a docker pull. Your ops team will finally sleep.
Tools are Python classes with an llm_callable flag. Write one, register it, and the orchestrator picks it up. No plugin marketplaces, no config spaghetti. Just code.
Faroo escalates when it doesn't know. It tracks which human answered and why. It learns, but only from approved data. Confidence is earned, not faked.
Faroo is in active development. Drop your email and we'll let you know the moment the beacon turns on.