Indovix Product
On-Premises AI for Australian Business
Full Feature Breakdown
AI & Knowledge
- Local LLM via Ollama (no cloud)
- Vector search via pgvector
- Semantic document retrieval (RAG)
- Local embedding models
- Plain-English query interface
- Context-aware answers grounded in your data
Deployment & Infrastructure
- Docker-based stack
- Windmill workflow automation
- Config-driven per-client setup
- Single config file per deployment
- LAN-accessible from day one
- No internet required for core operation
Document Ingestion
- PDF, Word, Excel, plain text
- Email archives
- Business manuals & SOPs
- Product catalogues
- Client records (anonymised)
- Ongoing ingestion pipeline
Base Tier vs Premium Tier
| Feature | Base | Premium |
|---|---|---|
| On-premises AI assistant | ✓ | ✓ |
| Document ingestion & indexing | ✓ | ✓ |
| LAN access for team | ✓ | ✓ |
| Local embedding (no cloud API) | ✓ | ✓ |
| Day-one knowledge base | ✓ | ✓ |
| Tailscale remote access | — | ✓ |
| Claude.ai connector | — | ✓ |
| Xero integration module | — | ✓ |
| Microsoft 365 integration | — | ✓ |
| Priority support | — | ✓ |
Technical Specifications
Stack
- Docker Compose
- Ollama (local LLM)
- pgvector (vector DB)
- Windmill (automation)
- PostgreSQL
Deployment
- On-premises hardware
- LAN or Tailscale access
- Config-driven setup
- No internet dependency
Minimum Hardware
- x86-64 processor
- 16 GB RAM recommended
- SSD storage
- Ubuntu 22.04+ / Debian 12+