Integrations
Local-first by default. Cloud when you ask for it.
Everything below works in the free tier. Cloud providers fire only when you explicitly set the relevant API key environment variable. The default code path makes zero outbound calls to any third party except Ollama at localhost:11434.
Browser engine
Playwright 1.52
StableDrives Chromium, Firefox, and WebKit. PlaywrightManager.launch() auto-downloads matching browser binaries on first run.
DESKTOP / MOBILE / TABLET device profiles supported. Headless and headed both work; headed is preferred during development for visual debugging.
Local LLM provider
Ollama
StableThe default LLM backend. Runs entirely on your machine. No API key, no rate limit, no token charges.
qwen2.5-coder for planning (~4 GB), nomic-embed-text for RAG embeddings (~270 MB), qwen2.5-vl for vision (~5 GB). Set OLLAMA_URL to point at a remote Ollama if you want a shared model server.
Cloud LLM provider
Anthropic Claude
Opt-inCloud fallback when your local model can't fit a huge prompt. Set AUTOMATOR_LLM_PROVIDER=anthropic and ANTHROPIC_API_KEY.
Default model is claude-sonnet-4-6. Tanvrit charges nothing extra; you pay Anthropic directly. The default code path is local — cloud calls fire only when you explicitly opt in.
OpenAI
Opt-inGPT-4o and o-series models supported via the OpenAI client. Set OPENAI_API_KEY and AUTOMATOR_LLM_PROVIDER=openai.
Google Gemini
Opt-inGemini 1.5 Pro and Flash supported. Set GEMINI_API_KEY and AUTOMATOR_LLM_PROVIDER=gemini.
Groq
Opt-inFastest inference for Llama / Mixtral / Qwen via the Groq Cloud API. Set GROQ_API_KEY.
DeepSeek
Opt-inDeepSeek-V3 and Coder via the DeepSeek API. Set DEEPSEEK_API_KEY.
Mistral
Opt-inMistral Large and Codestral via the Mistral API. Set MISTRAL_API_KEY.
Tool exposition
MCP — Model Context Protocol
StableTanvrit Automator can expose its 31 ToolDispatcher actions as MCP tools, callable from any MCP-compatible client.
Wire it into Claude Desktop, Cursor, or Zed and your LLM coding assistant gains direct browser-driving capability. Both stdio and HTTP transports are supported. See /help/mcp.
Local storage
SQLDelight + SQLite
StableAll trajectories, bench flows, and projects are persisted to ~/.automator/automator.db.
SQLDelight 2.1.0 generates type-safe Kotlin from SQL schemas in commonMain/sqldelight/. Tables: flows, trajectories, trajectory_steps, bench_flows, bench_runs, projects, action_steps, ui_patterns.
Vision fallback
Tesseract OCR
StableLast-resort OCR for pages where Qwen2.5-VL captioning misses on-screen text.
Install with `brew install tesseract` (macOS) or `apt install tesseract-ocr` (Ubuntu). Language packs are separate; install the ones your target apps use.
CI integration
GitHub Actions
Phase 2A reusable Action that runs Tanvrit Automator headless against a deployed preview URL — bench gates for your app's CI.
Currently you wire it manually with a JVM runner; the published action is on the roadmap. The bench runner CLI is functional today.
Environment-variable cheat sheet
# Default — local Ollama, zero cloud calls AUTOMATOR_LLM_PROVIDER=ollama OLLAMA_URL=http://localhost:11434 OLLAMA_MODEL=qwen2.5-coder # Anthropic Claude (opt-in) AUTOMATOR_LLM_PROVIDER=anthropic ANTHROPIC_API_KEY=sk-ant-... ANTHROPIC_MODEL=claude-sonnet-4-6 # OpenAI / Gemini / Groq / DeepSeek / Mistral AUTOMATOR_LLM_PROVIDER=openai # or gemini, groq, deepseek, mistral OPENAI_API_KEY=sk-... # Smart autofill (opt-in, step-0 form fill) AGENT_SMART_AUTOFILL=on