Smart Terminal operates primarily locally. When using local models like Ollama, your data never leaves your machine.
If you choose to use cloud providers (like OpenAI, Anthropic, or Groq), your prompts and terminal context will be sent to their respective APIs according to their privacy policies. We do not store or intercept these API keys or requests on our own servers.
We do not collect any telemetry, analytics, or usage data from your terminal.