AI Providers

Atua works with your own API keys. You choose the provider and model you want to use.

Choose A Provider

Most people can think about providers in three simple groups:

  • Major hosted providers if you want the easiest setup
  • Local providers if you want to keep requests on your Mac
  • Custom providers if you already use another OpenAI-compatible service

Providers Supported By Atua

Major hosted providers

  • OpenAI
  • Anthropic
  • Groq
  • Mistral
  • DeepSeek
  • xAI
  • Perplexity
  • Together AI
  • Fireworks
  • Cerebras
  • DeepInfra
  • Hugging Face
  • Cohere

Routing providers

  • OpenRouter
  • LLM Gateway

These are useful if you want access to multiple models through one provider account.

Local providers

  • Ollama
  • LM Studio

These are useful if you want your AI requests to stay on your machine.

Custom providers

You can also add any OpenAI-compatible service by entering a custom base URL.

Add A Provider

  1. Open Settings → Models.
  2. Click +.
  3. Choose the provider you want.
  4. Enter your API key if that provider needs one.
  5. Enter the model name you want to use.
  6. Click Save.
  7. Turn Active on if you want it to be your main provider.

Switch Providers

To change your main provider:

  1. Open Settings → Models.
  2. Select another provider in the sidebar.
  3. Turn Active on.

Only one provider is active at a time for general use.

Use A Different Provider For One Command

If you want one command to use a different provider or model:

  1. Open Settings → Commands.
  2. Select the command.
  3. Open the Model section.
  4. Pick the provider you want for that command.

This is useful if you want one fast model for quick edits and another model for bigger tasks.

Run Models Locally

If you want your AI requests to stay on your Mac, use one of these local options.

Ollama

  1. Install Ollama.
  2. Download a model in Ollama.
  3. In Atua, add Ollama as a provider.
  4. Enter the model name you want to use.

LM Studio

  1. Install LM Studio.
  2. Load a model in LM Studio.
  3. Start LM Studio’s local server.
  4. In Atua, add LM Studio as a provider.
  5. Enter the model name you want to use.

Add A Custom Provider

If your provider is not listed, use Custom:

  1. Click + in Settings → Models.
  2. Choose Custom.
  3. Enter the API base URL.
  4. Enter your API key.
  5. Enter the model name.

This works with services that follow the OpenAI chat format.

A Simple Way To Decide

  • Use OpenAI or Anthropic if you want a familiar hosted setup
  • Use OpenRouter if you want flexibility across many models
  • Use Ollama or LM Studio if you want to keep requests local
  • Use Custom if your team or company already has another compatible endpoint