AI Providers
Atua works with your own API keys. You choose the provider and model you want to use.
Choose A Provider
Most people can think about providers in three simple groups:
- Major hosted providers if you want the easiest setup
- Local providers if you want to keep requests on your Mac
- Custom providers if you already use another OpenAI-compatible service
Providers Supported By Atua
Major hosted providers
- OpenAI
- Anthropic
- Groq
- Mistral
- DeepSeek
- xAI
- Perplexity
- Together AI
- Fireworks
- Cerebras
- DeepInfra
- Hugging Face
- Cohere
Routing providers
- OpenRouter
- LLM Gateway
These are useful if you want access to multiple models through one provider account.
Local providers
- Ollama
- LM Studio
These are useful if you want your AI requests to stay on your machine.
Custom providers
You can also add any OpenAI-compatible service by entering a custom base URL.
Add A Provider
- Open Settings → Models.
- Click +.
- Choose the provider you want.
- Enter your API key if that provider needs one.
- Enter the model name you want to use.
- Click Save.
- Turn Active on if you want it to be your main provider.
Switch Providers
To change your main provider:
- Open Settings → Models.
- Select another provider in the sidebar.
- Turn Active on.
Only one provider is active at a time for general use.
Use A Different Provider For One Command
If you want one command to use a different provider or model:
- Open Settings → Commands.
- Select the command.
- Open the Model section.
- Pick the provider you want for that command.
This is useful if you want one fast model for quick edits and another model for bigger tasks.
Run Models Locally
If you want your AI requests to stay on your Mac, use one of these local options.
Ollama
- Install Ollama.
- Download a model in Ollama.
- In Atua, add Ollama as a provider.
- Enter the model name you want to use.
LM Studio
- Install LM Studio.
- Load a model in LM Studio.
- Start LM Studio’s local server.
- In Atua, add LM Studio as a provider.
- Enter the model name you want to use.
Add A Custom Provider
If your provider is not listed, use Custom:
- Click + in Settings → Models.
- Choose Custom.
- Enter the API base URL.
- Enter your API key.
- Enter the model name.
This works with services that follow the OpenAI chat format.
A Simple Way To Decide
- Use OpenAI or Anthropic if you want a familiar hosted setup
- Use OpenRouter if you want flexibility across many models
- Use Ollama or LM Studio if you want to keep requests local
- Use Custom if your team or company already has another compatible endpoint