Before using AI Employees, configure available LLM services first.
Supported providers include OpenAI, Gemini, Claude, DeepSeek, Qwen, Kimi, and Ollama local models.
Go to System Settings -> AI Employees -> LLM service.
Add New to open the creation dialog.Provider.Title, API Key, and Base URL (optional).Enabled Models:
Recommended models: use officially recommended models.Select models: select from the provider model list.Manual input: manually enter model ID and display name.Submit to save.
In the LLM service list, you can:
Enabled switch.
Use Test flight at the bottom of the service dialog to verify service and model availability.
It is recommended to run this test before production use.