Configure LLM Service

AI EmployeeCommunity Edition+

Before using AI Employees, configure available LLM services first.

Supported providers include OpenAI, Gemini, Claude, DeepSeek, Qwen, Kimi, and Ollama local models.

Create Service

Go to System Settings -> AI Employees -> LLM service.

  1. Click Add New to open the creation dialog.
  2. Select Provider.
  3. Fill Title, API Key, and Base URL (optional).
  4. Configure Enabled Models:
    • Select models: select from the provider model list.
    • Manual input: manually enter model ID and display name when the model list cannot be retrieved from the provider API.
  5. Click Submit to save.

20260425172809

Enable and Sort Services

In the LLM service list, you can:

  • Toggle service status with the Enabled switch.
  • Drag to reorder services (affects model display order).

llm-service-list-enabled-and-sort.png

Availability Test

Use Test flight at the bottom of the service dialog to verify service and model availability.

It is recommended to run this test before production use.