logologo
Get Started
Tutorials
Guide
Development
Plugins
API
Home
English
简体中文
日本語
한국어
Español
Português
Deutsch
Français
Русский
Get Started
Tutorials
Guide
Development
Plugins
API
Home
logologo
Overview
Quick Start

Features

Configure LLM Service
Enable AI Employees
Collaborate with AI Employees
Add Context - Blocks
Web Search
Use Skills
Shortcut Tasks
Built-in AI Employees
New AI Employee
Permission Control
File Management

AI Knowledge Base

Overview
Vector Database
Vector Store
Knowledge Base
RAG

Workflow

LLM Nodes

Text Chat
Multimodal Chat
Structured Output

Application Practices

Viz: CRM Scenario Configuration
Prompt Engineering Guide
Previous PageQuick Start
Next PageEnable AI Employees

#Configure LLM Service

Before using AI Employees, configure available LLM services first.

Supported providers include OpenAI, Gemini, Claude, DeepSeek, Qwen, Kimi, and Ollama local models.

#Create Service

Go to System Settings -> AI Employees -> LLM service.

  1. Click Add New to open the creation dialog.
  2. Select Provider.
  3. Fill Title, API Key, and Base URL (optional).
  4. Configure Enabled Models:
    • Recommended models: use officially recommended models.
    • Select models: select from the provider model list.
    • Manual input: manually enter model ID and display name.
  5. Click Submit to save.

llm-service-create-provider-enabled-models.png

#Enable and Sort Services

In the LLM service list, you can:

  • Toggle service status with the Enabled switch.
  • Drag to reorder services (affects model display order).

llm-service-list-enabled-and-sort.png

#Availability Test

Use Test flight at the bottom of the service dialog to verify service and model availability.

It is recommended to run this test before production use.