Managing configurations 

Configurations define use-case-specific presets that combine a model with a system prompt and generation parameters. Extension developers reference configurations by identifier in their code.

Configuration list with model assignment, use-case type, and parameter summary

The configuration list showing each entry's linked model, use-case type, and key parameters.

Adding a configuration manually 

  1. Navigate to Admin Tools > LLM > Configurations.
  2. Click Add Configuration.
  3. Fill in the required fields:

    Identifier
    Unique slug for programmatic access (e.g., blog-summarizer).
    Name
    Display name (e.g., Blog Post Summarizer).
    Model
    Select the model to use.
    System Prompt
    The system message that sets the AI's behavior and context.
  4. Optionally adjust temperature (0.0-2.0), top_p, frequency/presence penalty, max tokens, and use-case type (chat, completion, embedding, translation).
  5. Click Save.

Testing a configuration 

Click Test Configuration on any row. The test sends a short prompt to the model and shows the response, model ID, and token usage.

Configuration test modal showing successful response from Qwen 3 via Ollama

Successful configuration test with token count.

Editing configurations 

Click a configuration row to edit. Changes take effect immediately for any extension code that references this configuration's identifier — no code deployment needed.