Developer guide 

This guide covers technical details for developers integrating the LLM extension into their TYPO3 projects.

Core concepts 

Architecture overview 

The extension follows a layered architecture:

  1. Providers - Handle direct API communication.
  2. LlmServiceManager - Orchestrates providers and provides unified API.
  3. Feature services - High-level services for specific tasks.
  4. Domain models - Response objects and value types.
Architecture overview
┌─────────────────────────────────────────┐
│         Your Application Code           │
└────────────────┬────────────────────────┘
                 │
┌────────────────▼────────────────────────┐
│         Feature Services                │
│  (Completion, Embedding, Vision, etc.)  │
└────────────────┬────────────────────────┘
                 │
┌────────────────▼────────────────────────┐
│         LlmServiceManager               │
│    (Provider selection & routing)       │
└────────────────┬────────────────────────┘
                 │
┌────────────────▼────────────────────────┐
│           Providers                     │
│    (OpenAI, Claude, Gemini, etc.)       │
└─────────────────────────────────────────┘
Copied!

Dependency injection 

All services are available via dependency injection:

Example: Injecting LLM services
use Netresearch\NrLlm\Service\LlmServiceManager;
use Netresearch\NrLlm\Service\Feature\CompletionService;
use Netresearch\NrLlm\Service\Feature\EmbeddingService;
use Netresearch\NrLlm\Service\Feature\VisionService;
use Netresearch\NrLlm\Service\Feature\TranslationService;

class MyController
{
    public function __construct(
        private readonly LlmServiceManager $llmManager,
        private readonly CompletionService $completionService,
        private readonly EmbeddingService $embeddingService,
        private readonly VisionService $visionService,
        private readonly TranslationService $translationService,
    ) {}
}
Copied!

Using LlmServiceManager 

Basic chat 

Example: Basic chat request
$messages = [
    ['role' => 'system', 'content' => 'You are a helpful assistant.'],
    ['role' => 'user', 'content' => 'What is TYPO3?'],
];

$response = $this->llmManager->chat($messages);

// Response properties
$content = $response->content;           // string
$model = $response->model;               // string
$finishReason = $response->finishReason; // string
$usage = $response->usage;               // UsageStatistics
Copied!

Chat with options 

Example: Chat with configuration options
use Netresearch\NrLlm\Service\Option\ChatOptions;

// Using ChatOptions object
$options = ChatOptions::creative()
    ->withMaxTokens(2000)
    ->withSystemPrompt('You are a creative writer.');

$response = $this->llmManager->chat($messages, $options);

// Or using array
$response = $this->llmManager->chat($messages, [
    'provider' => 'claude',
    'model' => 'claude-sonnet-4-6',
    'temperature' => 1.2,
    'max_tokens' => 2000,
]);
Copied!

Simple completion 

Example: Quick completion from a prompt
$response = $this->llmManager->complete('Explain recursion in programming');
Copied!

Embeddings 

Example: Generating embeddings
// Single text
$response = $this->llmManager->embed('Hello, world!');
$vector = $response->getVector(); // array<float>

// Multiple texts
$response = $this->llmManager->embed(['Text 1', 'Text 2', 'Text 3']);
$vectors = $response->embeddings; // array<array<float>>
Copied!

Response objects 

See the API reference for the complete response object documentation. Key classes:

  • CompletionResponse — content, model, usage, finishReason, toolCalls
  • EmbeddingResponse — embeddings, model, usage
  • UsageStatistics — promptTokens, completionTokens, totalTokens

Error handling 

The extension throws specific exceptions:

Example: Error handling
use Netresearch\NrLlm\Provider\Exception\ProviderException;
use Netresearch\NrLlm\Provider\Exception\ProviderConfigurationException;
use Netresearch\NrLlm\Provider\Exception\ProviderConnectionException;
use Netresearch\NrLlm\Provider\Exception\ProviderResponseException;
use Netresearch\NrLlm\Provider\Exception\UnsupportedFeatureException;
use Netresearch\NrLlm\Exception\InvalidArgumentException;

try {
    $response = $this->llmManager->chat($messages);
} catch (ProviderConfigurationException $e) {
    // Invalid or missing provider configuration
} catch (ProviderConnectionException $e) {
    // Connection to provider failed
} catch (ProviderResponseException $e) {
    // Provider returned an error response
} catch (UnsupportedFeatureException $e) {
    // Requested feature not supported by provider
} catch (ProviderException $e) {
    // General provider error
} catch (InvalidArgumentException $e) {
    // Invalid parameters
}
Copied!

Events 

Best practices 

  1. Use feature services for common tasks instead of raw LlmServiceManager.
  2. Enable caching for deterministic operations like embeddings.
  3. Handle errors gracefully with proper try-catch blocks.
  4. Sanitize input before sending to LLM providers.
  5. Validate output and treat LLM responses as untrusted.
  6. Use streaming for long responses to improve UX.
  7. Set reasonable timeouts based on expected response times.
  8. Monitor usage to control costs and prevent abuse.