Version 2.0
Unreal Engine 5.6+
Windows 64-bit
Fab Standard License

Plugin Overview

The LLMAI plugin provides the core AI integration capabilities for Unreal Engine 5 projects. This plugin handles all the low-level communication with both OpenAI's and LocalAI's Realtime API, for text and audio processing, and provides Blueprint-friendly components for easy integration.

What's Included

Core Components

Data Assets

Editor Integration

Quick Integration

Add to Any Actor

// Add component to any Actor
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "AI")
class ULLMAIClientComponent* AIClient;

// In constructor
AIClient = CreateDefaultSubobject<ULLMAIClientComponent>(TEXT("AIClient"));

Basic Connection

// Connect to OpenAI (cloud)
AIClient->ConnectToAI(
    "OpenAI",
    "gpt-realtime", 
    "You are a helpful assistant", 
    {"text", "audio"}
);

// Or connect to LocalAI (local/offline)
AIClient->ConnectToAI(
    "LocalAI",
    "your-local-model", 
    "You are a helpful assistant", 
    {"text", "audio"}
);

Handle AI Responses

// Bind to events
AIClient->OnAITextResponseDelta.AddDynamic(this, &AMyActor::HandleAIResponse);
AIClient->OnAIFunctionCallRequested.AddDynamic(this, &AMyActor::HandleFunctionCall);

Plugin Structure

If you have downloaded the full project, then there are example demos and further documentation in the parent folders.

YourProject/ └── Plugins/ └── LLMAI/ # Plugin root folder ├── Config/ │ └── DefaultGame.ini # Plugin configuration ├── Content/ # Plugin content (currently empty) └── Source/ ├── LLMAI/ # Runtime module │ ├── Public/ # Public headers │ └── Private/ # Implementation files └── LLMAIEditor/ # Editor module ├── Public/ └── Private/

Plugin Dependencies

Required Plugin Dependencies

Supported Platforms

Runtime Module Dependencies (LLMAI)

Core Unreal Modules

Networking & Communication

Audio Processing

Platform-Specific Libraries

Windows (Win64):

Editor Module Dependencies (LLMAIEditor)

Core Editor Modules

User Interface

Settings Configuration

Access through Edit > Project Settings > Plugins > LLMAI:

Note: These are global project settings. For per-component instance settings (visible in the Details panel), see Component Instance Properties below.

General Settings

OpenAI Provider Settings (Project Settings > LLMAI > OpenAI Provider)

LocalAI Provider Settings (Project Settings > LLMAI > LocalAI Provider)

Note: LocalAI is available as a separate distribution. See LocalAI Quick Start Guide for download and setup.

Audio Settings

Advanced Options

Component Instance Properties

When you add a ULLMAIClientComponent to an Actor, these properties are configurable in the Details panel. These are per-instance settings that can vary between different components.

Voice Settings

Property Default Range Description
VoiceThreshold 0.01 0.001 - 1.0 Minimum audio level to detect speech. Lower = more sensitive.
MaxSilenceDuration 1.5 0.5 - 5.0 sec How long to wait after speech stops before committing audio.
bEnableClientsideVAD true Detect microphone input and interrupt server responses. Essential for interrupting local audio playback.
InterruptionThreshold 0.01 0.001 - 0.1 Audio level needed to trigger voice interruption.
bEnableAutoMicGating false Automatically mute microphone when AI is speaking to prevent feedback.
OutputAudioThreshold 0.01 0.001 - 0.1 Output audio level threshold for mic gating when interruption is disabled.
OutputAudioDecayRate 0.95 0.8 - 0.999 How quickly output audio envelope decays. Higher = slower decay.
VoiceOutputHoldTimeSeconds 2.0 0.0 - 3.0 sec Hold time after AI speech ends before triggering OnVoiceOutputEnd. Covers natural speech pauses.

Audio Settings

Property Default Range Description
AudioGainMultiplier 1.0 0.1 - 10.0 Amplify or reduce microphone input volume before sending to AI.
bAutoCreateAudioStream true Automatically create audio stream component for microphone capture.
bAutoCreateAudioPlayback true Automatically create audio component for AI voice playback.

Usage Examples

// In Blueprint: Set via Details panel or use nodes
// In C++: Modify properties before connecting

AIClient->VoiceThreshold = 0.02f;          // Less sensitive speech detection
AIClient->MaxSilenceDuration = 2.0f;       // Wait longer before committing
AIClient->bEnableAutoMicGating = true;     // Prevent feedback
AIClient->AudioGainMultiplier = 1.5f;      // Boost quiet microphones

ULLMAIAudioStreamComponent Properties

If using a custom ULLMAIAudioStreamComponent, these properties can be configured:

Property Default Description
AudioSourceMode Microphone Microphone - Capture mic input to send to AI
Loopback - Capture AI voice output for MetaHuman lip-sync
SampleRate 24000 Audio sample rate in Hz. 24000 is recommended for AI voice.
NumChannels 1 Number of audio channels (mono = 1, stereo = 2).
BitsPerSample 16 Audio bit depth. 16-bit PCM is standard for AI services.
StreamingChunkSize 4096 Size of audio chunks for streaming. Smaller = lower latency.
OutputFilePath (empty) Optional path to save captured audio to WAV file for debugging.

Function Calling System

Define AI Functions

Create function definition assets in the Content Browser:

  1. Right-click in Content Browser
  2. AI Functions > LLMAI Function Definition
  3. Configure function name, description, and parameters
  4. Register with AI client component

Define AI Function Profiles

A function profile is a set of AI Functions. This makes it easy to associate a set of functions with a particular purpose or connection.

  1. Simply select the AI functions which relate to the profile
  2. Register the profile with the AI client component

Handle Function Calls

UFUNCTION()
void HandleAIFunctionCall(const FLLMFunctionCall& FunctionCall)
{
    if (FunctionCall.Name == "my_function")
    {
        // Extract parameters
        FString Param = ULLMAIBlueprintLibrary::GetAIFunctionStringParameter(
            FunctionCall.ArgumentsJson, "parameter_name", "default_value"
        );
        
        // Execute your logic
        FString Result = ExecuteMyFunction(Param);
        
        // Return result to AI
        AIClient->SendAIFunctionCallResult(FunctionCall.CallId, Result);
    }
}

Audio Integration

The client component will automatically create needed audo components and begin and end streaming where necessary. You only need to set up the audio components manaually if you have a specific scenerio or settings you need to account for. For example if you wanted to connect the audio to a seperate actor or dynamically choose particular outputs and inputs.

Basic Audio Setup

// Setup voice capture (automatic microphone detection)
AIClient->SetupVoiceCapture();

// Setup audio playback
UAudioComponent* AudioComp = CreateDefaultSubobject<UAudioComponent>(TEXT("AIAudio"));
AIClient->SetupAudioPlayback(AudioComp);

Advanced Audio (Custom Capture)

// Create audio stream component for custom capture
ULLMAIAudioStreamComponent* StreamComp = CreateDefaultSubobject<ULLMAIAudioStreamComponent>(TEXT("AudioStream"));
AIClient->SetAudioStreamComponent(StreamComp);

// Handle custom audio data
StreamComp->OnAudioDataReceived.AddDynamic(this, &AMyActor::HandleAudioData);

Event System

Connection Events

AIClient->OnConnected.AddDynamic(this, &AMyActor::OnAIConnected);
AIClient->OnDisconnected.AddDynamic(this, &AMyActor::OnAIDisconnected);
AIClient->OnError.AddDynamic(this, &AMyActor::OnAIError);

Communication Events

AIClient->OnAISessionReady.AddDynamic(this, &AMyActor::OnAIReady);
AIClient->OnAITextResponseDelta.AddDynamic(this, &AMyActor::OnAITextReceived);
AIClient->OnAIResponseComplete.AddDynamic(this, &AMyActor::OnAIResponseFinished);

Voice Events

AIClient->OnVoiceModeActivated.AddDynamic(this, &AMyActor::OnVoiceModeStarted);
AIClient->OnInputAudioTranscriptionDelta.AddDynamic(this, &AMyActor::OnTranscription);

Thread Safety

The plugin is designed for thread-safe operation:

Always check IsInGameThread() before UI updates in custom handlers.

Debugging and Troubleshooting

The plugin includes a comprehensive debug & logging system for troubleshooting issues. See Debug-Logging.htm for complete details.

Quick Debug Commands

# In Unreal Console (` key):
llmai.debug.EnableAll              # Turn on all debugging
llmai.debug.OpenAILevel 2          # Debug API communication
llmai.debug.AudioLevel 2           # Debug voice issues
llmai.debug.FunctionLevel 2        # Debug function calling

Common Issues

Log Categories

Monitor these log categories in Output Log:

Full Debug & Logging Guide: See Debug-Logging.htm for complete troubleshooting commands and techniques.

Installation & Setup

For plugin installation, compilation, and initial configuration, see Installation.htm.

Changelog

📦 Version 2.0 — January 2, 2026

📦 Version 1.0 — September 9, 2025

Support




Engineered by