🚀 Quick Start Guide
Get up and running with AI conversations in your Unreal Engine project in under 5 minutes!
Prerequisites
Platform
Windows 64-bit only (Windows 10/11 x64 required)
*Other platforms coming soon
Step 1: Install the Plugin
Quick Install:
- Extract LLMAI folder to
YourProject/Plugins/LLMAI/
- Enable Plugin: Edit > Plugins > Search "LLMAI" > Check Enabled
- Restart the editor
For detailed installation steps and troubleshooting, see Plugin Installation Guide
Step 2: Configure AI Service
Option A: OpenAI (Cloud)
- Go to Edit > Project Settings
- Navigate to Plugins > LLMAI
- Enter your OpenAI API Key
- Optional: Set your preferred model and voice
Option B: LocalAI (Offline)
For local/offline AI processing, use the LLMAI LocalAI distribution (available separately). No API key required.
See LocalAI Quick Start Guide for download and setup.
Step 3: Try the Demos
LLMAI comes in two demo projects. Both use the same Terminal interface but showcase different AI capabilities:
🎮 LLMAI_Arcade (Rocketeer Game)
- Open Level:
Content/LLMAITerminal/Levels/LVL_LLMAITerminal
- Click Play (or press Alt+P) and connect to AI
- Enable Voice Mode and say "Launch Rocketeer"
- Control your rocket: Arrows to move, spacebar to fire
- Try AI commands: "Make it faster", "Give me an extra life", "Spawn some enemies"
- Advance stages: Ask AI to set difficulty and progress
Learn More: Rocketeer Documentation
🎭 LLMAI_LiveLink (MetaHuman Character)
- Open Level:
Content/LLMAITerminal/Levels/LVL_LLMAITerminal
- Click Play (or press Alt+P) and connect to AI
- Enable Voice Mode to see lip-sync in action
- Try gesture commands: "Wave hello", "Give me a thumbs up", "Nod yes"
- Try movement: "Move to the left", "Go to the right", "Jump!"
- Have a conversation and watch natural lip-sync
Learn More: LiveLink Demo Details
Step 4: Add AI to Your Project
Blueprint Method (Easy)
- Open any Blueprint Actor
- Add Component: Search for "LLMAI Client Component"
- In Begin Play:
- Use Connect to AI node
- Set Instructions:
You are a helpful assistant
- Set Modalities:
text, audio
- Send Text to AI:
- Use Send Text to AI node to trigger a conversation
- Connect a string input (e.g., "Hello, how are you?")
- This automatically triggers an AI response (no additional trigger needed)
- This can be triggered by user input, button press, or any game event
- Handle Responses:
- Easiest Method: Bind to On AI Text Response Complete event
- This provides the full response when complete (recommended for most use cases)
- Streaming Method (Optional): For real-time streaming text:
- Bind to On AI Text Response Start event (response begins)
- Bind to On AI Text Response Delta event (each text chunk)
- Bind to On AI Text Response End event (response complete)
- Print the response or update your UI
C++ Method (Advanced)
// In your Actor's header file
UPROPERTY(VisibleAnywhere, BlueprintReadOnly, Category = "AI")
class ULLMAIClientComponent* AIClient;
// In constructor
AIClient = CreateDefaultSubobject<ULLMAIClientComponent>(TEXT("AIClient"));
// In BeginPlay
AIClient->OnAITextResponseComplete.AddDynamic(this, &AMyActor::HandleAIResponse);
TArray<FString> Modalities = {"text", "audio"};
// Connect to AI (OpenAI example)
AIClient->ConnectToAI(
"OpenAI", // Provider: "OpenAI" or "LocalAI"
"gpt-realtime",
"You are a helpful assistant",
{"text", "audio"}
);
// To send text to AI (trigger this from user input, button press, etc.)
AIClient->SendTextToAI("Hello, how are you?");
// This automatically triggers an AI response - no additional call needed
Step 5: Enable Voice (Optional)
Start Voice Mode
// In Blueprint: use "Start Voice Mode" node with voice name "alloy"
// In C++:
AIClient->StartVoiceMode("alloy");
// Audio capture and streaming are handled automatically!
Note:
StartVoiceMode() automatically handles audio setup - no manual setup required
- Audio streaming starts automatically when voice mode activates
🔧 Step 6: Add AI Functions (Optional)
Let the AI call functions in your game!
Create Function Definition
- Right-click in Content Browser
- Miscellaneous > LLMAI Function Definition
- Name: "spawn_enemy"
- Description: "Spawn an enemy in the game"
- Add Parameters as needed
Register and Handle
// Register the function
AIClient->RegisterAIFunction(MyFunctionDefinition);
// Handle AI function calls
AIClient->OnAIFunctionCallRequested.AddDynamic(this, &AMyActor::HandleFunctionCall);
// In your handler
void AMyActor::HandleFunctionCall(const FLLMFunctionCall& FunctionCall)
{
if (FunctionCall.Name == "spawn_enemy")
{
// Your game logic here
SpawnEnemy();
// Tell AI it worked
AIClient->SendAIFunctionCallResult(FunctionCall.CallId, "Enemy spawned successfully!");
}
}
What to Try Next
💬 Basic Conversations
- "Hello, how are you?" - Test basic chat
- "What can you help me with?" - Explore AI capabilities
- "Tell me about this game" - Context-aware responses
🎤 Voice Features
- Enable voice mode and have natural conversations
- Try different voices: Known available voices are listed in DefaultGame.ini
- Test interruption: Talk while AI is responding
🔧 Function Calling
- Ask AI to control your game: "Spawn 3 enemies"
- Complex requests: "Make the game harder by adding obstacles"
- Context-aware functions: "If I'm doing well, increase difficulty"
Troubleshooting
"Plugin not found"
- ✅ Check LLMAI folder is in
Plugins/LLMAI/
- ✅ Verify
LLMAI.uplugin file exists
- ✅ Restart Unreal Engine editor
"API key invalid" or connection issues
- ✅ OpenAI: Verify your API key is correct and has credits
- ✅ OpenAI: Ensure no extra spaces in the key
- ✅ LocalAI: Verify LocalAI server is running
- ✅ LocalAI: Check the endpoint URL in settings
"Connection failed"
- ✅ Check internet connection
- ✅ Verify firewall isn't blocking WebSocket connections
- ✅ Try different network if on corporate/school WiFi
- ✅ Debug command:
llmai.debug.WebSocketLevel 2 (in console)
"No voice input"
- ✅ Check microphone permissions in Windows
- ✅ Test microphone in other applications
- ✅ Try plugging in microphone if using external device
- ✅ Debug command:
llmai.debug.AudioLevel 2 (in console)
"Audio not playing"
- ✅ Check audio output device
- ✅ Adjust system volume
- ✅ Test with different audio device
- ✅ Debug command:
llmai.debug.SaveAudioFiles 1 (to analyze audio data)
Advanced Troubleshooting
For complex issues, LLMAI includes a comprehensive debug system:
# Open console (` key) and use these commands:
llmai.debug.EnableAll # Turn on all debugging
llmai.debug.OpenAILevel 2 # See API communication details
llmai.debug.FunctionLevel 2 # Debug function calling
Full Debug Guide: Debug & Logging Documentation
Next Steps
Once you've got the basics working:
- Read the Plugin Installation Guide for advanced setup options
- Explore the Rocketeer Game Demo Documentation for detailed examples of function calling in a game
- Explore the Terminal Demo Documentation for detailed examples of text and voice communication
- Study the API Reference for complete functionality
- Review the Plugin Documentation for technical details
🎉 Congratulations!
You now have AI conversations working in your Unreal Engine project!
The AI can:
- Chat naturally about any topic
- Understand context about your game
- Control game systems through function calls
- Communicate by voice for hands-free interaction
Ready to build something amazing? The AI is waiting to help! 🤖✨