LLMAI Demo Guide
LLMAI comes in two demonstration projects: Both feature the same real-time AI chat interface with text and voice communication capabilities. Each demo showcases different AI function calling integrations to demonstrate the versatility of the LLMAI plugin.
LLMAI_Arcade — AI controls a retro arcade game called Rocketeer
LLMAI_LiveLink — AI controls a MetaHuman character with lip-sync and gestures
💬 Real-time AI Chat | 🎤 Voice Communication | 🔧 AI Function Calling | 🎮 Two Demo Showcases
🔑 Required Setup (Both Demos)
⚠️ Important: API Configuration Required
Both demos require either an OpenAI API key or the LLMAI LocalAI distribution (available separately) to function.
Option A: OpenAI API Key
To use OpenAI's cloud service, configure your API key:
Method 1: Project Settings (Recommended)
- Open Edit > Project Settings
- Navigate to Plugins > LLMAI
- Enter your API key in the Default OpenAI API Key field
Method 2: Environment Variable
# Windows Command Prompt
set OPENAI_API_KEY=your-api-key-here
# Windows PowerShell
$env:OPENAI_API_KEY="your-api-key-here"
Getting an API Key: Visit OpenAI's API Keys page to create your API key.
Option B: LocalAI (Offline/Local Processing)
For offline operation or local AI processing, use the LLMAI LocalAI distribution, available separately. LocalAI provides:
- No API key required
- Offline operation capability
- Local TTS voice synthesis
See LocalAI TTS Configuration for setup details.
📦 Demo Projects Overview
🎮 LLMAI_Arcade
Rocketeer Game Integration
A retro-style arcade shooter where AI controls the game through natural language commands. Demonstrates comprehensive function calling with 17+ AI-callable functions.
AI Can Control:
- Game start/stop, stage progression, difficulty
- Player lives and weapon upgrades
- Enemy spawning with custom flight paths
- Game speed and UI visibility
- Timed challenges with AI callbacks
Learn More: Rocketeer Game Documentation
🎭 LLMAI_LiveLink
MetaHuman Character Control
A MetaHuman character (Cooper) with real-time lip-sync driven by AI voice output. Demonstrates character animation control through AI function calling.
AI Can Control:
- Lip-sync animations driven by TTS voice
- 16 expressive character gestures
- Character movement to 3 waypoints (left, middle, right)
- Jump action with full character animation
Learn More: LiveLink Demo Details
✨ Shared Features (Both Demos)
Both demo projects use the same LLMAI Terminal interface, providing:
- Real-time Text Chat: Professional chat interface with streaming AI responses
- Voice Communication: Toggle voice mode for natural conversations with AI
- Voice Activity Detection: Automatic speech detection - just start talking
- Natural Interruption: Speak while AI is responding to interrupt
- Connection Management: Easy connect/disconnect with model and voice selection
- Debug Logging: Toggle log panel for system information
Terminal Reference: Terminal Demo Documentation
🚀 Getting Started
Quick Start Steps (Both Demos)
- Configure AI: Set up OpenAI API key or LocalAI (see above)
- Open the Level: Navigate to
Content/LLMAITerminal/Levels/LVL_LLMAITerminal
- Play the Level: Click Play in Editor or press Alt+P
- Connect to AI: Use the connection dialog to select model and voice
- Start Chatting: Use text or enable voice mode for natural conversation
- Try the Features: Use natural language to control the demo's features
🎮 LLMAI_Arcade Demo Details
Rocketeer Game Integration
The Rocketeer demo showcases AI function calling in a game context. The AI can start, control, and modify the game through natural conversation.
What to Try
- "Launch Rocketeer" — Start the arcade game
- "Make the game faster" — Adjust game speed
- "Give me an extra life" — Modify player lives
- "Spawn some enemies" — Trigger enemy waves
- "Set difficulty to hard" — Change difficulty level
- "What controls do I have?" — Get game controls
- "Create a zigzag flight path" — Design enemy patterns
Game Controls
- Arrow Keys: Move rocket (left, right, up, down)
- Spacebar: Fire weapons
Complete Function Reference: Rocketeer Game Documentation
🎭 LLMAI_LiveLink Demo Details
MetaHuman Character Control
The LiveLink demo showcases AI-driven character animation with a MetaHuman (Cooper). The AI's voice output drives real-time lip-sync, and function calling enables expressive gestures and movement.
Lip-Sync Feature
When voice mode is enabled, the AI's TTS (Text-to-Speech) output automatically drives the MetaHuman's lip-sync animations through LiveLink, creating natural-looking speech animation.
What to Try
- "Wave hello" — Trigger wave gesture
- "Give me a thumbs up" — Positive gesture
- "Shake your head no" — NodNo gesture
- "Move to the left" — Character walks to left waypoint
- "Go to the middle" — Character returns to center
- "Jump!" — Character performs jump animation
- "Do a little dance" — Dance gesture
- "Show me you're not sure" — Uncertain shrug gesture
Available Gestures
The AI can trigger 16 different character gestures through natural language:
Articulate
Expressive hand gestures while explaining
BringItOn
Taunting "bring it on" motion
Dance
Dancing movement
Dismissive
Dismissive wave gesture
LookAtMyOwnBody
Character looks down at body
LookDown
Looking downward
LookUp
Looking upward
NodNo
Shaking head no
NodYes
Nodding yes
NotSure
Uncertain shrug
PlayfulPoint
Playful pointing
Surprised
Surprised reaction
ThumbsDown
Thumbs down
ThumbsUp
Thumbs up
Wave
Waving hello/goodbye
WaveFingerNo
Finger wagging "no-no"
Movement Functions
The character can move between three waypoints using natural navigation:
AIF_Move — Waypoint Movement
- Left: Character walks to the left position
- Middle: Character walks to the center position
- Right: Character walks to the right position
Uses the built-in navigation system with Third Person character movement animations, then faces the camera upon arrival.
AIF_Jump — Jump Action
Triggers the character's jump animation using the Third Person character's built-in functionality with Animation Blueprint state machine for proper jump physics and animation blending.
Technical Implementation
The LiveLink demo uses these AI Function Definition assets:
AIF_Gesture — Gesture triggering function
AIF_Move — Waypoint movement function
AIF_Jump — Jump action function
AIFP_Character — Character function profile combining all functions
Location: Content/LLMAITerminal/Character/Blueprints/AIFunctions/
🆘 Troubleshooting
Common Issues
Connection Problems
- API/Connection Error: Ensure your OpenAI API key is correctly configured, or LocalAI is running
- Network Issues: Check your internet connection and firewall settings
- Model Availability: Try selecting a different model if connection fails
Audio Issues
- Voice Mode Not Working: Check microphone permissions in Windows
- No AI Voice Output: Verify speakers/headphones and system audio
LiveLink Demo Issues
- No Lip-Sync: Ensure voice mode is enabled for TTS output
- Character Not Moving: Verify NavMesh is properly configured in the level
Arcade Demo Issues
- Game Not Starting: Try saying "Launch Rocketeer" or "Start the game"
- Functions Not Working: Ensure the Rocketeer function profile is enabled in connection dialog
Need More Help? See the FAQ or Debug & Logging Documentation.
📚 Additional Documentation
🎯 Experience the power of AI integration with real-time conversation and intelligent control!