❓ Frequently Asked Questions


📋 Quick Navigation

General Questions

What is LLMAI?

LLMAI is a comprehensive AI integration plugin for Unreal Engine 5 that enables real-time text and voice conversations with AI assistants, plus the ability for AI to call functions and control game systems.

What versions of Unreal Engine are supported?

What platforms are supported?

✅ Currently Supported
Windows 64-bit only (Windows 10/11 x64)
Requires Visual Studio 2022 and Windows SDK
🔄 Planned
Mac and Linux support planned for future releases
❌ Not Supported
Mobile platforms, 32-bit Windows, ARM processors

Why Windows only? The plugin uses platform-specific audio libraries for real-time voice processing:

Technical Details: See Plugin Dependencies for complete dependency information.

Do I need an OpenAI account?

You have two options:

Setup and Installation

How do I get an OpenAI API key?

Note: An OpenAI API key is only required if using OpenAI's cloud service. For local/offline operation, use LocalAI instead.

  1. Go to platform.openai.com
  2. Sign up or log in
  3. Navigate to API keys section
  4. Create a new API key
  5. Copy the key to LLMAI plugin settings

Where do I put my API key?

For OpenAI: The plugin checks for your OpenAI API key in multiple locations (in this order):

  1. Project Settings (Recommended):
    • Go to Edit > Project Settings > Plugins > LLMAI
    • Enter your key in "Default OpenAI API Key" field
  2. Environment Variables:
    • Set LLMAI_OPENAI_API_KEY=your_key_here in your system environment
    • Or use the standard OPENAI_API_KEY=your_key_here
  3. Command Line Parameter:
    • Launch with -OpenAIKey=your_key_here parameter
    • Useful for automated builds or CI/CD pipelines

Note: The plugin tries these methods in order and uses the first valid key it finds. Project Settings is recommended for most users.

For LocalAI: No API key is required. Configure the LocalAI endpoint URL in Project Settings instead. See LocalAI Quick Start Guide for download and setup.

Can I use other AI services besides OpenAI?

Yes! LLMAI currently supports two providers:

Support for additional providers (Anthropic, Gemini, Grok) is planned for future releases.

Technical Details: The plugin includes provider-agnostic functions (ConnectToAI, SendTextToAI) designed for multi-provider support. See API Documentation for the provider architecture.

The plugin doesn't appear in my plugins list

Usage and Features

What AI models can I use?

The plugin is optimized for OpenAI's Realtime API. For the most current list of supported models, see Plugins/LLMAI/Config/DefaultGame.ini which contains the latest KnownRealtimeModels list.

As of this version, supported models include:

What voices are available?

For the most current list of available voices, see Plugins/LLMAI/Config/DefaultGame.ini which contains the latest KnownOpenAIRealtimeVoices list.

Current voices include:

alloy
Neutral, clear voice (default)
ash
Mature, conversational
ballad
Calm, melodic
coral
Warm, friendly
echo
Warm, upbeat voice
sage
Thoughtful, wise
shimmer
Soft, gentle voice
verse
Creative, expressive

Note: OpenAI periodically updates available voices. The configuration file always contains the most current list.

Can the AI control my game?

Yes! The plugin provides function calling capabilities that let AI interact with your game systems.

See the Rocketeer Game Demo (LLMAI_Arcade) for a complete example where AI can:

Technical Details: See API Documentation for function definition syntax and implementation patterns.

What is the LLMAI_LiveLink demo?

The LLMAI_LiveLink demo showcases AI-driven character animation with a MetaHuman (Cooper). It demonstrates:

Learn More: See LiveLink Demo Details.

How does lip-sync work in the LiveLink demo?

The lip-sync feature works automatically when voice mode is enabled:

  1. AI generates voice response via TTS (Text-to-Speech)
  2. Audio is streamed through LiveLink to the MetaHuman
  3. MetaHuman's facial animation system drives lip movements in real-time

No additional configuration is needed - simply enable voice mode and the lip-sync activates automatically.

What gestures are available in the LiveLink demo?

The AI can trigger 16 different character gestures through natural language:

Articulate - Expressive hand gestures
BringItOn - Taunting gesture
Dance - Dancing movement
Dismissive - Dismissive wave
LookAtMyOwnBody - Looks at body
LookDown - Looks downward
LookUp - Looks upward
NodNo - Shaking head no
NodYes - Nodding yes
NotSure - Uncertain shrug
PlayfulPoint - Playful pointing
Surprised - Surprised reaction
ThumbsDown - Thumbs down
ThumbsUp - Thumbs up
Wave - Waving
WaveFingerNo - Finger wagging

Try natural commands like "Wave hello", "Give me a thumbs up", or "Show me you're not sure".

Is voice communication real-time?

Yes, the plugin uses OpenAI's Realtime API for low-latency voice conversations. You can speak naturally and the AI responds with both voice and text.

Demo: Try the LLMAITerminal Demo to experience real-time voice conversations in action.

Technical Questions

Can I use this in C++ projects?

Yes, LLMAI works in both Blueprint and C++ projects. All functionality is available through both interfaces.

How do I handle API costs?

Does this work offline?

OpenAI: Requires an internet connection to communicate with OpenAI's servers.

LocalAI: Yes! The LLMAI LocalAI distribution (available separately) supports offline operation with local AI processing and TTS voice synthesis.

Performance and Optimization

How much bandwidth does voice mode use?

Voice mode uses approximately:

Troubleshooting

Voice input isn't working

  1. Check microphone permissions in Windows
  2. Test microphone in other applications
  3. Verify audio drivers are up to date
  4. Try different microphone if using external device
  5. Check Windows privacy settings for microphone access

Connection keeps dropping

  1. Check firewall settings - ensure WebSocket connections allowed
  2. Try different network if on corporate/restricted WiFi
  3. Check proxy settings if behind corporate proxy
  4. Verify stable internet connection
  5. Enable auto-reconnect in plugin settings

Function calls aren't working

  1. Check function definitions for correct syntax
  2. Verify function registration in logs
  3. Test with simple functions first
  4. Check AI instructions mention function availability
  5. Ensure proper parameter types are defined

Development Questions

Can I modify the plugin source code?

Yes, if you need to modify the plugin:

  1. Ensure your project is a C++ project
  2. Place the plugin in your project's Plugins folder
  3. Regenerate project files
  4. Build the project in Visual Studio

Technical Details: See Plugin Documentation for development setup and architecture overview.

How do I add custom audio processing?

Use the ULLMAIAudioStreamComponent for custom audio capture and processing.

Implementation Examples: See API Documentation for audio processing patterns and component usage details.

Can I create custom AI personalities?

Yes, by modifying the instructions parameter when connecting to AI. You can create different personalities for different game characters or contexts.

Examples and Syntax: See API Documentation for instruction parameter details and personality examples.

How do I debug and troubleshoot issues?

LLMAI includes a comprehensive debug system:

  1. Enable debug logging in the console:
    llmai.debug.EnableAll              # Turn on all debugging
    llmai.debug.OpenAILevel 2          # Debug API issues  
    llmai.debug.AudioLevel 2           # Debug voice problems
  2. Check specific issues:
    • Connection problems: llmai.debug.WebSocketLevel 2
    • Voice not working: llmai.debug.SaveAudioFiles 1
    • Function calls failing: llmai.debug.FunctionLevel 2
  3. Find debug output:
    • Output Log: Window > Developer Tools > Output Log
    • Log Files: YourProject/Saved/Logs/
    • Debug Files: YourProject/Saved/Logs/LLMAI/

Complete Guide: See Debug & Logging Documentation for detailed troubleshooting.

How do I handle errors gracefully?

Bind to error events:

Event Details: See API Documentation for complete event signatures and error handling patterns.

License and Distribution

What license is LLMAI under?

LLMAI is distributed under the Fab Standard License when available on fab.com. This license allows for commercial and non-commercial use in your projects. See the Fab EULA for complete terms.

Can I distribute games with LLMAI?

Yes, you can distribute games that include the LLMAI plugin. Ensure you handle API key security appropriately.

Do I need to credit LLMAI?

While not required by the Fab Standard License, attribution is appreciated. Consider mentioning LLMAI plugin in your game's credits or documentation.

Future Development

What features are coming next?

Planned features include: