MCPHub LabRegistryUnrealGenAISupport
prajwalshettydev

UnrealGenAISupport

Built by prajwalshettydev 502 stars

What is UnrealGenAISupport?

Unreal Engine plugin for LLM/GenAI models & MCP UE5 server. Includes OpenAI's GPT, Deepseek, Claude Sonnet/Opus APIs, Google Gemini 3, Alibaba Qwen, Kimi, and Grok 4.1, with plans to add TTS, Elevenlabs, Inworld, OpenRouter, Groq, GLM, seedream hunyuan3d, fal, Dashscope, Rodin, Meshy, Tripo, UnrealClaude soon. UnrealMCP is also here!!

How to use UnrealGenAISupport?

1. Install a compatible MCP client (like Claude Desktop). 2. Open your configuration settings. 3. Add UnrealGenAISupport using the following command: npx @modelcontextprotocol/unrealgenaisupport 4. Restart the client and verify the new tools are active.
🛡️ Scoped (Restricted)
npx @modelcontextprotocol/unrealgenaisupport --scope restricted
🔓 Unrestricted Access
npx @modelcontextprotocol/unrealgenaisupport

Key Features

Native MCP Protocol Support
Real-time Tool Activation & Execution
Verified High-performance Implementation
Secure Resource & Context Handling

Optimized Use Cases

Extending AI models with custom local capabilities
Automating system workflows via natural language
Connecting external data sources to LLM context windows

UnrealGenAISupport FAQ

Q

Is UnrealGenAISupport safe?

Yes, UnrealGenAISupport follows the standardized Model Context Protocol security patterns and only executes tools with explicit user-granted permissions.

Q

Is UnrealGenAISupport up to date?

UnrealGenAISupport is currently active in the registry with 502 stars on GitHub, indicating its reliability and community support.

Q

Are there any limits for UnrealGenAISupport?

Usage limits depend on the specific implementation of the MCP server and your system resources. Refer to the official documentation below for technical details.

Official Documentation

View on GitHub

Unreal Engine Fab Plugins C++ Platform AI Models MCP License: MIT Discord

Usage Examples:

MCP Example:

Claude spawning scene objects and controlling their transformations and materials, generating blueprints, functions, variables, adding components, running python scripts etc.

<img src="Docs/UnrealMcpDemo.gif" width="480"/>

API Example:

A project called become human, where NPCs are OpenAI agentic instances. Built using this plugin. Become Human

Unreal Engine Generative AI Support Plugin:

Every month, hundreds of new AI models are released by various organizations, making it hard to keep up with the latest advancements.

The "Unreal Engine Generative AI Support Plugin" allows you to focus on game development without worrying about the LLM/GenAI integration layer.

Current Progress:

This plugin will continue to get updates with the latest features and models. Contributions are welcome. For production-ready alternatives with 200+ AI models, guaranteed stability, automated testing, and UE 5.1-5.7+ support, check out the pro plugins below. This free plugin covers many use cases (including the examples above) and you can use it for free, forever.

Pro Plugins on Fab.com

<table> <tr> <td align="center" width="50%"> <a href="https://muddyterrain.com/t/genai-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=genai-plugin"> <img src="https://res.cloudinary.com/dqq9t4hyy/image/upload/q_60/v1772931701/MainBanners_4_d9nmqo.webp" width="100%" alt="Gen AI for Unreal - OpenAI GPT Anthropic Claude Gemini Grok ElevenLabs"/> </a> <p><b><a href="https://muddyterrain.com/t/genai-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=genai-plugin">Gen AI Pro</a></b><br/> <code>gpt-5.4</code> <code>claude-opus-4-6</code> <code>gemini-3.1-pro</code> <code>grok-4.1</code> <code>o3-pro</code> <code>eleven_v3</code><br/> Chat, Vision, Streaming, Realtime, Image Gen, TTS, Structured Output, Tool Use</p> </td> <td align="center" width="50%"> <a href="https://muddyterrain.com/t/genai-china-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=genai-china-plugin"> <img src="https://res.cloudinary.com/dqq9t4hyy/image/upload/q_60/v1773874915/MainBanners_6_tw1xai.webp" width="100%" alt="Gen AI Chinese Models - Alibaba Qwen DeepSeek Moonshot Kimi ByteDance ZhipuAI"/> </a> <p><b><a href="https://muddyterrain.com/t/genai-china-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=genai-china-plugin">Gen AI Pro China</a></b><br/> <code>qwen3.5-plus</code> <code>kimi-k2.5</code> <code>glm-5</code> <code>seed-2-0</code> <code>ernie-5.0</code> <code>seedream-4</code><br/> Worldwide support, Chat, Vision, Image Gen, TTS, Reasoning</p> </td> </tr> <tr> <td align="center" width="50%"> <a href="https://muddyterrain.com/t/genai-model-generator-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=gen3d-plugin"> <img src="https://res.cloudinary.com/dqq9t4hyy/image/upload/q_60/v1774387242/MainBanners_9_vdxmdm.webp" width="100%" alt="Gen AI Model Generator - Meshy Tripo Hunyuan3D Trellis Rodin 3D Generation"/> </a> <p><b><a href="https://muddyterrain.com/t/genai-model-generator-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=gen3d-plugin">GenAI Model Generator</a></b><br/> <code>meshy-6</code> <code>tripo-v2.5</code> <code>hunyuan3d-v3.1</code> <code>trellis-2</code> <code>rodin-gen-2</code><br/> Text-to-3D, Image-to-3D, Remesh, PBR Textures, Retexture, Auto-Rigging</p> </td> <td align="center" width="50%"> <a href="https://muddyterrain.com/t/genai-llama-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=genai-llama-plugin"> <img src="https://res.cloudinary.com/dqq9t4hyy/image/upload/q_60/v1774520082/MainBanners_10_gniskq.webp" width="100%" alt="Gen AI Llama - Ollama vLLM LM Studio GGUF Local LLMs Unreal Engine"/> </a> <p><b><a href="https://muddyterrain.com/t/genai-llama-fab?utm_source=github.com&utm_medium=repo-free&utm_campaign=genai-llama-plugin">Gen AI Llama</a></b><br/> <code>Ollama</code> <code>vLLM</code> <code>LM Studio</code> <code>GGUF</code> <code>Jan</code> <code>LocalAI</code> <code>GPT-oss</code> <code>LLaVA</code><br/> Local LLMs inside your Unreal Engine game - no API keys needed</p> </td> </tr> </table>

Free Plugin (This Repo) - LLM/GenAI API Support:

  • OpenAI: Chat (gpt-4.1, gpt-4.1-mini, o4-mini, o3, o3-pro), Structured Outputs
  • Anthropic Claude: Chat (claude-4-latest, claude-3-7-sonnet, claude-3-5-sonnet, claude-3-5-haiku)
  • XAI Grok: Chat (grok-3-latest, grok-3-mini-beta)
  • DeepSeek: Chat (deepseek-chat V3.1), Reasoning (deepseek-reasoning-r1)
  • Local AI: unreal-ollama (MIT) - gpt-oss, qwen3-vl and more
<details> <summary><b>Full Model List (200+ models across all plugins)</b></summary>

Gen AI Pro

  • OpenAI: gpt-5.4, gpt-5.4-pro, gpt-5.3-codex, gpt-5.2, gpt-5.1, gpt-5, gpt-5-mini, gpt-5-nano, gpt-4.1, o4-mini, o3, o3-pro | Responses API, Vision, Realtime (gpt-realtime), Image Gen (gpt-image-1.5, dall-e-3), TTS (gpt-4o-mini-tts, whisper-1), Streaming, Function Calling, Multimodal
  • Anthropic: claude-opus-4-6, claude-sonnet-4-6, claude-4.5-opus, claude-4.5-sonnet, claude-4.5-haiku | Extended Thinking, Vision, Tool Use
  • Google Gemini: gemini-3.1-pro, gemini-3.1-flash-lite, gemini-2.5-pro, gemini-2.5-flash | Imagen (imagen-4.0-generate, imagen-4.0-ultra), Realtime, TTS, Multimodal
  • XAI Grok: grok-4.1, grok-4, grok-4-eu, grok-code-fast-1, grok-3 | Vision, Streaming, Reasoning
  • ElevenLabs: TTS (eleven_v3, eleven_turbo_v2_5), Transcription (scribe_v2), Sound Effects (eleven_text_to_sound_v2)
  • Inworld AI: TTS (inworld-tts-1.5-max, inworld-tts-1.5-mini)
  • OpenAI Compatible Mode: Alibaba Qwen, Mistral, Groq, OpenRouter, Meta Llama, GLM-4, Ollama

Gen AI Pro China

  • Alibaba Qwen: qwen3.5-plus, qwen3.5-flash, qwen3-max, qwen3-coder-plus | Multimodal (qwen-omni-turbo, qwen-vl-max), Image Gen (qwen-image, wan2.2-t2i-plus), TTS (qwen3-tts-flash)
  • Moonshot/Kimi: kimi-k2.5, kimi-k2-thinking, kimi-k2-thinking-turbo | Multimodal (kimi-latest)
  • ByteDance: seed-2-0-mini, seed-1-8, skylark-pro-250415 | Vision (skylark-vision), Image Gen (seedream-4-0-250828)
  • ZhipuAI: glm-5, glm-4.7, glm-4.7-flash | Multimodal (glm-4.6v)
  • Baidu: ernie-5.0-8k, ernie-4.5-8k, ernie-x1-32k

GenAI Model Generator (3D)

  • Meshy AI: meshy-6 - Text-to-3D, Image-to-3D, Retexture, Auto-Rigging
  • Tripo AI: tripo-v2.5 - Text-to-3D, Image-to-3D
  • Hunyuan3D (Tencent): hunyuan3d-v3.1-pro Text-to-3D, hunyuan3d-v2.1 Image-to-3D
  • TripoSR: Image-to-3D (fast, <1s)
  • Rodin (Hyper3D): rodin-gen-2 - Text/Image-to-3D with PBR materials
  • Trellis 2 (Microsoft): Image-to-3D with PBR materials
  • Google Gemini: gemini-3.1-flash - PBR texture generation
</details>

Additional Features:

  • Plugin Example Project here
  • Version Control: Git Submodule Support, Perforce (in progress)
  • Lightweight: No External Dependencies, Exclude MCP from build
  • Testing across platforms and engine versions (available in pro plugins)

Unreal MCP (Model Control Protocol):

Other MCP options: Epic Games is working on an official Unreal MCP integration for UE 5.8+. There's also UnrealClaude (MIT) by Natfii - a standalone Unreal MCP implementation worth checking out. This plugin's MCP support targets UE 5.4-5.7+ and works alongside Claude Desktop, Claude Code, and Cursor. Note: MCP in this free plugin is not being actively developed - the features listed below reflect the current state.

<details> <summary><b>MCP Feature Status</b> (✅ = Done, 🛠️ = In Progress, 🚧 = Planned)</summary>
  • Clients Support ✅
    • Claude Desktop App Support ✅
    • Claude Code CLI Support ✅
    • Cursor IDE Support ✅
    • OpenAI Operator API Support 🚧
  • Blueprints Auto Generation 🛠️
    • Creating new blueprint of types ✅
    • Adding new functions, function/blueprint variables ✅
    • Adding nodes and connections 🛠️ (buggy, issues open)
    • Advanced Blueprints Generation 🛠️
  • Level/Scene Control for LLMs 🛠️
    • Spawning Objects and Shapes ✅
    • Moving, rotating and scaling objects ✅
    • Changing materials and color ✅
    • Advanced scene features 🛠️
  • Generative AI:
    • Prompt to 3D model fetch and spawn 🛠️
  • Control:
    • Ability to run Python scripts ✅
    • Ability to run Console Commands ✅
  • UI:
    • Widgets generation 🛠️
    • UI Blueprint generation 🛠️
  • Project Files:
    • Create/Edit project files/folders ️✅
    • Delete existing project files ❌
  • Others:
    • Project Cleanup 🛠️
</details>

Table of Contents

Setting API Keys:

[!NOTE]
There is no need to set the API key for testing the MCP features in Claude app. Anthropic key only needed for Claude API.

For Editor:

Set the environment variable PS_<ORGNAME> to your API key.

For Windows:

setx PS_<ORGNAME> "your api key"

For Linux/MacOS:

  1. Run the following command in your terminal, replacing yourkey with your API key.

    echo "export PS_<ORGNAME>='yourkey'" >> ~/.zshrc
    
  2. Update the shell with the new variable:

    source ~/.zshrc
    

PS: Don't forget to restart the Editor and ALSO the connected IDE after setting the environment variable.

Where <ORGNAME> can be: PS_OPENAIAPIKEY, PS_DEEPSEEKAPIKEY, PS_ANTHROPICAPIKEY, PS_METAAPIKEY, PS_GOOGLEAPIKEY etc.

For Packaged Builds:

Storing API keys in packaged builds is a security risk. This is what the OpenAI API documentation says about it:

"Exposing your OpenAI API key in client-side environments like browsers or mobile apps allows malicious users to take that key and make requests on your behalf – which may lead to unexpected charges or compromise of certain account data. Requests should always be routed through your own backend server where you can keep your API key secure."

Read more about it here.

For test builds you can call the GenSecureKey::SetGenAIApiKeyRuntime either in c++ or blueprints function with your API key in the packaged build.

Setting up MCP:

[!NOTE]
If your project only uses the LLM APIs and not the MCP, you can skip this section.

[!CAUTION]
Discalimer: If you are using the MCP feature of the plugin, it will directly let the Claude Desktop App control your Unreal Engine project. Make sure you are aware of the security risks and only use it in a controlled environment.

Please backup your project before using the MCP feature and use version control to track changes.

1. Install any one of the below clients:
  • Claude Desktop App from here.
  • Claude Code CLI from here.
  • Cursor IDE from here.
2. Setup the mcp config json:
For Claude Desktop App:

claude_desktop_config.json file in Claude Desktop App's installation directory. (might ask claude where its located for your platform!) The file will look something like this:

{
    "mcpServers": {
      "unreal-handshake": {
        "command": "python",
        "args": ["<your_project_directoy_path>/Plugins/GenerativeAISupport/Content/Python/mcp_server.py"],
        "env": {
          "UNREAL_HOST": "localhost",
          "UNREAL_PORT": "9877" 
        }
      }
    }
}
For Claude Code CLI:

.mcp.json file in your project root directory. The file will look something like this:

{
    "mcpServers": {
      "unreal-handshake": {
        "type": "stdio",
        "command": "python",
        "args": ["<your_project_directoy_path>/Plugins/GenerativeAISupport/Content/Python/mcp_server.py"],
        "env": {
          "UNREAL_HOST": "localhost",
          "UNREAL_PORT": "9877"
        }
      }
    }
}
For Cursor IDE:

.cursor/mcp.json file in your project directory. The file will look something like this:

{
    "mcpServers": {
      "unreal-handshake": {
        "command": "python",
        "args": ["<your_project_directoy_path>/Plugins/GenerativeAISupport/Content/Python/mcp_server.py"],
        "env": {
          "UNREAL_HOST": "localhost",
          "UNREAL_PORT": "9877"
        }
      }
    }
}
3. Install FastMCP.
pip install fastmcp
4. Enable python plugin in Unreal Engine. (Edit -> Plugins -> Python Editor Script Plugin)
5. [OPTIONAL] Enable AutoStart MCP server on editor open
<img src="Docs/Settings.png" width="782"/>

Adding the plugin to your project:

With Git:

  1. Add the Plugin Repository as a Submodule in your project's repository.

    git submodule add https://github.com/prajwalshettydev/UnrealGenAISupport Plugins/GenerativeAISupport
    
  2. Regenerate Project Files: Right-click your .uproject file and select Generate Visual Studio project files.

  3. Enable the Plugin in Unreal Editor: Open your project in Unreal Editor. Go to Edit > Plugins. Search for the Plugin in the list and enable it.

  4. For Unreal C++ Projects, include the Plugin's module in your project's Build.cs file:

    PrivateDependencyModuleNames.AddRange(new string[] { "GenerativeAISupport" });
    

With Perforce:

Still in development..

With Fab (Unreal Marketplace):

This free plugin is available via Git (above). For the pro plugins, check Fab.com.

Fetching the Latest Plugin Changes:

With Git:

you can pull the latest changes with:

cd Plugins/GenerativeAISupport
git pull origin main

Or update all submodules in the project:

git submodule update --recursive --remote

With Perforce:

Still in development..

Usage:

There is a example Unreal project that already implements the plugin. You can find it here.

OpenAI:

Currently the plugin supports Chat and Structured Outputs from OpenAI API. Both for C++ and Blueprints. Tested models: gpt-4.1, gpt-4.1-mini, gpt-4.1-nano, o4-mini, o3, o3-pro, o3-mini.

1. Chat:

C++ Example:
    void SomeDebugSubsystem::CallGPT(const FString& Prompt, 
        const TFunction<void(const FString&, const FString&, bool)>& Callback)
    {
        FGenChatSettings ChatSettings;
        ChatSettings.Model = TEXT("gpt-4o-mini");
        ChatSettings.MaxTokens = 500;
        ChatSettings.Messages.Add(FGenChatMessage{ TEXT("system"), Prompt });
    
        FOnChatCompletionResponse OnComplete = FOnChatCompletionResponse::CreateLambda(
            [Callback](const FString& Response, const FString& ErrorMessage, bool bSuccess)
        {
            Callback(Response, ErrorMessage, bSuccess);
        });
    
        UGenOAIChat::SendChatRequest(ChatSettings, OnComplete);
    }
Blueprint Example:
<img src="Docs/BpExampleOAIChat.png" width="782"/>

2. Structured Outputs:

C++ Example 1:

Sending a custom schema json directly to function call

FString MySchemaJson = R"({
"type": "object",
"properties": {
    "count": {
        "type": "integer",
        "description": "The total number of users."
    },
    "users": {
        "type": "array",
        "items": {
            "type": "object",
            "properties": {
                "name": { "type": "string", "description": "The user's name." },
                "heading_to": { "type": "string", "description": "The user's destination." }
            },
            "required": ["name", "role", "age", "heading_to"]
        }
    }
},
"required": ["count", "users"]
})";

UGenAISchemaService::RequestStructuredOutput(
    TEXT("Generate a list of users and their details"),
    MySchemaJson,
    [](const FString& Response, const FString& Error, bool Success) {
       if (Success)
       {
           UE_LOG(LogTemp, Log, TEXT("Structured Output: %s"), *Response);
       }
       else
       {
           UE_LOG(LogTemp, Error, TEXT("Error: %s"), *Error);
       }
    }
);
C++ Example 2:

Sending a custom schema json from a file

#include "Misc/FileHelper.h"
#include "Misc/Paths.h"
FString SchemaFilePath = FPaths::Combine(
    FPaths::ProjectDir(),
    TEXT("Source/:ProjectName/Public/AIPrompts/SomeSchema.json")
);

FString MySchemaJson;
if (FFileHelper::LoadFileToString(MySchemaJson, *SchemaFilePath))
{
    UGenAISchemaService::RequestStructuredOutput(
        TEXT("Generate a list of users and their details"),
        MySchemaJson,
        [](const FString& Response, const FString& Error, bool Success) {
           if (Success)
           {
               UE_LOG(LogTemp, Log, TEXT("Structured Output: %s"), *Response);
           }
           else
           {
               UE_LOG(LogTemp, Error, TEXT("Error: %s"), *Error);
           }
        }
    );
}
Blueprint Example:
<img src="Docs/BpExampleOAIStructuredOp.png" width="782"/>

DeepSeek API:

Currently the plugin supports Chat and Reasoning from DeepSeek API. Both for C++ and Blueprints. Points to note:

  • System messages are currently mandatory for the reasoning model. API otherwise seems to return null
  • Also, from the documentation: "Please note that if the reasoning_content field is included in the sequence of input messages, the API will return a 400 error. Read more about it here"

[!WARNING]
While using the R1 reasoning model, make sure the Unreal's HTTP timeouts are not the default values at 30 seconds. As these API calls can take longer than 30 seconds to respond. Simply setting the HttpRequest->SetTimeout(<N Seconds>); is not enough So the following lines need to be added to your project's DefaultEngine.ini file:

[HTTP]
HttpConnectionTimeout=180
HttpReceiveTimeout=180

1. Chat and Reasoning:

C++ Example:
 FGenDSeekChatSettings ReasoningSettings;
 ReasoningSettings.Model = EDeepSeekModels::Reasoner; // or EDeepSeekModels::Chat for Chat API
 ReasoningSettings.MaxTokens = 100;
 ReasoningSettings.Messages.Add(FGenChatMessage{TEXT("system"), TEXT("You are a helpful assistant.")});
 ReasoningSettings.Messages.Add(FGenChatMessage{TEXT("user"), TEXT("9.11 and 9.8, which is greater?")});
 ReasoningSettings.bStreamResponse = false;
 UGenDSeekChat::SendChatRequest(
     ReasoningSettings,
     FOnDSeekChatCompletionResponse::CreateLambda(
         [this](const FString& Response, const FString& ErrorMessage, bool bSuccess)
         {
             if (!UTHelper::IsContextStillValid(this))
             {
                 return;
             }

             // Log response details regardless of success
             UE_LOG(LogTemp, Warning, TEXT("DeepSeek Reasoning Response Received - Success: %d"), bSuccess);
             UE_LOG(LogTemp, Warning, TEXT("Response: %s"), *Response);
             if (!ErrorMessage.IsEmpty())
             {
                 UE_LOG(LogTemp, Error, TEXT("Error Message: %s"), *ErrorMessage);
             }
         })
 );
Blueprint Example:
<img src="Docs/BpExampleDeepseekChat.png" width="782"/>

Anthropic API:

Currently the plugin supports Chat from Anthropic API. Both for C++ and Blueprints. Tested models: claude-4-latest, claude-3-7-sonnet-latest, claude-3-5-sonnet, claude-3-5-haiku-latest.

1. Chat:

C++ Example:
    // ---- Claude Chat Test ----
    FGenClaudeChatSettings ChatSettings;
    ChatSettings.Model = EClaudeModels::Claude_3_7_Sonnet; // Use Claude 3.7 Sonnet model
    ChatSettings.MaxTokens = 4096;
    ChatSettings.Temperature = 0.7f;
    ChatSettings.Messages.Add(FGenChatMessage{TEXT("system"), TEXT("You are a helpful assistant.")});
    ChatSettings.Messages.Add(FGenChatMessage{TEXT("user"), TEXT("What is the capital of France?")});
    
    UGenClaudeChat::SendChatRequest(
        ChatSettings,
        FOnClaudeChatCompletionResponse::CreateLambda(
            [this](const FString& Response, const FString& ErrorMessage, bool bSuccess)
            {
                if (!UTHelper::IsContextStillValid(this))
                {
                    return;
                }
    
                if (bSuccess)
                {
                    UE_LOG(LogTemp, Warning, TEXT("Claude Chat Response: %s"), *Response);
                }
                else
                {
                    UE_LOG(LogTemp, Error, TEXT("Claude Chat Error: %s"), *ErrorMessage);
                }
            })
    );
Blueprint Example:
<img src="Docs/BpExampleClaudeChat.png" width="782"/>

XAI's Grok 3 API:

Currently the plugin supports Chat from XAI's Grok 3 API. Both for C++ and Blueprints.

1. Chat:

	FGenXAIChatSettings ChatSettings;
	ChatSettings.Model = TEXT("grok-3-latest");
		ChatSettings.Messages.Add(FGenXAIMessage{
		TEXT("system"),
		TEXT("You are a helpful AI assistant for a game. Please provide concise responses.")
	});
	ChatSettings.Messages.Add(FGenXAIMessage{TEXT("user"), TEXT("Create a brief description for a forest level in a fantasy game")});
	ChatSettings.MaxTokens = 1000;

	UGenXAIChat::SendChatRequest(
		ChatSettings,
		FOnXAIChatCompletionResponse::CreateLambda(
			[this](const FString& Response, const FString& ErrorMessage, bool bSuccess)
			{
				if (!UTHelper::IsContextStillValid(this))
				{
					return;
				}
				
				UE_LOG(LogTemp, Warning, TEXT("XAI Chat response: %s"), *Response);
				
				if (!bSuccess)
				{
					UE_LOG(LogTemp, Error, TEXT("XAI Chat error: %s"), *ErrorMessage);
				}
			})
	);

Model Control Protocol (MCP):

This is currently work in progress. The plugin supports various clients like Claude Desktop App, Cursor etc.

Usage:

If Autostart MCP server is enabled: (In plugin's settings)

1. Open the Unreal Engine Editor.
2. Open the Claude Desktop App, Claude Code CLI, or Cursor IDE.

That's it! You can now use the MCP features of the plugin.

If Autostart MCP server is disabled:

1. Run the MCP server from the plugin's python directory.
python <your_project_directoy>/Plugins/GenerativeAISupport/Content/Python/mcp_server.py
2. Run the MCP client by opening or restarting the Claude desktop app or Cursor IDE.
3. Open a new Unreal Engine project and run the below python script from the plugin's python directory.

Tools -> Run Python Script -> Select the Plugins/GenerativeAISupport/Content/Python/unreal_socket_server.py file.

4. Now you should be able to prompt the Claude Desktop App to use Unreal Engine.

Known Issues:

  • Nodes fail to connect properly with MCP
  • No undo redo support for MCP
  • No streaming support for Deepseek reasoning model
  • No complex material generation support for the create material tool
  • Issues with running some llm generated valid python scripts
  • When LLM compiles a blueprint no proper error handling in its response
  • Issues spawning certain nodes, especially with getters and setters
  • Doesn't open the right context window during scene and project files edit.
  • Doesn't dock the window properly in the editor for blueprints.

Contribution Guidelines:

Setting up for Development:

  1. Install unreal python package and setup the IDE's python interpreter for proper intellisense.
pip install unreal

More details will be added soon.

Project Structure:

More details will be added soon.

References:

Quick Links:

Support This Project

<iframe src="https://github.com/sponsors/prajwalshettydev/button" title="Sponsor prajwalshettydev" height="32" width="114" style="border: 0; border-radius: 6px;"></iframe>

If you find UnrealGenAISupport helpful, consider sponsoring me to keep the project going! Click the "Sponsor" button above to contribute.

Global Ranking

-
Trust ScoreMCPHub Index

Based on codebase health & activity.

Manual Config

{ "mcpServers": { "unrealgenaisupport": { "command": "npx", "args": ["unrealgenaisupport"] } } }