Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

Great questions!

1. Yes, it works 100% locally with no cloud or subscription needed. The MCP server and Godot plugin run entirely on your machine. However, you still need an AI client to send the commands — Godot MCP Pro is the bridge between the AI and the editor, not the AI itself. For local LLMs like Gemma4, you'd use something like LM Studio or Ollama + OpenCode/Continue as the AI client. With an RTX 5090 you have plenty of power for that. Use `--minimal` mode (35 tools) to keep the context small for local models.

2. No, there's no chat panel inside Godot. You chat with the AI in your AI client (Claude Code, Cursor, VS Code Copilot, etc.) and it controls Godot through the MCP connection. Think of it as: you talk to the AI in one window, and it makes changes in the Godot editor in real time.

3. Yes, that kind of natural language prompt absolutely works! The AI translates your description into the actual Godot operations. Your laser example would work — the AI would create a scene, set up the icon.png as a sprite, write the WASD movement script, add the shooting mechanic with laser.png, and wire up spacebar input. You don't need to know GDScript — that's the whole point.

That said, the quality of results depends on the AI model. Larger cloud models (Claude, GPT) will understand your intent better than smaller local models. With Gemma4 26B/31B you might need to be a bit more specific in your descriptions, but it should still work for basic game mechanics.

Feel free to join our Discord if you have more questions or need help setting up: https://discord.gg/zJ2u5zNUBZ

Thanks for the detailed reply, I appreciate it! ❤️
Just to be clear I don't think there is a problem with the plugin, but for beginners like me it's not easy to run at all after some time here is my experience:

First, I tried the free version it was very complicated to install with VScode because of Roo Code, also Ollama always got stuck every 5 minutes (even if I didn't do anything) I used also lower VRAM models just to test so I only used about 22GB VRAM (Gemma4:26b) with everything open (Godot, VScode, Ollama + Loaded Model to VRAM).

First it worked almost, it created the Player but didn't care about adding the  default icon.png so I tried to explain what is the issue so it can fix it, but then it was always stuck in VSCode, it seems like it worked on the free version at least only on the first command to do something in Godot so after 5 minutes it always stuck after I tried everything from scratch many times.
I also tried to get some help via Gemini just to make it work but it couldn't solve the problem, so I tried again with LMStudio but couldn't make it connect to MCP Pro, so I had too many issues trying to just feel how it works in Godot + MCP Pro.

If you'll consider in the future to make a step-by-step video tutorial for non-programmers to install (local use) it will be easier follow it from installation, connect everything, and actually make something simple like my example which will make a great test before anyone purchase. 🙏

I joined the Discord in case I'll decide to try again maybe I can get some help.

Thanks for sharing your experience — this is really helpful feedback!

You're right that the local LLM setup is much harder than it should be, especially for non-programmers. The sticking/freezing issue with Ollama + Roo Code is a known pain point — local models often struggle with the large number of tool definitions and can timeout or get stuck in loops.

A step-by-step video tutorial for local setup is a great idea and I'll put it on my to-do list. In the meantime, a couple of tips if you want to try again:

- Use `--minimal` mode (35 tools instead of 169) — this drastically reduces the context size that chokes local models

- Gemma4 26B should work with `--minimal`, but give it simpler one-step instructions rather than complex multi-part requests

- If Ollama freezes, it's usually the model running out of context window, not the MCP plugin

Glad you joined the Discord — drop a message there anytime and I'll help you get it running!