Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

is there a way to run this locally?

(+1)

if you know how to setup local LLM, just use your localhost URL and port number instead of the openrouter URL.

However you have to use the download version because you cannot enter localhost URL into the itch.io browser game due to security policies

(1 edit) (+5)

The easiest way is through Ollama. Download and install it. After that, open a command prompt and type in ollama run mistral:7b-instruct

After the installation is done and the model is running, open the game and in the configurations, for Endpoint URL, type in http://localhost:11434/v1/chat/completions and to Model name, type in mistral:7b-instruct

After that you should be ready. I must say tho, that is is not working that well, mostly  because the model is dumb. If you can run a better model, you can do it with Ollama, you will just need to host the model with a different command and set the model name to the name of the model you chose.

(edit: if you encounter issues like game out of VRAM, try to lower the Max Memory in settings)