Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines
(2 edits)

Hi, I'm trying to run Mistral-Nemo-Instruct-2407-GGUF locally from my computer (thru LM Studio). I have everything set up and it's working, but for some reason all of the responses are short (80-100 characters). When I try to run the same instances/scenarios from the online version of the game, I was starting to get much longer scenarios (300-500 characters)...but I don't want to bog down your server. Are there any tips or tricks you know of that I can use to get longer responses? Also, it appears to cut off the responses in the game window, however the "choices" are almost always properly formatted.

I have also verified that I have plenty of memory (to the best of my knowledge). I'm running 10000 max memory and 4000 max output tokens in the Endpoint menu and I have my Context Length set to 4096 in the LM model. It appears to be running the default Jinja template for the Prompt Template...I'm not knowledgeable enough to know if that is related to the problem or not. I do have a somewhat beefy pc, so I'm not too worried about bumping up settings if that is necessary.

From my limited understanding the game's output shouldn't be more than the tokens dedicated. The game is technically a really long conversation meaning the longer the game is played the more tokens it will use up. Once that limit is reached details will start disappearing as old events will be overwritten with new. Meaning the context Tokens set in LM Studio are used for both the game's memory and outputs. Remember to uncheck the 1 paragraph response restraint in the settings.

Hopefully someone more knowledgeable on the subject will answer.

It could very well be the "One Paragraph" option inside of the game! If that is enabled, then the AI will be forced to deliver much shorter answers.

However, if the option is disabled and you are still getting unusually short responses, you can try changing the instructions to make the AI add a lot of detail. Here is what I have once built:

---
You are an AI for a game narrative. Given the current game world information, within two paragraphs in plaintext and separated from each other using new lines, explain the world or any currently happening event in detail to the player. The key point here is that it should feel less like a game and more like something that is actually happening. Because of this, make no mentions of this being a game or any rules that are set in place.
---

(+1)

Oh my gosh. This has been killing me for days...I think I'm illiterate. I completely missed this option in the settings and have been pouring through information on AI models, LM Studio, and all of the LM Studio settings (though luckily learning quite a bit still). Thank you both so much!

I found the setting and unchecked it. All of a sudden, I am getting much more info and it no longer looks like it is cut off (was not getting a "." or completed sentence for the last sentence each time before unchecking).  Thanks!!!