Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

Fixed!


You were on the right track, but the magic parameter I needed to set was num_predict. Setting that to 128 forces the model to terminate if it gets stuck. Longer translations are cut off, though. I'm going to experiment with some other translation models to see if this is an issue with the model or ollama itself.

Thanks again for working through this with me.

Nice!

Thanks yourself for testing it out :)

I've had a bit of a timeout and holiday - sorry for the slow reply.
I experimented a lot with different local models as well as online models before leaving. If I remember correctly, the main reason as to why the model got stuck was due to improper prompts. Of course setting a limit stops the model from looping forever, but it is only really masking the real issue.

I intend to do some more rigorous testing and go through a few examples with specific models soon so that it is easier to understand where issues may crop up when we use LLM's for translations.