Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

GameTranslate

In-game translator at your disposal · By Godnoken

Ollama as an alternative for a local Translation model

A topic by Lolmaster29934 created Mar 22, 2025 Views: 159 Replies: 2
Viewing posts 1 to 3

Hi Godnoken, i have a question about using Ollama as another alternative for Local translation.  

i was wondering if it would be possible to use one of the many LLM models in Ollama to improve the quality of translation output?  

like: 

https://ollama.com/library/mistral-nemo

https://ollama.com/lauchacarro/qwen2.5-translator


website:

https://ollama.com/

Developer

Hello!

This is definitely possible to implement, but looking at the size of the model files, I bet, unfortunately, that they would be extremely slow to use. I am most definitely interested in supporting more types of translation services (both offline and online ones) though!

As it stands today, the biggest issue with translations is the text capturing part, so that is my focus at the moment. But be rest assured that other types of translations will be added in the future :)

Developer

Hi bud,

There is now a 0.4.9_beta version available that has a Custom API version available. You can use it to connect to an LLM model online or if you choose to host one yourself. There should be plenty of tutorials online on how to accomplish this. If you need help, please let me know! :)

Cheers!