Hi Godnoken, i have a question about using Ollama as another alternative for Local translation.
i was wondering if it would be possible to use one of the many LLM models in Ollama to improve the quality of translation output?
like:
https://ollama.com/library/mistral-nemo
https://ollama.com/lauchacarro/qwen2.5-translator
website: