Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

The fact this app is powered by Ollama backend makes it a none starter for those using a AMD GPU, I messed with it and Ollama before trying ST and LM Studio and never looked back. I have downloaded a lot of Ai LLM models from hugging face and they each have their own strengths and weaknesses especially for running RP scenarios. For those looking for good Ai models for RP try Triange104, DavidAU and obviously openerotica from hugging face, you will find they have plenty of choice and options depending on what your system can handle. 

Fair! But we do have Proxy LLM mode, so you can use any LLM you want. I.e. you can run llama.cpp or LM Studio for the LLM provider, and then just HammerAI for the UI and character chat. (and that's all 100% free, you only pay for access to cloud models)