Posted January 14, 2025 by Aleksandr Unconditional
#local model #koboldcpp
→ integrated koboldcpp with local model*
→ desktop: added welcome screen with choice of local model or via api
→ in the sandbox added location selection (upper left corner)
→ updated failed Ava images
→ redesign of selection buttons
→ reworked path to one of the sexual scenes with Dina in the plot
→ now if koboldcpp is selected and koboldcpp-remote tab kobold in settings is open by default
→ minimum screen size in windowed mode increased from 640x360 to 900x500
→ improved bots determining location changes and arrival of second character. fixed work on local models via koboldcpp
→ maximum temperature value lowered to 1.4, recommended to 0.72
→ fixed a bug with version checking
→ in the storyline the earthquake animation has been corrected (and possibly some others)
→ replaced music in the narrative event with fiona in the shower
koboldcpp:
→ now upon closing the game processes koboldcpp and electron terminate correctly and do not remain in the task manager
→ fixed premature token generation termination (error eos token triggered! id:2)
→ parameters adjusted
→ fixed the behavior of suggestions and generated options
→ maximum response length reduced to 630 tokens, recommended value up to 190
→ added a button to shut down koboldcpp
→ added information about approximate system requirements on the start screen and in settings (if there are no models in the folder)
→ added custom triggers for stopping
*now for windows there will be three options: standard (with koboldcpp), -local (with local model) and -lite (without koboldcpp). for linux u will need to download the model separately and select it in settings (later I will figure out the configuration and make it the same as on windows with the folder resources/koboldcpp)
windows: koboldcpp with models is located in the folder resources/koboldcpp, all gguf models from this folder the game will load on the start screen and tab in settings
windows/linux: a model from any folder can be loaded from the settings tab by clicking on "select model"
the local model where I stopped - Gemma-2-Ataraxy-v4d-9B.i1-Q4_K_M - by feeling ~ level of chatgpt-3.5, a bit weak, but much better than cosmosrp handles triggers-emojis, I think it should work fine for most users
among the models I tested, the one that might be best suited for the game—but is more demanding—is ChatWaifu_v1.4-Q4_K_M-GGUF. (If there are no models in the resources/koboldcpp folder, it will display system requirements and a link on the main screen and in the settings)