Devlogs
Version 0.0.63
Posted June 20, 2024 by HammerAI
#release
Hi all, version 0.0.63 is live! This release includes:
Desktop:
- Fix a bug where the chat would run very slowly after the 0.0.60 release - we were reading from the JSON file each time you or the LLM typed anything, which was very slow.
- Fix a bug where if you cancel your Pro subscription you would immediately lose access - now you have access until the end of your billing period.
- Expose two new settings:
- GPU Layers: This offloads more processing on your GPU. Before we had it set to 1 which means we didn't use the GPU to the full possible extent, but now we set it to 50. But you can try setting it even higher if you have a good GPU. In a follow-up we can automatically calculate this for you, but for now it's manual.
- Max Response Tokens: We set this to 256 today - if you find that your chat is cut off, you can increase this number.
Thank you!