There will be better long-term memory and OpenRouter support that lets you use large models like e.g. DeepSeek-V3 soon.
Awesome news! Will the requirements be around the same as Nemo/"New"?
...I wonder what sillyness I can get up to with better long-term goodness...
Will export and import memories still be a thing? It really is quite handy to steer the convos in a desired direction, or clear up the random mention of being a him when that's clearly not the case.
Well, OpenRouter is a marketplace for API models that run on a remote server, so there aren't any requirements at all.
It does cost money. I estimate about $0.4 for the equivalent of the 1,800 daily requests you get using the AI server, if you're using DeepSeek-V3.
The improved long-term memory will work with any model though.