Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines

this is very promising and i myself have been testing and messing with connecting LLM's to a game world. I am wondering about though, when using local LLM does the game tell the LLM about the robots stats, body and what it can perform in terms of actions? i read that currently there is a limitation with function calling but would love to hear more about how this works :) 


do you parse the actions from the text generated, or perhaps make the llm generate both speech and actions seperately or something like that?

yeah I inject messages to the chat for the game.


yes I parse actions manually sometimes depending on the action.

So i kind of took a look at the code and if i am not mistaken, are you not sending the tool list when using a local llm?


are you sending the tools as 


messages=[{"role": "user", "content": "What is the weather like in Paris today?"}],     tools=tools 

or is the tools like this?

messages=[{"role": "user", "content": "What is the weather like in Paris today?", "role": "tools", "content":"{{tool_list}}"}] 

neither, loop up tool calls on OpenAI API docs