Skip to main content

Indie game storeFree gamesFun gamesHorror games
Game developmentAssetsComics
SalesBundles
Jobs
TagsGame Engines
(+6)

If he loses from using too much power, then he should use less power. You don't need ChatGPT for that kind of reasoning...

Also, you can't even rely on LLMs for reason, because they make stuff up all the time. People poisoned themselves by following ChatGPT's medical advice, for example. 

ChatGPT is also agreeable, which feeds into people's biases. If you ask GPT if your family is trying to poison you, it'll will agree and recommend self defense, instead of psychiatric help. 

LLMs will undermine your critical thinking, your health, and your mind.

(-2)

ohhhh okay :)