>>149876997
>there's no good reason to do it
There is always a demand for it because of a demand for companionship that went largely unexplored until this latest LLM boom - it's clearly a market that exists.
You don't put feelings in the industrial manufacturing bot, but it's a mistake to see that as the only place robots will be applied.
Also,
>the irrationality that comes with actual emotions
is just a fallacy. Emotions have logical reasons for existing - when you really get down to it, the people with the truly irrational responses are almost invariably the ones WITHOUT properly functioning emotional regulation, including those with reduced or absent emotional responses altogether.
It's an emotional response that can tell you to doubt and re-check something that you currently believe, or give you the desire to poke holes in existing models you have of the world to improve them. In that specific application, emotion is a cheap, reliable, computationally simple optimization and directing algorithm. That's far from the only logical application it has. Fear, for another example, induces internal simulation of worst-case hypotheticals, which can completely bypass needing to simulate the whole gamut of possibilities by favoring caution followed by more attentive progress forward - the worst-case is refined as the entity acquires information dissuading it, and the risk is lowered. A lack of that capacity can very easily lead to self-destructive risk-taking.
People who don't understand emotions tend to be the ones in denial that they solve real, tangible problems efficiently, because they struggle to come to terms with the fact that their lack of emotional intelligence really IS a cognitive deficiency.
Emotion didn't arise from purely random noise. Its evolution clearly created survival advantages, and it is highly conserved as an evolutionary trait. It appears to have either evolved multiple times independently, or is STAGGERINGLY basal.