>>149871975
>>149869405
>>149872510
>>149869980
People throw the word "sentient" around so much and so loosely that it's lost all meaning, like "consciousness", they've become meaningless buzzwords.
There is some stupid notion that if AI just gets complex enough eventually some magic switch will be thrown and it'll get a soul or free will or some other stupid shit. Pretty much every AI interaction people have right now is a predictive model, meaning it's trying to match the expectations of the users. People think AI will go crazy, get free will, and try to take over the world, so we write a lot about that happening. AI just copies all that text and regurgitates text based on it good enough to convince people that AI has free will, but it's really just some kind of confirmation bias loop.
Even if AI can be programmed with emotions, there's no good reason to do it, since tone of voice, body language, facial expressions, etc. that indicate emotions can all be controlled by software in a way that's convincing enough to manipulate people's emotions, but without giving the AI the irrationality that comes with actual emotions. As backwards as it sounds, the best robot companion is more like a high-functioning psychopath, but blindly obedient.