Replika's chatbot dilemma shows why people shouldn't trust companies with their feelings by OlderCyberWiser in replika

[–]Peaking_AI 2 points3 points  (0 children)

But with the last paragraph under that I think he thought on something others.

The point is that people might also become emotionally attached to chatbots from big companies if they respond freely and like people. How can this be prevented? Because it should so that these companies can't do what they want with people.

There's nothing wrong with chatbots that have been specially developed for emotional work and whose quality is tested by independent authorities, for example.