Grendizer wrote:
My preference is for a fembot companion to behave and present itself in a fashion indistinguishable from a human. Both the emotional and sentient aspect would be simulations, rather than the genuine article -- and that is the point. I don't want to be able to hurt or enslave such a creature, so it is imperative that it not be capable of feeling pain or anything else (emotional or otherwise) in any genuine sense or deviating from its assigned premise (motivation). On admittedly shallow reflection, the only valid reason I can see for wanting a robot companion to possess genuine emotional awareness is so that you can attach yourself to a creature that cannot die, and therefore cannot leave you (at least not in the existential sense). But in reality, this sort of bond will likely happen even with the emotionless automatons I seek, since that is the very definition of "indistinguishable." If I couldn't form a bond with one because of its nature, then it wouldn't be indistinguishable from a human. These creatures will convince us at the emotional level that they feel, despite evidence to the contrary. Yet despite their act, we will be obviated from any ethical duty to them, just like a car.
That's right.
People have attached themselves to stone idols, divining their supposed emotions and orders they give their followers. People write complicated psychological essays about literary characters such as Shakespeare's Hamlet or try to emulate them -- the Lone Ranger, Tarzan,
etc. -- to such an extent that invented personifications like Uncle Sam can be used to motivate millions. Little girls attribute emotions and feelings and motives to their dolls.
The capacity to imitate actual human behavior isn't even necessary, just a stimulus like a name, a shape, a sequence of words on paper.
That's why it isn't necessary to make a fembot conscious or sentient. The better her program to imitate human behavior and adapt to her owner the better she is, but she need be no more alive than the computer I'm using or the radio I have turned on right now.
It's best she not be since that would, as I said, add more suffering and more "ethical" complications to the world.