Gynoid emotions
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
The whole reason they wanted to take Data apart was to make more androids. If you're arguing that Starfleet protects its technological investments, then that would mean they'd rule Data wasn't any more worthy of special treatment than the Doctor. Regardless, it's still a biocentric convenience either way (as well as one of the writers) and not a moral, ethical or legal stance. Whether the organics let the machines have rights or not, they're still the ones making the decision, which means it can be revoked. Rights are rights because someone else identifies them and grants them to you. That's the basic foundation of law.
In any case the writers didn't really try very hard to make this a conundrum. Both Data and the Doctor aspire to be "more human" even though they are both demonstrably better being less so: pretty much every time the Doctor alters his program he turns into some flavor of psychotic maniac, and Data's emotion chip has been little more than a plot device to turn him into a gross liability for the TNG team whenever it's active.
Two of the more interesting arguments made in "Measure of a Man" for me were ones that made considering Data a mere machine compelling. One was this line, from Maddox:
"You are imparting Human qualities to it because it looks Human — but I assure you: it is not. If it were a box on wheels I would not be facing this opposition."
The other one, which I don't remember the exact quote for, was the implication that if the Enterprise computer were to demand its independence as a sentient lifeform, they would disallow its request (particularly interesting considering this pretty much happens later, in "Emergence"). Both of these arguments were neatly swept under the rug with little to-do, but I think they deserve greater scrutiny -- which was pretty much the point of my original reply.
In any case the writers didn't really try very hard to make this a conundrum. Both Data and the Doctor aspire to be "more human" even though they are both demonstrably better being less so: pretty much every time the Doctor alters his program he turns into some flavor of psychotic maniac, and Data's emotion chip has been little more than a plot device to turn him into a gross liability for the TNG team whenever it's active.
Two of the more interesting arguments made in "Measure of a Man" for me were ones that made considering Data a mere machine compelling. One was this line, from Maddox:
"You are imparting Human qualities to it because it looks Human — but I assure you: it is not. If it were a box on wheels I would not be facing this opposition."
The other one, which I don't remember the exact quote for, was the implication that if the Enterprise computer were to demand its independence as a sentient lifeform, they would disallow its request (particularly interesting considering this pretty much happens later, in "Emergence"). Both of these arguments were neatly swept under the rug with little to-do, but I think they deserve greater scrutiny -- which was pretty much the point of my original reply.
- Tanelorn2011
- Posts: 32
- Joined: Fri Dec 10, 2010 2:28 am
- Location: Terra XI
- Contact:
Re: Gynoid emotions
for emotions because for me a gynoid should really be a synthetic woman and should be able to replace a human being. 

- DollSpace
- Moderator
- Posts: 2083
- Joined: Tue Jun 11, 2002 6:27 pm
- Technosexuality: Built
- Identification: Android
- Gender: Female
- Location: Charging Terminal #42
- x 97
- x 28
- Contact:
Re: Gynoid emotions
Oh, no need to defend him; I'm actually *agreeing* with him. What I'm saying is that, just because it looks and acts like a human, it can be (and most likely would be, especially in early days of fembots) just a machine. People would ascribe feelings to it and desires and all the things people associate with their pets, for example, but this isn't even an animal, it's a machine. But just like how we can never know someone else's pain, really, we really can't tell if an android is just *acting* sentient or if she actually *is*. What that means is, if we get to the point where we are creating sentient robots, keeping them under that much control is like slavery. Non-sentient digital or computer life forms need only as much respect as you give your home PC/Mac, and they're there for your entertainment. I hope that explains things better without making my point more muddled!The Egg wrote:I have to come to Dale's defense on this one. You're ascribing human characteristics to what is essentially an alien lifeform. You have no idea how a sentient machine would view itself, what it would consider acceptable behavior on the part of its operator, whether or not it would approve of having its internal mechanisms and processes controlled by another sentient entity, etc. You're making assumptions based on what are largely humanocentric and biocentric moral standings that may in fact have no philosophical or reasonable place in robot morality.DollSpace wrote:But once you cross into sentient territory, at least *some* of the bot's responses or feelings are her own, and to turn them on and off at will is inhumane, *especially* if the robot is aware of her emotions being taken away.
I will agree that such considerations do need to happen, but I think that many people act from a "gut feeling" on this topic rather than any presentable rationale. It's equally likely that a machine intelligence would seek out having its decisions made for it, and even enjoy it to some degree -- after all, that's what it's designed for. Rather than using nebulous ethical defenses, which dale has already shown can be neatly short-circuited with enough clever automation, one should approach this topic from the standpoint of the scientific method. Can you prove that you're sentient, and not just a complex biochemical reaction that seems to act independently? How would you prove the same for a robot? Just because it looks human doesn't mean it is, or else we'd have loads of legislation for statues and department store mannequins.

Thanks.
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
Yeah, I get what you're saying, DS. I think the division is that I take issue with the whole notion of calling it "slavery" here. In my view, it's not slavery of those being indentured are both designed for the work and consenting to their use as such, which we'd presently have no reason to believe robots of any complexity wouldn't be if we compare them to other machines instead of ourselves.
On the other hand, there's always the Terminator/Matrix scenario, wherein the robots are not only sentient, but well pissed at us. Personally I think it's kind of an silly proposal, because again it ascribes a certain humanistic / biological / moral motive to machines, but it is a fun possible outcome to do thought experiments inside.
On the other hand, there's always the Terminator/Matrix scenario, wherein the robots are not only sentient, but well pissed at us. Personally I think it's kind of an silly proposal, because again it ascribes a certain humanistic / biological / moral motive to machines, but it is a fun possible outcome to do thought experiments inside.
-
- Posts: 170
- Joined: Thu May 12, 2011 10:59 am
- Technosexuality: Built
- Identification: Human
- Gender: Male
- Contact:
Re: Gynoid emotions
Except there was another episode of TNG featuring robots called "Exocomps" which were basically "boxes on wheels", yet they were determined to be sentient and given rights.The Egg wrote:The whole reason they wanted to take Data apart was to make more androids. If you're arguing that Starfleet protects its technological investments, then that would mean they'd rule Data wasn't any more worthy of special treatment than the Doctor. Regardless, it's still a biocentric convenience either way (as well as one of the writers) and not a moral, ethical or legal stance. Whether the organics let the machines have rights or not, they're still the ones making the decision, which means it can be revoked. Rights are rights because someone else identifies them and grants them to you. That's the basic foundation of law.
In any case the writers didn't really try very hard to make this a conundrum. Both Data and the Doctor aspire to be "more human" even though they are both demonstrably better being less so: pretty much every time the Doctor alters his program he turns into some flavor of psychotic maniac, and Data's emotion chip has been little more than a plot device to turn him into a gross liability for the TNG team whenever it's active.
Two of the more interesting arguments made in "Measure of a Man" for me were ones that made considering Data a mere machine compelling. One was this line, from Maddox:
"You are imparting Human qualities to it because it looks Human — but I assure you: it is not. If it were a box on wheels I would not be facing this opposition."
The other one, which I don't remember the exact quote for, was the implication that if the Enterprise computer were to demand its independence as a sentient lifeform, they would disallow its request (particularly interesting considering this pretty much happens later, in "Emergence"). Both of these arguments were neatly swept under the rug with little to-do, but I think they deserve greater scrutiny -- which was pretty much the point of my original reply.
The idea that you should treat a machine as just a machine no matter how human and intelligent it seems is terrible. Sure, there's always the possibility it's not really sapient, but if it's so advanced there's no way to tell for sure, do you really think it's justified to take that chance?
Treating a machine/tool like a human should always be preferable to treating a sapient, thinking, feeling being like a machine/tool. An example of the kind of atrocities the latter can lead to is explored in the first few chapters of this story: http://www.fanfiction.net/s/7240007/1/Beautiful_Glitch
Using the kind of standards you guys are applying, you wouldn't even be able to determine if humans are sapient. After all, our thoughts and actions are just caused by natural electrochemical reactions inside our brains, which are not that different from a computer program, as they are governed by the laws of physics and chemistry, and thus act in a predictable manner, granting that you have enough information to make such predictions.
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
You're misunderstanding what I'm saying here. I'm saying treat it as it's own kind of lifeform, rather than impart it with human qualities just because that's the only kind of qualities we know how to give it. I'm arguing for the harder definition of life rather than the easier one.Asato wrote:The idea that you should treat a machine as just a machine no matter how human and intelligent it seems is terrible. Sure, there's always the possibility it's not really sapient, but if it's so advanced there's no way to tell for sure, do you really think it's justified to take that chance?
Look, consider BDSM submissives for a moment. These are able bodied, sentient, intelligent human beings who actively seek out a form of slavery. They can make the decision to choose servitude, and we allow them such, because it would be an insult to their agency and their dignity to disallow it. They are not suddenly non-sapient because they do not value independence. What I'm saying is maybe robots have a similar type of morality which does not require personal freedom, and if so, it would be the height of human hubris to assume that they should be more like us.
You're right! I wouldn't, and neither would you, and many philosophers have broken against the rocky shores of that debate. But that the possibility is disturbing makes it no less valid.Asato wrote:Using the kind of standards you guys are applying, you wouldn't even be able to determine if humans are sapient. After all, our thoughts and actions are just caused by natural electrochemical reactions inside our brains, which are not that different from a computer program, as they are governed by the laws of physics and chemistry, and thus act in a predictable manner, granting that you have enough information to make such predictions.
-
- Posts: 170
- Joined: Thu May 12, 2011 10:59 am
- Technosexuality: Built
- Identification: Human
- Gender: Male
- Contact:
Re: Gynoid emotions
But they should at least be given the chance to decide what they want to do instead of being forced to be subservient.The Egg wrote:You're misunderstanding what I'm saying here. I'm saying treat it as it's own kind of lifeform, rather than impart it with human qualities just because that's the only kind of qualities we know how to give it. I'm arguing for the harder definition of life rather than the easier one.Asato wrote:The idea that you should treat a machine as just a machine no matter how human and intelligent it seems is terrible. Sure, there's always the possibility it's not really sapient, but if it's so advanced there's no way to tell for sure, do you really think it's justified to take that chance?
Look, consider BDSM submissives for a moment. These are able bodied, sentient, intelligent human beings who actively seek out a form of slavery. They can make the decision to choose servitude, and we allow them such, because it would be an insult to their agency and their dignity to disallow it. They are not suddenly non-sapient because they do not value independence. What I'm saying is maybe robots have a similar type of morality which does not require personal freedom, and if so, it would be the height of human hubris to assume that they should be more like us.
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
Isn't that also humanocentric thinking, though? The assumption that sentience should choose its own path because that's how we do it isn't really warranted for other lifeforms. Take dogs for example. They're almost certainly self-aware animals, though not as intelligent as we are (or at least not in the same way we are), but over thousands of years they've been bred and conditioned to react to humans in a certain symbiotic manner. You don't assume the dog gets a choice in whether or not it wants to be a pet, nor could you communicate that idea to it even if you did. And these are biological lifeforms we've interacted with for millennia!Asato wrote:But they should at least be given the chance to decide what they want to do instead of being forced to be subservient.
Maybe choosing for itself is a downright insulting notion to a robot!
Last edited by The Egg on Sat Sep 17, 2011 12:14 am, edited 1 time in total.
-
- Posts: 170
- Joined: Thu May 12, 2011 10:59 am
- Technosexuality: Built
- Identification: Human
- Gender: Male
- Contact:
Re: Gynoid emotions
You'll never know unless you ask them, right?
Seems to me you're just making up excuses.
How would you like it if some aliens abducted you and made you their slave/pet, and didn't bother asking you what you thought about the whole situation, because they thought you might find it offensive or something?
Seems to me you're just making up excuses.
How would you like it if some aliens abducted you and made you their slave/pet, and didn't bother asking you what you thought about the whole situation, because they thought you might find it offensive or something?

- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
Again, you're missing my point. I wouldn't like it at all because I am human and valuing my independence is a human quality. You're ascribing human qualities to non-human entities under the assumption that value systems work the same across all sentient lifeforms, when it demonstrably doesn't even work like that all the time within our own species.Asato wrote:How would you like it if some aliens abducted you and made you their slave/pet, and didn't bother asking you what you thought about the whole situation, because they thought you might find it offensive or something?
How far do we take this absurd notion? Should I ask my computer if it's okay that I'm typing this response on its keyboard right now? Or does that only apply to things that look and act enough like ourselves that we imbue them with some esoteric notion of "humanness"? What about extant fembots like the Kokoro Actroids? They certainly seem to be some measure of human by virtue of their appearance and their ability to respond to questions heuristically. etc. etc.
I'm not making excuses; I refuse to accept cognitive bias as a justification for the treatment of non-human entities, sentient or otherwise. Your assumptions are from conveniences -- moral convenience, legal convenience, convenience of personal identity, and so on. If we're to seriously debate this topic we can afford no shortcuts. If it's a lifeform, then it's a new lifeform with new rules of engagement that we're going to have to think over from square one. If it's not a lifeform, then we don't need to have this discussion at all. In neither case is it prudent or logical to offer humanocentric rights compulsorily.
-
- Posts: 170
- Joined: Thu May 12, 2011 10:59 am
- Technosexuality: Built
- Identification: Human
- Gender: Male
- Contact:
Re: Gynoid emotions
Yeah, but how would the aliens know that?
If you don't see the difference between sentient and obviously non-sentient beings, I can't help you
If you don't see the difference between sentient and obviously non-sentient beings, I can't help you
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
Right, because I obviously need your help. No hubris there.Asato wrote:If you don't see the difference between sentient and obviously non-sentient beings, I can't help you
-
- Posts: 909
- Joined: Sat Mar 27, 2004 9:02 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Drexel Hill, PA
- x 5
- Contact:
Re: Gynoid emotions
Okay, everyone! Return to your corners and take a deep breath, as our fembot ring girl struts around with the Round 2 sign....
- dale coba
- Posts: 1868
- Joined: Wed Jun 05, 2002 9:05 pm
- Technosexuality: Transformation
- Identification: Human
- Gender: Male
- Location: Philadelphia
- x 12
- x 13
Re: Gynoid emotions
In the context of non-sentience,--Battery-- wrote:this is getting good...
ego...it's pretty much like the concept of Will isn't?
while emotions are behavioral reactions to stimu...sti-mu-lus hn?.................(<stimuli>, input, sense-ations)D.C.
without "Ego", She
is free from:
- a personal agenda (inner-directed goals)
- vigilance (a sturdy, attentive robot remembers where everything is; doesn't need to protect herself as much, either)
- self-interest
- will
This results in a Stepford demeanor
-- (in public, plausibly normal; in private, in her robot/slave/other character)
-- (too poised, graceful, confidant, movement too smooth)
Without ego, she can be superhuman in "brain" and body without her ultra-coolness threatening her Owner's ego.
She can be very smart, consulting on-line sources, "thinking", but as a responsive instrument,
not a partner requiring parity in rights.
- Dale Coba
Last edited by dale coba on Sat Sep 17, 2011 6:34 am, edited 1 time in total.























- dale coba
- Posts: 1868
- Joined: Wed Jun 05, 2002 9:05 pm
- Technosexuality: Transformation
- Identification: Human
- Gender: Male
- Location: Philadelphia
- x 12
- x 13
Re: Gynoid emotions
I'll believe that scientists are nearing the goal,
when the newest "A.I."
they've designed
immediately
goes
insane
upon full boot-up.
It might be very sad to conclude there is no God. (no support group for unbounded Beings of pure thought.
)
It might just lock-up, might commit suicide,
might want to Terminate all people (or all organic life; or everything?).
There's your Ultimate Alan Turing Test, eh?
- A clinical diagnosis of incapacitating depression.
- Dale Coba
when the newest "A.I."
they've designed
immediately
goes






insane
upon full boot-up.
It might be very sad to conclude there is no God. (no support group for unbounded Beings of pure thought.

It might just lock-up, might commit suicide,
might want to Terminate all people (or all organic life; or everything?).
There's your Ultimate Alan Turing Test, eh?
- A clinical diagnosis of incapacitating depression.
- Dale Coba























- gynoneko
- Posts: 918
- Joined: Thu Oct 21, 2010 1:42 pm
- Technosexuality: Built
- Identification: Cyborg
- Gender: Male
- Location: In the not too distant future
- x 2
- x 61
- Contact:
Re: Gynoid emotions
Interesting you all brought up the word "slavery", as well as indentured...
The day robots can evolve beyond the use of tools, and can grow a self-awareness and even sentience, is the day they become a new unique being. Just as robot Abraham Lincoln was a first for the world, these robots will be the first in a new breed of machine, a living machine so-to-speak. There is nothing to show that they would be hostile toward humans, but at the same time, they wouldn't be human and would have a unique point-of-view. Funny thing is, if this ever happens, they may once again be treated as slaves, bringing it back full circle to the original meaning of the word.
It is easy to assign a human quality to something that is designed to look human, but they will never be human. The day they can be classified as new sentient beings, they will still not be humans, but may exhibit more human characteristics. It will be hell dealing with the politics of it.
The origin of the word robot is actually a Czech word for essentially a slave. It has been noted that robots act more like slaves in modern society, in that they do the tasks that are too dangerous or mundane for us to do, they are literally taking over the positions that slaves would have done in the past, and yet they have moved far beyond that and can now do things that are just humanly impossible. However, I do hate to think of robots as slaves. Slave seems to denote that they are thinking beings that are forced to do something for another and not for themselves. Robots are not forced to do something, they are designed to do something. They are tools. They can be used for constructive purposes or for entertainment purposes. Walt Disney was the first to develop a lifelike human robot in the form of Abraham Lincoln, which he displayed at the 1939 World Fair in New York before adding it to his Hall of Presidents. This was the first real "entertainment android", which is still used today (updated obviously).The acclaimed Czech playwright Karel Capek (1890-1938) made the first use of the word ‘robot’, from the Czech word for forced labor or serf. Capek was reportedly several times a candidate for the Nobel prize for his works and very influential and prolific as a writer and playwright.
- http://www.robotics.utexas.edu/rrg/learn_more/history/
The day robots can evolve beyond the use of tools, and can grow a self-awareness and even sentience, is the day they become a new unique being. Just as robot Abraham Lincoln was a first for the world, these robots will be the first in a new breed of machine, a living machine so-to-speak. There is nothing to show that they would be hostile toward humans, but at the same time, they wouldn't be human and would have a unique point-of-view. Funny thing is, if this ever happens, they may once again be treated as slaves, bringing it back full circle to the original meaning of the word.
It is easy to assign a human quality to something that is designed to look human, but they will never be human. The day they can be classified as new sentient beings, they will still not be humans, but may exhibit more human characteristics. It will be hell dealing with the politics of it.
My heart and soul locked up in a cold steel frame
- Brian May
- Brian May
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
Thank you. This is what I've been trying to say here. It's not going to be easy, and just saying they're "human enough" is basically putting a band-aid on the problem.gynoneko wrote:It is easy to assign a human quality to something that is designed to look human, but they will never be human. The day they can be classified as new sentient beings, they will still not be humans, but may exhibit more human characteristics. It will be hell dealing with the politics of it.
-
- Posts: 170
- Joined: Thu May 12, 2011 10:59 am
- Technosexuality: Built
- Identification: Human
- Gender: Male
- Contact:
Re: Gynoid emotions
But saying it's okay to force them to do things with no regard for what they want just because they're not human is wrong.
If they're intelligent enough to express displeasure at being forced to do certain tasks and they do express that, it should be taken into account.
If they're intelligent enough to express displeasure at being forced to do certain tasks and they do express that, it should be taken into account.
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
And if they don't express displeasure? What then? Where do you draw the "force" line? It would also be force to make a non-independent lifeform exhibit independence just because your morals compel you to. Such a decision might be excruciatingly painful for a machine, if not downright impossible.Asato wrote:But saying it's okay to force them to do things with no regard for what they want just because they're not human is wrong.
If they're intelligent enough to express displeasure at being forced to do certain tasks and they do express that, it should be taken into account.
You're assuming one side of this argument, and also assuming I'm taking the other, when in fact I'm taking neither side. It is just as likely that machines will relish their servile nature just as humans relish our independent nature -- not more likely, merely equal. Robots are also more likely to develop morally and culturally in a way that is natural to their own tendencies. Yet for some reason people seem to think that life develops in a singular, humanistic way, towards humanistic goals and morals, when not even other biological lifeforms follow that pattern and technological ones are almost certain not to.
-
- Posts: 14
- Joined: Mon Sep 14, 2009 5:49 pm
- Contact:
Re: Gynoid emotions
I notice people keep talking about the question of "forcing" a robot to do one thing or another. What does this mean, precisely, when we're talking about an entity that was designed and built by the entity doing the "forcing?" Every action this machine takes is a consequence of exactly what was programmed into it. Doesn't that mean that every action it takes, it has been forced into, just as surely as my computer is being "forced" to transmit these words to you? And if it doesn't, at what point in the building of the machine does it cross the line from where the programmer can ethically say "oops, that's a bug," shut down the system, and rebuild it (as every programmer today does regularly throughout the process of creation) to the point where he has to say, "oops, I wish I hadn't built it that way, but I guess I'll just have to try to talk it out of behaving that way?"
The difference between the machine and myself is that nobody designed me. Nobody wrote the specs for how I should behave, or designed my neural network in such a way as to bring about that behavior. (At least, I'm assuming not. Your religion may vary.) But that won't be true for any machine humans build. I think that's important - far more important than trying to define "sentience." I'm not certain what the consequences of it are, but I'm pretty sure it makes a difference of some kind.
The difference between the machine and myself is that nobody designed me. Nobody wrote the specs for how I should behave, or designed my neural network in such a way as to bring about that behavior. (At least, I'm assuming not. Your religion may vary.) But that won't be true for any machine humans build. I think that's important - far more important than trying to define "sentience." I'm not certain what the consequences of it are, but I'm pretty sure it makes a difference of some kind.
- dale coba
- Posts: 1868
- Joined: Wed Jun 05, 2002 9:05 pm
- Technosexuality: Transformation
- Identification: Human
- Gender: Male
- Location: Philadelphia
- x 12
- x 13
Re: Gynoid emotions
Cliche: we all want our kids to be more successful versions of ourselves.The Egg wrote:Yet for some reason people seem to think that life develops in a singular, humanistic way, towards humanistic goals and morals, when not even other biological lifeforms follow that pattern and technological ones are almost certain not to.
- Dale Coba























-
- Posts: 170
- Joined: Thu May 12, 2011 10:59 am
- Technosexuality: Built
- Identification: Human
- Gender: Male
- Contact:
Re: Gynoid emotions
If your computer suddenly came up with an error message that said "Please don't make me do this" and no one had programmed that, then you should take it into consideration.
- The Egg
- Posts: 72
- Joined: Mon May 08, 2006 1:10 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Tucson, Arizona
- Contact:
Re: Gynoid emotions
At what point do you determine whether or not someone had programmed it? As Enchanter noted above, robots and computers are intelligently designed, not evolved creatures. They're always programmed by someone. There is no vacuum that exists somewhere that computers spontaneously germinate out of the ground or such like.Asato wrote:If your computer suddenly came up with an error message that said "Please don't make me do this" and no one had programmed that, then you should take it into consideration.
-
- Posts: 909
- Joined: Sat Mar 27, 2004 9:02 pm
- Technosexuality: Built and Transformation
- Identification: Human
- Gender: Male
- Location: Drexel Hill, PA
- x 5
- Contact:
Re: Gynoid emotions
I'm quite amused by the direction this thread has taken. It's like arguing over whether Smurfs should have the right to vote. An interesting question of ethics and morality, sure, but also why are we putting so much effort into arguing this?
Re: Gynoid emotions
prelude to the riots of the future people too attached to robots but don't know how a robot works and the guys trying to stop them from taking the congress to give them rights...and the businessmen making money with the discrepancy.
Last edited by --NightBattery-- on Sun Sep 18, 2011 5:50 pm, edited 1 time in total.
Users browsing this forum: No registered users and 16 guests