Eesh... I hope not much is made out of the 'prejudice' being alluded to in recent quotes. Like an emotionless robot somehow taking exception to things because it feels offended.
That was the biggest problem I had with 'The Animatrix' when it came out. Why would anyone programme manual labour devices with the capability to feel depressed and angry at being used?
The viral material states that Weyland is attempting to create robots that are indistinguishable from humans, and surely one of the requirements of this would be creating robots that can portray human emotions so convincingly that no one would be able to tell the difference. The question is then: do they actually feel
offended, or do they just act on programming that tells them that they should
The truth is that there is no way
to tell what a person/robot is feeling based on their actions and/or body language.
- portrays embarrassment, but only a complete idiot would suggest that a 15x15 GIF image actually felt embarrassed. Now imagine that you asked a computer program how it was feeling and it posted this:
- your immediate emotional reaction would be 'holy shit, the program is sad!', and you would have a hard time shaking that idea, even if the program in question just posted a random emoticon each time you asked it a question.
Fight for robots rights, before they do.