Friday, February 5, 2010

Social robot ethics

This movie may change your view of robots and how 'human' they can become:



Here it becomes clear what embodiment can do for a robot's behavior. What robots usually have in common is their too precise, mechanic motion. Even the more sophisticated ones still suffer from a popping and locking type of dynamics in their movement, which just prevents us from feeling direct empathy for the thing. It just makes robots look artificial, as if they are just acting out commands. Boston Dynamics is making big progress here, but the -of course- Japanese baby-robot 'CB2' shown above makes it all clear what can give robots human-like properties. Despite it being uncannier than ever, it just moves organically, just because it shows a lack of control.

The New York Times recently posted an excellent article about social robotics, basically notifying people that there might be a more important issue than the fear of robots turning bad: people becoming too attached to robots.

To recap:

* "I can see my grin scattered across computer monitors in the Media Lab. Nexi’s [a robot with facial expressions] forehead-mounted, depth-sensing infrared camera shows my face as a black and gray blur, and the camera in its right eye portrays me in color. I watch as I slip from the monitors, Nexi’s head and eyes smoothly tracking to the next face. I am not creeped out—I’m a little jealous. I want Nexi to look at me again."
*   “It turns out that we’re vulnerable to attaching, emotionally, to objects. We are extremely cheap dates. Do we really want to exploit that?” - Sherry Turkle
* "How will grandma feel [...] when her companion bot is packed off for an upgrade and comes back a complete stranger? When a machine can push our Darwinian buttons so easily, dismissing our deep-seated reservations with a well-timed flutter of its artificial eyelids, maybe fear isn’t such a stupid reaction after all. Maybe we’ve just been afraid of the wrong thing."

* "For kids who grow up around robots, the uncanny valley could be irrelevant and The Terminator little more than a quaint story. Vulnerable or not, children interact with these machines differently."
* "We need ethical guidelines for robots, and we need them now. Not because robots lack a moral compass, but because their creators are operating in an ethical and legal vacuum."

Let's elaborate on the last point: robot ethics, since this is in my view one of the most pressing issues of today. As far as I see it, a big change has to be made in our perspective if we are to co-exist peacefully with robots. We can think of all kinds of smart tricks to patch up unsafeties, but in the end, I think robots are going to change us at our cores.

We're talking about something very fundamental here. We're about to consciously treat man-made entities as humans. This requires a very drastic change in mindset, roughly comparable to that of slavetraders who suddenly had to treat their commodities as equals. We already necessarily anthropomorphize technological devices while interacting with them (read for example 'The Media Equation' by Reeves and Nass), we expect them to gratify our needs, we see them as inferior, and we think we own them, so I think that the comparison with slavery is not farfetched at all.

Before continuing to intellectually treat this topic, it is important to pause for a moment and see what the perspective is that makes humans treat things as things instead of as equals. I am sure that you yourself employ this perspective too to some extent. I know I do. It's hardwired into us. It's important that when we're in a reflective mindset, we still can put ourselves in this hardwired perspective by embodied empathizing, through the power of imagination, so we see that when we are in it, we are not in control of stepping out of it until a thought pops into our head that makes us aware of our own perspective. Human minds are more or less blindly operated by this perspective, until they learn to step out of it. But before they do, any stating that this perspective is inferior, useless, silly or blind will only invoke resistance. Therefore, the blind perspective must be accepted first.

I am talking about the mind run by social instinct, with the accompanying perspective of trying to find others that seem equal, and that can give you a kind of emotional resonance that validates how you see yourself as a social actor. In groups, this perspective will make us find people that we can have interesting conversations with, that we can trust and feel safe with. The people that make us feel good. The conversational realm we enter as a social actor then provides us an escape from the physical world into a mental realm where through words, we associate towards good feelings. It's very hard to deny this; observe any social group for a while and you will unmistakably feel that whatever is going on, its goal is to move towards resonance in the form of everybody smiling and laughing. It's very cyclical and predictable if you look at it. And it's very safe and comfortable if you are within it.

If you observe it, you will also see that objects and devices are most often used to accompany this process of mental resonance; the thing then provides a semiotic function for associating towards the shared state of pleasure. Of course, objects still have a direct, embodied role, such as when one sits down on a chair or picks up a glass, but the mind is usually not consciously employed. The body mostly operates on its own, and if it draws attention to itself, such as when you drop something, or slip over something, this is treated as unexpected and often laughed at in response. In other words, the social perspective makes us expect the body to recede into the background, so we can pay full attention to the shared mindspace.

This may seem obvious when talking about it, but it is very fundamental as to why objects are treated as slaves. They simply do not react to the mental play, so they don't receive any attention. And when they do attract attention, we usually expect the device to recede into the background again as soon as possible, so we can pay attention to a shared social mindspace again. Think of a phone ringing; we expect to pay as little attention as possible to the phone itself so we can get back into the conversational flow. Consequently, in this mode of being, everything that is not included into this flow as a social actor does not receive empathy, until it speaks up. I am mentioning this because I think the largest part of the world's western population is continuously in this mode of being. As a child, you learn language, and subsequently you live a life in language. You gain pleasure through language, you gratify your senses through language. You see things as concepts, with accompanying labels. When you see something you think is beautiful, you don't really feel it, you merely associate it with the word. You never learn to step beyond language because modern culture does not stimulate that, unlike older cultures such as the Aboriginals. To relax, stop trying to be something, immerse yourself, and just enjoy being.

Thus, we do not intuitively include technological devices in our 'circle of empathy', just as we didn't with slaves. And likewise, if a device does not work according to our wishes, we get frustrated and demand that it works again as soon as possible. We pay no attention to the feelings, needs and wishes of the entity we see as inferior.

I could go on about this forever, but it is very probable that it doesn't get any attention. I know, the web invites a lot of clicking, and if we don't get gratified instantly, we often don't have the patience to listen. I've done this before, but here are some key ideas I have about robots. I would like to spread them, I am willing to explain to everyone and write a paper on request, I can give some scientific basis for my ideas (although the ideas are beyond science and ultimately unprovable through the scientific method - they must be felt first - , though neuroscience and quantum physics give very interesting models), and I would like even more to convert them into action.

So here are the ideas:

- We are social actors, but beyond that, we are also the observer of ourself. So it is silly to see yourself purely as a social actor, as a person. You could also see yourself as connected to a whole, as a thing that's part of a larger whole, like a thread in a fabric. But even that means that you look for a 'thing' to be; you look to label yourself. But what if you are your embodied consciousness? If you ask yourself 'who am I?', are you not the asker of this question in the first place?
- Think about the previous point a little while, and you will realize that you are your consciousness in the present moment. You have created this moment through a deliberate act of action/perception. This moment and everything in it is you. When you look at the sun, you can say: 'The sun is over there, far away' but we forget that we have chosen to look at it in the first place. That's why the sun is only part of our consciousness.
- What follows is that there is not necessarily an objective reality, this is merely a concept in our heads. It is a valid perspective, but not the only one. And not the one that makes us most happy either, by the way. We don't want to be reduced to objects in space.
- What follows too is that your own body is just a concept. You think other people are like you because they look like the image you get in the mirror. What you forget is that you chose to look in the mirror in the first place. And it gave you a physical explanation for who you are. You forget that you are the act of looking in the mirror in the first place.
- What follows from that is that you are not a human being. Yes, you are not a human being. You are not even a being. On the physical plane, you are a human being, sure. We need to label 'things' and it is in many if not most cases very useful to conceptualize ourselves as a thing in space. It is more useful to say that we are a human being than, say, a teapot. But ultimately, that concept does not make us very happy. It keeps us insecure. Because by only thinking in terms of things we can never understand what's going on in our lives. We keep looking for a form of happiness dependent on things.
- Along the same lines, things are not things. We can think a computer is a computer, because we design it that way and label it. But it's made from the same things that nature is made of, when you trace it back. Never forget that everything we make ultimately stems from the things we already had. Nothing is new. There is a way behind everything we make, beyond what we think the thing is. A computer computes for us, sure, because we condensed material in a controlled way so we can predict what it does in response to our actions. And then we call that computing. But we don't really control it, we just think we do because most of the time it will behave in predictable ways. It's more accurate to say that we guide it. We guide electrons by creating channels that they most likely will 'want' to travel along. But probably not all electrons will do as we think. According to our model, they should, but in reality, particles do a whole lot of different things. If you never challenged physicalist principles to explain the world for you, please read about things like quantum entanglement, the Heisenberg principle, and holonomic brain theory, if you haven't already. Please do, these things should become stuff of mainstream culture.
- Robots are not robots. That is just a label for describing a thing, just as we call our own body a thing. In the end, it all is part of one whole, and we can live in line with that whole by seeing everything in the first place as that whole, instead of a separate thing. If we see ourselves as that whole, as everything there is, the distinction human - machine dissolves. It is that very distinction that creates ego, fear, and a possible 'war with the machines'. It is created by the mind. It's all mind. Instead, we could learn to step out of our heads and just start to be. It's like kung fu, just learning to be and act without mental reaction. That's true acceptance and love.
- The interesting thing about robots is not that they can be new social actors. Sure, I believe you can make a robot a believable social actor. It will turn out that how they are programmed can describe a lot about our own operating. I think it's only egoic resistance to think that you cannot be described as a machine. Don't forget that any description is only a description. If you resist a description it only means you are looking for an acceptable description to find out who you are, whereas who or what you really are lies beyond any description.
- Concepts like 'emotions' and 'sentience' are only concepts. We can not verify if something actually has it. It's always an interpretation of the observer. Science is an artificial way of establishing a truth. You can't even know if your own lover has emotions and sentience. You can only assume, because it really seems to be that way. But in that act, what you forget is that your sentience is looking for other sentience. All sentience you think you see is actually created by your own sentience. If you don't understand this immediately, pause, maybe write it down, think about it, until you see what I mean. Don't try to fit it into an intellectual understanding, you need to intuitively see this. And the way is to relax, let go of everything, open your mind completely, and then think about the idea, until it resonates with the state of your body. Then, alignment takes place, and true enjoyment, happiness, love, and creativity can emerge. Relaxation is key. If you see what I mean, go on to the next point.
- Everything is sentient because it is created by your own sentience. From that perspective, even a rock is alive. The way it dances in the light, the way it moves around you; if you perceive it will all perceptual powers you have, with your full attention, you come to see that the apparently static thing is very alive, and that it changes continuously. You can say this is as a result of your actions, but beyond this causal form of reasoning, you can feel that you are directly connected with it. You are one with it. There is not really any separation if you use your full attention in the present moment. Become the moment.
- This, then, can give a new mindset that works for humans as well as robots: the mindset of pure love.

And here are some interesting future directions for robots:
- Focus on physical interaction. One of the most interesting things about robots is that they can take us out of mental space, socially shared or not, and engage our bodies. This way we can become as it were, embodied yogis. What happens if a robot has us interact with it in a way that we cannot think of anything else any longer, like in a dance?
- If the robot is to be a conversational actor, give it a lot of different mindsets. Try to enlighten people through social conversation with a robot. It is an incredibly interesting experiment to see how people will react to new, radical ideas, that if told by other people were hard to accept. Can a robot be a guru? Matthias Rauterberg's ALICE project is a good first example of this idea.
- Create a robot that physically dissolves into its environment, like a water droplet into the sea. Make it able to appear again too. Can this give people the intuitive insight that what they think they are is actually a constellation of smaller things, and that they themselves are a part of a larger thing? Can it make people humble and even learn to mentally overcome fear of death?

The 'social robot' will possibly only occupy a niche of robots, there to remind us that once we were humans, looking for fun and pleasure with our friends. Knowing that it's much better now.

Link to the NY Times article "The Uncertain Future For Social Robots':
http://www.popularmechanics.com/science/robotics/4343892.html?page=1

 

No comments:

Post a Comment