Thursday, April 16, 2009
mind reading robots
The simulation continues as we're getting closer to the moment in which we have fully copied ourselves into abstract beings such as Asimo, one of the main current-day brainchilds of the Japanese. It is now possible to link the robot's movements to our thought patterns, which requires our brainwaves to be milked and analysed so patterns can be detected that trigger specific responses.
Of course this stimulus-action model of behavior suffers from a lack of direct embodied mediation in that there is a semantic interpretative layer in between the thought patterns and their effects on the robot. I think it would be nice if we could have a directly mapped embodiment of humans to robots or other technological devices, so our brains directly update our internal body map through interaction, and we can come to feel like the technology is a direct extension of their bodies, or even that it is our body itself. I even think that having a highly adaptive and evolved body map can even lead to a direct way of feeling empathy for robots, and robots feeling empathy for us, because we can literally feel what it is like being the other. That to me is the vision for future technology design, rather than becoming slaves of the simulation.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment