M068402b8978cf01c28fa5e7a66b282e74

MEMBERS


M068402b8978cf01c28fa5e7a66b282e74

ARCHIVES


RECENT ENTRIES

    M268402b8978cf01c28fa5e7a66b282e74

SYNDICATE


MAILING LIST

M368402b8978cf01c28fa5e7a66b282e74

Ticklish Robots

April 2013
By Bryan Bergeron

By now, I expect that you’re familiar with the android David in the sci-fi film Prometheus. David exemplifies near perfection — the ability to speak and understand language, an apparently perfect humanoid body, the ability to effectively lie and deceive, and a sense of self-preservation. What the film failed to reveal, however, was whether David was ticklish.

I focus on this seemingly insignificant ability because it’s something that most humans demonstrate and because it seems more easily achieved than, say, the ability to deceive others. Think about it — what could be so difficult? You’d need a sensor or two, and the ability to determine if a body part was touching or being touched by someone or something else.

I started an experiment with a 5DOF robot arm, a few pressure sensors, and an Arduino, and quickly discovered that determining who or what is doing the touching is non-trivial.

For example, to determine whether a pressure sensor response is due to movement of the arm or of something external touching the arm, you have to keep a history of the arm movement. If the arm has been sitting idle for 20 seconds, then you can probably conclude that the pressure sensor reacted to an external force. If the arm was moving, then it could have been either arm or external movement, or both. So, you’d need an optical or IR sensor to determine if someone or something was near.

There’s no $39 “Tickle Me Elmo” solution to creating a realistically ticklish android or black box robot. The skin sensors would need to distinguish a light soft touch from, say, a punch or scratch. Then, there’s the higher-level processing required to determine whether the soft touch should result in a tickle response, pulling away in a defensive posture, or a slap in the face. 

Local sensor processing isn’t enough. People often start laughing before being tickled. That is, the anticipation is enough to evoke laughter. Not only would you need to detect, say, a hand sneaking up, approaching your armpit or other sensitive spot, but you’d have to analyze the facial expression of the suspected tickler. A negative expression might suggest a mugger, and not someone to be taken lightly.

As impressive as Watson is playing Jeopardy, I’d be more impressed if it  were ticklish. I think that the Touring Test will be passed by computers long before the ‘ticklish test.’ A reasonable question is “So what?” As David said in Prometheus, we humans are most comfortable around our own kind. An android that can’t display the full range of human interactions — including being ticklish (and, as an aside, being able to tickle others) — would hardly pass for human.

From a practical perspective, future servant robots that attend to the aging and sick will need a full range of human-like emotions in order to bond with their patients and owners. As far as my experiments go, I’ve given up on the Arduino as an experimental platform, simply because it doesn’t have the processing power. Instead, I’m working with the Parallax Propeller and the Raspberry Pi — both powerful, inexpensive microcontrollers that may have enough speed and memory to monitor multiple sensors and at least guess about the context. Time will tell.

If you’ve worked in this area, please consider sharing your experience with your fellow readers. SV

 


Posted by Michael Kaudze on 03/20 at 03:37 PM


Comments



<< Back to blog