Teleabsence

Hello readers!

Today I’ll be updating you on my teleabsence robot.  I was inspired by Leila Takayama’s research on telepresence systems.  I was curious how a person’s interactions with others might be affected if they were not in control of the system.

The first system I wanted to build was a teleabscence robot.  I wanted the robot to turn away from people who approached it and tried to speak with the person dialed into the system.

My original idea was to use a microphone to detect if someone was speaking to robot, and then have the robot roll away.  There was one big problem with this approach.  Hobby microphones aren’t very sensitive so it wouldn’t be able to tell the difference between someone talking to the robot, someone talking near the robot, or the remote user talking over the telepresence system.

My next idea was to use a contactless IR sensor.  I wanted the robot to be able to differentiate between a wall and a person, and a person talking in the same room vs. someone talking to the robot.

I wired up the system, had it work for a hot second, and then fried the IR sensor while trying to rewire the system to consolidate 3 breadboards –> 2.  The sensor was pretty expensive (~$15) and it wasn’t all that great at detecting people to begin with, so I decided to take a different approach.

I thought about buying a motion detector (which I think is the easiest way to detect people), but decided to change the behavior of the robot so that it would act how I wanted it to act with the parts I had:

The first step was to wire the system up.  Since I didn’t buy a full-on robot car kit, I needed to figure out how to set up the motor driver and distance sensor.  I followed this tutorial for the driver, and this one for the ultrasonic distance sensor.

Then I loaded in code for a simple object avoiding robot to make sure everything was working as intended.  I think this code is a little different than what’s in Github (the robot doesn’t back up for 2 seconds before turning), but below is an early system test.

There are many problems with this simplistic system, but one of the biggest is that the field of view just wasn’t very big.  The robot wasn’t able to see (and avoid) objects that came in at an angle (see video below) or objects that were below or above the distance sensor (e.g. wires or a chair with a low crossbar).

Another problem that’s more obvious in the first video is that the chassis is pretty janky.  One of the screws on the motor had come off which is why the robot drifts to the left instead of going straight.

However!  The robot was *basically* working.

Shortly after the above videos were taken I fried the IR sensor and decided to go with a behavior change instead of a motion detector.

Originally I had wanted my robot to roam around aimlessly, but quickly go in the opposite direction if anyone tried to approach it.  However, it was quite tricky to tell the difference between a person and any other object that’s obstructing it’s path.

I decided for v1 to instead have the robot stay still, but then turn away if someone tried to approach it.   If the person continued to “bother” the robot (approach it 3x), then it would retreat to a safe distance.

In addition, I wanted the robot to still be able to avoid objects (walls etc.) if it encountered something while it was retreating.  As far as I can tell, there is no way to check the sensor input while the robot is moving forward, since in order to code the forward motion I needed to put in a delay.

What I did to deal with this issue was to add a second ping within the forward function.  If the robot encountered an object, it would simply stop.  Then after the forward function was finished being called (total 2 seconds), the main loop would begin again and the car would turn since it has encountered an object (remember, it stopped when it encountered the obstruction <= 20 cm away, so the obstruction is still there).

Now that I had the code working the way I wanted, I turned to the design of the casing.  I wanted my robot to look approachable, and decided to create something that looked like Miyazaki’s soot sprites from Spirited Away.sootballs.gif

Unfortunately, I didn’t take into account that a soot sprite is tiny and adorable, and my robot is (relatively) huge.

My husband thinks it looks like a runaway toupe.  My classmates thought it looked like a cat.  See for yourself below…

But wait!  You’re probably thinking.  where is the telepresence/absence aspect?  Good eye 😀  I haven’t implemented that yet since getting to this was all I could manage in the time I had for the class.

If you’re interested, my v1 robot code is here.

Next time, I’ll discuss the soldering and v2 design process.

Thanks for reading!

1 thought on “Teleabsence”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s