How to detect collisions? (When the robot gets "stuck".)


#1

I own an mBot Ranger and I’m getting a handle on programming with both Arduino and mBlock.

I can use the Ultrasonic Sensor to look for obstacles, but of course the robot can’t “see” everything. What’s the best way to determine, in code, when the robot is trying to move or turn… but can’t?


#2

This is an interesting problem which I am tackling as well.

In industrial applications (my day job), we would normally monitor the rotation of a non-driven wheel, such as the pivot wheel on the mBot. We would do this with an encoder of some sort. If the non-driven wheel is not turning…the bot isn’t going anywhere and should undertake extrication maneuvers.

Monitoring the drive wheels would tell you if they had stalled, but if they were slipping it would appear that the bot was still moving.

Since the pivot wheel is small and space constricted, an alternative is to mount another non-driven wheel with a monitoring system.

To monitor the wheel, an encoder could be used but it would have to be very low drag. For the drive wheels, one could position an infrared sensor to monitor the holes in the wheel. Reflective spots could be used, or a small protruding flag which rotates with the wheel could be detected. The simplest method is probably to glue one or more magnets near the rim of the wheel. There are many mechanical and electronic sensors which can detect the magnets passing by.

Now…let’s get inventive…using what you’ve got…

If the ultrasonic sensor value seems to be frozen in a very narrow range for several seconds (30?)…you might be stuck. Or you might be in the middle of a very, very large area like a gymnasium. Determining which is the trick!

Like all organisms, the bot might be tuned for his environment. In a typical house, the bot might expect to reach a barrier every 30 seconds…a wall, furniture, etc. If that doesn’t happen, the bot should become claustrophobic and begin extrication maneuvers.

I see the bot’s plight as similar to the ancient mariners. As long as the bot hugs the coast (a wall), then he has a reference point. If the bot journeys away from the wall out of sight distance (ultrasonic range), then it can be pretty confusing. Monitoring the movement of the stars (the overhead room lights) or variations in the floor might be the only navigation clues.

It would be nice if the IR sensor returned a value instead of being go/no-go. Monitoring slight changes might give clues as to whether or not the bot is moving. Perhaps the light sensor module could be pointed at the floor and used to monitor changes in reflectivity which might indicate motion.


#3

Thanks for the great reply! Good to know that someone else is thinking about this problem. :wink:

Your suggestion reminds me of a thought exercise I came up with many years ago. I’ll share it in case others find it interesting. :slight_smile:

I was writing some obstacle detection code for one of my first robots which, like the mBot, only had a single ultrasonic sensor in front. After playing around with it for awhile my options started to feel REALLY limited. Is something directly in front of the robot? Yes or no. How could I make my robot seem intelligent with a basically binary input like that?

Grumbling, I thought to myself, “Trying to program a robot to move around with only an ultrasonic sensor is like me walking around in a dark room with nothing but a long stick to “poke” in front of me.” So I decided to try it! (This could perhaps be an interesting lesson for someone teaching kids about robotics.)

I put on a blindfold and had my wife lead me to a random location in our house. (As I recall, she moved some things around first.) Then, armed only with a dowel rod, I tried to move around. Suddenly, my programming options for the robot didn’t seem so limited! Here are a few things I did:

  • There were lots of things I already knew about human living environments and this knowledge helped me navigate. (Doorways are “X” inches wide, and generally run perpendicular to hallways, which are “X” inches wide, etc.)

  • When I encountered an unexpected obstacle I used the stick to “measure” the size of it by turning left and right to find the edges.

  • It was pretty easy to detect a wall, and once I did I could orient myself perpendicular to it.

  • When I found an opening between two obstacles I used the stick to find the sides, then I went through the “middle” of the opening.

  • I slowed down when I was uncertain about my position.

  • Changes in light and sound helped me figure out my location. (My robot had light and sound sensors, as well.)

  • There was a (subtle) sound change when I moved from carpet to hardwood.

  • Doorways and floor transitions (carpet to hardwood, for example) were the best “checkpoints” to help me know my location, or at least to know that I had moved into a new “space”.


#4