Awesomer Media Sites: THE AWESOMER | MIGHTYMEGA | 95OCTANE
subscribe to our rss feedsubscribe via e-mailfollow technabob on twittertechnabob facebook fan pageGoogle+follow us in feedly
Follow Us:

Quad-Rotor Autonomous Helicopter Eschews Gps in Favor of Lasers. Laz0rz!

by


Here’s another one of them MAVs that’ll soon be flying all over the place. A group of MIT students – Abe Bachrach, Anton de Winter, Ruije He, Garrett Hemann and Sam Prentice (I think I got +10 to my IQ after spelling their names) – developed an autonomous flight system that could sweep and analyze it’s environment in real-time. While a 10-year old human can also “analyze it’s environment in real time”, a 10-year old human can’t fly. Or use lasers to build maps. Thanks to the Bachrach et al, the robot in the picture can do both. It’s a quad-rotor helicopter packed with sensors and a laser. It’s specialty? The great indoors.

quad-copter-MAV-for-indoor-autonomous-flight

You see, it’s relatively easy (for nerds) to build a robot that can find its way to a target outdoors, thanks to the magic of GPS. But what if you want to search inside a building? GPS won’t help you there, believe me. I researched it extensively. If you don’t have a map of the building or whatever structure you’re infiltrating, you’re screwed. But more importantly, GPS receivers need a strong signal to work, something which you’ll have difficulty obtaining inside a building.

quadcopter-MAV-2

So the aforementioned MIT students came up with a laser scanner that sweeps the helicopter’s immediate area, and that along with some algorithm magic (yeah this is where I lose track of definitions) builds a corresponding rough three-dimensional map of the autonomous vehicle’s immediate environment, as shown in the image above. Now I need you to watch the video below and then answer one tiny question for me: when the narrator uses the word “we” and “us”, does he mean that they can control the robot remotely which means they can see the map that the laser sweep generates, or is the robot fully autonomous?

Maybe the chopper is the narrator, and it’s talking using lasers. I don’t know. I’m confused.

[via MIT Tech TV]






Comments (6):

  1. Brian says:

    When they say “We” are doing something, they mean they are able to develop algorithms that the vehicle can use to do it autonomously. Yes, it is confusing.

  2. reza barmaki says:

    hi
    plz help me to build my own quad rotor
    i don,t know how to mix 4 brushless motor & 4 driver of 4 brushless & reciver 4~6ch
    infact i wana build it myself but i don,t know about it,s electric kit plz help me to do this
    i am really in desire of it
    tnx alot to read this and helping me

    bye

  3. Andi says:

    Hi
    As much as I’ve seen the device is autonomous, but may well be able to send its data to some computer where you can use a big display etc. – and perhaps an emergency shut-off knob ;) This of course puts additional load on the controller of the unit.

    For me I would _want_ the data to be sent in parallel to some base station for logging, debugging, etc..
    And I remember that the display of the map in some video seemed real-time.

    There are developement environments that faciliate this type of partly parallel, partly distributed programming.

    You can build a map inside the flying controller and duplicate the calculus in a stationary one, to have “broadband access” for display and analysis, what the current state is in the mobile device. You just have to define some interface that gets Y-splitted, one copy locally processed, one remotely transmitted and processed on a PC using the same algorithms.
    It depends on what you want to transmit, raw sensor data or to some degree processed data, how much bandwidth you have to use.

    For learning purposes sufficien I’ve seen such a setup even for Lego NXT using Lejos.

    My drive is towards the ASUS Atom with Nvidia ION platform, where one could use ~ 50 shaders for calculus. But this will still need to much power for a quad this size. (So I will begin with somethng ground-based, perhaps later a blimp ;)

    Have a look at readily-built quad or octocopters, but insist on a remote control unit (in your hands _and_ the counterpart in the copter) that has already a built-in serial interface. This enables you to seamlessly mix modes between autonomous and RC mode.
    http://pdv.cs.tu-berlin.de/MARVIN/mark_ii_frameset_introduction.html
    They have other cool tricks like calibrating angular sensors on an old record player’s turntable.

    I don’t like the copters that much because of their noise and energy hunger and because my kids would cut their fingers (or more sensitve parts like the eyes) on some day. Observe that even MIT build an outer double bar rim protecting people that would stray on the corridors. Not that I have seen anyone near in the videos I’ve seen.
    I would rather like to have a blimp like festo bionic mantas, but with more load capability and enough muscles to resist normal windy conditions… Still the load is not there, so I get tempted to process data off-vehicle again.)

    Greetings!

    Andi

    p.s.:
    Just because you process something in a base station doesn’t mean it is not acting autonomous. Or does it? When mobile computing power rises, one can later reinsert outsourced calculus, so I would reason.
    Somethig different is MIT’s “aggeressive mode” Quad, where IR Cameras fix the bot to some specially designed room.
    Still consider this: areas with GPS coverage are just the same “specially designed volumes” on earth, is a robot using gps non-autonomous? ;)
    Perhaps later on some swarm of differently designed robots could together explore an unknown volume and set up some presice space monitoring net, and distribute calculus needed amongst each other.

Post a Comment:

Want a personal avatar on your comments? Sign up for a free Gravatar now!

Note: All comments with links in them will be held for moderation in order to prevent spam, so you may not see your post appear immediately.

More from Awesomer Media...