Bioinspired Vision-based Microflyers
Taking inspiration from biological systems to enhance navigational autonomy of robots flying in confined or cluttered environments.
This project is funded by the Swiss National Science Foundation
The goal of this project is to develop control strategies and neuromorphic chips for autonomous microflyers capable of navigating in confined or cluttered areas such as houses or small built environments using vision as main source of information.
Artist's view of the project. © lis.epfl.ch & tangherlini.it
Flying in such environements implies a number of challenges that are not found in high-altitude, GPS-based, unmanned aerial vehicles (UAVs). These include small size and slow speed for maneuverability, light weight to stay airborne, low-consumption electronics, and smart sensing and control. We believe that neuromorphic vision chips and bio-inspired control strategies are very promising methods to solve this challenge.
The project is articulated along three, tightly integrated, research directions:
- Mechatronics of indoor microflyers (Adam Klaptocz, EPFL);
- Neuromorphic vision chips (Rico Möckel, INI);
- Insect-inspired flight control strategies (Antoine Beyeler, EPFL).
We plan to take inspiration from flying insects both for the design of the vision chips and for the choice of control architectures. Instead, for the design of the microflyers, we intend to develop innovative solutions and improvements over existing micro-helicopter and micro-airplanes.
Our final goal is to better understand the minimal set of mechanisms and strategies required to fly in confined environments by testing theoretical and neuro-physiological models in our microflyers.
|
MC2 - the 10g microflyer
|
|
A 10-gram microflyer that flies autonomously in a 7x6m test arena
The purpose of this ongoing experiment is to demonstrate autonomous steering of a 10-gram microflyer (the MC2) in a square room with different kind of textures on the walls (the Holodeck). This will first be achieved with conventional linear cameras before migrating towards aVLSI sensors.
The MC2 (see picture on the left) is based on a microCeline, a 5-gram living room flyer produced by DIDEL equipped with a 4mm geared motor (a) and two magnet-in-a-coil actuators (b) controlling the rudder and the elevator (b). When fitted with the required electronics for autonomous navigation, the total weight reaches 10 grams. The custom electronics consists of a microcontroller board (c) featuring a PIC18LF4620 running at 32MHz, a Bluetooth radio module (for parameter monitoring), and two camera modules, which comprise a CMOS linear camera (TSL3301) and a MEMS rate gyros (ADXRS150) each. One of those camera modules (d) is oriented forward with its rate gyro measuring yaw rotations, and will mainly by used for obstacle avoidance. The second camera module (c) is oriented downwards, looking longitudinally at the ground, while its rate gyro measures rotation about the pitch axis. Each of the cameras have 102 gray-level pixels spanning a total field of view of 120°. In order to measure its airspeed, the MC2 is also equipped with an anemometer (e) consisting of a free propeller and a hall-effect sensor. This anemometer is placed in a region that is not blown by the main propeller (a). The 65mAh Lithium-polymer battery (f) ensures an autonomy of approximately 10 minutes.
"Holodeck" (see pictures here) is the name we gave to our new experimentation room. In order to be able to easily modify the textures of the experimental arena, we equiped a 7x6m room with 8 projectors hanging from the ceiling and connected to a cluster of 8 Linux-based computers. In addition, an omnidirectional camera has been placed in the center of the ceiling in order to track the flying robot. A program is currently under development, which will allow for trajectory reconstruction.
Video Clips
- Fully autonomous flight with the MC2 in the Holodeck (MP4,
45MB or youtube, 10.05.2007).
- Autonomous take-off, steering and speed control with the MC1 in our Holodeck (low-res MPG,
14MB, 19.01.2006). See this related publication for more details.
- Video clips of the remote-controlled microCeline typical indoor environments can be found here.
- Video clips of previous experiments of optic-flow-based autonomous steering with a 30g airplane in a 16x16m arena can be found here.
Related Publications
301 Moved Permanently
301 Moved Permanently
CloudFront
|
Optic-flow-based control strategies
Most current UAV/MAV autopilots rely on estimation of the three positional coordinates and the three angular coordinates by fusing GPS and Inertial Measurement Unit (IMU) information in order to maintain control the stability and trajectory in obstacle-free space. This prevents the use of UAV/MAVs in cluttered environments such as cities, mountain regions, or indoors, which demand for lightweight vehicles capable of continuously steering among obstacles without GPS signal.
In this project, we develop a novel method, called optiPilot (patent pending) for near-obstacle flight control that uses optic flow and does not require estimation of positional and angular coordinates. It consists of directly mapping optic flow estimates taken in various directions around the aircraft translation vector into control signals for roll and pitch regulation by means of two weighted sums, similarly to neural matched filters found in flying insects. This method is capable of stabilising both altitude and attitude and of steering through cluttered environments. It is applicable to any flying vehicle whose translation vector can be assumed to be aligned with its longitudinal axis, such as fixed-wing aircraft or helicopters in translational flight.
This method can be implemented with various kinds of optic-flow detectors as front end. Typical examples include standard CMOS cameras with off-board image processing, aVLSI motion chips (see next section), or optical mouse sensors (see figure on the right, top one).
Video Clips
- Video of a fully autonomous swinget using optiPilot for low-level control and GPS for global guidance (MP4, 14MB, or youtube, 08.2009, accompanying video of the ICRA'10 paper)
- Video of a swinget taking-off and landing autonomously using optiPilot (MP4, 8MB, or youtube, 04.2009, accompagnying material of the EMAV'09 paper)
- Video of a physical flying wing avoiding trees, ground and water in natural outdoor environments using optiPilot (MP4, 36MB, or youtube, 09.2008)
- Video (simulation) of a small flying wing in urban and mountaneous environments (MP4, 31MB, 08.2007)
- Video of a completely autonomous full 3D flight with a simulated MC2 (MPG, 11MB, 20.09.2006).
Related Publications
301 Moved Permanently
301 Moved Permanently
CloudFront
|
Flying wing for outdoor experiments
(2008) |
Simulation setup for urban flight control experiments (2007-8) |
Simulation setup for indoor flight control experiments (2006) |
|
Layout of the first test chip |
| aVLSI optic-flow detectors (developed at INI)
The extraction of motion from optic flow is very expensive from the computational point of view and turns out to be a great challenge in embedded systems: A changing noisy view of the environment is shown to an array of photoreceptors and is used to detect motion.
To get a good estimation of the motion in front of a microflyer, computation on the signals extracted from the array of photoreceptors should be in parallel to make sure that necessary data is not lost. For a microflyer we also have constrains like low power consumption and low weight. That is why aVLSI (analog very large scale integrated) motion chips ideally fit for microflyers.
Related Publications
301 Moved Permanently
301 Moved Permanently
CloudFront
|
Related Links
Other Bio-inspired Vision-based Flying Robots
- Obstcale avoidance
- Altitude control
- In simulation
- Stabilization and visual odometry with blimps and
airships
Other Miniature and Slow Flying Robots
- Flapping-wings
- Rotorcrafts
Other Indoor Flying Robots
Biological Studies of Flying Insects
Miscellaneous
|
People Involved in this
Project
Visiting Students
- Guillaume Masson: Vision-based altitude and pitch estimation.
Master Projects
- Adam Klaptocz: Emebedded electronics and control of the MC3.
- Julien Reuse: Visual tracking of indoor flying robots.
Semester Projects
- Adam Klaptocz: Design and realization of ultra-light linear camera modules with rate gyro for the MC1.
- Tobias Greuter: Miniature aVLSI camera for indoor microflyers.
- Gavrilo Bosovic: Miniature ultrasonic sensor.
Student Project Proposals
|
|
|
|
|
|
|
[3-8 May 2010] J.-C. Zufferey presents a paper on how to couple the optiPilot control strategy with GPS guidance at ICRA'10 in Anchorage.
[11 October 2009] J.-C. Zufferey presents new results obtained in this project at the MAV workshop at IROS'09 in St-Louis.
[15 September 2009] Antoine Beyeler presents the recent results obtained in autonomous take-off and landing at EMAV'09 in Delft.
[22 September 2008] This project was presented at the workshop on Visual guidance systems for small autonomous aerial vehicles at IROS'08, in Nice.
[23-25 June 2008] This project was presented at the UAV'08 conference in Orlando.
[12-17 August 2007] This project has been featured at the Flying Insects and Robots symposium.
[10-14 April 2007] A. Beyeler presents a paper at ICRA'07,
in Rome, Italy.
[9-15 October 2006] J.C. Zufferey presents a paper at IROS'2006,
in Beijing, China.
[9-14 July 2006] J.C. Zufferey presents a poster at the 50th Anniversary Summit of Artificial Intelligence,
in Monte Verità, Ascona, Switzerland.
[15-19 May 2006] A. Beyeler presents a paper at ICRA'2006,
in Orlando, Florida.
[23 February
2006] Invited presentation on our autonomous indoor microflyers by J.C. Zufferey for the Akademische Fluggruppe Zürich.
[18-22 April
2005] J.C. Zufferey presents a paper at ICRA'2005,
in Barcelona, Spain.
[1 March 2005] Invited talk by J.C. Zufferey on
Evolutionary Robotics at JOIN05 in Braga, Portugal.
Previous participation to conferences are listed on this related project web page.
|
|
|
|
|
|
|