BillGreen

Views:
 
Category: Entertainment
     
 

Presentation Description

No description available.

Comments

Presentation Transcript

Autonomous Aerial Robots in Near-Earth Environments: 

Autonomous Aerial Robots in Near-Earth Environments William E. Green Drexel Autonomous Systems Lab (DASL) Mechanical Engineering, Drexel University Koerner Fellows Presentation– 06/09/04

Unmanned Air Vehicles Advocacy: 

“Unmanned aerospace vehicles (UAVs) are the hallmark of our future” Dr. James Roche, Secretary of the Air Force, October 10, 2001 “I do NOT believe that the current DOD inventory of airborne intelligence platforms or programmed procurement of additional assets is adequate to satisfy all of the CINCs' requirements.” Marine Gen. Peter Pace, Vice Chairman, Joint Chiefs of Staff, then Commander of US Southern Command, September 6, 2000 I hope I’ve demonstrated personally my commitment to UAVs, and I am committed… You can count on the fact that [my] commitment will continue.” Gen. John Jumper, Air Force Chief of Staff, Senate Confirmation Hearings, September 2001 “It shall be a goal of the Armed Forces to achieve the fielding of unmanned, remotely controlled technology such that by 2010, one-third of the aircraft in the operational deep strike force aircraft fleet are unmanned.” 2001 Defense Authorization Conference Bill - H.R. 4205 Unmanned Air Vehicles Advocacy

Slide3: 

Classes of UAVs “Improving UAV reliability (autonomy) is the single most immediate and long reaching need to ensure their success” – Office of the Secretary of Defense UAV Roadmap 2002-2027

Slide4: 

Near-Earth Environments Characteristics Rich with obstacles Poor GPS Degraded communications Rugged Labor Intensive Target Localization Search-and-Rescue Recon and Surveillance Inspection, Assessment

Slide5: 

What Type of Aircraft? Fly through small openings Carry a payload Specifications: Fly slowly and safely banquet table 4 mph Green, W.E., Oh, P.Y., “An Aerial Robot Prototype for Situational Awareness in Closed Quarters” IEEE 2003 International Conference of Intelligent Robots and Systems Market East Station

Slide6: 

What Type of Sensors? Collision Avoidance Turn away when optic flow high Speed Control Constant optic flow Hover, Gust Stabilization Zero out optic flow

Slide7: 

From Insects to MAVs Implementation Mount sensors on nose and belly Incoming collisions and altitude Rapidly expanding region on right 1D Theory OF = (V/D) sin q - w Computationally expensive? Green, W.E., Oh, P.Y., et al, “Flying Insect Inspired Vision for Autonomous Aerial Robot Maneuvers in Near-Earth Environments” IEEE 2004 International Conference of Robotics and Automation

Slide8: 

Previous Research Green, W.E., Oh, P.Y., et al, “Autonomous Landing for Indoor Flying Robots Using Optic Flow” 2003 ASME International Mechanical Engineering Congress and Exposition

Slide9: 

A Closer Look Focus of Expansion (FOE) Optic flow vectors radiate Pure translation yields zero OF Obstacles in line with optical axis Small: appear to get larger Large: will not be detected!! Platform Can’t Fly in Wind!! Can’t Hover!! Large Inertia Low Endurance Difficult to Control

Autonomous Flight in Caves & Tunnels: 

Autonomous Flight in Caves & Tunnels What configuration? What type of sensors? Wingspan: 44 inches Weight: 10 ounces Max Speed: 15 MPH Endurance: 20 minutes

Slide11: 

Near-Earth Platform Designed to Hover! High thrust-to-weight ratio (T/W=1.5–2) Aircraft wt. balanced by motor thrust Allows rapid transition thru stall regime Large control surface area Capable of flying in tight spaces Failsafe collision avoidance maneuver

Slide12: 

Future Roadmap June-July August-Sept Oct-Nov Dec-Jan Backpackable CF Prototype Autonomous Hovering Autonomous Cruise-To-Hover Transition Aircraft Model Develop Aircraft Simulator Autonomous Hover-To-Cruise Transition Begin Autonomous Flight Tests In Caves

Slide13: 

Contributions Slow flying fixed-wing test bed vehicle Non-GPS local processing based sensor suite Demonstrated autonomous tasks in near-Earth environment Collision avoidance Landing Altitude hold

Slide14: 

Conclusions Localization Now, lighter and more compact Maps can then be built (SLAM) IMUs initially too heavy Collision Avoidance We’ve proven this is possible Then comes path planning Must detect obstacle

Slide15: 

Thank You! Special Thanks to Dr. Koerner and His Family

Slide17: 

Gimbal Lock

Slide18: 

Contrast Enhance Feature Detection Digitization μC Sensor architecture Imaging fabric mesh includes photo sensing and analog preprocessing Other circuitry performs feature detection and digitization Low BW signal MOP μcontroller: Biomimetic motion detection algorithms and I/O. Winner-Take-All (WTA) To mux and vision chip output Photo- receptors Vision Chip “Morphology” Vision chip outputs edge locations. Microcontroller need only track edge movements. Edge Detectors 2-D imager 1-D imager Optic Flow Microsensors

Centering Response: 

Centering Response Bees fly through openings by balancing image speed to left and right Trained bees to fly through tunnel Vertical stripe pattern on side walls Grating on one wall could be moved at any desired speed and direction Results confirm that the bees balance the speeds of the retinal images in the two eyes and not the contrast frequencies Hypothesis Experiment Srinivasan et al., Journal of Experimental Biology

Additional Issues: 

Additional Issues Questions Is local computing necessary? Range of wireless cameras in NEE? Can small obstacles (wire) be detected? Soon enough to be avoided? Alternative to following GPS waypoints?

Slide21: 

Target Identification Identify survivors Deploy beacon to pinpoint location Determine video tx range Manual Control 2005 Inaugural Competition Collision Avoidance Demonstrate collision avoidance line following Gust stabilization Autonomous Control

authorStream Live Help