Академический Документы
Профессиональный Документы
Культура Документы
Abstract
The Eos Unmanned Air System (UAS) is designed to complete a number of tasks in
support of an autonomous reconnaissance mission during the Association of Unmanned
Vehicle Systems International (AUVSI) Student Unmanned Air Systems (SUAS) Compe-
tition. These task include: manual takeoff and landing; autonomous flight and waypoint
navigation; manual sense, detect, and avoid; interoperability; target detection, localization,
and classification; autonomous search; automatic target localization; actionable intelligence;
off-axis target imagery and classification; emergent target re-tasking and identification; sim-
ulated remote intelligence center interaction; and autonomous air drop release with required
drop accuracy. The airframe has conventional topology with an electric tractor propeller,
high-aspect ratio wings, and a conventional tail. Imagery is gathered from a Point Grey
Flea3-8.8MP camera mounted on a pitch & roll gimbal, managed by an Intel NUC com-
puter, and transferred via two 5.8GHz Ubiquiti Bullet radios to a Netgear R7000 router
on the ground. A 2.4GHz Ubiquiti Bullet connects to the SRIC network. Flight control
and navigation is performed by a modified version of the ArduPlane software running on
the Pixhawk hardware platform. Between Eos and its sister plane, Icarus, over 30 flight
tests have been conducted. Unfortunately, the team experienced a crash recently, without
enough time to repair and prepare to compete, so Eos will not fly a mission at the 2015
SUAS Competition.
CONTENTS Eos Unmanned Air System
Contents
1 System Design Rationale 3
2 Payload System 3
2.1 Camera & Camera Gimbal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Air Drop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.3 Payload Computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.4 Airborne Power System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.5 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
3 Flight System 8
3.1 Aircraft . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.2 Navigation System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
4 Ground System 14
4.1 Catapult . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
4.2 Communication System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.3 Targeting System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
4.4 Automatic Detection & Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
4.5 Localization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5 Mission Operation 18
5.1 Safety . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
6 Summary 19
2 Payload System
The payload system consists of all airborne components of the UAS which are not required for autonomous
flight. It performs the imaging, SRIC, and air drop tasks defined in the mission specification by employing a
machine vision camera, a two-axis gimbal, a mini servo, and a 64-bit Intel computer with custom software.
The Cornell team has decided not to attempt the infrared target task due to budgetary constraints and the
negative effect that the extra weight would incur on other aspects of the mission. In support the mission tasks,
the Cornell team has also devised an airborne power system to provide multiple independently-controllable
rails of efficiently regulated power from a single battery, and communications system to reliably transfer
images and commands between the payload system and the ground system.
Figure 1: Images of Targets from the Machine Vision Camera: The left image shows full image. The right
image is a close-up of a target with high contrast (green on red).
machine vision cameras are more expensive and complicated than the DSLR camera the team has used in past
competitions, it was decided that the reduced weight and form factor of machine vision cameras outweigh the
increased cost and development time. After a trade study considering cost, imaging performance, interface
type, weight, and form factor, the Flea3-8.8 was chosen as the teams imaging system.
Dhorizontal = D(39.6 , 500ft, 4096) The Point Grey Flea3-8.8 with a Tamron
feet M118FM08 lens has a horizontal field of view of
= 0.0879 (3)
pixel 39.6 degrees, a vertical field of view of 30.02 de-
Dvertical = D(30.02 , 500ft, 2160) grees, and a resolution of 4096x2160. The maxi-
mum required imaging height is 500 feet.
feet
= 0.1241 (4)
pixel
The Flea3-8.8 camera is controlled by FlyCaptures proprietary FlyCap software. This software allows
image capture, image transfer to the payload computer, and in-flight modifications to various camera prop-
erties, such as shutter speed and brightness. Custom payload software was developed to integrate FlyCap
with the rest of the system. An additional advantage of the Machine Vision camera is the ability to use
hardware triggering to command the camera to capture an image at a precise moment. This functionality
is used to make it easy to precisely synchronize the time of capture with the stream of telemetry data by
causing the capture to occur the instant a new piece of data arrives in the system.
A two-axis, stabilized gimbal system enables the Eos UAS to image off-axis targets and maintain a
downward gaze during turns and altitude changes. The competition rules require that the Eos UAS capture
an image of a target that lies up to 500 feet to the left or right of the aircraft while the aircraft is at an
altitude of 300 feet. This specification requires a field of view of up to 60 degrees to either side of the aircraft
to image the target. The imaging system gives a camera field of view of 20 degrees to either side while the
Figure 2: Gimbal System Design and Implementation: The left image is the CAD of the gimbal control
board. The middle image is the fully assembled gimbal control board. The right image is the CAD of the
gimbal mechanism
camera is pointed straight down. Therefore, the gimbal system must be able to roll the camera by at least
40 degrees. To capture multiple images of the off-axis target, the gimbal must also pitch the camera forward
and backward as the aircraft flies by the target. The team further requires the gimbal to keep the camera
pointed at the ground during roll and pitch changes that occur during turns, ascents, and descents. This
requirement improves the total ground coverage of captured images and reduces the distortion in images
processed by the ADLC and MDLC systems.
The gimbal system consists of a gimbal mechanism, a control board, and an interface software on the
payload computer. It has three operating modes: stabilized, point-at-GPS, and retract, which can be changed
by the imagery system through a serial interface. The gimbal system employs its own inertial measurement
unit (IMU) with 6 degrees of freedom to measure the orientation of the aircraft with respect to a global
North-East-Down (NED) frame to allow the controller to update at 100Hz. The gimbal controller receives
a 5Hz update from the autopilot with the aircrafts heading and position which allows the controller to
compensate in the yaw drift of the gyroscope and to point at a given GPS position on the ground. Feedback
from magnetic encoders with 0.1 degree of accuracy allow a proportional-integral-derivative controller to
track the desired motor angles as determined by the state and the mode.
The gimbal mechanism itself was constructed from laser-cut pieces of aircraft-grade birch plywood to
produce a light-weight yet stiff design. Both the weight and the stiffness are critical to the performance
of the larger system because the aircraft must remain light so that it can be catapult-launched safely, and
any error in the cameras orientation has a large impact in the accuracy of our localization algorithm. The
gimbal mechanism is mounted rigidly to the rear wall of the fuselage using four screws. The physical gimbal
and control board are shown in Figure 2.
When armed from the ground station, the master aerial software node (MASN) requests a drop time
from the airdrop planning module (ADPM). The ADPM uses the aircraft state data to simulate a drop and
determine where the relief canister would land relative to the aircraft. The ADPM then finds the time at
which the relief canister would land closest to the targetthe black dot in Figure 3and the distance that
this drop position would be from the target center. The team sets a threshold radius for each pass, so that
a second attempt can be made if the first is not satisfactory.
Data Link The primary task of the data link is to transfer images taken during the mission to the ground
where they can be processed. It must be able to provide a constant, high-bandwidth connection so the
images are conveyed with minimal delay. Two long range Ubiquiti Bullet M5HP (Bullet M5) WiFi to
Ethernet bridge were selected to provide this link for their size, simplicity, reliability, and previous success.
The Bullet M5s are interfaced with the Intel NUC Computer over Gigabit Ethernet and retransmit data
using the IEEE 802.11n WiFi standard operating at 5.8 GHz with a max transmit power of 25 dBmusing
WPA2 encryption. To ensure a constant connection with the ground, the Bullet M5s are designed to enter
a continuous loop where they will attempt to connect to a predefined network if the connection is dropped.
The use of two radios also provides some element of redundancy, where if one stops working the other
will carry out all data transfers, albeit in a limited capacity. Through previous system iterations the team
learned that if one 2.4 GHz and one 5.8 GHz link are used for data transfer, the lower frequency acts as a
bottleneck for all data transfers. Using two 5.8 GHz links eliminates this issue. With these changes, speeds
of 50-60 Mbps are achievable at distances up to a kilometer, with speeds up to 130 Mbps achievable at close
range. The minimum bandwidth required to transfer images in real time is roughly 30 Mbps, the current
communications system exceeds requirements.
Both Bullet M5s are connected to omnidirectional 3dBi monopole antennas and thus radiate energy in a
near spherical pattern. This allows for a constant wireless connection regardless of the aircrafts orientation.
The data link antennas are oriented vertically to maximize their horizontal range and placed at each wingtip
to increase physical separation and reduce potential multipath interference. Given the metallic radiating
elements in the aircraft the radiation simulations revealed the patternsuch as in Figure 4in which Eos
antennas will transmit energy and analogously data. These results were taken into account when determining
antenna placement.
SRIC Task The Simulated Remote Intelligence Center (SRIC) task requires the Eos UAS to connect to a
remote wireless hot-spot in the competition field. While connected it must download a message from a remote
computer attached to the network, and upload an image to the same computer. The task is automated by the
use of a UNIX script which uses the SSH protocol to modify a Ubiquiti Bullet M2s configuration on-the-fly
to connect to the SRIC network. The script then loops waiting to connect to the network. Once connection
is confirmed, the script opens an FTP connection to the SRIC computer and download the SRIC message.
It then transfers an image to the computer in a binary fashion, literally writing the image byte by byte to
the FTP server. The script then asks for user feedback about the validity of the mission from a user on
the ground before terminating. The script is built to be as modular as possible and allows for automatic
execution by the Intel NUC as well as remote execution from the ground.
3 Flight System
The flight system is comprised of the aircraft and navigation system whose primary responsibilities are to
carry the payload system into the air and around the field so that they can perform their missions. The
majority of the requirements on these systems are derived from the payloads themselves rather than directly
from mission specifications.
3.1 Aircraft
Though critical to the completion of the mission, the airframe is not directly assessed during the mission.
The primary objective of the airframe is to facilitate other systems in accomplishing mission tasks. As such,
some of the major drivers for the airframe design are accessibilityto ease integration and troubleshooting,
stability and controllabilityto facilitate autopilot tuning and autonomous control, efficiencyto ensure
that the system can remain aloft and utilize the mission time provided, durability & repairabilityto allow
for uninterrupted testing, and the ability to catapult launch and belly-landto make autonomous launch
and landing significantly simpler for the autopilot.
3.1.1 Design
Eos airframe is constructed from a carefully selected set of different composites, foams, and woods, including
fiberglass, carbon fiber, Kevlar, Garolite, Pactiv R10 Unfaced Polystyrene Foam, birch plywood, and poplar
plywood. The team utilized wet layups of fiberglass-sandwiched foam for the majority of the aircraft body,
and strategically integrated Kevlar on the bottom of the fuselage to provide abrasion resistance. Carbon
fiber spars were integrated in the wings and tail near joints for added strength. The team used laser-cut
plywood ribs to join the composite parts and serve as breakpoints in the event of a crash.
Payloads & Accessibility Eos is designed to carry and protect 6.7 pounds of electronics and batteries,
as required by the payload system. The width and height of the fuselage were determined using the mea-
surements of the largest piece of equipmentthe camera gimbalwhich required 6.5in of width and 6in of
height.
The team designed the aircraft such that the entire top of the fuselage is removable. The wings are held
00
on by a single 1/4 20 thumb screw and sliding support to make it easy to remove, and the remainder of
the top of the fuselage is covered by two magnetic hatches. With this design, to access a payload, at most
one screw needs to be removed, but payloads that require high accessibility are mounted below the hatches
instead of the wings.
Stability & Controllability As the team determined desired flight characteristics, stability emerged as the
most imperative. Foremost, the team uses a high-wing configuration, which also increases the usable space
within the fuselage and decreases the likelihood of a wing strike on belly-landing. The wings also feature a
two-degree dihedral, as this number has worked for the team in the past and is recommended from various
sources. Further, Eos design incorporates two degrees of washoutthe tip has a shallower angle of attack
than the rootso the root section of the wing will stall before the tip, assuring that the pilot retains roll
control of the aircraft.
Efficiency The entire geometry of Eos was designed to be sleek and aerodynamic. Though the payloads
require a relatively wide and tall fuselage, the team created a long and slender nosecone and empennage to
increase the aircrafts efficiency. For the wings, the team used XFoil to find that the MH114 airfoil was best
suited for the missionthe MH 114 has a very high lift to drag ratio, while still having very gentle stall
characteristics. This airfoil has also been proven to work well as the team has used it for all of its aircraft
thus far. To increase the efficiency of Eos wings specifically, the team increased the aspect ratio, resulting in
a much longer, more slender wing. The aspect ratio the team chose for Icarus was 12, allowing for not only
a longer flight time but also a more stable flight. Further, Eos wings have a linear taper with the tip chord
as half the root chord to mimic an elliptical wingreducing dragwhile remaining easy to manufacture.
Figure 7: Airframe Features: Catapult Mounting Holes (left); Modular Design (center); Wing Mount and
Hatches (right)
Durability & Repairability To enhance the durability and repairability of the airframe, during the design
process the team considered a number of scenarios that could potentially damage the UAS. These include
rough belly-landing, wing-tip strikes, camera damage, and full crashes.
One case is when a wingtip contacts the ground on landing. This scenario would produce a large rearward
force on one wing, which could easily cause some component to break. Rather than design a system to
withstand this load, the team created a strategic break-point to absorb the impact: the wing is held in place
00
by a 1/4 20 thumbscrew in the front and by a piece of plywood in the rear. If a wing strike occurs, the
rear piece of wood will snap and allow the striking wing to rotate backward around the thumbscrew. The
plywood can be easily replaced so testing and operation of the UAS are barely effected.
Another scenario that Eos was designed for was an especially fast vertical descent, which produces large
impact forces on the nosecone and fuselage. Again, rather than designing the system to survive, the team
planned for ease of repair. Both the nosecone and the fuselage are created from female molds, which allow
backup parts to be made in a span of days rather than weeks. In the event that the aircraft impacts
the ground in a crasheven more violent than a rough belly landingthe aircraft is designed such that
the plywood connection braces with take the brunt of the impact and leave the payloads and composite
components in relatively good shape.
Belly-landing is especially difficult for this mission because it requires that the camera be exposed to
produce good imagery. To mitigate the risk of damaging the camera, the empennage is designed with a lip
on the back that will impact the ground before the camera. Furthermore, the gimbal rotates the camera
upwards during landing, to prevent debris from hitting the lens.
Apart from designing for repairability after a mishap, precautions we taken so that the aircraft would
not get damaged in the first place. Eos was designed for its wings, vertical stabilizer, horizontal stabilizer,
empennage, and fuselage to all separate so that it could be transported in foam-filled ruggedized cases.
Additionally the wings are designed as two halves that come together and connect onto the fuselage using a
bolt and slide-in wooden piece. Figure 7 shows Eos separated into its components.
Catapult Launch & Belly-Landing As discussed earlier, the team decided to leverage its mechanical skill to
produce a system that simplified the requirements on the navigation system autonomous launch and landing.
Hand-launch was deemed to unreliable and too restrictive on aircraft size, so catapult launch emerged as a
simple was of getting the aircraft airborne so the navigation system could just fly it. Eos design incorporates
four holes for the arms of the catapult to hook into. The connection points are reinforced internally using
Garolite, and are carefully placed so that the aircraft is being mostly pulled from in front of its center of
gravity, rather than pushed, which could cause it to jump off the catapult.
For landing, the team chose to forgo landing gear and allow the aircraft to skid on its belly. This means
that the autopilot does not need to perfectly orient wheels with the aircrafts direction of travelfailure to
do this would result in the aircraft flipping over. The arched kevlar bottom creates an extremely durable
surface that disperses the landing load to the fuselage walls, resulting in a fuselage strong enough to endure
many belly landings. After the motor is turned off, the propeller folds against the edge of the nose cone to
avoid splintering upon impact.
Flight Characteristics After takeoff, Eos can easily climb with a vertical speed of 25 KIAS and can fly
level as slow as 27 m/s without flaps. When flying, Eos can have a turn radius of 100 feet. At cruise speed,
the throttle is set to 45 percent, which results in a 25 minute flight time before the 9 cell, 5000mAh battery
reaches 25 percent charge. Upon landing, the flaps slow Eos down to 21 m/s, which allows Eos to reach a
full stop in less than 30 feet after initial touchdown.
When flying Eos, there was no sideslip, indicating a properly sized tail and a stable aircraft. When Eos
flew horizontally with throttle cut, Eos did not initially point up or down, further enforcing its stability. The
two degree dihedral also proved the self stabilizing affinity, when Eos flew is very low wind conditions. The
large flaperons provided very sensitive controllability, enabling Eos to turn very quickly when piloted by the
autopilot, and subsequently hitting waypoints that required Eos to rapidly change directions. This ability
helped Eos avoid moving obstacles with ease.
Repairability & Durability Throughout the aircrafts life, various repairs were needed after test flights. In
one instance, radio interference resulted in the teams aircraft loosing control and hitting the ground. The
left wing snapped in half, all ribs broke, the nosecone buckled, and the empennage sheared off from the
fuselage. Even though the left wing snapped, the team believed that the wings would have been in much
worse shape if not for the breakaway point between the wing and the fuselage. This point broke as projected,
and allowed the wings to snap from the fuselage, dissipating some of their energy. Despite this seemingly
devastating crash, the airframe team was able to rebuild Icarus in that night, and get Icarus back in the
sky the following morning. This accomplishment was due both to the teams strong work ethic and to the
modular design and repairability of the aircraft. The ribs were easily reprinted and glued together, while
the wings were easily repairable by a one lamination. The nosecone was patched, unbuckled, and laminated
within two hours and the empennage did not need any repairs. This one-day rebuild proves how repairable
the teams design was. Eos also proved to be extremely durable, surviving many belly landings without
a need for repair. When landing over rocks, no abrasions were inflicted on the fuselage, and, even when
dropping five feet, on the ground, Eos sustained no damages.
Catapult Launch & Belly-Landing The Garolite-reinforced catapult connection holes, located at the bot-
tom of the fuselage, interacted with the catapult perfectlydistributing its load to the rest of the fuselage.
Through all flights, Eos did not experience any damage from the catapult. As described in the durability
discussion above, Eos survived all belly-landings with minimal abrasions.
Both research and past experience were utilized to choose a navigation system. In previous years the
team has used both the Kestrel and Piccolo II navigation systems. The Kestrel performed well enough but
was difficult to use, did not support rolling takeoff and landing, and was not highly robust or reliable. The
Piccolo II system performed far better. The autopilot was able to achieve stable autonomous flight within a
month of using the system. Further, it supports autonomous rolling takeoff and landing although the Cornell
team had difficulty taking advantage of this feature in past years. However, the Piccolo systems code is not
editable, and its application program interface (API) is limited and often difficult to use. Thus, the team
would be unable to easily interface with the navigation system, which is essentially required by the mission.
There are a few open source autopilot options, most notably, ArduPilot . ArduPilot is supported by
hundreds of developers and an even larger community, in addition to official support from 3D Robotics, the
primary producer of this open-source autopilot hardware. Furthermore, the ArduPilot supports autonomous
takeoff and landing and many other features that will be valuable in future years. There are two main
platforms that are capable of running ArduPlane, ArduPilots plane control software: the ArduPilot Mega
(APM) and the PixHawk flight system. The team chose the PixHawk system since it has a more powerful
processor that will enable the team to expand the autopilot system in the future. The entire autopilot system
is significantly less expensive than the Piccolo system, so unlike in years past, the team was able to replicate
the system and create test flight platform in addition to the competition flight platform. This extra test
plane allowed for testing of risky maneuvers such as landing that could damage the competition platform.
This test flight platform also allowed for immediate integration. The team could therefore test autopilot
systems while the competition platform was being built and integrated.
The entire autopilot system needed to be integrated seamlessly with other software on the plane and on
the ground. The team evaluated the available ground station options that supported MAVlink, the commu-
nication protocol used by ArduPilot, including QGroundControl, APM Planner and Mission Planner. These
grounds stations did not have a robust API and were often unreliable, crashing multiple times during testing.
Therefore, the team decided to revamp an existing open source ground station called MAVProxy. MAVProxy
is a command-line and console based ground station built in Python that can handle communication to the
ArduPilot. MAVProxy is open-source and thus the team could manipulate it and build custom modules.
The team first created a robust web API that enabled other systems, including the imagery system, to easily
access autopilot data. In addition, the team designed two web-based interfaces that connect to the API. The
first is a Graphic User Interface to view autopilot status and easily control the system. The second interface
lets judges view the status of the autopilot. The team also developed a custom module to connect to the
interoperability system. The most important aspect of MAVProxy is its flexibility that will enable the team
to expand in the future.
use PlaneMaker software. The model was then tuned in the simulation by manual editing PID parameters.
Once the tuning was perfected in the simulation, the autopilot was added to the aircraft.
After a manual takeoff, the autopilot would take over flight and go into AutoTune mode. Using the PID
values from the simulation as a base, the autopilot self tuned while the pilot flew aggressive maneuvers.
AutoTune takes roughly ten to twenty minutes of flight time. Once AutoTune was completed, the plane was
landed manually. In subsequent flights the navigation system was tuned, auto takeoff and auto landing.
Flight Pattern All sections of the flight plan are well-defined by the AUVSI SUAS Competition except
that of the search grid. For this portion of the mission, a rectangular sweeping pattern will be used that
gradually moves down the length of the search grid. The desired spacing between passes on the search grid
is defined by the total distance imaged along the roll axis, found to be 180 ft. This distance was computed
as part of the distance-imaged-per-pixel calculation shown in Equations (3) and (4). Thus, a rectangular
sweeping pattern with passes separated by at most 180 ft will be used.
Communication All communication to the Autopilot will go over a 900MHz XBee encrypted paired radio
link. The protocol for the communication, MAVLink, is highly robust to noise and is designed to go over a
slow connection to ensure a long distance connection. The XBee link can go up 3 miles, however this range
was not tested.
Failsafes Both failsafes are implemented in the Autopilot. After 30 seconds of loss of Radio Control signal,
the plane changes to Return To Launch mode. The plane stays in this mode for 3 minutes. If radio control
is not regained after those 3 minutes, the plane performs aerodynamic termination. If during any of the 3.5
minutes of fail safe, radio control link is regained, the plane will return to the autonomous mission. Both
failsafes have been tested on the ground, neither have been tested in the air.
3.2.3 Testing
The open-source nature of the ArduPilot system allows for changes to its software, and in some ways
actually requires them. Since the ArduPilot software is not a commercial product, the codebase contained
small bugs and was poorly documented in many places. The system, therefore, required extensive testing
and documentation in order to perform reliably. These issues were first addressed in a simulated environment
before being flight-tested on a real aircraft.
The beginning of the year was spent assembling a simulation suite that connected a simulated autopilot
to FlightGear, a flight simulator. The simulation runs the same code as the physical autopilot and is able
to simulate all processor and sensor aspects such as registers, communication delays and sensor noise. The
team wrote a conneciton layer that passed servo values from the simulated autopilot to FlightGear, which
simulated the flight dynamics. The connection layer then took the attitude and GPS values from FlightGear
and passed them bak to the autopilot as simulated sensor values. The plane tested in the simulator was
provided by the ArduPilot community and very closely resembled our Autopilot Test Aircraft (Bixler V3).
This simulation system allowed the team to test and tune the aircraft before the initial flight.
Secondly, software was tested on a test platform that was light and inexpensive to allow for relatively
safe crashes in the event of software issues. The test aircraft also allowed the autopilot team to work with
a real aircraft with similar flight characteristics to the competition plane, while the mechanical team built
and assembled competition aircraft, Eos.
4 Ground System
The ground system for the Eos UAS is comprised of the equipment and software necessary for wireless
communication, image processing, in-flight re-tasking, and manual override of autonomous airborne systems.
Eos ground system also includes a pneumatic catapult to provide consistent, reliable launches.
4.1 Catapult
The Eos UAS employs a catapult for takeoff and belly landing in order to facilitate fully autonomous
takeoff and landing. Previous iteration of the Cornell teams UAS have used fixed landing gear to perform
takeoffs and landings from a runway. While our autopilot system is capable of autonomous rolling takeoff
and landing it was unreliable at best. Therefore, the Cornell team decided to explore other take off and
landing approaches, such as hand and launching. A hand launching approach suffers from the unreliability
and inconsistency of the human operator and it also limits the size of the UAV. A catapult launch was
selected for its reliability, consistency, and proven effectiveness with commercial and military UAV systems.
Additionally, catapult launch and belly-landing allows the system to be tested even when ground conditions
are unfavorable. The Cornell team designed, manufactured, and tested a portable, reliable, and safe linear
catapult system driven by a compressed air piston-pulley propulsion system.
Eos catapult is composed of the cart, the rail, the pneumatic system, and the control system. The rail
is made of two lengths of galvanized steel square tubing connected by an inner coupling piece. Within the
interior of the rail rests the cylinder for the piston which will convert the pressure from the compressed air
into a force on the cart. On the rear end is the bracket for a safety pin that attaches to the back of the
cart. On the front end are the support legs, shock-absorbing springs for the cart, and the pulley mechanism.
Beside the catapult sits the reservoir and the catapult pressure control box (CPCB). The CPCB controls
the pressurization of the reservoir and the firing of the catapult. The catapult remote control unit (CRCU)
is connected by a CAT5 cable to the CPCB and displays information about the state of the catapult of
the user. The CRCU also allows the user to control the pressurization and launch of the catapult from a
distance.
During launch, the pneumatic system releases compressed air, which forces a piston rear-ward down the
center of the rail, pulling a cable that loops around a pulley at the front, and accelerates the cart and aircraft.
Once the cart reaches the end of the rail, the cart is stopped by two large springs, and the arms that hold
the aircraft are free to rotate forward, allowing the aircraft to continue moving straight.
resume exactly where it left off. To accomplish this, the server is stateless. All internal state is offloaded
to PostgreSQL, which handles the complex persistence and recovery logic automatically. The ground server
can interface with any number of clients.
The client application was designed to be run in a web browser using standard Javascript, HTML and
CSS. This takes advantage of the ubiquity of the browser as an application platform; deploying the application
takes literally milliseconds on nearly any computer. Each client enables its operator to perform the targeting
task and, by interfacing with the server, to combine work done on multiple clients. Also, the client is capable
of issuing commands to the payload application.
4.5 Localization
In the Eos ground system, after the location of a target is identified in an imagewhether through manual or
automatic meansits GPS location is determined through a single localization algorithm. Though thresholds
for scoring are specified in the mission outline, the team did not utilize them, but simply attempted to devise
a correct algorithm that would produce good results when error is factored in. This consolidated localization
algorithm estimates the GPS location of each target in three distinct steps. First, for each sighting of a
target in an image, the location of the target relative to the aircraft is estimated by calculating a pointing
vector from the aircraft to the target.
Each pointing vector is initially constructed pointing straight down, but then is rotated to account for
the cameras orientation and the targets location in the image. The rotation of the camera in world space is
measured by an IMU mounted on the camera, and the state is captured by the gimbal controller at the same
instant the camera is triggered. This ensures that the position and orientation of the camera are as accurate
as possibleany timing error means that this data is incorrect. The implicit rotation from the cameras
line of sight to the intended pointing vector is provided by the targets location in the image. In the second
step, for each of the calculated pointing vectors, the GPS location of each target is estimated. From this
aircraft location the target location is derived via vector addition with the pointing vector. As a final step,
the GPS location estimates from each sighting of a target are combined into a single final estimate. The
data is cleaned of outliers or impossible points, and then the average of the location estimates is calculated.
5 Mission Operation
Ten people operate the Eos UAS. First, there is a mission lead, who does not have a predefined task other
than to coordinate the other operators and serve as a unified interface to the team for the evaluators. Next,
there is a safety pilot who performs manual takeoff, monitors the flight while under autopilot control, and
performs manual landing. He has the ability to take manual control at any time during the flight if necessary.
By his side is the spotter whos only role is to assist the safety pilotby adjusting trims and ensuring that
the runway is clear, among other things. The remainder of the operators are seated at some position on the
ground station and interface with some software. The autopilot operator is responsible for the autonomous
navigation and flight portion of the mission, and will assign flight plans and re-task the aircraft according to
the direct mission requirements and requests from other operators. He is assisted by the autopilot assistant,
who is tasked with monitoring the status of the autonomy software during re-tasking. For completing the
imagery task, the Cornell team has an imagery lead who is responsible for changing camera settings, merging
target sightings into logical targets, providing actionable intelligence, and providing the target information to
the evaluators at the end of the mission. The remaining four operators are responsible for using the MDLC
application to identify and tag targets in the imagery; they may also assist the imagery leadfrom their
own workstationin completing any of his tasks. These operators are responsible for monitoring automated
tasks as well, to ensure that the mission can still be completed in the event of a failure. One monitors the
ADLC program, another monitors the antenna tracking machine, yet another monitors the completion of
the SRIC task, and the last is responsible for arming the air drop system as well as executing the manual
override in the case that the automated drop fails.
The mission timeline consists of three phases: setup, execution, and teardown. Setup of the ground
station involves placing the tables and chairs, setting up the individual computer nodes, connecting all
network cables to switches and computers, connecting all power cables to power strips or uninterruptable
power supplies, initializing the antenna tracker, and placing the sunshades on the table to decrease sun glare
on the computer monitors. The execution phase starts with the mission clock. The safety pilot completes a
short preflight test to verify the operation of the payload components, manual aircraft control, and navigation
system sensor readings. This preflight test takes around 5 minutes to complete and helps to ensure mission
success upon takeoff. The next step is a manual takeoff followed by 20 minutes of flight time.
5.1 Safety
Adequate safety precautions must be observed while testing an operating an aircraft, especially systems
with some degree of autonomy. The Cornell team strongly emphasized the safe operation of the flight
vehicle through all phases of design, development, and flight operations to mitigate the risk of personal
harm or property damage. The team has designed a system with redundancies, failsafesas defined by
mission requirements), and appropriate factors of safety that significantly reduce potential risks during
system operation. During flight operations, the safety pilot retains the ability to instantly regain full manual
control of the aircraft at any time by toggling a switch on the pilot transmitter. The safety pilot is an AMA-
licensed pilot with years of experience flying model aircraft. A spotter stands by the safety pilot at all times
to observe the surrounding areas for other aircraft or hazards, maintain flight line safety, and communicate
to the other ground system personnel over a handheld radio.
6 Summary
The Cornell University Team has designed the Eos Unmanned Aerial System to optimize mission perfor-
mance by creating subsystems that complement one another. Each system, as well as the whole UAS, was
implemented and thoroughly tested to ensure safe and reliable mission operation. Through thoughtful design
and testing, the team is confident that the Eos UAS would have been able to complete the following mission
tasks: manual takeoff and landing; autonomous flight and waypoint navigation; manual sense, detect, and
avoid; interoperability; target detection, localization, and classification; autonomous search; automatic tar-
get localization; actionable intelligence; off-axis target imagery and classification; emergent target re-tasking
and identification; simulated remote intelligence center interaction; and autonomous air drop release with
required drop accuracy.
Unfortunately, while perfecting tuning parameters shortly before team members left campus, the Eos
UAS entered an unrecoverable flat spin and crashed. As a result, the team will not be flying at the 2015
SUAS Competition.
References
[1] Cloud Cap Technology. Cloud cap technology documentation. http://www.cloudcaptech.com/
piccolo_II.shtm#history, 2013.
[2] Lockheed Martin. Kestrel flight system. http://www.lockheedmartin.com/us/products/procerus/
kestrel.html, 2013.
[3] Lockheed Martin. Product price sheet. http://www.lockheedmartin.com/content/dam/lockheed/
data/ms2/documents/procerus/procerus_pricing_022713.pdf, October 2013.
[7] Daniel Raymer. Aircraft Design: A Conceptual Approach. American Institute of Aeronautics and
Astronautics, Inc., Washington, D.C., 3rd edition, 1999.
[8] CUAir: Cornell University Unmanned Air Systems Team. Helios autonomous air system, May 2014.
[9] Mark Drela. Xfoil subsonic airfoil development system. http://web.mit.edu/drela/Public/web/
xfoil/, April 2008.