Вы находитесь на странице: 1из 10

Virtual Cockpit

Under the guidance of Dr. J. Chandrasekhar


Niranjan R
CB.EN.U4AEE12029
Vishal Hayagrivan
CB.EN.U4AEE12055
Navaneethkrishanan B
CB.EN.U4AEE12061
Abstract
Unmanned Aerial Vehicles are used for numerous applications like military, recreational, surveillance etc. These machines
have greater advantage over conventional methods as these dont pose risk to the operators life, it is not prone to fatigue,
it provides a cheaper alternative than its manned counterpart in many situations. But even with all the benefits it does
not address one of the advantages that manned vehicles have over unmanned ones- The first person view the operator
(pilot) has which help the pilot experience the flight first hand. Our Virtual Cockpit provides a VR (Virtual Reality)
cockpit environment to the pilot on the ground. The Virtual Cockpit prototype was developed for a Virtual Flight Testing
Laboratory and the open source nature of the platform enables it to be modified to suit any kind of UAV application.

Contents
1. Introduction:
2. Stages of development
2.1. Web-based instrument Panel (Basic T) . . . . . . . . . . . . . . . . . . . .
2.2. Video + Heads-up display prototype . . . . . . . . . . . . . . . . . . . . .
2.3. Stereoscopic 3D view + Head orientation tracking using Gyroscope . . . . .
3. Proof of Concept
3.1. FlightGear Parameter Logging Suite . . . . . . . . . . . . . . . . . . . . .
3.2. Flight testing comparison between Virtual Cockpit and Actual Flight testing
3.2.1. Glide . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
3.2.2. Phugoid Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
4. Conclusion
References

. . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .
. . . . . . . . . . . . . . . .
.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

.
.
.
.

1
2
2
3
5
6
6
8
8
9
10
10

1. Introduction:
The Virtual Cockpit is a virtual-reality interface software that interfaces the pilot and the unmanned aerial vehicle or a
simulator. The main aim is to give the pilot the feel that he is in the aircraft that he is piloting even if it is physically
impossible. This enables even a commercial pilot to be able to pilot a UAV with no prior training or experience. This
can be useful in various areas like pilot training, virtual aerial city tours, flight testing etc. The prototype software was
developed to suit the needs of a virtual flight testing laboratory. Flight testing is a branch of aeronautical engineering
that develops and gathers data during flight of an aircraft, and then analyzes the data to evaluate the aerodynamic flight
characteristics of the vehicle in order to validate the design, including safety aspects. [9]
The advantages of having a Virtual Cockpit environment in flight testing of UAV :
1

Data can be streamed and analyzed real-time


Pilot will have a first person view
Real-time data analysis could detect errors and faults arising in the aircraft
Greater computational power is available as the aircraft is controlled from the ground which makes data analysis
1000 times faster and could incorporate machine learning to predict the performance or could suggest an optimized
flight path The developed virtual cockpit prototype was tested with two simulators namely Flight Gear and X-plane
10 and the proof of concept was demonstrated by conducting a few flight test procedures using the Virtual Cockpit
on X-plane 10 and comparing them with the actual flight test results.

2. Stages of development
The Virtual cockpit interface was split into three modules, the flight instrument panel, the video head-up display and the
Virtual Reality environment.

2.1. Web-based instrument Panel (Basic T)


UAVs must have a proper, well-readable instrument panel to assist the pilot at the ground station to effectively fly the
aircraft. A web-based instrument panel was chosen over a native platform-specific one because, a web-based panel can run
on any computer, tablet, mobile phone that has a considerably capable web browser. It reduces the resource requirements
for the end-user and is widely compatible.
The web-based instrument panel communicates with a ground station in real-time via websockets. The 5 basic instrument panels have been included in the prototype.
1.
2.
3.
4.
5.

Artificial Horizon
Heading Indicator
Vertical Speed Indicator
Airspeed Indicator
Altitude Indicator

Figure 1. The instrument Panel


The data is received in the form of json (JavaScript object notation) [8] . The received data is parsed using JavaScripts
2

in-built JSON parser. Each of the instrument in the panel consists of smaller components that can be individually moved.
Based on the data received, the components are transformed (rotated, moved, translated) using JavaScript and CSS (cascading style script) there by causing the instrument panel to look and feel like a mechanical panel.
A total of 520 lines of code has been written in java, javascript, HTML, CSS to make this instrument panel.
The following is a snippet showing how the data is displayed on the instruments.
// ROLL
$(this).find('div.instrument.attitude div.roll')
.css('transform', 'rotate(' + roll + 'deg)');
// PITCH
$(this).find('div.instrument.attitude div.roll div.pitch')
.css('top', pitch * 0.7 + '%');
// HEADING
$(this).find('div.instrument.heading div.heading')
.css('transform', 'rotate(' + -heading + 'deg)');
// TURN
$(this).find('div.instrument.turn_coordinator div.turn')
.css('transform', 'rotate(' + turn + 'deg)');
// VERTICAL SPEED
if (vspeed > constants.vario_bound) {
vspeed = constants.vario_bound;
} else if (vspeed < -constants.vario_bound) {
vspeed = -constants.vario_bound;
}
vspeed = vspeed * 90;
$(this).find('div.instrument.vario div.vspeed')
.css('transform', 'rotate(' + vspeed + 'deg)');
// AIRSPEED
speed = 90 + speed * 2;
$(this).find('div.instrument.airspeed div.speed')
.css('transform', 'rotate(' + speed + 'deg)');
// ALTITUDE
var needle = 90 + altitude % 1000 * 360 / 1000;
var needleSmall = altitude / 10000 * 360;
$(this).find('div.instrument.altimeter div.needle')
.css('transform', 'rotate(' + needle + 'deg)');
$(this).find('div.instrument.altimeter div.needleSmall')
.css('transform', 'rotate(' + needleSmall + 'deg)');

2.2. Video + Heads-up display prototype


A prototype for the proposed Video + Heads-up display was developed using an android mobile device. The video feed
was acquired from the mobile device's camera and the internal Gyroscopes and accelerometers in the device were used.
The prototype allows the user to see a fed from the camera overlaid with a HUD that displays the following based on
the orientation of the mobile device.
1. Artificial Horizon
2. Heading
3

3. Roll
4. Pitch
The main sensor used was the Gyroscope. It provided the rate of rotation in rad/s around the device's x, y, and z axis.

Figure 2. Heads-up display


Rotation is positive in the counter-clockwise direction; that is, an observer looking from some positive location on the
x, y or z axis at a device positioned on the origin would report positive rotation if the device appeared to be rotating counter
clockwise. This is the standard mathematical definition of positive rotation and is not the same as the definition for roll
that is used by the orientation sensor.
The output of the gyroscope is integrated over time to calculate a rotation describing the change of angles over the time
step.
A total of 410 lines of code has been written in java to implement this prototype.
The following is a snippet showing how the raw Gyroscope rates are integrated to obtain the roll, pitch and heading
angles.
// Create a constant to convert nanoseconds to seconds.
private static final float NS2S = 1.0f / 1000000000.0f;
private final float[] deltaRotationVector = new float[4]();
private float timestamp;
public void onSensorChanged(SensorEvent event) {
// This timestep's delta rotation to be multiplied by the current rotation
// after computing it from the gyro sample data.
if (timestamp != 0) {
final float dT = (event.timestamp - timestamp) * NS2S;
// Axis of the rotation sample, not normalized yet.
float axisX = event.values[0];
float axisY = event.values[1];
float axisZ = event.values[2];
// Calculate the angular speed of the sample
float omegaMagnitude = sqrt(axisX*axisX + axisY*axisY + axisZ*axisZ);
// Normalize the rotation vector if it's big enough to get the axis
// (that is, EPSILON should represent your maximum allowable margin of error)
if (omegaMagnitude > EPSILON) {
axisX /= omegaMagnitude;
axisY /= omegaMagnitude;
axisZ /= omegaMagnitude;
}

// Integrate around this axis with the angular speed by the timestep
// in order to get a delta rotation from this sample over the timestep
// We will convert this axis-angle representation of the delta rotation
// into a quaternion before turning it into the rotation matrix.
float thetaOverTwo = omegaMagnitude * dT / 2.0f;
float sinThetaOverTwo = sin(thetaOverTwo);
float cosThetaOverTwo = cos(thetaOverTwo);
deltaRotationVector[0] = sinThetaOverTwo * axisX;
deltaRotationVector[1] = sinThetaOverTwo * axisY;
deltaRotationVector[2] = sinThetaOverTwo * axisZ;
deltaRotationVector[3] = cosThetaOverTwo;
}
timestamp = event.timestamp;
float[] deltaRotationMatrix = new float[9];
SensorManager
.getRotationMatrixFromVector(deltaRotationMatrix, deltaRotationVector);
// User code should concatenate the delta rotation
// we computed with the current rotation
// in order to get the updated rotation.
// rotationCurrent = rotationCurrent * deltaRotationMatrix;
}
}
The Gyroscope readings obtained are used to to change the pitch, roll, & horizon, display on the HUD. The mobile devices
internal electronic compass is queried to updating the heading on the HUD.

2.3. Stereoscopic 3D view + Head orientation tracking using Gyroscope


For the final stage of the project, a Virtual Reality headset has been developed which will allow the user to experience the
Simulation in 360 degrees. Various Virtual Reality headsets were reviewed.
1.
2.
3.
4.

Oculus Rift
Samsung Gear VR
HTC Vive
Google Cardboard

Google Cardboard was the best candidate due to its low costs, easy availability, DIY nature and eco-friendly components
[5]
. The Google Cardboard is a DIY (Do-it yourself) VR Headset proposed by Google, Inc. which uses pieces of cardboard
and lens to form a VR Headset.
They consist of two simple convex lenses for properly focusing the image from the screen onto our retina and a
neodymium magnet which acts as a mode of input. A mobile device is kept inside the Cardboard structure and the lens
make sure that the image on the screen is properly and comfortably interpreted by our eyes.
Stereopsis (from the Greek - stereo- meaning solid, and opsis, appearance, sight) is a term that is
most often used to refer to the perception of depth and 3-dimensional structure obtained on the basis of visual information
deriving from two eyes by individuals with normally developed binocular vision [10] . This phenomenon of Stereopsis is
used to create a 3D virtual reality.
FlightGear has the ability to provide a stereoscopic output (side-by-side) of the rendered simulation. To transmit this
stream live to a mobile device, Moonlight (an open-sourced implementation of Nvidia's GameStream Protocol) has been
used. The video feed is encoded and transmitted via a USB Ethernet interface.
The client application of the android device is written in Java. The application receives the video feed and displays
side-by-side on the screen. Head movements are recognized using a gyroscope present inside the android device. These
movements are sent back to the laptop and the mouse pointer is moved proportionally causing the FlightGear view to be
changed depending on the Head Position.

Figure 3. Stereoscopic display

3. Proof of Concept
The Virtual Cockpit was used to perform flight test procedure on two different simulation environments Flight Gear and
X-plane 10.
The difference in the two simulators is the methodology implemented by each of them to predict how the aircraft flies.
Laminar Researchs X-Plane is considered to be the most realistic simulator available to the public. Unlike most simulation
engines, X-Plane does not rely on stability derivatives to define how an aircraft should fly; however, X-Plane uses actual
flow calculations many times per second to figure out how the given aircraft flies in the simulated environment. The
engineering process used to calculate the simulated flow field is Blade Element Theory.
Flight Gear uses stability derivative and look up stables to predict how the aircraft in question might fly. Aircraft
stability coefficients are based upon the aircrafts geometry and each components interaction with each other during flight.
Those coefficients can be estimated using equations or found using empirical sources such as the United States Airforces
(USAF) Digital DATCOM. [1]

3.1. FlightGear Parameter Logging Suite


A parameter logging suit was built from scratch to enable us to log the flight, aerodynamic, and environment parameters
live during simulation session on FlightGear. The same suite can be extended for X-Plane with minimal modifications to
the ports and Data-Ref paths.
The suite has a user-friendly web interface which allows us to easily select the parameters that we would like to log,
and the interval between consecutive polls.
The interface is coded in python 2.7.x. The software communicates with a FlightGear instance via a UDP
connection that is opened by FlightGear on port 5555. The software traverses the FlightGear property tree to pick the
values of the desired parameters.
A total of 1100 lines of code has been written in python, HTML, javascript, CSS to implement this Parameter
Logging Suite
The following parameters can be logged by the suite.
1. Altitude (in feet) - /position[0]/altitude-ft
2. Odometer Reading (in feet) - /instrumentation[0]/gps[0]/odometer

3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
44.
45.
46.
47.
48.
49.
50.
51.
52.
53.
54.
55.
56.

Pitch (in degree) - /orientation[0]/pitch-deg


Heading (in degree) - /orientation[0]/heading-deg
Roll (in degree) - /orientation[0]/roll-deg
Alpha (in degree) - /orientation[0]/alpha-deg
Beta (in degree) - /orientation[0]/beta-deg
Yaw (in degree) - /orientation[0]/yaw-deg
Path (in degree) - /orientation[0]/path-deg
Roll rate (in degrees/second) - /orientation[0]/roll-rate-degps
Pitch rate (in degrees/second) - /orientation[0]/pitch-rate-degps
Yaw rate (in degrees/second) - /orientation[0]/yaw-rate-degps
Side Slip (in degree) - /orientation[0]/side-slip-deg
Track (in degree) - /orientation[0]/track-deg
Aileron Deflection (in degree) - /controls[0]/flight[0]/aileron
Aileron Trim (in degree) - /controls[0]/flight[0]/aileron-trim
Elevator Deflection (in degree) - /controls[0]/flight[0]/elevator
Elevator Trim (in degree) - /controls[0]/flight[0]/elevator-trim
Rudder Deflection (in degree) - /controls[0]/flight[0]/rudder
Rudder Trim (in degree) - /controls[0]/flight[0]/rudder-trim
Flaps Deflection (in degree) - /controls[0]/flight[0]/flaps
Wing Sweep (in degree) - /controls[0]/flight[0]/wing-sweep
Vertical speed (in feets/second) - /velocities[0]/vertical-speed-fps
Airspeed (in knots) - /velocities[0]/airspeed-kt
Groundspeed (in knots) - /velocities[0]/groundspeed-kt
Glideslope - /velocities[0]/glideslope
Airspeed (in mach) /velocities[0]/mach
u (in feets/second) - /fdm[0]/jsbsim[0]/velocities[0]/u-fps
v (in feets/second) - /fdm[0]/jsbsim[0]/velocities[0]/v-fps
w (in feets/second) - /fdm[0]/jsbsim[0]/velocities[0]/w-fps
p (in rad/second) - /fdm[0]/jsbsim[0]/velocities[0]/p-rad_sec
q (in rad/second) - /fdm[0]/jsbsim[0]/velocities[0]/q-rad_sec
r (in rad/second) - /fdm[0]/jsbsim[0]/velocities[0]/r-rad_sec
u dot (in feets/second2) - /fdm[0]/jsbsim[0]/accelerations[0]/udot-ft_sec2
v dot (in feets/second2) - /fdm[0]/jsbsim[0]/accelerations[0]/vdot-ft_sec2
w dot (in feets/second2) - /fdm[0]/jsbsim[0]/accelerations[0]/wdot-ft_sec2
p dot . (in rad/second2) - /fdm[0]/jsbsim[0]/accelerations[0]/pdot-rad_sec2
q dot (in rad/second2) - /fdm[0]/jsbsim[0]/accelerations[0]/qdot-rad_sec2
r dot (in rad/second2) - /fdm[0]/jsbsim[0]/accelerations[0]/rdot-rad_sec2
Engine RPM - /engines[0]/engine[0]/rpm
Engine thrust (in lbs) - /engines[0]/engine[0]/thrust-lbs
Torque (in ft.lb) - /engines[0]/engine[0]/torque-ftlb
Fuel Consumed (in lbs) - /engines[0]/engine[0]/fuel-consumed-lbs
Weight (in lbs) - /fdm[0]/jsbsim[0]/inertia[0]/weight-lbs
CDO - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CDo
CDDf - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CDDf
CDwbh - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CDwbh
CDDe - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CDDe
CD - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CDbeta
CLwbh - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CLwbh
CLDf - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CLDf
CLDe - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CLDe
CL - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CLadot
CLq - /fdm[0]/jsbsim[0]/aero[0]/coefficient[0]/CLq
CL2 - /fdm[0]/jsbsim[0]/aero[0]/cl-squared
kCDge - /fdm[0]/jsbsim[0]/aero[0]/function[0]/kCDge

57. kCLge - /fdm[0]/jsbsim[0]/aero[0]/function[0]/kCLge


58. Fuel Tank 1 (in lbs) - /fdm[0]/jsbsim[0]/propulsion[0]/tank[0]/contents-lbs
59. Fuel Tank 2 (in lbs) -/fdm[0]/jsbsim[0]/propulsion[0]/tank[1]/contents-lbs
The python script opens up a socket connection to the FlightGear server listening on port. The connection is started by
sending an NOP (code 241) command to the server. It is prepended by IAC (code 255) which will interpret the following
term as a command. [6]
The rest of the commands are sent directly as strings followed by a CRLF character. (Carriage Return Line Feed,
typically \n in the case of a unix based operating system or \r\n in the case of windows).
The parameters are logged onto a csv (comma seperated values) file [7] . The file can then be opened by any text editor
or spreadsheet viewer.
The communication between the python interface and the web-based control panel is done via a web-socket connection
on port 8778. The standard websocket protocol is employed [3] . The web-based control panel uses javascript and acts as
a websocket client and the python interface acts as a websocket server.

3.2. Flight testing comparison between Virtual Cockpit and Actual Flight testing
3.2.1. Glide
The aim of this procedure is to find the glide velocity and the glide slope of the UAV. The test procedure for glide is:
1.
2.
3.
4.

The aircraft is taken to a particular altitude


Then the engine (or motor) is switched off
The velocity and the altitude is measured at every 100 millisecond
The above steps may be repeated to verify the values

Figure 4. Glide velocity comparison


8

[2]

Figure 5. Glide altitude variation

3.2.2. Phugoid Mode


This mode is essentially an airspeed and altitude oscillation at near constant angle of attack. The phugoid mode is characterized by an alternatively climbing and diving of the aircraft. The test procedure for phugoid mode is: [2]
1.
2.
3.
4.

The aircraft is trimmed to the required speed and attitude


A step input is given to the elevator for about 10 seconds
The elevator is brought back to the trim position after 10 seconds
This creates a lightly damped oscillations of velocity and attitude

Figure 6. Velocity comparision


The frequency is comparable in both cases.
9

Figure 7. Pitch comparision

4. Conclusion
All the primary objectives of this project have been achieved. The results of the simulator have been comparable with
the actual Flight Tests conducted using an R/C Aircraft. This can be extended by transforming the FlightGear parameter
logging suit into a Virtual Flight Testing lab command center allowing the users to input aircraft dimensions, simulate
and obtain the results in a user friendly, easy-to-user interface.

References
[1] David W. Babkal. Flight Testing in a Simulation Based Environment . California Polytechnic University, California
Polytechnic University. 2011.
[2] M. V Cook. Flight Dynamics Principles . Arnold, New York. 1997.
[3] I. Fette, and A. Melnikov. RFC 6455 - The Websocket Protocol. 2011. https://tools.ietf.org/html/
rfc6455. (Online; accessed 28-April-2016).
[4] FlightGear. Property Tree - FlightGear Wiki. 2015. http://wiki.flightgear.org/Property_tree. (Online;
accessed 28-April-2016).
[5] Google Inc. Specifications For Viewer Design - Cardboard Manufacturer Help. 2016. https://support.
google.com/cardboard/manufacturers/answer/6323398. (Online; accessed 28-April-2016).
[6] J. Postel, and J. Reynolds. RFC 854 - Telnet Protocol Specification. 1983. https://tools.ietf.org/html/
rfc854. (Online; accessed 28-April-2016).
[7] Y. Shafranovich. RFC 4180 - Common Format and MIME Type for CSV Files. 2005. https://tools.ietf.
org/html/rfc4180. (Online; accessed 28-April-2016).
[8] Ed T. Bray. RFC 7159 - The Javascript Object Notation (JSON) Data Interchange Format. 2014. https://tools.
ietf.org/html/rfc7159. (Online; accessed 28-April-2016).
[9] Wikipedia. Flight Test Wikipedia, The Free Encyclopedia. 2016. https://en.wikipedia.org/wiki/
Flight_test. (Online; accessed 28-April-2016).
[10] Wikipedia. Stereopsis Wikipedia, The Free Encyclopedia. 2016. https://en.wikipedia.org/wiki/
Stereopsis. (Online; accessed 28-April-2016).

10

Вам также может понравиться