Вы находитесь на странице: 1из 10

OPTIMIZED SATELLITE DATA LINK FOR METEOROLOGICAL GROUND-AIR- REPORTING

Piedade, Duarte S.
GMV-SKYSOFT - Av. D. Joo II, Lote 1.17.02 - Ed. Ferno de Magalhes, 7 - 1998-025 Lisboa

Rosado, Andr R. S.
GMV-SKYSOFT - Av. D. Joo II, Lote 1.17.02 - Ed. Ferno de Magalhes, 7 - 1998-025 Lisboa

Freitas, Jos L. S.
GMV-SKYSOFT - Av. D. Joo II, Lote 1.17.02 - Ed. Ferno de Magalhes, 7 - 1998-025 Lisboa

ABSTRACT The continuous and dynamic meteorological changes have an important role in the aircraft safety situation during all phases of flight. The developing of systems that enable all aircraft to get, in the near future, timely, dedicated, improved weather information, is a very relevant topic, concerning for the safety increasing of aircraft flights. A fully integrated system to demonstrate the dynamic data link between a ground-based weather server and an airborne avionics weather information processing client was demonstrated in the European Comission FP6 project FLYSAFE. This integrated system encompasses an architecture designed from the ground-based platforms, to the airborne applications, passing through the communication solutions, e.g. the satellite and wireless data link approaches. In this paper are presented the innovative means which allow the aircrafts to manage the requests of data previously gathered and formatted,, and is capable of being received by the onboard systems, in addition to the optimized satellite data link system that permit the link to the ground. The fully integrated system was successively tested in several flight tests, which occurred over a month. At the end, the achievements are illustrated, as well as the next steps to be performed, in order to continuously improve the presented concept. KEYWORDS Real-Time, Ground Server, Weather Products, Data-link, SATCOM, Airborne, Flight trials, FLYSAFE

1. INTRODUCTION
Nowadays weather reports and associated data are delivered to the aviation actors via several sources, e.g., voice communication, or by paper on the ground. The first source of data is the World Area Forecast Centre (WAFC), a meteorological centre that provides real-time meteorological information broadcasts for aviation purposes, which are applicable for global operations according to a production schedule. This data has a validity of six hours since the time of production. Another source of weather information is available from the National Weather Centres for each National Air Traffic Management Area. Other sources of weather information are from the Airport Authority in the form of TAFs (a specially encoded weather forecast for a terminal aerodrome) and METARs (Aviation Routine Weather Report); Eurocontrol also provides a service that collates (non-weather) information that may affect aircraft operations, e.g. Notice to Airmen (NOTAM), which is accessible from the European Aeronautical Database (EAD). In addition, Tropical and Volcanic Ash Advisory Centres provide notifications of severe weather reports and hazardous volumes of space. The FLYSAFE project is partly funded by the European Commission - EU Research 6 Framework (http://www.eu-flysafe.org/), is a consortium of 36 partners from 14 European and associated countries, gathering the highest skills from companies, universities, research centres and SMEs, and started on 1st of February 2005, having finish in mid 2009.

The aim of the project is to provide aviation users with greater situational awareness such that aviation safety is maintained at current levels or improved further. This will be achieved by providing enhanced weather, terrain and air-traffic information. Concerning the weather aspects of the project, the intentions are to provide the link between the aviation user and producer of the enhanced weather information. The Dutch institute NLR, part of FLYSAFE consortium, provided a Swearingen Metro II test aircraft, which was used during several flight tests in order to experiment the concept of real time onboard weather data fusion. The weather data fusion concept is much larger than subject exposed in this paper, because encompasses the post-receiving tasks, e.g., the onboard treatment of the received data, as well as the display of this data. This last issue is very sensitive, due to the possible introduction of nuisance information into the cockpit. On the ground side it is placed a complex system, which is fed with weather data, the so called weather products, from several meteorological stations. These weather products are made available to online request through a ground-based weather server, being accessible to the aviation users on demand, denoting a complex architecture. This paper, however, is focused in the enhancement of the communication architecture, in particular the onboard mobile terminal unit, and the process of managing the meteorological data requests between the aircraft and the ground weather stations according to a variable production schedule and making it available to the aviation users. The second chapter describes the integrated system for the data link between a ground-based weather server and an airborne avionics weather information client. This chapter is mainly focused, on the two components that were supporting the weather information cycle: the first one is the SaNTA architecture, and the interconnection within that system. The second is the description of the onboard software solution which was crafted to manage the transfers of data between the aircraft and the ground server. In the third chapter are illustrated the flights trials goals, the description of these tests and the achieved results. The last chapter is focused on the general conclusions and points out the potential room for improvement of this concept, despite the innovation step achieved.

2. THE INTEGRATED SYSTEM


This integrated system encompasses an architecture designed from the ground-based platforms to the airborne applications, passing through the communication solution, e.g. the satellite data link approach and the network architecture to optimize transmission in that environment.

2.1 Satellite Data Link System


2.1.1 The Aeronautical Satellite Communication Service
The data link connection between the ground and airborne components was established using INMARSAT Swift64 service, consisting in a 64 kbps communication channel, which was supplied by a satellite communication provider (SATCOM1,[1]). The MPDS (Mobile Packet Data Service) mode was chosen among the Swift64 existing options, and it allows a more cost-effective solution for web browsing or other applications that do not require constant transmission of data in both directions. An important consideration in MPDS mode is that the bandwidth is shared with all users (like, for example, other aircrafts), so the allocated amount of bandwidth upon request is dependent of the number of connected users at that given instant.

2.1.2 The SaNTA Solution


In order to enable the transmission of real-time weather information to the airborne side, optimizing the data transfer rate through the satellite link, GMV-Skysofts SaNTA (Satellite Networks Transport Architecture) solution was used.

SaNTA was developed under the scope of a contract awarded by the European Space Agency (ESA), and consists in an architecture designed to provide an efficient exploitation of TCP traffic over long-latency satellite links, which are typically limited by the congestion control mechanism. The congestion control mechanism of TCP assumes that each delay is an indication of packet losses caused by congestion in the link. This reveals to be problematic in presence of network connections which include several physical links with different bandwidth and latency characteristics. In particular, satellite links introduce significant latency delays and consequently are expected to reduce the throughput of TCP by triggering its congestion control mechanism. Thus, a problem of the TCP congestion control policy is that it should be enforced along the path of TCP packets but is solely enforced by end systems. The result in the satellite link environment is the degradation in performance, which is not acceptable for transmitting data in a real-time application. SaNTA has a lot in common with the split transport architecture that has become popular as the basis of a variety of PEP (Performance Enhancing Proxies), in that it contains a concatenation of separate transport protocol instances, together forming a transport chain between the end systems. However, SaNTA uses a new concept, an extra end-to-end transport layer inserted between the per-link split transport layer and the application. It is called ITL (Inter Transport Layer), and re-establishes the end-to-end context that is lost with the split transport. A set of ITL relays is used for accelerating the acknowledgement of TCP segments and avoiding the traditional management of TCP windows which assumes that data losses are due to congestion. In order to optimize the transmissions over the satellite link, it was designed the proprietary transport protocol SaSTP (SaNTA Satellite Transport Protocol), which is based on the X.25 LAPB protocol, well suited for a point-to-point link. Among the protocol features is the option for selective acknowledgements to speed up recovery from a single or few packets lost due to link errors. SaSTP and the protocol stack below are flow controlled in order not to lose packets between layers. The SaNTA architecture is composed by two components that are ITL-aware (see Figure 1): Satellite Enhancers (SEs) and InterWorking Units (IWUs). The SE is the end-user component of SaNTA, and acts as a gateway between normal TCP traffic, used by user applications and servers, and ITL traffic managed by SaNTA. The IWU is the core component of SaNTA which is closest to the satellite link. It acts as a gateway between SaNTA ITL traffic and satellite-specific transport protocol, namely SaSTP. It was also developed the option of having another network element, the Mobile Terminal (MT), which brings together the capabilities of SE and IWU, meaning that implements the protocol stack of the IWU, having also the function of gateway to the end-user (or running itself the application). The MT allows mobility to one end-user in the connection, in the sense that it only needs a satellite terminal to send data through the SaNTA network. This is highly suitable for an aeronautical connection, and also other similar scenarios, like maritime and land mobile using satellite links. In operational terms, SaNTA components relay TCP connections that are routed between SEs, or one SE and the MT. The relaying is managed by the ITL layer, which stands above TCP, and uses it to forward ITLencapsulated end-user TCP segments and a simple UDP tunnel for relaying ITL-encapsulated end-user UDP datagrams. Thus, normal TCP connection between end nodes N1 and N2, using SaNTA results in a series of TCP connections and one SaSTP connection along the communication (see Figure 1). The component SE1 interacts directly with N1 on behalf of N2, thus impersonating N2, and similarly, SE2 interacts directly with N2 on behalf of N1. IWUs interact with each other, or with a MT unit, through SaSTP, which runs over IP, itself supported on the satellite native layers 1 and 2.

Figure 1: SaNTA example for TCP acceleration architecture and protocol stack

In the figure depicted here, can be observed a SaNTA example for TCP acceleration architecture and protocol stack: breaking down a pretended TCP virtual circuit from N1 to N2 into four TCP virtual circuits and one SaSTP virtual circuit.

2.1.3 Flight Tests Scenario Configuration


The implementation of SaNTA architecture in the present environment followed the approach of using the MT in the airborne side and SE and IWU units on the ground side, connected to the Ground Weather Processor (located at Meteo France in Toulouse). The next figure displays the scenario for the flight tests:

Figure 2: the scenario for the flight tests.

The following aspects are worthy of consideration in this context: The application-layer protocol used for the air-ground exchange of weather data was HTTPS, adding security to the communications. The SPU (Surveillance Processor Unit) PC contained the Request/Reply Manager software. The SATCOM unit used on the aircraft was the Rockwell-Collins HST 900-2100 model, and the direct link to the MT was established through a PPPoE (Point-to-Point Protocol over Ethernet) connection. The Satellite Service Provider assigned one fixed public IP to the MT, whereas the IWU and SE were placed in one LAN at Meteo France network infrastructure. A purposely set IP-tunnel was used between MT and IWU, in order to directly route the packets between those two SaNTA elements.

2.2 Request and reply management software


This software module is responsible for an autonomous management of the requests of weather data, the reception of this data and the process of parsing the data, in order to let it available for the other applications that are part of the weather data fusion system. These requests are generated and sent to the Ground Weather Processor through SATCOM means on a cyclic basis. The GWP stores the information, tailors the data for particular aircraft requirements and communicate them to the aircraft. Up-linked data is fused onboard with data from onboard sensors and displayed to the crew. The reply weather data is designated by weather products, since this is processed data, resulted from the gathering of the different weather centers sources. The products can have different nature and different areas of application, e.g. there are local scope (within Terminal Control Area), regional scope (European context), or global scope.

2.2.1 The Weather Corridor


The weather corridor is the fundamental input in the request preparation, because it is the information which allows the ground server to perform a reply with the necessary geographical and temporal context for the aircraft. This application reads the avionics database which contains the decoded ARINC 429 labels (position, heading, track angle, etc) as received from avionics (Inertial Reference System and Air Data Computer) on ARINC 429 data bus. The ARINC 429 Data Manager function updates the Avionics Database with decoded ARINC 429 information [3]. These are the necessary inputs for the construction of the weather corridor, which consists of 4D bubbles with: - A temporal window, - A coverage area (3-D volume) describing the area of interest for WIMS products to be uplinked. Temporal windows of weather corridor The temporal dimension is sized as a temporal window with a Start Time parameter (UTC time) and a End Time parameter (UTC time). WIMS products whose validity period intersects the temporal windows are returned by GWP on request. Coverage area of weather corridor The spatial dimension of weather corridor is based on a horizontal dimension (polygon defined as a rectangle) plus a vertical dimension (bottom and top altitudes). Bottom and top altitudes of weather corridor are respectively 0 meter AMSL and 13000 meters AMSL, based on altitude probability occurrence of weather phenomenon. The WIMS products whose polygon and altitude intersects this 3-D weather corridor are returned by Ground Weather Processor. The size of weather corridor to be used depends on request frequency as follows. Size of coverage area for requests with 5 minutes frequency:

The rectangle is defined with 4 points M1, M2, M3 and M4, as illustrated in Figure 1, so that the rectangle size is defined by a forward distance a = 120 nm (ahead the aircraft and according to current aircraft heading) and a backward distance b = 37 nm, according to heading as well. Lateral extension of the weather corridor is: c = 2*a = 240 nm. M1, M2, M3 and M4 have the following relative coordinates (related to aircraft position): M1: Relative Distance d1 = 169.705627 nm Relative Bearing Az1 = -45 M2: Relative Distance d2 = 169.705627 nm Relative Bearing Az2 = +45 M3: Relative Distance d3 = 125.574678 nm Relative Bearing Az3 = +107.136274 M4: Relative Distance d4 = 125.574678 nm Relative Bearing Az4 = -107.136274

The next figure shows clearly the 2D aspect of this area for the 5 minutes frequency.

Figure3: Request sequence with the time frame and the correspondent product requested (5 minutes)

Size of rectangle is linked to aircraft cruise speed (225 knot) and to range of weather radar range providing enough accurate weather data for Weather Data Fusion (80 nm). Size of coverage area for requests with 15 minutes frequency: The rectangle is defined with 4 points M5, M6, M7 and M8 so that the rectangle size is defined by a forward distance k = 240 nm (ahead the aircraft and according to current aircraft heading) and a backward distance l = 40 nm. Lateral extension of the weather corridor is: m = 2*k = 480 nm. M5, M6, M7 and M8 have the following relative coordinates (related to aircraft position):

M5: Relative Distance d5 = 339.411255 nm Relative Bearing Az5 = -45 M6: Relative Distance d6 = 339.411255 nm Relative Bearing Az6 = +45 M7: Relative Distance d7 = 243.310501nm Relative Bearing Az7 = +99.4623222 M8: Relative Distance d8 = 243.310501nm Relative Bearing Az9 = -99.4623222

The weather corridor for 15 minutes frequency is analogous to the one represented in Figure 1, for 5 minutes frequency, in the sense that the differences are basically the size of k and l distances Size of coverage area for requests with 30 minutes frequency: The rectangle is defined with 4 points M6, M7, M8 and M9 so that the rectangle size is defined by a forward distance u = 360 nm (ahead the aircraft and according to current aircraft heading) and a backward distance v = 40 nm. Lateral extension of the weather corridor is: w = 2*u = 720 nm. M6, M7, M8 and M9 have the following relative coordinates (related to aircraft position): M9: Relative Distance d9 = 509.116882 nm Relative Bearing Az9 = -45 M10: Relative Distance d10 = 509.116882 nm Relative Bearing Az10 = +45 M11: Relative Distance d11 = 362.215406 nm Relative Bearing Az11 = +96.340192 M12: Relative Distance d12 = 362.215406 nm Relative Bearing Az12 = -96.340192

WIMS products whose polygon and altitude intersects this 3-D weather corridor are returned by Ground Weather Processor.

2.2.2 Weather Request Sequence


The weather products are grouped in three different types: CAT (Clear Air Turbulence), CB (Thunderstorm or Cumulonimbus) and ICE (in-flight Icing). The CAT WIMS will consist of two so-called scale products, for continental (here: European) and local (TMA) scale. The CB WIMS will consist of two so-called scale products, for continental and local (TMA) scale. The ICE WIMS will consist of three socalled scale products, for global, continental and local (TMA) scale. These scale products are based on existing meteorological expert systems. The following table gives products details, considering type; scale, refresh rate and forecast frequency (N/A means there is no forecast: product is a diagnostic).
Product Type CAT CAT CB CB ICE ICE ICE Scale Regional Global Local Regional Local Regional Global Refresh Rate 6 hours 6 hours 5 minutes 15 minutes 15 minutes 1 hour 6 hours Forecast Frequency 3 hours 6 hours 5 minutes 5 minutes N/A 1 hour 6 hours

Table1: Request sequence with the time frame and the correspondent product requested

Since the requests need to be cyclic and precise in terms of sequence, the aircraft embeds a Network Time Protocol server. The Surveillance Processor Unit (SPU) will be synchronized as a Network Time Protocol client, as well as all other embedded PCs.

The Weather Information Management sequence is shown in the following table. Is composed by a repeatable sequence of 30 minutes duration and the different request types are spread across this sequence, one each minute, although some minutes no request is performed, as it can be observed in the following table.
Time T0 T0+1 T0+2 T0+3 T0+4 T0+5 T0+6 T0+7 T0+8 T0+9 T0+10 T0+11 T0+12 T0+13 T0+14 Product Type CB Local CB Regional ICE Local ICE Regional CB Local CAT Regional ICE Local CAT Global ICE Global CB Local ICE Local Time T0+15 T0+16 T0+17 T0+18 T0+19 T0+20 T0+21 T0+22 T0+23 T0+24 T0+25 T0+26 T0+27 T0+28 T0+29 Product Type CB Local CB Regional ICE Local CB Local

CB Local ICE Local

Table2: Request sequence with the time frame and the correspondent product requested

The replies are compressed in GWP server, using the gzip compression algorithm, and then decompressed in Request Reply Manager in the reception process. This is a very important step to achieve the estimated desirable sizes of 10 to 30Kb, due to the 80% compression rate of gzip for this type of data. The application was built with a failure recovering process for responses lost in the sequence, in particular for the ones that are more critical, like the CB Local and the CB Regional. A missing reply doesnt require a new request transmission, as requests continue to be transmitted according to the 30 minutes cycle table define above. Although some faults are tolerable, the consecutive absence of responses and late replies are counted to a certain limit and then, the request process takes specific actions. This process is called the degraded mode, and allows for instance the restart of the cycle, resetting all the timers and counters of the application, in order to return in this way to the nominal mode.

3. EXPERIMENT: THE FLIGHT TRIALS


In order to evaluate uploading and onboard use of both the local and regional WIMS products, the FLYSAFE Flight Tests Experiment 5, took place in the Paris TMA (coverage with ground weather radars available) as well as over other more or less arbitrary locations over Europe. The 16 flights experiments with an average duration of 90 minutes, took place during an entire month period. Bottom line for both is that the weather in the area should meet pre-specified requirements, to allow the effective test of all weather products. The atmospheric phenomena and weather hazards that could impact the aircraft were monitored by observation systems, the crew and by various onboard weather sensors (onboard enhanced weather radar and storm scope). After the reception, the data is processed by each WIMS, using dedicated and up-to-date tools. NLRs Swearingen Metro II test aircraft was used in those FLYSAFE flight tests. A High Speed SATCOM system has been used for the flight tests, coupled with SaNTA, as data link TCP protocol enhancer solution. The Paris TMA region offered very few moments during the flight tests campaign where the required weather conditions for the experiment were met. As a consequence, the number of collected WIMS products with Regional scale is significantly more important than the number of collected WIMS products with Local scale. The following table presents the measured size of the products, the transfer time to uplink the products on board and the associated throughput value:
Product CAT CAT CB Scale Regional Global Local Compressed replies (Kbytes) Mean = 14 ; min=6 ; max=30 Mean = 34 ; min=4 ; max=101 Mean = 5 ; min=2 ; max=10 Transfer time (seconds) Mean=22; min=6; max=66 Mean=27; min=9; max=78 Mean=18; min=5; max=57 Throughput (Kbits/s) Mean=6.9; min=2.2; max=14.5 Mean=9.4; min=5.6; max=16.8 Mean=3.8; min=0.2; max=8.2

CB ICE ICE ICE

Regional Local Regional Global

Mean = 39 ; min=3 ; max=150 Mean = 7 ; min=6 ; max=9 Mean =184; min=55; max=323 Mean = 13 ; min=5 ; max=24

Mean=36; min=9; max=180 Mean=40; min=6; max=40 Mean=124;min=25;max=279 Mean=21; min=5; max=67

Mean=8.9; min=5.0; max=16.9 Mean=5.4; min=1.4; max=12.3 Mean=16.5; min=5.3; max=29.5 Mean=6.6; min=3.4; max=13.5

Table1: size of the products, the transfer time and the throughput value, depends on type of WIMS products.

It is possible to conclude that some products are very large to download and take too much time, which is related to the fact that the satellite data link is shared by other connected users of the Swift64 service, and thus not allowing the full available bandwidth of 64Kbps. Also, the size of products can be very different among them, producing, in worst cases, transfer times bigger than 60 seconds, which is the minimum transfer time between products in the sequence. This leads to delays in sequence, which were very common and in some extreme situations lead to degraded modes. The observed amounts of uplinked data are larger than expected (when compared to the initial rough estimation) mainly due to large size of Regional and Global products data. With the tested periodic uploading sequence and recorded amount of uplinked data in isolated CB conditions, the following maximum figures of compressed data were obtained: 500 Kbytes in the first 5 minutes (T0 to T0+5min), 174 Kbytes from 5 minutes to 10 minutes, 30 Kbytes from 10 minutes to 15 minutes, 170 Kbytes from 15 minutes to 20 minutes and 20 Kbytes from 20 minutes to 25 minutes.

4. CONCLUSION
The proof of concept was clearly a meaningful step in order to achieve a most reliable control of weather hazards, in terms of providing to the pilot a real-time, contextualized and continuously updated airborne picture, though some points need to improve. The innumerous risks inherent to this kind of experience were mitigated with extreme success. The 16 realized flights were productive in terms of the metrological conditions, which didnt compromise the success of the tests. The request reply manager achieved the necessary goals and was robust enough during all the flights, which was a fundamental requirement, since this application was running in a stand alone mode, without human interaction after the startup configurations. The communication cost constituted a constraint in this concept. The size of the data represents incremental costs, because the satellite service used charges by transferred kilobyte and not by the amount of time spent online. The shared data link of 64Kbps does not allow bigger expectations in terms of scalability, depicting that some other solutions must be explored in terms of satellite service providers. However, the introduction of SaNTA solution in data link architecture allowed an optimized transmission of the data, and contributed to a more effective use of the available bandwidth. The use of XML format in order to transfer the information between the entities, brings not only the known existent benefits of XML, but also emphasizes other problems, mainly the huge size that the files can take. For instance, a simple query to the ground server implies an amount of text that doesnt have direct correspondence with the amount of useful data available, either in the request and the response, due to XML tags syntax. Further work should be done on weather data size optimization (e.g.: format, compression) to reduce weather data flow and allow to use a well-balanced solution with more effective data transfer rate in the context of the existing satellite communication costs.

ACRONYMS
CAT - Clear Air Turbulence CB - Cumulonimbus (thunderstorm cloud) DSP - Datalink Service Provider FAR - Federal Aviation Regulations FMS - Flight Management System GLB - Global

GPS - Global Positioning System GWP - Ground Weather Processor HMI - Human Machine Interface IRS - Inertial Reference System LOC - Local METAR - Aviation Routine Weather Report MPDS - Mobile Packet Data Service REG - Regional SANTA - Satellite Network Transport Architecture SATCOM - Satellite Communication SPU - Surveillance Processor Unit TAF - a specially encoded weather forecast for a terminal aerodrome TMA - Terminal Manoeuvring Area WIMS - Weather Information Management Systems

ACKNOWLEDGEMENT
This development work has partly been funded by the European Commission, EU Research 6 Framework, EC contract AIP4-CT-2005-516167. National Aerospace Laboratory NLR: M.J. Verbeek, Wilfred Rouwhorst. Rockwell Collins France: F. Azum, Olivier Perrier. Mto France: A. Drouin, Patrick Josse GWP/Local CB/ICE WIMS. UK Met Office: Andrew Mirza Global CAT/ICE WIMS. Deutsches Zentrum fr Luft- und Raumfahrt DLR: Thomas Gerz Regional CB WIMS. University of Hannover: Thomas Hauf Regional ICE WIMS. GTD Sistemas de Informacin: Florent Birling, Joan Roig, Isidro Bas.

REFERENCES
[1] Erling Kristiansen, Afonso Nunes, Jos Brzio, and Andr Zquete. Satellite Network Transport Architecture (SaNTA). In Proc. of the AIAA Int. Communications Satellite Systems Conf. (ICSSC 2005), Rome, Italy, September 2005. [2] INMARSAT Swift 64 brochure [3] ARINC Specification 429P2-15, March 6, 1996 [3] ARINC Specification 429P2-15, March 6, 1996 [4] M.J. Verbeek, NLR, A. Drouin Mto France, Toulouse, F. Azum, Rockwell Collins France, Toulouse - Flight Testing of Real-time On-board Weather Data Fusion, In European Space and Air Conference (CEAS), Manchester UK, 26th - 29th of October 2009. [5] 'Skysoft apresenta Flysafe' publish in 'Sirius Magazine' January/February 2009

Вам также может понравиться