You are on page 1of 28

Transactions of the Institute of Measurement and Control 34, 2/3 (2012) pp.

291317

Bio-inspired robotics for air traffic weather information management


Vinh Bui, Viet V. Pham, Antony W. Iorio, Jiangjun Tang, Sameer Alam and Hussein A. Abbass
Defence and Security Applications Research Center, University of New South Wales at the Australian Defence Force Academy, Northcott Drive, Canberra, Australia 2600

With the increase in automation to serve the growing needs and challenges of aviation, air traffic controllers (ATCs) are now faced with an information overload from a myriad of sources, both in graphical and textual format. One such source is weather information, which is typically comprised of wind speed, wind direction, thunderstorms, cloud cover, icing, temperature and pressure at various altitudes. This information requires domain expertise to interpret and communicate to ATCs, who then employ this information to manage air traffic efficiently and safely. Unfortunately, ATCs are not trained meteorologists, so there are significant challenges associated with the correct interpretation and utilization of this information by ATCs. In this paper, we propose a bio-inspired weather robot, which interacts with the air traffic environment and provides targeted weather-related information to ATCs by identifying which airspace sectors they are working on. It uses bio-inspired techniques for processing weather information and path planning in the air traffic environment and is fully autonomous in the sense that it only interacts with the air traffic environment passively and has an onboard weather information processing system. The weather robot system was evaluated in an experimental environment with live Australian air traffic, where it successfully navigated the environment, processed weather information, identified airspace sectors and delivered weather-related information for the relevant sector using a synthetic voice. Key words: air traffic management; bio-inspired robotics; image processing; path planning; weather information management.

Address for correspondence: Hussein A. Abbass, Defence and Security Applications Research Center, University of New South Wales at the Australian Defence Force Academy, Northcott Drive, Canberra, Australia 2600. E-mail: h.abbass@adfa.edu.au Figures 1, 6, 8, 1014 and 16 appear in colour online: http://tim.sagepub.com

2010 The Institute of Measurement and Control

10.1177/0142331210366688

292

Bio-inspired robotics for air traffic weather information management

1. Automation in air traffic management


Present-day air traffic is reaching its operational limits and accommodating air traffic growth is becoming a challenging task for air traffic controllers (ATCs) and air navigation service providers. There are clear needs to increase substantially the existing capacity of airspaces and to minimize disruptions that can be caused by poor weather, the cognitive load on ATCs and inefficient air routes (ICAO, 2002). As a result of these needs, new paradigms are being introduced that increase anticipatory capabilities (Alam, 2008; Erzberger, 2004). These approaches increase automation with the intention to support the human operators in the exploitation of timely and dynamic information on atmospheric hazards, traffic fluctuations and airspace utilization. In current air traffic management (ATM) systems, an ATC is primarily responsible for ensuring safe separation among the aircraft in his/her sector as efficiently as possible. To achieve this task, the controller uses weather reports, voice communication with pilots and other sector controllers, flight strips (for historic and future flight path), and a radar display that provides data on the current position, altitude, speed and track of all aircraft in a sector (Nolan, 2004). In next-generation ATM systems, there will be a significant increase in automation. Several displays will be utilized, where much of the data presented on the displays will be presented in a graphical format (Azuma et al., 2000). Two next-generation ATM systems that integrate such new capabilities include the Single European Sky Air traffic Research system (SESAR) in Europe (2007) and the Next Generation Air Transport System (NextGen) in the USA (Joint Planning and Development Office, 2007). SESAR and NextGen combine increased automation with new procedures to achieve safety, capacity, environmental, economic and security benefits. These prospective aids incorporated into next-generation systems can make it possible for a controller in one sector to anticipate thunderstorms or conflict down the line, and make appropriate adjustments in a flight route, thereby solving potential problems long before they occur. It is also expected that with increased automation there will be less voice communication, fewer tactical problems needing the controllers attention, a shift from textual to graphical information and an extended time frame for making decisions (Wickens, 1999; Wickens et al., 1998). However, in severe weather conditions, emergency situations or in the case of automation failure, the controller will be able to take over and manage traffic. Furthermore, increased automation has the capability both to compensate for a controllers cognitive vulnerabilities, such as ability to construct complex mental maps of the monitored environment, and to better support and exploit a controllers cognitive strengths. When controllers are provided with accurate and timely information, they can be very effective at solving problems but if such problem solving requires knowledge from other domains (eg, for meteorology, scheduling), problem solving becomes challenging (Broach, 2005). Of course, there are still challenges associated with

Bui et al.

293

the approaches being investigated for next-generation ATM systems. For example, when controllers were asked to review operational requirements for future automation to assess their cognitive skills and abilities needed, they pointed out that their ability to translate and interpret data from a myriad of sources would be extremely challenging (Broach, 2005). In particular, meteorological reports on weather phenomena that can impact flight operations are clearly an important component of such cognitive challenges.

2. Automated integrated weather system


A regular supply of accurate, timely and reliable meteorological information is necessary for safe day-to-day aircraft operations (Sherry et al., 2001). Currently, in order to interpret meteorological information correctly, very specialized domain knowledge is required, which is burdensome to ATCs because they are already overloaded with responsibilities with respect to management, tracking and scheduling of flights (Evans, 2001). In addition, although multiple and overlapping weather data sources provide redundancy and completeness of information, they also present a number of challenges for ATCs: typically weather data introduces additional screens for ATCs to monitor and increases the cognitive burden for the controller; weather data can divert the focus of controllers from time-critical air traffic control responsibilities; and weather data is highly domain specific, requiring domain expertise to interpret the data. In addition to these challenges, controllers have to digest and fuse this information in the context of existing air traffic flow in order to form mental maps over time and space. In turn, this can eventually lead to cognitive overload, which can interfere with the controllers primary job of maintaining safe separation between aircraft. One of the questions associated with the ATM system weather integration problem is how information can be provided to controllers in a timely and targeted manner while also not overloading them. Such an integrated system should be capable of fusing multiple data sources and other weather reports in time and space, and to generate useful meta-data, such as the likely presence of turbulence or the possibility of storm cells forming from known atmospheric conditions. Such a system should also deliver the information in the best possible manner to the relevant controller in a manner that does not overly burden the ATC cognitively, or distract the ATC from primary air traffic control duties. The information should be targeted appropriately, and communicated in a non-invasive fashion. Although overlaying graphical weather information on a flight control screen may be possible, presenting weather information with flight information on one screen may not be feasible because of an increase in clutter on an already information-overloaded air traffic radar display, which can further affect the cognitive capacity of human controllers. Moreover, all the ATC systems are closed systems, in that other software is not permitted to interact with core air traffic software systems because of safety concerns and requirements. Of course, an ideal but impractical solution would be to

294

Bio-inspired robotics for air traffic weather information management

have an aviation meteorologist seated next to each and every controller in ATC centres, who can then continuously monitor the weather information and provide expert domain advice to controllers accordingly. Between these infeasible solutions exists an alternative, namely a passive automation system that does not interfere with the core ATC operations, and that can also address the stated objectives and concerns.

3. Development of a bio-inspired weather robot


In light of the cognitive burden aspects of air traffic control screens and infeasibility of integrating weather data systems with existing systems, we propose an alternative passive approach involving a collaboration between ATCs and a situated embodied autonomous robot managing weather data and informing controllers of relevant meteorological hazards. Broadly speaking, an agent system is embodied if it is structurally coupled to its environment (Ziemke, 2003). In other words, embodiment is a result of an interaction of an agent with its environment through co-adaptation of the agent and environment (Teo and Abbass, 2005). Embodiment provides benefits in terms of the dynamics of the system as a whole through this coupling. This subtle coupling of the robot with its environment allows it to communicate weather severity through its physical presence at ATC screens associated with a particular sector where there are high levels of weather disturbances. Furthermore, an independent platform that is not coupled with the closed ATC system avoids the safety and security issues mentioned in Section 2. This collaborative approach is analogous to the collaboration between a meteorologist and ATCs we mentioned earlier. The motivation for embodying an agent in this environment as opposed to using a software-based solution, is to minimize cognitive distractions and invasive visual messages on critical screens associated with ATC operations. In the weather robot system, we have developed bio-inspired techniques for pathplanning and real-time weather data processing. The robot interacts with the air traffic control environment and is capable of navigating in this environment using evolutionary computation applied to path planning. It also uses neural networks for weather data processing. The weather robot also has the capability to process images from its environment, namely the air traffic sectors on ATC screens, and can verbalize weather conditions associated with an ATC sector. The robot assists and affects ATCs with respect to weather phenomenon related to the ATC sectors by moving around the control room and inspecting sector screens, identifying sectors and informing controllers verbally of the relevant weather alerts for a particular sector. In future development, the impact of the robots physical presence will become more important as the robot will adaptively change its path navigation strategy according to weather severity in different sectors. The weather robot has a number of components, which are highlighted in Figure 1: a user interface for manual control intervention; a decision making module, which consists of

Bui et al.

295

User interface

Decision making module

Navigation Image recognition Weather information module module module

Voice command module

Text to speech module

Communication module

Operating system (Windows XP) Hardware platform CPU Mobile platform Sensors
Positioning Obstacle detector Imaging

Communication devices
WIFI Bluetooth Ethernet

Peripherals
Audio Interfacing devices

Figure 1 Modular design of the weather robot

a navigation module for planning paths and obstacle avoidance; an image recognition module for identifying ATC sector screens; a weather information module for gathering and interpreting weather data; a voice command module for issuing verbal instructions to the robot; a text-to-speech module for alerting controllers to relevant weather conditions and the robots current activity; a communication module; and finally the hardware platform, which has the sensors, communication hardware and CPU. In the following section, we will provide further information about these components.

4. Weather robot design


We chose to build our robot on a Targo platform (http:/ /alife-robotics.co.jp). The particular computing platform used in this robot is an Intel-based industrial PC running Windows XP. It provides the computing resources necessary to achieve the stated goals of this project, and to implement a variety of state-of-the art technologies from signal processing, speech recognition and image processing.

4.1

Hardware platform

An Intel-based industrial PC with Pentium MMX 1.6 MHz CPU, 1 Gb RAM and 10 Gb HDD provides the robot with its computing capabilities. The computer interfaces with external peripheral through a PCI-PAZ 3305 Mini PCI Interface and USB ports. The robot has two independent step motor drive wheels in the front and a balancing wheel (or castor) in the centre. As a result, it can turn and move freely in any direction with the

296

Bio-inspired robotics for air traffic weather information management

resolution of the step motor. To determine locations and to detect obstacles while moving, the robot relies on an optical sensor. It also has a wireless network link via both WIFI and public GSM networks. The robot hardware is illustrated in Figure 2.

4.2

Navigation module

The navigation module is responsible for tracking the robots location and navigating it towards a desired destination. Since the operating environment of the robot is dynamic, this module has the capability to determine correctly the robot location while carrying out adaptive route planning with obstacle avoidance tasks.

4.3

Image processing module

The weather robot has to be able to recognize air traffic sectors automatically from video images in an ATC centre. In order to achieve this task, an image from the robot camera is processed using the Sobel edge detection technique (Sobel and Feldman, 1968) before fitting into a sector detection algorithm that finds the corresponding sector,

Figure 2 The weather robot hardware, showing the main CPU, optical mouse-based location device and the wireless communication device

Bui et al.

297

based on the principle of template matching. The details of the sector detection algorithm are discussed in Section 5.4.

4.4

Voice command module

The weather robot can communicate with ATC operators using simple voice commands. The voice command module carries out the speech recognition functionality, which translates natural language commands into machine-readable commands. To process voice commands, the Microsoft Speech Recognition Engine (http:/ / www.microsoft.com/speech/speech2007/default.mspx) is used.

4.5

Text to speech module

In this module, weather messages and meta-data are interpreted and announced by the robot to ATCs in natural language. Currently, we rely on the Microsoft Text to Speech Engine (http:/ /www.microsoft.com/speech/speech2007/default.mspx) to carry out this task.

4.6

Communication module

This module provides communication functionalities for the robot. It allows the robot to communicate with human operators by means of voice, video and short text messages (SMS). For this work, only voice communication is implemented.

4.7

Decision making module

This module is responsible for deciding whether the robot should move, listen or speak to human ATCs by co-ordinating information from other modules such as robot location, the active air traffic sector and relevant weather information.

4.8

Weather information system

A weather monitoring system typically consists of sensors (both airborne as well as ground) located in or near the terminal area as well as local and regional forecast information. The main types of information generated by aviation weather monitoring systems are the SIGMET (Significant Meteorological Hazard Warning), AIRMET (Airmens Meteorological Warning), TTF (Trend Type Forecast), TAF (Terminal Aerodrome Forecast), ARFOR (Low level Area Forecasts), Area QNH (air pressure), SIGWX (Significant Weather Charts) and grid-point wind, temperature and pressure forecasts (Lester, 2000).

298

Bio-inspired robotics for air traffic weather information management

The weather information system can retrieve and process SIGMET, ARFOR, and numerical grid-point wind, temperature and pressure data. This data consists of semistructured and structured data. The system accesses the BoM (Bureau of Meteorology, Australia) website, downloads the HTML messages containing SIGMETs and parses them. The SIGMET parser extracts structured data from the SIGMET, identifying the region, weather phenomena, flight level and time of the event. This data is then stored in a database for retrieval and future updates. The interpretation of region information in a SIGMET is typically reported as vertices of a polygon in latitude and longitude co-ordinates. Sometimes this data is presented as a mix of waypoint codes, or latitude and longitude co-ordinates. For ATCs that may look at more than one sector in a day, depending on traffic load, and this can be burdensome, as the ATCs have to have familiarity with the waypoints in each region. The robots onboard weather information system automates the extraction of the waypoint codes and converts this data to latitude and longitude co-ordinates, and can provide the most current SIGMET within a specified time window. Furthermore, the system extracts numerical grid-point wind, temperature and pressure data from different weather sources. In addition to the SIGMET and high-level numerical gridpoint wind, temperature and pressure data, the weather and wind processing system also downloads and parses the low-level ARFOR (Area forecast) data and extracts the wind direction and intensity at various flight levels for each area. Figure 3 illustrates the process flow for wind, atmospheric and weather data processing.

4.9

Weather robot navigation

There are a number of techniques that can be used for mobile robot location and navigation, such as odometry and GPS. Since the weather robot is intended to operate only in an ATC centre, we chose to build our navigation module based on an optical mouse, as shown in Figure 4. The use of an optical mouse for mobile robot navigation has a number of advantages. Firstly, an optical mouse is a very low cost sensor. Secondly, it can accurately measure displacements independently of the kinematics of the robot by using external natural microscopic ground landmarks to obtain effective relative displacement data. Most importantly, this approach is capable of providing accurate location measurements for the problem at hand (Palacin et al., 2006). The optical mouse also facilitates obstacle detection through sensing a change in location while the drive system is active. If a change in location does not occur when the drive system is active, then an obstacle is detected. Although this mechanism is simple, it does not provide the robot with the capability of avoiding obstacles without making physical contact. Therefore, combining it with a more sophisticate obstacle detection and avoidance mechanism, which relies on ultrasonic sensors, is an approach to solving this problem.

Bui et al.

299

Structured data

Chart data

Semi-structured data

SIGMET (HTML)

Grid point wind speed, temp, and direction (PDF)

ARFOR (HTML)

Convert to image Remove background from grid point chart Perform information extraction based on semantics of domain - extract location, and related temp. pressure, wind speed data

Extract temp, wind speed, and direction from image

Fuse temp, pressure, windspeed, front, storm, cloud, hail/ice, and other atmospheric phenomena with lat. Ion coordinates and flight level

Figure 3 Process flow diagram for weather and wind data processing

5. Bio-inspired techniques in weather robot


The weather robot uses bio-inspired techniques for wind and atmospheric data processing as well as for navigation in a constrained environment. In the following subsections, the techniques used are described in more detail. To begin, we describe an approach in Section 5.1 using neural networks, which is used by the robot to extract data from weather images. The second component described in Section 5.2 is responsible for navigating the robot using a bio-inspired dynamic path-planning technique utilizing a genetic algorithm.

5.1

Neural networks for weather image processing

As shown in Figure 5, wind and atmospheric data in Australia is generated every 6 h by the Bureau of Metrology in an image format. This data is unavailable in a textual

300

Bio-inspired robotics for air traffic weather information management

Figure 4 Optical mouse-based weather robot navigation hardware

format so this image needs to be converted into a text format for further machine processing. Each image (map of Australian airspace) is divided into grid cells based on latitude and longitude. The size of each cell is 58 3 58. Each cell has wind and atmospheric information for different altitudes. This information is in the format of ddfffTT, where dd are the two digits for wind direction, fff are the three digits for wind speed and TT are the two digits for temperature. In general, the problem is seen as a pattern recognition problem where the patterns the robot needs to identify are digits from 0 to 9. The problem of converting this image into a system readable text file is dealt with in two stages; in the first stage we filter out the background clutter (the map of Australia), and localize and segment the image based on the location of individual digits. In the second stage, an offline trained neural network is used to identify the digits in wind and atmospheric image files.

5.1.1 Pre-processing of weather images: Before we can train neural network to identify digits in wind and atmospheric images, several pre-processing steps need to be performed in order to generate the cases used for training. The map of Australia overlaps a number of digits and the size (width and height) of one digit sometimes varies from one to another. Utilizing several wind and atmospheric image files, the

Bui et al.

301

Figure 5 Wind and atmospheric image file retrieved by weather robot

unchanging background pixels associated with the grid and map of Australia are identified. Following this, we acquire the digits from each image by subtracting the background image and drawing another automatically constructed grid on the image in order to segment the digits. The segmentation of the grid is generated by iterating over every pixel row and column of the image; the pixels that become associated with the segmentation grid are the rows and columns with white pixels only. If there are several consecutive white rows and columns, we consider them as one. The grid then allows us to segment the individual digits with high precision. This segmented set of digits is then used in the second stage where a neural network is used to recognize digits in new wind and atmospheric images. The size of the training set used by the neural network is 317, where the number of templates for each digit from 0 to 9 is 29, 16, 25, 52, 19, 31, 50, 9, 30 and 56 respectively. These numbers were used to reflect the variations associated with the font used in the image for the digits.

5.1.2 Training the neural network: We used a set of neural networks that are trained separately to recognize each digit. Each trained network has one hidden layer with 10 units and one output. The output reports true if an image of a digit is matched, otherwise it reports false. The architecture of the 10 neural networks

302

Bio-inspired robotics for air traffic weather information management

is presented in Figure 6. The neural network parameters are reported in Table 1. As shown in the table:
 The network type we use is feed-forward back-propagation (Heaton, 2005). In a feedforward neural network, neurons are only connected forward. Each layer of the neural network contains connections to the next layer (eg, from the input to the hidden layer), but there are no connections back. Back-propagation is a form of supervised training. When using a supervised training method, the network must be provided with both sample inputs and anticipated outputs. The anticipated outputs are compared against the actual outputs for a given input. Using the anticipated outputs, the backpropagation training algorithm then takes a calculated error and adjusts the weights of the various layers backwards from the output layer to the input layer.  The transfer function used in the network is the tansig function. This function transfers the weighted sum of the input nodes to the output as shown in Equation (1).
n X y f a f xi wi i1

IW[1,1]

LW[2,1]

b[1]
84 10

b[2]
1

Figure 6 The architecture of the 10 neural networks with one output Table 1 Parameters for the two neural network models Parameters Network type Hidden layer Output layer Goal Epochs Momentum constant Learning rate Values Feed-forward back-propagation Transfer function is tansig Transfer function is tansig 0 100 0.95 0.01

Bui et al.

303

where xi(i 1, . . . , n) are elements of an input vector x, x0 is a threshold input, wi is a weight from an input i to a neuron, and f(a) is the is the tansig function given in Equation (2). 2 f a 1 2 1 e2a
 Goal, epochs, and momentum constant (MC) are training parameters. Training stops when a maximum number of epochs occur or the performance goal is met. The MC defines the amount of momentum. MC is set between 0 (no momentum) and values close to 1 (lots of momentum). An MC of 1 results in a network that is completely insensitive to the local gradient and, therefore, does not learn properly. The learning rate for both input weights and biases is 0.01. This constant is used in error back-propagation learning and other artificial neural network learning algorithms to control the speed of learning.

The training set for each neural network consists of the set of template images for all 10 digits, even though a neural network is trained to recognize a specific digit only. This is necessary so that each network can see negative as well as positive examples in order to distinguish correctly between digits. The 84-bit strings used in training originate from the binary matrix representing the digit image, where the width of the matrix is 7 and the height is 12. In Figure 7, examples of this binary matrix are presented for digit 0.

5.1.3 Using neural networks for classifying weather images: Once the neural network is trained, it can then be used to process new grid-point wind and atmospheric data images. When a new image arrives for processing, we perform the same preliminary preprocessing steps that were used before training. After this, the segmented digits can be recognized using the trained neural networks. In order to recognize one new image of a digit, all 10 trained neural networks are used. If only one of the neural networks has the

0011110 0100001 1000001 1000001 1000001 1000001 1000001 1000001 1000001 0100010 0011110 0000000

0111100 1000010 1000010 1000001 1000001 1000001 1000001 1000001 1000001 1000001 0100010 0111100

0011100 0100010 1000001 1000001 1000001 1000001 1000001 1000001 0100010 0011100 0000000 0000000

0111000 1000100 1000010 1000010 1000010 1000010 1000010 1000010 0100100 0111000 0000000 0000000

0011100 0100010 0100001 0100001 0100001 0100001 0100001 0100001 0100001 0010010 0011100 0000000

Figure 7 Five samples of digit 0

304

Bio-inspired robotics for air traffic weather information management

output of true during classification, the digit that this neural network is responsible for identifying is deemed the digit corresponding to the new image, otherwise the new image is deemed to be unrecognized. The neural networks are applied to each digit in the image until there are no more digits to classify. The wind, atmosphere and weather information extracted from the image is then integrated to give a complete picture. The graphical representation of this meta-data generated by processing weather and wind information is then visualized as given in Figure 8.

5.2

Evolutionary computation for path planning

The second bio-inspired component of the weather robot is the path-planning module. Before we introduce our approach to path planning, we will discuss some of the limitations of traditional path-planning approaches. In traditional path planning used in robotics, terrain is often represented as a grid of cells (Barraquand et al., 1992). These cells include forbidden cells that the robot cannot pass through and allowed cells that the robot can pass through. As a result, the pathplanning problem can be reformulated as a problem of finding paths in a graph, where the graph vertices are the cells, and the edges are the connections between neighbouring cells.

Figure 8 Graphical output of meta-data generated by processing weather and wind information

Bui et al.

305

The main issue of this type of approach is that the path found is usually not smooth as it is a list of neighbouring cells from the origin to the destination. Hence, the robot usually has to change its direction when it goes from one cell to the next cell. Moreover, when the grid has a high resolution, it is very time consuming to find a path using traditional algorithms. For example, the computational complexity of Dijkstras algorithm is O(|E||V|log|V|), where |E| is the number of edges of the graph, and |V| is the number of vertices of the graph. We try to avoid the above issue by using a continuous terrain representation. In particular, terrain is represented as a continuous space in which obstacles are bounded by polygons represented as lists of points with real co-ordinates. With this representation, we can increase the smoothness of the found paths. Unfortunately, traditional path finding algorithms will not work with this continuous representation as the number of cells is, for all intents and purposes, almost infinite. Therefore, we employ a genetic algorithm (GA), which is a bio-inspired technique based on Darwins principle of natural selection, often used for optimization and search problems. GAs work by firstly initializing a random population of candidate solutions, which can be represented by a binary string of 0 s and 1 s, or a vector of real variables. This population of solutions is then evaluated according to some measure of fitness or utility. Following this, a new offspring population is produced by using genetic operators such as crossover and mutation. The offspring are then evaluated and selected for the next generation based on some fitness criteria. The algorithm continues over a number of generations until a preset maximum number of generations is reached, or a solution with acceptable fitness values is found. Specifically the NSGA-II (Deb et al., 2002) algorithm is used in this work to find obstacle-free shortest paths for the robot. Although the NSGA-II is designed for multi-objective optimization, in this work we have only utilized a single-objective, which is minimizing the path for the robot from its origin to its destination. We decided to use this algorithm for this single-objective problem, because it was already integrated into an existing framework for path planning, which incorporates objectives such as minimizing the risk associated with the robot colliding with obstacles. In addition, NSGA-II is a very robust algorithm and provides advantages to singleobjective optimization because of its diversity preserving mechanism. In our work, the NSGA-II procedure is implemented repeatedly until the termination criterion is met. Usually, the NSGA-II procedure is continued for a predefined number of iterations (Tmax).

5.2.1 Chromosome representation and evaluation: In our approach, one individual in the GA population is a path consisting of a list of waypoints, which the robot visits on the way from the origin to the destination. In this paper, we use NSGA-II with some modifications in the representation of a candidate path, and some necessary constraint checking associated with the feasibility of a path.

306

Bio-inspired robotics for air traffic weather information management

Each 2D point in the space is represented using two values, one each along the (X,Y) axis. Therefore, if there are n points, then one solution will include 2n real-parameter variables, as a two-tuple for each point in a successive manner, as shown in Equation (3).

X1 , Y1 , X2 , Y2 , . . . Xn , Yn |{z} |{z} |{z}


P1 P2 Pn

while computing the entire path from start to end, the start and end points are added in the above list at the beginning and at the end, respectively. The robot must not collide with obstacles; therefore every candidate path for the robot in one population is checked to see that it is feasible (a feasible path does not collide with obstacles). We model our weather robot as a bounding box with a width of 50 cm. To check this condition, one bounding box is created between every two successive points, which represents the width of the robots bounding box. If all bounding boxes corresponding to one path do not intersect with any polygon representing an obstacle, the path satisfies the feasibility constraint. In Figure 9, though the path line connecting points P1; P2; P3; P4 does not intersect with any polygon, this path does not satisfy the feasibility constraint because the bounding boxes of the path intersect with a polygon corresponding to an obstacle. A path is deemed feasible if the bounding boxes associated with the path do not intersect with any polygons representing obstacles, if and only if every edge of the obstacle polygons do not

P1

P2

P3 P4

Figure 9 Bounding boxes for the robot path

Bui et al.

307

intersect with any edge of the path bounding boxes, every point of the obstacle polygon is outside the path bounding boxes, and every point of the path bounding boxes is outside the polygon. In our optimization procedure if one individual does not satisfy the feasibility constraint, a very large penalty is added to its fitness value, in order to remove the possibility of its selection in the following generation; otherwise fitness is assigned as the length of the overall path. The value associated with path length should be as small as possible, because we are interested in minimizing the path length. The length of the path is calculated by summing over the distances between the consecutive discrete points of the path. If we assume the path consists of a list of points P0, P1, . . . Pn1, where P0 is origin and Pn1 is the destination, the length of the path is described in Equation (4).
in X ! Pi Pi1 i0

5.2.2 Genetic operators: For recombination, we use the standard SBX operator (Deb et al., 2007) with crossover probability pc 0.9 and a distribution index of hc 15. SBX operator is the simulated binary crossover whose search power is similar to that of the single-point crossover used in binary-coded GAs. Simulation results on a number of real-valued test problems of varying difficulty and dimensionality suggest that the realcoded GAs with the SBX operator are able to perform as well as or better than binarycoded GAs with the single-point crossover. SBX is found to be particularly useful in problems like the one at hand. Furthermore, a simulation on a two-variable blocked function shows that the real-coded GA with SBX works as suggested by Goldberg, and in most cases the performance of real-coded GA with SBX is similar to that of binary GAs with a single-point crossover (Deb and Agrawal, 1995). Equation (5) details the mutation operator employed by our genetic algorithm (Deb and Goyal, 1996). yi xi xU xL di i i ( 1=hm 1 2ri 1 di 1 j21 ri j1=hm 1

if ri \ 0:5 otherwise

where xi is the value of the ith parameter selected for mutation; yi is the result of the mutation; xL and xU are the lower bound and the upper bound of xi respectively, and ri i i is a random number in [0,1]; gm is a control parameter (hm 20 in our study).

5.2.3 Objective function: The objective function is to find a shortest path for the robot from a given start point to an end point in the environment. Thus the objective

308

Bio-inspired robotics for air traffic weather information management

function is to minimize Equation (4), where P0 is origin, Pn1 is the destination, P1 to Pn are middle points extracted from a chromosome.

5.3

Air traffic flow management

The weather robot monitors a realistic air traffic control environment, which is the Trajectory Optimization and Prediction for Live Air Traffic (TOP-LAT) system (Figure 10). This system was co-developed by the authors and can provide a real time full situation awareness of airspace (control zones, sectors, terminal airspace, special use airspace), which includes air traffic flow, aviation emission, airspace complexity and safety indicators. TOP-LAT synthesizes and integrates all this information into an interactive graphical user interface, which assists users (airline operation centres, air traffic flow management centres) to make air traffic flow management decisions. TOP-LAT provides users with a clear view for sectors in each

Figure 10 Air traffic flow management environment (TOP-LAT) showing all the sectors in the Australian airspace

Bui et al.

309

category, and the interface provides options for choosing individual sectors that can be monitored by ATCs.

5.4

Sector image processing

The weather robot takes a picture of the controllers screen to identify which sector the controller is handling. In order to detect a sector, the colour picture is first transformed into a grey-scale image and the histogram of the grey-scale image is built. The image is then adjusted to display only the interesting curves and points, eg, the boundary of the sector in white and the image background in black, by limiting the upper and lower grey level (between 0.98 and 0.92 in our study) of the histogram. ^ Equations (6)(8) illustrate the transformation process in which Ci is the grey level of i pixel before the transformation; Ri, Gi, Bi are red, green and blue portions of the pixel, WR, WG, WB are the corresponding weights.

Ci Ri 3WR Gi 3WG Bi 3WB  Cmin minCi Cmax maxCi ^ Ci Ci Cmin 255 Cmax Cmin

6 7 8

Blob detection and extraction (Lindeberg, 1994) is then applied to remove the noise points in the adjusted image, and extract the blob of a sector. At the stage of blob detection and extraction, a label is applied to the sector. The highlighted sector is then filled with white colour for pixel distance comparison. Based on the boundary (M 3 N) of the sector, the image is cropped to the required size (M 3 N) fitting the sector before it is then translated to a 2D matrix (M 3 N) of Boolean values, where true or false identifies the pixel as white or black respectively. The obtained Boolean matrix is then used to search for the matching sector in a previously stored database of sector boundaries and locations. The various stages in sector image processing are illustrated in Figure 11.

6. Validation
To evaluate the functionalities of the robot in the air traffic control environment, a number of experiments were conducted. In particular, two main functionalities of the weather robot were tested: (1) navigation in a constrained dynamic environment; and (2) recognizing ATC sectors, interpreting weather messages for that sector and communicating (through a synthetic voice) relevant weather information to the controller managing the sector.

310

Bio-inspired robotics for air traffic weather information management

Figure 11 (a) Sector image captured by robot in the ATC environment; (b) grey image generated by the image processing algorithm; (c) black and white processed image of sector boundaries generated by the image processing algorithm; (d) image after blob extraction; (e) sector filled by white colour by the image processing algorithm; (f) identified sector image by the image processing algorithm

Bui et al.

311

6.1

Navigation in a constrained environment

We have used a live air traffic environment for the Australian airspace from the TOPLAT system. Four ATC positions were set up in our Air Traffic Research Laboratory. The task of the weather robot is to navigate successfully in this constrained environment and identify each controllers position, and aim at the sector image on the controllers screen. The robot moves continuously in this environment (Figure 12), going from one controllers position to another controllers position. Timings were recorded and the distance covered from one position to another was measured. We note that this is a small set-up compared with a full air traffic operation centre, where computational timings and distances covered by the weather robot will be of much greater importance. In this path finding experiment, we chose to use 10 intermediate points in a path, ie, n 10 (see Equation 4). The number of generations was 100. The population of one generations consisted of 1000 candidate paths. It took approximately 35 s to find a path for the weather robot.

Figure 12 Weather robot navigating in the ATC environment

312

Bio-inspired robotics for air traffic weather information management

Figure 13 An example robot navigation path in the given ATC environment

Figure 13 shows the path found by the algorithm (path P) and the shortest path (path S). It can be seen that the path found by the algorithm is close to the shortest path. The turning points between the origin and the destination in path P are points extracted from the best chromosome in the final population. The two objects (ATC environment hardware) in the centre (with red boundary) are obstacles, which the robot needs to navigate around. We can see that the algorithm converges after about 80 generations as shown in Figure 14. The standard deviation of fitness values of individuals in a population decreases quickly. It is around 10 (cm) after 80 generations.

6.2

Sector recognition and interpreting weather data

The other task of the robot is to capture the sector image and identify correctly, in a reasonable time, which sector a controller is managing. This is followed by interpreting weather-related messages for the identified sector and delivering it to the relevant controller.

Bui et al.

313

x 104 3
Mean STDEV

2.5 Fitness values

1.5

0.5 0

20

40

60 Generations

80

100

120

Figure 14 Mean and standard deviation through 100 generations, Seed 1

As shown in Figure 12, there are four ATC positions working on different sectors. The weather robot is given a start point and an end point. The robot then continuously moves from start to end, stops at each controller position, identifies the sector, makes weather announcements and then moves onto the next controllers position. The sector information was then combined with the weather information processed by the neural network to formulate the weather message. In our experiment, the accuracy of the image recognition task achieved with the neural network was 100%. The sample final text output of the image recognition module is illustrated in Figure 15, which fully matched with the sample input data given in Figure 5.

6.3

Metrics and measures

The following measures and metrics were recorded in the experiment.


        Distance (m) from start to end and distance travelled by weather robot. Time (s) to navigate from start to end. Time (s) at a controllers position. Time (s) to identify a sector. Time (s) to make weather announcements. Correct identification of sector. Correct interpretation of weather data pertaining to the recognized sector. Delivery of weather announcements, in synthetic voice, to the relevant ATC.

314

Bio-inspired robotics for air traffic weather information management

Figure 15 Processed wind and atmospheric data text generated by neural networks

Three experiments with four different sectors were conducted. There were 12 sectors to be recognized and 12 different weather datum to be processed by the weather robot and announced in accordance with 12 controller positions.

7. Results and discussions


The weather robot successfully navigated in the air traffic environment from its given start position to its end position. The distance from start to end position was 2.52 m. The robot travelled 4.25 m, which included movements at each controllers position. The average total time it took for the robot from start to end was 54 s with a standard deviation of 61 s. However, the robot has five speed settings and can run five times faster but was limited because of safety concerns. The weather robot correctly identified the controllers position in each instance and performed the correct positioning of the

Bui et al.

315

Figure 16 Weather robot processing sector image

camera with respect to the sector. It spent 59 6 20 s on average at each controllers position. The time at a controllers position included the time to take a picture of the sector, identify the sector, process the weather message pertaining to that sector and vocalize the relevant weather conditions. On average, the weather robot took 25 6 10 s to identify the sector image, and 30 6 10 s on average to make the announcements. The robot successfully recognized all of the 12 sector images displayed on the ATC screens and correctly identified the weather information pertaining to each sector. A video of the robot in motion during the experiments can be seen at http:/ / www.itee.adfa.edu.au/;alar/tmag/. Figure 16 shows the weather robot at an ATC position processing a sector image. As an example, when the weather robot successfully recognized the YBBB/YMMM/ HUON sector over Tasmanian airspace, it made the following weather announcement (speech converted to text): Weather Information for YBBB/YMMM/HUON Sector Location: 45S 140E Wind Direction: 20 Degrees At Flight Level: 300 Wind Speed: 130 knots Temperature: 48 degrees centigrade Barometric Pressure: 300 Hectare Pascal SIGMET Validity Period from 8/19/2009 3:00:00 PM to 8/19/2009 8:30:00 PM Severe turbulence

316

Bio-inspired robotics for air traffic weather information management

Location: Latitude 47 degrees to 32 Degrees and Longitude 128 Degrees to 136 Degrees Movement: East Movement Speed: 25 Knots. An audio clip of the weather message announced by the weather robot to the controllers for the given sector can be listened to at: http:/ /www.itee.adfa.edu.au/ ;alar/tmag/.

8. Conclusions
In this paper, we proposed a weather robot that uses bio-inspired techniques for weather image processing and navigation in a constrained ATM environment. The robot is capable of navigating around ATC positions and finding shortest paths from one location to another while avoiding obstacles. It uses neural networks for weather image data processing, and edge-detection algorithms to identify airspace sectors on an ATC screen. The robot can assist human ATCs with respect to weather phenomena associated with the controllers working sector by announcing the relevant weather information to the controller in natural language. In our conducted experiments, the weather robot successfully navigated in a simulated ATM environment, correctly identified the sectors and made appropriate weather announcements based on relevant sector weather information. Although the main intention of our robot system is to demonstrate the practical synthesis and engineering of a robotic system in ATM environments, we are interested in improving the bio-inspired techniques employed in our robot, such that they can function with greater efficiency in the ATC domain. To this end, an analysis of the topology of the search spaces involved in our domain, and the relationship between these search spaces and our chosen algorithms would be desirable. We leave this as a matter for future work, as our project continues. Also in future work, we will be further exploring the effect of co-adaptation and learning within the environment. In addition, high-risk sectors with regular weather phenomenon, such as thunderstorms and hailstorms, will be regularly attended to by the weather robot. The navigation strategy of the weather robot within an air traffic control centre will also be improved, such that the robots path planning will also be based on prioritization of weather reports and any critical weather phenomenon can be delivered to the required controller in the most efficient and timely manner. We will also be conducting human factor experiments where the effect on the cognitive workload of the controller when using the weather robot will be investigated, as well as the role that implicit embodiment plays in the communication of weather alerts.

Bui et al.

317

References
Alam, S. 2008: Evolving complexity towards risk: A massive scenario generation approach for evaluating advanced air traffic management concepts. PhD thesis, School of Information Technology & Electrical Engineering, University of New South Wales. Azuma, R., Neely, H., Daily, M. and Geiss, R. 2000: Visualization tools for free flight airtraffic management. IEEE Computer Graphics and Applications, 326. Barraquand, J., Langlois, B. and Latombe, J.-C. 1992: Numerical potential field techniques for robot path planning. IEEE Transactions on Systems, Man, and Cybernetics 22, 22441. Broach, D. 2005: A singular success: air traffic control specialist selection 19811992. In: Kirwan, B., Rodgers, M. & Schafer, D. editors. Human factors impacts in air traffic management 177205. Deb, K. and Agrawal, R.B. 1995: Simulated binary crossover for continuous search space. Complex Systems 9, 11548. Deb, K. and Goyal, M. 1996: A combined genetic adaptive search (geneas) for engineering design. Computer Science and Informatics 26, 3045. Deb, K., Pratap, A., Agarwal, S. and Meyarivan, T. 2002: A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Transactions on Evolutionary Computation 6, 18297. Deb, K., Sindhya, K. and Okabe, T. 2007: Selfadaptive simulated binary crossover for realparameter optimization. GECCO07: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation. ACM, 118794. Erzberger, H. 2004: Transforming the NAS: the next generation air traffic control system. NASA Ames Research Centre, Moffett Field, CA, Technical Report NASA/TP2004-212828. Evans, J. 2001: Tactical weather decision support to complement strategic? Traffic flow management for convective weather. The Fourth International Air Traffic Management R&D Seminar ATM-2001, Santa Fe, NM. Paper available at http://atm2001. eurocontrol.fr. Heaton, J.T. 2005: Introduction to neural networks with Java. second edition. Heaton Research. ICAO. 2002: The ICAO Global Air Navigation Plan for CNS/ATM systems. ICAO, Volume 1, no. 9750. Joint Planning and Development Office. 2007: Concept of operations for the next generation air transportation system, version 2.0. Washington, D.C. Lester, P. 2000: Aviation weather. Jeppesen Sanderson. Lindeberg, T. 1994: Scale-space theory in computer vision. Springer. Nolan, M. 2004: Fundamentals of air traffic control. fourth edition. Brooks/ColeThomson Learning. Palacin, J., Valganon, I. and Pernia, R. 2006: The optical mouse for indoor mobile robot odometry measurement. Sensors & Actuators: A. Physical 126, 1417. SESAR Consortium. 2007: The ATM target concept: SESAR definition phase, deliverable 3, Eurocontrol, Brussels, Technical Report DLM-0612-001-02-00. Sherry, J., Ball, C. and Zobell, S. 2001: Traffic flow management (tfm) weather rerouting. 4th USA/Europe Air Traffic Management R&D Seminar. Sobel, I. and Feldman, G. 1968: A 333 isotropic gradient operator for image processing. Presentation for Stanford Artificial Project. Teo, J. and Abbass, H. 2005: Multiobjectivity and complexity in embodied cognition. IEEE Transactions on Evolutionary Computation 9, 33760. Wickens, C. 1999: Automation in air traffic control: The human performance issues. In: Scerbo, M.W. and Mouloua, M., editors. Automation technology and human performance: Current research and trends. Lawrence Erlbaum, NJ, 210. Ziemke, T. 2003: Whats that thing called embodiment? Proceedings of the 25th Annual Meeting of the Cognitive Science Society, 130510.

Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.

You might also like