You are on page 1of 23

11

Autonomous Flight Control for RC Helicopter Using a Wireless Camera


Yue Bao1, Syuhei Saito2 and Yutaro Koya1
1Tokyo

City University, 2Canon Inc. Japan

1. Introduction
In recent years, there are a lot of researches on the subject of autonomous flight control of a micro radio control helicopter. Some of them are about flight control of unmanned helicopter (Sugeno et al., 1996) (Nakamura et al., 2001). The approach using the fuzzy control system which consists of IF-Then control rules is satisfying the requirements for the flight control performance of an unmanned helicopter like hovering, takeoff, rotating, and landing. It is necessary to presume three dimensional position and posture of micro RC helicopter for the autonomous flight control. A position and posture presumption method for the autonomous flight control of the RC helicopter using GPS (Global Positioning System), IMU (Inertial Measurement Unit), Laser Range Finder (Amida et al., 1998), and the image processing, etc. had been proposed. However, the method using GPS cannot be used at the place which cannot receive the electric waves from satellites. Therefore, it is a problem that it cannot be used for the flight control in a room. Although the method which uses various sensors, such as IMU and Laser Range Finder, can be used indoors, you have to arrange many expensive sensors or receivers in the room beforehand. So, these methods are not efficient. On the other hand, the method using an image inputted by a camera can be used in not only outdoors but also indoors, and is low price. However, this method needs to install many artificial markers in the surroundings, and it is a problem that the speed of the image inputting and image processing cannot catch up the speed of movement or vibration of a RC helicopter. A method presuming the three dimensional position and posture of a RC helicopter by the stereo measurement with two or more cameras installed in the ground was also proposed. In this case, the moving range of the RC helicopter is limited in the place where two or more cameras are installed. Moreover, there is a problem for which a high resolution camera must be used to cover a whole moving range. (Ohtake et al., 2009) Authors are studying an autonomous flight of a RC helicopter with a small-wireless camera and a simple artificial marker which is set on the ground. This method doesnt need to set the expensive sensors, receivers, and cameras in the flight environment. And, we thought that a more wide-ranging flight is possible if the natural feature points are detected from the image obtained by the camera on the RC helicopter. This chapter contains the following contents.

www.intechopen.com

218 a. b. c. d.

Advances in Flight Control Systems

Input method of image from a small, wireless camera which is set on a RC helicopter. Extraction method of feature points from an image of flight environment taken with a camera on RC helicopter. Calculation method of three dimensional position and posture of RC helicopter by image processing. Experiment of autonomous flight of a RC helicopter using fuzzy logic control.

2. Composition of system
The overview of a micro RC helicopter with coaxial counter-rotating blades used in our experiment is shown in Fig.1. Since this RC helicopter can negate a running torque of a body by a running torque between an up propeller and a down propeller, it has the feature that it can fly without being shakier than the usual RC helicopter. The composition of our experiment system for automatic guidance of RC helicopter is shown in Fig.2. A small wireless camera is attached on the RC helicopter as shown in Fig.3, and the image of the ground is acquired with this camera, and this image is sent to the receiver on ground, and then sent to the computer through a video capture. The position and posture of the RC

Fig. 1. Micro RC helicopter with the coaxial contra-rotating rotors

Fig. 2. Composition of automatic guidance system

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

219

Fig. 3. RC helicopter equipped with a micro wireless camera

Fig. 4. An artificial marker

Fig. 5. Coordinate axes and attitude angles

www.intechopen.com

220

Advances in Flight Control Systems

helicopter are computed with image processing by the computer set on the ground, and, this image processing used the position and shape of an artificial marker in the camera image like Fig.4. The three dimensional position of the RC helicopter ( x(t ), y(t ), z(t )) and the $ (t ), y $ (t ), z $(t )) and the attitude angles (t ) and changing changing speed of the position ( x $ speed of attitude angles (t ) can be obtained by this calculation. Fig.5 shows the relation between these coordinate axes and attitude angles. This micro RC helicopter is controlled by four control signals, such as Aileron, Elevator, Rudder, and Throttle, and, the control rule of fuzzy logic control is decided by using measurement data mentioned above. The control signals are sent to micro RC helicopter through the digital-analog converter.

3. Image processing
3.1 Image input The micro wireless camera attached on the RC helicopter takes an image by interlaces scanning. If the camera takes an image during RC helicopter flying, since the vibration of the RC helicopter is quicker than the frame rate of the camera, the image taken by the camera will be a blurred image resulting from an interlace like Fig.6. We devised a method skipping the odd number line (or, even number line) of input image to acquire an clear input image while the RC helicopter is flying.

Fig. 6. A blurring image acquired by wireless camera 3.2 Feature point extraction Feature point detection is defined in terms of local neighborhood operations applied to an image such as an edge and corner. Harris operator (Harris and Stephens, 1988) (Schmid et al., 1998) and SUSAN operator (Smith and Brady, 1997) are well known as common feature detectors. The methods (Neumann and You, 1999) (Bao and Komiya, 2008) to estimate

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

221

position and attitude by using natural feature points or marker in the input image are proposed too. Our prototype experiment system used the Harris operator which can extract the same feature points in higher rate more than other feature point extraction methods (Schmid et al., 1998). First, we obtain a grayscale image from camera. Let us consider taking an image patch over an area (u , v ) from an image I and shifting it by (x, y). The Harris matrix M can be found by taking the second derivative of the sum of squared differences between these two patches around (x , y ) = (0, 0). M is given by: 2I I I G G 2 x y x M = 2I I I G G 2 x y y

(1)

Let the standard deviation of G in the equation be with Gaussian function for performing smoothing with Gaussian filter. The strength of a corner is decided by second derivative. Here, the eigenvalue of M is (1 , 2 ) , and the value of eigenvalue can be got from the following inference. If 1 0 and 2 0 , then there are no features at this pixel (x , y). If 1 and 2 are both large positive values, then a corner is found. If either 1 or 2 is large positive value, then an edge is found.

Because the exact calculation of eigenvalue by the method of Harris will increase computational amount, the following functions R were proposed instead of those calculation methods.

R = det( M ) k(tr( M ))2


R = 12 k( 1 + 2 )2

(2)
(3)

The det expresses a determinant and tr expresses the sum of the diagonal element of a matrix, and k is a value decided experientially. The kanade-Tomasi corner detector (Shi and Tomasi, 1994) uses min (12) as measure of feature point. For example, Fig.7 shows a feature point detection using Harris operator for photographed image. The Harris operator detects the corner point mainly from the image as a feature point. The position of feature point is estimate able by related position information of an artificial marker to feature point from camera image after coordinate transformation. The flight control area of RC helicopter can be expanded(see Fig.8) by using the information of natural feature points around an artificial marker. Harris operator is suitable for detecting natural points. Our system saves the areas including the natural feature points as templates when the artificial marker is detected. In the range that can take the image of the artificial marker, the system uses the position information of the artificial marker. If the system can't take the image of an artificial marker, the position of the helicopter is estimated by template matching between the area of natural feature points and the template area.

www.intechopen.com

222

Advances in Flight Control Systems

Fig. 7. Feature point extraction by a Harris operator

Fig. 8. The expansion of flight area

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

223

3.3 Detection of a marker

Fig. 9. The flow chart of marker detection The image of the marker photographed with the wireless camera on micro RC helicopter is shown in Fig.4. The yaw angle can be calculated by the center of the marker and a position of a cut of triangle of the marker, and the angle of the pitch and the angle of roll can be acquired by coordinate transformation between a camera coordinate system and a world coordinate system. The flow chart about the image processing for the marker extraction and the calculation of position and posture are shown in Fig.9. First, a binarization processing is performed at the input image from the camera installed on the micro RC helicopter. Then, the marker in the image is searched, and the outline of the marker is extracted. If the outline cannot be extracted, an image is acquired from the camera again and a marker is searched again. A marker center is searched after the outline of a marker is extracted. The method of search is shown in Fig.10. The maximum values and the minimum values about the x-coordinate and y-coordinate are searched from of the extracted outline, and their middle values are given as the coordinates of the marker center. A length of major axis and a length of minor axis of the marker are calculated by the distance between the center coordinates of the marker and the pixel on the outline of the marker. The calculation method of the major axis and a minor axis is shown in Fig.11. When the center coordinate is defined as P( xC , yC ) , and the coordinate of the pixel which is on the outline is defined as I(x,y), the distance PI from the center to the pixel of the outline is calculated by equation (4).

PI = ( x xc )2 + ( y yc )2

(4)

For obtaining the maximum value G1 of PI and the minimum value G2 of PI, all pixels on the outline are calculated. And the segment of G1 is defined as the major axis PO , and the segment of G2 is defined as a minor axis PQ . The position and posture of the micro RC helicopter are calculated by the method shown in Section 4.

www.intechopen.com

224

Advances in Flight Control Systems

Y ymax

ymax-ymin 2

P (xc,yc)

ymin X xmax

xmin
Fig. 10. The marker center

xmax-xmin 2

Y O I(x,y) PI G1

G2 Q

P (xc,yc)

X
Fig. 11. The calculation method of the major axis and the minor axis of the marker

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

225

Since the vibration of the RC helicopter is quicker than the frame rate of the camera, when the camera which attached on the micro RC helicopter takes the image, the taken image has become a blurred image resulting from the interlace like Fig.6. Therefore, an exact result cannot be obtained by a usual image processing. The image processing that we devised scans with skipping the odd (or even) number lines only about y axial direction at pixel scanning. The method of scanning is shown in Fig.12. If the pixels processed are on odd lines, the even number lines are skipped only about y axial direction. About x axial direction, the system scans 1 pixel at a time like the usual pixel scanning. By this method, a stable profile tracking can be performed without being blurred by interlace.

edge pixel

odd-numbered line even-numbered line

attention pixel

Fig. 12. The method of outline tracking

4. Calculation of position and posture


4.1 Calculation of the position coordinates of RC helicopter In order to measure the position of the RC helicopter, it is necessary to calculate the focal distance of the wireless camera on the RC helicopter first. If the size of marker and the location of z axial direction of RC helicopter are got, the focal distance f of the wireless camera can be calculated easily. The three-dimensional coordinates of a moving RC helicopter are calculated by using the focal distance, a center of gravity and the value of radius of the marker. The image formation surface of the camera and the circular marker surface are parallel if RC helicopter is hovering above the circular marker like Fig.13. Lets consider that a marker is in the center of the photographed image. The radius of circular marker in the camera image is defined as D1 , and the center coordinates of the circle marker are defined as ( xC 1 , yC 1 ) . The radius of the actual marker is defined as d1, and the center coordinates of the actual marker in a world coordinate system are defined as ( x1 , y 1 , z1 ) . Then, the focal distance f of a camera can be calculated from the following two equations from parallel relation.
z1 : d1 = f : D1

(5) (6)

f =

z1 D1 d1

www.intechopen.com

226

Advances in Flight Control Systems

When the RC-helicopter is in moving, the radius of the marker in the image after moving is defined as D2 , the center coordinates of the marker after moving are defined as ( xC 2 , yC 2 ) , and the center coordinates of actual marker is defined as ( x2 , y 2 , z2 ) . Then, the following equation is acquired.
z2 : d1 = f : D2

(7)

Here, since the focal distance f and the radius d1 of actual marker are not changing, the following relation is obtained from equation (5) and equation (7).
D1 : D2 = z2 : z1

(8)

z2 can be acquired by the following equation. Moreover, x2 and y 2 can be calculated by the following equations from parallel relation. Therefore, the coordinate of the helicopter after moving is computable by using equation (9), equation (10), and equation (11), using the focal distance of the camera.

z2 = x2 = y2 =

D1 z1 D2

(9)

X2 z2 f Y2 z2 f

(10)

(11)

Camera coordinate
D2 D1 X

Camera coordinate
D1 Y X

(Xc1,Yc1) (Xc2,Yc2)

(Xc1,Yc1)

z2
z1
y z
d1

(x1,y1)

(x1,y1) y
d1

(x2,y2)

World coordinate

World coordinate

Fig. 13. The location calculation method of RC helicopter

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

227

4.2 Calculation of the attitude angle of RC helicopter The relation between an angle of RC helicopter and an image in the camera coordinate system is shown in Fig.14. When RC helicopter is hovering above a circular marker, the circular marker image in the camera coordinate system is a right circle like an actual marker. If RC helicopter leans, the marker in a camera coordinate system becomes an ellipse. To calculate the attitude angle, first, the triangular cut part of the circular marker is extracted as a direction feature point. Then the deformation of the marker image is corrected for calculating a yaw angle using the relation between the center of the circular marker and the location of the direction feature point of the circular marker. The pitch angle and the roll angle are calculated performing coordinate transformation from the camera coordinate system to the world coordinate by using the deformation rate of the marker in the image from the wireless camera.

Camera coordinate

RC helicopter

Marker Ground

Fig. 14. Relation between attitude angle of RC helicopter and image in wireless camera.
Calculation of a yaw angle

The value of yaw angle can be calculated using the relation of positions between the center of circular marker image and the direction feature point of the circular marker image. However, when the marker image is deforming into the ellipse, an exact value of the yaw angle cannot be got directly. The yaw angle has to be calculated after correcting the deformation of the circular marker. Since the length of the major axis of the ellipse does not change before and after the deformation of marker, the angle between x axis and the major axis can be correctly calculated even if the shape of the marker is not corrected. As shown in Fig.15, the center of a marker is defined as point P , the major axis of a marker is defined as PO , and the intersection point of the perpendicular and x axis which were taken down from Point O to the x axis is defined as C . The following equation is got if OPC is defined as '. OC ' = arctan PC (12)

Here, when the major axis exists in the 1st quadrant like Fig.15(a), is equal to the value of ', and when the major axis exists in the 2nd quadrant, is calculated by subtracting '

www.intechopen.com

228

Advances in Flight Control Systems

from 180 degrees like Fig.15(b). If the x -coordinate of Point O is defined as xO, the value of is calculated by the following equation.
( xo 0) = 180 ( xo < 0)

(13)

Next, the angle between the major axis and the direction of direction feature point is calculated. When taking a photograph from slant, a circular marker transforms and becomes an ellipse-like image, so the location of the cut part has shifted compared with the original location in the circular image. The marker is corrected to a right circle from an ellipse, and the angle is calculated after acquiring the location of original direction feature point. First, the value for deforming an ellipse into a right circle on the basis of the major axis of an ellipse is calculated. The major axis of an ellipse is defined as PO like Fig.16, and a minor axis is defined as PQ. The ratio R of the major axis to a minor axis is calculated by the following equation. R= PO G1 = PQ G2 (14)

Fig. 15. An angle between the major axis and coordinate axes

If this ratio multiplies along the direction of a minor axis, an ellipse can be transformed to a circle. The direction feature point of the marker in the ellipse is defined as a, and the point of intersection formed by taking down a perpendicular from Point a to the major axis PO is defined as S . If the location of the feature point on the circle is defined as A, point A is on the point of intersection between the extended line of the segment aS and a right circle. Because aS is a line segment parallel to a minor axis, the length of a line segment aS is calculated by the following equations. AS = aS R (15)

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

229

When the line segment between Point A and the center of the marker is defined as PA , the angle which the line segment PA and the major axis PO make is calculated by the following equations.
= arctan AS PS

(16)

Finally, a yaw angle is calculable by adding to .

Y O S A a G1 Q G2 P X

Fig. 16. An angle between the direction feature point and the major axis
Calculation of pitch angle and roll angle

By using the deformation rate of the marker in an image, a pitch angle and a roll angle can be calculated by performing coordinate transformation from a camera coordinate system to a world coordinate system. In order to get the pitch angle and rolling angle, we used a weak perspective projection for the coordinate transformation (Bao et al., 2003). Fig.17 shows the principle of the weak perspective projection. The image of a plane figure which photographed the plane figure in a three-dimensional space by using a camera is defined as I, and the original configuration of the plane figure is defined as T. The relation between I and T is obtained using the weak perspective projection transformation by the following two steps projection. a. T' is acquired by a parallel projection of T to P paralleled to camera image surface C. b. I is acquired by a central projection of T ' to C . The attitude angle ' is acquired using relation between I and T. The angle ' shown in Fig.18 expresses the angle between original marker and the marker in the camera coordinate system. In that case, the major axis G1 of the marker image and a minor axis G2 of the marker image can show like Fig.19.

www.intechopen.com

230

Advances in Flight Control Systems

m p
Two dimension image T 3-dimensional space

o
Two dimension image T

Camera imaging surface C

O P M
Photography image I

z y x
Fig. 17. The conceptual diagram of weak central projection
Y

O X P G1 G2 Q

Fig. 18. The schematic diagram of the attitude angle '

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

231
G2 P G1

Camera

Ground

Fig. 19 shows the calculation method of . PQ is transformed into LP if along optical axis of a camera an inverse parallel projection is performed to the minor axis PQ. Since the original configuration of a marker is a right circle, LP becomes equal to the length G1 of the major axis in a camera coordinate system. is calculated by the following equation. G ' = arcsin 2 G1 (17)

Fig. 19. Calculation of an attitude angle

To get the segment TU, SU is projected orthogonally on the flat surface parallel to PQ . PQ and TU are in parallel relationship and LP and SU are also in parallel relationship. Therefore, the relation between and can be shown by equation (18), and the inclination of the camera can be calculated by the equation (19).

'=

(18) (19)

= arcsin

G2 G1

5. Control of RC helicopter
Control of RC helicopter is performed based on the position and posture of the marker acquired by Section 4. When RC helicopter is during autonomous hovering flight, the position

www.intechopen.com

232

Advances in Flight Control Systems

data of RC helicopter are obtained by tracking the marker from definite height. The fuzzy rule of the Throttle control input signal during the autonomous flying is defined as follows. $(t ) is PB, Then Throttle is NB If z(t ) is PB and z $(t ) is ZO, Then Throttle is NS If z(t ) is PB and z $(t ) is NB, Then Throttle is ZO If z(t ) is PB and z $(t ) is PB, Then Throttle is NS If z(t ) is ZO and z $(t ) is ZO, Then Throttle is ZO If z(t ) is ZO and z $(t ) is NB, Then Throttle is PS If z(t ) is ZO and z $(t ) is PB, Then Throttle is ZO If z(t ) is NB and z $(t ) is ZO, Then Throttle is PS If z(t ) is NB and z $(t ) is NB, Then Throttle is PB If z(t ) is NB and z The fuzzy rule design of Aileron, Elevator, and Rudder used the same method as Throttle. Each control input u(t) is acquired from a membership function and a fuzzy rule. The adaptation value i and control input u(t) of a fuzzy rule are calculated from the following equations.

i = Aki ( x k )
n

1 ici u(t ) = i = r i =1i


r

k =1

(20)

(21)

Here, i is the number of a fuzzy rule, n is the number of input variables, r is the quantity of a fuzzy rule, Aki is the membership function, x k is the adaptation variable of a membership function, and ci is establishment of an output value (Tanaka, 1994) (Wang et al., 1997).

6. Experiments
In order to check whether parameter of a position and a posture can be calculated correctly, we compared actual measurement results with the calculation results by several experiments. The experiments were performed indoors. In the first experiment, a wireless camera shown in Fig.20 is set in a known three-dimensional position, and a marker is put on the ground like Fig.21. The marker is photographed by this wireless camera. A personal computer calculated the position and the posture of this wireless camera and compared the calculated parameters with the actual parameters. Table 1 shows the specification of the wireless camera and Table 2 shows the specification of the personal computer. A marker of 19cm radius is used in experiments because it is considered that the marker of this size can be got easily when this type of wireless camera which has the resolution of 640x480 pixels photographs it at a height between 1m and 2m. Table 3 shows experimental results of z axis coordinates. Table 4 shows experimental results of moving distance. Table 5 shows experimental results of yaw angle ( +). Table 6 shows experimental results of angle. According to the experimental results, although there are some errors in these computed results, these values are close to actual measurement.

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

233

Fig. 20. The wireless camera

Fig. 21. The first experiment Maker Part number Image sensor Lens Scan mode Effective distance Time of charging battery Size Weight RF SYSTEM lab. Micro Scope RC-12 270,000pixel , 1/4 inch , color CMOS 0.8mm Pin lens Interlace 30m About 45 minutes 151835(mm) 14.7g

Table 1. The specification of the wireless camera Maker Model name OS CPU Memory Table 2. The specification of PC Hewlett Packard Compaq nx 9030 Windows XP Intel Pentium M 1.60GHz 768MB

www.intechopen.com

234

Advances in Flight Control Systems

Actual distance (mm) Calculated value

800 785

1000 980

1200 1225

1400 1372

Table 3. The experimental results of z axis coordinates Actual moving distance (mm) Computed value of x axis coordinates Computed value of y axis coordinates Table 4. The experimental results of moving distance Actual degree (degree) Calculated value 45 64 135 115 225 254 315 350 50 31 29 -50 -33 -33 100 78 101 -100 -75 -89

Table 5. The experimental results of yaw angle ( angle angle) Actual degree (degree) Calculated value 0 12 10 28 20 36

Table 6. The experimental results of angle

40 44

In next experiment, we attached the wireless camera on RC helicopter, and checked if parameters of a position and a posture would be calculated during the flight. Table 7 shows the specification of RC helicopter used for the experiment. A ground image like Fig.22 is photographed with the wireless camera attached at RC helicopter during the flight. The marker is detected by the procedures of Fig.9 using image processing program. A binarization was performed to the inputted image from the wireless camera and the outline on the marker was extracted like Fig. 23. The direction feature point was detected from the image of the ground photographed by the wireless camera like Fig.24. Fig. 25 shows the measurement results on the display of a personal computer used for the calculation. The measurement values in Fig.25 were x-coordinate=319, y-coordinate=189, z-coordinate = 837, angle =10.350105, angle = -2.065881, and angle '=37.685916. Since our proposal image input method which can improve blurring was used, the position and the posture were acquirable during flight. However, since the absolute position and posture of the RC helicopter were not measureable by other instrument during the flight. We confirmed that by the visual observation the position and the posture were acquirable almost correctly. Length Width Height Gross load Diameter of a main rotor Gear ratio Motor Table 7. The specification of RC helicopter 360mm(Body) , 62mm(Frame) 90mm 160mm 195g 350mm 9.857:1 XRB Coreless Motor

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

235

Fig. 22. An image photographed by the wireless camera

Fig. 23. The result of marker detection

Fig. 24. The result of feature point extraction

www.intechopen.com

236

Advances in Flight Control Systems

Fig. 25. The measurement results during flight At the last, the autonomous flight control experiment of the RC helicopter was performed by detecting the marker ,calculating the position and the posture,and fuzzy control. Fig. 26 shows a series of scenes of a hovering flight of the RC helicopter. The results of image processing can be checked on the display of the personal computer. From the experimental results, the marker was detected and the direction feature point was extracted correctly during the autonomous flight. However, when the spatial relation of the marker and the RC helicopter was unsuitable, the detection of position and posture became unstable, then the autonomous flight miscarried. We will improve the performance of the autonomous flight control for RC helicopter using stabilized feature point detection and stabilized position estimation.

7. Conclusion
This Chapter described an autonomous flight control for micro RC helicopter to fly indoors. It is based on three-dimensional measuring by a micro wireless camera attached on the micro RC helicopter and a circular marker put on the ground. First, a method of measuring the self position and posture of the micro RC helicopter simply was proposed. By this method, if the wireless camera attached on the RC helicopter takes an image of the circular marker, a major axis and a minor axis of the circular marker image is acquirable. Because this circular marker has a cut part, the direction of the circular marker image can be

www.intechopen.com

Autonomous Flight Control for RC Helicopter Using a Wireless Camera

237

Time 1

Time 2

Time 3 Fig. 26. The experiment of autonomous flight

Time 4

acquired by extracting the cut part as a direction feature point of the circular marker. Therefore, the relation between the circular marker image and the actual circular marker can be acquired by a coordinate transform using the above data. In this way, the threedimensional self position and posture of the micro RC helicopter can be acquired with image processing and weak perspective projection. Then, we designed a flight control system which can perform fuzzy control based on the three-dimensional position and posture of the micro RC helicopter. The micro RC helicopter is controlled by tracking the circle marker with a direction feature point during the flight. In order to confirm the effectiveness of our proposal method, in the experiment, the position and the posture were calculated using an image photographed with a wireless camera fixed in a known three-dimensional position. By the experiment results, the calculated values near the actually measuring values were confirmed. An autonomous flight control experiment was performed to confirm that if our proposal image input method is effective when using a micro wireless camera attached on the micro RC Helicopter. By results of the autonomous flight control experiment of the RC helicopter, the marker was detected at real-time during the flight, and it was confirmed that the autonomous flight of the micro RC helicopter is possible. However, when the spatial relation of the marker and the RC helicopter was

www.intechopen.com

238

Advances in Flight Control Systems

unsuitable, the detection of position and posture became unstable and then the autonomous flight miscarried. We will improve the performance of autonomous flight control of the RC helicopter to more stable. We will improve the system so that the performance of the autonomous flight control of the RC Helicopter may become stability more.

8. Reference
Amida, O.; Kanade, T. & Miller, J.R. (1998). Vision-Based Autonomous Helicopter Research at Carnegie Mellon Robotics Institute 1991-1997, American Helicopter Society International Conf. Heli, Japan. Harris, C. & Stephens, M. (1988). A Combined Corner and Edge Detecter, Proc. 4th Alvey Vision Conf., pp.147-151. Nakamura, S.; Kataoka, K. & Sugeno, M. (2001). A Study on Autonomous Landing of an Unmanned Helicopter Using Active Vision and GPS, J.RSJ Vol.18, No.2, pp.252-260. Neumann, U. & You, S. (1999). Natural Feature Tracking for Augmented-reality, IEEE Transactions on Multimedia, Vo.1, No.1, pp.53-64. Ohtake, H.; Iimura, K. & Tanaka, K. (2009). Fuzzy Control of Micro RC Helicopter with Coaxial Counter-rotating Blades, journal of Japan Society for Fuzzy Theory and Intelligent Informatics, Vol.21, No.1, pp.100-106. Schmid, C.; Mohr, R. & Bauckhage, C. (1998). Comparing and Evaluating Interest Points, Proc. 6th Int. Conf. on Computer Vision, pp.230-235. Shi, J. & Tomasi, C. (1994). Good Features to Track, Proc. IEEE Conf. Comput. Vision Patt. Recogn., pp.593-600. Smith, S. M.; & Brady, J. M. (1997). SUSAN - A New Approach to Low Level Image Processing, Int. J. Comput. Vis., vol.23, no.1, pp.45-78. Sugeno, M. et al. (1996). Inteligent Control of an Unmanned Helicopter based on Fuzzy Logic., Proc. of American Helicopter Society 51st Annual Forum., Texas. Tanaka, K. (1994). Advanced Fuzzy Control, Kyoritsu Shuppan Co.,LTD, Japan. Wang, G.; Fujiwara, N. & Bao, Y. (1997). Automatic Guidance of Vehicle Using Fuzzy Control. (1st Report). Identification of General Fuzzy Steering Model and Automatic Guidance of Cars., Systems, Control and Information, Vol.10, No.9, pp.470479. Bao, Y.; Takayuki, N. & Akasaka, H. (2003). Weak Perspective Projection Invariant Pattern Recognition without Gravity Center Calculation, journal of IIEEJ, Vol.32, No.5, pp.659--666 Bao, Y. & Komiya, M. (2008). An improvement Moravec Operator for rotated image, Proc. of the ADVANTY 2008 SYMPOSIUM , pp.133-138.

www.intechopen.com

Advances in Flight Control Systems Edited by Dr. Agneta Balint

ISBN 978-953-307-218-0 Hard cover, 296 pages Publisher InTech

Published online 11, April, 2011

Published in print edition April, 2011 Nonlinear problems in flight control have stimulated cooperation among engineers and scientists from a range of disciplines. Developments in computer technology allowed for numerical solutions of nonlinear control problems, while industrial recognition and applications of nonlinear mathematical models in solving technological problems is increasing. The aim of the book Advances in Flight Control Systems is to bring together reputable researchers from different countries in order to provide a comprehensive coverage of advanced and modern topics in flight control not yet reflected by other books. This product comprises 14 contributions submitted by 38 authors from 11 different countries and areas. It covers most of the currents main streams of flight control researches, ranging from adaptive flight control mechanism, fault tolerant flight control, acceleration based flight control, helicopter flight control, comparison of flight control systems and fundamentals. According to these themes the contributions are grouped in six categories, corresponding to six parts of the book.

How to reference

In order to correctly reference this scholarly work, feel free to copy and paste the following: Yue Bao, Syuhei Saito and Yutaro Koya (2011). Autonomous Flight Control for RC Helicopter Using a Wireless Camera, Advances in Flight Control Systems, Dr. Agneta Balint (Ed.), ISBN: 978-953-307-218-0, InTech, Available from: http://www.intechopen.com/books/advances-in-flight-control-systems/autonomous-flightcontrol-for-rc-helicopter-using-a-wireless-camera

InTech Europe

University Campus STeP Ri Slavka Krautzeka 83/A 51000 Rijeka, Croatia Phone: +385 (51) 770 447 Fax: +385 (51) 686 166 www.intechopen.com

InTech China

Unit 405, Office Block, Hotel Equatorial Shanghai No.65, Yan An Road (West), Shanghai, 200040, China Phone: +86-21-62489820 Fax: +86-21-62489821

You might also like