You are on page 1of 66

1 California University of Pennsylvania

2
3 Department of Applied Engineering and Technology
4
5
6
7
8
9

10 CET 492:
11 Senior Project II
12
13
14
15
16

17 X-Fly
18 User Manual
19
20
21
22
23
24
25
26
27
28
29Design by: Professor:
30Ethan Plummer Dr. Weifeng Chen
31Nicholas Reich
32Robert Alex Kibler
33

1 1
2
34 Instructor Comments and Evaluation
35

3 2
4
36Table of Contents

371 Introduction..............................................................................................................................1

38 1.1 Community / Social Implications..........................................................................................2

39 1.2 Motivation..............................................................................................................................2

402 Project Overview..........................................................................................................................3

41 2.1 Component Overview............................................................................................................3

42 2.1.1 Flight Control Board (FCB)............................................................................................3

43 2.1.2 Transmitter......................................................................................................................4

44 2.1.3 Receiver..........................................................................................................................5

45 2.1.4 Frame..............................................................................................................................6

46 2.1.5 Battery.............................................................................................................................7

47 2.1.6 Motors.............................................................................................................................8

48 2.1.7 Electronic Speed Controllers (ESCs)..............................................................................9

49 2.1.8 Propellers......................................................................................................................10

50 2.1.9 Raspberry Pi..................................................................................................................11

51 2.1.10 Raspberry Pi Camera Board Module..........................................................................12

52 2.1.11 Motor Mounts.............................................................................................................13

53 2.1.12 Camera Gimbal Socket...............................................................................................14

54 2.1.13 Camera Gimbal...........................................................................................................15

5 3
6
55 2.1.14 Multiplexer.................................................................................................................16

56 2.1.15 Microcontroller...........................................................................................................17

57 2.1.16 Auxiliary Battery Pack................................................................................................18

58 2.2 Software Components..........................................................................................................19

59 2.2.1 Vision Tracking.............................................................................................................19

60 2.2.2 Software/Hardware Control..........................................................................................20

613 Project Implementation...............................................................................................................21

62 3.1 Implementation Details........................................................................................................21

63 3.1.1 Building the Quadcopter...............................................................................................21

64 3.1.2 Software Implementation..............................................................................................23

65 3.1.3 Challenges during Implementation...............................................................................25

66 3.1.4 Difference from Design Document...............................................................................27

67 3.2 System Diagrams.................................................................................................................28

68 3.2.1 Software Block Diagrams.............................................................................................28

69 3.2.2 Hardware Block Diagrams...........................................................................................29

70 3.2.3 Schematics....................................................................................................................30

714 Use of Software Engineering Principles.....................................................................................31

725 User Manual................................................................................................................................32

73 5.1 Setting up X-Fly...................................................................................................................32

74 5.1.1 Charging the batteries...................................................................................................32

7 4
8
75 5.1.2 Powering on..................................................................................................................32

76 5.1.3 Ready to fly...................................................................................................................32

77 5.2 Getting in Position...............................................................................................................32

78 5.2.1 Basic flight controls......................................................................................................32

79 5.2.2 Switching to semi-autonomous mode...........................................................................33

80 5.3 Landing (safely)...................................................................................................................33

81 5.4 How to Obtain Help.............................................................................................................33

82Appendix A: Team Details.............................................................................................................34

83 Ethan Plummer...........................................................................................................................34

84 Nicholas Reich...........................................................................................................................36

85 R. Alex Kibler............................................................................................................................37

86Appendix B: Work Flow Authentication.......................................................................................38

87Appendix C: References................................................................................................................39

88Appendix D: Code Listings...........................................................................................................40

89

90

9 5
10
91 Table of Figures

92Figure 1: KK2.0 Multi-Rotor Control Board...................................................................................3

93Figure 2: Futaba 10CG 10-Channel 2.4GHz Computer System.....................................................4

94Figure 3: R6014HS Receiver...........................................................................................................5

95Figure 4: Hobbyking H4 Copter......................................................................................................6

96Figure 5: Turnigy nano-tech 3300mah 3S 25~50C Lipo Pack........................................................7

97Figure 6: Turnigy D2836/8 1100KV Brushless Outrunner Motor...................................................8

98Figure 7: HobbyKing 20A BlueSeries Brushless Speed Controller................................................9

99Figure 8: 10x4.5 Carbon Fiber Propellers 1pc Standard/1pc RH Rotation...................................10

100Figure 9: Raspberry Pi Model B....................................................................................................11

101Figure 10: Raspberry Pi Camera Board Module...........................................................................12

102Figure 11: 3D Printed Motor Mount/Leg......................................................................................13

103Figure 12: 3D Printed Camera Gimbal Socket..............................................................................14

104Figure 13: 3D Printed Camera Gimbal..........................................................................................15

105Figure 14: 74HC157N Quad 2-input multiplexer..........................................................................16

106Figure 15MC9S08QD4CPC-ND Embedded Microcontroller.......................................................17

107Figure 16: Auxiliary Battery Pack.................................................................................................18

108Figure 17: Scope shot of the transmitter........................................................................................25

109Figure 18: Scope shot of the Raspberry Pi....................................................................................25

110Figure 19: Software Block Diagram..............................................................................................28

111Figure 20: Hardware Block Diagram............................................................................................29

112Figure 21: Schematics....................................................................................................................30

113

11 6
12
1141 Introduction

115 Extreme sports athletes (snowboarders, for example) today face a problem when it

116comes to recording footage of their runs: in order to have their activities filmed, they have to

117have an equally skilled cameraman to follow them and record. This comes at the risk of

118damaging both expensive film equipment and injuring the cameraman or woman. However, in

119order to gain exposure, these activities need to be captured on film and up close, but athletes at

120the top of their skill can be difficult to keep up with and record.
121 The X-Fly system is one solution to this problem. X-Fly is a quad propeller-driven

122multicopter, also known as a quadcopter. It contains an intelligent camera system that can detect

123a target from large distances away. In its current state, the target wears a specific color (which

124can be modified by the operator) and proceeds as normal. The operator of X-Fly raises the

125quadcopter up to the optimal height (roughly three meters above the ground) and aligns it with

126the target. At this point, the operator flips the switch on the transmitter, switching the X-Fly

127system into semi-autonomous mode. When this happens, the X-Fly will become autonomous,

128tilting and rolling to stay locked onto the target. The operator can attach their own camera

129system to X-Fly to record the target.


130 One camera system that works excellently with X-Fly is a GoPro HERO camera module

131with the Wi-Fi BacPac. When attached using the provided mount, the GoPro can be set as a

132wireless access point to which the operator can connect with their smartphone or computer. At

133this point, the operator can see in real-time what X-Fly is seeing. This can also be used in

134conjunction with a media server to live-stream the action as it happens, which is something that

135would be very difficult to do with a handheld camera system.


136 When the target has completed their run, the operator of X-Fly can release the semi-

137autonomous switch, returning the quadcopter to manual mode. At this point, he or she can either

13 7
14
138land X-Fly where it currently is return it to the starting position and land it there. Overall, the X-

139Fly system is designed for ease of use and affordability. By allowing the user to provide their

140own camera, X-Fly can potentially save thousands over commercial film setups, as well as being

141the only autonomous filming quadcopter on the market.


142 If X-Fly were to go to mass market, it could see massive popularity among

143skiers/snowboarders, as well as skateboarders, surfers, bicyclists, and even runners. This would

144lower costs of professional filming equipment dramatically, as well as lowering the costs of

145professional filmographers, as the demand would decrease.

1461.1 Community / Social Implications

147 X-Fly has the potential to vastly increase the quality and number of professional-quality

148recordings of extreme sports events, as well as reducing the number of accidents by reducing the

149number of people needed to film an event. By reducing the number of accidents happening, this

150will reduce the cost of healthcare for extreme sports athletes and would eventually lower the cost

151of healthcare for the rest of the country.

1521.2 Motivation

153 Our motivation for this project was our interest in robotics and control systems. We

154talked about multiple similar projects, such as an electric, computer-controlled skateboard

155system, sort of like a Segway, but decided on X-Fly because it was a much more interesting and

156challenging idea to us. The problem was also very similar to the CET 360 smart car project and

157we wanted to build on that.

15 8
16
1582 Project Overview

17 9
18
1592.1 Component Overview
160 X-Fly consists of three major components: the quadcopter itself, the transmitter, and the

161onboard computer. The quadcopter consists of many components. Among these are the frame,

162the motor mounts, the motors, the propellers, the flight control board, the electronic speed

163controllers, the receiver, and the battery.

1642.1.1 Flight Control Board (FCB)

165The flight control board is KK2.0 Multi-Rotor Control Board from HobbyKing. It acts as the

166main flight computer for the quadcopter. As shown in Figure 1, it has an accelerometer to detect

167acceleration, two gyros to measure orientation, an Atmel Mega324PA microprocessor, an LCD

168Display, four buttons, a piezo output (speaker), five inputs, and supports up to eight outputs

169(motors). For the purpose of the X-Fly project, we were most concerned with the first four

170inputs (AIL for aileron, ELE for elevator, THR for Throttle, and RUD for Rudder) and M1

171through M4 for our four motors.

172

173 Figure 1: KK2.0 Multi-Rotor Control Board

19 10
20
1742.1.2 Transmitter

175For the transmitter, we are using the Futaba 10CG 10-Channel 2.4GHz Computer System

176Transmitter. It has a seven point throttle and pitch curve, real-time response, smart switch

177technology, a 160x72 backlit LCD, a servo monitor display, hover throttle, gyro menu, and many

178other features, all found on the website linked in the references. One of the important features

179we are using is the dead-mans switch on the upper left corner for autonomous mode. When the

180operator pulls and holds the switch, X-Fly becomes semi-autonomous, but as soon as the

181operator lets go of the switch, it goes back to manual mode. Another feature that is useful to us

182is that the throttle stick (left) holds its position rather than springing back to the bottom when the

183user lets go.

184

185 Figure 2: Futaba 10CG 10-Channel 2.4GHz Computer System

21 11
22
1862.1.3 Receiver

187The receiver we are using to capture the transmissions from the transmitter is the R6014HS

188Receiver, also from Futaba. As seen in Figure 3 below, one of the most useful features of this

189receiver is the Link button. With a single press of the button, it can link with the transmitters

190unique signal, eliminating the worry of interference from other controllers that might be being

191used. It also includes a battery failsafe with an auto cut-off to protect the quadcopter. It weighs

19221.5g. Both the receiver and the transmitter are owned by the university and were lent to us by

193Professor Sumey.

194

195 Figure 3: R6014HS Receiver

196

197

23 12
24
1982.1.4 Frame

199For the quadcopter frame, we chose to go with the Hobbyking H4 Copter frame. As seen below

200in Figure 4 and as referenced by name, the H4 Copter is an H-shaped quadcopter frame.

201Because of this, it is more ideal for video capturing. In a standard x-shaped quadcopter, the arms

202can get in the way of cameras. The H4 solves this by keeping the arms in an H-shaped

203configuration, perpendicular to the frame, keeping them out of the view of the camera. The H4

204is an extremely lightweight (453g) frame made out of aluminum and glass fiber. We decided

205from the very beginning to get the lightest frame we could get because we knew that X-Fly

206would be on the heavy side after adding all of our extra hardware.

207
208 Figure 4: Hobbyking H4 Copter

209

210

25 13
26
2112.1.5 Battery

212For the main battery (which powers the FCB, the ESCs, and the motors), we decided to go with

213the Turnigy nano-tech 3300mah 3S 25~50C Lipo Pack from HobbyKing. It is a three cell 11.1V

214lithium polymer battery with a 25C constant / 50C burst discharge. It weighs 199g and has a

215capacity of 3300mah. Originally we had a battery with a much higher capacity picked, but it

216added so much to the weight that we decided it wasnt worth it. The current battery delivers

217between 10-20 minutes of flight time, which is more than enough to record a snowboard run.

218
219 Figure 5: Turnigy nano-tech 3300mah 3S 25~50C Lipo Pack

220
221

27 14
28
2222.1.6 Motors

223For our motors, we chose to use the Turnigy D2836/8 1100KV Brushless Outrunner Motor.

224After completing many different thrust calculations, we found that these motors would provide

225the optimum amount of thrust for the weight of the X-Fly system. These calculations will be

226provided later in the report. The Turnigy D2836/8 1100KV Brushless Outrunner Motor has a

227max power of 336W, and has a max thrust of 1130g. Throughout the process we had multiple

228motors fail, but they ended up being exactly what we needed.

229
230 Figure 6: Turnigy D2836/8 1100KV Brushless Outrunner Motor

231

29 15
30
2322.1.7 Electronic Speed Controllers (ESCs)

233For the speed controller, we decided to use the HobbyKing 20A BlueSeries Brushless Speed

234Controller. We picked these because they have an automatic timing setup, only weighed 18g, and

235were able to output enough power to drive the motors from our batteries. They were also very

236cheap at about $10 each.

237
238 Figure 7: HobbyKing 20A BlueSeries Brushless Speed Controller

239

31 16
32
2402.1.8 Propellers

241Finally, for the propellers, we chose to use the 10x4.5 Carbon Fiber Propellers 1pc Standard/1pc

242RH Rotation from HobbyKing. We chose to use these propellers because they are made of high

243quality, durable carbon fiber, making them less likely to break upon impact than plastic. They

244weigh 9g each and were fairly cheap.

245

246 Figure 8: 10x4.5 Carbon Fiber Propellers 1pc Standard/1pc RH Rotation

247

33 17
34
2482.1.9 Raspberry Pi

249 For our onboard computer, we decided to go with the highly popular Raspberry Pi. We

250considered many different options, such as an Arduino, or a Gumstix computer, as well as a

251coldfire MCU, but we decided on the Raspberry Pi because of its affordability and its featureset.

252For $35, the Raspberry Pi comes with 512MB of RAM, an Arm11 MPU, 2 USB Ports, Ethernet

253and HDMI ports, and mounting holes. It can also be powered by USB or by batteries. Another

254reason we chose the Raspberry Pi is because it has its own supported camera (which will be

255discussed next), which means we did not have to figure out drivers for third party cameras. We

256also were very impressed by the community and pre-built linux distributions which saved

257countless hours because we were able to boot a pre-existing linux operating system and do

258anything we could do on a linux desktop.

259

260 Figure 9: Raspberry Pi Model B

35 18
36
2612.1.10 Raspberry Pi Camera Board Module

262Next was the Camera module. When we were researching which onboard computer to use, we

263learned that the Raspberry Pi has its own Camera Board Module for $30 which has the ability to

264record 1080p at 30fps, which is more than we needed. This was a great thing, so we decided to

265use the Raspberry Pi and its camera module which plugs in right between the HDMI and

266Ethernet ports in Figure 9 above.

267

268 Figure 10: Raspberry Pi Camera Board Module

269

37 19
38
2702.1.11 Motor Mounts

271 These 3D printed motor mounts were designed by Ethan to replace the stock legs for the

272frame. They are extended legs so there is room to attach the camera on the bottom of X-Fly.

273They are also wider at the bottom to absorb more of the force.

274

275 Figure 11: 3D Printed Motor Mount/Leg

276

39 20
40
2772.1.12 Camera Gimbal Socket

278 This 3D printed socket, in conjunction with the Camera Gimbal of Figure 13, below,

279allows us to mount the camera to the bottom of X-Fly and move it around as needed to get the

280proper angle.

281

282 Figure 12: 3D Printed Camera Gimbal Socket

283

41 21
42
2842.1.13 Camera Gimbal

285 This gimbal was 3D printed and allows the camera to feed down through it and attach to

286the bottom of X-Fly. This allows us to change the camera angle and protect the cable.

287
288 Figure 13: 3D Printed Camera Gimbal

289

43 22
44
2902.1.14 Multiplexer

291 When we started considering how to switch between autonomous and manual mode, we

292decided to use the 74HC157N Quad 2-input multiplexer. It accepts the outputs from the

293transmitter and the raspberry pi and then, depending on whether or not the switch is flipped, can

294determine which set of outputs to feed to the flight control board.

295

296
297 Figure 14: 74HC157N Quad 2-input multiplexer

298

45 23
46
299

3002.1.15 Microcontroller

301 The MC9S08QD4CPC-ND Embedded Microcontroller was chosen to convert the

302pulsewidth sent from the transmitter. We capture this pulsewidth so we can determine whether

303the user wants autonomous or manual mode.

304

305 Figure 15MC9S08QD4CPC-ND Embedded Microcontroller

306

47 24
48
3072.1.16 Auxiliary Battery Pack

308 This battery pack, in conjunction with four NiMH AA batteries, powers the raspberry pi

309and the receiver. This separates the computer system from the main power, preventing a total

310system failure.

311

312

313 Figure 16: Auxiliary Battery Pack

314

49 25
50
3152.2 Software Components

3162.2.1 Vision Tracking

317 The computer vision portion of this platform is written in a combination of C and C++.

318This combination was used because some crucial modules were written previously by developers

319in C, and the more recent versions of OpenCV were written. This particular combination of

320languages turns out to not be a problem, because C++ is the successor to C, and the main

321program was written in C++. The modules that were written in C are the ones that allow the

322Raspberry Pi camera module to be used as an input capture and the one that allow use of

323input/output pins in a any C program. The call that OpenCV uses to get an input image wouldn't

324normally find the Raspberry Pi camera module as an input source, so this library, called

325RaspiCamCV which was written by Emil Valkov, allows the flight program to use the small on-

326board camera. The other module of code is called RaspberryPi-GPIO and allows for a code

327interface to the GPIO pins.

328 Now that there is an image to work with on the Raspberry Pi, from the Raspberry Pi

329camera, image manipulation algorithms can be performed to extract the desired data from the

330image. The simplest approach to any kind of object tracking is to focus on that object's color.

331This can be done in OpenCV by using the cvInRangeS function. After the general target has been

332identified, the target will needs to be cleaned up because it at this point will be a rough binary

333image. To clean up the target, a series of erosions and dilatations of the whole image are applied

334to get a more normalize representation of the target. Next some sense of this blobular image

335needs to be made. A simple way to do this is to change the image to a more orthogonal picture.

336This is achieved by bounding the contours of the binary image with a rectangle. This provides an

337adequate amount of data to allow for spatial analysis in 3 planes of motion.

51 26
52
338 Once the corrective measures have been calculated, a signal needs to be sent to the flight

339control board (FCB) to adjust the position to the platform in the necessary directions. This is

340where the ServoBlaster software is used. ServoBlaster allows any GPIO pin on the Raspberry Pi

341to be used as a PWM pin. Once the pulse width is generated, it is sent to the FCB so that

342corrective movements can be made. See the data flow diagram for a basic graphical

343representation of signals and data.

3442.2.2 Software/Hardware Control

345 Upon power up of the system, a number of background scripts are run for the sole use of

346the operating system on the Raspberry Pi. These do various like monitor the central processing

347unit and memory usage. These scripts are crucial to the stable operation of the micro-controller.

348Amongst the system scripts, additional scripts were needed to allow usage of other software and

349devices like the general purpose input/output (GPIO) ports.

350 A script is a file that used to automatically execute commands that a user would typically

351enter in a terminal. For the Raspberry Pi, which is on-board the X-Fly, to make use of it's GPIO

352pins, which send commands to the flight control board to maneuver. A script was written to

353execute the ServoBlaster program as a background daemon during start-up.

354

355 ServoBlaster is a program that allows the Raspberry Pi to send out pulse width

356modulation (PWM) signals via it's GPIO pins. This program without the use of a script would

357normally be executed by entering the the command on the above figure. By putting this

358command into a script it makes it easier to have every thing need for flight to be executed at at

359start up.

53 27
54
360 Another program that was put into a script was the enable writing to the GPIO program.

361In this case it resides in the same directory as the ServoBlaster. However, the program is not right

362there so a change directory command was put into the script that ultimately calls the make

363command in figure 1.2. This enables other programs to write to the file which the pin values are.

364These modules will be explained further detail later.

365 The last script that had to be written was the one to execute the main vision program.

366This script was executed third out of the 3 additional scripts that were written. This is because

367the vision program was dependent on the other two to operate properly. This script is the

368equivalent of simply doing a change directory command and then a program execution

369command.

370 The combination of these scripts allow the X-Fly to be able to fly autonomously upon the

371application of power to the system. An added bonus of putting these scripts in the start-up

372sequence is that logging into the Raspberry Pi is unnecessary.

373

3743 Project Implementation

3753.1 Implementation Details

3763.1.1 Building the Quadcopter

377 In September, we started researching what type of vessel would be the most efficient as a

378person-tracking drone. We considered a model airplane because of price, but decided that a

379multicopter would be the best idea due to its ability to hover. Next, we had to research which

380onboard computer would be the best for it. As I mentioned in section 2.1.9, we decided that a

381Raspberry Pi with its camera would work best for our needs. After this, we had to figure out

55 28
56
382what parts we actually needed to build the quadcopter, because this was something that none of

383us had ever done before. After researching different frame and motor configurations, we decided

384on an H frame from Hobby King (Figure 4 above). According to Hobby Kings listing for the

385frame, having an H-shaped frame keeps the arms away from the camera, allowing the camera to

386have a much wider field of view without things getting in the way. Obviously this was optimal

387for us. Next, we had to decide on the rest of the hardware. We picked our flight control board

388because it was cheap and compatible with the transmitter/receiver combo that Mr. Sumey let us

389borrow. We ran calculations based on the mass of all of the components and decided on motors

390and a battery that would work optimally. All of this was done in early September, and all of the

391hardware was ordered by the end of the month. The Raspberry Pi was among the first to arrive in

392the first week of October. After it arrived, we researched different linux distributions that were

393compatible to decide which one to install. We decided to install Raspian, which is a Raspberry

394Pi-specific version of Debian. The two main reasons we chose Raspian were that the camera was

395compatible with it out of the box, and OpenCV had an installation guide for Debian-based

396systems. We got it installed and tested the camera. Next was to install OpenCV on one of our

397computers. This took about two weeks and was finally finished installing on Ethans computer

398on October 17th. We encountered many errors during the OpenCV installs, and sometimes they

399didnt occur until hours into the installation so we wouldnt find out until the following morning.

400The next batch of hardware arrived on the 17th as well. This shipment came with the wires,

401connectors, battery monitor, and the receiver. Most of the quadcopter was wired on the 18th,

402including adding a bullet connector to the power distribution block. We secured the ESCs to the

403frame using zip ties. We attempted cloning Ethans computer so we could each have a completed

404install of OpenCV but we ended up abandoning this idea and working directly on the Raspberry

57 29
58
405Pies. On the 26th we got the rest of the hardware and got OpenCV installed on the Raspberry Pi.

406On the 18th of November, we attempted the first test flight, but there was a problem. We had all

407four ESCs wired to the Power Distribution Block, where only one needed to be. This caused a

408short and all of the ESCs were destroyed. We ordered more and waited about a week for them to

409arrive. Things got busy with the semester coming to a close but we completed our first test flight

410on December 23rd, 2013. It was completely unflyable. During the week of February 10th, we did

411a factory reset on the flight control board (Mr. Sumey suggested that we tweaked the PID values

412so much that we were beyond the point of no return) and this seemed to fix it. We also removed

413the stock legs from the H frame and 3D printed new legs to sit on the corners, bringing the center

414of balance out towards the edges, making it much more stable. During the week of March 10th,

415we got the power system completely finished. We isolated the raspberry pi and receiver from the

416main power supply so that if the main battery died, the pi would still be running and so would the

417receiver. At this point, we printed a new set of legs for the quadcopter because we had a rough

418landing that broke two of them. The new revision had a wider base and was able to take more of

419an impact.

4203.1.2 Software Implementation

421 The software implementation started back in September when we first started discussing

422what software we would use for our vision tracking. Ethan met with a former coworker who

423works for a robotics company and he mentioned that we should look into OpenCV. We

424researched OpenCV extensively and decided that it would work excellently for us. OpenCV is a

425BSD-licensed library of functions mainly used during live video. It supports functions such as

426drawing on videos as they stream, as well as doing math on things that are happening in the

427videos. This is the part that interests us. By mid-December, a program was written that could

59 30
60
428read in video from a USB webcam and filter out all but a specified color. This was step one of

429our Vision Tracking program. By the end of February, we had a version of the program written

430where we could specify a color and the program would draw boxes around any objects with that

431color. By mid-March, we had a version written that could calculate the centroid of the object and

432determine how far away the quadcopter was from where it wanted to be. It could use this

433information to determine what pulsewidths to send to the ServoBlaster. ServoBlaster is an open-

434source library written by Richard Ghirst and published on GitHub. It provides an interface to

435output PWMs to multiple GPIO pins on the Raspberry Pi, which is not a feature native to the Pi.

436We also used a library called RaspberryPi-GPIO to interface with the GPIO pins. It was written

437by Alan Barr. By the end of February, there were two separate programs written: The vision

438tracker, and a hardware controller. One could identify the target, and the other could control the

439quadcopter, but the two needed to be merged together. This was completed by mid-March. The

440vision tracker identifies an object based on a pre-defined color, determines what it needs to do to

441get that object to the center of its field of vision, and then sends the command to the hardware

442controlling part of the software, which calculates what PWMs to send to the GPIO pins. Figure

44311 below shows what an oscilloscope shows when a signal is sent via the transmitter. Figure 12

444shows what the scope sees when a signal is sent out from the Raspberry Pi. In both shots,

445channel 1 is aileron, channel 2 is elevator, channel 3 is throttle, and channel 4 is rudder

61 31
62
446

447 Figure 17: Scope shot of the transmitter

448

449 Figure 18: Scope shot of the Raspberry Pi

4503.1.3 Challenges during Implementation

451 Many challenges were had during the implementation of the X-Fly system. Our first

452major challenge, as described above, was determining which onboard computer to go with. We

63 32
64
453had many different options, but we ended up choosing the Raspberry Pi, which ended up

454working out for us.

455 Our next challenge was determining how to filter out our target. We were between using

456RoboRealm and OpenCV but decided on OpenCV because of its opensource license and ease of

457use. Once we decided that we were going to use OpenCV, we had to get it installed. This took

458almost a month just to get installed on a computer. The installation took about two hours to

459install, and errors didnt usually occur until the end of the process. Once we got an installation to

460complete, it was easy to get it installed on the Raspberry Pi.

461 Our next challenge was getting the quadcopter off the ground. After we wired everything

462up and turned on the quadcopter, the ESCs all failed because of a miswiring. We ordered new

463ones and rewired it. Everything was assembled and we attempted our first flight, but it was

464extremely unstable. After many hours of tinkering with the flight control board, we were able to

465achieve a somewhat stable flight. This was tweaked over the following weeks.

466 Once the quadcopter was able to fly manually, we started experimenting with software

467controlled flight using ServoBlaster. This worked well for a short amount of time but then we

468suffered a crash, breaking two propellers.

469 Along the way, we suffered multiple other failures as well. At one point, one of the

470motors failed, so we had to order more to replace it. We waited over a month for the

471replacements, so the quadcopter was grounded the entire time.

472 The control program and vision software were written separately, which caused multiple

473problems. The control program was written in C and the vision software was written in C++, so

65 33
66
474the compiler got confused and was unable to compile. This was fixed by forcing the C libraries

475to compile as C libraries rather than allowing the compiler to attempt to compile them in C++

4763.1.4 Difference from Design Document

477 There are multiple differences from the design document. The biggest one is that in the

478design document, we said that at runtime, the user would decide whether X-Fly would launch in

479autonomous mode or manual mode. This is different now because it always starts out in manual

480mode and then can be switched to autonomous. This is preferential because it allows the user to

481take control if anything goes wrong. It also means that it wasnt necessary to write take-off and

482landing software.

483 Another large difference is that there is no longer an onboard altimeter like specified in

484the design document. We originally said that when X-Fly reached a certain altitude (provided by

485the altimeter), it would automatically start recording video. As the video recording is done by a

486standalone camera provided by the user, this was no longer feasible and was removed.

487

67 34
68
4883.2 System Diagrams

4893.2.1 Software Block Diagrams

490
491 Figure 19: Software Block Diagram

492

69 35
70
493

494

4953.2.2 Hardware Block Diagrams

496

497 Figure 20: Hardware Block Diagram

498

71 36
72
4993.2.3 Schematics

500

501 Figure 21: Schematics

502

73 37
74
5034 Use of Software Engineering Principles

504X-Fly was a strong example of the proper way to use Software Engineering Principles. During

505the development and deployment of X-Fly, we abided by the Software Development Process.

506The first phase was the requirements phase. During the requirements phase, we determined what

507was required of the project. The ideas were organized more tightly during this phase. The next

508phase was specification. During the specification phase, we determined what components were

509needed to build the quadcopter and to complete the system. We created a specification document

510to show what the system was supposed to do when it was completed. We also created block

511diagrams and flow charts during this period. During the next phase, we broke the project down

512into sections: hardware, software, and documentation. This was to ensure that work was being

513split up efficiently and fairly. During the implementation phase, we took note of the previous

514documents as we created the quadcopter and started programming it.

515 During the entire process, all of us worked together to design a system that would meet

516all of the requirements of the CET program. It was designed with the goals in mind. We wrote

517the software with maintainability in mind. The software is extremely modular and could be used

518on a variety of different systems.

519

75 38
76
5205 User Manual

5215.1 Setting up X-Fly

5225.1.1 Charging the batteries

523 Before X-Fly can take off, all three sets of batteries must be charged. The largest battery

524(Figure 5 above) needs to be charged with the supplied charger. The second battery is the

525Auxiliary Battery Pack (Figure 16). This is a set of four AA batteries that can be rechargeable or

526not. Finally, the Transmitter (Figure 2) needs to be charged on its supplied charger. See the

527attached LiPo Charging Guide in the reference section

5285.1.2 Powering on

529 In order to power on X-Fly, insert the AA batteries into the auxiliary pack and attach the

530cover. The Raspberry Pi should power on. This is indicated by the PWR light turning on. Next,

531insert the LiPo battery into its harness and plug it in. The flight control board should come on

532and beep. Finally, turn on the transmitter by flipping the power switch to the on position.

5335.1.3 Ready to fly

534 Arm the flight control board by moving the right stick to the right with zero throttle. It

535will beep and the LED will turn on. For safety reasons, do not do this until you are at least 5

536meters away. X-Fly is now ready to fly.

5375.2 Getting in Position

5385.2.1 Basic flight controls

539 The left stick controls the throttle. Throttle is increased by raising the stick. There is no

540spring in the throttle stick, so it will maintain throttle unless you change it. Moving the throttle

77 39
78
541stick left and right will rotate (yaw) the quadcopter. The right stick controls pitch and roll.

542Moving the stick forward will pitch X-Fly forward, meaning it will start to fly forward. Moving

543the stick back will fly backward. Moving left and right will make it fly left and right.

5445.2.2 Switching to semi-autonomous mode

545 Once X-Fly is behind the target and about ten feet above the ground, it can be switched to

546semi-autonomous mode. Leave the throttle stick in its position and pull and flip the E switch.

547At this point, it will take over and autonomous mode and start following the target.

5485.3 Landing (safely)

549 After you are done flying, flip the E switch back to off. This will switch back to manual

550mode. Since the throttle stick is still in its position, it will maintain height. At this point, slowly

551lower the throttle stick to land the X-Fly.

5525.4 How to Obtain Help

553 Help can be obtained by emailing Ethan Plummer (plu5937@calu.edu), Nick Reich

554(rei8749@calu.edu), or Alex Kibler (kib9362@calu.edu)

555

79 40
80
556Appendix A: Team Details

557Ethan Plummer

558Ethan Plummer is currently a senior Computer Engineering Technology major at California

559University of Pennsylvania. He is also pursuing an associate's degree in Robotics Engineering

560Technology. He is serving as the Build Manager for the X-fly project. His study interests include

561automation, embedded systems and artificial intelligence. During his off time from school he

562interns at the National Robotics Engineering Center in Pittsburgh. His future plans after

563graduation include applying for grad school to further pursue his study interests.

564 For the X-Fly production team, He served as the Project Manager and Lead Design

565Engineer. He was chosen to be Project Manager because of his experience leading various

566projects. The most notable was leading Team Cache Money to a second place victory in the CET

567360 (Microprocessor Engineering) smart car class race.

568 As project manager, he took care of making sure that work was completed on schedule.

569This included making sure that parts were ordered, paid for, and were shipped in a timely

570manner. He also delegated responsibilities related to the project based on team member's skills.

571 As Lead Design Engineer, he took responsibility for the overall design of the project.

572This included deciding the necessary features and functions of the project. For example the

573frame of the quad was decided upon because of the need for ample space to place components.

574He was also responsible for the design and production of the custom 3d printed parts. On the

575programming side of things he was responsible for researching, designing, and writing the vision

576based algorithms. After the vision program was written and working, he was responsible for

81 41
82
577compiling all of the libraries related to the project into an executable. He also wrote the start-up

578scripts to enable the complementary processes to execute.

579

83 42
84
580Nicholas Reich

581Nick Reich is currently a senior Computer Engineering Technology student at California

582University of PA. He will be graduating in May 2014. He was originally a Computer Science

583major, but decided to take his love for programming to the next level by switching to CET.

584During the X-Fly project, Nick served as the Control Systems Engineer. He was chosen for this

585role because due to his comfort zone with embedded programming, including I/O peripherals and

586motor control.

587As Control Systems Engineer, his roles were split into two parts: hardware and software. The

588hardware connections needed to be completed before any attempts at programming could be

589made to control the X-Fly. Hardware design and testing was completed using oscilloscopes,

590digital multi meters, and component soldering. Proper voltage levels at certain points of the

591circuit along with accurate control pulses needed to be precise and reliable for this project, or the

592consequences could be detrimental.

593On the software side of the project, Nick utilized his embedded programming experience to

594obtain proper digital pulse width outputs from the Raspberry Pi. He was able to manipulate the

595Raspberry Pi into thinking it was a common RC transmitter by imitating the PWM outputs from

596ours. Testing for the software design was once again accomplished using an oscilloscope to

597ensure proper pulses were being sent to the motors. After this control software was merged with

598Ethans vision software, they tested and debugged together to obtain proper object tracking.

599

85 43
86
600R. Alex Kibler

601Alex Kibler is a senior Computer Engineering Technology major at Cal U. He is currently

602interning at US Steel as a Process Control Engineer, and he has accepted a full time position

603pending graduation.

604Alex contributed to the project in multiple ways. His largest contribution was as the

605documentation specialist. He hosts the webserver for the website. He also designed the entire

606website and wrote the software to manage the media on the server. He wrote all of the weekly

607reports and powerpoints.

608Alex suggested the idea of using the Raspberry Pi as an onboard computer because he had

609experience working with embedded Unix systems and knew that it would be powerful enough

610for the project, while also being very affordable. He set up the VNC server on the raspberry pi

611and worked with other low level Unix systems on the raspberry pi. He also worked as a software

612support engineer, providing support to Nick and Ethan as they encountered issues.

613Alex was also the photographer for the group.

614

615

87 44
88
616Appendix B: Work Flow Authentication

617I, Ethan R Plummer, hereby attest that I have performed the work as documented herein.

618Signature:______________________________ Date:______________

619

620I, Nicholas Reich, hereby attest that I have performed the work as documented herein.

621Signature:______________________________ Date:______________

622

623I, Robert Alex Kibler, hereby attest that I have performed the work as documented herein.

624Signature:______________________________ Date:______________

625

626

89 45
90
627Appendix C: References

628ServoBlaster repository: https://github.com/richardghirst/PiBits/tree/master/ServoBlaster

629RaspberryPi-GPIO repository: https://github.com/alanbarr/RaspberryPi-GPIO

630OpenCV: http://opencv.org/

631Raspberry Pi StackExchange: http://raspberrypi.stackexchange.com/

632Using Raspberry Pi Camera with OpenCV:

633https://robidouille.wordpress.com/2013/10/19/raspberry-pi-camera-with-opencv/

634Raspicam_CV Repository:

635https://github.com/robidouille/robidouille/blob/master/raspicam_cv/README

636KK2.0 Manual: http://www.hobbyking.com/hobbyking/store/uploads/181270330X7478X47.pdf

637LiPo Charging Guide: http://www.rcgroups.com/forums/showthread.php?t=209187

638

91 46
92
639Appendix D: Code Listings

640// Ethan Plummer, Nick Reich, Alex Kibler

641// April 2014

642// Target Recognition Software/Motor PWM Control

643// X-fly Project

644
645//-----Special Thanks----------------

646// Emil Valkov for assistance with raspicam image input

647// GoodFeaturesToTrack referance from Kristi Tsukida

648// BoundedRectangle help form Sergi Pons Freixes and Jean-Pierre Landary

649// opencv-users @ nabble.com fourms

650// OpenCV Documentation

651// Stack Overflow Fourms

652

653
654#include "highgui.h"

655#include "cv.h"

656#include "RaspiCamCV.h"

657#include <opencv2/core/core.hpp>

658#include <opencv2/opencv.hpp>

659#include <opencv2/highgui/highgui.hpp>

660#include <iostream>

661#include <opencv2/imgproc/imgproc.hpp>

662#include "stdio.h"

663#include "math.h"

664

93 47
94
665extern "C"

666 {

667 #include <string.h>

668 #include <sys/time.h>

669 #include <termios.h>

670 #include <stdlib.h>

671 #include <unistd.h>

672 #include <rpiGpio.h>

673 #include <gpio.h>

674 }

675
676//servoblaster->GPIO pin #. Change if you are not connecting ail to GPIO 25

677//ele to GPIO 24, thr to GPIO 23, rud to GPIO 18, and aux to GPIO 22

678#define ail_pin 7

679#define ele_pin 6

680#define thr_pin 5

681#define rud_pin 2

682#define aux_pin 22

683
684// Change scalar format for simplilicity

685#define CV_HSV(h,s,v) cvScalar((h),(s),(v))

686
687//for opening /dev/servoblaster

688FILE *fp;

689
690using namespace cv;

691using namespace std;

692

95 48
96
693// Initial Pin PWMs

694int ail_pwm=150;

695int ele_pwm=150;

696int thr_pwm=151;

697int rud_pwm=150;

698int ail_flag=0;

699

700

701
702// Targeting objects

703CvMemStorage* contour_storage = NULL;

704CvSeq* contours = 0;

705CvRect edges;

706CvFont font;

707char Target[10];

708int PtX;

709int PtY;

710int CrossX = 160;

711int CrossY = 120;

712CvPoint CenterMass = cvPoint(PtX,PtY);

713CvPoint Center = cvPoint(CrossX, CrossY);

714
715// Colors

716CvScalar White= CV_RGB(255,255,255);

717CvScalar Black= CV_RGB(0,0,0);

718CvScalar Blue= CV_RGB(0,0,255);

719CvScalar Red= CV_RGB(255,0,0);

720CvScalar Teal= CV_RGB(0,255,255);

97 49
98
721

722
723// Filter varibles

724 //Red

725
726int Lhue = 169;

727int Lsat = 97;

728int Lval = 154;

729int Hhue = 185;

730int Hsat = 255;

731int Hval = 255;

732
733/* //Blue

734
735int Lhue = 105;

736int Lsat = 97;

737int Lval = 74;

738int Hhue = 168;

739int Hsat = 255;

740int Hval = 255;

741*/

742CvScalar LowerBound = CV_HSV(Lhue,Lsat,Lval);

743CvScalar UpperBound = CV_HSV(Hhue,Hsat,Hval);

744
745// Shi-Tomasi Varibles

746const int NumOfCorners = 17;

747CvPoint2D32f IntrestCorners[NumOfCorners] = {0};

748int CornerCount = NumOfCorners;

99 50
100
749double CornerQuality = .05;

750double MinDistance = 5;

751int EigBlockSize = 3;

752int UseHarris = false;

753int Radius = 2;

754

755
756//Control Window Varibles

757const int slider_max = 256;

758int xDilate = 17, xErode = 20;

759char TrackbarName[10];

760
761//_________________________________________________Functions__________________
762__________________________

763//___________________________________________Quad Control
764Functions_________________________________

765
766/*initPWM(int,int,int,int)c

767 in: none

768 out: none

769This function will initialize all 4 controls to their

770neutral positions.

771*/

772
773void initPWM(int ail,int ele,int thr,int rud) {

774 fprintf(fp,"%i=%i\n",ail_pin,ail);

775 fflush(fp);

776 fprintf(fp,"%i=%i\n",ele_pin,ele);

777 fflush(fp);

101 51
102
778 fprintf(fp,"%i=%i\n",thr_pin,thr);

779 fflush(fp);

780 fprintf(fp,"%i=%i\n",rud_pin,rud);

781 fflush(fp);

782}//initPWM()

783
784/* updatePWM(int,int)

785 in: control to be updated & its pwm

786 out: none

787This function will accept which motor control needs

788updated(ail,ele,thr,rud) and send it to servoblaster

789*/

790
791void updatePWM(int control, int pulse) {

792 fprintf(fp,"%i=%i\n",control,pulse);

793 fflush(fp);

794}//updatePWM()

795
796//__________________________________________ OpenCV Vision
797Functions_________________________________

798
799//Refresh Trackbar

800void on_trackbar( int, void* )

801{

802}

803
804/*

805//show Control Window

103 52
104
806void trackbar( int, void*)

807{

808 namedWindow ("Thresholds", 23); //Make shit load of trackbars to


809tune image

810 sprintf (TrackbarName,"Lower Hue ");

811 createTrackbar (TrackbarName, "Thresholds", &Lhue, slider_max, on_trackbar


812);

813 sprintf (TrackbarName, "Upper Hue ");

814 createTrackbar (TrackbarName, "Thresholds", &Hhue, slider_max, on_trackbar


815);

816 sprintf (TrackbarName, "Lower Saturation");

817 createTrackbar (TrackbarName, "Thresholds", &Lsat, slider_max, on_trackbar


818);

819 sprintf (TrackbarName, "Upper Saturation");

820 createTrackbar (TrackbarName, "Thresholds", &Hsat, slider_max, on_trackbar


821);

822 sprintf (TrackbarName, "Lower Value ");

823 createTrackbar (TrackbarName, "Thresholds", &Lval, slider_max, on_trackbar


824);

825 sprintf (TrackbarName, "Upper Value ");

826 createTrackbar (TrackbarName, "Thresholds", &Hval, slider_max, on_trackbar


827);

828 imshow;

829

830

831
832 namedWindow ("Thresholds", 23); //Make
833trackbars to tune image

834// sprintf (TrackbarName, "Erode ", slider_max);

835 createTrackbar (TrackbarName, "Thresholds", &xErode, slider_max,


836on_trackbar );

837// sprintf (TrackbarName, "Dilate ", slider_max);

105 53
106
838 createTrackbar (TrackbarName, "Thresholds", &xDilate, slider_max,
839on_trackbar );

840 imshow;

841
842}

843*/

844
845// Threshold Image

846IplImage* ThresholdImage( IplImage* HSVImg )

847 {

848 IplImage* ThreshImg = cvCreateImage(cvGetSize(HSVImg), IPL_DEPTH_8U, 1);

849 // Turn HSV image into a binary image by filtering out all but red

850 cvInRangeS(HSVImg, /*LowerBound, UpperBound,*/cvScalar(Lhue,Lsat,Lval),


851cvScalar(Hhue,Hsat,Hval), ThreshImg);

852 // Don't need this anymore

853 cvReleaseImage( &HSVImg );

854 return ThreshImg;

855 // Clean up left-overs

856 cvReleaseImage( &ThreshImg );

857 }

858

859
860// Filter Noise

861IplImage* FilterNoise( IplImage* ThreshImg )

862 {

863 IplImage* FiltImg = cvCreateImage(cvGetSize(ThreshImg), IPL_DEPTH_8U, 1);

864 // Make desired color more prominent

865 cvDilate(ThreshImg, ThreshImg, 0, xDilate);

866 //

107 54
108
867 cvErode(ThreshImg, FiltImg, 0, xErode);

868 cvReleaseImage( &ThreshImg );

869 return FiltImg;

870 cvReleaseImage( &FiltImg );

871 }

872
873// Find edges of the binary image

874CvRect FindContours( IplImage* FiltImg )

875 {

876 // Set up place for contours of binary image

877 if( contour_storage == NULL )

878 {

879 contour_storage = cvCreateMemStorage(0);

880 }

881 else

882 {

883 cvClearMemStorage( contour_storage );

884 }

885 CvSeq* contours = 0;

886

887 // Locate contours from the filtered image

888 cvFindContours( FiltImg, contour_storage, &contours );

889 cvZero( FiltImg );

890 if( contours )

891 {

892 // Highlight contours

893 cvDrawContours (FiltImg,

894 contours,

109 55
110
895 cvScalarAll(255),

896 cvScalarAll(255),

897 100 );

898 }

899 for(CvSeq* i = contours; i != NULL; i=i->h_next)

900 {

901 // Set up rect. corners for later

902 edges = cvBoundingRect(i);

903 }

904 // Clean up left-overs

905 cvReleaseImage( &FiltImg );

906 return edges;

907 }

908

909
910// Box and Name Target(s)

911IplImage* MarkTarget (IplImage* image, CvRect edges)

912 {

913 int n = 1;

914 // Set up text for later

915 cvInitFont (&font,

916 CV_FONT_HERSHEY_COMPLEX_SMALL,

917 .6, .6, 0, 1, 6);

918
919 // Draw rectangle around the target

920 cvRectangle (image,

921 cvPoint(edges.x, edges.y),

922 cvPoint(edges.x+edges.width, edges.y+edges.height),

111 56
112
923 Blue);

924
925 sprintf(Target, "Target %d", n);

926 n++;

927 // Display text on upper left corner of rectangle

928 cvPutText (image,

929 Target ,

930 cvPoint(edges.x+edges.width, edges.y),

931 &font,

932 Black);

933 return image;

934 // Clean up left-overs

935 cvReleaseImage( &image );

936 }

937

938
939// Find Centroid of Rect

940CvPoint FindCenter( CvRect edges )

941 {

942 int PtX = ((((edges.x+edges.width)-edges.x)/2)+edges.x);

943 int PtY = ((((edges.y+edges.height)-edges.y)/2)+edges.y);

944 CvPoint CenterMass = cvPoint(PtX,PtY);

945
946// cout << PtX << "," << PtY << endl;

947 return CenterMass;

948 }

949
950void DirectionalCalc(CvPoint CenterMass, CvPoint Center, CvRect edges)

113 57
114
951 {

952

953// if(gpioSetup() != OK)

954// {

955// dbgPrint(DBG_INFO, "gpioSetup failed");

956// return 1;

957// cout << "IO setup failure:" << endl;

958// }

959
960// eState state;

961 //make GPIO22 an input

962// gpioSetFunction(aux_pin, input);

963 //read the current state each time around

964// gpioReadPin(aux_pin,&state);

965
966 int Direct = 0;

967// cout << CenterMass.x << "," << CenterMass.y << endl;

968// cout << edges.height * edges.width << endl;

969
970 //if(!state)

971// {

972 if (CenterMass.x > CrossX+80)

973 {

974 // cout << "Go Left" <<'\r';

975 // Direct = abs(CenterMass.x - CrossX);

976 // cout << "----------" << Direct << " Pixels" << endl;

977 // printf("Rolling left\n");

978 if(!ail_flag)

115 58
116
979 {

980 ail_pwm--;

981 if(ail_pwm < 140)

982 {

983 ail_flag=1;

984 ail_pwm=150;

985 }

986 updatePWM(ail_pin,ail_pwm);

987 }

988 }

989 else if (CenterMass.x< CrossX-80)

990 {

991 // cout << "Go Right"<< '\r';

992 // Direct = abs(CenterMass.x - CrossX);

993 // cout << "----------" << Direct << " Pixels" << endl;

994 if(!ail_flag)

995 {

996 ail_pwm++;

997 if(ail_pwm > 160)

998 {

999 ail_flag=1;

1000 ail_pwm=150;

1001 }

1002 updatePWM(ail_pin,ail_pwm);

1003 }

1004 }

1005 else

1006 {

117 59
118
1007 ail_pwm=150;

1008 updatePWM(ail_pin,ail_pwm);

1009 ail_flag=0;

1010 }

1011

1012
1013 if (CenterMass.y > CrossY+50)

1014 {

1015 // cout << "Go Up" << '\r';

1016 // Direct = abs(CenterMass.y- CrossY);

1017 // cout << "----------" << Direct << " Pixels" << endl;

1018 thr_pwm++;

1019 if(thr_pwm > 153)

1020 thr_pwm=153;

1021 //updatePWM(thr_pin,thr_pwm);

1022 }

1023 else if (CenterMass.y < CrossY-50)

1024 {

1025 // cout << "Go Down" << '\r';

1026 // Direct = abs(CenterMass.y - CrossY);

1027 // cout << "----------" << Direct << " Pixels" << endl;

1028 thr_pwm--;

1029 if(thr_pwm < 149)

1030 thr_pwm=149;

1031 //updatePWM(thr_pin,thr_pwm);

1032 }

1033 else

1034 {

119 60
120
1035 thr_pwm=151;

1036 //updatePWM(thr_pin,thr_pwm);

1037 }

1038
1039 if ((edges.height * edges.width) < 7000)

1040 {

1041 // cout << "Go Forward" <<'\r';

1042 ele_pwm--;

1043 if(ele_pwm < 130)

1044 ele_pwm=130;

1045// updatePWM(ele_pin,ele_pwm);

1046 }

1047 else if ((edges.height * edges.width) > 10000)

1048 {

1049 // cout << "Go Backward" <<'\r';

1050 ele_pwm++;

1051 if(ele_pwm > 170)

1052 ele_pwm=170;

1053// updatePWM(ele_pin,ele_pwm);

1054 }

1055 else

1056 {

1057 ele_pwm=150;

1058// updatePWM(ele_pin,ele_pwm);

1059 }

1060// }

1061 }

1062

121 61
122
1063// Mark Centroid

1064IplImage* MarkCenterMass( IplImage* image, CvPoint CenterMass)

1065 {

1066 cvCircle(image, CenterMass, 3, White, -1);

1067 return image;

1068 cvReleaseImage( &image );

1069 }

1070
1071IplImage* MarkCenter( IplImage* image, CvPoint Center)

1072 {

1073 cvCircle(image, Center, 9, Red, 1);

1074 cvLine(image, cvPoint(CrossX + 20, CrossY), cvPoint(CrossX + 4,CrossY),


1075Red, 2);

1076 cvLine(image, cvPoint(CrossX - 20, CrossY), cvPoint(CrossX - 4,CrossY),


1077Red, 2);

1078 cvLine(image, cvPoint(CrossX, CrossY + 20), cvPoint(CrossX, CrossY + 4),


1079Red, 2);

1080 cvLine(image, cvPoint(CrossX, CrossY - 20), cvPoint(CrossX, CrossY - 4),


1081Red, 2);

1082 return image;

1083 cvReleaseImage( &image );

1084 }

1085
1086// Use Shi Tomasi corner detection

1087IplImage* GetRotationalFeatures( IplImage* GrayImg, IplImage* TempImg,


1088IplImage* image, CvRect edges )

1089 {

1090 IplImage* OutImg;

1091 cvGoodFeaturesToTrack (GrayImg, // Input image

1092 OutImg, // Output image

123 62
124
1093 TempImg, // Temp algorithim workspace

1094 IntrestCorners, // Cooridnates of points of intrest

1095 &CornerCount, // Number of corners

1096 CornerQuality, // How clean the corners are

1097 MinDistance, // Distance between points

1098 0,

1099 EigBlockSize, // Area to apply the eigen values at


1100one time

1101 UseHarris); // Do not use Harris method

1102
1103 for (int i=0; i<CornerCount; i++)

1104 {

1105 cvCircle(image, cvPoint((int)(IntrestCorners[i].x + edges.x),


1106(int)(IntrestCorners[i].y + edges.y)), Radius, Black);

1107 }

1108 cvReleaseImage( &GrayImg );

1109 return image;

1110 cvReleaseImage( &image );

1111 }

1112

1113

1114

1115
1116// Clean up input

1117void CleanUp(RaspiCamCvCapture* capture)

1118 {

1119 raspiCamCvReleaseCapture(&capture);

1120// cvReleaseCapture( &capture );

1121 cvDestroyWindow( "Live Feed" );

125 63
126
1122 }

1123

1124

1125
1126//_________________________________________Main_______________________________
1127_____________

1128
1129int main( int argc, char** argv )

1130{

1131 fp=fopen("/dev/servoblaster","w"); //open


1132servoblaster

1133// trackbar(Lhue, 0); // Call a track bar

1134// CvCapture* capture = cvCreateCameraCapture(-1); // Get


1135image from camera, -1=default camera

1136// CvCapture* capture = cvCreateFileCapture( argv[1] ); // Use


1137video file for debug

1138 RaspiCamCvCapture* capture = raspiCamCvCreateCameraCapture(0);

1139 IplImage* image;

1140 IplImage* HSVImg;

1141 IplImage* ThreshImg;

1142 IplImage* FiltImg;

1143 IplImage* GrayImg;

1144 IplImage* TempImg;

1145 CvRect edges;

1146 initPWM(ail_pwm,ele_pwm,thr_pwm,rud_pwm);

1147

1148// for (int i=0;i<100;) // Process Images


1149Forever

1150 for (;;)

1151 {

127 64
128
1152// double t = getTickCount();

1153// if( !image ) break;

1154// image = cvQueryFrame( capture ); // Put capture in


1155an Object

1156 image = raspiCamCvQueryFrame(capture);

1157 cvFlip(image,image,0);

1158 HSVImg = cvCreateImage(cvGetSize(image), IPL_DEPTH_8U, 3); //


1159Make HSVimg same size as image

1160 cvCvtColor(image, HSVImg, CV_BGR2HSV); // Change to


1161HSV Color Space

1162 ThreshImg = ThresholdImage( HSVImg ); // Get


1163filtered image

1164 FiltImg = FilterNoise( ThreshImg ); // Focus on


1165Target

1166// cvShowImage( "Targets" , ThreshImg);

1167 edges = FindContours( FiltImg ); // Find and


1168mark edges of target

1169 CenterMass = FindCenter( edges ); // Calculate


1170Centroid

1171 DirectionalCalc(CenterMass, Center, edges);

1172// GrayImg = cvCreateImage(cvGetSize(image), IPL_DEPTH_8U, 1);

1173// cvCvtColor( image, GrayImg, CV_BGR2GRAY); // Change to


1174Gray Color Space

1175// cvSetImageROI( GrayImg, edges); // Set region of


1176interest

1177// GetRotationalFeatures( GrayImg, TempImg, image, edges ); // Get


1178interest points

1179 MarkTarget( image, edges ); // Draw box around


1180target

1181 MarkCenterMass( image, CenterMass ); // Mark


1182Centroid

1183 MarkCenter( image, Center );

1184// cvShowImage( "Targets", image); // Show final


1185image

129 65
130
1186// cvReleaseImage( &image );

1187

1188

1189 char c = cvWaitKey(55);

1190 if( c == 35 ) break;

1191// t = ((double)getTickCount()-t)/getTickFrequency();

1192// cout << t <<'\r';

1193 } // Clean up leftover imagescc

1194 CleanUp(capture);

1195}

1196

1197

1198

1199

1200

1201
1202

131 66
132

You might also like