Professional Documents
Culture Documents
2
3 Department of Applied Engineering and Technology
4
5
6
7
8
9
10 CET 492:
11 Senior Project II
12
13
14
15
16
17 X-Fly
18 User Manual
19
20
21
22
23
24
25
26
27
28
29Design by: Professor:
30Ethan Plummer Dr. Weifeng Chen
31Nicholas Reich
32Robert Alex Kibler
33
1 1
2
34 Instructor Comments and Evaluation
35
3 2
4
36Table of Contents
371 Introduction..............................................................................................................................1
39 1.2 Motivation..............................................................................................................................2
43 2.1.2 Transmitter......................................................................................................................4
44 2.1.3 Receiver..........................................................................................................................5
45 2.1.4 Frame..............................................................................................................................6
46 2.1.5 Battery.............................................................................................................................7
47 2.1.6 Motors.............................................................................................................................8
49 2.1.8 Propellers......................................................................................................................10
5 3
6
55 2.1.14 Multiplexer.................................................................................................................16
56 2.1.15 Microcontroller...........................................................................................................17
70 3.2.3 Schematics....................................................................................................................30
7 4
8
75 5.1.2 Powering on..................................................................................................................32
83 Ethan Plummer...........................................................................................................................34
84 Nicholas Reich...........................................................................................................................36
85 R. Alex Kibler............................................................................................................................37
87Appendix C: References................................................................................................................39
89
90
9 5
10
91 Table of Figures
113
11 6
12
1141 Introduction
115 Extreme sports athletes (snowboarders, for example) today face a problem when it
116comes to recording footage of their runs: in order to have their activities filmed, they have to
117have an equally skilled cameraman to follow them and record. This comes at the risk of
118damaging both expensive film equipment and injuring the cameraman or woman. However, in
119order to gain exposure, these activities need to be captured on film and up close, but athletes at
120the top of their skill can be difficult to keep up with and record.
121 The X-Fly system is one solution to this problem. X-Fly is a quad propeller-driven
122multicopter, also known as a quadcopter. It contains an intelligent camera system that can detect
123a target from large distances away. In its current state, the target wears a specific color (which
124can be modified by the operator) and proceeds as normal. The operator of X-Fly raises the
125quadcopter up to the optimal height (roughly three meters above the ground) and aligns it with
126the target. At this point, the operator flips the switch on the transmitter, switching the X-Fly
127system into semi-autonomous mode. When this happens, the X-Fly will become autonomous,
128tilting and rolling to stay locked onto the target. The operator can attach their own camera
131with the Wi-Fi BacPac. When attached using the provided mount, the GoPro can be set as a
132wireless access point to which the operator can connect with their smartphone or computer. At
133this point, the operator can see in real-time what X-Fly is seeing. This can also be used in
134conjunction with a media server to live-stream the action as it happens, which is something that
137autonomous switch, returning the quadcopter to manual mode. At this point, he or she can either
13 7
14
138land X-Fly where it currently is return it to the starting position and land it there. Overall, the X-
139Fly system is designed for ease of use and affordability. By allowing the user to provide their
140own camera, X-Fly can potentially save thousands over commercial film setups, as well as being
143skiers/snowboarders, as well as skateboarders, surfers, bicyclists, and even runners. This would
144lower costs of professional filming equipment dramatically, as well as lowering the costs of
147 X-Fly has the potential to vastly increase the quality and number of professional-quality
148recordings of extreme sports events, as well as reducing the number of accidents by reducing the
149number of people needed to film an event. By reducing the number of accidents happening, this
150will reduce the cost of healthcare for extreme sports athletes and would eventually lower the cost
1521.2 Motivation
153 Our motivation for this project was our interest in robotics and control systems. We
155system, sort of like a Segway, but decided on X-Fly because it was a much more interesting and
156challenging idea to us. The problem was also very similar to the CET 360 smart car project and
15 8
16
1582 Project Overview
17 9
18
1592.1 Component Overview
160 X-Fly consists of three major components: the quadcopter itself, the transmitter, and the
161onboard computer. The quadcopter consists of many components. Among these are the frame,
162the motor mounts, the motors, the propellers, the flight control board, the electronic speed
165The flight control board is KK2.0 Multi-Rotor Control Board from HobbyKing. It acts as the
166main flight computer for the quadcopter. As shown in Figure 1, it has an accelerometer to detect
168Display, four buttons, a piezo output (speaker), five inputs, and supports up to eight outputs
169(motors). For the purpose of the X-Fly project, we were most concerned with the first four
170inputs (AIL for aileron, ELE for elevator, THR for Throttle, and RUD for Rudder) and M1
172
19 10
20
1742.1.2 Transmitter
175For the transmitter, we are using the Futaba 10CG 10-Channel 2.4GHz Computer System
176Transmitter. It has a seven point throttle and pitch curve, real-time response, smart switch
177technology, a 160x72 backlit LCD, a servo monitor display, hover throttle, gyro menu, and many
178other features, all found on the website linked in the references. One of the important features
179we are using is the dead-mans switch on the upper left corner for autonomous mode. When the
180operator pulls and holds the switch, X-Fly becomes semi-autonomous, but as soon as the
181operator lets go of the switch, it goes back to manual mode. Another feature that is useful to us
182is that the throttle stick (left) holds its position rather than springing back to the bottom when the
184
21 11
22
1862.1.3 Receiver
187The receiver we are using to capture the transmissions from the transmitter is the R6014HS
188Receiver, also from Futaba. As seen in Figure 3 below, one of the most useful features of this
189receiver is the Link button. With a single press of the button, it can link with the transmitters
190unique signal, eliminating the worry of interference from other controllers that might be being
191used. It also includes a battery failsafe with an auto cut-off to protect the quadcopter. It weighs
19221.5g. Both the receiver and the transmitter are owned by the university and were lent to us by
193Professor Sumey.
194
196
197
23 12
24
1982.1.4 Frame
199For the quadcopter frame, we chose to go with the Hobbyking H4 Copter frame. As seen below
200in Figure 4 and as referenced by name, the H4 Copter is an H-shaped quadcopter frame.
201Because of this, it is more ideal for video capturing. In a standard x-shaped quadcopter, the arms
202can get in the way of cameras. The H4 solves this by keeping the arms in an H-shaped
203configuration, perpendicular to the frame, keeping them out of the view of the camera. The H4
204is an extremely lightweight (453g) frame made out of aluminum and glass fiber. We decided
205from the very beginning to get the lightest frame we could get because we knew that X-Fly
206would be on the heavy side after adding all of our extra hardware.
207
208 Figure 4: Hobbyking H4 Copter
209
210
25 13
26
2112.1.5 Battery
212For the main battery (which powers the FCB, the ESCs, and the motors), we decided to go with
213the Turnigy nano-tech 3300mah 3S 25~50C Lipo Pack from HobbyKing. It is a three cell 11.1V
214lithium polymer battery with a 25C constant / 50C burst discharge. It weighs 199g and has a
215capacity of 3300mah. Originally we had a battery with a much higher capacity picked, but it
216added so much to the weight that we decided it wasnt worth it. The current battery delivers
217between 10-20 minutes of flight time, which is more than enough to record a snowboard run.
218
219 Figure 5: Turnigy nano-tech 3300mah 3S 25~50C Lipo Pack
220
221
27 14
28
2222.1.6 Motors
223For our motors, we chose to use the Turnigy D2836/8 1100KV Brushless Outrunner Motor.
224After completing many different thrust calculations, we found that these motors would provide
225the optimum amount of thrust for the weight of the X-Fly system. These calculations will be
226provided later in the report. The Turnigy D2836/8 1100KV Brushless Outrunner Motor has a
227max power of 336W, and has a max thrust of 1130g. Throughout the process we had multiple
229
230 Figure 6: Turnigy D2836/8 1100KV Brushless Outrunner Motor
231
29 15
30
2322.1.7 Electronic Speed Controllers (ESCs)
233For the speed controller, we decided to use the HobbyKing 20A BlueSeries Brushless Speed
234Controller. We picked these because they have an automatic timing setup, only weighed 18g, and
235were able to output enough power to drive the motors from our batteries. They were also very
237
238 Figure 7: HobbyKing 20A BlueSeries Brushless Speed Controller
239
31 16
32
2402.1.8 Propellers
241Finally, for the propellers, we chose to use the 10x4.5 Carbon Fiber Propellers 1pc Standard/1pc
242RH Rotation from HobbyKing. We chose to use these propellers because they are made of high
243quality, durable carbon fiber, making them less likely to break upon impact than plastic. They
245
247
33 17
34
2482.1.9 Raspberry Pi
249 For our onboard computer, we decided to go with the highly popular Raspberry Pi. We
251coldfire MCU, but we decided on the Raspberry Pi because of its affordability and its featureset.
252For $35, the Raspberry Pi comes with 512MB of RAM, an Arm11 MPU, 2 USB Ports, Ethernet
253and HDMI ports, and mounting holes. It can also be powered by USB or by batteries. Another
254reason we chose the Raspberry Pi is because it has its own supported camera (which will be
255discussed next), which means we did not have to figure out drivers for third party cameras. We
256also were very impressed by the community and pre-built linux distributions which saved
257countless hours because we were able to boot a pre-existing linux operating system and do
259
35 18
36
2612.1.10 Raspberry Pi Camera Board Module
262Next was the Camera module. When we were researching which onboard computer to use, we
263learned that the Raspberry Pi has its own Camera Board Module for $30 which has the ability to
264record 1080p at 30fps, which is more than we needed. This was a great thing, so we decided to
265use the Raspberry Pi and its camera module which plugs in right between the HDMI and
267
269
37 19
38
2702.1.11 Motor Mounts
271 These 3D printed motor mounts were designed by Ethan to replace the stock legs for the
272frame. They are extended legs so there is room to attach the camera on the bottom of X-Fly.
273They are also wider at the bottom to absorb more of the force.
274
276
39 20
40
2772.1.12 Camera Gimbal Socket
278 This 3D printed socket, in conjunction with the Camera Gimbal of Figure 13, below,
279allows us to mount the camera to the bottom of X-Fly and move it around as needed to get the
280proper angle.
281
283
41 21
42
2842.1.13 Camera Gimbal
285 This gimbal was 3D printed and allows the camera to feed down through it and attach to
286the bottom of X-Fly. This allows us to change the camera angle and protect the cable.
287
288 Figure 13: 3D Printed Camera Gimbal
289
43 22
44
2902.1.14 Multiplexer
291 When we started considering how to switch between autonomous and manual mode, we
292decided to use the 74HC157N Quad 2-input multiplexer. It accepts the outputs from the
293transmitter and the raspberry pi and then, depending on whether or not the switch is flipped, can
295
296
297 Figure 14: 74HC157N Quad 2-input multiplexer
298
45 23
46
299
3002.1.15 Microcontroller
302pulsewidth sent from the transmitter. We capture this pulsewidth so we can determine whether
304
306
47 24
48
3072.1.16 Auxiliary Battery Pack
308 This battery pack, in conjunction with four NiMH AA batteries, powers the raspberry pi
309and the receiver. This separates the computer system from the main power, preventing a total
310system failure.
311
312
314
49 25
50
3152.2 Software Components
317 The computer vision portion of this platform is written in a combination of C and C++.
318This combination was used because some crucial modules were written previously by developers
319in C, and the more recent versions of OpenCV were written. This particular combination of
320languages turns out to not be a problem, because C++ is the successor to C, and the main
321program was written in C++. The modules that were written in C are the ones that allow the
322Raspberry Pi camera module to be used as an input capture and the one that allow use of
323input/output pins in a any C program. The call that OpenCV uses to get an input image wouldn't
324normally find the Raspberry Pi camera module as an input source, so this library, called
325RaspiCamCV which was written by Emil Valkov, allows the flight program to use the small on-
326board camera. The other module of code is called RaspberryPi-GPIO and allows for a code
328 Now that there is an image to work with on the Raspberry Pi, from the Raspberry Pi
329camera, image manipulation algorithms can be performed to extract the desired data from the
330image. The simplest approach to any kind of object tracking is to focus on that object's color.
331This can be done in OpenCV by using the cvInRangeS function. After the general target has been
332identified, the target will needs to be cleaned up because it at this point will be a rough binary
333image. To clean up the target, a series of erosions and dilatations of the whole image are applied
334to get a more normalize representation of the target. Next some sense of this blobular image
335needs to be made. A simple way to do this is to change the image to a more orthogonal picture.
336This is achieved by bounding the contours of the binary image with a rectangle. This provides an
51 26
52
338 Once the corrective measures have been calculated, a signal needs to be sent to the flight
339control board (FCB) to adjust the position to the platform in the necessary directions. This is
340where the ServoBlaster software is used. ServoBlaster allows any GPIO pin on the Raspberry Pi
341to be used as a PWM pin. Once the pulse width is generated, it is sent to the FCB so that
342corrective movements can be made. See the data flow diagram for a basic graphical
345 Upon power up of the system, a number of background scripts are run for the sole use of
346the operating system on the Raspberry Pi. These do various like monitor the central processing
347unit and memory usage. These scripts are crucial to the stable operation of the micro-controller.
348Amongst the system scripts, additional scripts were needed to allow usage of other software and
350 A script is a file that used to automatically execute commands that a user would typically
351enter in a terminal. For the Raspberry Pi, which is on-board the X-Fly, to make use of it's GPIO
352pins, which send commands to the flight control board to maneuver. A script was written to
354
355 ServoBlaster is a program that allows the Raspberry Pi to send out pulse width
356modulation (PWM) signals via it's GPIO pins. This program without the use of a script would
357normally be executed by entering the the command on the above figure. By putting this
358command into a script it makes it easier to have every thing need for flight to be executed at at
359start up.
53 27
54
360 Another program that was put into a script was the enable writing to the GPIO program.
361In this case it resides in the same directory as the ServoBlaster. However, the program is not right
362there so a change directory command was put into the script that ultimately calls the make
363command in figure 1.2. This enables other programs to write to the file which the pin values are.
365 The last script that had to be written was the one to execute the main vision program.
366This script was executed third out of the 3 additional scripts that were written. This is because
367the vision program was dependent on the other two to operate properly. This script is the
368equivalent of simply doing a change directory command and then a program execution
369command.
370 The combination of these scripts allow the X-Fly to be able to fly autonomously upon the
371application of power to the system. An added bonus of putting these scripts in the start-up
373
377 In September, we started researching what type of vessel would be the most efficient as a
378person-tracking drone. We considered a model airplane because of price, but decided that a
379multicopter would be the best idea due to its ability to hover. Next, we had to research which
380onboard computer would be the best for it. As I mentioned in section 2.1.9, we decided that a
381Raspberry Pi with its camera would work best for our needs. After this, we had to figure out
55 28
56
382what parts we actually needed to build the quadcopter, because this was something that none of
383us had ever done before. After researching different frame and motor configurations, we decided
384on an H frame from Hobby King (Figure 4 above). According to Hobby Kings listing for the
385frame, having an H-shaped frame keeps the arms away from the camera, allowing the camera to
386have a much wider field of view without things getting in the way. Obviously this was optimal
387for us. Next, we had to decide on the rest of the hardware. We picked our flight control board
388because it was cheap and compatible with the transmitter/receiver combo that Mr. Sumey let us
389borrow. We ran calculations based on the mass of all of the components and decided on motors
390and a battery that would work optimally. All of this was done in early September, and all of the
391hardware was ordered by the end of the month. The Raspberry Pi was among the first to arrive in
392the first week of October. After it arrived, we researched different linux distributions that were
393compatible to decide which one to install. We decided to install Raspian, which is a Raspberry
394Pi-specific version of Debian. The two main reasons we chose Raspian were that the camera was
395compatible with it out of the box, and OpenCV had an installation guide for Debian-based
396systems. We got it installed and tested the camera. Next was to install OpenCV on one of our
397computers. This took about two weeks and was finally finished installing on Ethans computer
398on October 17th. We encountered many errors during the OpenCV installs, and sometimes they
399didnt occur until hours into the installation so we wouldnt find out until the following morning.
400The next batch of hardware arrived on the 17th as well. This shipment came with the wires,
401connectors, battery monitor, and the receiver. Most of the quadcopter was wired on the 18th,
402including adding a bullet connector to the power distribution block. We secured the ESCs to the
403frame using zip ties. We attempted cloning Ethans computer so we could each have a completed
404install of OpenCV but we ended up abandoning this idea and working directly on the Raspberry
57 29
58
405Pies. On the 26th we got the rest of the hardware and got OpenCV installed on the Raspberry Pi.
406On the 18th of November, we attempted the first test flight, but there was a problem. We had all
407four ESCs wired to the Power Distribution Block, where only one needed to be. This caused a
408short and all of the ESCs were destroyed. We ordered more and waited about a week for them to
409arrive. Things got busy with the semester coming to a close but we completed our first test flight
410on December 23rd, 2013. It was completely unflyable. During the week of February 10th, we did
411a factory reset on the flight control board (Mr. Sumey suggested that we tweaked the PID values
412so much that we were beyond the point of no return) and this seemed to fix it. We also removed
413the stock legs from the H frame and 3D printed new legs to sit on the corners, bringing the center
414of balance out towards the edges, making it much more stable. During the week of March 10th,
415we got the power system completely finished. We isolated the raspberry pi and receiver from the
416main power supply so that if the main battery died, the pi would still be running and so would the
417receiver. At this point, we printed a new set of legs for the quadcopter because we had a rough
418landing that broke two of them. The new revision had a wider base and was able to take more of
419an impact.
421 The software implementation started back in September when we first started discussing
422what software we would use for our vision tracking. Ethan met with a former coworker who
423works for a robotics company and he mentioned that we should look into OpenCV. We
424researched OpenCV extensively and decided that it would work excellently for us. OpenCV is a
425BSD-licensed library of functions mainly used during live video. It supports functions such as
426drawing on videos as they stream, as well as doing math on things that are happening in the
427videos. This is the part that interests us. By mid-December, a program was written that could
59 30
60
428read in video from a USB webcam and filter out all but a specified color. This was step one of
429our Vision Tracking program. By the end of February, we had a version of the program written
430where we could specify a color and the program would draw boxes around any objects with that
431color. By mid-March, we had a version written that could calculate the centroid of the object and
432determine how far away the quadcopter was from where it wanted to be. It could use this
434source library written by Richard Ghirst and published on GitHub. It provides an interface to
435output PWMs to multiple GPIO pins on the Raspberry Pi, which is not a feature native to the Pi.
436We also used a library called RaspberryPi-GPIO to interface with the GPIO pins. It was written
437by Alan Barr. By the end of February, there were two separate programs written: The vision
438tracker, and a hardware controller. One could identify the target, and the other could control the
439quadcopter, but the two needed to be merged together. This was completed by mid-March. The
440vision tracker identifies an object based on a pre-defined color, determines what it needs to do to
441get that object to the center of its field of vision, and then sends the command to the hardware
442controlling part of the software, which calculates what PWMs to send to the GPIO pins. Figure
44311 below shows what an oscilloscope shows when a signal is sent via the transmitter. Figure 12
444shows what the scope sees when a signal is sent out from the Raspberry Pi. In both shots,
61 31
62
446
448
451 Many challenges were had during the implementation of the X-Fly system. Our first
452major challenge, as described above, was determining which onboard computer to go with. We
63 32
64
453had many different options, but we ended up choosing the Raspberry Pi, which ended up
455 Our next challenge was determining how to filter out our target. We were between using
456RoboRealm and OpenCV but decided on OpenCV because of its opensource license and ease of
457use. Once we decided that we were going to use OpenCV, we had to get it installed. This took
458almost a month just to get installed on a computer. The installation took about two hours to
459install, and errors didnt usually occur until the end of the process. Once we got an installation to
461 Our next challenge was getting the quadcopter off the ground. After we wired everything
462up and turned on the quadcopter, the ESCs all failed because of a miswiring. We ordered new
463ones and rewired it. Everything was assembled and we attempted our first flight, but it was
464extremely unstable. After many hours of tinkering with the flight control board, we were able to
465achieve a somewhat stable flight. This was tweaked over the following weeks.
466 Once the quadcopter was able to fly manually, we started experimenting with software
467controlled flight using ServoBlaster. This worked well for a short amount of time but then we
469 Along the way, we suffered multiple other failures as well. At one point, one of the
470motors failed, so we had to order more to replace it. We waited over a month for the
472 The control program and vision software were written separately, which caused multiple
473problems. The control program was written in C and the vision software was written in C++, so
65 33
66
474the compiler got confused and was unable to compile. This was fixed by forcing the C libraries
475to compile as C libraries rather than allowing the compiler to attempt to compile them in C++
477 There are multiple differences from the design document. The biggest one is that in the
478design document, we said that at runtime, the user would decide whether X-Fly would launch in
479autonomous mode or manual mode. This is different now because it always starts out in manual
480mode and then can be switched to autonomous. This is preferential because it allows the user to
481take control if anything goes wrong. It also means that it wasnt necessary to write take-off and
482landing software.
483 Another large difference is that there is no longer an onboard altimeter like specified in
484the design document. We originally said that when X-Fly reached a certain altitude (provided by
485the altimeter), it would automatically start recording video. As the video recording is done by a
486standalone camera provided by the user, this was no longer feasible and was removed.
487
67 34
68
4883.2 System Diagrams
490
491 Figure 19: Software Block Diagram
492
69 35
70
493
494
496
498
71 36
72
4993.2.3 Schematics
500
502
73 37
74
5034 Use of Software Engineering Principles
504X-Fly was a strong example of the proper way to use Software Engineering Principles. During
505the development and deployment of X-Fly, we abided by the Software Development Process.
506The first phase was the requirements phase. During the requirements phase, we determined what
507was required of the project. The ideas were organized more tightly during this phase. The next
508phase was specification. During the specification phase, we determined what components were
509needed to build the quadcopter and to complete the system. We created a specification document
510to show what the system was supposed to do when it was completed. We also created block
511diagrams and flow charts during this period. During the next phase, we broke the project down
512into sections: hardware, software, and documentation. This was to ensure that work was being
513split up efficiently and fairly. During the implementation phase, we took note of the previous
515 During the entire process, all of us worked together to design a system that would meet
516all of the requirements of the CET program. It was designed with the goals in mind. We wrote
517the software with maintainability in mind. The software is extremely modular and could be used
519
75 38
76
5205 User Manual
523 Before X-Fly can take off, all three sets of batteries must be charged. The largest battery
524(Figure 5 above) needs to be charged with the supplied charger. The second battery is the
525Auxiliary Battery Pack (Figure 16). This is a set of four AA batteries that can be rechargeable or
526not. Finally, the Transmitter (Figure 2) needs to be charged on its supplied charger. See the
5285.1.2 Powering on
529 In order to power on X-Fly, insert the AA batteries into the auxiliary pack and attach the
530cover. The Raspberry Pi should power on. This is indicated by the PWR light turning on. Next,
531insert the LiPo battery into its harness and plug it in. The flight control board should come on
532and beep. Finally, turn on the transmitter by flipping the power switch to the on position.
534 Arm the flight control board by moving the right stick to the right with zero throttle. It
535will beep and the LED will turn on. For safety reasons, do not do this until you are at least 5
539 The left stick controls the throttle. Throttle is increased by raising the stick. There is no
540spring in the throttle stick, so it will maintain throttle unless you change it. Moving the throttle
77 39
78
541stick left and right will rotate (yaw) the quadcopter. The right stick controls pitch and roll.
542Moving the stick forward will pitch X-Fly forward, meaning it will start to fly forward. Moving
543the stick back will fly backward. Moving left and right will make it fly left and right.
545 Once X-Fly is behind the target and about ten feet above the ground, it can be switched to
546semi-autonomous mode. Leave the throttle stick in its position and pull and flip the E switch.
547At this point, it will take over and autonomous mode and start following the target.
549 After you are done flying, flip the E switch back to off. This will switch back to manual
550mode. Since the throttle stick is still in its position, it will maintain height. At this point, slowly
553 Help can be obtained by emailing Ethan Plummer (plu5937@calu.edu), Nick Reich
555
79 40
80
556Appendix A: Team Details
557Ethan Plummer
560Technology. He is serving as the Build Manager for the X-fly project. His study interests include
561automation, embedded systems and artificial intelligence. During his off time from school he
562interns at the National Robotics Engineering Center in Pittsburgh. His future plans after
563graduation include applying for grad school to further pursue his study interests.
564 For the X-Fly production team, He served as the Project Manager and Lead Design
565Engineer. He was chosen to be Project Manager because of his experience leading various
566projects. The most notable was leading Team Cache Money to a second place victory in the CET
568 As project manager, he took care of making sure that work was completed on schedule.
569This included making sure that parts were ordered, paid for, and were shipped in a timely
570manner. He also delegated responsibilities related to the project based on team member's skills.
571 As Lead Design Engineer, he took responsibility for the overall design of the project.
572This included deciding the necessary features and functions of the project. For example the
573frame of the quad was decided upon because of the need for ample space to place components.
574He was also responsible for the design and production of the custom 3d printed parts. On the
575programming side of things he was responsible for researching, designing, and writing the vision
576based algorithms. After the vision program was written and working, he was responsible for
81 41
82
577compiling all of the libraries related to the project into an executable. He also wrote the start-up
579
83 42
84
580Nicholas Reich
582University of PA. He will be graduating in May 2014. He was originally a Computer Science
583major, but decided to take his love for programming to the next level by switching to CET.
584During the X-Fly project, Nick served as the Control Systems Engineer. He was chosen for this
585role because due to his comfort zone with embedded programming, including I/O peripherals and
586motor control.
587As Control Systems Engineer, his roles were split into two parts: hardware and software. The
589made to control the X-Fly. Hardware design and testing was completed using oscilloscopes,
590digital multi meters, and component soldering. Proper voltage levels at certain points of the
591circuit along with accurate control pulses needed to be precise and reliable for this project, or the
593On the software side of the project, Nick utilized his embedded programming experience to
594obtain proper digital pulse width outputs from the Raspberry Pi. He was able to manipulate the
595Raspberry Pi into thinking it was a common RC transmitter by imitating the PWM outputs from
596ours. Testing for the software design was once again accomplished using an oscilloscope to
597ensure proper pulses were being sent to the motors. After this control software was merged with
598Ethans vision software, they tested and debugged together to obtain proper object tracking.
599
85 43
86
600R. Alex Kibler
602interning at US Steel as a Process Control Engineer, and he has accepted a full time position
603pending graduation.
604Alex contributed to the project in multiple ways. His largest contribution was as the
605documentation specialist. He hosts the webserver for the website. He also designed the entire
606website and wrote the software to manage the media on the server. He wrote all of the weekly
608Alex suggested the idea of using the Raspberry Pi as an onboard computer because he had
609experience working with embedded Unix systems and knew that it would be powerful enough
610for the project, while also being very affordable. He set up the VNC server on the raspberry pi
611and worked with other low level Unix systems on the raspberry pi. He also worked as a software
612support engineer, providing support to Nick and Ethan as they encountered issues.
614
615
87 44
88
616Appendix B: Work Flow Authentication
617I, Ethan R Plummer, hereby attest that I have performed the work as documented herein.
618Signature:______________________________ Date:______________
619
620I, Nicholas Reich, hereby attest that I have performed the work as documented herein.
621Signature:______________________________ Date:______________
622
623I, Robert Alex Kibler, hereby attest that I have performed the work as documented herein.
624Signature:______________________________ Date:______________
625
626
89 45
90
627Appendix C: References
630OpenCV: http://opencv.org/
633https://robidouille.wordpress.com/2013/10/19/raspberry-pi-camera-with-opencv/
634Raspicam_CV Repository:
635https://github.com/robidouille/robidouille/blob/master/raspicam_cv/README
638
91 46
92
639Appendix D: Code Listings
644
645//-----Special Thanks----------------
648// BoundedRectangle help form Sergi Pons Freixes and Jean-Pierre Landary
652
653
654#include "highgui.h"
655#include "cv.h"
656#include "RaspiCamCV.h"
657#include <opencv2/core/core.hpp>
658#include <opencv2/opencv.hpp>
659#include <opencv2/highgui/highgui.hpp>
660#include <iostream>
661#include <opencv2/imgproc/imgproc.hpp>
662#include "stdio.h"
663#include "math.h"
664
93 47
94
665extern "C"
666 {
674 }
675
676//servoblaster->GPIO pin #. Change if you are not connecting ail to GPIO 25
677//ele to GPIO 24, thr to GPIO 23, rud to GPIO 18, and aux to GPIO 22
678#define ail_pin 7
679#define ele_pin 6
680#define thr_pin 5
681#define rud_pin 2
682#define aux_pin 22
683
684// Change scalar format for simplilicity
686
687//for opening /dev/servoblaster
688FILE *fp;
689
690using namespace cv;
692
95 48
96
693// Initial Pin PWMs
694int ail_pwm=150;
695int ele_pwm=150;
696int thr_pwm=151;
697int rud_pwm=150;
698int ail_flag=0;
699
700
701
702// Targeting objects
704CvSeq* contours = 0;
705CvRect edges;
706CvFont font;
707char Target[10];
708int PtX;
709int PtY;
714
715// Colors
97 49
98
721
722
723// Filter varibles
724 //Red
725
726int Lhue = 169;
732
733/* //Blue
734
735int Lhue = 105;
741*/
744
745// Shi-Tomasi Varibles
99 50
100
749double CornerQuality = .05;
750double MinDistance = 5;
751int EigBlockSize = 3;
753int Radius = 2;
754
755
756//Control Window Varibles
759char TrackbarName[10];
760
761//_________________________________________________Functions__________________
762__________________________
763//___________________________________________Quad Control
764Functions_________________________________
765
766/*initPWM(int,int,int,int)c
770neutral positions.
771*/
772
773void initPWM(int ail,int ele,int thr,int rud) {
774 fprintf(fp,"%i=%i\n",ail_pin,ail);
775 fflush(fp);
776 fprintf(fp,"%i=%i\n",ele_pin,ele);
777 fflush(fp);
101 51
102
778 fprintf(fp,"%i=%i\n",thr_pin,thr);
779 fflush(fp);
780 fprintf(fp,"%i=%i\n",rud_pin,rud);
781 fflush(fp);
782}//initPWM()
783
784/* updatePWM(int,int)
789*/
790
791void updatePWM(int control, int pulse) {
792 fprintf(fp,"%i=%i\n",control,pulse);
793 fflush(fp);
794}//updatePWM()
795
796//__________________________________________ OpenCV Vision
797Functions_________________________________
798
799//Refresh Trackbar
801{
802}
803
804/*
103 52
104
806void trackbar( int, void*)
807{
828 imshow;
829
830
831
832 namedWindow ("Thresholds", 23); //Make
833trackbars to tune image
105 53
106
838 createTrackbar (TrackbarName, "Thresholds", &xDilate, slider_max,
839on_trackbar );
840 imshow;
841
842}
843*/
844
845// Threshold Image
847 {
849 // Turn HSV image into a binary image by filtering out all but red
857 }
858
859
860// Filter Noise
862 {
866 //
107 54
108
867 cvErode(ThreshImg, FiltImg, 0, xErode);
871 }
872
873// Find edges of the binary image
875 {
878 {
880 }
881 else
882 {
884 }
886
891 {
894 contours,
109 55
110
895 cvScalarAll(255),
896 cvScalarAll(255),
897 100 );
898 }
900 {
903 }
907 }
908
909
910// Box and Name Target(s)
912 {
913 int n = 1;
916 CV_FONT_HERSHEY_COMPLEX_SMALL,
918
919 // Draw rectangle around the target
111 56
112
923 Blue);
924
925 sprintf(Target, "Target %d", n);
926 n++;
929 Target ,
931 &font,
932 Black);
936 }
937
938
939// Find Centroid of Rect
941 {
945
946// cout << PtX << "," << PtY << endl;
948 }
949
950void DirectionalCalc(CvPoint CenterMass, CvPoint Center, CvRect edges)
113 57
114
951 {
952
954// {
956// return 1;
958// }
959
960// eState state;
964// gpioReadPin(aux_pin,&state);
965
966 int Direct = 0;
967// cout << CenterMass.x << "," << CenterMass.y << endl;
969
970 //if(!state)
971// {
973 {
976 // cout << "----------" << Direct << " Pixels" << endl;
978 if(!ail_flag)
115 58
116
979 {
980 ail_pwm--;
982 {
983 ail_flag=1;
984 ail_pwm=150;
985 }
986 updatePWM(ail_pin,ail_pwm);
987 }
988 }
990 {
993 // cout << "----------" << Direct << " Pixels" << endl;
994 if(!ail_flag)
995 {
996 ail_pwm++;
998 {
999 ail_flag=1;
1000 ail_pwm=150;
1001 }
1002 updatePWM(ail_pin,ail_pwm);
1003 }
1004 }
1005 else
1006 {
117 59
118
1007 ail_pwm=150;
1008 updatePWM(ail_pin,ail_pwm);
1009 ail_flag=0;
1010 }
1011
1012
1013 if (CenterMass.y > CrossY+50)
1014 {
1017 // cout << "----------" << Direct << " Pixels" << endl;
1018 thr_pwm++;
1020 thr_pwm=153;
1021 //updatePWM(thr_pin,thr_pwm);
1022 }
1024 {
1027 // cout << "----------" << Direct << " Pixels" << endl;
1028 thr_pwm--;
1030 thr_pwm=149;
1031 //updatePWM(thr_pin,thr_pwm);
1032 }
1033 else
1034 {
119 60
120
1035 thr_pwm=151;
1036 //updatePWM(thr_pin,thr_pwm);
1037 }
1038
1039 if ((edges.height * edges.width) < 7000)
1040 {
1042 ele_pwm--;
1044 ele_pwm=130;
1045// updatePWM(ele_pin,ele_pwm);
1046 }
1048 {
1050 ele_pwm++;
1052 ele_pwm=170;
1053// updatePWM(ele_pin,ele_pwm);
1054 }
1055 else
1056 {
1057 ele_pwm=150;
1058// updatePWM(ele_pin,ele_pwm);
1059 }
1060// }
1061 }
1062
121 61
122
1063// Mark Centroid
1065 {
1069 }
1070
1071IplImage* MarkCenter( IplImage* image, CvPoint Center)
1072 {
1084 }
1085
1086// Use Shi Tomasi corner detection
1089 {
123 62
124
1093 TempImg, // Temp algorithim workspace
1098 0,
1102
1103 for (int i=0; i<CornerCount; i++)
1104 {
1107 }
1111 }
1112
1113
1114
1115
1116// Clean up input
1118 {
1119 raspiCamCvReleaseCapture(&capture);
125 63
126
1122 }
1123
1124
1125
1126//_________________________________________Main_______________________________
1127_____________
1128
1129int main( int argc, char** argv )
1130{
1146 initPWM(ail_pwm,ele_pwm,thr_pwm,rud_pwm);
1147
1151 {
127 64
128
1152// double t = getTickCount();
1157 cvFlip(image,image,0);
129 65
130
1186// cvReleaseImage( &image );
1187
1188
1191// t = ((double)getTickCount()-t)/getTickFrequency();
1194 CleanUp(capture);
1195}
1196
1197
1198
1199
1200
1201
1202
131 66
132