Professional Documents
Culture Documents
University of Surrey
School of Electronics & Physical Sciences
Abstract
The Special Engineering Project is based around the development of a robotic platform for use in the University of Surrey CVSSP. This report deals with the development of the Sensing and Control system to allow the robot to move around its environment, using Ultrasound Sensors to detect obstacles. The Sensing and Control system receives instructions from the robot's Decision System, and passes sensor data back over the same channel. The project was a success, in that the Sensing and Control System met the specifications given, however there is significant room for improvement of the system.
Table of Contents
1 INTRODUCTION TO THE PROJECT...................................................................................................1 1.1 OVERALL AIMS OF THE PROJECT (ENUMERATED)...................................................................2 1.2 SPECIFICATION FOR SENSING AND CONTROL SYSTEM..........................................................2 1.3 CONCLUSION.......................................................................................................................................2 2 RESEARCH................................................................................................................................................3 2.1 INTRODUCTION..................................................................................................................................3 2.2 SENSOR TECHNOLOGIES..................................................................................................................3 2.2.1 Infra-Red.........................................................................................................................................3 2.2.2 RADAR..........................................................................................................................................4 2.2.3 Inductive, Magnetic, Capacitive.....................................................................................................4
Figure 2.1: Operation of a Hall-effect sensor in conjunction with a permanent magnet [Fu, 1987] pp279 5
2.2.4 Sonar...............................................................................................................................................5
Figure 2.2: Differentiating between Walls and Corners using RCD's [Nehmzow, 2000] pp28..................6 Figure 2.3: 360 degree sonar scan, with two RCDs [Nehmzow, 2000] pp27.............................................6
2.2.5 Laser Range Finders.......................................................................................................................7 2.2.6 Shaft Encoders................................................................................................................................7 2.3 CONTROL OF MOTORS......................................................................................................................7 2.4 CONTROL INTERFACE.......................................................................................................................8 2.5 TECHNOLOGIES SELECTED.............................................................................................................9
Figure 2.4: Position of Sensors on Chassis....................................................................................................10
3 DESIGN OVERVIEW..............................................................................................................................11 3.1 INTRODUCTION................................................................................................................................11 3.2 HARDWARE DESIGN HIERARCHY...............................................................................................11 3.3 SOFTWARE DESIGN HIERARCHY.................................................................................................12 3.4 CONCLUSION.....................................................................................................................................12 4 HARDWARE DESIGN............................................................................................................................13 4.1 INTRODUCTION................................................................................................................................13 4.2 PIC SENSOR CONTROLLER.............................................................................................................13 4.2.1 Important features of the circuit....................................................................................................14
Figure 4.1: Circuit Diagram of Sensor Controller.........................................................................................14
4.2.2 Brief explanation of I2C bus protocol..........................................................................................15 4.3 MOTOR CONTROLLER.....................................................................................................................16 4.4 LAPTOP CRADLE AND SENSOR PLATFORM..............................................................................17
Figure 4.2: Laptop Cradle Design (dimensions in mm).................................................................................17 Figure 4.3: Sensor Bracket Design.................................................................................................................17
Test Circuit........................................................................................................................................................23 Figure 4.6 - Sensor Test Circuit.............................................................................................. ........................23 Methodology......................................................................................................................................................23 Results...............................................................................................................................................................24 Analysis of Results............................................................................................................................................24 Table 4.5.2: Sensor Test Results.....................................................................................................................24
4.6 CONCLUSION.....................................................................................................................................31 5 SOFTWARE DESIGN.............................................................................................................................32 5.1 INTRODUCTION................................................................................................................................32 5.2 SENSOR CONTROLLER PIC CODE.................................................................................................32
Figure 5.1: PIC code Flowchart.....................................................................................................................33
5.5 REMOTE CONTROL SOFTWARE ('ROBOTERM')........................................................................36 5.6 SOFTWARE TESTING.......................................................................................................................37 5.6.1 Sensor Tests on Robot Chassis.....................................................................................................37
Aim....................................................................................................................................................................37 Methodology......................................................................................................................................................37 Results...............................................................................................................................................................37 Table 5.6.1: Apparent Distances (in cm) when mounted on Chassis.............................................................. 37 Analysis of Results............................................................................................................................................37
5.7 CONCLUSION.....................................................................................................................................38 6 OTHER TESTING...................................................................................................................................40 6.1 INTRODUCTION................................................................................................................................40 6.2 INTEGRATION WORK FLOOR RUNNING TESTS WITH CAMERA.......................................40 6.2.1 Aim...............................................................................................................................................40 6.2.2 Methodology.................................................................................................................................40 6.2.3 Results...........................................................................................................................................40 6.3 INTEGRATION WORK COLLECTION OF DATA FOR DECISION SYSTEM..........................40 6.3.1 Aim...............................................................................................................................................40 6.3.2 Methodology.................................................................................................................................41 6.3.3 Results...........................................................................................................................................41
7 PROBLEMS AND ISSUES ENCOUNTERED......................................................................................42 7.1 INTRODUCTION................................................................................................................................42 7.2 PROJECT TASK OVERRUNS............................................................................................................42
Figure 7.1: Gantt Chart created at beginning of project...............................................................................42 Figure 7.2: Gantt Chart created half-way through the Project......................................................................43 Figure 7.3: Retrospective Gantt Chart based on schedule of whole project..................................................43
7.3 COMPLETION OF PIC SENSOR INTERFACE................................................................................44 7.4 HARDWARE CONSTRUCTION........................................................................................................44 7.5 MANOEUVRABILITY ISSUES.........................................................................................................44 7.6 HARDWARE INTERFACE CLASS TIME-OUT...............................................................................45 8 CONCLUSION..........................................................................................................................................46 9 REFERENCES..........................................................................................................................................48 APPENDIX A CODE ANALYSIS...........................................................................................................50 PIC SENSOR CONTROLLER CODE........................................................................................................50
Header Files........................................................................................ ............................................................50 Global Variables.............................................................................................................................................50 Variables in main............................................................................................................................................50 Function Prototypes........................................................................................................................................50 Preprocessor Directives...................................................................... ............................................................51 Program Flow.................................................................................................................................................51 I2C interrupt...................................................................................................................................................52
APPENDIX B - USER GUIDE SENSORS AND CONTROL SYSTEM..............................................56 INTRODUCTION.......................................................................................................................................56 SET-UP........................................................................................................................................................56 Sensor Controller Circuit Board.............................................................................................................56 Sensors....................................................................................................................................................56 USB-I2C interface..................................................................................................................................57 MD22 Motor Controller..........................................................................................................................57 Hardware Interface Software..................................................................................................................57 Remote Control Software.......................................................................................................................57 CONTROLLING THE ROBOT..................................................................................................................58 Using Roboterm.exe...............................................................................................................................58
Table 1: Command Format for Interface with Hardware..............................................................................58 Table 2: Acknowledgement Format for Interface with Hardware..................................................................58
Using other software...............................................................................................................................58 Hardware Proximity Override................................................................................................................59 USING THE HARDWARE INTERFACE CLASS....................................................................................59 TROUBLESHOOTING...............................................................................................................................59 APPENDIX C C++ SOURCE CODE......................................................................................................60 HARDWARE INTERFACE SOFTWARE MIDDLEMAN.EXE...........................................................60 HARDWARE INTERFACE CLASS BOT.H..........................................................................................61 REMOTE CONTROL SOFTWARE - ROBOTERM.EXE........................................................................66 KEYS CLASS KEYS.H AND KEYS.CPP..............................................................................................69
keys.h.................................................................................................................................................................69 keys.cpp.............................................................................................................................................................70
Martin Nicholson Project Management and Artificial Intelligence. Peter Helland Video Processing and acquisition. Ahmed Aichi Networking and Inter-System Communications. Edward Cornish (author) Robotic Locomotion Control and Collision Avoidance.
The Project is being academically supervised by Dr Richard Bowden, a specialist in Computer Vision Systems. Dr Bowden's role is to oversee the project, and to provide guidance where necessary, however all management and design decisions are made by the Undergraduate Team. The robotic platform (hereafter referred to as 'the robot') is controlled by one of two laptop computers supplied by the CVSSP. It is hoped that, once one platform design has been finalised and tested, a duplicate system can be constructed that will allow the robots to be used for Artificial Intelligence experiments in team working and emergent behaviours. The laptop computers are the only technology supplied for use on the robot, all other equipment must be purchased from a Project Budget of 1000. The University does provide support facilities in the form of the Engineering Workshop (where bespoke mechanical items can be constructed usually free of charge), the CVSSP computing system ('Brother', to allow Video Processing work to be carried out), and the laboratory and library facilities available to all Electronics Undergraduates. The robot(s), once completed will be maintained at the CVSSP for use in future projects. At the earliest stage of the project, a pre-built robotic chassis was chosen to be the basis for the project. This chassis is a product of Lynxmotion [Lynx, 2006] , and is model 4WD3[Lynx, 2006] . This model was chosen for its large size and flexibility, allowing many devices to be mounted and supported. The 4WD (and attached nonelectronic components (mounting frames etc.) are hereafter referred to as 'the chassis'. The chassis includes electric motors for locomotion. The four project areas are interlinked, yet each is a distinct system in it's own right. The Artificial Intelligence of the robot will make decisions related to navigation based on the information it receives through 1
Robotic Sensors & Control - Final Project Report the interfaces provided to it (largely software based) [Nicholson, 2007] . The interfaces between the various software programs composing the robot control system will communicate using a custom Network API, which can be used to communicate between programs running on the same computer, as well as between computers[Aichi, 2007] . This allows decentralised processing to take place; video data captured on the robot can be passed over a wireless link to the CVSSP network, where the powerful vision processing systems in place can relieve the processor load placed on the laptop computer. The vision processing algorithms developed as part of the project will be used to determine the velocity of the robot, the rough position, and the presence of objects of interest (fixed obstacles, people, goals, etc.)[Helland, 2007] . The Sensing and Control system will allow the robot to move around its environment in order to complete the goals assigned to it, based on instructions received over the Networking API from the Artificial Intelligence system, which will issue instructions based on the information it receives from the Vision System and the Sensing and Control system.
1.3 Conclusion
The Sensors and Control System is a critical part of the proposed robot platform. It is responsible for providing a flexible, modular system for controlling the movement of the robot, retrieving sensor data about the surroundings, and preventing collisions. The Sensors and Control System is intended to work in parallel with the Vision System to provide information to the Decision System. The Network API is to provide communication links between the various processes in the other three Systems.
2 Research
2.1 Introduction
In this section, potential technologies are examined and discussed. The technology areas to be discussed are: Sensor Technologies Control of Motors Control of Sensors The Sensor Technologies are discussed in order to determine the most appropriate means of detecting obstacles, considering the intended operation environment of the robot, and the space, weight, and power constraints in effect. The chassis used in the robot's design incorporates four DC electric motors. A means to control these motors was considered an integral part of the project, especially considering the fixed nature of the motors; the wheels cannot be rotated to steer the robot, and so a 'skid-steer' system must be used (where the speed of the motors on one side of the robot is increased, to cause the robot to turn towards the opposing side). The information collected by the sensors on the robot must be captured and passed back to the higher level processes running on the laptop computer. Some processing ability (either hardware) was needed to perform this. The interfaces available on the laptop computer needed to be considered, both hardware (the physical ports), and the software capabilities (the communications protocols available). The laptop computer is running Windows XPtm Professional. Research was performed using Robotics Textbooks in the University of Surrey Library, and a selection of Internet Resources. These sources are referenced in 9.
2.2.1 Infra-Red
IR proximity ranging has the disadvantage of only realistically providing detect/non-detect information, since the reflectivity of objects to IR is much more variable in an indoors environment [Schur, 2006] . The components, however, are widely available and compact. IR sensors use reflected IR light to detect surfaces. Low frequency modulation of the emitted beam is usually used to eliminate interference from unvarying sources, such as electric lights or the sun. Distance measurements are only possible if the environment has uniform colour an surface structure, and the sensors must be calibrated 3
Robotic Sensors & Control - Final Project Report to do this. This is rarely practical in most scenarios, however. Black or dark surfaces, for instance, are practically invisible to IR sensors, so they are not completely infallible when it comes to proximity detection. It is because of this that IR sensors are generally only effective for object detection, and not distance measuring. Furthermore, since the intensity of IR light decreases quadratically with distance (proportional to d-2), typical maximum ranges are 50 to 100 cm, which may prove too small for the purposes of the project. [Nehmzow, 2000]
2.2.2 RADAR
RADAR provides an accurate picture of the surroundings, and is a well understood technology. The majority of objects in the indoor environment have high radar reflectivity, however there may be significant potential for interference from other radio sources in the CVSSP, due to the Wireless Networking systems in place. It is also uncertain how much power would be needed to operate a RADAR antenna with sufficient power to work effectively over the distances involved.
Figure 2.1: Operation of a Hall-effect sensor in conjunction with a permanent magnet [Fu, 1987] pp279
This method has similar disadvantages to the Inductive sensor method described above; only ferromagnetic materials can be detected, and the range of detection is reduced. [Fu, 1987] If a sensor is required to detect proximity to a non-ferromagnetic surface, a capacitive sensor may be used. These sensors are capable (with varying degrees of sensitivity) of reacting to all non-gaseous materials. As the name implies, the sensor works by detecting a change in capacitance between two electrodes, effectively using the sensed object (and the air around it) as part of the capacitors dielectric. [Fu, 1987] Capacitance based sensors are once again subject to a limited range. Also, whilst non-ferrous materials will give rise to a response, the level will be markedly less than that of a ferrous material; for example, Iron can cause a response 2.5 times greater than that caused by PVC at the same distance (see [Fu, 1987] pp.281).
2.2.4 Sonar
A great deal of work has been done on (ultrasound) sonar sensing in the field of Robotics. In a typical ultrasound sensor system, a 'chirp' of ultrasound is emitted periodically from a reasonably narrow-beam acoustic transducer. This burst of ultrasound will be reflected from nearby surfaces and can be detected at the sensor after a time T. This time interval is the out-and-back time. Since the speed of sound in air is known, it is a simple matter to calculate the distance to the reflecting surface using the relationship between velocity and time. A major advantage of ultrasound sensing methods is that the dependency of the sensor response upon the material being sensed is reduced, when compared to methods such as Opto-sensing and RADAR. This is clearly of benefit in an indoor environment, where a variety of obstacles will be found having different surface
Robotic Sensors & Control - Final Project Report compositions may be encountered. A contrasting disadvantage is that the sensor field is in the shape of a cone; the detected object could be anywhere within the sensor cone at the measured distance. The accuracy of the position measurement is dependent on the width of the sensor beam. Also, a phenomenon called Specular Reflections can cause inaccuracies in the measurements. If an ultrasound beam strikes a smooth surface at a sufficiently shallow angle, the beam will be reflected away from the receiver instead of back towards it. This may cause a larger range than actually exists to be read by the sensor.
Figure 2.2: Differentiating between Walls and Corners using RCD's [Nehmzow, 2000] pp28
Figure 2.3: 360 degree sonar scan, with two RCDs [Nehmzow, 2000] pp27
There are methods that have been developed to combat Specular Reflections. One method uses so called Regions of constant depth. If a 360 sonar scan is performed (for example) a significant section of arc where the ranges measured are constant is termed a Region of constant depth (RCD, see Figure 2.2). These regions can be interpreted by taking two (or more) sensor scans from two differing locations and comparing the arcs of the RCD's. If the arcs intersect, a corner is indicated at the point of intersection. If the arcs are caused by a flat wall, they will be at a tangent to the reflecting plane (see Figure 2.3). [Nehmzow, 2000] A third issue to be overcome relates to arrays of ultrasound sensors. If one sensor detects the reflected pulse from another, so-called crosstalk arises. Solutions to this include coding the sensor signals somehow, or controlling the timing of the sensors to prevent erroneous detections. [Nehmzow, 2000] Ultrasound sensors are effective at much greater distances than the proximity sensing methods mentioned above, even taking into account the increased atmospheric attenuation of sound waves at high frequencies. This means that the robot would have more freedom of movement, and would be able to sense obstacles at a greater range, allowing more time for path-planning computations to be performed. An experiment performed by Mitsubishi Electric Corporation showed that a mechanically scanned ultrasound sensor was able to detect the locations of standing persons within a room ([Pugh, 1986] pp.271). Investigations were also made into the practicality of an electronic scanning system. The advantage of the electronic scanning system over the mechanical system is that the servos used to pan and tilt the sensor beam contribute vibrational noise, and the assembly is by necessity quite large. An electronic scanning system can be used to deflect the beam by unifying the phases of the emitter elements in the desired direction. The study performed by Mitsubishi highlighted the problems with resolution, reliability, and processing time that must be overcome in the implementation of this form of sensor. 6
Robotic Sensors & Control - Final Project Report A fixed sensor will not have the flexibility of the scanning sensor, but will be simpler to mount and utilise. Multiple sensors are needed to provide all around coverage.
Robotic Sensors & Control - Final Project Report inputs, and can be controlled using a variety of methods, including analogue voltage inputs, and Radio Control Model systems. Of particular note is the I2C capability built into many of these products, considering the availability of USB-I2C interface devices from the same manufacturer, although analogue signals can also be used to control the speed/direction. It was possible that a custom circuit could have been designed, incorporating the power regulation and communications capabilities desired. This, however, would have been a significant design undertaking, requiring significant time and effort, to allow for development of a working device. This option was considered infeasible within the time constraints of this project.
Robotic Sensors & Control - Final Project Report very popular and widely understood product range. Many of Devantech Ltd's products are based around PIC micro controllers, which suggests that the PIC product family is trusted and well-supported by the robotics community. In addition to this, facilities for programming PICs are available in the Undergraduate Labs. Such facilities include MPLABtm [Micro, 2000] software, which allows programs to be composed in assembly language, and, with the installed C18 compiler, in C. Since the author is familiar with C from a level 1 programming course, this does not require learning a new language. The MPLAB software includes sophisticated debugging tools, allowing code execution to be 'stepped through', whilst displaying the values of any program variables. Debugging can be done in hardware, if an In-Circuit Debugger tool is connected to the correct pins on the PIC. The undergraduate lab has a number of these tools, as well as PICStart Plus programmers, which are simply used to program PICs before they are installed into a circuit. Also useful are development boards which provide a variety of tools for testing programs and concepts (switches, keypads, displays, etcetera).
10
3 Design Overview
3.1 Introduction
This section examines the overall structure of the Sensors and Control System. The Hardware and Software aspects of the system are discussed in turn, with brief justifications for different aspects of the design.
The power source for the Sensor Controller and MD22 is drawn from the USB/I2C interface device. This negates the need for a DC-DC converter or other 5V supply. The control outputs of the MD22 are connected to the chassis motors in a 'skid-steer' configuration. This means that the controller drives a pair of motors on each side of the chassis. If a right hand turn is desired, for example, the power to the left hand side motors is increased, and the robot will turn to the right. This method allows for differential control of direction, which may facilitate simpler computations for the high level AI. The SRF05 Ultrasound rangers are mounted at eight points on the frame of the robot; each sensor requires the following connections:
The Echo outputs of each sensor are connected to a single pin on the control PIC, as each sensor is polled separately. Each Trigger input must be driven from a separate PIC output pin, however. It was found during the design process that when two or more sensors were connected to the PIC, that the 11
Robotic Sensors & Control - Final Project Report echo pulse would not be received by the PIC. An oscilloscope was used to determine that the sensors were functioning correctly and were being triggered. It was reasoned that when two or more sensors are connected to the same node in the circuit, the echo outputs of the other sensors load that node with their impedance, causing the voltage at the PIC to be less than the logic 1 voltage. In order to remedy this, diodes were connected between each echo pin connection and the circuit node where they were joined. This was observed to remedy the problem.
3.4 Conclusion
The Sensors and Control System can be broadly divided into the Hardware aspect, consisting of the Sensor Controller (with Sensors) and MD22 Motor Controller, and the Software aspect, consisting of the Hardware Interface Software, which is build around the Hardware Interface Class and is the point of control for the user and/or decision system.
12
4 Hardware Design
4.1 Introduction
This section aims to discuss in greater depth the various Hardware features of the Sensors and Control System. Each component is described in terms of it's function and capabilities. Concepts necessary for a practical understanding of the Hardware are explained. The tests that were performed on the Hardware aspects of the Sensors and Control System are described, along with the results of the tests, and the conclusions drawn (and actions taken, if any).
Robotic Sensors & Control - Final Project Report sufficiently high (> 1cm accuracy) whilst allowing the maximum pulse width of the sensor (~30 ms) to be measured. The oscillator uses a 16 bit register to store its counter value; this register is read in two byte read operations.
14
mode). 7. The PIC responds with an ACK. It will then hold the SCL line low until it is ready to transmit data.
This is referred to as Clock Stretching, and allows slower processors to communicate with faster ones by allowing the Slave to decide when transmission starts.
15
Robotic Sensors & Control - Final Project Report 8. The Master releases the SDA line, and generates 8 clock pulses on the SCL line (once it has been
released by the Slave). The Slave will change the SDA line according to each bit of the byte to be transmitted. 9. Upon successful reception of the byte, the Master will generate an ACK condition by pulling the
SDA line low (after it is released by the Slave) and triggering a clock pulse. 10. The Slave is now free to send another byte, unless the Master has read the required number of bytes. In this case, immediately after the ACK, the Master will send a STOP condition (first release SCL, then SDA). This tells the Slave to go back to waiting for a START condition.
16
It was necessary to design and construct a hardware fixture to support the laptop when the robot was being operated. Also, mounting points were needed for the sensors and camera, which could not easily be attached to the purchased chassis whilst at the same time having a wide coverage. The decision was taken to combine a laptop cradle and sensor platform into a one-piece construction. Discussions were conducted with the University Engineering Workshop to establish the construction methods available, and a design was produced (see Figure 4.2). This design was made from aluminium, making it light and strong, and allowing mounting points to be drilled for the sensors anywhere on the frame. Right-angle mounting brackets were produced in order to attach 17
Robotic Sensors & Control - Final Project Report the Ultrasound sensors to the frame (see Figure 4.3). The cradle also incorporated a support for the laptop screen, allowing the display to be read when the robot was operating without the screen being swung backwards by the robot's inertia. NOTE: The designs in Figure 4.2 and Figure 4.3 were developed in conjunction with the rest of the project team, and with the assistance and guidance of staff in the University of Surrey Mechanical Workshop. The concepts involved did not originate only with the author.
Equipment Used
Laptop Computer Serial link test Software USB/I2C interface Laboratory Power Supply MD22 motor controller Robot Chassis inc. motors.
7 .2 V PSU L e ft M o to r s
M D 2 2 M o to r C o n tr o ll e r
L a p to p C o m p u te r
R ig h t M o to r s
U SB/ I2 C
18
Test Procedure
The establishment of serial communications with the MD22 was done using a free Serial Port Test Program [Ser, 2004] . The parameters for communication with the USB/I2C device are (from the manufacturers documentation)[Devan, 2006]
Baud Rate of 19200 8 data bits No Parity bits Two Stop bits.
The commands are sent as Hex characters. The modes of operation were tested in turn, with key speeds being applied (i.e. Full forward, full reverse, half forward, half reverse etcetera). Some initial testing was done to ensure that the motors were connected correctly. The reason for this was that the polarities of the motor power outlets on the controller are not labelled.
Equipment Used
Digital Stopwatch Robot (complete chassis with sensors and sensor controller)
Methodology
The robot was set up with the items mentioned above installed. A second laptop computer, owned by the author was used to issue commands, using the Hardware Interface and Remote Control programs. The Remote Control program was operated in terminal mode (see 5.5), as it was not possible to steer the robot remotely and maintain consistent speed whilst measuring the speed. The robot was placed on the floor of the laboratory, and markers were placed at a 6 metre interval. A 19
Robotic Sensors & Control - Final Project Report stopwatch was used to time the robot's passage between the markers, running at various speeds. The information obtained was used to calculate the speed of the robot is metres per second (ms-1), with three runs at each speed setting being made, then averaged. The speed settings were incremented in steps of 3, from 6 to 18. The lower speeds were not tested, as it was expected that the speed would increase linearly, allowing the lower equivalent speeds to be extrapolated.
Table 4.5.1: Average speeds for speed settings 6 to 18 It was observed that the robot did not travel in a straight line, even when the same speed levels were input to each pair of motors. The robot would always pull to the right, and so it was necessary to place the robot to the right side of the test course, angled to the left, so that it would not collide with furniture before reaching the end marker. This naturally made the distance travelled between the markers difficult to determine, and was a source
0.6000 0.5500 0.5000 0.4500 0.4000 0.3500 0.3000 0.2500 0.2000 0.1500 0.1000 0.0500 0.0000 6 9 12 16 18
Figure 4.5: Graph showing approximate speed values (averaged over three readings and taking into account uncertainties)
20
Robotic Sensors & Control - Final Project Report of uncertainty. Another source of uncertainty was the nature of the timing method. The author used a stopwatch to time the interval between the start and finish markers. This relied on the authors judgement of when the robot had crossed the marker, and so an uncertainty of 0.1 seconds was assumed. In order to account for the sources of error present, upper and lower bounds for the speed were calculated, using the recorded time + 0.1 seconds and distance of 6 metres for the lower limit (the 'worst-case'); and the recorded time 0.1 seconds and distance of 6.4 metres for the higher limit, since the curved path of the robot would have increased the distance travelled. An average of these two limits was calculated. It can be seen from Figure 4.5 that the trend is quite linear until the higher speed values are reached. It is likely that the readings for speed level 16 are anomalous in some way, perhaps due to a skewed average of times. If the linear progression of the lower speed values is continued, ignoring the level 16 data, it should intersect the level 18 data. It is considered that the speed of the robot increases linearly with the speed levels, with a top speed of approximately 0.884 metres per second.
Equipment Used
TDS3032 Digital Oscilloscope TTi EL30T Power Supply Unit (PSU) PIC evaluation board
Methodology
In order to determine the source of the problem with I2C communications on the PIC, a digital oscilloscope was used to examine the signals on the bus. One channel each was used to monitor the SCL (clock) and SDA (data/address) lines. The oscilloscope was set to trigger on a falling edge, on the channel attached to SCL (I2C is an active low clocked bus). The bus was first tested using a Serial Port Test Program to send commands to the MD22 motor controller. This was done to establish that the methodology was sound, since the address and data patterns were explicitly known. A variety of messages were sent to the MD22, setting the mode, left and right speed registers to various values. The PIC (programmed with simple I2C code, allowing a constant value of 0x33 to be read) was connected to the same Test Program using the Interface device. Attempts to read back the constant character were made, 21
Robotic Sensors & Control - Final Project Report whilst monitoring the SCL and SDA lines.
Modifications
When writing the I2C slave code for the PIC, consultation was made of an Application Note for PIC devices published by Microchip. This document is referred to as AN734 [Micro, 2000]. It details the different states that an I2C slave can be in at any stage during a bus transaction. An error has been pointed out in this application note, where the CKP bit is not set in the event of a Master NACK condition [I2C Slave, 2002] . This would cause the PIC to stop responding after the first read. The PIC code was re-written, based literally on AN734, taking into account the error that was identified. The five possible states were incorporated into a C switch statement, based on the flag bits relating to the PIC serial communications module. This was separated into read and write versions of the code, discarding those states that would not be used. The same test character (0x33) was used for the read tests. The success of the write tests would be based on whether or not an acknowledgement of 0x01 was sent back to the test program (this was generated by the I2C interface, not the PIC chip). This test code was found to work successfully, and the oscilloscope traces showed the correct characters being sent across the bus, with appropriate responses being received.
22
Equipment Used
TDS3032 Digital Oscilloscope TTi EL30T Power Supply Unit (PSU) Push-to-make switch Breadboard Tape Measure
Test Circuit
+5 V P u s h -T o -M a k e T r ig g e r 1uF Echo
O s c illo s c o p e
Methodology
A flat upright surface was placed at various distances from the sensor aperture. The surface was the largest face of a plastic component storage box found in the laboratory. This object was chosen as it had large flat surfaces, and was of a rigid material, and was expected to have a high Ultrasound reflectivity. The distance between the sensor and the surface was measured using a tape measure. The box was placed between 1m and 10cm away from the sensor aperture, in 10cm increments, and finally at 23
Robotic Sensors & Control - Final Project Report 5cm. At each discrete distance, the push-to-make switch was pressed, and the resulting echo pulse captured on the oscilloscope. The width of the pulse in microseconds (us) was measured using the scale on the oscilloscope display. This value was recorded. Each measurement was repeated three times at each distance.
Results
The results obtained show that the length of the echo pulse varies linearly with the distance of the reflecting surface (see Table 4.5.2). This corresponds to the expected performance of the Ultrasound Ranger, based on the manufacturers documentation. The method of manually reading the pulse width from the Oscilloscope screen is inaccurate, and a different method should be used to determine the uncertainty inherent in the Ultrasound measurements. Range (Cm) 100 90 80 70 60 50 40 30 20 10 5 1 5800 5100 4550 4080 3450 2900 2360 1760 1200 620 320 Readings (us) 2 5800 5150 4550 4090 3440 2990 2380 1730 1210 620 322 Average (us) 5800.00 5123.33 4553.33 4086.67 3443.33 2961.67 2373.33 1746.67 1208.33 620.00 320.67 Measured range (Cm) 100.00 88.33 78.51 70.46 59.37 51.06 40.92 30.11 20.83 10.69 5.53 Deviation (cm) 0 -1.67 -1.49 0.46 -0.63 1.06 0.92 0.11 0.83 0.69 0.53
3 5800 5120 4560 4090 3440 2995 2380 1750 1215 620 320
Analysis of Results
The results obtained show a strong correlation between the range measured from the sensor and the actual range, see Figure 4.7. This is as expected. The method used for measuring the pulse width was highly accurate. It is also very hard to quantify the inaccuracy. It was decided that further tests should be done, using a more accurate method of measuring the pulse width, with an accuracy that can be quantified. This will allow a value to be quoted for the accuracy of the sensors when connecting to other systems.
24
60.00 55.00 50.00 45.00 40.00 35.00 30.00 25.00 20.00 15.00 10.00 5.00 5 10 15 20 25 30 35 40 45 50 55 60 65 70 75 80 85 90 95 100
Actual Range
The maximum angle at which the sensor will detect an object The response of differently shaped objects at different distances
The accuracy of the range measurements at ranges up to 3m, down to 1cm, or the minimum distance at which the sensor is effective, whichever is the greater.
Equipment Used
TDS3032 Digital Oscilloscope EL302T Triple Power Supply Standard Ranger test circuit consisting of: Capacitor Wires Push to make switch SRF05 Mounting Bracket (see Figure 4.3) A range of test objects, intended to simulate potential obstacles to the robot: A plastic equipment case, with roughly rectangular sides A plastic water bottle, empty (22.4cm 0.1cm high, radius of 3.1cm 0.1cm).
25
A small plastic pencil lead case with rhombic cross-section An extendible tape measure, with metal casing (width 1.8 cm 0.1 cm, height 5.3 cm 0.1 cm)
Assumptions:
It is assumed that the beam pattern of the SRF05 is symmetrical, and that the beam pattern of the SRF05 extends at a perfect perpendicular from the plane of the circuit board.
Test 2
The procedure of locating the detectability threshold was performed in the same manner as in Test 1, substituting the metal tape measure for the plastic bottle. This time, the threshold of detectability was measured to be slightly wider, at 27.5 degrees off of centre.
26
Test 3
The plastic pencil lead case was used for the third set of measurements, proceeding as outlined in the above two sections. A problem was encountered, in that the lead case was undetectable beyond a range of 4.8 cm 0.1 cm, when facing its widest aspect towards the sensor, and beyond a range of 6.3 cm 0.1 cm, when facing its narrower, more sharply angled aspect toward the sensor.
Test 4
A different procedure was adopted for this test; the plastic equipment case was placed at a distance of 25 cm 0.1 cm away from the sensor, centrally to the sensor beam. With its widest flat face toward the sensor, the out and back period was recorded four times. The equipment case was rotated on its vertical axis in increments of 10 degrees. With a rotation of 30 degrees, the out and back period was at the sensors maximum, indicating no return pulse was received. The out and back period was then measured at 25, 26, 26.5, 27 and 27.5 degrees, in order to determine the threshold beyond which the box would not be detected.
Test 5
The same procedure as in the above Test was performed, substituting the metal tape measure for the plastic equipment case. It was found that the tape measure could not be detected if it was rotated more than a few degrees (< 5).
Test 6
The plastic equipment case was placed 1m 0.1 cm away from the sensor, in line with the beam centre. The distance was measured from the edge of the case, to the centre of the SRF05 circuit board. The largest flat side of the case was oriented towards the sensor, and it was kept as perpendicular as possible to the line of the beam centre. The out and back period was measured four times, and the results were recorded. This was repeated at the following distances measured from the case to the sensor (error of 0.1 cm): 80 cm, 60 cm, 40 cm, 20 cm, 10 cm, 5 cm, 4cm, 3 cm, 2cm. In order to test the consistency of the sensor beyond 1m, the case was placed at a range of 2m, 1cm, and the out and back period was measured and recorded four times.
27
Distance measured (cm) 100.00 80.00 60.00 40.00 20.00 10.00 5.00 4.00 3.00 2.00
OBT (us) 5784.00 4628.00 3480.00 2321.00 1152.00 555.60 276.80 219.60 162.40 212.40 5752.00 4640.00 3480.00 2321.00 1152.00 555.60 276.80 219.60 162.40 212.70 5780.00 4636.00 3476.00 2321.00 1152.00 555.20 276.80 219.60 162.40 212.40 5752.00 4636.00 3480.00 2321.00 1152.00 555.60 276.80 219.20 162.40 212.40
Average (us) Distance (cm) 5767.00 99.43 4635.00 79.91 3479.00 59.98 2321.00 40.02 1152.00 19.86 555.50 9.58 276.80 4.77 219.50 3.78 162.40 2.80 212.48 3.66
Error (cm) 0.57 0.09 0.02 -0.02 0.14 0.42 0.23 0.22 0.20 -1.66
Error (%) 0.99 1.00 1.00 1.00 0.99 0.96 0.95 0.95 0.93 1.83
Analysis of Results
It can be observed that increasing the angle of deflection from the centre line increased the out and back time, even though the straight line distance to the object was the same. This may be caused by a smaller amount of energy being reflected back to the transducer of the sensor. If a smaller amount of energy is reflected back, and assuming that the ultrasonic intensity takes a finite time to rise to its highest level, the time at which the received energy crosses the detection threshold will be delayed, and therefore the out and back time will be seen to increase, until eventually the received energy will not cross the detection threshold. Assuming that the ultrasonic beam intensity falls off in a non-linear fashion from the centre of the beam (as is indicated by the manufacturers specifications), then noticeable increases in the out and back time would be apparent for objects deflected from that centre. The results gained using the metal tape measure indicate a much better ultrasound return from a flat surface
Distances measured by Sensor - Test 6
105.00 100.00 95.00 90.00 85.00 80.00 75.00 Distance Measured using SRF05 (cm) 70.00 65.00 60.00 55.00 50.00 45.00 40.00 35.00 30.00 25.00 20.00 15.00 10.00 5.00 0.00 100.00 80.00 60.00 40.00 20.00 10.00 5.00 4.00 3.00 2.00 Actual Distance (cm)
Figure 4.8: Graph showing results for Test 6, with Y axis error shown
28
Robotic Sensors & Control - Final Project Report than one that has a convex curve in relation to the sensor. The higher rigidity of the metal may also contribute to a better return. The tests involving the pencil lead case indicated that a small, sharply angled object will not give a good sensor return, even if it has rigid surfaces. Expanding on this result, the tests involving the angling of the equipment case in the centre of the beam show that the maximum angle of the box was 26.5 degrees (to the nearest half degree) before the return became erratic. An explanation for this is that at an angle of greater than 26.5 degrees results in all of the ultrasound energy being reflected away from the sensor, instead of back toward it. This may result in a false return at a greater perceived distance, if the ultrasound energy reflects from another object, then back towards the angled surface and then towards the sensor. This may cause problems with detecting some obstacles; however, it will also prevent the robot from perceiving a gentle ramp or slope as an obstacle. With a smaller object, the tolerance to rotation was much lower, although such objects are unlikely to be serious obstacles. The tests of the accuracy of the sensor up to a distance of 1 metre showed an average error of 2%, some of which is due to the oscilloscope accuracy (see Table 4.5.3 & Figure 4.8). When using the received sensor readings, the accuracy should be taken as 2%, or 1cm, whichever is greater (the sensor readings are only calculated to the nearest cm, to keep the data compact). This maintains accuracy at acceptable levels.
Methodology
The power consumption of the various devices to be powered was measured using a laboratory multimeter in series with the power connections of each device. The total was then compared to the maximum supply current of the USB-I2C device. The devices were then connected to this power supply, and functional tests were performed. These tests involved polling for sensor data at the same as issuing drive commands.
Results& Analysis
By measuring the currents drawn by each of the devices, it was found that the sensors drew approximately 2 mA each, and the MD22 drew approximately 50 mA. This gave a total of 66 mA, 4 mA less than the maximum current capacity of the USB-I2C device. The devices were connected in an operational configuration, and all performed as expected, even when repeated commands were being sent to the sensor controller and MD22. It was concluded that there was no issue powering the devices from the USB-I2C interface.
29
Methodology
The Sensor Controller was installed on the robot, with all eight sensors connected, and the battery power to the motors running through the override circuit. The robot was supported with the wheels off the floor, and the motors were set turning at 1/3 speed. The author initially moved his hand close to each sensor in turn, and observed whether or not the wheels stopped turning. The Udar sensor polling program was used to monitor the sensor readings. The readings were observed as the author moved his hand closer to the sensor, in order to determine if the relay was switching at the correct distance.
Modifications
The PIC code was modified so that the cut-off distance was twice that used previously: around 12 cm. The above tests were repeated, and similar performance was observed.
Robotic Sensors & Control - Final Project Report relay, so that the relay is switched immediately upon a sufficiently low sensor reading being recorded. Another solution, which could be implemented in parallel to that mentioned above, would be to alter the sequence in which the sensors are triggered. At present, each sensor is triggered sequentially, in order. It may be possibly, due to the high directionality of the sensors, to trigger sensors on opposite sides of the robot at the same time without them interfering. This would require significant testing to be done on the reflective properties of surfaces in the CVSSP, as well as some means by which the sensor controller can 'know' which sensors are connected in which position; at present, any sensor in any position on the robot can be connected to any set of control pins on the Controller. It would also be possible to exploit prior knowledge of which directions collisions are most likely to occur in. If the robot is moving forward the majority of the time, it is sensible to poll the forward facing sensors more regularly than the side and rear sensors. This reduces the time between the sensors triggering, making it more likely that collisions will be avoided in time. The PIC code can be edited easily in circuit, using a Microchip ICD 2 in conjunction with MPLAB software [Micro, 2000]. This leaves the possibility of future improvements open.
4.6 Conclusion
The Hardware aspect of the Sensors and Control System is capable of moving the robot at a variety of speeds, and can accurately detect obstacles. A feature is present to cut power to the motors if an obstacle is detected closer than 12 cm, however this is only effective at low speeds. Use of the interrupt capability of the PIC18F2220 could help overcome this, as could triggering more than one sensor at once, or regularly polling only the forward sensor. The Hardware can be powered from the USB port on the laptop computer, excluding the 7.2V supply for the motors, which is drawn from a battery within the chassis. The sensors are able to detect a wide variety of objects, including those with soft or curved surfaces, although detection of angled surfaces is problematic in the current configuration. Work could be done using Regions of Constant Depth (see 2.2.4) to negate this problem.
31
5 Software Design
5.1 Introduction
This section is intended to describe the Software aspects of the Sensors and Control System, as well as how they interact with each other, and with the Hardware being controlled. The source code for the various pieces of Software is not featured here: it is included on a CD-ROM (along with the latest executables) available with this report, and in Appendix C C++ Source Code. Appendix A Code Analysis contains analysis of the source code.
32
In te rru p t S e rv ic e R o u tin e
In te r r u p t S e r v ic e R o u t in e S t a r t
I n s t a n t ia t e V a r ia b le s
S e t u p I2 C In te r r u p t C le a r YES LED T o g g le H e a r tb e a t o v e r f lo w ?
R e se t r e g is t e r if o v e r f lo w e d
NO
M A S T E R r e a d in g , la s t b y te w a s a d d r e s s ?
YES F lu s h B u ffe r
NO
T x m it p r e v io u s ly s e le c t e d r e g is t e r
R e le a s e C lo c k
F lu s h B u ffe r In c r e m e n t r e g is t e r T x m it cu rre n t r e g is t e r R e le a s e C lo c k
YES S ta r t T im e r
NO
YES
R e le a s e C lo c k
T im e le s s t h a n 0 x 4 0 0? YES S e t p ro x c e ll t o 1 NO S e t p ro x c e ll t o 0
NO R e le a s e C lo c k
Loop
E n d In te r r u p t S e r v ic e R o u t in e
Table 5.3.1: Command Format for Interface with Hardware Ack. (HIS to AI) Valid command (D, L or Q) Valid command (D, L or Q) Override Notification (S) Override Notification (D) Invalid Command Char 1 Char 2 Char 3 Char 4 Char 5 Char 6 Char 7 Char 8 Char 9 A A O O E num num num num num num num num num num /0 num num /0 /0 num num num num num num num num num num Char 10 /0 /0 -
Table 5.3.2: Acknowledgement Format for Interface with Hardware. The Networking API allows character strings and integers to be sent between processes. The above command format was developed with this capability in mind. 34
Robotic Sensors & Control - Final Project Report connector position. The correlation between cells in the sensor array variable and the connector positions on the sensor controller is as follows: Figure 5.2 shows the Invocation Hierarchy for the Hardware Interface class, indicating the relationships between the public and private member functions. For example, the public member function check_command will invoke the private member function check, which, if it returns a boolean value of 'true', will invoke the private member function convert. The private member function moderate is used to adjust the acceleration of the robot. If the difference between the robot's current speed and the a new speed instruction is greater than 6, the acceleration of the robot is reduced in proportion with the difference in speed. This feature is intended to prevent overloading the motors, for example if the robot was travelling at full reverse speed, and received an instruction to travel at full forward speed. As mentioned in 4.5.1, the speed levels are split into 18 forward speed increments and 18 reverse speed increments. The Hardware Interface Class uses the check and convert private member functions to convert the received speed value (a signed integer between -18 and +18) into a signed char value between -128 and 128. If the speed value is not valid or outside of this range, an error acknowledgement will be returned. The sense and rangetocm private member functions are used to poll for sensor readings and convert the received two-byte value to an integer, respectively. This is done simply by multiplying the 'high' byte by 256 and adding the two bytes as integers. The instruct private member function is used to send motor speeds to the MD22 motor controller, dependent on the speeds being valid. The led private member function is used for testing purposes, to turn off the LED on the USB-I2C interface. The flush_tx and flush_rx private member functions are used to clear old data from the serial port buffers, after transmission and reception respectively.
36
Robotic Sensors & Control - Final Project Report The keyboard control is performed using a class called 'keys'. This class is based on an example shown to the author by Ahmed Aichi, where upon pressing a key, the corresponding keyboard code was printed to the console [Aichi, 2007] . This was adapted so that the arrow keys caused left and right speeds to change. These speed values are then be sent to the Hardware Interface Software after each keystroke. Pressing the space bar stops the robot. See Keys Class keys.h and keys.cpp on page 69 for source code.
Methodology
Six sensors were installed on the robot, and the Sensor Controller was connected in its operational configuration. A piece of software termed Udar.exe was written (see CD-ROM), incorporating the Bot class. This program was used to repeatedly poll the Sensor Controller from a laptop computer. A plastic tool kit with a woollen hat covering it was placed at 10 cm 1 mm away from each sensor. The woollen cover was used to reduce the high reflectively of the plastic surface, in order to better simulate the objects that would prove more difficult to detect in the CVSSP (i.e. soft furnishings). The apparent distance was observed both with and without the woollen cover, averaged over at least three readings.
Results
Surface Woollen Plastic Forward Sensor 9 9 Forward-Left Sensor 10 10 ForwardRight Sensor 9 10 Rear-Left Sensor 9 10 Rear-Right Sensor 10 10 10 11 Rear Sensor
Analysis of Results
It was surprising to note that the presence of the woollen surface did not increase the apparent distance in any of the results. Rather, the opposite was observed in some cases (see Table 5.6.1). The measured readings are all within 1 cm of the measured distance, which is in keeping with the expected accuracy of the Sensor Controller, bearing in mind that all floating point values are rounded down to the nearest integer. At this distance, an accuracy of 1cm can be expected, which is acceptable given the size of the areas the robot will have to negotiate.
37
Test Procedure
The test program works by creating a file handle that accesses the virtual COM port used by the interface device. The command to deactivate the LED is written as a sequence of BYTE variables. For details of the USB/I2C interface device, see the relevant section of the Devantech Ltd website [Devan, 2006] . The parameters for serial communication used by the program are identical to those mentioned in the Motor Testing Procedure in 4.5.1. The code takes these parameters as arguments when creating the COM port link. The parameters are as follows: - Baud Rate of 19200 - 8 data bits - No Parity bits - Two Stop bits. In order to perform the test, the USB/I2C interface was connected to a Laptop Computer with the Test Program Executable on the hard drive, and the USB/I2C interface drivers installed. After checking that the interface device was enumerated as expected by the operating system (on port COM3), the test executable program was run, and the LED on the interface device was observed.
5.7 Conclusion
The Hardware Interface software can be used to control the robot over any Wireless Network, using the proprietary Network API created by Ahmed Aichi [Aichi, 2007] . The processing relating to control commands is done by the Hardware Interface Class, which also has a feature to stop the robot after a certain amount of time has elapsed without an instruction being received. Sensor readings are given as integer values to the nearest centimetre, which is deemed accurate enough for the environment the robot is expected to operate in. The Object-Orientated nature of the Hardware Interface Class grants the Sensors and Control System a high degree of flexibility and modifiability. For instance, should it be decided that a higher level of accuracy was 38
Robotic Sensors & Control - Final Project Report required from the sensors (millimetres rather than centimetres for example), the class could be modified without changing the way the Hardware Interface Software uses it. The Remote Control Software is a useful tool for testing and development of new and existing features of the robot, and is intended to emulate the outputs of any decision system. The modular nature of the Sensors and Control System allows the decision system to be substituted for the Remote Control Software with no changes being made to the Hardware Interface Software.
39
6 Other Testing
6.1 Introduction
This section describes testing that does not definitively fall under the domain of Hardware or Software, but incorporates elements of both. These tests involved other members of the project team, and further descriptions are likely to be found in the Project Reports for their respective areas.
6.2 Integration work Floor running tests with camera 6.2.1 Aim
To obtain on-board camera footage to aid development of the Vision System; specifically, footage from the robot moving forward at varying speeds, so that motion detecting video algorithms could be tested.
6.2.2 Methodology
The laptop was secured to the chassis cradle, and a web-cam attached to the front of the chassis, in roughly the position proposed for the final design. This, along with the MD22, was connected via USB to the laptop, and the Serial Test Program was used to input simple movement instructions (move forward at constant speed). As no battery had yet been sourced for the robot, a long power lead was connected to a bench-top power supply. This gave the robot a limited movement range, however it did allow emergency shut-down of the motors by means of deactivating the power supply. The robot was tested moving forwards at 1/3, 2/3 and full forward speed. The video data from the web cam was recorded for each run. Tests were performed where furniture was placed in front of the robot, to examine the effect this had on the motion data obtained. One of the team members also walked across the cameras field of vision to examine the effect of moving persons.
6.2.3 Results
The video feeds were obtained with no major incident. The movement of the robot was consistent and there were many visual references for use by the Vision System. This test involved integrating the Sensors and Control System with the Vision Processing system, in terms of the speed setting applied to the motors, and the apparent velocity observed from the on-board camera. For full discussion of the results, see the report on the Vision System [Helland, 2007] .
6.3 Integration Work Collection of Data for decision system 6.3.1 Aim
To obtain sensor data that can be correlated in time with a position given by the External Vision System. This data will be examined to determine how well it matches the values expected (based on the known layout of the
40
6.3.2 Methodology
The Remote Control Software was used to drive the robot, over a Wireless link, with sensor readings being polled constantly and saved in a text file, with a time stamp appended to each set of readings. The robot was driven down one corridor of the CVSSP in clear sight of one of the fixed cameras installed in the centre. Peter Helland's Vision Processing program was used to give a value in Cartesian coordinates of the robot's position at regular intervals [Helland, 2007] . The position results were also saved in a separate text file with a time stamp automatically appended by the program. The system times of the laptop running the Remote Control Software and the PC running the Vision Processing software were synchronised, to ensure that the readings could be correlated accurately. The robot was driven at speed setting 6, close to the centre of the corridor as possible. People were asked to keep out of the corridor for the duration of the test, so that the position readings obtained by the Vision Processing Software would not be affected.
6.3.3 Results
The data obtained was used by Martin Nicholson to assist in developing his Decision System. This test involved integration of all aspects of the project; the Vision System, the Networking API, the Sensors and Control System, and the Decision System. For the numerical analysis of the data, see Martin Nicholson's project report. [Nicholson, 2007]
41
42
Semester 1 Week Chassis and Motor Tests Develop Test Rig Test Register reads/writes Sensor Trade off analysis Examine options Sensor Testing Height-above-floor tests Hard return tests Soft Return tests Software Interface Module Relationship Design Coding Comments and Annotations PIC Encoding State Machine Diagram Coding and Simulation Programming and testing Inter-rim report Collation of results into appendices Write-up experimental work Write Introduction, Conclusion References Contents Page Circuit Development Layout design Population and Tests Integration onto chassis System Duplication Final Report (Dissertation) Construct work schedule Integration of new data/results Update conclusions Ensure design is current Tidy up and submit 1 2 3 4 5 6 7 8 9 10 11 12
Exams 13 14 15 1
Xmas 2 3 4 1 2 3 4
Semester 2 5 6 7 8 9 10
Robotic Sensors & Control - Final Project Report Insufficient allowance was made for delivery of components, completion of PCB's and other time-factors not under the author's control. The working practice that should have been adopted is for the Project Schedule to be reviewed in a regular session every week, possibly as an adjunct to the project meetings. There was also no contingency plan formulated in case areas of the project took longer than expected. The end result of these delays was that the robot lacks a magnetic orientation sensor, which was seen as a potential addition to the project that would have increased the capabilities of the robot. Also, no work was done on duplicating the System until the very end of the project, and at the time of writing is still incomplete, although a Prototype Sensors Controller is available that lacks the hardware override mechanism. This will be used on the second robot. It would also be possible to modify the prototype to have the same functionality as the latest circuit.
The ICD header pins were connected to the wrong pins on the PIC sensor controller. No bypass switch was connected across the relay to prevent the PIC sensor controller shutting off the
motor power.
The Relay used in the original design was more complicated than was necessary (of the Double Pole -
Double Throw (DPDT) type, as opposed to Double Pole Single Throw (DPST)). A new PCB layout was designed, and the existing one modified by hand using additional wires. The new PCB used a differing relay footprint (for a SPST relay), and included a bypass switch and a series resistor that could be used to reduce the current flowing through the relay, in case this was proved to be an issue. A 100 ohm resistor was initially connected in series with the relay; this did not allow sufficient voltage drop across the relay coil. The resistor was replaced with a length of wire: this allowed the relay to function as desired.
44
Robotic Sensors & Control - Final Project Report needed in a skid steering system. Since the tyres of the robot are wide, a high level of power is needed to turn on the spot, since the friction of the tyres on the ground must be overcome. It was also noted that the wheels persistently developed instabilities, in that a tendency to wobble on the axle was observed. This resulted in the wheels having to be periodically tightened and adjusted, otherwise the handling of the robot was affected. Not even stringent readjustment could completely correct the problem (the robot would consistently pull to one side during tests).
45
8 Conclusion
The system produced as a result of this project provides a self contained means of controlling a robot, which can be interfaced with using the networking API. Since the Hardware Interface Software (Middleman) is objectorientated, the Hardware Interface class can be modified as a 'black-box' entity; this makes the software very flexible, allowing for future development of the robot. The simple interface also means that the Decision System can be modified as an individual module. The system could be improved by adding more sensors; these would be best connected to the PIC sensor controller, and accessed through extra registers. Since the PIC can be reprogrammed in-circuit, the code can be updated easily. The unused PIC outputs are also brought out to connection points on the circuit board, meaning that the PCB will not require modifications. It would also be beneficial to incorporate a level of decision-making ability into the motor-override feature. At present, the motor commands will be overridden regardless of which sensor is reading below the threshold. More useful would be a system that could detect the direction that would potentially result in a collision, and override only movement commands that take the robot in that direction. This was considered beyond the ability of the author to implement in the available time, however a person with experience in machine intelligence should be able to come up with a workable solution. This task is made more complex by the interchangeability of the sensors, as any sensor can be connected to any position on the control port. The Sensors and Control Documentation contains guidance to avoid this (see Appendix B - USER GUIDE SENSORS AND CONTROL SYSTEM). Sensors that could usefully be added to the system include magnetic sensors for determining the robot's orientation, and odometry sensors for determining the distance travelled (and possibly position relative to the starting point). The system meets the specifications outlined in 1.2. The sensors will detect objects up to 5 metres away, accurate to the nearest centimetre, and the readings are updated constantly. The latest sensor readings are available at any time, and can be sent to any program, using the Network API [Aichi, 2007] . The speed and direction of the robot can be controlled very accurately, setting the speeds of the left and right motors independently. The Hardware Interface Class automatically moderates the acceleration of the robot, to prevent over-charging the motors. There exist both software and hardware options for preventing collisions; the software will stop the robot if a sensor reading is less than 6cm. The sensor readings are not checked automatically, only when a sensor data request is received. The PIC sensor controller will cut power to the motors if a sensor reading is less than 11.76 cm (2.d.p); this cut-out can be bypassed by a switch, or by not connecting the motor power through the sensor controller at all. It should be noted that the Artificial Intelligence is expected to avoid obstacles in the first case. The cut-out is an 'emergency' measure. The override feature mentioned above is an area of the project where there is significant scope for further work. As mentioned in 4.5.7, the delay between an obstacle crossing the override threshold and the motors stopping can be up to half a second. The hardware cut-off threshold has been extended to compensate for this, however the issue is still pertinent. 4.5.7 suggests possible improvements to the hardware override mechanism. 46
Robotic Sensors & Control - Final Project Report It is also possible that the software override could be used to more effectively halt movement when necessary, since all sensor readings are received simultaneously, and the laptop processor can execute instructions much more quickly than the PIC (the PIC clock is 24 MHz, the laptop PC is 933 MHz). It is possible that high frequency polling of the Sensor Controller will slow down the rate of sensor updates, however. Another area with great potential for further work is the acknowledgements system. At present, a command will be acknowledged based solely on whether it meets the Hardware Interface Class' definition of a valid command. This could be expanded to included whether or not a response was received from the I2C device being accessed (a hexadecimal value of 1 will be returned if the write was successful, a value of zero otherwise). This would allow the user to gain a higher level of information about where in the System any problems were occurring, particularly if different acknowledgement characters were used for invalid commands and failed transactions. It would also be possible to re-write the software associated with this project in a more organised fashion, packaging code into functions, and perhaps making the whole system (including the Remote Control Software) object orientated. There is also a minor bug in the Hardware Interface Class which should be remedied (see 7.6). When undertaking any future projects of this nature, it will be necessary to prepare contingency plans in case of delays. Regular time-management sessions will also be a part of the project schedule, where Gantt charts will be updated and priorities re-assessed. This measure should help to highlight problems earlier. In summary, this project was successful in producing a system which meets the specifications given, despite poor time management and development difficulties. There is much room for improvement of the System, however such modification should not be difficult to accomplish, thanks to the modular design.
47
9 References
[Aichi, 2007] Special Engineering Project: Networking and Inter-System Communications, Final
[Devan, 2006] Devantech Ltd website, containing product details and technical documentation. Can
be accessed at the following URL: http://www.robot-electronics.co.uk/. The documentation for the products used in the project can be accessed as follows:
[Fu, 1987] Robotics: Control, Sensing, Vision, and Intelligence, by K. S. Fu, R. C. Gonzalez, and C.
[Gorman, 2002] Serial Port Enumeration Class by Zach Gorman, Archetype Auction
[Helland, 2007] Special Engineering Project: Vision Processing, Final Project Dissertation, Section 6:
[I2C Slave, 2002] Hobby website containing the only PIC I2C Slave examples found on the web (note
that the examples are not for the C18 compiler, rather they are in PICBasic.). http://www.astrosurf.com/soubie/pic_as_an_i2c_slave.htm.
[Micro, 2000] - Using the PICmicro SSP for Slave I2C Communication, Application Note
734 (AN734), by Stephen Bowling. Available from the Microchip website: http://ww1.microchip.com/downloads/en/AppNotes/00734a.pdf.
[Nicholson, 2007] Special Engineering Project: Decision System, Final Project Report, by Martin
Nicholson, 2007
[Pugh, 1986] Robot Sensors Volume 2: Tactile and Non-Vision, edited by Alan Pugh, published by
[R.D.Klein, 2003] Serial Communication Library written by and Ramon de Klein, available as free
software. The author can be contacted at the following e-mail address: Ramon.de.Klein@ict.nl . The code can be obtained from the author's homepage at: http://home.ict.nl/~ramklein/Projects/Serial.html. 48
www.sourceforge.net.
49
Global Variables
unsigned char last_byte: this byte is used to flush the Serial input buffer during the I2C interrupt service routine. ram volatile unsigned char results[18]: 18 bytes that act as user accessible registers, the first 16 of which contain ultrasound sensor readings, the remaining two give the compass reading. ram volatile unsigned char reading[2]: 2 bytes to hold the Timer values obtained from measuring an echo pulse, the values are then transferred into the appropriate results register. ram volatile int j = 0 : integer used to index the results register, any values written over the I2C bus overwrite this value, allowing the user to select which registers they wish to read from. This value will loop back to 0 if it overflows past 18. ram volatile long c : an integer value used in conjunction with the local variable i (found in main). The value of 'i' is incremented for every iteration of the main loop; when i exceeds the value of 'c', the status of the heartbeat bit changes. This is visible as a flashing LED on the PCB. Accessing the I2C interrupt service routine (ISR) causes the value of 'c' to change momentarily, and the LED to flash faster.
Variables in main
unsigned char trig : A byte to store the trigger output patterns for PORTB long i : Heartbeat counter; this is incremented with each iteration of the main loop. When i overflows the value of c, the status of an external LED is toggled. int beat : The toggle variable for the heartbeat LED. int s : Variable to hold which sensor is being polled, used in switch statement. Set to 0 when it overflows the maximum number of sensors (8). int prox[8] : array of integers used to determine if one or more of the sensors is reading an object closer than 5.88 cm. int k : Variable used to select cells in the prox array. int relay : Variable used to sum the prox array. If this value is greater than 0, then the power to the motors is cut off.
Function Prototypes
void i2c_isr(void) : The prototype for the I2C interrupt service routine. This takes no arguments and returns no values. void setup(void) : The prototype for the function which sets the options for I2C communications, Interrupts, and Input/Output. As above, this function takes no arguments and returns no values.
Preprocessor Directives
#pragma code low_vector=0x18 : This directive defines the following code section to be located at the PIC low priority interrupt vector. This is a section of the PIC program memory reserved for code that executes when a low priority interrupt occurs. The code section contains inline Assembler code, indicating a jump to the i2c_isr function. #pragma code : This signifies the end of the code section to be located at the low priority interrupt vector. #pragma interruptlow i2c_isr : This is the beginning of the interrupt service routine. This #pragma is unique to the Microchip C18 compiler, and is not part of the C standard. #define address 0xE0 : Defines a constant address to be 0xE0. This is the address of the sensor controller on the I2C bus.
Program Flow
At the start of the main function, variables are declared and initialised, and global variable c is set to 10 (gives a slow heartbeat). The setup function is called. This sets up the data direction on the input/output ports (TRIS registers), the interrupt configuration (priority off, Serial Port interrupts only), and the I2C settings (slew rate control off, 7-bit address, start/stop interrupts, clock stretching on). A do{...}while(1) loop is started here.
i is incremented, and the code for the heartbeat counter is executed. This is an if statement that checks the logical condition (i > c). If true, the value of the beat variable is toggled using another if statement (if the value is 1, it is set to 0, otherwise it is set to 1). The pin acting as a pull down for the external LED is set to the value of beat. If c is less than 10, is is incremented until it is equal to this value. This will cause the heartbeat LED flashing to slow down after speeding up for an interrupt. The variable s is incremented. An if statement then checks to see if(s == 8). If so, s is reset to 0.
The relay variable is used to sum the contents of the prox array. If the relay variable is 0, then the relay is held closed, and power can flow to the motors. If the relay variable is not 0, then the relay is opened, interrupting the power. The 17th cell in the results array is set to 1 if the power is interrupted, 0 otherwise (allowing the user to read this back as an override notification flag). A switch statement, taking s as an argument, is used to select the trigger output pattern for the sensor being polled. Following the switch statement, any interrupts on Timer1 are cleared. The Timer values are set to overflow after 10 microseconds. The Timer is set to run at 0.75 MHz, and is started. Immediately afterwards, the value of trig is assigned to PORTB. This raises the trigger pin of the appropriate sensor to logic 1. When the Timer overflows (after 10 microseconds) PORTB is set to 0, and the Timer is stopped. The interrupt, which was monitored with a while() statement, is cleared. The Timer is set for a delay of 700 microseconds (the time quoted by the sensor manufacturer between the end of the trigger pulse and the echo line being raised). The Timer is started with no prescale (at 6 MHz). A while() statement is used to monitor for the echo line going high or the timer overflowing. Once the sensor has raised the echo line high, the Timer is set to 0x0000, it's lowest value, and the prescale is set to give an effective clock value of 1.5 MHz. This will give a maximum timer value of around 43 milliseconds, 13 milliseconds greater than the maximum pulse width of the sensor. The interrupt from the previous Timer operation is cleared, and the Timer is set running.
A while() statement is used to halt until either the echo pin goes low, or the Timer overflows. Once either of these occurs, the Timer is stopped, and the interrupt cleared. The 2 bytes of Timer value are assigned to the reading variable. The high byte of the Timer is then checked in an if statement; if the value is greater than 512, then the appropriate cell in the prox array
The contents of reading are assigned to the appropriate pair of cells in the results array.
The Timer is set up for 50 ms before overflow, running at 0.75 MHz, and set running. This is the amount of time recommended by the sensor manufacturer to allow each sensor pulse to fade and eliminate crosstalk. When the Timer has overflowed, the interrupt is cleared, and program execution jumps to the top of the do{...}while(1) loop.
I2C interrupt
If, at any point during program execution, the Synchronous Serial Port Interrupt Flag (SSPIF) is raised (activity takes place on the I2C bus), the PIC will save all variables currently in use to a special section of memory, and jump to the i2c_isr function at the interrupt vector. The global variable c is set to 2. This increases the frequency of the heartbeat counter by a factor of 5, giving a visual indication of when an interrupt has been triggered. The SSPIF is cleared before any other action takes place. If a series of transactions on the bus occur in quick succession, then the interrupt service routine will continue to be called, however, clearing the interrupt as a matter of course ensures that program execution can never 'lock up' due to an interrupt never being cleared. The global variable j, which is used as an index for the results array, is looped around to the start of the array if it has gone past the 18th value. The program now enters a 4-state machine composed of if/else statements. The four states that can be accessed are listed below, the state accessed is dependent on the Serial Port Control Flags. 1. If the bus Master is reading, and the last byte received was an address byte, the port buffer is read to flush the data it contains. The buffer is then loaded (for transmission to the Master) with the currently indexed cell of the results array. The Clock Release bit is set, releasing the bus clock and allowing the transaction to proceed. 2. If the Master is reading, and the last byte received was a data byte, the port buffer is read to flush it, after the value of j has been incremented. The newly indexed results cell is loaded into the buffer for transmission. The Clock Release bit is set. 3. If the Master is writing, and the last byte received was and address byte, then the value of j is set to the value held in the buffer, setting the index of the results array to the user's desired value. The clock is released. 4. If the Master has sent a NACK condition (end of transaction), then the buffer is flushed and the clock release, with no other actions taking place.
bot.h is the header files for the bot class created to perform the conversions for the hardware interface. Only one instance of this class occurs in Middleman. stdio.h contains the standard input/output function definitions for C/C++. <iostream> allows the use of streams for input and output. <stdlib.h> is the C/C++ standard library. "TCPLink.h" is the header file for Ahmed Aichi's Network API.
Definitions
#define WIN32_LEAN_AND_MEAN: This option excludes rarely used references from the windows.h header file (included in bot.h), reducing build time. using namespace std: this causes all the C standard library functions to be brought into the same
bot pro: this creates and instance of class bot, referred to as pro. See 3.2.2 for a description of this class. int i: an integer variable used to count iterations in a for loop. long left and long right: long integers used to hold speed values received via the TCP link. char command: a single character which is received via the TCP link and denotes an instruction to the robot. char ack: a single character returned from pro, and sent via the TCP link to denote either a correct or incorrect instruction. int local_port: integer holding the local TCP port number for Middleman. int remote_port: integer holding the remote TCP port number that Middleman will accept connections from.
Program flow
Objects and Variables are instantiated/declared. Information is printed to the console, informing the user what the program is and what TCP port numbers are being used. Two functions from the Network API are called to set-up the TCP link. The first (TCPLink::Load) launches a program called Bcastmanager. The second declares a TCP link called motorlink, accessed as a stream, which is trying to connect to a program called roboterm. When the connection process has begun, the user is informed that connection is in progress, and a full stop is printed to the console every 200 milliseconds, after the fashion of a progress bar. This continues until the TCP link is connected (there is no time-out). When the link is connected, the user is notified and execution proceeds. The program enters a do{}while() loop, which continues until the command Q is received. Every 20 milliseconds, the program checks for data received over the TCP link. When data is received, any previously read data still in the buffer is cleared. The command character and the left and right speed values are extracted from the motorlink buffer in the same way that characters are extracted from a cin stream. The command character is converted to uppercase, since a user can enter both lowercase and uppercase characters into the Roboterm program, and the pro object deals only with uppercase. he command is printed to the console (for debugging purposes). The command character and left and right speed values are passed to pro.check_command, which returns an appropriate acknowledgement code (either 'A' for a correct command, or 'E' if the command or values where unrecognised or not of the correct type), and acts upon the command (sets the motor speed, for example). The acknowledgement is printed to the console (for debugging purposes). A nested if(...)else(...)statement checks if the acknowledgement was A; if so, the appropriate response to the received command is loaded into the motorlink buffer (for example, if the command S was received, the user is waiting for eight integer value sensor readings after the acknowledgement, whereas if the command was L, only the ack character is expected). If the acknowledgement was 'E', then the left and right values are returned as they were received. The information in the motorlink buffer is now sent.
If the command character was anything other than Q, the execution returns to the start of the do(...)while(...) loop waiting for the next instruction. If the command is 'Q', the loop ends, and executions continues to the exit(0) statement.
#pragma once: this directive instructs the compiler to only include bot.h once in any single compilation. This should increase compile speed on any implementation using more than one object of class bot. This is not a standard ANSI C directive.
Header Files
EnumSerial.h: header file for a library that facilitates Enumeration of serial ports on a Windows PC [Gorman, 2002] . <iostream> is needed for the cout and cin streams. Serial.h is included to allow the use of the Serial Communications Class [R.D.Klein, 2003] .
Member Functions
Public Member Functions
bot (constructor) : Since this function is the constructor, all the class variables are initialised here (if necessary). This includes the values for the I2C addresses of the various devices under the programs control. The function has several local variables, including an array vi, which holds the information about the COM ports on the local machine. This array is populated using the function Enumerate, which is defined in the EnumSerial header. Once the array is populated, the names of the ports are listed, and the user selects which port is connected to the USB-I2C interface. The serial interface is then initialise, using the CSerial class defined in Serial.h, with the appropriate parameters (Baud Rate etcetera). The user is asked to enter a time out period in milliseconds. This is the amount of time that will elapse after a Drive command is received before the robot stops moving. The final thing to occur in the constructor is the initialisation of the MD22. The left and right speed values are set to 0 and the acceleration and mode registers are set to the appropriate values. check_command : This function is the user's method for sending commands to the robot. The arguments consist of a command character, and a left and right speed value (long integers). If a command is being sent that is not a speed command, both speed values should be 0. The function returns an acknowledgement character. The function first checks that the speed values are appropriate (between -18 and 18), and converts them to a value between -127 and 127. The acceleration values are moderated based on the previous speed values; if the difference is above a certain threshold, the value is increased proportionally. The command character is then run through a nested if statement to determine the appropriate action to take. The functions to execute the received commands are called here. A switch statement is used to choose the response. timeout_check : This is called in the host program to check the time out status. The current file time (measured in units of 100 nanoseconds since 1st January 1601) is subtracted from the file time recorded upon reception of the previous Drive command. If the difference is greater than the time out the user entered, the time out flag is set and the motors are stopped. ~bot (destructor) : The class destructor stops the motors on the robot, and closes the serial port handle, freeing up the resources associated with it.
rangetocm : This function takes two bytes as arguments, and returns an integer value. The first byte is converted to an integer, and multiplied by 256; the second byte is also converted and added. This concatenates the two bytes into one integer value, which is then returned.
convert : This function converts its argument (a long integer between -18 and 18) to an integer value between -128 and 128. This value is returned by the function. check : This function performs a simple logical check on it's argument to determine if it is outside the range -18 to 18. A logical 1 is returned if this is true, a 0 otherwise. moderate : This function checks whether the difference between the previous speeds sent to the robot and the new speeds received is greater than 80. If so, the acceleration is set to equal the difference between the new and old values (a higher acceleration value results in slower power stepping of the motors). instruct : this function is used to send speed values to the MD22 motor controller. The COM port transmission buffer is flushed, then the values are assigned to the buffer, and sent. The speed values sent are stored for the purposes of acceleration moderation. sense : Returns a pointer to the receive buffer of the serial port, so that the sensor readings can be extracted immediately after this function is called. If this is not done, data may be lost. The function takes two arguments, the first is the PIC register that the read operation should start at, the second is the number of bytes to read back. The read operation is preceded by a write, setting the PIC to read back from the register corresponding to the first variable. A pause is inserted of 10 milliseconds, to allow the PIC to respond. The COM port is then read into the receive buffer, a pointer to which is returned. led : This function sends a hard coded set of instructions to the serial port to turn the led on the USB-I2C off. flush_tx : This function flushes the serial transmit buffer by overwriting it with 0's. flush_rx : This function flushes the serial receive buffer by overwriting it with 0's.
A Sensor Controller Circuit Board Eight sensors per robot with four-way connectors A USB-I2C interface with USB cable MD22 motor controller Hardware Interface Software Remote Control Software (optional)
The following User Guide outlines the procedure for correct use of the Sensors and Control System. The setup procedure for each component will be given, followed by instructions for sending commands to the robot using either the Remote Control Software or another program, and information on writing software to incorporate the Hardware Interface Class.
Set-Up
It is recommended that the user check the tyres of the robot, to ensure that they are firmly affixed. Unstable wheel alignment can cause steering problems.
Sensors
Rear Side 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 1 2 7 6 5 4 3 Front Side 2 1 0
The user must note which sensors are connected to which parts of the header strip. Each sensor slot (group of four colours in Illustration 1) is referenced by a cell in an integer array in the Hardware Interface Class (see ). The blue numbers in Illustration 1 denote the cell that the sensor reading will be found in. It is recommended that a logical connection scheme is applied; i.e. The front sensor is connected in position 0, with connection proceeding clockwise around the chassis.
USB-I2C interface
A female-to-female adapter is included with the Sensor Controller board. This consists of a 5-track piece of veroboard with a 5-way connector on each end. One end of this should be connected to the I2C interface header on the rear of the Sensor Controller board, with the copper side uppermost. The USB-I2C pins should be inserted into the remaining socket, with the component side of the interface circuit uppermost. The USB cable will be connected to the controlling laptop computer.
Battery + Right Motors + Right Motors Left Motors Left Motors + Battery -
Bcastmanager.exe is located in the root of the laptops C drive. A copy of this program can be obtained from Ahmed Aichi. The USB cable is from the USB-I2C interface is connected.
When the software is launched, the user will be presented with a list of all the COM ports available on the laptop. Select the entry in the list that is referred to as a USB serial port by entering the list index number of the entry. Press enter. The user will be asked to enter a time out value in milliseconds. This is the amount of time that will elapse before the robot is stopped, after the Hardware Interface Software has received a motor command. Choose a suitable value based on the means of controlling the robot and the operating environment. The user will be asked if she/he wishes to enable software proximity overrides. Enter 'Y' or 'N', based on your preference. Once these options have been selected, the Hardware Interface Software will broadcast using Bcastmanager.exe, until a connection with a controller is made.
robot. Choosing 'No' at this point will take the user to a terminal style interface where commands are entered one at a time. Choosing 'Yes' will cause driving instructions to be printed to the console, and the user will be able to drive the robot using the arrow keys on the keyboard. Sensor readings will be continuously polled and saved in the same directory as Roboterm.exe, with the filename Sensor_log.txt.
Command (AI to HIS) Motor Speeds Sensor Data Request LED off Quit
Char 4 /0 /0 /0 /0
Char 5 -
Char 6 -
Char 7 -
Char 8 -
Char 9 -
Char 10 -
Table 1: Command Format for Interface with Hardware Ack. (HIS to AI) Valid command (D, L or Q) Valid command (D, L or Q) Override Notification (S) Override Notification (D) Invalid Command Char 1 Char 2 Char 3 Char 4 Char 5 Char 6 Char 7 Char 8 Char 9 A A O O E num num num num num num num num num num /0 num num /0 /0 num num num num num num num num num num Char 10 /0 /0 -
Troubleshooting
The robot will not move, even though movement instructions are acknowledged
Ensure that the on/off switch on the rear of the robot is in the off position Ensure a 10 amp fuse is fitted in the fuse holder Check the connections to the MD22
Ensure that no objects are within 12 cm of the robot, and check the state of the override bypass switch Make sure that the motor battery has sufficient charge Ensure that the USB-I2C device is connected properly
The robot accelerated rapidly backwards or forwards when it was turned on!
This is due to the MD22 receiving spurious instructions when the Hardware Interface Software is launched. It is a known issue. It is recommended that the set-up procedure is followed exactly, as this will prevent unwanted movement. An inexperienced user should stand the robot off the floor when powering up the robot, to prevent damage.
/*Instantiate variables*/ int i = 0; //Incrementer variable long left; //Received left speed long right; //Received right speed char command; char ack; //Received command character //Acknowledgment character for transmission //Startup
cout << "~~~~Middleman V1.0 by Edward Cornish~~~~\n\n"; bot pro; //create instance of class bot int local_port = 5555;//Initialise ports int remote_port = 5556;
cout << "Uses port 5555. Looks for port 5556.\n"; //Information for user (delete in final //version) cout << "Local broadcast name = 'middleman'. Looks for name 'roboterm'\n"; TCPLink::Load(local_port, //(Comments by Ahmed Aichi) "middleman", //local broadcast name _on, //enable broadcasts "c:\\", //Path to file BcastManager.exe, make sure the executable is in there true); //Show bcastmanager console. Set to false when you release your program cout << "TCPLink loaded\n"; TCPLink motorlink ("ML" , remote_port , "roboterm"); //Instantiate link called 'motorlink', looks for Roboterm program cout << "Awaiting connection to controller...\n"; do{ //idle while awaiting connection Sleep(200); cout << "."; }while(motorlink.notConnected()); TCPLink::AllConnected(); //Called when all TCPLink objects are connected, //This closes the broadcast manager when all //TCPLink objects on the machine are connected cout << "Connected!\n"; do{ do{ pro.timeout_check(); //check for speed timeout Sleep(20); if(motorlink.notConnected()) { ack = pro.check_command('D', 0, 0); //Stop motors if connection is //lost. cout << "Connection Lost!\n"; do{ //idle while awaiting connection Sleep(200); cout << "."; }while(motorlink.notConnected()); } }while (!motorlink.Receipt()); //Loop while awaiting instruction //notify and proceed
/*Commented by Ahmed Aichi*/ motorlink.FreeInputBuf(); //You should free the input buffer on a regular basis, //see documentation for details //this basically frees the data that you have extracted //and leaves only the bit that you //you have not read. if you keep old data in there, //peformance may be affected (Ahmed) motorlink >> command >> left >> right;//Receive instruction from buffer command = toupper(command); //Convert command to uppercase (user can enter upper or //lowercase using Roboterm) cout << "Command: " << command << "\n"; //Display
ack = pro.check_command(command, left, right);//get appropriate acknoledgement code cout << "The acknowledgment code is:" << ack << "\n"; //Print to console
if (ack == 'A') //Relay over TCPLink as appropriate { if(command == 'D') { motorlink << ack << " " << left << " " << right << "\n"; }else if (command == 'S') { motorlink << ack << "\n"; for (i = 0; i < 8; i++) { motorlink << pro.sensors[i] << "\n"; //Send each sensor reading //in turn } }else if (command == 'L' || command == 'Q') { motorlink << ack << "\n"; } }else { motorlink << ack << " " << left << " " << right << "\n"; } motorlink.Send(); //Send whatever has been put into buffer //loop while quit command has not been received
motor_address; //Set in constructor sensor_address; // '' motor_mode; // '' motor_acceleration;// '' left; //Received left speed right; //Received right speed old_left; //Previously received left speed old_right; //Previously received right speed
//flag for acceleration moderation //Received command character //Acknowledgment character for transmission //Pointer to byte, used to convert received out and back time (in two bytes) to a //single integer
signed char send_left;//Characters to be sent over I2C to MD22 signed char send_right; int timeout; //Timeout value in ms
bool override_flag; //Overridden? bool override_control;//Override on? FILETIME ft; //Hold filetime LARGE_INTEGER stamp; //time that last movement instruction was received. LARGE_INTEGER current;//Current file time bool timed_out; //has the drive command timed out? int rangetocm (BYTE high, BYTE low) { int cm; //The int cm = ((int)high * 256); cm += (int)low; cm /= 87; return (cm); } int convert(long s) //simple number conversion, for speeds { int spd; //eight-bit value spd = s*(127/18); //convert plus or minus 0-18 to plus or minus 0-127 return spd; //returns signed int } bool check(long v) //checks if outside acceptable range { if ( (v > 18) || (v < -18)) return 0; else return 1; } void moderate (void) //Moderate acceleration in case of high speed differential { motor_acceleration = 0x50; if ((old_right - right) > 80) { motor_acceleration = (old_right - right); }else if ((right - old_right) > 80) { motor_acceleration = (right - old_right); } if((old_left - left) > 80) { motor_acceleration = (old_left - left); }else if((left - old_left) > 80) { motor_acceleration = (left - old_left); } } void instruct(signed char left, signed char right) { flush_tx(); txbuff[0] txbuff[1] txbuff[2] txbuff[3] txbuff[4] txbuff[5] txbuff[6] = = = = = = = 0x55; motor_address; 0x01; 0x03; right; left; motor_acceleration; //Send speeds //Convert two bytes to an int //Shift high byte up by one byte
//...now
BYTE r = (BYTE)start; //Register to start from BYTE l = (BYTE)len; //# of registers to read txbuff[0] txbuff[1] txbuff[2] txbuff[3] = = = = 0x55; (sensor_address + 1); //Read mode r; //Register to start from l; //# of registers to read
i2cbus.Write(&txbuff, 5); //Send read request Sleep(10); //Give chance to respond i2cbus.Read(rxbuff,sizeof(rxbuff)); //Get data - may require timeouts etc Sleep(10); //Give chance to respond return (&rxbuff[0]); } void led(void)//Test LED on I2C interface { flush_tx(); txbuff[0] txbuff[1] txbuff[2] txbuff[3] } void flush_tx(void)//flush txbuff { for (int i = 0; i < sizeof(txbuff) ; i++) { txbuff[i] = 0; } } void flush_rx(void)//flush rxbuff { for (int i = 0; i < sizeof(rxbuff) ; i++) { rxbuff[i] = 0;//Zero buffer } } public: int sensors[8]; = = = = 0x5A; 0x10; 0x00; 0x00; //LED should turn off! //Return address of array
i2cbus.Write(&txbuff, 4);
void timeout_check (void) { long time_ft; if(!timed_out)//Flag already set? { GetSystemTimeAsFileTime(&ft);//Acquire current.LowPart = ft.dwLowDateTime;//Assign to Large_int current.HighPart = ft.dwHighDateTime;//for arithmetic operations time_ft = timeout * 10000;//Convert timeout to units of 100ns if((current.QuadPart - stamp.QuadPart) > time_ft) {//Is difference between current ft and prev ft greater than the timeout value? instruct(0,0);//Stop the robot timed_out = 1;//Raise flag cout << "Time-out! Speed set to " << old_right << " , " << old_left << ", time_ft is " << time_ft << "timeout is " << timeout << "\n"; }//Print info to console } }
char check_command (char command, long new_left, long new_right) {//Use this method to pass instructions to the robot, and request data. char ack; int i = 0; //Acknowledgment character for transmission char ack; //Incrementer variable
if (check(new_left) && check(new_right)) //Check that left and { //right speeds are valid left = convert(new_left);//converts the proprietary speed right = convert(new_right);//values to the actual values send_left = (char)left; // send_right = (char)right; cout << "\nThe command is:" << command << "\n"; switch (command) { case 'D' : if(override_flag && override_control) { instruct(0,0); //Need to send S to clear override ack = 'O'; }else {//send speeds instruct(send_left, send_right); cout << "The corresponding left speed value is:" << left << "\n"; cout << "The corresponding right speed value is:" << right << "\n"; ack = 'A'; } //Move or stop the robot cout << send_left << "~" << send_right << "\n"; GetSystemTimeAsFileTime(&ft); stamp.LowPart = ft.dwLowDateTime; stamp.HighPart = ft.dwHighDateTime; //^^Convert to LONG_INTEGER for arithmetic^^ timed_out = 0; //^^ Timestamp code ^^ break; case 'L' : led();//Turn LED off cout << "LED should be off!\n"; ack = 'A'; break; case 'S' : pt = sense(0,16); //Read PIC registers 0-16 cout << "Sensors responding!\n"; for (i = 0; i < 8 ; i++) {//populate array sensors[i] = rangetocm(pt[i*2], pt[i*2+1]); //convert to ints cout << "-" << sensors[i]; if(sensors[i] < 6 && override_control) { override_flag = 1; //set override if ANY sensor is less than 6cm }else if(sensors[i] > 6 && override_control) { //lower flag if sensor is greater than 6cm override_flag = 0; } if(override_flag && override_control) { //only if override occurs and they are turned on instruct(0,0); //Stop if robot gets too close ack = 'O'; }else { ack = 'A'; } } break; case 'Q' : ack = 'A'; cout << "Exiting...."; //QUIT break;
} }else { cout << "\nInvalid input!\n"; ack = 'E'; } return ack; } bot(void) { //CONSTRUCTOR
CArray<SSerInfo,SSerInfo&> vi;//Array of type SSerInfo, contains details of the Ports on the //machine EnumSerialPorts(vi,FALSE); //Enumerate ports int j; //To hold the number of ports available int i = 0; //Loop var char prt; //Input Var from console char millisec_value[256]; //input var for timeout char override_choice; //Y or N LPCTSTR port_name; j = vi.GetSize(); //To hold name of selected port, pass to create serial obj //How many ports? //Heading
//A loop to print array of type SSerInfo - list of ports on machine cout << (i + 1) << " - " << vi[i].strFriendlyName << "\n";//print names of the ports i++; //Increment }while(i != j);//loop while there are still ports in the array that have not been printed do{ //Loop while the user selects a port cout << "Please Select the COM port attached to the robot interface,\n"; cout << "by entering the number preceding the dash:"; cin >> prt; if(i > j || i < j || !isdigit(prt)) { cout << "\nError! Invalid selection!\n"; }else break; //**out of loop**
}while(1);//allow user to specify a COM port to use (since the USB-I2C dongle is enumerated //differently on different systems) i = atoi(&prt); //convert user friendly list number to actual number (array cell) i--; //Decrement i cout << i << "\n"; //Print for debug cout << vi[i].strFriendlyName << "\n"; //Print for debug cout << "Opening " << vi[i].strPortName << "...\n"; //Print the port being opened port_name = vi[i].strPortName;//Pass over the name to create the port object - EnumSerial no //longer needed i2cbus.Open(_T(port_name));//Open the specified port... i2cbus.Setup(CSerial::EBaud19200, CSerial::EData8, CSerial::EParNone, CSerial::EStop2); i2cbus.SetupHandshaking(CSerial::EHandshakeOff); //...with the appropriate parameters //(hardcode, will not change) /*Setup the addresses - hardcoded*/ motor_address = 0xB0; sensor_address = 0xE0; motor_mode = 0x01; //Signed int, skid-steer mode motor_acceleration = 0x50; //Limit acceleration txbuff[0] txbuff[1] txbuff[2] txbuff[3] txbuff[4] txbuff[5] txbuff[6] txbuff[7] = = = = = = = = 0x55; motor_address; 0x00; 0x04; motor_mode; 0x00; 0x00; motor_acceleration;//Mode, stopped, acceleration
i2cbus.Write(&txbuff, 8);//initialises MD22 timed_out = 1; //Pretend time_out has elapsed before start up do{ cout << "Please enter movement time-out value in milliseconds:"; cin >> millisec_value; if(!isdigit(millisec_value[0]) || !isdigit(millisec_value[1]) || !isdigit(millisec_value[2])) {//Check that first three chars of string are digits cout << "Invalid input! Must be numeric.\n"; }else break; }while(1); //loop while the user enters an appropriate timeout value //assign the entered value to an integer var
cout << "Do you wish to activate emergency overrides? (Y or N):"; //Ask user to enter yes or no cin >> override_choice; override_choice = toupper(override_choice); //Convert input to uppercase if( override_choice != 'Y' && override_choice != 'N') { cout << "Invalid input! Must be Y or N!.\n"; }else break;
}while(1);
switch(override_choice) {//set control flag accordingly case 'Y' : override_control = 1; break; case 'N' : override_control = 0; break; }
//DESTRUCTOR = = = = = = = = 0x55; motor_address; 0x00; 0x04; motor_mode; 0x00; 0x00; motor_acceleration;//Mode, stopped, acceleration
using namespace std; void main (void) { //Instantiate variables char cmd; //Holds command character received from AI char ack; //Holds ack character received from HIS long left; long right; //speeds to send
long rxleft;//Speeds received with acknowledgement long rxright; int *spd_pt = 0;//pointer to keyboard speeds keys drive; //instantiate keys object int dr_left = 0;//Speeds used in keyboard control mode int dr_right = 0; string left_buf;//FOr holding terminal speed input string right_buf; int sensors[8];//Sensor readings fstream sensor_log;//File stream object to hold sensor readings SYSTEMTIME time;//Used for sensor log timestamp cout << "~~~~~Roboterm V1.0, written by Edward Cornish~~~~~\n"; int local_port = 5556;//Ports are hardcoded int remote_port = 5555; TCPLink::Load(local_port, //Comments on this by Ahmed Aichi "roboterm", //local broadcast name _on, //enable broadcasts "c:\\", //Path to file BcastManager.exe, make sure the executable is in there true); TCPLink motorlink ("ML", remote_port, "middleman"); //means connect to the program on local 5555 do{ Sleep(200);//Await connection cout << "."; }while(motorlink.notConnected()); TCPLink::AllConnected(); //Called when all TCPLink objects are connected, //This closes the broadcast manager when all TCPLink //objects on the machine are connected(Ahmed) cout << "Connected!\n"; do{ cout << "Do you wish to remote-drive the robot? Please enter Y or N:"; cin >> cmd;//Use cmd to receive user's choice cmd = toupper(cmd);//To uppercase if(cmd == 'N')//User wants terminal mode { do{//Terminal mode starts here cout << "\nPlease enter a command: "; cin >> cmd;//Take command input cmd = toupper(cmd);//Uppercase if (cmd != 'Q' && cmd == 'D') {//Drive command cout << "\nPlease enter left speed value: "; cin >> left_buf; cout << "\nPlease enter right speed value: "; cin >> right_buf;//Get speeds } else { right = 0; left = 0; //Right and left speeds are 0 when command is not D. This DOES NOT stop the //robot! } stringstream sl(left_buf);//Stringstream for string to int conversion stringstream sr(right_buf); sl >> left;//Convert sr >> right; motorlink << cmd << " " << left << " " << right << "\n"; motorlink.Send();//Send command and speeds while (!motorlink.Receipt())//Await acknowledgement { Sleep(10);//Idle cout << "~"; }
if( cmd != 'Q' && cmd == 'D') {//Receive two speeds with acknowledgement motorlink >> ack >> rxleft >> rxright; cout << ack << "-" << rxleft << "-" << rxright << "\n"; left = 0;//reset speeds right = 0; }else if (cmd != 'Q' && cmd == 'S') {//Receive eight sensor readings with acknowledgement motorlink >> ack; for(int x = 0; x < 8; x++) { motorlink >> sensors[x]; //fill sensor array with readings } cout << ack;//print ack for(int x = 0; x < 8; x++) { cout << "-" << (int)sensors[x]; //print all received sensor readings to screen } cout << "\n"; }else{ } motorlink >> ack; cout << ack; //just receive acknowledgement
}else if (cmd == 'Y') //User wishes remote control { cout << //Print cout << cout << cout << cout << cout << cout << cout << cout << cout << "Entering remote-drive mode.\n"; Driving instructions to console "~~~~Driving Instructions~~~~\n\n\n"; "> Accelerate = Up Arrow key\n"; "> Turn Left = Left Arrow key\n"; "> Turn Right = Right Arrow key\n"; "> Decelerate = Down Arrow key\n"; "> Full Stop = Space Bar\n"; "> Centre Steering = Enter\n"; "> Exit Remote-drive = Escape\n\n"; "~~~~Drive Safely!~~~~\n";
sensor_log.open("Sensor_Log.txt", ios::out | ios::app); //Open Sensor log file for output do{ if(_kbhit())//Detect keystroke { spd_pt = drive.capture();//Get pointer to speed from keyboard function dr_left = *spd_pt;//Assign speeds spd_pt = spd_pt+1; dr_right = *spd_pt; cmd = 'D';//Set command to drive }else { //Poll sensors cmd = 'S'; left = 0; //Speed values are null right = 0; motorlink << cmd << " " << left << " " << right << "\n"; motorlink.Send();//Send Sensor command while (!motorlink.Receipt()) { Sleep(10);//Idle while awaiting data } motorlink >> ack;//Get acknowledgement //Retreive date and time, append to sensor log file. GetSystemTime(&time); sensor_log << time.wDay << "/" << time.wMonth << "/" << time.wYear << "\n" << time.wHour << ":" << time.wMinute << ":" << time.wSecond
<< ":" << time.wMilliseconds << "\n"; //Write latest sensor readings to log file, for(int x = 0; x < 8; x++) { motorlink >> sensors[x];//recieve sensor readings sensor_log << sensors[x];//Write to file if (x < 7) {//Insert seperation characters between readings sensor_log << "|"; } } sensor_log << "\n"; Sleep(20);//Pause between sensor readings } if(dr_left == 666 && dr_right == 666)//666 is exit code (no such speed) {//Exit remote drive cout << "You have pressed ESC. Exiting remote-drive...\n"; dr_left = 0; //Zero speed values dr_right = 0; sensor_log.close();//Close file break; }else if (cmd == 'D') {//Send speed values from keyboard capture motorlink << cmd << " " << dr_left << " " << dr_right << "\n"; motorlink.Send(); while (!motorlink.Receipt()) { Sleep(10);//Await Acknowledgement cout << cmd; } motorlink >> ack >> rxleft >> rxright;//Retrieve acknowldegement cout << ack << "-" << rxleft << "-" << rxright << "\n"; } }while(1); } else {
cout << "\nUnrecognised Command!"; }//User has entered an unknown command value motorlink.FreeInputBuf(); //Flush Broadcast buffer }while(1);
keys.cpp
#include "keys.h" keys::keys(void) {//constructor - initialise vars k1 = 0; k2 = 0; spd_left_right[2];//Returns speeds left_spd = 0; right_spd = 0; } keys::~keys(void) {//Nothing to do in destructor } int *keys::capture(void) //check kbhit { k1 = _getch();//Acquire char if (_kbhit())//If second byte of code to be read { k2 = _getch(); } //Speeds are changed in increments of two, to reduce the number //of keystrokes needed to reach full speed, and to eliminate an //annoying beep sound when a speed value of 1 was printed to the console if (k1 == 224) {//Arrow keys switch (k2) {//Determine which key based on second part of code case 72 : { //up arrow left_spd++; left_spd++; right_spd++; right_spd++; //Speed up break; } case 80 : {//down arrow left_spd--; left_spd--; right_spd--; right_spd--; //Slow down break; } case 75 : {//right arrow left_spd--; left_spd--; right_spd++; right_spd++; //Turn right break;} case 77 : { //left arrow left_spd++; left_spd++; right_spd--; right_spd--; //Turn left break;} } }else if (k1 == 32) {//Spacebar - stops robot left_spd = 0; right_spd = 0; } else if (k1 == 27) //ESC - Quits remote drive { spd_left_right[0] = 666; spd_left_right[1] = 666; //666 should never be reached as a result of keypresses (see below) return spd_left_right; } /*Stop speeds from exceeding limits*/ if( left_spd > 18) left_spd = 18; if( left_spd < -18) left_spd = -18; if( right_spd > 18) right_spd = 18;
if(
spd_left_right[0] = left_spd; spd_left_right[1] = right_spd; //Assign speeds return &spd_left_right[0]; //And return them }