sensors and control report

Upload: tamil-arasan

Post on 03-Apr-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/28/2019 Sensors and control report

    1/77

    Special Engineering Project

    Robotic Sensors & Control

    Final Report

    Edward Cornish

    [email protected]

    mailto:[email protected]:[email protected]
  • 7/28/2019 Sensors and control report

    2/77

    University of Surrey

    School of Electronics & Physical Sciences

    Department of Electronic EngineeringFinal Year Project DissertationI confirm that the project dissertation I am submitting is entirely my own work and that any material

    used from other sources has been clearly identified and properly acknowledged and referenced. Insubmitting this final version of my report to the JISC anti-plagiarism software resource, I confirm that mywork does not contravene the university regulations on plagiarism as described in the StudentHandbook. In so doing I also acknowledge that I may be held to account for any particular instances ofuncited work detected by the JISC anti-plagiarism software, or as may be found by the project examineror project organiser. I also understand that if an allegation of plagiarism is upheld via an AcademicMisconduct Hearing, then I may forfeit any credit for this module or a more severe penalty may beagreed.

    Project Title: Robotic Sensors and Control System

    Student Name:Edward Cornish

    Supervisor: Dr Richard Bowden

    Date 6-5-2007

  • 7/28/2019 Sensors and control report

    3/77

    Abstract

    The Special Engineering Project is based around the development of arobotic platform for use in the University of Surrey CVSSP. This report deals

    with the development of the Sensing and Control system to allow the robot to

    move around its environment, using Ultrasound Sensors to detect obstacles. The

    Sensing and Control system receives instructions from the robot's Decision

    System, and passes sensor data back over the same channel. The project was a

    success, in that the Sensing and Control System met the specifications given,

    however there is significant room for improvement of the system.

  • 7/28/2019 Sensors and control report

    4/77

    Table of Contents

    1 INTRODUCTION TO THE PROJECT...................................................................................................11.1 OVERALL AIMS OF THE PROJECT (ENUMERATED)...................................................................21.2 SPECIFICATION FOR SENSING AND CONTROL SYSTEM..........................................................21.3 CONCLUSION.......................................................................................................................................2

    2 RESEARCH................................................................................................................................................32.1 INTRODUCTION..................................................................................................................................32.2 SENSOR TECHNOLOGIES..................................................................................................................3

    2.2.1 Infra-Red.........................................................................................................................................32.2.2 RADAR..........................................................................................................................................42.2.3 Inductive, Magnetic, Capacitive.....................................................................................................4

    Figure 2.1: Operation of a Hall-effect sensor in conjunction with a permanent magnet [Fu, 1987] pp2795

    2.2.4 Sonar...............................................................................................................................................5Figure 2.2: Differentiating between Walls and Corners using RCD's [Nehmzow, 2000] pp28..................6Figure 2.3: 360 degree sonar scan, with two RCDs [Nehmzow, 2000] pp27.............................................6

    2.2.5 Laser Range Finders.......................................................................................................................72.2.6 Shaft Encoders................................................................................................................................7

    2.3 CONTROL OF MOTORS......................................................................................................................72.4 CONTROL INTERFACE.......................................................................................................................82.5 TECHNOLOGIES SELECTED.............................................................................................................9

    Figure 2.4: Position of Sensors on Chassis....................................................................................................10

    3 DESIGN OVERVIEW..............................................................................................................................113.1 INTRODUCTION................................................................................................................................113.2 HARDWARE DESIGN HIERARCHY...............................................................................................113.3 SOFTWARE DESIGN HIERARCHY.................................................................................................123.4 CONCLUSION.....................................................................................................................................12

    4 HARDWARE DESIGN............................................................................................................................134.1 INTRODUCTION................................................................................................................................134.2 PIC SENSOR CONTROLLER.............................................................................................................13

    4.2.1 Important features of the circuit....................................................................................................14Figure 4.1: Circuit Diagram of Sensor Controller.........................................................................................144.2.2 Brief explanation of I2C bus protocol..........................................................................................15

    4.3 MOTOR CONTROLLER.....................................................................................................................164.4 LAPTOP CRADLE AND SENSOR PLATFORM..............................................................................17

    Figure 4.2: Laptop Cradle Design (dimensions in mm).................................................................................17Figure 4.3: Sensor Bracket Design.................................................................................................................17

    4.5 TESTING (HARDWARE)...................................................................................................................184.5.1 Motor Control Tests......................................................................................................................18

    Equipment Used................................................................................................................................................18Figure 4.4: Motor Controller Test Circuit......................................................................................................18

    Test Circuit........................................................................................................................................................18Test Procedure...................................................................................................................................................19

    Results & Analysis............................................................................................................................................194.5.2 Speed Tests...................................................................................................................................19

    Aim....................................................................................................................................................................19Equipment Used................................................................................................................................................19Methodology......................................................................................................................................................19Results & Analysis............................................................................................................................................20

    Table 4.5.1: Average speeds for speed settings 6 to 18..................................................................................20Figure 4.5: Graph showing approximate speed values (averaged over three readings and taking intoaccount uncertainties).....................................................................................................................................20

    4.5.3 Oscilloscope testing of I2C bus....................................................................................................21Aim....................................................................................................................................................................21Equipment Used................................................................................................................................................21Methodology......................................................................................................................................................21

    Results & Analysis............................................................................................................................................22Modifications.....................................................................................................................................................22

    4.5.4 Sensor Tests..................................................................................................................................22Equipment Used...............................................................................................................................................23

  • 7/28/2019 Sensors and control report

    5/77

    Test Circuit........................................................................................................................................................23Figure 4.6 - Sensor Test Circuit......................................................................................................................23

    Methodology......................................................................................................................................................23Results...............................................................................................................................................................24Analysis of Results............................................................................................................................................24

    Table 4.5.2: Sensor Test Results.....................................................................................................................24

    4.5.5 Sensor Testing Conclusive.........................................................................................................25Figure 4.7: Graph showing results of initial sensor tests...............................................................................25

    Points to be investigated:...................................................................................................................................25Equipment Used................................................................................................................................................25Assumptions:.....................................................................................................................................................26Methodology & Results:....................................................................................................................................26Test 1.................................................................................................................................................................26Test 2.................................................................................................................................................................26Test 3.................................................................................................................................................................26Test 4.................................................................................................................................................................27Test 5.................................................................................................................................................................27Test 6.................................................................................................................................................................27Analysis of Results............................................................................................................................................28

    Table 4.5.3: Results for Test 6........................................................................................................................28Figure 4.8: Graph showing results for Test 6, with Y axis error shown........................................................28

    4.5.6 Power Consumption Tests............................................................................................................29Aim....................................................................................................................................................................29Methodology......................................................................................................................................................29Results& Analysis.............................................................................................................................................29

    4.5.7 Override Relay Tests....................................................................................................................30Aim....................................................................................................................................................................30Methodology......................................................................................................................................................30Results & Analysis............................................................................................................................................30Modifications.....................................................................................................................................................30Results & Analysis of Modified Circuit............................................................................................................30

    4.6 CONCLUSION.....................................................................................................................................315 SOFTWARE DESIGN.............................................................................................................................32

    5.1 INTRODUCTION................................................................................................................................32

    5.2 SENSOR CONTROLLER PIC CODE.................................................................................................32Figure 5.1: PIC code Flowchart.....................................................................................................................33

    5.3 HARDWARE INTERFACE SOFTWARE ('MIDDLEMAN')............................................................34Table 5.3.1: Command Format for Interface with Hardware........................................................................34Table 5.3.2: Acknowledgement Format for Interface with Hardware............................................................34

    5.4 HARDWARE INTERFACE CLASS ('BOT')......................................................................................35Figure 5.2: Hardware Interface class - Invocation hierarchy diagram.........................................................35

    5.5 REMOTE CONTROL SOFTWARE ('ROBOTERM')........................................................................365.6 SOFTWARE TESTING.......................................................................................................................37

    5.6.1 Sensor Tests on Robot Chassis.....................................................................................................37Aim....................................................................................................................................................................37Methodology......................................................................................................................................................37Results...............................................................................................................................................................37

    Table 5.6.1: Apparent Distances (in cm) when mounted on Chassis..............................................................37Analysis of Results............................................................................................................................................37

    5.6.2 Serial Communications Test Program..........................................................................................38Test Procedure...................................................................................................................................................38Results & Analysis............................................................................................................................................38

    5.7 CONCLUSION.....................................................................................................................................386 OTHER TESTING...................................................................................................................................40

    6.1 INTRODUCTION................................................................................................................................406.2 INTEGRATION WORK FLOOR RUNNING TESTS WITH CAMERA.......................................40

    6.2.1 Aim...............................................................................................................................................406.2.2 Methodology.................................................................................................................................406.2.3 Results...........................................................................................................................................40

    6.3 INTEGRATION WORK COLLECTION OF DATA FOR DECISION SYSTEM..........................406.3.1 Aim...............................................................................................................................................406.3.2 Methodology.................................................................................................................................416.3.3 Results...........................................................................................................................................41

  • 7/28/2019 Sensors and control report

    6/77

    7 PROBLEMS AND ISSUES ENCOUNTERED......................................................................................427.1 INTRODUCTION................................................................................................................................427.2 PROJECT TASK OVERRUNS............................................................................................................42

    Figure 7.1: Gantt Chart created at beginning of project...............................................................................42Figure 7.2: Gantt Chart created half-way through the Project......................................................................43Figure 7.3: Retrospective Gantt Chart based on schedule of whole project..................................................43

    7.3 COMPLETION OF PIC SENSOR INTERFACE................................................................................44

    7.4 HARDWARE CONSTRUCTION........................................................................................................447.5 MANOEUVRABILITY ISSUES.........................................................................................................447.6 HARDWARE INTERFACE CLASS TIME-OUT...............................................................................45

    8 CONCLUSION..........................................................................................................................................469 REFERENCES..........................................................................................................................................48

    APPENDIX A CODE ANALYSIS...........................................................................................................50PIC SENSOR CONTROLLER CODE........................................................................................................50

    Header Files....................................................................................................................................................50Global Variables.............................................................................................................................................50Variables in main............................................................................................................................................50

    Function Prototypes........................................................................................................................................50Preprocessor Directives..................................................................................................................................51Program Flow.................................................................................................................................................51

    I2C interrupt...................................................................................................................................................52HARDWARE INTERFACE SOFTWARE.................................................................................................52

    Header Files....................................................................................................................................................52Definitions.......................................................................................................................................................52Objects and Variables.....................................................................................................................................53

    Program flow..................................................................................................................................................53

    HARDWARE INTERFACE CLASS..........................................................................................................54Preprocessor directives...................................................................................................................................54Header Files....................................................................................................................................................54Member Functions..........................................................................................................................................54

    APPENDIX B - USER GUIDE SENSORS AND CONTROL SYSTEM..............................................56INTRODUCTION.......................................................................................................................................56SET-UP........................................................................................................................................................56

    Sensor Controller Circuit Board.............................................................................................................56Sensors....................................................................................................................................................56USB-I2C interface..................................................................................................................................57MD22 Motor Controller..........................................................................................................................57Hardware Interface Software..................................................................................................................57Remote Control Software.......................................................................................................................57

    CONTROLLING THE ROBOT..................................................................................................................58Using Roboterm.exe...............................................................................................................................58

    Table 1: Command Format for Interface with Hardware..............................................................................58Table 2: Acknowledgement Format for Interface with Hardware..................................................................58

    Using other software...............................................................................................................................58Hardware Proximity Override................................................................................................................59

    USING THE HARDWARE INTERFACE CLASS....................................................................................59TROUBLESHOOTING...............................................................................................................................59APPENDIX C C++ SOURCE CODE......................................................................................................60

    HARDWARE INTERFACE SOFTWARE MIDDLEMAN.EXE...........................................................60HARDWARE INTERFACE CLASS BOT.H..........................................................................................61REMOTE CONTROL SOFTWARE - ROBOTERM.EXE........................................................................66KEYS CLASS KEYS.H AND KEYS.CPP..............................................................................................69

    keys.h.................................................................................................................................................................69keys.cpp.............................................................................................................................................................70

  • 7/28/2019 Sensors and control report

    7/77

    Robotic Sensors & Control - Final Project Report

    1 Introduction To The ProjectThis report outlines the research and initial work performed on a sensing and control system for a robot.

    Research into existing technologies is outlined, with appropriate references. The testing performed to validatethe technologies used is described. The overall design, at the time of writing, is described, along with the

    functional description of each module.

    The sensing and control system discussed in this report is intended to be used on a robotic platform for use in

    the University of Surrey Centre for Vision Systems and Signal Processing (CVSSP). The robot is being

    developed by a team of four Electronics Undergraduate Students, of which the author is one. Each member of

    the team has been assigned a specific role within the project. The team members are responsible for individually

    documenting their progress, and therefore this report shall not discuss the design particulars of the other parts of

    the robotic platform, except where such details affect the sensing and control system. The roles assigned to each

    member of the team is as follows:

    Martin Nicholson Project Management and Artificial Intelligence.

    Peter Helland Video Processing and acquisition.

    Ahmed Aichi Networking and Inter-System Communications.

    Edward Cornish (author) Robotic Locomotion Control and Collision Avoidance.

    The Project is being academically supervised by Dr Richard Bowden, a specialist in Computer Vision

    Systems. Dr Bowden's role is to oversee the project, and to provide guidance where necessary, however all

    management and design decisions are made by the Undergraduate Team.

    The robotic platform (hereafter referred to as 'the robot') is controlled by one of two laptop computers

    supplied by the CVSSP. It is hoped that, once one platform design has been finalised and tested, a duplicate

    system can be constructed that will allow the robots to be used for Artificial Intelligence experiments in team

    working and emergent behaviours. The laptop computers are the only technology supplied for use on the robot,

    all other equipment must be purchased from a Project Budget of 1000. The University does provide support

    facilities in the form of the Engineering Workshop (where bespoke mechanical items can be constructed

    usually free of charge), the CVSSP computing system ('Brother', to allow Video Processing work to be carried

    out), and the laboratory and library facilities available to all Electronics Undergraduates. The robot(s), once

    completed will be maintained at the CVSSP for use in future projects.

    At the earliest stage of the project, a pre-built robotic chassis was chosen to be the basis for the project. This

    chassis is a product of Lynxmotion [Lynx, 2006] , and is model 4WD3[Lynx, 2006] . This model was chosen for

    its large size and flexibility, allowing many devices to be mounted and supported. The 4WD (and attached non-

    electronic components (mounting frames etc.) are hereafter referred to as 'the chassis'. The chassis includes

    electric motors for locomotion.

    The four project areas are interlinked, yet each is a distinct system in it's own right. The Artificial

    Intelligence of the robot will make decisions related to navigation based on the information it receives through

    1

  • 7/28/2019 Sensors and control report

    8/77

    Robotic Sensors & Control - Final Project Report

    the interfaces provided to it (largely software based) [Nicholson, 2007] . The interfaces between the various

    software programs composing the robot control system will communicate using a custom Network API, which

    can be used to communicate between programs running on the same computer, as well as between

    computers[Aichi, 2007] . This allows decentralised processing to take place; video data captured on the robot

    can be passed over a wireless link to the CVSSP network, where the powerful vision processing systems in place

    can relieve the processor load placed on the laptop computer.

    The vision processing algorithms developed as part of the project will be used to determine the velocity of

    the robot, the rough position, and the presence of objects of interest (fixed obstacles, people, goals, etc.)[Helland,

    2007] . The Sensing and Control system will allow the robot to move around its environment in order to

    complete the goals assigned to it, based on instructions received over the Networking API from the Artificial

    Intelligence system, which will issue instructions based on the information it receives from the Vision System

    and the Sensing and Control system.

    1.1 Overall Aims of The Project (Enumerated)

    1 To produce a robot platform for indoor use.

    2 To design and build a sensor system allowing the robot to navigate around its environment.

    3 To develop a Vision system allowing moving objects to be identified.

    4 To design and implement a Networking API in the control system of the robot, allowing interprocess andpeer-to-peer communications.

    5 To develop Artificial Intelligence (AI) routines that allow the robot to be used for future projects in theCVSSP. Also referred to as the Decision System.

    6 To duplicate the robotic platform if possible, and investigate the potential for cooperative behaviours.

    1.2 Specification for Sensing and Control System

    The Sensor System must detect objects within 1m of the robot (the approximate width of a CVSSP

    corridor), and allow the measured distance to the objects to be passed to other systems in the robot.

    The System must allow a high level of control over the speed and direction of the robot.

    There must be a collision avoidance mechanism that will prevent the robot from driving into obstacles.

    1.3 Conclusion

    The Sensors and Control System is a critical part of the proposed robot platform. It is responsible for

    providing a flexible, modular system for controlling the movement of the robot, retrieving sensor data about the

    surroundings, and preventing collisions.

    The Sensors and Control System is intended to work in parallel with the Vision System to provide

    information to the Decision System. The Network API is to provide communication links between the various

    processes in the other three Systems.

    2

  • 7/28/2019 Sensors and control report

    9/77

    Robotic Sensors & Control - Final Project Report

    2 Research

    2.1 IntroductionIn this section, potential technologies are examined and discussed. The technology areas to be discussed are:

    Sensor Technologies

    Control of Motors

    Control of Sensors

    The Sensor Technologies are discussed in order to determine the most appropriate means of detecting

    obstacles, considering the intended operation environment of the robot, and the space, weight, and power

    constraints in effect.

    The chassis used in the robot's design incorporates four DC electric motors. A means to control these motors

    was considered an integral part of the project, especially considering the fixed nature of the motors; the wheels

    cannot be rotated to steer the robot, and so a 'skid-steer' system must be used (where the speed of the motors on

    one side of the robot is increased, to cause the robot to turn towards the opposing side).

    The information collected by the sensors on the robot must be captured and passed back to the higher level

    processes running on the laptop computer. Some processing ability (either hardware) was needed to perform this.

    The interfaces available on the laptop computer needed to be considered, both hardware (the physical ports), and

    the software capabilities (the communications protocols available). The laptop computer is running Windows

    XPtm

    Professional.

    Research was performed using Robotics Textbooks in the University of Surrey Library, and a selection of

    Internet Resources. These sources are referenced in 9.

    2.2 Sensor Technologies

    When considering sensor technologies for the robot, the operation environment must be considered. The

    robot is intended to operate indoors, on conventional carpeted floors. The obstacles encountered will primarily

    be either furniture, features of architecture, and people. The first two types can be assumed to be stationary,

    whereas people are likely to move around, and may provide fluctuating sensor returns. It is expected that the

    Vision System will be able to differentiate between people and non-living objects.

    2.2.1 Infra-Red

    IR proximity ranging has the disadvantage of only realistically providing detect/non-detect information, since

    the reflectivity of objects to IR is much more variable in an indoors environment [Schur, 2006] . The

    components, however, are widely available and compact.

    IR sensors use reflected IR light to detect surfaces. Low frequency modulation of the emitted beam is usually

    used to eliminate interference from unvarying sources, such as electric lights or the sun. Distance measurements

    are only possible if the environment has uniform colour an surface structure, and the sensors must be calibrated

    3

  • 7/28/2019 Sensors and control report

    10/77

    Robotic Sensors & Control - Final Project Report

    to do this. This is rarely practical in most scenarios, however. Black or dark surfaces, for instance, are practically

    invisible to IR sensors, so they are not completely infallible when it comes to proximity detection. It is because

    of this that IR sensors are generally only effective for object detection, and not distance measuring. Furthermore,

    since the intensity of IR light decreases quadratically with distance (proportional to d-2), typical maximum ranges

    are 50 to 100 cm, which may prove too small for the purposes of the project. [Nehmzow, 2000]

    2.2.2 RADAR

    RADAR provides an accurate picture of the surroundings, and is a well understood technology. The majority

    of objects in the indoor environment have high radar reflectivity, however there may be significant potential for

    interference from other radio sources in the CVSSP, due to the Wireless Networking systems in place. It is also

    uncertain how much power would be needed to operate a RADAR antenna with sufficient power to work

    effectively over the distances involved.

    2.2.3 Inductive, Magnetic, Capacitive

    In the field of proximity sensing, Inductive sensors may be used to detect the proximity of ferromagnetic

    materials. However, this method is unsuitable for use in the specified environment, as ferromagnetic materials

    are unlikely to be encountered in great quantities, making this technology more suitable for industrial and

    manufacturing robots. In addition to this, the sensor requires motion of a ferromagnetic object to generate an

    output voltage; stationary objects relative to the sensor have no effect. The inductive proximity sensor also has

    an extremely short range, typically fractions of a millimetre. This range limitation is another reason why this

    technology is mainly confined to assembly-line robots.[Fu, 1987]

    A technology with a potentially greater detection range is theHall-effect sensor. This device works on the

    principle of a Hall-effect sensor located between the poles of a permanent magnet. When a ferromagnetic

    surface/object is brought close to the magnetic poles, the magnetic field across the sensor would be reduced (see

    Figure 2.1).

    4

  • 7/28/2019 Sensors and control report

    11/77

    Robotic Sensors & Control - Final Project Report

    This method has similar disadvantages to the Inductive sensor method described above; only ferromagnetic

    materials can be detected, and the range of detection is reduced. [Fu, 1987]

    If a sensor is required to detect proximity to a non-ferromagnetic surface, a capacitive sensor may be used.

    These sensors are capable (with varying degrees of sensitivity) of reacting to all non-gaseous materials. As the

    name implies, the sensor works by detecting a change in capacitance between two electrodes, effectively using

    the sensed object (and the air around it) as part of the capacitors dielectric. [Fu, 1987]

    Capacitance based sensors are once again subject to a limited range. Also, whilst non-ferrous materials will

    give rise to a response, the level will be markedly less than that of a ferrous material; for example, Iron can cause

    a response 2.5 times greater than that caused by PVC at the same distance (see [Fu, 1987] pp.281).

    2.2.4 Sonar

    A great deal of work has been done on (ultrasound) sonar sensing in the field of Robotics.

    In a typical ultrasound sensor system, a 'chirp' of ultrasound is emitted periodically from a reasonably

    narrow-beam acoustic transducer. This burst of ultrasound will be reflected from nearby surfaces and can be

    detected at the sensor after a time T. This time interval is the out-and-back time. Since the speed of sound in air is

    known, it is a simple matter to calculate the distance to the reflecting surface using the relationship between

    velocity and time.

    A major advantage of ultrasound sensing methods is that the dependency of the sensor response upon the

    material being sensed is reduced, when compared to methods such as Opto-sensing and RADAR. This is clearly

    of benefit in an indoor environment, where a variety of obstacles will be found having different surface

    5

    Figure 2.1: Operation of a Hall-effect sensor in conjunction with a permanent

    magnet [Fu, 1987]pp279

  • 7/28/2019 Sensors and control report

    12/77

    Robotic Sensors & Control - Final Project Report

    compositions may be encountered. A contrasting disadvantage is that the sensor field is in the shape of a cone;

    the detected object could be anywhere within the sensor cone at the measured distance. The accuracy of the

    position measurement is dependent on the width of the sensor beam. Also, a phenomenon called Specular

    Reflections can cause inaccuracies in the measurements. If an ultrasound beam strikes a smooth surface at a

    sufficiently shallow angle, the beam will be reflected away from the receiver instead of back towards it. This

    may cause a larger range than actually exists to be read by the sensor.

    There are methods that have been developed to combat Specular Reflections. One method uses so called

    Regions of constant depth. If a 360 sonar scan is performed (for example) a significant section of arc where

    the ranges measured are constant is termed a Region of constant depth (RCD, see Figure 2.2). These regions can

    be interpreted by taking two (or more) sensor scans from two differing locations and comparing the arcs of the

    RCD's. If the arcs intersect, a corner is indicated at the point of intersection. If the arcs are caused by a flat wall,

    they will be at a tangent to the reflecting plane (see Figure 2.3). [Nehmzow, 2000]

    A third issue to be overcome relates to arrays of ultrasound sensors. If one sensor detects the reflected pulse

    from another, so-called crosstalkarises. Solutions to this include coding the sensor signals somehow, or

    controlling the timing of the sensors to prevent erroneous detections. [Nehmzow, 2000]

    Ultrasound sensors are effective at much greater distances than the proximity sensing methods mentioned

    above, even taking into account the increased atmospheric attenuation of sound waves at high frequencies. This

    means that the robot would have more freedom of movement, and would be able to sense obstacles at a greater

    range, allowing more time for path-planning computations to be performed.

    An experiment performed by Mitsubishi Electric Corporation showed that a mechanically scanned ultrasound

    sensor was able to detect the locations of standing persons within a room ([Pugh, 1986]pp.271). Investigations

    were also made into the practicality of an electronic scanning system.

    The advantage of the electronic scanning system over the mechanical system is that the servos used to pan

    and tilt the sensor beam contribute vibrational noise, and the assembly is by necessity quite large. An electronic

    scanning system can be used to deflect the beam by unifying the phases of the emitter elements in the desired

    direction. The study performed by Mitsubishi highlighted the problems with resolution, reliability, and

    processing time that must be overcome in the implementation of this form of sensor.

    6

    Figure 2.3: 360 degree sonar

    scan, with two RCDs [Nehmzow,

    2000] pp27

    Figure 2.2: Differentiating between Walls

    and Corners using RCD's [Nehmzow, 2000]

    pp28

  • 7/28/2019 Sensors and control report

    13/77

    Robotic Sensors & Control - Final Project Report

    A fixed sensor will not have the flexibility of the scanning sensor, but will be simpler to mount and utilise.

    Multiple sensors are needed to provide all around coverage.

    2.2.5 Laser Range FindersThese sensors are also referred to as Laser Radar or 'Lidar'. They are common in robotics, and function in the

    same manner as the sonar sensors detailed above; instead of emitting a pulse of ultrasound, a pulse of near-

    infrared light is emitted. The out-and-back time is again used to determine the range to the detected object.

    However, since the speed of light is much faster than the speed of sound through air at room temperature (order

    of 106 higher), the means of measuring the out-and-back time must be proportionately more accurate.

    Since the wavelength is also much shorter, the probability of total reflection off a smooth surface is reduced,

    so specular reflections are less of an issue. Accuracy of commercial Laser sensors is typically in the millimetre

    range. [Nehmzow, 2000]

    2.2.6 Shaft Encoders

    In order to determine the robot's position, some form of odometry is useful. Sensors known as shaft encoders

    are used to measure the rotations of robot's wheels. If the circumference of the wheels is known, the distance

    travelled (and possibly the direction) can be determined.

    For measuring distance travelled, Incremental encoders are most suitable. The alternative, Absolute encoders,

    are more suitable for measuring the current position of a rotating shaft. Incremental encoders are suited for

    summing movement over time. In a typical set up, two photoreceptors are used to read a disc affixed to the shaft.

    The disc is encoded with two tracks of information, one for each receptor, in such a way that one will always lag

    in the case of clockwise rotation, and always lead in the case of anti-clockwise rotation (for example). The

    number of times each receptor is triggered will inform the number of revolutions achieved.

    Using shaft encoders to provide odometry, and in turn an estimate of position, is known as dead reckoning. It

    has been observed in practice that dead reckoning is very unreliable over any significant distance. This is due to

    motions of the shaft that are not due to locomotive rotation, such as skidding or slipping on a surface. Such

    issues would be of particular concern in a skid steer system. [Nehmzow, 2000]

    When conducting preliminary research for this project, it was noted that Optical (IR and Laser) and Acoustic

    sensors (ultrasound) are common products available for amateur Roboticists. This may be taken as a reasonable

    indication of their ease of use and manufacture, and of their suitability for indoor robotic sensing applications.

    2.3 Control of Motors

    The robot uses DC electric motors, and in order to control and drive them, a system incorporating a power

    converter/regulator is needed. The power from the chassis battery must be translated to the 7.2V needed by the

    DC motors, and regulated in such a way as to provide speed, acceleration, and directional control to the robot.

    A range of off-the-shelf controllers are available from Devantech Ltd [Devan, 2006] , which perform exactly

    the task outlined in the above paragraph. These controllers are highly modular, requiring only power and control

    7

  • 7/28/2019 Sensors and control report

    14/77

    Robotic Sensors & Control - Final Project Report

    inputs, and can be controlled using a variety of methods, including analogue voltage inputs, and Radio Control

    Model systems. Of particular note is the I2C capability built into many of these products, considering the

    availability of USB-I2C interface devices from the same manufacturer, although analogue signals can also be

    used to control the speed/direction.

    It was possible that a custom circuit could have been designed, incorporating the power regulation and

    communications capabilities desired. This, however, would have been a significant design undertaking, requiring

    significant time and effort, to allow for development of a working device. This option was considered infeasible

    within the time constraints of this project.

    2.4 Control Interface

    The Motor controllers used in the above example robot are acting as slaves on an I2C bus. This

    communications standard was developed by Philips as a means of communicating between Integrated Circuits,

    using a minimum number of data lines. A range of sensors are available from the company supplying the motor

    controller and the chassis that also act as I2C slaves. Elsewhere in the product range, there are sensors based on

    the same principle, but are triggered by logic levels, with no bus communication functionality. It was felt that the

    logic-triggered sensors should be combined with a processing interface, as this would allow for more flexibility,

    and would foster greater understanding of the technology.

    The laptop computer has several USB ports available (due to an installed USB expansion card), and a USB to

    I2C translation device is available from Devantech Ltd [Devan, 2006] . This makes the I2C bus a viable choice

    for a sensor/motor controller interface, as the aforementioned translation device is treated as a serial (COM) port

    by Windows (through the use of the freely available drivers), and the writing of Windows programs to accessserial ports is a simple task.

    Another option for the connection of sensors to the laptop computer is an RS232 interface to one of the serial

    communications ports. This would have required more time to implement, however, and in order to achieve a

    working solution quickly, the USB-I2C interface was deemed to be the best choice. In addition to this, RS232 is

    an older technology, which may not be supported on future laptops. Therefore, in order to include an element of

    'future-proofing' to the system, USB is a better choice.

    If the bus-enabled sensors were chosen, they would have been connected to the laptop and controlled directly

    via the Hardware Interface Software. If the logic-triggered sensors were chosen, then some intermediate device

    needed to be in place to govern communications between the sensors and the laptop PC. There also needed to be

    some provision for the possibility of using other types of sensors, and sensors from other manufacturers. The I2C

    enabled sensors appear to be unique to Devantech Ltd [Devan, 2006] , and this should be considered with regard

    to the long term maintainability of the system.

    Such an intermediate device needs to either have inbuilt I2C functionality, or be sufficiently customisable

    that an I2C interface can be implemented. Devantech Ltd offer a range of I2C to IO devices that can do this job

    very simply.

    A more sophisticated solution is to use a PIC (Programmable Interface Controller), manufactured by

    Microchip Inc. These chips come with a wide range of features (including in-built I2C functionality) and are a

    8

  • 7/28/2019 Sensors and control report

    15/77

    Robotic Sensors & Control - Final Project Report

    very popular and widely understood product range. Many of Devantech Ltd's products are based around PIC

    micro controllers, which suggests that the PIC product family is trusted and well-supported by the robotics

    community. In addition to this, facilities for programming PICs are available in the Undergraduate Labs. Such

    facilities include MPLABtm[Micro, 2000] software, which allows programs to be composed in assembly

    language, and, with the installed C18 compiler, in C. Since the author is familiar with C from a level 1

    programming course, this does not require learning a new language. The MPLAB software includes

    sophisticated debugging tools, allowing code execution to be 'stepped through', whilst displaying the values of

    any program variables. Debugging can be done in hardware, if an In-Circuit Debugger tool is connected to the

    correct pins on the PIC. The undergraduate lab has a number of these tools, as well as PICStart Plus

    programmers, which are simply used to program PICs before they are installed into a circuit. Also useful are

    development boards which provide a variety of tools for testing programs and concepts (switches, keypads,

    displays, etcetera).

    2.5 Technologies Selected

    Based on the available products, and the literature researched (see References), Ultrasound Sensors were

    selected for collision avoidance and obstacle detection. The model chosen was the Devantech Ltd SRF05. This

    Sensor is simple to use, and has a range between 4m and 1cm, suitable for the distances encountered in the

    CVSSP.

    The Motors is controlled by an MD22 Motor Controller. As mentioned above, this was shown to work well

    with the chassis and motors in a demonstration video. In addition, buying as many components as possible from

    the same manufacturer was intended keep postage costs down.

    A PIC18F2220 micro controller is used as a sensor controller, and communicates with the Laptop using the

    built-in I2C module, through a USB to I2C interface (USB-I2C). It should be noted that the I2C module is part

    of a configurable serial port system on the PIC, allowing the use of other serial protocols in the future. The

    configurable logic outputs are used to trigger the SRF05 sensors (see below), and the internal timers are used to

    measure the length of the return pulse.

    The sensors used in the design are triggered by logic signals applied to their control pins. A range of sensors

    are also available with I2C bus functionality, allowing them to act as slave devices and respond to commands in

    the same fashion as the MD22 motor controller. The simpler sensors were selected to keep the Hardware

    Interface Software as simple as possible, and in order to provide a wider range of learning opportunities, such as

    PIC programming.

    It was reasoned that should the PIC based solution prove unworkable, the I2C-ready sensors could be used

    with a minimal number of changes to the Hardware Interface Software. In addition, a possible future project

    could be based on constructing Ultrasound rangers from scratch, based on the commercially available model. If

    this was the case, the rangers would have an interface already in place, although this was not necessary.

    9

  • 7/28/2019 Sensors and control report

    16/77

    Robotic Sensors & Control - Final Project Report

    The sensors are mounted at positions on the chassis of the robot, at equal angular spacing (see Figure 2.4.

    This will allow a model of the robots surroundings to be produced, by frequently polling each sensor. In this

    way, a constantly updating navigational map can be constructed. The eight sensors, mounted as shown in Figure

    2.4, should allow for maximum information about the environment to be collected, as well as the possibility of

    using Regions of Constant Depth (see 2.2.4).

    It was hoped to add a magnetic sensor to the robot to allow orientation to be determined. However due to

    time constraints this was not implemented, and there was no significant research done on this type of sensor.

    10

    Figure 2.4: Position of Sensors on Chassis

  • 7/28/2019 Sensors and control report

    17/77

    Robotic Sensors & Control - Final Project Report

    3 Design Overview

    3.1 IntroductionThis section examines the overall structure of the Sensors and Control System. The Hardware and Software

    aspects of the system are discussed in turn, with brief justifications for different aspects of the design.

    3.2 Hardware Design Hierarchy

    The laptop computer is the control centre for the robot, and is the platform for the software needed to control

    the robot. The laptop has Wireless LAN capability, allowing the robot to be controlled from another computer on

    the same network. This prevents the decision system being constrained by the specifications of the laptop

    computer.

    The robot sensors and motors are interfaced using a USB/I2C conversion device. This allows commands to

    be sent to the sensor controller/motor controller through a standard serial port software interface.

    The motor controller and sensor controller reside on the I2C bus, as mentioned above. The MD22 is a self

    contained prefabricated unit, requiring only control and power inputs, whereas the sensor controller is a bespoke

    circuit consisting of:

    A PIC chip

    A 24MHz oscillator

    Sensor connections

    Emergency Override Circuit

    The power source for the Sensor Controller and MD22 is drawn from the USB/I2C interface device. This

    negates the need for a DC-DC converter or other 5V supply.

    The control outputs of the MD22 are connected to the chassis motors in a 'skid-steer' configuration. This

    means that the controller drives a pair of motors on each side of the chassis. If a right hand turn is desired, for

    example, the power to the left hand side motors is increased, and the robot will turn to the right. This method

    allows for differential control of direction, which may facilitate simpler computations for the high level AI.

    The SRF05 Ultrasound rangers are mounted at eight points on the frame of the robot; each sensor requires thefollowing connections:

    +5V

    Ground

    Trigger logic input

    Echo logic output

    The Echo outputs of each sensor are connected to a single pin on the control PIC, as each sensor is

    polled separately. Each Trigger input must be driven from a separate PIC output pin, however.

    It was found during the design process that when two or more sensors were connected to the PIC, that the

    11

  • 7/28/2019 Sensors and control report

    18/77

    Robotic Sensors & Control - Final Project Report

    echo pulse would not be received by the PIC. An oscilloscope was used to determine that the sensors were

    functioning correctly and were being triggered. It was reasoned that when two or more sensors are connected to

    the same node in the circuit, the echo outputs of the other sensors load that node with their impedance, causing

    the voltage at the PIC to be less than the logic 1 voltage. In order to remedy this, diodes were connected between

    each echo pin connection and the circuit node where they were joined. This was observed to remedy the

    problem.

    3.3 Software Design Hierarchy

    Data is exchanged between the Robot Hardware and the controlling decision system through an intermediate

    software program (Hardware Interface Software). This program must be running for the robot to respond to

    commands. This program exchanges commands and data over a TCP/IP connection, using a network API

    developed for this project [Aichi, 2007] . The TCP/IP link transmits and receives over the CVSSP Wireless

    LAN.

    The command set used is flexible enough that the robot can be controlled either by a software decision

    system, or by a human operator using a Remote Control program. A piece of software performing this function

    was written to aid in development and testing. This software is referred to as the Remote Control Software.

    3.4 Conclusion

    The Sensors and Control System can be broadly divided into the Hardware aspect, consisting of the Sensor

    Controller (with Sensors) and MD22 Motor Controller, and the Software aspect, consisting of the Hardware

    Interface Software, which is build around the Hardware Interface Class and is the point of control for the userand/or decision system.

    12

  • 7/28/2019 Sensors and control report

    19/77

    Robotic Sensors & Control - Final Project Report

    4 Hardware Design

    4.1 IntroductionThis section aims to discuss in greater depth the various Hardware features of the Sensors and Control

    System. Each component is described in terms of it's function and capabilities. Concepts necessary for a practical

    understanding of the Hardware are explained.

    The tests that were performed on the Hardware aspects of the Sensors and Control System are described,

    along with the results of the tests, and the conclusions drawn (and actions taken, if any).

    4.2 PIC Sensor Controller

    The SRF05 sensors are controlled by a PIC 18F220 device, configured as an I2C bus slave. The PIC is

    programmed using MPLAB software released by Microchip Technology Incorporated, who also manufacture the

    PIC range of devices. The features built into the PIC 18F220 include:

    On-board and external Oscillator modes

    Pulse Width Modulation (PWM) dedicated inputs and outputs

    10-bit Analogue-to-Digital Converter

    On-board EEPROM flash memory

    Serial Communications module, with USART and I2C capabilities

    The features listed above provide a good deal of flexibility to the user, allowing the PIC to control more

    features of the robot, should they prove necessary. This flexibility extends in scope to future projects. See Figure

    4.1 for the circuit diagram.

    The PIC triggers each sensor in turn, measures the out and back time, and stores the result in a register entry

    reserved for that sensor. The sensor data can be requested at any time by the I2C bus master (the laptop

    computer), and the PIC will respond with the data. It is also intended for the PIC to perform ranging constantly

    upon power being applied.

    The PIC is programmed to interrupt power to the motors should an obstacle be detected within a small radius

    of the robot. This is achieved by using a relay through which the motor power is carried. If the relay control line

    (connected to the PIC) is at logic 0, the relay shall prevent power from reaching the motors, whereas a logic 1

    will allow power to flow. This has the added benefit of preventing the robot from moving when the sensor

    module is un-powered. A simple switch is fitted in order that this mechanism can be bypassed if necessary.

    The PIC clock is driven from an external crystal, across pins RA6 and RA7, running at 24 MHz. This high

    clock speed minimises delays in processing, since responsiveness is important when making decisions based on

    changing sensor readings.

    The oscillator module used for timing the sensor pulses is automatically run through a prescale, with an

    additional prescale selected in the Timer options, giving an effective frequency of 1.5 MHz. This allows

    13

  • 7/28/2019 Sensors and control report

    20/77

    Robotic Sensors & Control - Final Project Report

    sufficiently high (> 1cm accuracy) whilst allowing the maximum pulse width of the sensor (~30 ms) to be

    measured. The oscillator uses a 16 bit register to store its counter value; this register is read in two byte read

    operations.

    4.2.1 Important features of the circuit

    1uF decoupling capacitors are connected between the 5 V (Vdd) and 0 V (GND) supplies, to help reduce

    voltage fluctuations and spikes. Capacitors are also connected between GND and circuit nodes that undergo high

    speed voltage transitions to reduce ringing.

    Connection points are included for connecting a reset switch for the PIC. The PIC's MCLR pin should be

    pulled up to Vdd for the PIC to operate correctly. A switch closed between GND and MCLR will pull this pin

    low, resetting the PIC. This feature is intended to be used in the future if the PIC code is expanded; it may bepossible for the program execution to become 'stuck', if future code is not fully implemented.

    The heartbeat LED (used to give an indication of program status) is connected between Vdd and the

    controlling output on the PIC (with a series resistor to dissipate current). When the output is driven high, the

    LED turns off (there is no potential difference across it); when the output is low, the LED will be on.

    An RJ12 socket is included for In-Circuit Debugging/Programming. The socket connects to the Vdd and

    GND lines, the MCLR reset pin, and the RB7 and RB6 pins. It is important that the sensors are disconnected

    when using the RJ12 socket, as the sensors will disrupt the signals on the RB7 and RB6 pins.

    14

    Figure 4.1: Circuit Diagram of Sensor Controller

  • 7/28/2019 Sensors and control report

    21/77

    Robotic Sensors & Control - Final Project Report

    4.2.2 Brief explanation of I2C bus protocol

    In the I2C bus protocol, a device is either a Slave or a Master. Each Slave device has a 7-bit address that

    must be unique on the bus. Slave devices only respond to signals from the Master, they never initiate a

    transaction. Generally there is only one Master on the bus, however it is possible to work with multiple Masters;this is beyond the scope of this report.

    The I2C bus consists of two physical signal lines, SCL and SDA. SCL is an active-high clock line, and SDA

    is a Data/Address line. When a device puts a signal onto the bus, it does so by pulling one or both of these lines

    low. This has the effect of automatically reserving the bus for that device until it allows the line to be pulled

    high. Both the SCL and SDA lines must have pull-up resistors (in this case these are included on the USB-I2C

    device).

    In the system described in this report, there is one bus Master (the laptop PC) and two slaves (the MD22 and

    the PIC sensor controller). The slaves have addresses 0xB0 and 0xE0 respectively.

    When the Master wishes to start a transaction, it puts a START condition onto the bus. This consists of the

    Master pulling the SDA line low, then pulling SCL low after a short delay. Upon receiving this START

    condition, all slave devices will immediately listen for their address. The Master will send the address of the

    device it wishes to communicate with; the 7-bit address will be sent as part of a byte, with the last bit indicating

    a read or write operation (read = 1 / write = 0). For example, if the Master wished to write to the Sensor

    Controller, it would send the address 0xE0, and for a read it would send 0xE1.

    The Master controls the SCL line for this phase of the transaction. Data on the SDA line is valid when the

    SCL line is high. If a Slave detects its address, it will respond with an ACK condition; as the Master pulls SCL

    low to transmit the eighth bit of the address, the Slave will pull the SDA line low. The Master will then release

    SCL high, and then the Slave will release the SDA line to complete the ACK sequence. The Master can then

    proceed with the next byte of the transaction (if it is writing data) or wait for a response (if it is reading).

    For the purposes of the PIC sensor controller, the following sequence will occur for a read (the only data to

    be written to the PIC is the register address to read from):

    1. The Master will send a START condition.

    2. The Master will send the ADDRESS 0xE0 (write mode).

    3. The PIC will respond with an ACK.

    4. The Master will send a byte with the value of the register it wishes to start reading from.

    5. The PIC responds with an ACK.

    6. The Master sends another START condition (repeated start), followed by the ADDRESS 0xE1 (read

    mode).

    7. The PIC responds with an ACK. It will then hold the SCL line low until it is ready to transmit data.

    This is referred to as Clock Stretching, and allows slower processors to communicate with faster ones

    by allowing the Slave to decide when transmission starts.

    15

  • 7/28/2019 Sensors and control report

    22/77

    Robotic Sensors & Control - Final Project Report

    8. The Master releases the SDA line, and generates 8 clock pulses on the SCL line (once it has been

    released by the Slave). The Slave will change the SDA line according to each bit of the byte to be

    transmitted.

    9. Upon successful reception of the byte, the Master will generate an ACK condition by pulling theSDA line low (after it is released by the Slave) and triggering a clock pulse.

    10. The Slave is now free to send another byte, unless the Master has read the required number of bytes.

    In this case, immediately after the ACK, the Master will send a STOP condition (first release SCL, then

    SDA). This tells the Slave to go back to waiting for a START condition.

    4.3 Motor Controller

    As the MD22 motor controller is a self-contained device, it is simply connected to the I2C bus, as well as the

    requisite power and control points (for details of the MD22 operation, see the relevant section of the Devantech

    Ltd website [Devan, 2006] ).

    The MD22 is operated in Mode 1; This mode has a separate speed register for each motor (left and right), and

    interprets the contents of these registers as signed values (127 (0x7F) is full forward, -127 (0x80) is full reverse).

    The MD22 also has an acceleration register, which allows the rate of power stepping of the motors to be

    controlled. The acceleration value is changed to prevent over driving the motors. For example, if the robot is

    travelling at maximum forward speed, and a command is received to travel at maximum reverse speed, there is

    the possibility of over driving the motors if the power steps are of minimum size. A section of code in the

    Hardware Interface Software is used to prevent this (see 5.3).

    16

  • 7/28/2019 Sensors and control report

    23/77

    Robotic Sensors & Control - Final Project Report

    4.4 Laptop Cradle and Sensor Platform

    It was necessary to design and construct a hardware fixture to support the laptop when the robot was being

    operated. Also, mounting points were needed for the sensors and camera, which could not easily be attached to

    the purchased chassis whilst at the same time having a wide coverage. The decision was taken to combine a

    laptop cradle and sensor platform into a one-piece construction. Discussions were conducted with the University

    Engineering Workshop to establish the construction methods available, and a design was produced (see Figure

    4.2). This design was made from aluminium, making it light and strong, and allowing mounting points to be

    drilled for the sensors anywhere on the frame. Right-angle mounting brackets were produced in order to attach

    17

    Figure 4.2: Laptop Cradle Design (dimensions in mm)

    Figure 4.3: Sensor Bracket Design

  • 7/28/2019 Sensors and control report

    24/77

    Robotic Sensors & Control - Final Project Report

    the Ultrasound sensors to the frame (see Figure 4.3). The cradle also incorporated a support for the laptop screen,

    allowing the display to be read when the robot was operating without the screen being swung backwards by the

    robot's inertia.

    NOTE: The designs in Figure 4.2 and Figure 4.3 were developed in conjunction with the rest of the projectteam, and with the assistance and guidance of staff in the University of Surrey Mechanical Workshop. The

    concepts involved did not originate only with the author.

    4.5 Testing (Hardware)

    4.5.1 Motor Control Tests

    The aim of this set of tests was to characterise the performance of the MD22 in conjunction with the robot

    motors, and to assemble a list of commands that could be used to control the robots direction and speed. In order

    to do this, the MD22 documentation was used as a reference in testing the different modes of operation.

    Equipment Used

    Laptop Computer

    Serial link test Software

    USB/I2C interface

    Laboratory Power Supply

    MD22 motor controller

    Robot Chassis inc. motors.

    Test Circuit

    In order to test the functionality of the Motor Control System, the chassis and motors were assembled, and

    the MD22 connected to the motors and a laboratory power supply, set to provide 7.2V DC (see Figure 4.4. The

    MD22 was accessed using a serial port test program downloaded from the Internet (ref), and connected using the

    USB/I2C link. The 5V DC power for the MD22 control circuitry was drawn from the supply pins on the

    USB/I2C link, for the sake of simplicity. The chassis was placed with the wheels off the ground, and various

    instructions were input to the MD22, and the results observed.

    18

    Figure 4.4: Motor Controller Test Circuit

    M D 2 2 M o t o r

    C o n tr o l l e r

    U S B /

    I 2 C

    L a p t o p

    C o m p u t e r 7 . 2 V

    P S U

    L e f t

    M o t o r s

    R i g h t

    M o t o r s

  • 7/28/2019 Sensors and control report

    25/77

    Robotic Sensors & Control - Final Project Report

    Test Procedure

    The establishment of serial communications with the MD22 was done using a free Serial Port Test Program

    [Ser, 2004] . The parameters for communication with the USB/I2C device are (from the manufacturers

    documentation)[Devan, 2006]

    Baud Rate of 19200

    8 data bits

    No Parity bits

    Two Stop bits.

    The commands are sent as Hex characters.

    The modes of operation were tested in turn, with key speeds being applied (i.e. Full forward, full reverse, half

    forward, half reverse etcetera). Some initial testing was done to ensure that the motors were connected correctly.

    The reason for this was that the polarities of the motor power outlets on the controller are not labelled.

    Results & Analysis

    It was found that for correct operation, the positive leads should be on the two outermost outlets. This will

    result in the motors turning in the same direction when a Full-speed forward command is received.

    The modes of operation all performed as expected, and the response of the controller was immediate. The

    motor speeds were separated into 18 discrete levels for forward and backwards, to simplify the instruction set. It

    is deemed unlikely that 127 levels of forward and backward speed will be required.

    4.5.2 Speed Tests

    Aim

    To characterise the speed of the robot fully loaded with laptop, camera, chassis, and eight sensors.

    Equipment Used

    Digital Stopwatch

    Robot (complete chassis with sensors and sensor controller)

    Methodology

    The robot was set up with the items mentioned above installed. A second laptop computer, owned by the

    author was used to issue commands, using the Hardware Interface and Remote Control programs. The Remote

    Control program was operated in terminal mode (see 5.5), as it was not possible to steer the robot remotely and

    maintain consistent speed whilst measuring the speed.

    The robot was placed on the floor of the laboratory, and markers were placed at a 6 metre interval. A

    19

  • 7/28/2019 Sensors and control report

    26/77

    Robotic Sensors & Control - Final Project Report

    stopwatch was used to time the robot's passage between the markers, running at various speeds. The information

    obtained was used to calculate the speed of the robot is metres per second (ms-1), with three runs at each speed

    setting being made, then averaged.

    The speed settings were incremented in steps of 3, from 6 to 18. The lower speeds were not tested, as it wasexpected that the speed would increase linearly, allowing the lower equivalent speeds to be extrapolated.

    Results & Analysis

    It was observed that the robot did not travel in a straight line, even when the same speed levels were input to

    each pair of motors. The robot would always pull to the right, and so it was necessary to place the robot to the

    right side of the test course, angled to the left, so that it would not collide with furniture before reaching the end

    marker. This naturally made the distance travelled between the markers difficult to determine, and was a source

    20

    Table 4.5.1: Average speeds for speed settings 6 to 18

    Speed Sett ing

    6 0.2324 0.2498 0.24106

    9 0.3990 0.4314 0.41519

    12 0.5576 0.6061 0.58186

    16 0.7622 0.8342 0.79822

    18 0.8432 0.9254 0.88430

    Lower Limit (ms-1) Higher Limit (ms-1) Average Value (ms-1)

    Figure 4.5: Graph showing approximate speed values (averaged over three readings and

    taking into account uncertainties)

    6 9 12 16 18

    0.0000

    0.0500

    0.1000

    0.1500

    0.2000

    0.2500

    0.3000

    0.3500

    0.4000

    0.4500

    0.5000

    0.5500

    0.6000

    0.6500

    0.7000

    0.7500

    0.8000

    0.8500

    0.9000

    0.9500

    Speed of Robot in metres per second

    Lower Limit (ms-1)

    Higher Limit (ms-1)

    Average Value (ms-1)

    Speed Setting (out of 18)

    Speedmetresperseconds

  • 7/28/2019 Sensors and control report

    27/77

    Robotic Sensors & Control - Final Project Report

    of uncertainty.

    Another source of uncertainty was the nature of the timing method. The author used a stopwatch to time the

    interval between the start and finish markers. This relied on the authors judgement of when the robot had crossed

    the marker, and so an uncertainty of 0.1 seconds was assumed.

    In order to account for the sources of error present, upper and lower bounds for the speed were calculated,

    using the recorded time + 0.1 seconds and distance of 6 metres for the lower limit (the 'worst-case'); and the

    recorded time 0.1 seconds and distance of 6.4 metres for the higher limit, since the curved path of the robot

    would have increased the distance travelled. An average of these two limits was calculated.

    It can be seen from Figure 4.5 that the trend is quite linear until the higher speed values are reached. It is

    likely that the readings for speed level 16 are anomalous in some way, perhaps due to a skewed average of times.

    If the linear progression of the lower speed values is continued, ignoring the level 16 data, it should intersect the

    level 18 data.

    It is considered that the speed of the robot increases linearly with the speed levels, with a top speed of

    approximately 0.884 metres per second.

    4.5.3 Oscilloscope testing of I2C bus

    Aim

    To examine the signal lines of the I2C bus during communications with functional devices, and to use that

    information to inform the writing of I2C slave routines for the PIC sensor controller.

    Equipment Used

    TDS3032 Digital Oscilloscope

    TTi EL30T Power Supply Unit (PSU)

    PIC evaluation board

    Methodology

    In order to determine the source of the problem with I2C communications on the PIC, a digital oscilloscope

    was used to examine the signals on the bus. One channel each was used to monitor the SCL (clock) and SDA

    (data/address) lines. The oscilloscope was set to trigger on a falling edge, on the channel attached to SCL (I2C is

    an active low clocked bus).

    The bus was first tested using a Serial Port Test Program to send commands to the MD22 motor controller.

    This was done to establish that the methodology was sound, since the address and data patterns were explicitly

    known. A variety of messages were sent to the MD22, setting the mode, left and right speed registers to various

    values.

    The PIC (programmed with simple I2C code, allowing a constant value of 0x33 to be read) was connected to

    the same Test Program using the Interface device. Attempts to read back the constant character were made,

    21

  • 7/28/2019 Sensors and control report

    28/77

    Robotic Sensors & Control - Final Project Report

    whilst monitoring the SCL and SDA lines.

    Results & Analysis

    When using the MD22 to test the bus, the bit-patterns were observed to be as expected.

    Attempts to read back the constant value from the PIC were made, and were unsuccessful. The oscilloscope

    trace showed that the SCL and SDA lines where not being pulled down to ground successfully. It was realised

    that the 5V line for the I2C interface was connected to the 5V line powering the PIC, which was fed from a

    separate source. The 5V lines were separated, whilst keeping the ground lines connected. The read tests were

    repeated, and the oscilloscope trace clearly showed that the SCL and SDA lines were being pulled down

    correctly. The test character, however, was not being received by the test program correctly. This indicated that

    the code being used was not correct.

    Modifications

    When writing the I2C slave code for the PIC, consultation was made of an Application Note for PIC devices

    published by Microchip. This document is referred to as AN734 [Micro, 2000]. It details the different states that

    an I2C slave can be in at any stage during a bus transaction. An error has been pointed out in this application

    note, where the CKP bit is not set in the event of a Master NACK condition [I2C Slave, 2002] . This would

    cause the PIC to stop responding after the first read.

    The PIC code was re-written, based literally on AN734, taking into account the error that was identified. The

    five possible states were incorporated into a C switch statement, based on the flag bits relating to the PIC serial

    communications module. This was separated into read and write versions of the code, discarding those states thatwould not be used. The same test character (0x33) was used for the read tests. The success of the write tests

    would be based on whether or not an acknowledgement of 0x01 was sent back to the test program (this was

    generated by the I2C interface, not the PIC chip).

    This test code was found to work successfully, and the oscilloscope traces showed the correct characters

    being sent across the bus, with appropriate responses being received.

    4.5.4 Sensor Tests

    The accuracy of the SRF05 rangers was assessed in the laboratory. A single ranger was used, and was

    suspended from a clamp stand approximately 1m from the surface of the bench, so that vibrational interference

    would be minimised, and false returns from the bench top would be negated. The distance was measured three

    times at each distance, in order to determine the variability of the sensor readings at a constant distance, and to

    account for minor changes in the sensor return between measurements (objects moving in the background). A

    range of distances between 5cm and 1m was measured, as this was considered by the author to be the most

    critical range for the sensors to operate at (the manufacturers state the maximum range as 3-4m, depending on

    application).

    22

  • 7/28/2019 Sensors and control report

    29/77

    Robotic Sensors & Control - Final Project Report

    Equipment Used

    TDS3032 Digital Oscilloscope

    TTi EL30T Power Supply Unit (PSU)

    Push-to-make switch

    Breadboard

    Tape Measure

    Test Circuit

    The circuit used to test the sensor was a simple one (see Figure 4.6). A laboratory PSU was used to provide

    +5V and Ground levels, which were also used to supply the sensor with power (not shown in circuit). A 1uF

    capacitor was connected between the trigger input of the sensor and ground. This was included to counteract the

    effect of 'bounce' on the push-to-make switch. If any spurious voltage spiking occurred due to switch bounce, the

    capacitor would short-circuit the high-frequency spikes to ground, and provide a good DC level for the logic

    trigger.

    The trigger input must be held at +5V for 10 microseconds for the sensor to initiate ranging. The ranging will

    not start until the trigger input goes low again, allowing the sensor to be triggered by hand using the circuit in

    Figure 4.6.

    Methodology

    A flat upright surface was placed at various distances from the sensor aperture. The surface was the largest

    face of a plastic component storage box found in the laboratory. This object was chosen as it had large flat

    surfaces, and was of a rigid material, and was expected to have a high Ultrasound reflectivity. The distance

    between the sensor and the surface was measured using a tape measure.

    The box was placed between 1m and 10cm away from the sensor aperture, in 10cm increments, and finally at

    23

    Figure 4.6 - Sensor Test Circuit

    + 5 V

    T r i g g e r

    E c h o1 u F

    P u s h - T o - M a k e

    O s c i l l o s c o p e

  • 7/28/2019 Sensors and control report

    30/77

    Robotic Sensors & Control - Final Project Report

    5cm. At each discrete distance, the push-to-make switch was pressed, and the resulting echo pulse captured on

    the oscilloscope. The width of the pulse in microseconds (us) was measured using the scale on the oscilloscope

    display. This value was recorded. Each measurement was repeated three times at each distance.

    Results

    The results obtained show that the length of the echo pulse varies linearly with the distance of the reflecting

    surface (see Table 4.5.2). This corresponds to the expected performance of the Ultrasound Ranger, based on the

    manufacturers documentation. The method of manually reading the pulse width from the Oscilloscope screen is

    inaccurate, and a different method should be used to determine the uncertainty inherent in the Ultrasound

    measurements.

    Analysis of Results

    The results obtained show a strong correlation between the range measured from the sensor and the actual

    range, see Figure 4.7. This is as expected.

    The method used for measuring the pulse width was highly accurate. It is also very hard to quantify the

    inaccuracy. It was decided that further tests should be done, using a more accurate method of measuring the

    pulse width, with an accuracy that can be quantified. This will allow a value to be quoted for the accuracy of the

    sensors when connecting to other systems.

    24

    Table 4.5.2: Sensor Test Results

    Range (Cm) Readings (us) Average Measured range Deviation (cm)

    1 2 3 (us) (Cm)

    100 5800 5800 5800 5800.00 100.00 090 5100 5150 5120 5123.33 88.33 -1.67

    80 4550 4550 4560 4553.33 78.51 -1.49

    70 4080 4090 4090 4086.67 70.46 0.46

    60 3450 3440 3440 3443.33 59.37 -0.63

    50 2900 2990 2995 2961.67 51.06 1.06

    40 2360 2380 2380 2373.33 40.92 0.92

    30 1760 1730 1750 1746.67 30.11 0.11

    20 1200 1210 1215 1208.33 20.83 0.83

    10 620 620 620 620.00 10.69 0.69

    5 320 322 320 320.67 5.53 0.53

  • 7/28/2019 Sensors and control report

    31/77

    Robotic Sensors & Control - Final Project Report

    4.5.5 Sensor Testing Conclusive

    Points to be investigated:

    The maximum angle at