raspberry pi open project - draughts robot

52
Registration number 100178897 2019 Raspberry Pi Open Project - Draughts Robot Supervised by Dr Edwin Ren University of East Anglia Faculty of Science School of Computing Sciences

Upload: others

Post on 15-Oct-2021

13 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Raspberry Pi Open Project - Draughts Robot

Registration number 100178897

2019

Raspberry Pi Open Project - DraughtsRobot

Supervised by Dr Edwin Ren

University of East Anglia

Faculty of Science

School of Computing Sciences

Page 2: Raspberry Pi Open Project - Draughts Robot

Abstract

Consumer grade robotics has seen a large increase in popularity in the recent years al-

lowing people to begin developing relatively cheap robotic projects for needs that suit

them. This could lead to a fast advancement in affordable robot tech. This report docu-

ments the design and implementation of a robotic arm that is capable of playing check-

ers against a human player. This project incorporates fundamental robotic technologies

including computer vision, kinematics and controlling actuators. A fully functioning

robotic arm is achieved that can play a human in a game of checkers.

Page 3: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Contents

1 Introduction 6

1.1 Aims . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

1.2 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

1.3 Areas of Understanding . . . . . . . . . . . . . . . . . . . . . . . . . . 9

1.4 Road Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

2 Research 10

2.1 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

2.2 Arm Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.2.1 SCARA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.2.2 Cartesian . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2.2.3 Arm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2.3 Hardware . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.3.1 Servo Control and Programming . . . . . . . . . . . . . . . . . 14

2.4 Computer Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

2.4.1 Object Tracking/Board recognition . . . . . . . . . . . . . . . 15

2.4.2 Camera Calibration . . . . . . . . . . . . . . . . . . . . . . . . 15

2.4.3 Programming language . . . . . . . . . . . . . . . . . . . . . . 15

3 Identification of Issues 15

3.1 Issues Identified In Planning . . . . . . . . . . . . . . . . . . . . . . . 16

4 Design & Implementation 17

4.1 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4.2 Software Components . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

4.2.1 Software Architecture . . . . . . . . . . . . . . . . . . . . . . 18

4.2.2 Language and Libraries . . . . . . . . . . . . . . . . . . . . . . 18

4.2.3 Game Engine Design . . . . . . . . . . . . . . . . . . . . . . . 19

4.2.4 Game Engine Implementation . . . . . . . . . . . . . . . . . . 20

4.2.5 Servo and Arm control Design . . . . . . . . . . . . . . . . . . 22

4.2.6 Servo and Arm control Implementation . . . . . . . . . . . . . 23

Reg: 100178897 iii

Page 4: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

4.2.7 Computer Vision Design . . . . . . . . . . . . . . . . . . . . . 27

4.2.8 Computer Vision Implementation & Camera Calibration . . . . 29

4.2.9 Kinematics Design and Implementation . . . . . . . . . . . . . 34

4.3 Hardware Components Design and Implementation . . . . . . . . . . . 36

4.3.1 Parts List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36

4.3.2 Raspberry Pi + Raspbian . . . . . . . . . . . . . . . . . . . . . 37

4.3.3 Power Supply & Servo interface . . . . . . . . . . . . . . . . . 37

4.3.4 Arm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

4.3.5 Board . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

5 Testing and Full system 42

5.1 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

5.1.1 Game Engine Testing . . . . . . . . . . . . . . . . . . . . . . . 42

5.1.2 Computer vision testing . . . . . . . . . . . . . . . . . . . . . 43

5.1.3 Arm Testing/Servo . . . . . . . . . . . . . . . . . . . . . . . . 44

5.2 Integrated system . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45

6 Evaluation 46

6.1 Limitations of the design & Addressed issues . . . . . . . . . . . . . . 47

6.2 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

6.3 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

7 Gantt Chart 49

References 51

Reg: 100178897 iv

Page 5: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

List of Figures

1 SCARA Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . 12

2 Cartesian/Gantry Configuration . . . . . . . . . . . . . . . . . . . . . . 13

3 Arm Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

4 System Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

5 File Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18

6 UML for important Classes . . . . . . . . . . . . . . . . . . . . . . . . 19

7 Flow of control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

8 Sending coordinates to arm . . . . . . . . . . . . . . . . . . . . . . . . 21

9 MG996r PWM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

10 Initialising arm object and binding servos . . . . . . . . . . . . . . . . 25

11 Example of perspective transform applied warped image . . . . . . . . 29

12 Checking for movement on board . . . . . . . . . . . . . . . . . . . . 33

13 Left: Arm with rotational constraints. Right: Planar manipulator . . . . 34

14 Birds eye view of arm over board and ZMap . . . . . . . . . . . . . . . 35

15 Ground and power leads to servo . . . . . . . . . . . . . . . . . . . . . 38

16 Measurements of the arm . . . . . . . . . . . . . . . . . . . . . . . . . 39

17 Remounted grabber . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

18 Boards tested . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40

19 Grayscale red and green board . . . . . . . . . . . . . . . . . . . . . . 41

20 Board with Black/Red pieces . . . . . . . . . . . . . . . . . . . . . . . 42

21 Game engine passing and receiving values . . . . . . . . . . . . . . . . 43

22 Testing Digital board . . . . . . . . . . . . . . . . . . . . . . . . . . . 44

23 Testing Pi camera image of board . . . . . . . . . . . . . . . . . . . . 44

24 Testing arm and grabber . . . . . . . . . . . . . . . . . . . . . . . . . 45

25 Testing servo accuracy . . . . . . . . . . . . . . . . . . . . . . . . . . 45

26 Record of time spent on tasks . . . . . . . . . . . . . . . . . . . . . . . 47

27 Final Project Gantt chart . . . . . . . . . . . . . . . . . . . . . . . . . 50

Reg: 100178897 v

Page 6: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

1 Introduction

Robotics is used in a vast number of industries to perform roles that humans once

occupied, with applications ranging from the mass scale production of cars to

extremely precise open surgery- A robotic arm even sits on the face of another planet-

Mars Curiosity Rover (JPL 2019).

For these robots to autonomously perform tasks it is usually necessary for them to be

able to understand their environment, and then interact with it. This requires mapping,

using sensors to gain a representation of the surroundings and determine an objects

position and orientation (pose). To interact with an object the robot must have a

reliable coordinate system accompanied by an equally reliable way to translate

movements of servos/actuators, which move the arm, to the real world coordinates- this

translation of movement is known as kinematics. A robotic arm consists of a chain of

links moved by motors and actuators, with an end-effector to interact with the

environment, higher degrees of freedom are achieved by adding more joints or adding

planes of movement to current joints.

Many of the computer vision algorithms still used today date back to the 1960’s -

1980’s. Including the Canny edge detector (Canny 1986) and the Harris corner edge

detector (Lowe 1999) used in object detection. Although the introduction of robotics in

manufacturing occurred in the 1980’s, this technology has only recently, 2000’s, been

accurate and reliable enough to be used for safety critical systems such as surgery, with

advancements happening constantly- see Asimo Honda’s humanoid robot (Honda

2019) and MIT’s Cheetah (MIT 2019).

Historically, robotics is expensive, with hobbyist dissuaded from joining the industry

unless backed by a sizeable wallet. However, with the prevalence of low cost

computing hardware, as well as the declining cost of consumer robotics equipment this

field is becoming more accessible now than ever. By developing an arm with enough

accuracy to play checkers, whilst constrained by a budget, a view into the progress of

this field will be visible. Demonstrating the trajectory of this field, and the capability of

low cost hardware/software.

Reg: 100178897 6

Page 7: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

1.1 Aims

The aim of this project is to have a game of checkers/draughts between a human and a

robot arm. A program running on a Raspberry Pi, will be used to control the arm and

gather movement data on the human player. This could be further extended to allow

the robot to play against itself, as well as adapting the arm movement for other

applications such as integration with different games.

By taking the final aim of this product and decomposing it, several milestones can be

generated each of which must be met to complete the project:

• Selecting a robot arm with a sufficient range of movement and accuracy to repeat-

edly lift and place a draughts piece.

• Use inverse kinematics to derive the rotations for the servo motors so that the

end-effector will be placed over the desired tile on the game board.

• Use computer vision to recognise the board, differentiate team pieces and track

where pieces are, as well as determine if the human has taken their turn.

• Alter an open source checkers game to send movements to the arm and control

the state of play.

• Link the working arm, camera, and an open source Draughts game (complete with

AI), together to play a game.

Reg: 100178897 7

Page 8: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

1.2 Motivation

Robotics is a rapidly expanding market, with its adoption present in almost all

industries in some shape or form. Kinematics, known as "the geometry of motion", is

one discipline that underpins robotic advancement as it allows us to control of complex

robotics capable of many degrees of movement.

Computer Vision(CV) is another vital aspect of robotics, as it allows a system to react

in real time based on what it sees. CV algorithms have many applications, including

self driving cars, facial recognition, object tracking in sports and are even being

implemented into medicine to identify tumours (TheDuckCow 2019).

Because of the emerging prevalence of these two technologies I am interested in

learning more; This is where my personal motivation is derived. I desire to learn about

robotics and computer vision as I believe these fields would be a great way for my

career to progress, therefore am interested in involving these as part of my degree and

gaining practical experience. This kind of technology has the capability of improving

many peoples, for example those with disabilities, quality of life and can also remove

the need for humans to work jobs that can be carried out by robots.

The advancement of CV and robotics have the potential to change many lives, possibly

even preventing people from working poorly paid, monotonous or dangerous labour

jobs. If robotics can improve the world in which i live, i would love to help improve

this field advance; I believe in my lifetime these areas will rapidly change at a rate we

may not currently comprehend.

I have no previous experience in either fields so my approach will be utilising

fundamental techniques in attempt to gain an understanding of two critical aspects of

robotics. This will allow me to grow my knowledge base and understand which aspects

of robotics are important to me.

Reg: 100178897 8

Page 9: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

1.3 Areas of Understanding

To complete this project a good level of understanding of python programming is

required, as well as a firm grasp of available computer vision libraries, in this instance

openCV is required coupled with knowledge of feature detection is to recognise and

extract key objects from an image. Knowledge of data structures will be needed to

understand how to store camera calibration settings and board configurations. An

understanding of servo control and programming is vital, to allow manipulation of the

robotic arm. A brief understanding of electronics is required to understand how servos

receive control signals, and to ensure they are correctly powered and driven by the

Raspberry Pi’s hardware interface. A firm understanding of trigonometry is key as the

kinematics algorithms are based on complex trigonometry formulae.

1.4 Road Map

This section introduced the project and explained its motivation and application.

Section 2 will discuss relevant findings related to this project- Servo control,

kinematics algorithms, computer vision, hardware interfaces and programming; then

briefly describe two similar projects. Section 3 discusses issues identified during

development, which later ties in with section 5 showing which issues occurred and

relevant solutions. Section 4 documents the design decisions and the planning process,

as well as gives pseudo-code representations of algorithms used and describes the

software used. Section 4 also describes the implementation of each component of the

system. Section 5 describes any testing carried out on each component, then the overall

state of the complete system. The outcome is examined in section 6 including an

evaluation, then describes limitations to the design along with issues as well as future

development. Here the success of the project is evaluated. The final section, 7, shows a

gantt chart that represents the order in which tasks were completed, and their duration.

Reg: 100178897 9

Page 10: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

2 Research

This section explores research into similar topics and details any relevant findings. The

areas researched include: Servo control, arm design, hardware/software interfaces

(GPIO), error calculation and Limitation, computer vision and object tracking, camera

calibration and Programming language/Environment.

Firstly, related work is discussed focusing on other projects that have completed

similar tasks. Using this, research will be conducted into the relevant fields, and any

important information needed for design considerations will be reported. The research

discussed is as follows: Arm design and servo control; this research identifies different

forms of suitable arms and various servo hardware interfaces. Then computer vision

and object tracking methods are reported, identifying any necessary algorithms needed,

as well as providing support for camera calibration. Finally, research will show which

programming languages may be suitable for this project.

2.1 Related Work

Many projects exist that utilise a robotic arm along with a computer vision algorithm to

complete some task. More precisely there are several examples of projects that have

used an arm to play checkers/chess. Some of the ways in which these projects were

carried out will be identified and discussed in this section, giving me some indication

on what can be learned, and identifying areas of research. Two projects - Raspberry

Turk (), and Buttercup chess robot() utilise arms to play a game of chess, using

computer vision for piece detection.

Raspberry Turk uses a SCARA arm to play chess, the arm is controlled by a python

program, linked to an Arduino controlling 3 servos. The pieces are picked up by an

electromagnet on the end of a rod, that is lowered down to the pieces. This project uses

openCV for the vision module to detect the board and A neural network is trained to

recognise the pieces on the board.

Buttercup also uses a SCARA type arm to play chess. OpenCV allows the detection of

the board squares as well as pieces on the board. The pieces are picked up by a claw

Reg: 100178897 10

Page 11: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

like grabber controlled by a servo motor. The game engine tracks the movement of

players pieces, meaning no neural network is needed. The implementation details are

described fully in the paper.

Other examples of a robot playing a human exist, but the two cases talked about are

relevant to this project as they are completed on a low budget and use similar hardware

to what is used in this project. Most examples found online contain arms that cost

upwards of £1,000. This is clearly not feasible. Some other examples exist on

YouTube, however the documentation for these is harder to locate, therefore they dont

provide helpful as the work cannot be studied and used to help this project.

2.2 Arm Design

A robotic arms main objective is to translate linear or rotational servo movement into a

real world position. By using a coordinate system for the real world servo rotations can

be calculated that map movement between two desired points. In robotics there are

several design methodologies that can be utilised for positioning an end effector (The

point at the end of an arm that interacts with the environment) at a desired coordinate

in 3D space. The most common are outlined in this section.

The position of the end effector can be determined through either inverse or forward

kinematics, this is used in almost all robotic systems and uses trigonometry to calculate

the correct angles of movement at each joint(Richard 1981).

Due to the project budget some design methodologies may be inadequate. One

consideration with regards to arm choice is that- This is not an engineering project,

thus building an arm from scratch is out of scope. This makes the Arm choice and

research biased towards purchasable build kits and affordable hardware.

2.2.1 SCARA

Selective Compliance Assembly Robotic Arm, is a common methodology used in mass

production. This type of arm is used to interact with items laying flat beneath the arm.

As demonstrated in Figure 4 this arm allows slight compliance (flexibility) in the X and

Reg: 100178897 11

Page 12: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Y planes and rigid movement in the Z. The end effector is placed above the target and

lowered into position. This type of arm is commonly used, for example, to lower pins

into holes with high tolerances, and thus this methodology is a large contender for use

in this project. The reach of the arm is limited by the lengths of the inner and outer arm.

Figure 1: SCARA Configuration

2.2.2 Cartesian

Cartesian robots have many uses, including that of 3D Printers. The end effector has

rigid movement in X,Y,Z planes. As demonstrated in Figure 2 the arm sits above the

desired workspace as a gantry. This design requires a large frame however is extremely

accurate. Research shows that build kits exist for this type of robot, however they can

be costly, with simple models costing £300 - RobotShop 2019.

This arm is mechanically the best candidate for the project, however due to price

constraints is not the correct choice.

2.2.3 Arm

This design is likely what most people think of when hearing the term robotic arm. The

Robotic arm comes in many shapes and forms. These types of arms are preferred by

hobbyists, and have a large amount of forum support, as well as many available build

Reg: 100178897 12

Page 13: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 2: Cartesian/Gantry Configuration

kits. As seen in Figure 3 this arm has 4 servos, allowing 4 points of articulation. This

allows control in the Z direction with one servo, and two-three servos control the X,Y

direction, depending on the need of the system. The robot is modular and easily

changeable, meaning any required modifications can be made as well as more versatile

than the other methodologies as the end effector is not always adjacent to the floor.

This approach does not confine the robot to just checkers, and allows the manipulator

to be used for other applications if desired.

Figure 3: Arm Configuration

Reg: 100178897 13

Page 14: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

2.3 Hardware

2.3.1 Servo Control and Programming

To control a robotic arm servos are required to manipulate the position at each joint in

the arm. There exist many servos capable of relatively high precision that would be

suitable for Arm control.

Servos are generally controlled by Pulse Width Modulation (PWM). The signal pin on

the servo receives a 20ms pulse signal. The position of the motor is changed by varying

the length of the pulse by 15ms, shortening the pulse moves the servo clockwise and

lengthening moves counter clockwise, depending on the model. (MG669r manual)

Research through various hobbyist websites, and arm build kits pointed towards the

MG996r servo being a suitable candidate(Arduino.com 2016). The motors are

reasonably priced and capable of a high enough resolution for this application.

There are libraries available for servo control in various different languages, including

python, C, C++ and java. These allow signal control from either an Arduino or

Raspberry Pi. Since a Raspberry Pi contains a processor it is advantageous to the

Arduino. The Pi’s GPI/O board is only capable of running one pin on PWM, however

the library PIGPIO provides a software based PWM, therefore makes control of

various motors from any of the 30 pins possible (abz.me 2019).

2.4 Computer Vision

For a robot to be able to interact with the real world, it must fist be able to see the

world in which it is placed. Therefore it is vital to use some form of video analysis.

This analysis allows detection of objects in the environment, allowing code to react to

events in view of the camera. OpenCV is a Library available that provides functions

aimed at real time image processing- Full documentation available at www.opencv.org.

There exist algorithms that can detect various features in an image, for example

contour detection and edge detection.

Reg: 100178897 14

Page 15: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

2.4.1 Object Tracking/Board recognition

Canny edge detection(Canny 1986) is an algorithm applied to an image to detect all

straight dominant edges. This algorithm generates a thresholded image containing only

the edges, as well as a list containing all edge corners.

Chessboard detection is used in camera calibration, this could prove useful for the

project as it is a function that can recognise if a board is in view. This is a possible

route to follow when designing the computer vision module.

Hough Circle transforms provide a consistent and reliable way to detect circular shapes

of varying size in an image, this could be useful when detecting board pieces and their

position within the grid.

2.4.2 Camera Calibration

A chessboard detection algorithm is a standard way to calibrate a camera and check for

distortion. Camera calibration can allow for images to be re-sized or a change of

perspective allows a consistent image to be captured. A standard method for camera

calibration is described by OpenCV.org(OpenCV 2019). Camera calibration allows a

consistent image is captured for a program to use.

2.4.3 Programming language

Since the systems examined use python this is a good language to aim to use. Many

other languages exist for servo control, like c++ or java. However, Python is easy to use,

lightweight and allows for use with openCV, and PI.GPIO. It is the obvious candidate

for use in this project especially due to many open source games existing.

3 Identification of Issues

This section discusses any problems/issues identified whilst planning, It gives an

insight into the issues that may affect the success of the project, these are identified in

the planning/research stage. Section 5 assesses any issues that occurred in the time

between planning and completing the project.

Reg: 100178897 15

Page 16: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

3.1 Issues Identified In Planning

• Accuracy of Arm: Since a build kit will be used an arm with sufficient articula-

tion, reach and accuracy will need to be used. There are many build kits available

for purchase, some of which will be documented in section 2.3 of the report

• Look-up Table vs On the Fly calculation: Determining the angle at which each

motor should rotate to map all positions onto a chess board could be hard-coded

or worked out on the fly. Each approach would work, but I believe a look-up table

would introduce less error. The angles could be calculated using Inverse Kine-

matics, however the real world values may not align with the calculated values

due to servo error.

• Calibration: Will the error introduced into the system need to be mitigated using

calibration points, or will the it not amount to enough to make a noticeable differ-

ence. If the error is unworkable a calibration function may be needed to track the

location of the arm using CV.

• Grabber: There are two ways in which pieces can be lifted, electromagnet or

pincer. The lifting technique will likely be a magnet, since this is more simple to

control using GPIO and also has no moving parts to calibrate. However, a claw

aligns more directly with a robotic arm and allows a greater degree of freedom

for future applications and adaptations.

• Computer vision accuracy: Determining when the human player has taken their

turn, and when the human takes a piece from the AI will require the use of a

camera. There needs to exist a sufficient method of determining whose turn is

next, if a piece is taken, moved and when a game ends.

• Expense: Since the project is based largely on hardware this can become expen-

sive. This is further discussed in the Planning section. However the budget for the

project cannot exceed £150, therefore lower budget hardware will likely be used,

this may affect the accuracy of parts of the system.

Reg: 100178897 16

Page 17: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

4 Design & Implementation

This section first shows a system overview then details the Design and implementation

of the required components. Software components are described with pseudo-code and

hardware components are shown with diagrams. The design and implementation of

each separate system, once tested and functioning, can be joined together to create a

single working system.

4.1 System Overview

Figure 4: System Architecture

• A. The Raspberry Pi is the main control system, it handles the game engine

(checkers game), as well as servo control and computer vision. The main sys-

tem is a checkers game running in python.

• B. The camera is used to detect the current state of the chess board, If a piece

moves the game state is updated and gets the movement for the computers turn.

Reg: 100178897 17

Page 18: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

• C. The arm receives controls from the Raspberry Pi, over the GPIO interface.

• D. The human and arm both play using this chess board.

• E. The human player moves their piece, playing against a min-max algorithm.

4.2 Software Components

4.2.1 Software Architecture

The project uses a modular, and simple file structure keeping functionality separated to

files containing individual modules. Each file contains several python files that are

used by the system at some point. This section did not necessarily incorporate any

design, however it was an emergent outcome from creating each module separately.

Figure 5 shows the hierarchy view of files used by the software element of this project.

ChessRobot is the root file. A Virtual environment is also included in this structure

with the required dependencies however is not shown here as it is contained within the

code. For the whole system to be started the file checkers.py must be ran.

Figure 5: File Structure

4.2.2 Language and Libraries

Python is the most suited language for this project to succeed. I have used python

previously and am familiar with how the language works. As shown in the UML this

program will be written using OOP design, thus creating objects for the Arm, Camera,

players and game.

Reg: 100178897 18

Page 19: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Libraries/Dependencies used in this projects are as follows:

• OpenCV: is used to process the images taken by the camera.

• PI.GPIO: used for serial pin control on the Raspberry Pi, allowing python to

communicate over hardware interfaces with the servo motors.

• NumPY: used for storing and manipulating complex data types, and saving data

to files.

• Python 3: language used, must be installed to run the project.

The image shown in Figure 6 shows the basic UML for the game engine and its

interaction with the Arm class and CV module. The checkers game creates two

abstract classes, player and robot. The robot sends moves to the arm, and the player

pulls its moves from the CV module. This is modified from the original code, where

the player inputs their moves from the keyboard and the robot gets its moves from the

AI. The robot still gets its move from here, however the move is returned to the engine

to be sent to the Arm class.

Figure 6: UML for important Classes

4.2.3 Game Engine Design

To make each of the elements in the system work together they need to be controlled

by a central system, this is the purpose of the game engine. It determines who takes the

Reg: 100178897 19

Page 20: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

next turn, the scoring and keeps track of the position of pieces. An open source

application needs to be used so that i can freely modify code and use it in my project

without breaching licences. The following engines may be suitable:

• Carson Checkers game and AI : Simple checkers game with min max AI. This

program is a python 2.7 program and thus not suitable for this project. Source at

https://github.com/codeofcarson/Checkers

• Raven Checkers: More complex checkers game, ports exist in python 3. However

this uses a GUI, this may not be suitable for the project as a GUI is not required.

Source at : https://github.com/bcorfman/raven-checkers

• Checkers Game with Agents: This game uses a learning agent to play against a

human, is available in python 3.6 and has a command line interface. Source found

at: https://github.com/VarunRaval48/checkers-AI

The game chosen for this project is the Checkers game with Agents as both the player

and agent are derived from the same superclass meaning their methods and attributes

are the same, this makes integration easier as next moves, and best moves can be called

with the same code. Also this is lightweight and available in python 3 without

modification and also requires no external libraries.

4.2.4 Game Engine Implementation

The engine chosen is modified so that it can send commands to the arm, and calculate

the human players move using the vision class, the flow of control during a game is

shown in Figure 7. A copy of the source code can be obtained from:

"https://github.com/VarunRaval48/checkers-AI", working as of April 2019. The robots

agent code is modified to send coordinates to the arm class before sending them to the

checkers class to update the board. This means the arm will move the piece, but if it

fails the game will not be updated until the piece is moved by the arm. The agent class

contains code for human and robot players getMove method. The input for the human

player getMove is redirected to the computer vision class to detect pieces, once the

moves have been detected the game engine is updated with the correct moves.

Reg: 100178897 20

Page 21: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 7: Flow of control

To move the arm using the game engine only a small modification is required. When

the AI calculates its best move, a list containing this move is created, before the list is

returned to game engine, it is passed to the arm, if the arm move is successful the move

is then passed to the game engine. The code snippet below shows the code required for

this. The check in the first 2 lines sees if the move is a double take, if it is the end

position is set to the last position. This prevents the arm missing double moves whilst

the game engine registers them, thus placing the physical game out of sync.

Figure 8: Sending coordinates to arm

Reg: 100178897 21

Page 22: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

The game engine handles the input of moves from the CV algorithm in a similar way,

usually the input is taken from the keyboard for the human. This function is changed to

take the input from the CV algorithm, the CV module reports back the changed board,

the method for determining where the player has moved is explained in the CV section.

A large change from the initial design, shown in the project notebook term 2 week 7, is

with the role of the game engine, originally the computer vision algorithm would track

each piece and determine its colour and position updating the engine with the piece of

every counter and its position, however since the game engine tracks all possible

moves, and where each piece is on the board, the computer vision just has to track

which pieces move so the game engine knows which move the player made. This

simplifies the role of the CV algorithm and removes redundant functionality.

4.2.5 Servo and Arm control Design

The arm uses 4 servos to manipulate the position of the end-effector and one servo to

open and close the claw. As shown in the section 4.2.3, the game engine sends

coordinates to the arm class. For the arm to move there needs to exist some way for the

coordinates to be translated into a set of instructions to be sent to the servos. Since the

arm and the board are of a defined size these numbers will only need to b calculated

once, a dictionary would be the likely solution to storing these values. The positions

can be calculated with a kinematics algorithm then stored. Because the servos may not

be perfectly accurate, storing the values allows them to be adjusted, rather than

working out the rotations on the fly, which is much harder to account for error. If erro

needs to be accounted for this can be done for each board position.

Servos are controlled using PWM (pulse width modulation). By sending a signal of

varying length at regular intervals to the signal wire on the servo, the position of the

motor can be changed. The documentation for the MG996r servos used states - 20ms

pulse cycle with 5/15 ms duty with a minimum duty change of 0.1ms, this is

demonstrated in Figure 9

Reg: 100178897 22

Page 23: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 9: MG996r PWM

The library Pi.GPIO allows control of the duty cycle and the generation of specified

pulse widths, thus allowing motor control by varying the duty cycle. The method in

which the servo rotations are calculated is described in Section 4.2.9.

4.2.6 Servo and Arm control Implementation

As discussed in the previous section a look-up table will be used for servo values, so

they can be calculated once. A csv file is used to store the positions then these are read

into the Arm class.

The Arm class is used to control the servos. The methods init() and startPins() handle

the setup and initialisation of servo pints and stopPins() de-allocates and cleans the

pins up for the next use. The following process is a high level explanation of how the

servos are controlled with Pi.GPIO:

1. A pin object is assigned to a physical GPIO board pin according to the boardcom

numbering.

2. The Pins mode it set to output a signal.

3. Assign the pin a pulse width in Hz. According to the specification of mg996r -

1000ms/20ms duty = 50Hz.

Reg: 100178897 23

Page 24: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

4. Start the pulse on the pin.

5. The duty cycle is changed by passing a new duty value to the pin object.

6. Stop the pin, Cleanup GPIO registers for a clean exit.

The arm receives coordinates from the game engine, these must be converted into a

value in range of the pulse cycle. Firstly, the duty cycle setting the servos to the min (0

degrees), mid (90 degrees) and max (180 degrees) must be calculated. According to the

specification (ServoDatabase 2019) these values are 5ms, 10ms and 15ms respectively,

however after testing these values are incorrect probably due to using cheap motors

(see project notebook - Term 2 week 3).

By attaching a servo horn and using a protractor the duty value ranges were calculated

manually with: Min = 3.1ms Mid = 7.15ms Max = 11.2ms.

Using the max and min values the range and resolution of movement can be calculated.

Since the minimum change in duty cycle is 0.1ms the motors range is equal to Formula

(1). The range calculated shows the number of unique positions the motor can achieve

between 0 and 180 degrees, the resolution is the number of degrees each unique

position represents and is calculated in Formula (2). In an ideal circumstance the

motors would map one degree of rotation per unique position between 0 and 180

degrees, this would make the accuracy within 1 degree. To have a 1.1 degree resolution

the motors would need to accept a minimum duty cycle change of 0.045ms, which they

do not.

Range =11.2−3.1

0.1= 81 (1)

Formula (2) calculates the resolution over 180 degrees using output from (1).

Resolution =18081

= 2.2 (2)

Formula (3) calculates minimum change in duty cycle to get 180 degrees of revolution.

Range =8.1180

= 0.045 (3)

Reg: 100178897 24

Page 25: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Using the value in Formula (3) a formula can be derived to convert a given degree, to a

duty cycle Formula (4).However, this formula assumes there are 180 unique positions,

when we know there are 81. Since the minimum increment of the duty cycle is 0.1 ms

the values will be rounded to the nearest 0.1ms. The offset is the duty cycle value

needed to set the servo to 0 degrees, thus offsets the formula to work with the mg996r

servos.

Duty cycle from Degree = degree×0.045+o f f set (4)

Each motor was tested to check min/max values. 3/6 complied, the remaining ones

were returned to the retailer. A batch of 5 more were ordered and the ones which

complied were kept.

The Formula in (4) allows angles to be calculated by a kinematic algorithm, and passed

directly to the arm.

The arm class has to complete several tasks: Move a piece from start to end position

and remove a piece from the board. The algorithms to complete these tasks are detailed

below: Figure 10 shows each servo being initialised. To change the duty cycle of a pin

the function .ChangeDucyCycle(ms) is used.

Figure 10: Initialising arm object and binding servos

Algorithm 1 gets the servo rotations for each motor and passes them to the moveArm

function. If both moves return true then the arm successfully moved a piece.

Reg: 100178897 25

Page 26: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Algorithm 1 moveAtoB(Arm startCoord endCoord)1: Start← servoMoveDictionary(startCoord)

2: End← servoMoveDictionary(endCoord)

3: movedA← moveArm(Start[0],Start[1],Start[2],Start[3],1)

4: movedB← moveArm(End[0],End[1],End[2],End[3],0)

5: if movedA and movedB equal True then6: returns True

7: else8: returns False

9: end if

Algorithm 2 is responsible for picking up or putting down a piece depending on the

grabberState variable- 1 picks a piece up, 0 places it down. The code for this algorithm

is found in the Arm class. For the arm to pick a piece up correctly, the base must be

angled to the correct heading, then joint 2 must be set first, stopping the arm hitting the

board when joint 1 and 3 move into position. If joint 2 is set after, joint 1 and 3 may

place the arm below the level of the board, thus moving the end effector into a position

below the board- This is obviously impossible.

Reg: 100178897 26

Page 27: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Algorithm 2 moveArm(Arm base j1 j2 j3 grabberState)1: base, j1, j2, j3← degreesToDuty(base, j1, j2, j3)

2: if grabberState equal to 0 then3: Arm.j1.ChangeDuctCycle(90 degrees)

4: Arm.j2.ChangeDuctCycle(90 degrees)

5: end if6: Arm.base.ChangeDuctCycle(base)

7: Arm.j1.ChangeDuctCycle(j2)

8: if grabberState equal to 1 then9: Arm.grabber.Open()

10: else11: Arm.grabber.Close()

12: end if13: Arm.j2.ChangeDuctCycle(j1)

14: Arm.j3.ChangeDuctCycle(j3)

15: if grabberState equal to 1 then16: Arm.grabber.Close()

17: else18: Arm.grabber.Open()

19: Arm.j1.ChangeDuctCycle(90 degrees)

20: Arm.j2.ChangeDuctCycle(90 degrees)

21: Arm.ResetArm()

22: end if

4.2.7 Computer Vision Design

This element of the project utilises OpenCv and various widely used computer vision

algorithms to determine what moves are made by the human player. There are a several

objectives for this section to be successful:

• Recognise the checkers board and calibrate the camera successfully so it only sees

the board. Formally- get the board perspective transform matrix.

Reg: 100178897 27

Page 28: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

• Extrapolate the outer corners of the board from the inner corners.

• Detect which squares on the board contain a piece.

• Detect when a piece is moved, start and end point.

A camera mounted above the board will be used to take the images. There are two

methods for capturing images - 1) using the Pi camera, this captures a low resolution

image that can be directly accessed by openCV as a web-cam 2) iPhone camera using a

RTSP (real time streaming protocol) application (Japan 2019), this approach is less

reliable as it requires a constant connection but the images are a much higher

resolution.

After comparing images (see project notebook- Term 2 week 8) the Raspberry Pi

camera was found to be more suitable. This is mainly due to the image file size, ones

taken on iPhone are roughly 4MB compared to the 100Kb images taken by the Pi

camera. This large increase in file size leads to a delay in image processing, as the

image contains a larger search space. Also the higher resolution image picks up more

noise, thus making the corners look more distorted. This is recorded in the project

notebook.

The purpose of calibrating the camera is to ensure the piece detection algorithm works

on images taken at different angles or heights, meaning if the project is taken apart and

rebuilt, as long as the calibration function succeeds the computer vision algorithm will

succeed. This also results in a consistent image to be used for the game engine, with

each corner of the chess board in the same place each time. For the camera to be

calibrated a picture is taken, then a perspective warp is calibrated, this changes all the

squares on the board to the same dimensions. This warp can later be applied to all

images, resulting in the board looking symmetrical, and giving a birds eye view.

Using corner detection, and camera calibration the board will be selected from an

image and checked for which tiles contain a square.

Reg: 100178897 28

Page 29: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

4.2.8 Computer Vision Implementation & Camera Calibration

The implementation of camera calibration is shown in Algorithm 3 and returns an

array of inner corners found using openCV’s findChessboardCorners, and a

transformation matrix using getPerspectiveTransform. The input and output is shown

in Figure 11, notice the output is a birds eye view with indicated inner corners whereas

the input is of a board from an off centre view. The transformation matrix returned by

Algorithm 3 can be applied to all images with the same perspective as Input and yield

the same perspective Output, without having to recalculate the transform again thus

improving performance. The Transformation matrix is saved to a file, to be used by the

camera class. The camera class will apply this warp to every image taken, meaning the

vision engine does not have to handle this transform each time, the input it receives is a

birds eye view of a board.

Figure 11: Example of perspective transform applied warped image

The algorithm is easiest to understand with an informal description alongside the

pseudo-code. Firstly the image is converted into gray scale, then the canny image

algorithm finds all edges within the image. This edged image is now scanned for

contours, which are sets of closed geometrical shapes, with only the largest contour by

area saved. If this contour has 4 edges then it must be a square and thus the board. A

new image size is calculated using the corners of the largest square, and the width and

height are calculated. Line 15 uses an openCV function to return a matrix given the

corners of the square and the desired new image size. Line 15 Takes the

Transformation matrix and returns a new image, with the new corners as the

chessboard corners. Finally the inner corners of the board are found on line 17.

Reg: 100178897 29

Page 30: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Algorithm 3 getWarpedCorners(inputImg)1: grayImg← inputImg.convertToGrayscale

2: cannyImg← grayImg.findAllEdges

3: contours← cannyImg.findAllContours

4: largestContour← maxContour in contours by area

5: if largestContour has length equal to 4 then6: boardEdges← largestContour

7: end if8: topLeftCorner← boardEdge with smallest sum

9: bottomRightCorner← boardEdge with largest sum

10: topRightCorner← boardEdge with smallest difference

11: bottomLeftCorner← boardEdge with largest difference

12: square← topLeftCorner, topRightCorner, bottomRightCorner, bottomLeftCorner

13: Width ← maxDistance between topLeftCorner topRightCorner and bottomLeft-

Corner bottomRightCorner xcoord

14: Height ← maxDistance between topLeftCorner bottomLeftCorner and topRight-

Corner bottomRightCorner ycoord

15: transformationMatrix ← openCV.getPerspectiveTransform(square, newImg-

Corners)

16: warpedImg ← openCV.warpPerspective(inputImg, transformationMatrix, (Width,

Height))

17: listInnerCorners← openCV.findChessboardCorners(warpedImg, (7,7 ))

18: returns listInnerCorners, warpedImg

This function only returns the inner corner coordinates, for an 8x8 board there are 7x7

inner corners, next the outer corners coordinates must be calculated to give a full 8x8

board/9x9 list of corners. This is fairly straight forward, and relies on the board having

regularly sized squares, this is why the perspective warp is needed. The average x and

y distance between each inner corner is calculated. Using simple arithmetic the new

corners can be found. The 7x7 list is extended to a 9x7 list by duplicating the first and

last elements of each list in place. The y coordinates of these duplicated elements will

Reg: 100178897 30

Page 31: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

be correct as they are on the same row, however the first elements are decremented by

the average x distance, and the last elements are incremented by the average x distance

thus generating correct x,y coordinates. The same is done in the y dimension, the first

list and last lists are duplicated in place to create a 9x9 list, decrementing the first lists

y coordinates by the average y distance and incrementing the last lists y coordinates by

the average y distance. This method generates all the corners for an 8x8 board.

Once a complete set of board corners have been extrapolated, an algorithm to detect

which squares contain checkers pieces can be implemented. This algorithm uses a

widely used method, known as a Hough circle transform (OpenCV 2013). Algorithm 4

documents the implementation used in this project. Hough transform is used to

identify circles in the image with a max radius 10% more than half the size of a square,

this method aims to eradicate any false positives by ensuring only circles smaller than

the squares are detected. The algorithm returns a list of circles, which are then sorted

from top left to bottom right row by row. The list of circles contains 3 values x

coordinate, y coordinate and radius.

Algorithm 4 findCircles(cameraImg)1: height, width, channels← cameraImg.shape

2: maxCircleRadius← half average(height+width) + 10%

3: gray← cameraImg.ConvertToGray

4: blur← 5pxBlur(gray)

5: circles← HoughCircles(blur,minimum radius=1, max radius = maxCircleRadius )

6: orderedCircles← circles ordered by x then y

7: returns8: orderedCircles

The location of pieces on the board can now be determined. Using the calculated 8x8

corners and the detected circles, an algorithm detectPieceLocation is implemented to

return a list showing which squares have a piece inside. Squares not containing pieces

are ignored, as they are not useful.

Reg: 100178897 31

Page 32: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Algorithm 5 detectPieceLocation(orderedCircles, cornerList)1: Dictionary populatedSquares

2: for each corner in cornerList do3: tlx← top left x coordinate of corner

4: brx← bottom right x coordinate of corner

5: tly← top left y coordinate of corner

6: bry← bottom right y coordinate of corner

7: for each circle in orderedCircles do8: if tlx < circle x < brx and tly < circle y < bry then9: corner← this corner contains a circle

10: populatedSquares.add(corner)

11: else12: continue13: end if14: end for15: end for16: returns populatedSquares

Now that the computer vision can detect the board, calculate the corners, and

determine which squares contain pieces, this information can be used by the game

engine to determine where a user moves. Checking if a user has taken a move is as

easy as running the detectPieceLocation algorithm every second (or n seconds), as

when a user moves a piece the new location of pieces will be updated. In line with the

official competition rules, players should press a timer indicating the end of their move,

so this could be another approach, as this would ensure the game engine knows exactly

when to look for a humans move, as their move is explicitly stated. However the

approach described in the paragraph means this is not helpful, as only moves that are

valid will be considered as a players move. So if an illegal move is made the game

engine will not be updated, and will continue to wait for a valid move to be made.

The way in which a players move is detected is one of the benefits for the chosen game

engine. The engine provides a list of all possible start and end locations for the human

Reg: 100178897 32

Page 33: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

player in a 2xn list, each row in this list is called a move. The moves, rows, are

formatted as follows :[[startx,starty][endx,endy]]. To find the move a player made the

xy coordinate of every player piece on the board is calculated using Algorithm 5 (after

the player has moved), the piece that has moved can only have moved to a location

within the list of moves corresponding to an [endx][endy] value, so these values are

compared to the populatedSquares list returned by the detectPieceLocation algorithm.

Using the known end position of the moved piece the list of total moves possible can

be narrowed down to ones only containing the correct end location. There may now be

one or more moves left, to check which move is correct the board squares

corresponding to [startx][starty] are checked for piece occupancy, if they do not

contain a piece then this is the correct start point, otherwise it contains a piece so the

move must not have originated here- look at the next start values in the list. Figure 12

demonstrates this in operation. The value returned is the players start and end move.

Figure 12: Checking for movement on board

Reg: 100178897 33

Page 34: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

4.2.9 Kinematics Design and Implementation

Kinematics is defined as "the geometry of motion" and is the system used to calculate

the angles for each link in the arm chain. Kinematics solves the problem of mapping

the end-effector (EE), claw , to a desired point on the board, using known arm link

lengths and known distance to pieces. The kinematics algorithm used to calculate the

servo angles is adapted from an open source 3DOF planar arm project (Qrobotics

2018). Planar arm means the arm works in one plane, in this case the Cartesian plane.

Figure 13 Demonstrates the arm used in this project, the angles in the right image are

the ones that need to be calculated to correctly position the arm over a given coordinate

in the Cartesian plane and the left image shows the degrees of freedom of each link.

The arm uses two algorithms to calculate the servo rotations needed to pick a piece up.

The first algorithm is designed by myself, and the second one takes in the lengths of

the links, along with a desired x,y coordinate and EE angle and returns the value in

degrees in which each motor must be set to in order to reach the x,y coordinate relating

to a square on the board.

Figure 13: Left: Arm with rotational constraints. Right: Planar manipulator

The first algorithm is used to point the arm in the correct direction to a given square,

thus eliminating the 3rd dimension and creating a problem that can be solved on a

Reg: 100178897 34

Page 35: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

single plane. Figure 14 shows D as the desired end direction, and angle A is the angle

required to move the arm from 90 degrees to A. The function returns a map, see Figure

14, of all the A angles required to face the centre of each square on the board. The

angle A can be calculated with trigonometry and finding the hypotenuse is easy since

the opposite and adjacent sizes are known as each square is the same size. The values

for x and y list used in Algorithm 6 are known as each square is the same size, so to

calculate the adjacent side the column number is incremented by 3cm (distance

between two square centres) per column, the same is done with rows to find the

opposite side length.

Once the correct angle for A has been calculated, the problem can be solved using the

3DOF planar kinematic function. The function needs an x and y coordinate with the

origin at the base of the first joint, this is demonstrated in 13, the length of the

hypotenuse is used for the x value as it sits this distance from the base. The y value is

consistent, and is the height of the first joint from the board, meaning it is always

negative. The algorithm returns a list of 3 servo values that are stored to a csv, joint 1, 2

and 3. These values contain the rotations to reach each square on the board. Due to

error in accuracy of the servos, these values do not correctly map to the real world ones,

therefore many values have been manually tested and adjusted so the correct angles can

be passed to the arms movement function. This is a huge benefit of working out values

and storing them, as opposed to calculating on the fly. The inverseKinematics function

in the movement class shows a formal implementation along with the output values.

Figure 14: Birds eye view of arm over board and ZMap

Reg: 100178897 35

Page 36: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

These two functions provide the values required for each of the 4 servos to move the

end effector to a square on the board. The kinematics function is not described as it is

adapted from an open source project, therefore this code was not written by me due to

the complexity of the problem.

Algorithm 6 getZmap()1: x← list of x distances

2: y← list of y distances

3: ZCoordinateMap

4: for each square in board do5: opposite← x[square]

6: adjacent← y[square]

7: hypotenuse← squareRoot(opposite2,ad jacent2)

8: A← inverseCos(adjacent/hypotenuse)

9: ZCoordinateMap.add(A) at current square

10: end for11: returns ZCoordinateMap

4.3 Hardware Components Design and Implementation

4.3.1 Parts List

This project set out an initial budget of £150 maximum, this list details all parts

purchased and their price. Links to the pieces purchased are contained within the

notebook resources section.

• 6DOF Arm = £50

• Raspberry Pi = £34

• 5 Mg996r servo motors = £6 x 5

• Pi Camera + Cable = £10

• Wires + Breadboad + Power Supply = £30

Reg: 100178897 36

Page 37: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

• Player pieces = £6

Therefore the total cost of all parts is £160 placing it slightly over budget.

4.3.2 Raspberry Pi + Raspbian

The whole system needs some way for the elements to be connected together, and for a

central controller. Although the game engine handles the software side, there needs to

be an operating system running the python application and interfacing with the servos

and camera. The two main considerations for this were Raspberry Pi and Arduino.

An Arduino can be connected to any computer via USB and runs on any operating

system acting as a hardware controller for servos, however this must be controlled

using C ++. This is not convenient as the Game engine, and computer vision modules

are written is Python. Because of this the Arduino is not helpful, For this reason a

Raspberry Pi is the best choice of controller.

A Raspberry Pi is a small modular computer with a GPIO board and other ports. The

Raspberry Pi is chosen as it has ample processing power and can be moved around

easily, as well as being capable of running a fully fledged operating system. Raspbian

is the chosen operating system for this project and is a lightweight Linux distribution

running a terminal based interface. Python and openCV can be installed using a

package manager (PIP). A Virtual environment is is created for the project which

allows dependencies to be installed separate to site packages for the OS.

4.3.3 Power Supply & Servo interface

Each servo requires 4.8v to operate. A Raspberry Pi has a power pin, however the

current supplied is not enough to power each servo. Using the Pi the servos jitter,

because they are under-powered, this is not acceptable as it creates unreliable PWM

pulses.

Therefore to supply the 4.8v to each servo an Elego MB102 power supply is used,

information about this is in the notebook resources tab. This can output 3.3v or 5v and

Reg: 100178897 37

Page 38: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

has three separate outputs, each acting as a separate source, this is helpful if the servos

draw too much current and this effects the operation.

The servos are wired as seen in Figure 15. The left image shows the power supply

attached in series. The yellow cable is attached to the ground pin on the Raspberry Pi

for use with PWM. The middle image shows the wiring for the servos, red wires are

power, green are ground and the other colour is the signal wire, connected to the

Raspberry Pi GPIO board. The right image shows the servo signal pins connected to

the Raspberry Pi, 5 in total. This setup provides a stable current to each motor allowing

for consistent motor control.

Figure 15: Ground and power leads to servo

4.3.4 Arm

As discussed in the research section, there are several robot designs to be considered,

however the most cost effective solution is the robotic arm. This also is expandable for

use in other projects after the completion of this and fits in with my motivation for this

project. The arms dimensions are shown in Figure 16.

Reg: 100178897 38

Page 39: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 16: Measurements of the arm

The arm used for this project was purchased from an online retailer, AliExpress. The

arm comes disassembled, and without servos, these were purchased separately. Once

assembled, the arm measures 36cm, and when accounting for the base has a max reach

of 31.5cm. This information is useful for the board design discussed in the following

subsection. The original arm was modified, a servo was removed, as there was too

much strain placed on joint one and two resulting in unreliable movement. Figure 17

shows the original arm, then the arm with the extra link removed. The removed link

allowed rotation of the claw, however this was not applicable to this project, thus

determined to be a waste of weight. This removal was straight forward since all the

parts contain pre-drilled holes, the claw was simply turned around and remounted, in

line with the centre of the arm. The only difference is the orientation of the servo, in the

altered design the servo sits atop the grabber, however this does not impede function, it

also allows the arm to reach closer to itself, meaning it can sit closer to the board.

Reg: 100178897 39

Page 40: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 17: Remounted grabber

4.3.5 Board

The board is designed to fit the arm specification. The max reach of the arm is 31.5cm,

therefore the board must not exceed this in any dimensions. The board size decided

upon is 24cmx24cm, leaving a maximum of 7.5cm spare reach for the arm.

Using a 24cm board each tile is 3cm wide, which is conveniently slightly larger than

the counters purchased at the start of the project.

One important consideration is the colour of the board, since the computer vision

aspect of the game relies on detecting the chessboard pattern, to calibrate the camera,

as well as the player counters during play. Three boards were considered and these are

shown in Figure 18.

Figure 18: Boards tested

All of the three boards function correctly when tested with digital copies, however

when tested on printed boards only the right two images work. This is because the

algorithm used to detect the board first converts the image to grayscale, the grays on

Reg: 100178897 40

Page 41: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

the left board are too similar, thus making the algorithm fail. Figure 20 demonstrates

this on a printed board taken by the Raspberry Pi camera. The border around the

boards is used in Algorithm 3, and is detected as the largest contour. This helps with

camera calibration. Therefore the black and white with a red border seems like the best

fit. When placing the arm this border also has to be taken into account, for this reason

the arm must not cover the border, this is why there is a space between the base and the

board.

Figure 19: Grayscale red and green board

The board design must also take into consideration the colour of the players pieces.

The pieces purchased at the start of this project are Black and Red, this sometimes

causes some error in detection, as the black pieces do not contrast with the board well

enough depending on the light. For this reason white and red pieces may be chosen for

the final demonstration, as it allows sufficient contrast between the board tiles and the

pieces. Other colours could also be used, for example brown/white.

Figure shows the completed board design. This image is taken by the Pi camera, it is

clear there is some difficulty in recognising the pieces on the right of the board. This

demonstrates why more contrasting colours are needed.

Reg: 100178897 41

Page 42: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 20: Board with Black/Red pieces

5 Testing and Full system

5.1 Testing

Due to the modular nature of this system testing was carried out whilst developing

each module. This allowed for any problems to be detected early, and before the whole

system was integrated together. Each section of the system had one point of interaction

with each other part, thus allowing errors to be detected and resolved more easily.

5.1.1 Game Engine Testing

The game engine itself was a working project when downloaded, so the functionality

of the game was not thoroughly tested. Several games were played to make sure the

game was working correctly. The testing for the game engine was mainly to ensure the

correct values were being passed to the arm when the player and the computer took

their turn.

Figure 21 shows the values passed from the game engine to the arm, and then the CV

module to the game engine. Both work as intended.

Reg: 100178897 42

Page 43: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 21: Game engine passing and receiving values

5.1.2 Computer vision testing

Some brief testing was described in the board section of this report, however further

testing is required to ensure the camera can recognise the board repeatedly, and detect

moved pieces. This testing was carried out by running the Computer vision algorithm

separately, then testing for movement detection on the board. There were few problems

with this since the environment tested in is well lit, however errors occur when the light

level is low, or too much glare from the board confuses the contour detection algorithm.

To test the accuracy of the Vision, it was first tested on digital images, shown in Figure

22, then images taken by the pi camera, shown in Figure 23. The sets of images below

show the testing of several images, the annotation describes what has been done to the

image. The corners found by the vision algorithm are plotted to give a visual

representation. Further testing is shown in the project notebook, term 2 week 8,

including testing of the piece recognition algorithm. This testing was limited, as it

works consistently as long as the calibration succeeds, the pieces are recognised as

long as the lighting conditions are good.

Reg: 100178897 43

Page 44: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 22: Testing Digital board

Figure 23: Testing Pi camera image of board

5.1.3 Arm Testing/Servo

Each servo was initially tested by placing it in the centre of a protractor with a pencil

attached to find the max and min the range, then to determine the accuracy. To test the

arm, each position on the board was moved to using the move arm function and

iterating through all pieces on the board. Some positions further from the base are less

reliable, this makes sense as the error is accumulated the further from the base the arm

moves. This testing led to a slight change in the position of the arm, originally the arm

would sit between the human and robot players, now it sits behind the robot. This also

Reg: 100178897 44

Page 45: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

improves accuracy as the claw collides less with adjacent pieces. More testing images

of the arm are shown in the project notebook. The Arm was also tested by picking a

piece up from each location, shown in Figure 24.

Figure 24: Testing arm and grabber

To test the servo accuracy a protractor was used, this method is shown in Figure 25.

This method was used to ensure all servos had the same tolerances and min/max

values. This meant each servo sets to the same degree based on the duty cycle written,

this means less error is introduced and attempts to make the kinematics algorithm work

first time, without too much adjustment.

Figure 25: Testing servo accuracy

5.2 Integrated system

All elements of the system are connected to the Raspberry Pi and tested to make sure

they work together. The arm and computer vision functions correctly, however a

change in light can affect the accuracy of the detection of the pieces, this is not

common though. To test the system working together the game is played a number of

times, making sure to take different starting moves. Testing of the complete system

Reg: 100178897 45

Page 46: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

shows the software works reliably, however the hardware lets the project down, as the

motors are low accuracy. Overall the integration of the system went smoothly as each

element was engineered to fit together, with as little information passed between each

module as possible. By doing this points of failures can be identified and controlled,

making bug fixing easy and passing control around the system a straight forward

process.

6 Evaluation

To gauge the success of this project the initial aims can be compared to the achieved

outcome.

This project met all aims defined in Section 1.1:

• A robot arm can pick up and move checkers pieces.

• Inverse kinematics was used to build a dictionary of rotations for each servo.

• A computer vision algorithm recognises the board and tracks players moves.

• A chess engine sends movements to the arm to control it, as well as the running

of the game.

• All components are linked together allowing a human to play a robot at a game of

checkers.

Each of the criteria stated at the start of the project were met, therefore this can be

considered successful.

The project took roughly 170 hours to complete. I believe this could have been cut

down. The project initially set out to play Scrabble, then Chess, before settling on

checkers, therefore time was spent researching these before starting the build of the

checkers robot. The chart in Figure 26 shows a rough record of time spent each week

and a rough guide to the type of work completed that week.

Reg: 100178897 46

Page 47: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Figure 26: Record of time spent on tasks

Throughout the project time was allocated to three main tasks, Research, Report

writing, and building or coding. The first section of the project focused more heavily

on researching current systems and ways in which to implement each section, this

meant when the design and implementation phase started there would be less errors.

Progress was delayed due to the arm taking a long time to arrive, however this left time

to start the computer vision side of the project. This led up till Christmas. After

Christmas the arm arrived so progress started on this, whilst spending some time

implementing the vision. The arm and the vision module were completed at almost the

same time, and because each element had to be continuously tested as it was being

designed the complete modules had very few issues. Well the arm had accuracy issues,

but these couldn’t be solved by software. The game engine took the shortest amount of

time to modify, but finding a suitable one to use was a lengthy task, this is since one

with specific elements was needed. After everything had been completed the individual

modules were bought together and combined, they worked fairly reliably with a few

hiccups. I believe progress in the first term may have been slower due to the lack of

communication with my advisor, however in the second term i had regular meetings

which helped solve issues and allowed me to ask questions about progress.

6.1 Limitations of the design & Addressed issues

The completed system contains some issues and limitations. Appropriate action to

resolve these could be implemented with further development. However, due to time

and budget these cannot be resolved in the final version of this project.

Reg: 100178897 47

Page 48: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

• The low accuracy of the arm sometimes results in a piece not being correctly

lifted. This could be solved using more accurate servos, however the budget for

this project would not cover the cost.

• Grabber consistency, due to the grabber being made from metal it sometimes

knocks adjacent pieces, this is rare however still a limitation of the design. The

best strategy to mitigate this is to use smaller pieces, or a more precise claw. Fu-

ture development would include a claw that has smaller tolerances, and a smaller

footprint. Meaning it would be less clumsy.

• During low light, or high glare lighting the computer vision accuracy is dimin-

ished. However as long as the camera is pre-calibrated this is usually not an issue.

This is detrimental when the lighting conditions are not ideal for calibration, one

recommendation is possibly using a manual calibration function.

• An issue with the pieces is their size relative to the board squares, when the grab-

ber reaches down to lift a piece, it sometimes collides with the adjacent pieces.

To solve this smaller pieces could be used. The pieces used in the demonstration

may be different to ones pictured in this report.

• Servo jitter caused by high strain on the arm, causing the potentiometers to be

overloaded and the rotational position to slip from a desired location.

6.2 Future Work

With more time, or budget this project could be adapted in many ways. Some of the

future developments are stated in this section:

• Adapt to play other games. By using the same piece detection method this project

could be adapted to play chess fairly simply, as long as a chess game works in a

similar way (by listing best moves). This may require an improvement of accuracy

though.

• Improve motor accuracy. By getting more accurate servos this project could be

adapted to carry out more high precision tasks.

Reg: 100178897 48

Page 49: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

• Change computer vision method. The method works with optimal lighting, how-

ever by using calibration points on the board this accuracy could be improves.

Similar techniques to QR codes can be used to orient and calibrate the board, this

removing the reliance on the detect chessboard algorithm.

• Robot vs robot. By using another arm, or the same one the robot could play

against itself, this could be an entertaining item to have on a coffee table and

would be an interesting development.

• Change in arm position. By moving the arm above the board the strain on motors

may be improved, this would remove much of the servo jitter caused by strain on

the arm.

6.3 Conclusion

In conclusion, the project achieved its end goals. Some issues were identified and there

were a few setbacks, but these did not drastically change the project outcome. The

project allowed me to gain experience in the areas of robotics, kinematics and

computer vision. The overall project is something that can easily be adapted to play

other games or provide other functionality in the future.

The motivation for this project stemmed from personal interest and also wanting to see

what current consumer grade robotics is capable of. This project proved that for as

small as £150 a robot can be designed and constructed to play a game of checkers. This

makes me wonder what could be possible in the near future and shows the capability of

low cost electronics. Could the average person have some sort of robot in their home,

in the coming years, to complete trivial tasks and if so this could greatly impact certain

people, like ones with disabilities. This field is yet to be explored in much depth and

I’m excited to see what comes of it in my lifetime.

7 Gantt Chart

Reg: 100178897 49

Page 50: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y

Proj

ects

ched

ule

show

nfo

re-v

isio

nw

eek

num

bers

and

sem

este

rwee

knu

mbe

rs

89

1011

1213

1415

1617

1819

2021

2223

2425

2627

2829

3031

3233

3435

3637

3839

4041

12

34

56

78

910

1112

CB

12

34

56

78

9E

B10

1112

1314

Proj

ectp

ropo

sal

Lite

ratu

rere

view

Plan

ning

/Res

earc

h

Cod

eC

VM

odul

e

Cod

eA

rmC

lass

Cod

eG

ame

engi

ne

Test

ing

Pro

ject

deliv

ery

Dem

opr

epar

atio

n

Figure 27: Final Project Gantt chart

Reg: 100178897 50

Page 51: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y References

References

abz.me (2019). The pigpio library. URL: http:// abyz.me.uk/ rpi/ pigpio/ .

Arduino.com (2016). Recommendation for High Torque Servo on Robotic Arm. URL:

https:// forum.arduino.cc/ index.php?topic=394272.0.

Banerjee, Nandan (2019). Buttercup Chess Robot. URL: http: // www.nandanbanerjee.

com/ projects/ buttercup-chess-robot .

Canny, J (1986). “A Computational Approach to Edge Detection”. In: IEEE Trans. Pat-

tern Anal. Mach. Intell. 8.6, pp. 679–698. ISSN: 0162-8828. DOI: 10.1109/ TPAMI.

1986.4767851. URL: https:// doi.org/ 10.1109/ TPAMI.1986.4767851.

Honda (2019). the world’s most advanced humanoid robot2019. URL: https: / / asimo.

honda.com.

Japan, Kazuki Yoshida (2019). Live-Reporter Security Camera. URL: https : / / itunes.

apple.com/ us/ app/ live-reporter-security-camera/ id996017825?mt=8.

JPL, NASA (2019). rovers arm 2019. URL: https:// mars.nasa.gov/ mer/ mission/ rover/

arm/ .

Lowe, David G (1999). “Object recognition from local scale-invariant features”. In:

Object Recognition.

Meyer, Joey (2019). Raspberry Turk. URL: http:// www.raspberryturk.com.

MIT (2019). MIT biomimetics robotics Lab. URL: https:// biomimetics.mit.edu.

OpenCV (2013). Hough Circle Transform. URL: https : / / opencv - python - tutroals .

readthedocs.io/ en/ latest/ py_tutorials/ py_imgproc/ py_houghcircles/ py_houghcircles.

html .

Reg: 100178897 51

Page 52: Raspberry Pi Open Project - Draughts Robot

CMP-6013Y References

OpenCV (2019). OpenCV Python camera calibration. URL: https:// opencv-python/ py-

tutorials/ py_calibration.html .

Paul, Richard P (1981). Robot Manipulators: Mathematics, programming, and control;

the computer control of robot manipulators. MIT.

Qrobotics (2018). Forward and Inverse Kinematics with 3 DOF Planar Robot. URL:

https:// www.youtube.com/ watch?v=VohqXRskHgs.

RobotShop (2019). Cartesian Robot. URL: https:// www.robotshop.com/ uk/ makeblock-

xy-plotter-robot-kit-v20-electronics.

ServoDatabase (2019). TowerPro MG996R Servo. URL: https: / / servodatabase.com/

servo/ towerpro/ mg996r .

Szeliski, Richard (2011). Computer Vision. Springer-Verlag London Limited.

TheDuckCow (2019). Tumour processing with openCV. URL: https:// theduckcow.com/

dev/ opencv/ cancer-tumor-processing/ .

Reg: 100178897 52