georgia tech savannah robotics’ asvmalisoff/papers/gtsr.pdfabstract—victoria, is an autonomous...
TRANSCRIPT
![Page 1: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/1.jpg)
1
Georgia Tech Savannah Robotics’ ASVVictoria 2012 AUVSI and ONR’s RoboBoat
CompetitionValerie Bazie, Steven Bradshaw, Michael Bunch, Phillip Cheng, Lisa Hicks, Brian Redden, Carol Young
Abstract—Victoria, is an autonomous surface vehicle(ASV) dually used as a research and competition platform.Currently, the vehicle is being used as Georgia Tech Savan-nah Robotics’ (GTSR) 3rd entry into AUVSI and ONR’sRoboBoat competition. The vehicle has been designed todetect and navigate through objects using an intertwinedLiDAR-camera system, and perform mission specific tasks.
Fig. 1: Victoria
I. INTRODUCTION
V ICTORIA is Georgia Tech Savannah Robotics’entry into AUVSI Foundation and ONR’s Fifth
International RoboBoat Competition. While the overallappearance of Victoria has remained the same, her in-ternal hardware differs tremendously from our previousentry. In this paper we will outline the mechanical,electrical and navigation specifics.
II. MECHANICAL SYSTEMS
A. Hulls
Significant improvements have been made to the re-cycled hulls from last year. Many leaks formed fromthe previous year’s use. After extensive research, it wasdiscovered that the main cause of the leaks was due toimproper finishing of the hulls. To prevent such damages
from reoccurring, the hulls were stripped, coated inmarine paint, and a gel coat to create a water tight seal.The interior of the hulls were also coated with a spray-onrubberized utility coating for additional protection.
B. Interior
The major interior change was the switch from awooden frame to an aluminum frame. The new designis shown in Figure 2. The electronics are attached to
Fig. 2: Frame
racks constructed of acrylic and aluminum angle pieces.Nylon bolts were chosen to bolt down the racks in orderto prevent stripping of the aluminum threads on theframe. The hulls are held together using pieces of anglealuminum bolted to the frames. Similarly, the thrustermount, shown in Figure 3, is constructed from aluminumand is bolted to the frame. They are located in thecenter of the vehicle to allow the rear of the thrustersto be aligned with the back of the hull. The thrusters arelocated in between the hulls to prevent them from beingdamaged during handling.
![Page 2: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/2.jpg)
Fig. 3: Thruster Mount
C. LiDAR Mount
Previously, the LiDAR was attached directly to thehulls causing it to move up and down as Victoria rocked.As a result of the vehicle’s constantly changing pitch, theLiDAR was unable to detect the buoys. The solution wasto make a stabilizing mount for the LiDAR as shownin Figure 4. Acetal sleeves were used as bearings for
Fig. 4: LiDAR Mount
the shafts that connect the servo to the LiDAR and thepotentiometer’s shaft. The potentiometer is driven by ashaft which is attached to a gear. The gear is driven byanother gear attached to the servo shaft. The entire mountwas constructed out of aluminum for its good strengthto weight ratio and corrosion resistance. It was designedwith slots to allow a forward, backward, upward, anddownward movement of the LiDAR. To compensate forthe change in pitch, as measured by the IMU, a PIDcontroller is used to correct the angle of the LiDAR.
III. ELECTRICAL SYSTEMS
Victoria is powered using two 25.9V lithium polymerbatteries for control systems, two 12V lead acid batteriesfor propulsion, and six D-cell alkaline batteries for the
backup systems. For safety purposes, all power is routedthrough fuse blocks and a master cut-off switch. Foremergency telemetry and control, a pair of Xbee 900Mhz radios and Arduino embedded controllers havecommand over power, thrusters, and basic sensors. Thecomplete electrical schematic of Victoria is shown inFigure 5.
A. Computers
Victoria’s central computing occurs on a NationalInstruments CompactRIO (cRIO) embedded computer.The cRIO uses a state machine architecture to per-form each of the competition tasks autonomously. ThecRIO communicates with both the shore laptop andthe on-board computer designated to vision processing,through both TCP and UDP network protocols. Thesecommunication protocals were chosen to ensure bothhigh-speed and high-reliability data streams. Victoria’sinternal ethernet network is linked to the shore stationusing the Ubiquiti airMAX long-range wireless system.The shore laptop is a Lenovo ThinkPad T520 with aCore i5 2.5 GHz processor.
As previously stated, vision processing is performedon a dedicated PC running LabVIEW in a Windows7 environment. The hardware platform is the Fit-PC3,a consumer-grade embedded computer outfitted with aribbed aluminum case for fanless heat dissipation.
B. Sensors
Victoria’s navigation system includes a Garmin 16XGPS receiver, Microstrain 3DM-GX3-45 GPS enabledInertial Navigation System (GPS/INS), Sick LMS-291Light Detection And Ranging (LiDAR)sensor, and twoMicrosoft LifeCam Studio 1080p HD webcams. TheGPS, INS, and LiDAR are connected to the cRIO viaserial connections. The GPS is an integrated waterproofantenna and receiver, providing WAAS-enabled GPSdata at 1 Hz with an accuracy of 3 meters. The INScontains three accelerometers, three gyros, and threemagnetometers providing data at a rate of up to 100Hz as well as secondary GPS data. The LiDAR is a75Hz, 180-degree scanning range finder used for objectdetection and avoidance. The data from the LiDARis used in conjunction with camera images for littoralnavigation, shoreline detection, and detecting missionstation props. The cameras are connected via USB to thevision computer, which is connected to the cRIO via aLinksys router. These cameras were chosen for their highdefinition, low distortion, and auto focus feature. Forthe Hot Suit mission, an Omega OS136 IR temperature
GTSR 2
![Page 3: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/3.jpg)
Laptop*
Fuse Block
Com
actR
IO
NI c
RIO
- 9
024 4-Slot Chassis
DIO Module
NI-9403
Serial Module
NI-9870
XBOX
Controller InterfaceUSB XBOX
Controller
Wireless Link
*110V 60HZ AC power not shown
12V SLA Battery
12V SLA Battery
Linksys
E1000 Router
Shore Systems
+2
4V
Port Thruster
Onboard Controls
Thrusters
Onboard Systems
25.9V Li-po
FIT PC3
Embedded Computer
Ubiquiti Bullet M5
Long Range Wireless
Ubiquiti Bullet M5
Long Range Wireless
25.9V Li-po
Fuse BlockL ESC1
L ESC2
R ESC1
R ESC2
FANS
30A
30A
30A
30A
5A
CRIO
24V REG
12V REG
5V REG
ADJ REG
5A
5A
5A
5A
5A
L ESC1
L ESC2
Hydra ESC
Hydra ESC
ESC Redundancy Matrix
Stbd ThrusterHydra ESC
Hydra ESC
ESC Redundancy Matrix
R ESC1
R ESC2
Left
LIDAR Servo
Right
LIDAR Servo
+5V
+5V
Ard
uin
o M
ega
25
60
Digi Xbee Pro
XSC 900 MHZ 60mW
Regulator Modules
Stop
Switch
Standby
24V+
Relay
Drivers
RS
-232 S
eria
l
Digi Xbee Pro
XSC 900 MHZ 60mW
2.4
GH
Z W
ifi L
ink
2.4
GH
Z H
igh P
ower
Lin
k
900 M
HZ W
irel
ess
Link
SICK
LIDAR
Garmin
16X HVS GPS
Microstrain
IMU
US
B F
rom
Mic
roso
ft W
ebca
m
Garmin
18X GPS
Temperature
Sensors
Water
Sensors
Voltage
Sensors
Sensors Connected to Arduino Mega 2650
0-5V Analog
0-5V Analog0-5V Analog
RS-232 Serial
RS
-232 f
rom
LID
AR
RS
-232 f
rom
IMU
Sensors Connected to
Compact RIO
Analog Module
NI-9201
Infrared
Sensor
PING
Sensor
RS-232 Serial RS-232 Serial
RS-232 Serial 0-5V Analog Digital
Microsoft
1080p Webcam
Sensors Connected to
FIT PC 3
USB
Water Pump
+12V
+5V+5V+12V
+12V+24V
+12V +5V
+5V
Remote Telemetry
& Control Station
RS
-232 f
rom
GP
S
+5V
+12V
+12V +24V+5V
9V Backup
Fig. 5: Electrical Schematic
GTSR 3
![Page 4: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/4.jpg)
sensor is mounted to the vehicle to determine the suitwith the highest temperature.
IV. MISSION CONTROL
A. Speed Gates and Channel Navigation
An Extended Kalman Filter (EKF) based Simultane-ous Localization and Mapping (SLAM) algorithm wasimplemented in order to navigate through both the speedgates and the channel. This complex algorithm involvesseveral operations. First, we must detect potential ob-stacles; this step consists of identifying objects usingrange data from the LiDAR and color data from thecameras. Then, we accurately determine the location ofthe object by filtering the different sensors data throughthe Extended Kalman Filter. Finally, Victoria has tosuccessfully proceed through the speed gates and thechannel using a custom curve tracking controller. Theonly difference between navigating through the speedgates and the channel is avoiding the yellow buoys inthe middle of the channel.Object Detection - CameraImage processing is performed using the National Instru-ments Vision Assistant software. Once an initial imageis obtained several separate buffered images are saved,one for each color of buoy. The following process isrepeated from the original saved image for each color.First, blocks of color are detected by setting acceptancethresholds in the Hue Saturation Intensity (HSI) colorspace. HSI is used instead of a Red Green Blue (RGB)space because the HSI has a lower dependence onbackground lighting. This filter then produces a binaryimage consisting of passed or not passed pixels. If thereexists a large amount of noise close in color to thebuoy, the erode function is used. Erode removes texturedsurfaces, by shrinking them significantly faster thansmooth surfaces. This function is useful since naturalsurfaces tend to be more textured than buoys. Next, afilter for the area of each blob is applied for all images,and any blob smaller than a predetermined threshold isrejected. Each of the remaining blobs are bounded by arectangle.
The location of the center of the bottom line for eachrectangle is exported along with the color code fromthe filter. This data is then combined with the cameraproperties: angle, height, field of view, and resolution,to approximate the direction of each blob. This directionis used to match each object in the camera’s field of viewwith the objects detected by the LiDAR.
The system is calibrated by using videos recordedduring test runs. Individual frames are analyzed and thedifferent parameters are adjusted to capture the buoys.
These adjustments are then tested on the entire videoclip to confirm their accuracy.
Object Detection - LiDARTo identify whether an object is detected, we mustanalyze the difference between the previous and currentdata points. For objects in the water two parametersmust be considered: (1) the distance threshold, ∆r;(2) the angle span, ∆φ. The distance threshold isdefined as the difference of the distances between twoconsecutive data points, and the angle span is definedas the difference of angles between the beginning andend of an object. These parameters will vary dependingon the current environment, and must be experimentallyset. An illustration of LiDAR data can be seen in Figure6. A large difference in distance represents a gap in
Fig. 6: Diagram of Sample LiDAR Date and Table ofEach Scenario
the data, denoting the beginning or end of a potentialobject. Since noise is to be expected in LiDAR data,we must further inspect the data to confirm that thepotential object fulfills both parameters. A feature labelpattern recognition method is used to detect the objectsfrom the LiDAR data. Each point of a LiDAR scan isgiven a numerical feature value, 1-6. The labeling canbe seen in Figure 6, where each data point is labeled inred with its corresponding feature number.
To successfully label each data point, we must traversethrough multiple nested if-else statements. Figure 7shows the hierarchy structure of the if-else statements.
A flow chart of the data labelling process is illustratedin Figure 8. Once the data points of an entire LiDARscan are labeled, a pattern recognition algorithm is usedto identify where actual objects are located in the scan.
The template used to identify an objectconsists of a sequence of features represented by{6, 1, 2, . . . , 2, 3, . . . , 3, 4, 6}. In Figure 6, the datarepresented by {6, 1, 2, 2, 3, 3, 3, 4, 6} is an object,while the data {6, 1, 2, 5, 6} is considered noise.
SLAMSimultaneous Localization and Mapping is a popular,
GTSR 4
![Page 5: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/5.jpg)
Case 1|ri−1- ri| ≥ ∆rmin
TrueCase 2
(ri−1- ri)≥ 0
True FalseCase 3
∆φ ≥ ∆φmin
True False
FalseCase 4
∆φ ≥ 1◦
TrueCase 5ri−1 ≥ ri
True False
False
Fig. 7: Tree Diagram of Nested Case Structures
1 2
3
5
4
6
Fig. 8: Feature Labels Diagram
yet difficult to implement, technique used to map anunknown environment while tracking a robot’s locationwithin that environment. Although erroneous data fromsensors are useful in creating a rough estimate of anenvironment, passing all data through a Kalman filter isan advantageous way to reduce noise. Since Victoria’sdynamics are based on a unicycle model, which isnonlinear, we use an Extended Kalman Filter (EKF)rather than a Kalman Filter. Victoria successfully usesan EKF-SLAM algorithm to aid in all navigation. Inorder to use EKF-SLAM, we must create a state spacemodel for the system. The state space model consistsof a state equation and an observation equation; theinput parameters are the state and input vectors. We usethe position and heading of the robot, as well as thecoordinates of each detected buoy to construct the statevector. The input vector is composed of the linear andangular velocity of the robot. The state vector x and theinput vector u are defined in equations (1) and (2).
x = [xr yr θr x1 y1 . . . xN yN ]T (1)
u = [v ω]T (2)
We use the state and output equations, (3) and (4), from[1] in our implementation. At each time step sensor datais used to update (1) and (2). The non linear matricesf and h are added to zero mean Gaussian random noisevectors, ε and δ .
xk = f(xk−1,uk) + εk (3)zk = h(xk) + δk (4)
The GPS provides the latitude and longitude locationof the robot, which are then converted into a Cartesiancoordinates. The IMU yaw reading represents a truenorth magnetic bearing, which is directly used as thevehicle heading. To compute the coordinates of thebuoys we use angle and distance data from the LiDAR,as well as the Cartesian coordinates and heading ofthe robot. By sending the GPS, IMU, and LiDARdata through an EKF, we reducing the amount of errorassociated with each sensor. This allows us to accuratelymap each detected buoy.
ControllerThe channel navigation and the speed gate missionsare based on the curve tracking algorithm from [2].Curve tracking is a fairly common controller usedon autonomous vehicles, which navigates the robotalong a predefined curve or line. The first step in thisalgorithm is to select the objects located in front ofthe robot. Then, the objects are sorted by distance andcolor, which are determined by comparing the cameraand LiDAR data. This condition is given in equation (5).
(θi − δth) ≥ αi ≤ (θi + δth) (5)
where θi represents the angle between the robot headingand the buoy, αi is the angle to the buoy from the cameradata, and δth is the angle threshold.For the curve tracking controller, we must assume thevehicle and objects satisfy the Frenet-Serret equationsshown in equations (6) and (7).
r1 = v x1
x1 = vuy1
y1 = vux1
(6)
r2 = sx2
x2 = sκy2
y2 = −sκx2
(7)
GTSR 5
![Page 6: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/6.jpg)
where v is the linear speed control, and u is the angularspeed control of the robot; κ is the curvature constantand s is the arc length parameter of the curve [2]. Therobot and the object position vectors are r1 and r2,respectively. The curve tracking algorithm generates animaginary curve through the two closest pairs of red andgreen buoys, and a circle around each yellow buoy. Thered and green curves serve as the outer boundaries forthe channel, while the circle allows Victoria to avoidthe yellow buoys. Using equations (8), (9), and (10), anangular speed is calculated for each curve.
ur =−κr
1 + κrρrcos(φr)− λ(ρr − ρo) cos(φr)− µ sin(φr)
(8)
ug =κg
1 + κgρgcos(φg) + λ(ρg − ρo) cos(φg) + µ sin(φg)
(9)
uy =κy
1 + κyρycos(φy) + λy(ρy − ρo) cos(φy) + µy sin(φy)
(10)
where φr, φg , and φy are the angle from the robot to therespective curves; ρr, ρg , and ρy represent the distancebetween the robot and the respective curves; ρo is thedesired separation between the robot and the curves; λand λy are the proportional gains; µ and µy are thedifferential gains. The controller then calculates the totalangular speed, using equation (11), needed to stay withinthe imaginary curves. The linear speed, v, is set to aconstant value.
u = ur + ug + uy (11)
An illustration of the imaginary curves and the desiredpath is shown in Figure 9.SimulationTo begin software testing as early as possible theEKF-SLAM navigation controller was tested using theNational Instruments Robotics Environment Simulator.The simulator is a new feature of the LabVIEW 2011Robotics module, and includes simulation of a fewvehicles and sensors. Since Victoria has a differentialdrive system, we have opted to use the NI Starter Kit2.0, which is a three-wheeled differential drive robot.The Starter Kit powers two TETRIX wheels with DCmotors and uses an omni wheel for steering. The drivesystem similarities made it straightforward to convertour thruster duty cycle controller outputs to angularvelocity inputs for the DC motors. In addition, thesimulator provides drivers for a few select sensors. Theavailable simulated GPS is the U-blox 5 series, which iscomparable to the Garmin 16X GPS, but also providesbearing data. The available LiDAR is the Hokuyo URG
Fig. 9: Channel Navigation Path
series, which only has a field of view of 150◦. Althoughthe SICK LiDAR has a significantly larger field of view,we felt the Hokuyo LiDAR would suffice. Additionally,the environment simulator provides an AXIS M1011camera, which has a field of view and resolution that areapproximately equivalent to the LifeCam Studio camerasused on Victoria. The environment simulator GPS andLiDAR data were ran through the Extended KalmanFilter to obtain more accurate sensor data to be usedin the state and input vector. The Axis camera wasused to obtain the color of the different obstacles onthe vehicle’s route. The simulator environment used forchannel navigation is shown in Figure 10.
Fig. 10: Channel Navigation Simulation Environment
GTSR 6
![Page 7: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/7.jpg)
Additionally, sample LiDAR data used for objectdetection is illustrated in Figure 11. The set of threeconsecutive points located at an angle of approximately−50◦ represents a detected object, whereas the scattereddots are considered to be noise.
Fig. 11: Sample LiDAR Data collected in SimulationEnvironment
Furthermore, a sample of image processing data usedin the object detection algorithm is shown in Figure 12.
Fig. 12: Sample Camera Data From SimulationEnvironment
B. Poker Chip
This mission was the most difficult to prepare for aslaunching an autonomous vehicle from an autonomousvehicle is not a trivial task. We have designed anamphibious vehicle to deploy, shown in Figure 13,affectionately named the ‘Mother Pucker’ (MP) by itsdesigners, from a custom deployment ramp. We plan
to have each vehicle act independently, while Victoriamonitors the sensors of the MP.
Fig. 13: Autonomous Amphibious Vehicle
The first step of the mission requires image processingto look for the specified orange duct tape lining theramp to the dock. Once this color is within the specifiedpixel locations, slight forward thrust will be usedto ‘dock’ Victoria. A linear actuator will deploy thetelescoping ramp system shown in Figures 14 and 15.
Fig. 14: Amphibious Vehicle Deployment RampPre-deployment
The stored potential energy will be used to launchthe secondary amphibious vehicle as shown in Figure16. Once the MP is on the dock, it uses IR rangesensors to determine when it has been deployed. Then,it begins its navigation algorithm which is a simplifiedoccupancy grid mapping using IR sensors in much thesame way a submerged vehicle would use side scanSONAR. Occupancy Grid Mapping (OGM) assignsa 2-D grid of cells, each with an associated discreterandom variable that can take on two values: occupiedand empty. A probability can then be assigned to eachcell to determine the likelihood that a cell is occupied.
GTSR 7
![Page 8: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/8.jpg)
Fig. 15: Amphibious Vehicle Deployment Ramp modelPost-deployment
Fig. 16: Ramp Deployement Test
Occupancy grid mapping also requires a known path.Since the dimensions of the dock are given, a simple‘lawnmower’ path can be generated to cover the entirearea. The vehicle uses front and rear facing IR sensorsto prevent it from falling off the dock, while using sidescanning sensors to make a map of the environment.When the edge of the dock is detected, the grid cellprobability is updated from 0.5 to 1. For the areaswithin sensor range, but free of any obstacles, the gridcell probabilities are updated to zero. Indexing theoccupancy grid probabilities to filter out the cells thathave adjacent probabilities of 1 yields the exact pucklocation. The vehicle can then plan a path to retrievethe puck. Once the MP has executed the planned path,it uses a dedicated IR sensor placed in the front of thevehicle to verify the mapped puck location and placesthe vehicle directly over the puck. Once in place, alinear actuator attached to a plate of Velcro will deploy,retrieving the puck. A mechanical limit switch is placedon the vehicle to activate this capture mechanism forredundancy. When the chip is captured, the vehicle usesthe map generated to navigate back to Victoria.
There are several fail safes to ensure the deployedvehicle is safely retrieved. The first method utilizes themap generated during the run to drive the vehicle back
to the ramp. A mechanical limit switch as well as an IRsensor will trigger the linear actuator to retract the ramponce the MP is in place. In the event that the vehiclecan not return by its own navigation capabilities, awinch with a safety line attached to the MP will pullthe vehicle back to Victoria. The final safeguard is atimed, forced retrieval. After the allotted amount oftime for the amphibious vehicle to find and retrieve thechip is over, the winch will activate, reeling in releasedline, and then the deployment system will be retracted.
SimulationThe mission was modeled in National InstrumentsRobotics Environment Simulator. The challengeenvironment is shown in Figure 17. The map andoccupancy grid created from sensor data are shownin Figure 18. The white trajectory is the robot’s path,and the green represents an object detected by theIR sensors. The initial occupancy grid is on the left,with the updated grid on the right. Figure 18 showsthe occupancy grid when the robot detects the chip;therefore, the cell with the chip holds a value of 1. t
Fig. 17: Poker Chip Challenge Simulation Environment
C. The Jackpot
In addition to Victoria’s primary camera, a MicrosoftLifeCam Studio will be deployed inside an acrylic casingunder the surface of the water. The primary camerawill detect the two red buttons by color matching anda height/width ratio filter. The underwater camera willfocus on finding an object with high intensity, ratherthan looking for the color white, so that the detection ofthe buoy will be more robust to color changes caused bylight filtering through the water.
Because the acrylic casing for the underwater camerawill create distortions in the image, a software solution
GTSR 8
![Page 9: Georgia Tech Savannah Robotics’ ASVmalisoff/papers/GTSR.pdfAbstract—Victoria, is an autonomous surface vehicle (ASV) dually used as a research and competition platform. Currently,](https://reader036.vdocument.in/reader036/viewer/2022071003/5fc007d3e899023b31535d28/html5/thumbnails/9.jpg)
Fig. 18: Occupancy Grid Graph
utilizing a non-linear mapping between the pixel locationand the angle of the object with respect to the camerawill be used.
Both the underwater camera and primary camera willbe placed on the same vertical axis, which allows themto work in conjunction for more precise matching. Onceboth a button and the buoy are detected at the sameangle with respect to Victoria, Victoria will drive to thatlocation to hit the button.
D. Cheaters Hand
This challenge relies mainly on image processing. Thealgorithm includes four primary steps. The first step ofthe mission requires image processing to identify thelocation of the mission station. The current camera feedis compared to a predefined template which consists ofan image of the mission station. A template matchingscore is used to determine when Victoria has reachedthe station. Once the matching score has reached thethreshold, Victoria searches for the fake card, i.e. the bluesquare. This step requires the use of a color matchingfunction. The vehicle drives in front of the set of cardsuntil it detects the color blue. Once the blue color isdetected, a message is sent from the vision computer tothe cRIO via TCP. When the cRIO receives this message,a signal is sent to the water pump to start the dispersionof water. The water pump keeps firing until the red flagis detected. Similarly, a color matching function is usedto determine when the red flag has risen. The regionof interest is specified above the top corner of the bluesquare. Once the red flag is detected, a message is sent
from the vision computer to the cRIO to stop the waterdispersion.
E. The Hot Suit
This challenge involves minor image processing andsensor measurement. As Victoria is navigating alongthe mission station, an IR sensor is used to measurethe temperature of each suit, while recording their GPSlocations. After Victoria has reached the last suit, themaximum temperature is determined. The suit symbol,its temperature, and its location are then transmitted backto the shore station using the provided communicationprotocol.
V. CONCLUSION
Victoria was subject to considerable redesign not onlyin the mechanical and electrical systems but also in soft-ware. The new mechanical and electrical design providesa more stable and reliable platform that can certainlybe used for a variety of goals. Additionally, there wasa complete redesign in the software architecture, addingmore complex and sophisticated algorithms. Not only dothe EKF-SLAM and curve tracking algorithms provideseamless navigation, but the new vision capabilities alsoincrease our chances of completing each mission in thisyear’s competition.
REFERENCES
[1] T. Bailey, “Mobile Robot Localisation and Mapping in ExtensiveOutdoor Environments,” Doctor of Philosophy thesis, Departmentof Aerospace, Mechanical and Mechatronic Engineering, TheUniversity of Sydney, Sydney, Australia, 2002.
[2] F. Zhang, and M. Egerstedt,“ Curve Tracking Control forAutonomous Vehicles with Rigidly Mounted Range Sensors,”Journal of Intelligent and Robotic Systems, 56(1-2):177-197, 2009.
GTSR 9