advanced driver-assistance system (adas) for intelligent...

8
Research Article Advanced Driver-Assistance System (ADAS) for Intelligent Transportation Based on the Recognition of Traffic Cones Liyong Wang, 1 Peng Sun, 1 Min Xie, 2 Shaobo Ma, 1 Boxiong Li, 1 Yuchen Shi, 1 and Qinghua Su 1 1 Key Laboratory of Modern Measurement and Control Technology, Ministry of Education, Beijing Information Science & Technology University, Haidian District, Beijing 100192, China 2 Computer School, Beijing Information Science and Technology University, Haidian District, Beijing 100101, China Correspondence should be addressed to Qinghua Su; [email protected] Received 13 April 2020; Revised 13 May 2020; Accepted 20 May 2020; Published 2 June 2020 AcademicEditor:QiangTang Copyright©2020LiyongWangetal.isisanopenaccessarticledistributedundertheCreativeCommonsAttributionLicense, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Greatchangeshavetakenplaceinautomationandmachinevisiontechnologyinrecentyears.Meanwhile,thedemandsfordriving safety, efficiency, and intelligence have also increased significantly. More and more attention has been paid to the research on advanceddriver-assistancesystem(ADAS)asoneofthemostimportantfunctionsinintelligenttransportation.Comparedwith traditional transportation, ADAS is superior in ensuring passenger safety, optimizing path planning, and improving driving control,especiallyinanautopilotmode.However,level3andaboveoftheautopilotarestillunavailableduetothecomplexityof traffic situations, for example, detection of a temporary road created by traffic cones. In this paper, an analysis of traffic-cone detection is conducted to assist with path planning under special traffic conditions. A special machine vision system with two monochromecamerasandtwocolorcameraswasusedtorecognizethecolorandpositionofthetrafficcones.eresultindicates thatthisnovelmethodcouldrecognizethered,blue,andyellowtrafficconeswith85%,100%,and100%successrate,respectively, while maintaining 90% accuracy in traffic-cone distance sensing. Additionally, a successful autopilot road experiment was conducted, proving that combining color and depth information for recognition of temporary road conditions is a promising development for intelligent transportation of the future. 1. Introduction With rapid economic development, new opportunities have emerged for the automobile industry. In recent years, both carownershipanddrivernumbershaveincreasedsharplyin China. According to the data from the Ministry of Com- munications, before 2018, China already had over 300 million vehicles and 400 million drivers [1], and with a fast increaseinthenumberofvehicles,someserioustrafficissues have become noticeable. First, traffic safety continues to be verychallenging.Globally,morethan1.25millionpeopledie duetotrafficaccidentsannually,withatotalnumberhaving reached over 38 million since the start of the automobile industry [2–4]. e situation in China is not optimistic becauseover100thousandpeoplegetinjuredordieintraffic accidents every year, costing the economy more than 10 billion Renminbi (RMB). Second, traffic jams have become moreandmoreserious.ishasbecomeaglobalproblemin both developed and developing countries due to the traffic approaching or exceeding road capacity. According to the 2019reportfromAutoNavi,rushhourtrafficjamsoccurred in over 57% of the cities in China, while 4% of the cities suffered heavy ones [5]. Traffic jams increase travel time, gasoline consumption, and exhaust emission while at the same time decrease driving safety tremendously. Advanced driver-assistance system (ADAS) (an im- portant part of intelligent transportation) was developed to overcometheaboveproblems[6].Withthedevelopmentin the telecommunication services, sensing technologies, au- tomation, and computer vision technologies, ADAS devel- opment has achieved positive results in traffic resource integration, real-time vehicle status, and driving Hindawi Advances in Civil Engineering Volume 2020, Article ID 8883639, 8 pages https://doi.org/10.1155/2020/8883639

Upload: others

Post on 01-Dec-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

Research ArticleAdvanced Driver-Assistance System (ADAS) for IntelligentTransportation Based on the Recognition of Traffic Cones

Liyong Wang1 Peng Sun1 Min Xie2 Shaobo Ma1 Boxiong Li1 Yuchen Shi1

and Qinghua Su 1

1Key Laboratory of Modern Measurement and Control Technology Ministry of EducationBeijing Information Science amp Technology University Haidian District Beijing 100192 China2Computer School Beijing Information Science and Technology University Haidian District Beijing 100101 China

Correspondence should be addressed to Qinghua Su suqinghua1985qqcom

Received 13 April 2020 Revised 13 May 2020 Accepted 20 May 2020 Published 2 June 2020

Academic Editor Qiang Tang

Copyright copy 2020 LiyongWang et al +is is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Great changes have taken place in automation andmachine vision technology in recent years Meanwhile the demands for drivingsafety efficiency and intelligence have also increased significantly More and more attention has been paid to the research onadvanced driver-assistance system (ADAS) as one of the most important functions in intelligent transportation Compared withtraditional transportation ADAS is superior in ensuring passenger safety optimizing path planning and improving drivingcontrol especially in an autopilot mode However level 3 and above of the autopilot are still unavailable due to the complexity oftraffic situations for example detection of a temporary road created by traffic cones In this paper an analysis of traffic-conedetection is conducted to assist with path planning under special traffic conditions A special machine vision system with twomonochrome cameras and two color cameras was used to recognize the color and position of the traffic cones+e result indicatesthat this novel method could recognize the red blue and yellow traffic cones with 85 100 and 100 success rate respectivelywhile maintaining 90 accuracy in traffic-cone distance sensing Additionally a successful autopilot road experiment wasconducted proving that combining color and depth information for recognition of temporary road conditions is a promisingdevelopment for intelligent transportation of the future

1 Introduction

With rapid economic development new opportunities haveemerged for the automobile industry In recent years bothcar ownership and driver numbers have increased sharply inChina According to the data from the Ministry of Com-munications before 2018 China already had over 300million vehicles and 400 million drivers [1] and with a fastincrease in the number of vehicles some serious traffic issueshave become noticeable First traffic safety continues to bevery challenging Globally more than 125million people diedue to traffic accidents annually with a total number havingreached over 38 million since the start of the automobileindustry [2ndash4] +e situation in China is not optimisticbecause over 100 thousand people get injured or die in trafficaccidents every year costing the economy more than 10

billion Renminbi (RMB) Second traffic jams have becomemore and more serious+is has become a global problem inboth developed and developing countries due to the trafficapproaching or exceeding road capacity According to the2019 report from AutoNavi rush hour traffic jams occurredin over 57 of the cities in China while 4 of the citiessuffered heavy ones [5] Traffic jams increase travel timegasoline consumption and exhaust emission while at thesame time decrease driving safety tremendously

Advanced driver-assistance system (ADAS) (an im-portant part of intelligent transportation) was developed toovercome the above problems [6] With the development inthe telecommunication services sensing technologies au-tomation and computer vision technologies ADAS devel-opment has achieved positive results in traffic resourceintegration real-time vehicle status and driving

HindawiAdvances in Civil EngineeringVolume 2020 Article ID 8883639 8 pageshttpsdoiorg10115520208883639

environment monitoring [7ndash10] Generally ADAS consistsof active safety and passive safety Passive safety relies oncertain devices such as safety belts airbags and bumpers toprotect passengers and reduce damage [11] Howeverpassive safety cannot improve driving safety by itself because93 of the traffic accidents are caused by the driversrsquo lack ofawareness of the danger [12] Also it has been reported that90 of the dangerous accidents could have been avoided ifthe drivers were warned just 15 seconds earlier [13]Consequently active safety (developed to sense and predictdangerous situations) has been considered an important partof modern vehicles By exchanging data with other deviceson the Internet of things (IoT) active safety modules canassist drivers in making decisions based on the overall trafficstatus and replace the traffic lights for adaptive scheduling ofvehicles at intersections [14] Active safety modules can alsoestimate the risk of current driving behaviors by analyzingdynamic information from nearby vehicles via telecom-munication service and cloud computing If the risk is highand might cause a collision the vehicle can warn the driverto correct the driving behavior and in urgent cases theactive safety modules can take over the control of the vehicleto avoid a traffic accident [15] +e latest active safetymodules have achieved the identification of traffic signs byapplying deep machine learning technology As a result avehicle could recognize a traffic warning or limitation andremind the driver not to violate the traffic rules [16]

In response to the need for intelligent transportationADAS research has focused on autopilots with manycountries (especially the US Japan and some Europeancountries) investing a lot of money and effort into theirdevelopment and making outstanding achievements [17]Vehicular ad hoc network (VANET) technology whichprovides channels for collecting real-time traffic informationand scheduled vehicle crossings in the intersection zonesoffers a new approach to releasing traffic pressure whentraditional governance cannot solve the congestion issueeffectively It reduces the average vehicle waiting time andimproves traveling efficiency and safety by gathering propertraffic-related data and optimizing scheduling algorithms[18ndash20] Many accidents caused by the driverrsquos inattention tothe traffic signs can be avoided if the warnings are noticed inadvance Traffic-sign recognition function which includestraffic-sign detection and traffic-sign classification has beendeveloped to solve this issue via machine vision technologySince the camera-captured images include a lot of uselessinformation sliding window technology has been used tolocate the traffic sign region in the image +en certainalgorithms such as histogram of oriented gradient (HOG)support vector machine (SVM) random forest and con-volutional neural network (CNN) are used for feature de-tection and classification [21ndash23] With the sliding windowtechnology being rather time-consuming some researchershave proposed other solutions for locating traffic regions(ie region of interest (ROI)) which decreased averageimage processing time to 67ms [24] One of the most im-portant functions of ADAS is collision avoidance wherewarning technology senses potential accident risks based oncertain factors such as vehicle speed space between vehicles

and so on [22] By installing proper sensors like radarultrasonic sensor or infrared sensor multiple target vehiclesand objects within 150m can be measured with precisionand assessed rapidly for a safe distance [21 24] One obviouschallenge however is that space information may bemissing in certain blind spots that sensors cannot detect[23] To solve this problem vehicle-to-vehicle (V2V)communication and Global Positioning System (GPS) haverecently been introduced Since then collision avoidancewarning has begun not only to be analyzed via passivemeasurements but also collected by active communicationfor its status data on the nearby vehicles [25]

Even though many different measures have been used indanger detection one issue remains challenging Colorfultraffic cones that temporarily mark roads for road main-tenance control or accident field protection are often hard todetect and process by the space sensors due to their smallsize If neither the driver nor the ADAS notices the trafficcones on the road serious human injuries and propertydamagemay occur Some fruitful research in detecting trafficcones has been conducted using cameras and LiDAR sen-sors using such technologies as machine vision imageprocessing and machine learning [26ndash28] However someproblems have become noticeable First high-quality sen-sors like LiDAR are expensive and manufacturers are notwilling to install them without a sharp cost decrease Secondmachine learning technology requires a lot of system re-sources and on-board computers are not sufficient +usthe overall objective of this study was to develop a cost-effective machine vision system that can automatically detectroad traffic cones based on the cone distribution to avoid anypotential accidents +is method was able not only to rec-ognize traffic cones on the roads but to sense their distanceand assist the automatic vehicle control in navigating themsmoothly +is required the development of algorithms forquick recognition of traffic cones by color and for sensing thecorresponding distance data

2 Materials and Methods

21 Experiment Car and Traffic Cones An experimental carwas designed with a 2600mm length a 1500mm width anda 1650mm height and its powertrain was composed of a 4Ah battery and a DC motor with 80KW as shown inFigure 1

+e controlling system of the car contained an em-bedded computer (Intel i7 CPU 8G RAM) vehicle con-trolling unit (VCU) battery management system (BMS)brake controller DC motor controller and a machine visionsystem as shown in Figure 2 +e embedded computerwhich worked as the brain of the car not only controlled themachine vision system to capture the road images but alsosent appropriate commands to VCU after processing theroad images and analyzing the car status VCU performed asa bridge between the embedded computer and the hardwareonboard VCU collected real-time status data of the carsending it to the embedded computer At the same time itcontrolled BMS the DC motor controller and the brakecontroller as they issued valid commands from the

2 Advances in Civil Engineering

embedded computer For safety reasons the VCU rejectedany invalid commands or any commands received in thepresence of a component error Each part of the controllingsystem communicated through the CAN bus with a250Kbps baud rate except for the machine vision systemwhich exchanged data with the embedded computerthrough Ethernet

+e red blue and yellow traffic cones that are widelyused on the roads in China were200mmtimes 200mmtimes 300mm (length width and heightrespectively) with a reflective stripe attached in the middleas shown in Figure 1 +e red and blue traffic cones wereused for indicating the left and right edges of a temporaryroad while the yellow ones specified the start and end of aroad in this experiment

22 Machine Vision System Figure 3 shows the Smart EyeB1 camera system (consisting of four cameras) chosen forthis research Twomonochrome cameras which composed astereo vision system were used for sensing real-time 3-di-mensional environment data whereas the color cameraswere detecting color information According to the speci-fications of the Smart B1 camera system its error of spaceprediction is lt6 within a detectable range of 05ndash60mAdditionally this camera system can automatically adjustwhite balance +e resolution for all cameras was set to1280lowast720 and the frequency of all cameras was set to 12 fpsTwo independent Ethernets with a 100 megabit bandwidthcontrolled the data exchange for the monochrome and colorcameras +e camera was placed 1500mm above the groundto simulate the field of view in a sedan +e example imagesare shown in Figure 4(a)

23 Range Detection via Stereo Vision In this experimenttwomonochrome cameras were used to build a stereo visionA point P (x y z) in a world coordinate system projectedinto the two cameras with the coordinates Pleft (xl yl zl) andPright (xr yr zr) Since the height of the two cameras was thesame the values of yl and yr were the same and the 3-di-mensional coordinate could be changed into a 2-dimen-sional coordinate for analysis as shown in Figure 5 f was thecamerarsquos focal length while b was the baseline of the left andright cameras

According to the triangle similarity law the followingrelation exists

z

f

y

yl

y

yr

x

xl

x minus b

xr (1)

From equation (1) the x y and z values can be calculatedwith the following equations

x xllowast b

xl minus xr

y ylowast b

xl minus xr

z flowast b

xl minus xr

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(2)

A depth image D (x y) which included the objectrsquosdistance information in each pixel was generated by the zvalues as a 32 bit floating matrix that could be visualized viathe handleDisparityPointByPoint () API from the camerasystemrsquos Standard Development Kit (SDK) A processeddepth image is presented in Figure 4(b) with the warmercolor indicating the longer distance +e original depthimage format was converted from the 32 bit floating matrixto a color image because the float data and pixel valuesexceeded 255 and were unavailable for display on the currentoperating system

24 TrafficConeDetection Traffic cone detection which wasdeveloped using C++ language with an OpenCV libraryconsisted of four functions color recognition size anddistance calculation noise filtering and the traffic conemarking

241 Color Recognition All traffic cones had the sameshape size and reflective stripes except for their color Sincethe differences between the yellow red and blue colors wereobvious they were able to distinguish from the color imagesby processing these images during the day time +e colordetection algorithm is shown in equation (3)+e red greenand blue values in each pixel of the color imageH (x y) wereused for ratio calculations that would determine this pixel

Machine visionsystem

DC motor

Battery

Traffic cones

Figure 1 Experiment car

Advances in Civil Engineering 3

(a) (b)

Figure 4 Images output from Smart B1 camera system (a) +e color image (b) +e processed depth image

Camera L

Camera R

Baseline b

f

f

Z

XP (x z)

xl

xr

xndashb

z

Figure 5 Range detection

Ethernet Embedded computer

VCU

BMS

Battery

DC motor controller

DC motor

Brake controller

Machine visionsystem

CAN bus

Brake

Figure 2 Vehicle control system

43

1 2

Figure 3 Smart B1 camera system

4 Advances in Civil Engineering

color feature+e thresholds fromT1 to T7 were set based onthe experimental results

yellow ifH(x y) middot [red]

H(x y) middot [blue]gtT1amp

H(x y) middot [green]

H(x y) middot [blue]gtT2amp

H(x y) middot [red]

H(x y) middot [green]gtT3

blue ifH(x y) middot [red]

H(x y) middot [blue]ltT4amp

H(x y) middot [green]

H(x y) middot [blue]ltT5

red ifH(x y) middot [red]

H(x y) middot [blue]gtT6amp

H(x y) middot [green]

H(x y) middot [blue]gtT7

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(3)

242 Size and Distance Calculation When all traffic conepixels in image H (x y) were marked traffic conersquos size anddistance were calculated as shown in equation (4) Size Swasthe number of pixels in one isolated traffic cone area inH (xy) while D was the average gray value in the same arearsquosdepth image D (x y)

S 0 as initial

S S + 1 ifH(x y) is traffic cone pixel

D 1113936i1113936jD(xi yj)

S D(xi yj)inside traffic cone area

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎩

(4)

243 Noise Filtering and Target Marking Since variousobjects showed up in the color images with colors similar tothose of the traffic cones it was necessary to eliminate those asnoise Because the traffic cone size was in reverse proportion tothe distance in the images filtering of the fake traffic cone pixelswas conducted based on the size S and average distance dataDas shown in equation (5) A traffic cone was ignored unless Swas less than the threshold at distanceD and it was confirmedif S was equal to or larger than the threshold at D Finallyminimal external rectangles were calculated to mark all of theexisting traffic cones in the area as the detected traffic cones

is traffic cone ifSge threshold atD

not traffic cone ifSge threshold atD1113896 (5)

3 Results and Discussion

+e experiment was separated into a color marking test anda distance matching test +e color marking test was mainlyfocused on the traffic cone recognition whereas the distancematching test validated the space measuring function Inaddition a road test was conducted to validate the algo-rithmrsquos stability and efficiency

31 Traffic Cone Recognition Test Twenty red traffic conesfourteen blue cones and sixteen yellow cones were manually

placed in front of the experiment car As shown in Figure 6recognized traffic cones were marked by rectangles with thesame colors as the bodies of the cones whereas the un-recognized ones were marked with white rectangles +eblue and yellow traffic cones reached a 100 detectionsuccess rate while the red ones were accurately detected 85of the time +e three undetected red traffic cones werelocated close to the left and right edges of the image andplaced on a section of the playground that was reddish incolor Also one of them was 10 meters away from thecamera and two were over twenty meters away from thecamera +e ground color might have influenced red colorrecognition

32 Distance Matching Test After the traffic cone markingprocess the distance data matching test was conducted andthe experiment results are shown in Figure 7 Fourteen blueand sixteen yellow traffic cones were matched with thecorresponding distance data from the depth image with a100 accuracy rate However only 15 out of 20 red trafficcones had the corresponding distance data in the pixel areaof the depth image Besides the three undetected red trafficcones in recognition test another two red ones on the leftside which were close to a blue pole were mismatched incolor and depth +e overlay might be the reason for thiserror Consequently 45 out of 50 traffic cones were suc-cessfully paired with their distance information and theoverall success rate was 90 A prediction error existed forthe paired traffic cones from 2 cm to 11m between predicteddistance and manual measured distance and this error wentup when the distance between the camera and the coneincreased +is error was within 6 and it was acceptablewhile the experiment car ran at a speed of 10 kmh

33 Road Test To simulate a temporary road the trafficcones with red color were designated as the left roadboundary and the blue ones were designated as the rightroad boundary+e yellow traffic cones were used to indicatethe start and end of the temporary road +e distance be-tween any two traffic cones of the same color was 5m andthe width of the temporary road as marked by the red andblue cones was 3m +e temporary road included a curve-

Advances in Civil Engineering 5

line section and a straight-line section and the road testimages are shown in Figure 8

+e experiment demonstrated that a machine visionsystem could detect red blue and yellow traffic cones andthe experiment car in an autopilot mode could successfullynavigate a temporary road at a speed of 10 kmh Withoutthe similar color influence the success rate of recognitionincreased At times one or two traffic cones were missingfrom a frame of color and depth image and this might beexplained by the following First some cones that were nearthe left and right edges of the images could not be paired in

color and depth and the same happened in the initial statictest Since the distance between the car and the traffic conesnear the edge of the image was quite long the error wouldnot impact driving safety Besides 12 frames of color anddepth images were captured in one second so the missingcones could be detected in the following frames while theymoved away from the image boundary area Second trafficcones that were entering or leaving the images while theexperiment car was moving might not have been detected ifthey showed up only partially Once these traffic cones fullyentered the images this problem was solved automatically

Figure 6 Traffic cone recognition static test

2

times104 mm

times104 mm

15

1

05ndash2 ndash15 ndash1 ndash05 0 05 1 15 2

Red coneBlue coneYellow cone

Figure 7 Traffic cone distance matching static test

(a) (b)

Figure 8 Road test (a) +e curve-line section (b) +e straight-line section

6 Advances in Civil Engineering

4 Conclusion

An image processing algorithm based on color and depthimages was successfully applied to traffic cone detectionEach image frame was analyzed within 80ms which in-cluded one color and one depth image capture and pro-cessing +e traffic cones were very accurately recognized bycolor with the success rates of color recognition being 85100 and 100 for red blue and yellow cones respectivelyAdditionally the distance was successfully sensed for 90 ofthe traffic cones by pairing color and depth images Some ofthe cones were missing in some of the image frames whenthey were located around the image edge area but they couldbe found in the following frames of the dynamic test With12 frames per second in the machine vision system cones atthe edges of the area naturally came in and out of the field ofvision of the moving camera +is method was very effectiveon a temporary road marked by traffic cones of differentcolors +e advantages of using paired color and depthimages for traffic cone detection can be summarized asfollows (1) +is method is sensitive to small safety-relatedtraffic cones (2) It uses a highly efficient and stable algo-rithm for recognition processing (3) It is a cost-effectivesolution for maintaining safe driving on temporary roads

Data Availability

All data presented and analyzed in the study were obtainedfrom laboratory tests at Beijing Information Science ampTechnology University in Beijing China All laboratorytesting data are presented in the figures and tables in thearticle We will be very pleased to share all our raw data Ifneeded please contact us via e-mail suqinghua1985qqcom

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+e authors wish to thank the National Defense Science andTechnology Project (JCCPCX201705)+e authors also ap-preciate the great support from Beijing Information Scienceamp Technology University with Qin Xin Talents CultivationProgram (QXTCPA201903 and QXTCPB201901) ScientificResearch Level Promotion Project (2020KYNH112) andSchool Research Fund (2025041)

References

[1] China Association for Road Traffic Safety (CARTS) ldquoChinarsquosmotor vehicles and drivers maintain rapid growth in 2016rdquoRoad Traffic Management vol 2 p 9 2017 in Chinese

[2] M Gao Analysis of Highway Traffic Accidents in HebeiProvince and Preventive Measures [D] Peoplersquos Public Se-curity University of China Beijing China 2019 in Chinese

[3] P J Ossenbruggen J Pendharkar and J Ivan ldquoRoadwaysafety in rural and small urbanized areasrdquoAccident Analysis ampPrevention vol 33 no 4 pp 485ndash498 2001

[4] T Toroyan ldquoGlobal status report on road safety 2013 sup-porting a decade of actionrdquo Injury Prevention vol 15 no 4p 286 2013

[5] Xinhuanetcom ldquoAnalysis report on traffic of major cities inChina 2018 Q3 released by Autonavirdquo Urban Traffic vol 6pp 106-107 2018 in Chinese

[6] R Wang L Guo L Jin et al ldquoRecent research on safetyassisted driving technology of intelligent vehiclerdquo HighwayTransportation Technology vol 24 no 7 pp 107ndash111 2007 inChinese

[7] I F Akyildiz W Weilian Su Y Sankarasubramaniam andE Cayirci ldquoA survey on sensor networksrdquo IEEE Communi-cations Magazine vol 40 no 8 pp 102ndash114 2002

[8] S Ma and Z Zhang Computer Vision Fundamentals ofComputational eory and Algorithm Beijing Science PressBeijing China (in Chinese) 1998

[9] Yi Sun Short Range Wireless Communication and NetworkingTechnology Xirsquoan University of Electronic Science andTechnology Press Xirsquoan China (in Chinese) 2008

[10] Z Zhe H Jia W Jiang et al Research on Intelligent ControlTechnology Hebei University of technology and IndustryPress Hebei China (in Chinese) 2010

[11] Y Nie ldquoAnalysis of vehicle safety assisted driving technol-ogyrdquo Traffic and Transportation vol 2 pp 151ndash153 2008 (inChinese)

[12] L Jiaxing Driver Fatigue Monitoring and Warning SystemBased-On Multi-Parameret Fusion pp 226ndash232 LanzhouUniversity Lanzhou China 2013

[13] M Tetsuya and N Hidetoshi ldquoAnalysis of relationship be-tween characteristics of driverrsquos eye movements and visualscene in driving eventsrdquo in Proceedings of the 2011 IEEEInternational Conference on Fuzzy Systems (FUZZ-IEEE2011) IEEE Taipei Taiwan June 2011

[14] K Zhang D Zhang A de La Fortelle X Wu and J GregoireldquoState-driven priority scheduling mechanisms for driverlessvehicles approaching intersectionsrdquo IEEE Transactions onIntelligent Transportation Systems vol 16 no 5 pp 2487ndash2500 2015

[15] Y Wang E Wenjuan D Tian G Lu and Y Wang ldquoVehiclecollision warning system and collision detection algorithmbased on vehicle infrastructure integrationrdquo in Proceedings ofthe Advanced Forum on Transportation of China (AFTC 2011)7th IET Beijing China October 2011

[16] T Chen and S Lu ldquoAccurate and efficient traffic sign de-tection using discriminative adaboost and support vectorregressionrdquo IEEE Transactions on Vehicular Technologyvol 65 no 6 pp 4006ndash4015 2016

[17] F Yang ldquoDevelopment status and prospect of driverlessvehiclesrdquo Shanghai Automotive vol 3 pp 35ndash40 2014 (inChinese)

[18] A Bazzi A Zanella B M Masini and G Pasolini A Dis-tributed Algorithm for Virtual Traffic Lights with IEEE 80211pin Proceedings of the European Conference on Networks ampCommunications IEEE Valencia Spain October 2014

[19] G Wang Y Hou Y Zhang Y Zhou N Lu and N ChengldquoTlb-Vtl 3-Level buffer based virtual traffic light scheme forintelligent collaborative intersectionsrdquo in Proceedings IEEE86th Vehicular Technology Conference (Vtc-Fall) pp 1ndash5Toronto canada September 2017

[20] M B Younes and A Boukerche ldquoIntelligent traffic lightcontrolling algorithms using vehicular networksrdquo IEEETransactions on Vehicular Technology vol 65 no 8pp 5887ndash5899 2016

Advances in Civil Engineering 7

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering

Page 2: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

environment monitoring [7ndash10] Generally ADAS consistsof active safety and passive safety Passive safety relies oncertain devices such as safety belts airbags and bumpers toprotect passengers and reduce damage [11] Howeverpassive safety cannot improve driving safety by itself because93 of the traffic accidents are caused by the driversrsquo lack ofawareness of the danger [12] Also it has been reported that90 of the dangerous accidents could have been avoided ifthe drivers were warned just 15 seconds earlier [13]Consequently active safety (developed to sense and predictdangerous situations) has been considered an important partof modern vehicles By exchanging data with other deviceson the Internet of things (IoT) active safety modules canassist drivers in making decisions based on the overall trafficstatus and replace the traffic lights for adaptive scheduling ofvehicles at intersections [14] Active safety modules can alsoestimate the risk of current driving behaviors by analyzingdynamic information from nearby vehicles via telecom-munication service and cloud computing If the risk is highand might cause a collision the vehicle can warn the driverto correct the driving behavior and in urgent cases theactive safety modules can take over the control of the vehicleto avoid a traffic accident [15] +e latest active safetymodules have achieved the identification of traffic signs byapplying deep machine learning technology As a result avehicle could recognize a traffic warning or limitation andremind the driver not to violate the traffic rules [16]

In response to the need for intelligent transportationADAS research has focused on autopilots with manycountries (especially the US Japan and some Europeancountries) investing a lot of money and effort into theirdevelopment and making outstanding achievements [17]Vehicular ad hoc network (VANET) technology whichprovides channels for collecting real-time traffic informationand scheduled vehicle crossings in the intersection zonesoffers a new approach to releasing traffic pressure whentraditional governance cannot solve the congestion issueeffectively It reduces the average vehicle waiting time andimproves traveling efficiency and safety by gathering propertraffic-related data and optimizing scheduling algorithms[18ndash20] Many accidents caused by the driverrsquos inattention tothe traffic signs can be avoided if the warnings are noticed inadvance Traffic-sign recognition function which includestraffic-sign detection and traffic-sign classification has beendeveloped to solve this issue via machine vision technologySince the camera-captured images include a lot of uselessinformation sliding window technology has been used tolocate the traffic sign region in the image +en certainalgorithms such as histogram of oriented gradient (HOG)support vector machine (SVM) random forest and con-volutional neural network (CNN) are used for feature de-tection and classification [21ndash23] With the sliding windowtechnology being rather time-consuming some researchershave proposed other solutions for locating traffic regions(ie region of interest (ROI)) which decreased averageimage processing time to 67ms [24] One of the most im-portant functions of ADAS is collision avoidance wherewarning technology senses potential accident risks based oncertain factors such as vehicle speed space between vehicles

and so on [22] By installing proper sensors like radarultrasonic sensor or infrared sensor multiple target vehiclesand objects within 150m can be measured with precisionand assessed rapidly for a safe distance [21 24] One obviouschallenge however is that space information may bemissing in certain blind spots that sensors cannot detect[23] To solve this problem vehicle-to-vehicle (V2V)communication and Global Positioning System (GPS) haverecently been introduced Since then collision avoidancewarning has begun not only to be analyzed via passivemeasurements but also collected by active communicationfor its status data on the nearby vehicles [25]

Even though many different measures have been used indanger detection one issue remains challenging Colorfultraffic cones that temporarily mark roads for road main-tenance control or accident field protection are often hard todetect and process by the space sensors due to their smallsize If neither the driver nor the ADAS notices the trafficcones on the road serious human injuries and propertydamagemay occur Some fruitful research in detecting trafficcones has been conducted using cameras and LiDAR sen-sors using such technologies as machine vision imageprocessing and machine learning [26ndash28] However someproblems have become noticeable First high-quality sen-sors like LiDAR are expensive and manufacturers are notwilling to install them without a sharp cost decrease Secondmachine learning technology requires a lot of system re-sources and on-board computers are not sufficient +usthe overall objective of this study was to develop a cost-effective machine vision system that can automatically detectroad traffic cones based on the cone distribution to avoid anypotential accidents +is method was able not only to rec-ognize traffic cones on the roads but to sense their distanceand assist the automatic vehicle control in navigating themsmoothly +is required the development of algorithms forquick recognition of traffic cones by color and for sensing thecorresponding distance data

2 Materials and Methods

21 Experiment Car and Traffic Cones An experimental carwas designed with a 2600mm length a 1500mm width anda 1650mm height and its powertrain was composed of a 4Ah battery and a DC motor with 80KW as shown inFigure 1

+e controlling system of the car contained an em-bedded computer (Intel i7 CPU 8G RAM) vehicle con-trolling unit (VCU) battery management system (BMS)brake controller DC motor controller and a machine visionsystem as shown in Figure 2 +e embedded computerwhich worked as the brain of the car not only controlled themachine vision system to capture the road images but alsosent appropriate commands to VCU after processing theroad images and analyzing the car status VCU performed asa bridge between the embedded computer and the hardwareonboard VCU collected real-time status data of the carsending it to the embedded computer At the same time itcontrolled BMS the DC motor controller and the brakecontroller as they issued valid commands from the

2 Advances in Civil Engineering

embedded computer For safety reasons the VCU rejectedany invalid commands or any commands received in thepresence of a component error Each part of the controllingsystem communicated through the CAN bus with a250Kbps baud rate except for the machine vision systemwhich exchanged data with the embedded computerthrough Ethernet

+e red blue and yellow traffic cones that are widelyused on the roads in China were200mmtimes 200mmtimes 300mm (length width and heightrespectively) with a reflective stripe attached in the middleas shown in Figure 1 +e red and blue traffic cones wereused for indicating the left and right edges of a temporaryroad while the yellow ones specified the start and end of aroad in this experiment

22 Machine Vision System Figure 3 shows the Smart EyeB1 camera system (consisting of four cameras) chosen forthis research Twomonochrome cameras which composed astereo vision system were used for sensing real-time 3-di-mensional environment data whereas the color cameraswere detecting color information According to the speci-fications of the Smart B1 camera system its error of spaceprediction is lt6 within a detectable range of 05ndash60mAdditionally this camera system can automatically adjustwhite balance +e resolution for all cameras was set to1280lowast720 and the frequency of all cameras was set to 12 fpsTwo independent Ethernets with a 100 megabit bandwidthcontrolled the data exchange for the monochrome and colorcameras +e camera was placed 1500mm above the groundto simulate the field of view in a sedan +e example imagesare shown in Figure 4(a)

23 Range Detection via Stereo Vision In this experimenttwomonochrome cameras were used to build a stereo visionA point P (x y z) in a world coordinate system projectedinto the two cameras with the coordinates Pleft (xl yl zl) andPright (xr yr zr) Since the height of the two cameras was thesame the values of yl and yr were the same and the 3-di-mensional coordinate could be changed into a 2-dimen-sional coordinate for analysis as shown in Figure 5 f was thecamerarsquos focal length while b was the baseline of the left andright cameras

According to the triangle similarity law the followingrelation exists

z

f

y

yl

y

yr

x

xl

x minus b

xr (1)

From equation (1) the x y and z values can be calculatedwith the following equations

x xllowast b

xl minus xr

y ylowast b

xl minus xr

z flowast b

xl minus xr

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(2)

A depth image D (x y) which included the objectrsquosdistance information in each pixel was generated by the zvalues as a 32 bit floating matrix that could be visualized viathe handleDisparityPointByPoint () API from the camerasystemrsquos Standard Development Kit (SDK) A processeddepth image is presented in Figure 4(b) with the warmercolor indicating the longer distance +e original depthimage format was converted from the 32 bit floating matrixto a color image because the float data and pixel valuesexceeded 255 and were unavailable for display on the currentoperating system

24 TrafficConeDetection Traffic cone detection which wasdeveloped using C++ language with an OpenCV libraryconsisted of four functions color recognition size anddistance calculation noise filtering and the traffic conemarking

241 Color Recognition All traffic cones had the sameshape size and reflective stripes except for their color Sincethe differences between the yellow red and blue colors wereobvious they were able to distinguish from the color imagesby processing these images during the day time +e colordetection algorithm is shown in equation (3)+e red greenand blue values in each pixel of the color imageH (x y) wereused for ratio calculations that would determine this pixel

Machine visionsystem

DC motor

Battery

Traffic cones

Figure 1 Experiment car

Advances in Civil Engineering 3

(a) (b)

Figure 4 Images output from Smart B1 camera system (a) +e color image (b) +e processed depth image

Camera L

Camera R

Baseline b

f

f

Z

XP (x z)

xl

xr

xndashb

z

Figure 5 Range detection

Ethernet Embedded computer

VCU

BMS

Battery

DC motor controller

DC motor

Brake controller

Machine visionsystem

CAN bus

Brake

Figure 2 Vehicle control system

43

1 2

Figure 3 Smart B1 camera system

4 Advances in Civil Engineering

color feature+e thresholds fromT1 to T7 were set based onthe experimental results

yellow ifH(x y) middot [red]

H(x y) middot [blue]gtT1amp

H(x y) middot [green]

H(x y) middot [blue]gtT2amp

H(x y) middot [red]

H(x y) middot [green]gtT3

blue ifH(x y) middot [red]

H(x y) middot [blue]ltT4amp

H(x y) middot [green]

H(x y) middot [blue]ltT5

red ifH(x y) middot [red]

H(x y) middot [blue]gtT6amp

H(x y) middot [green]

H(x y) middot [blue]gtT7

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(3)

242 Size and Distance Calculation When all traffic conepixels in image H (x y) were marked traffic conersquos size anddistance were calculated as shown in equation (4) Size Swasthe number of pixels in one isolated traffic cone area inH (xy) while D was the average gray value in the same arearsquosdepth image D (x y)

S 0 as initial

S S + 1 ifH(x y) is traffic cone pixel

D 1113936i1113936jD(xi yj)

S D(xi yj)inside traffic cone area

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎩

(4)

243 Noise Filtering and Target Marking Since variousobjects showed up in the color images with colors similar tothose of the traffic cones it was necessary to eliminate those asnoise Because the traffic cone size was in reverse proportion tothe distance in the images filtering of the fake traffic cone pixelswas conducted based on the size S and average distance dataDas shown in equation (5) A traffic cone was ignored unless Swas less than the threshold at distanceD and it was confirmedif S was equal to or larger than the threshold at D Finallyminimal external rectangles were calculated to mark all of theexisting traffic cones in the area as the detected traffic cones

is traffic cone ifSge threshold atD

not traffic cone ifSge threshold atD1113896 (5)

3 Results and Discussion

+e experiment was separated into a color marking test anda distance matching test +e color marking test was mainlyfocused on the traffic cone recognition whereas the distancematching test validated the space measuring function Inaddition a road test was conducted to validate the algo-rithmrsquos stability and efficiency

31 Traffic Cone Recognition Test Twenty red traffic conesfourteen blue cones and sixteen yellow cones were manually

placed in front of the experiment car As shown in Figure 6recognized traffic cones were marked by rectangles with thesame colors as the bodies of the cones whereas the un-recognized ones were marked with white rectangles +eblue and yellow traffic cones reached a 100 detectionsuccess rate while the red ones were accurately detected 85of the time +e three undetected red traffic cones werelocated close to the left and right edges of the image andplaced on a section of the playground that was reddish incolor Also one of them was 10 meters away from thecamera and two were over twenty meters away from thecamera +e ground color might have influenced red colorrecognition

32 Distance Matching Test After the traffic cone markingprocess the distance data matching test was conducted andthe experiment results are shown in Figure 7 Fourteen blueand sixteen yellow traffic cones were matched with thecorresponding distance data from the depth image with a100 accuracy rate However only 15 out of 20 red trafficcones had the corresponding distance data in the pixel areaof the depth image Besides the three undetected red trafficcones in recognition test another two red ones on the leftside which were close to a blue pole were mismatched incolor and depth +e overlay might be the reason for thiserror Consequently 45 out of 50 traffic cones were suc-cessfully paired with their distance information and theoverall success rate was 90 A prediction error existed forthe paired traffic cones from 2 cm to 11m between predicteddistance and manual measured distance and this error wentup when the distance between the camera and the coneincreased +is error was within 6 and it was acceptablewhile the experiment car ran at a speed of 10 kmh

33 Road Test To simulate a temporary road the trafficcones with red color were designated as the left roadboundary and the blue ones were designated as the rightroad boundary+e yellow traffic cones were used to indicatethe start and end of the temporary road +e distance be-tween any two traffic cones of the same color was 5m andthe width of the temporary road as marked by the red andblue cones was 3m +e temporary road included a curve-

Advances in Civil Engineering 5

line section and a straight-line section and the road testimages are shown in Figure 8

+e experiment demonstrated that a machine visionsystem could detect red blue and yellow traffic cones andthe experiment car in an autopilot mode could successfullynavigate a temporary road at a speed of 10 kmh Withoutthe similar color influence the success rate of recognitionincreased At times one or two traffic cones were missingfrom a frame of color and depth image and this might beexplained by the following First some cones that were nearthe left and right edges of the images could not be paired in

color and depth and the same happened in the initial statictest Since the distance between the car and the traffic conesnear the edge of the image was quite long the error wouldnot impact driving safety Besides 12 frames of color anddepth images were captured in one second so the missingcones could be detected in the following frames while theymoved away from the image boundary area Second trafficcones that were entering or leaving the images while theexperiment car was moving might not have been detected ifthey showed up only partially Once these traffic cones fullyentered the images this problem was solved automatically

Figure 6 Traffic cone recognition static test

2

times104 mm

times104 mm

15

1

05ndash2 ndash15 ndash1 ndash05 0 05 1 15 2

Red coneBlue coneYellow cone

Figure 7 Traffic cone distance matching static test

(a) (b)

Figure 8 Road test (a) +e curve-line section (b) +e straight-line section

6 Advances in Civil Engineering

4 Conclusion

An image processing algorithm based on color and depthimages was successfully applied to traffic cone detectionEach image frame was analyzed within 80ms which in-cluded one color and one depth image capture and pro-cessing +e traffic cones were very accurately recognized bycolor with the success rates of color recognition being 85100 and 100 for red blue and yellow cones respectivelyAdditionally the distance was successfully sensed for 90 ofthe traffic cones by pairing color and depth images Some ofthe cones were missing in some of the image frames whenthey were located around the image edge area but they couldbe found in the following frames of the dynamic test With12 frames per second in the machine vision system cones atthe edges of the area naturally came in and out of the field ofvision of the moving camera +is method was very effectiveon a temporary road marked by traffic cones of differentcolors +e advantages of using paired color and depthimages for traffic cone detection can be summarized asfollows (1) +is method is sensitive to small safety-relatedtraffic cones (2) It uses a highly efficient and stable algo-rithm for recognition processing (3) It is a cost-effectivesolution for maintaining safe driving on temporary roads

Data Availability

All data presented and analyzed in the study were obtainedfrom laboratory tests at Beijing Information Science ampTechnology University in Beijing China All laboratorytesting data are presented in the figures and tables in thearticle We will be very pleased to share all our raw data Ifneeded please contact us via e-mail suqinghua1985qqcom

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+e authors wish to thank the National Defense Science andTechnology Project (JCCPCX201705)+e authors also ap-preciate the great support from Beijing Information Scienceamp Technology University with Qin Xin Talents CultivationProgram (QXTCPA201903 and QXTCPB201901) ScientificResearch Level Promotion Project (2020KYNH112) andSchool Research Fund (2025041)

References

[1] China Association for Road Traffic Safety (CARTS) ldquoChinarsquosmotor vehicles and drivers maintain rapid growth in 2016rdquoRoad Traffic Management vol 2 p 9 2017 in Chinese

[2] M Gao Analysis of Highway Traffic Accidents in HebeiProvince and Preventive Measures [D] Peoplersquos Public Se-curity University of China Beijing China 2019 in Chinese

[3] P J Ossenbruggen J Pendharkar and J Ivan ldquoRoadwaysafety in rural and small urbanized areasrdquoAccident Analysis ampPrevention vol 33 no 4 pp 485ndash498 2001

[4] T Toroyan ldquoGlobal status report on road safety 2013 sup-porting a decade of actionrdquo Injury Prevention vol 15 no 4p 286 2013

[5] Xinhuanetcom ldquoAnalysis report on traffic of major cities inChina 2018 Q3 released by Autonavirdquo Urban Traffic vol 6pp 106-107 2018 in Chinese

[6] R Wang L Guo L Jin et al ldquoRecent research on safetyassisted driving technology of intelligent vehiclerdquo HighwayTransportation Technology vol 24 no 7 pp 107ndash111 2007 inChinese

[7] I F Akyildiz W Weilian Su Y Sankarasubramaniam andE Cayirci ldquoA survey on sensor networksrdquo IEEE Communi-cations Magazine vol 40 no 8 pp 102ndash114 2002

[8] S Ma and Z Zhang Computer Vision Fundamentals ofComputational eory and Algorithm Beijing Science PressBeijing China (in Chinese) 1998

[9] Yi Sun Short Range Wireless Communication and NetworkingTechnology Xirsquoan University of Electronic Science andTechnology Press Xirsquoan China (in Chinese) 2008

[10] Z Zhe H Jia W Jiang et al Research on Intelligent ControlTechnology Hebei University of technology and IndustryPress Hebei China (in Chinese) 2010

[11] Y Nie ldquoAnalysis of vehicle safety assisted driving technol-ogyrdquo Traffic and Transportation vol 2 pp 151ndash153 2008 (inChinese)

[12] L Jiaxing Driver Fatigue Monitoring and Warning SystemBased-On Multi-Parameret Fusion pp 226ndash232 LanzhouUniversity Lanzhou China 2013

[13] M Tetsuya and N Hidetoshi ldquoAnalysis of relationship be-tween characteristics of driverrsquos eye movements and visualscene in driving eventsrdquo in Proceedings of the 2011 IEEEInternational Conference on Fuzzy Systems (FUZZ-IEEE2011) IEEE Taipei Taiwan June 2011

[14] K Zhang D Zhang A de La Fortelle X Wu and J GregoireldquoState-driven priority scheduling mechanisms for driverlessvehicles approaching intersectionsrdquo IEEE Transactions onIntelligent Transportation Systems vol 16 no 5 pp 2487ndash2500 2015

[15] Y Wang E Wenjuan D Tian G Lu and Y Wang ldquoVehiclecollision warning system and collision detection algorithmbased on vehicle infrastructure integrationrdquo in Proceedings ofthe Advanced Forum on Transportation of China (AFTC 2011)7th IET Beijing China October 2011

[16] T Chen and S Lu ldquoAccurate and efficient traffic sign de-tection using discriminative adaboost and support vectorregressionrdquo IEEE Transactions on Vehicular Technologyvol 65 no 6 pp 4006ndash4015 2016

[17] F Yang ldquoDevelopment status and prospect of driverlessvehiclesrdquo Shanghai Automotive vol 3 pp 35ndash40 2014 (inChinese)

[18] A Bazzi A Zanella B M Masini and G Pasolini A Dis-tributed Algorithm for Virtual Traffic Lights with IEEE 80211pin Proceedings of the European Conference on Networks ampCommunications IEEE Valencia Spain October 2014

[19] G Wang Y Hou Y Zhang Y Zhou N Lu and N ChengldquoTlb-Vtl 3-Level buffer based virtual traffic light scheme forintelligent collaborative intersectionsrdquo in Proceedings IEEE86th Vehicular Technology Conference (Vtc-Fall) pp 1ndash5Toronto canada September 2017

[20] M B Younes and A Boukerche ldquoIntelligent traffic lightcontrolling algorithms using vehicular networksrdquo IEEETransactions on Vehicular Technology vol 65 no 8pp 5887ndash5899 2016

Advances in Civil Engineering 7

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering

Page 3: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

embedded computer For safety reasons the VCU rejectedany invalid commands or any commands received in thepresence of a component error Each part of the controllingsystem communicated through the CAN bus with a250Kbps baud rate except for the machine vision systemwhich exchanged data with the embedded computerthrough Ethernet

+e red blue and yellow traffic cones that are widelyused on the roads in China were200mmtimes 200mmtimes 300mm (length width and heightrespectively) with a reflective stripe attached in the middleas shown in Figure 1 +e red and blue traffic cones wereused for indicating the left and right edges of a temporaryroad while the yellow ones specified the start and end of aroad in this experiment

22 Machine Vision System Figure 3 shows the Smart EyeB1 camera system (consisting of four cameras) chosen forthis research Twomonochrome cameras which composed astereo vision system were used for sensing real-time 3-di-mensional environment data whereas the color cameraswere detecting color information According to the speci-fications of the Smart B1 camera system its error of spaceprediction is lt6 within a detectable range of 05ndash60mAdditionally this camera system can automatically adjustwhite balance +e resolution for all cameras was set to1280lowast720 and the frequency of all cameras was set to 12 fpsTwo independent Ethernets with a 100 megabit bandwidthcontrolled the data exchange for the monochrome and colorcameras +e camera was placed 1500mm above the groundto simulate the field of view in a sedan +e example imagesare shown in Figure 4(a)

23 Range Detection via Stereo Vision In this experimenttwomonochrome cameras were used to build a stereo visionA point P (x y z) in a world coordinate system projectedinto the two cameras with the coordinates Pleft (xl yl zl) andPright (xr yr zr) Since the height of the two cameras was thesame the values of yl and yr were the same and the 3-di-mensional coordinate could be changed into a 2-dimen-sional coordinate for analysis as shown in Figure 5 f was thecamerarsquos focal length while b was the baseline of the left andright cameras

According to the triangle similarity law the followingrelation exists

z

f

y

yl

y

yr

x

xl

x minus b

xr (1)

From equation (1) the x y and z values can be calculatedwith the following equations

x xllowast b

xl minus xr

y ylowast b

xl minus xr

z flowast b

xl minus xr

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(2)

A depth image D (x y) which included the objectrsquosdistance information in each pixel was generated by the zvalues as a 32 bit floating matrix that could be visualized viathe handleDisparityPointByPoint () API from the camerasystemrsquos Standard Development Kit (SDK) A processeddepth image is presented in Figure 4(b) with the warmercolor indicating the longer distance +e original depthimage format was converted from the 32 bit floating matrixto a color image because the float data and pixel valuesexceeded 255 and were unavailable for display on the currentoperating system

24 TrafficConeDetection Traffic cone detection which wasdeveloped using C++ language with an OpenCV libraryconsisted of four functions color recognition size anddistance calculation noise filtering and the traffic conemarking

241 Color Recognition All traffic cones had the sameshape size and reflective stripes except for their color Sincethe differences between the yellow red and blue colors wereobvious they were able to distinguish from the color imagesby processing these images during the day time +e colordetection algorithm is shown in equation (3)+e red greenand blue values in each pixel of the color imageH (x y) wereused for ratio calculations that would determine this pixel

Machine visionsystem

DC motor

Battery

Traffic cones

Figure 1 Experiment car

Advances in Civil Engineering 3

(a) (b)

Figure 4 Images output from Smart B1 camera system (a) +e color image (b) +e processed depth image

Camera L

Camera R

Baseline b

f

f

Z

XP (x z)

xl

xr

xndashb

z

Figure 5 Range detection

Ethernet Embedded computer

VCU

BMS

Battery

DC motor controller

DC motor

Brake controller

Machine visionsystem

CAN bus

Brake

Figure 2 Vehicle control system

43

1 2

Figure 3 Smart B1 camera system

4 Advances in Civil Engineering

color feature+e thresholds fromT1 to T7 were set based onthe experimental results

yellow ifH(x y) middot [red]

H(x y) middot [blue]gtT1amp

H(x y) middot [green]

H(x y) middot [blue]gtT2amp

H(x y) middot [red]

H(x y) middot [green]gtT3

blue ifH(x y) middot [red]

H(x y) middot [blue]ltT4amp

H(x y) middot [green]

H(x y) middot [blue]ltT5

red ifH(x y) middot [red]

H(x y) middot [blue]gtT6amp

H(x y) middot [green]

H(x y) middot [blue]gtT7

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(3)

242 Size and Distance Calculation When all traffic conepixels in image H (x y) were marked traffic conersquos size anddistance were calculated as shown in equation (4) Size Swasthe number of pixels in one isolated traffic cone area inH (xy) while D was the average gray value in the same arearsquosdepth image D (x y)

S 0 as initial

S S + 1 ifH(x y) is traffic cone pixel

D 1113936i1113936jD(xi yj)

S D(xi yj)inside traffic cone area

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎩

(4)

243 Noise Filtering and Target Marking Since variousobjects showed up in the color images with colors similar tothose of the traffic cones it was necessary to eliminate those asnoise Because the traffic cone size was in reverse proportion tothe distance in the images filtering of the fake traffic cone pixelswas conducted based on the size S and average distance dataDas shown in equation (5) A traffic cone was ignored unless Swas less than the threshold at distanceD and it was confirmedif S was equal to or larger than the threshold at D Finallyminimal external rectangles were calculated to mark all of theexisting traffic cones in the area as the detected traffic cones

is traffic cone ifSge threshold atD

not traffic cone ifSge threshold atD1113896 (5)

3 Results and Discussion

+e experiment was separated into a color marking test anda distance matching test +e color marking test was mainlyfocused on the traffic cone recognition whereas the distancematching test validated the space measuring function Inaddition a road test was conducted to validate the algo-rithmrsquos stability and efficiency

31 Traffic Cone Recognition Test Twenty red traffic conesfourteen blue cones and sixteen yellow cones were manually

placed in front of the experiment car As shown in Figure 6recognized traffic cones were marked by rectangles with thesame colors as the bodies of the cones whereas the un-recognized ones were marked with white rectangles +eblue and yellow traffic cones reached a 100 detectionsuccess rate while the red ones were accurately detected 85of the time +e three undetected red traffic cones werelocated close to the left and right edges of the image andplaced on a section of the playground that was reddish incolor Also one of them was 10 meters away from thecamera and two were over twenty meters away from thecamera +e ground color might have influenced red colorrecognition

32 Distance Matching Test After the traffic cone markingprocess the distance data matching test was conducted andthe experiment results are shown in Figure 7 Fourteen blueand sixteen yellow traffic cones were matched with thecorresponding distance data from the depth image with a100 accuracy rate However only 15 out of 20 red trafficcones had the corresponding distance data in the pixel areaof the depth image Besides the three undetected red trafficcones in recognition test another two red ones on the leftside which were close to a blue pole were mismatched incolor and depth +e overlay might be the reason for thiserror Consequently 45 out of 50 traffic cones were suc-cessfully paired with their distance information and theoverall success rate was 90 A prediction error existed forthe paired traffic cones from 2 cm to 11m between predicteddistance and manual measured distance and this error wentup when the distance between the camera and the coneincreased +is error was within 6 and it was acceptablewhile the experiment car ran at a speed of 10 kmh

33 Road Test To simulate a temporary road the trafficcones with red color were designated as the left roadboundary and the blue ones were designated as the rightroad boundary+e yellow traffic cones were used to indicatethe start and end of the temporary road +e distance be-tween any two traffic cones of the same color was 5m andthe width of the temporary road as marked by the red andblue cones was 3m +e temporary road included a curve-

Advances in Civil Engineering 5

line section and a straight-line section and the road testimages are shown in Figure 8

+e experiment demonstrated that a machine visionsystem could detect red blue and yellow traffic cones andthe experiment car in an autopilot mode could successfullynavigate a temporary road at a speed of 10 kmh Withoutthe similar color influence the success rate of recognitionincreased At times one or two traffic cones were missingfrom a frame of color and depth image and this might beexplained by the following First some cones that were nearthe left and right edges of the images could not be paired in

color and depth and the same happened in the initial statictest Since the distance between the car and the traffic conesnear the edge of the image was quite long the error wouldnot impact driving safety Besides 12 frames of color anddepth images were captured in one second so the missingcones could be detected in the following frames while theymoved away from the image boundary area Second trafficcones that were entering or leaving the images while theexperiment car was moving might not have been detected ifthey showed up only partially Once these traffic cones fullyentered the images this problem was solved automatically

Figure 6 Traffic cone recognition static test

2

times104 mm

times104 mm

15

1

05ndash2 ndash15 ndash1 ndash05 0 05 1 15 2

Red coneBlue coneYellow cone

Figure 7 Traffic cone distance matching static test

(a) (b)

Figure 8 Road test (a) +e curve-line section (b) +e straight-line section

6 Advances in Civil Engineering

4 Conclusion

An image processing algorithm based on color and depthimages was successfully applied to traffic cone detectionEach image frame was analyzed within 80ms which in-cluded one color and one depth image capture and pro-cessing +e traffic cones were very accurately recognized bycolor with the success rates of color recognition being 85100 and 100 for red blue and yellow cones respectivelyAdditionally the distance was successfully sensed for 90 ofthe traffic cones by pairing color and depth images Some ofthe cones were missing in some of the image frames whenthey were located around the image edge area but they couldbe found in the following frames of the dynamic test With12 frames per second in the machine vision system cones atthe edges of the area naturally came in and out of the field ofvision of the moving camera +is method was very effectiveon a temporary road marked by traffic cones of differentcolors +e advantages of using paired color and depthimages for traffic cone detection can be summarized asfollows (1) +is method is sensitive to small safety-relatedtraffic cones (2) It uses a highly efficient and stable algo-rithm for recognition processing (3) It is a cost-effectivesolution for maintaining safe driving on temporary roads

Data Availability

All data presented and analyzed in the study were obtainedfrom laboratory tests at Beijing Information Science ampTechnology University in Beijing China All laboratorytesting data are presented in the figures and tables in thearticle We will be very pleased to share all our raw data Ifneeded please contact us via e-mail suqinghua1985qqcom

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+e authors wish to thank the National Defense Science andTechnology Project (JCCPCX201705)+e authors also ap-preciate the great support from Beijing Information Scienceamp Technology University with Qin Xin Talents CultivationProgram (QXTCPA201903 and QXTCPB201901) ScientificResearch Level Promotion Project (2020KYNH112) andSchool Research Fund (2025041)

References

[1] China Association for Road Traffic Safety (CARTS) ldquoChinarsquosmotor vehicles and drivers maintain rapid growth in 2016rdquoRoad Traffic Management vol 2 p 9 2017 in Chinese

[2] M Gao Analysis of Highway Traffic Accidents in HebeiProvince and Preventive Measures [D] Peoplersquos Public Se-curity University of China Beijing China 2019 in Chinese

[3] P J Ossenbruggen J Pendharkar and J Ivan ldquoRoadwaysafety in rural and small urbanized areasrdquoAccident Analysis ampPrevention vol 33 no 4 pp 485ndash498 2001

[4] T Toroyan ldquoGlobal status report on road safety 2013 sup-porting a decade of actionrdquo Injury Prevention vol 15 no 4p 286 2013

[5] Xinhuanetcom ldquoAnalysis report on traffic of major cities inChina 2018 Q3 released by Autonavirdquo Urban Traffic vol 6pp 106-107 2018 in Chinese

[6] R Wang L Guo L Jin et al ldquoRecent research on safetyassisted driving technology of intelligent vehiclerdquo HighwayTransportation Technology vol 24 no 7 pp 107ndash111 2007 inChinese

[7] I F Akyildiz W Weilian Su Y Sankarasubramaniam andE Cayirci ldquoA survey on sensor networksrdquo IEEE Communi-cations Magazine vol 40 no 8 pp 102ndash114 2002

[8] S Ma and Z Zhang Computer Vision Fundamentals ofComputational eory and Algorithm Beijing Science PressBeijing China (in Chinese) 1998

[9] Yi Sun Short Range Wireless Communication and NetworkingTechnology Xirsquoan University of Electronic Science andTechnology Press Xirsquoan China (in Chinese) 2008

[10] Z Zhe H Jia W Jiang et al Research on Intelligent ControlTechnology Hebei University of technology and IndustryPress Hebei China (in Chinese) 2010

[11] Y Nie ldquoAnalysis of vehicle safety assisted driving technol-ogyrdquo Traffic and Transportation vol 2 pp 151ndash153 2008 (inChinese)

[12] L Jiaxing Driver Fatigue Monitoring and Warning SystemBased-On Multi-Parameret Fusion pp 226ndash232 LanzhouUniversity Lanzhou China 2013

[13] M Tetsuya and N Hidetoshi ldquoAnalysis of relationship be-tween characteristics of driverrsquos eye movements and visualscene in driving eventsrdquo in Proceedings of the 2011 IEEEInternational Conference on Fuzzy Systems (FUZZ-IEEE2011) IEEE Taipei Taiwan June 2011

[14] K Zhang D Zhang A de La Fortelle X Wu and J GregoireldquoState-driven priority scheduling mechanisms for driverlessvehicles approaching intersectionsrdquo IEEE Transactions onIntelligent Transportation Systems vol 16 no 5 pp 2487ndash2500 2015

[15] Y Wang E Wenjuan D Tian G Lu and Y Wang ldquoVehiclecollision warning system and collision detection algorithmbased on vehicle infrastructure integrationrdquo in Proceedings ofthe Advanced Forum on Transportation of China (AFTC 2011)7th IET Beijing China October 2011

[16] T Chen and S Lu ldquoAccurate and efficient traffic sign de-tection using discriminative adaboost and support vectorregressionrdquo IEEE Transactions on Vehicular Technologyvol 65 no 6 pp 4006ndash4015 2016

[17] F Yang ldquoDevelopment status and prospect of driverlessvehiclesrdquo Shanghai Automotive vol 3 pp 35ndash40 2014 (inChinese)

[18] A Bazzi A Zanella B M Masini and G Pasolini A Dis-tributed Algorithm for Virtual Traffic Lights with IEEE 80211pin Proceedings of the European Conference on Networks ampCommunications IEEE Valencia Spain October 2014

[19] G Wang Y Hou Y Zhang Y Zhou N Lu and N ChengldquoTlb-Vtl 3-Level buffer based virtual traffic light scheme forintelligent collaborative intersectionsrdquo in Proceedings IEEE86th Vehicular Technology Conference (Vtc-Fall) pp 1ndash5Toronto canada September 2017

[20] M B Younes and A Boukerche ldquoIntelligent traffic lightcontrolling algorithms using vehicular networksrdquo IEEETransactions on Vehicular Technology vol 65 no 8pp 5887ndash5899 2016

Advances in Civil Engineering 7

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering

Page 4: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

(a) (b)

Figure 4 Images output from Smart B1 camera system (a) +e color image (b) +e processed depth image

Camera L

Camera R

Baseline b

f

f

Z

XP (x z)

xl

xr

xndashb

z

Figure 5 Range detection

Ethernet Embedded computer

VCU

BMS

Battery

DC motor controller

DC motor

Brake controller

Machine visionsystem

CAN bus

Brake

Figure 2 Vehicle control system

43

1 2

Figure 3 Smart B1 camera system

4 Advances in Civil Engineering

color feature+e thresholds fromT1 to T7 were set based onthe experimental results

yellow ifH(x y) middot [red]

H(x y) middot [blue]gtT1amp

H(x y) middot [green]

H(x y) middot [blue]gtT2amp

H(x y) middot [red]

H(x y) middot [green]gtT3

blue ifH(x y) middot [red]

H(x y) middot [blue]ltT4amp

H(x y) middot [green]

H(x y) middot [blue]ltT5

red ifH(x y) middot [red]

H(x y) middot [blue]gtT6amp

H(x y) middot [green]

H(x y) middot [blue]gtT7

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(3)

242 Size and Distance Calculation When all traffic conepixels in image H (x y) were marked traffic conersquos size anddistance were calculated as shown in equation (4) Size Swasthe number of pixels in one isolated traffic cone area inH (xy) while D was the average gray value in the same arearsquosdepth image D (x y)

S 0 as initial

S S + 1 ifH(x y) is traffic cone pixel

D 1113936i1113936jD(xi yj)

S D(xi yj)inside traffic cone area

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎩

(4)

243 Noise Filtering and Target Marking Since variousobjects showed up in the color images with colors similar tothose of the traffic cones it was necessary to eliminate those asnoise Because the traffic cone size was in reverse proportion tothe distance in the images filtering of the fake traffic cone pixelswas conducted based on the size S and average distance dataDas shown in equation (5) A traffic cone was ignored unless Swas less than the threshold at distanceD and it was confirmedif S was equal to or larger than the threshold at D Finallyminimal external rectangles were calculated to mark all of theexisting traffic cones in the area as the detected traffic cones

is traffic cone ifSge threshold atD

not traffic cone ifSge threshold atD1113896 (5)

3 Results and Discussion

+e experiment was separated into a color marking test anda distance matching test +e color marking test was mainlyfocused on the traffic cone recognition whereas the distancematching test validated the space measuring function Inaddition a road test was conducted to validate the algo-rithmrsquos stability and efficiency

31 Traffic Cone Recognition Test Twenty red traffic conesfourteen blue cones and sixteen yellow cones were manually

placed in front of the experiment car As shown in Figure 6recognized traffic cones were marked by rectangles with thesame colors as the bodies of the cones whereas the un-recognized ones were marked with white rectangles +eblue and yellow traffic cones reached a 100 detectionsuccess rate while the red ones were accurately detected 85of the time +e three undetected red traffic cones werelocated close to the left and right edges of the image andplaced on a section of the playground that was reddish incolor Also one of them was 10 meters away from thecamera and two were over twenty meters away from thecamera +e ground color might have influenced red colorrecognition

32 Distance Matching Test After the traffic cone markingprocess the distance data matching test was conducted andthe experiment results are shown in Figure 7 Fourteen blueand sixteen yellow traffic cones were matched with thecorresponding distance data from the depth image with a100 accuracy rate However only 15 out of 20 red trafficcones had the corresponding distance data in the pixel areaof the depth image Besides the three undetected red trafficcones in recognition test another two red ones on the leftside which were close to a blue pole were mismatched incolor and depth +e overlay might be the reason for thiserror Consequently 45 out of 50 traffic cones were suc-cessfully paired with their distance information and theoverall success rate was 90 A prediction error existed forthe paired traffic cones from 2 cm to 11m between predicteddistance and manual measured distance and this error wentup when the distance between the camera and the coneincreased +is error was within 6 and it was acceptablewhile the experiment car ran at a speed of 10 kmh

33 Road Test To simulate a temporary road the trafficcones with red color were designated as the left roadboundary and the blue ones were designated as the rightroad boundary+e yellow traffic cones were used to indicatethe start and end of the temporary road +e distance be-tween any two traffic cones of the same color was 5m andthe width of the temporary road as marked by the red andblue cones was 3m +e temporary road included a curve-

Advances in Civil Engineering 5

line section and a straight-line section and the road testimages are shown in Figure 8

+e experiment demonstrated that a machine visionsystem could detect red blue and yellow traffic cones andthe experiment car in an autopilot mode could successfullynavigate a temporary road at a speed of 10 kmh Withoutthe similar color influence the success rate of recognitionincreased At times one or two traffic cones were missingfrom a frame of color and depth image and this might beexplained by the following First some cones that were nearthe left and right edges of the images could not be paired in

color and depth and the same happened in the initial statictest Since the distance between the car and the traffic conesnear the edge of the image was quite long the error wouldnot impact driving safety Besides 12 frames of color anddepth images were captured in one second so the missingcones could be detected in the following frames while theymoved away from the image boundary area Second trafficcones that were entering or leaving the images while theexperiment car was moving might not have been detected ifthey showed up only partially Once these traffic cones fullyentered the images this problem was solved automatically

Figure 6 Traffic cone recognition static test

2

times104 mm

times104 mm

15

1

05ndash2 ndash15 ndash1 ndash05 0 05 1 15 2

Red coneBlue coneYellow cone

Figure 7 Traffic cone distance matching static test

(a) (b)

Figure 8 Road test (a) +e curve-line section (b) +e straight-line section

6 Advances in Civil Engineering

4 Conclusion

An image processing algorithm based on color and depthimages was successfully applied to traffic cone detectionEach image frame was analyzed within 80ms which in-cluded one color and one depth image capture and pro-cessing +e traffic cones were very accurately recognized bycolor with the success rates of color recognition being 85100 and 100 for red blue and yellow cones respectivelyAdditionally the distance was successfully sensed for 90 ofthe traffic cones by pairing color and depth images Some ofthe cones were missing in some of the image frames whenthey were located around the image edge area but they couldbe found in the following frames of the dynamic test With12 frames per second in the machine vision system cones atthe edges of the area naturally came in and out of the field ofvision of the moving camera +is method was very effectiveon a temporary road marked by traffic cones of differentcolors +e advantages of using paired color and depthimages for traffic cone detection can be summarized asfollows (1) +is method is sensitive to small safety-relatedtraffic cones (2) It uses a highly efficient and stable algo-rithm for recognition processing (3) It is a cost-effectivesolution for maintaining safe driving on temporary roads

Data Availability

All data presented and analyzed in the study were obtainedfrom laboratory tests at Beijing Information Science ampTechnology University in Beijing China All laboratorytesting data are presented in the figures and tables in thearticle We will be very pleased to share all our raw data Ifneeded please contact us via e-mail suqinghua1985qqcom

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+e authors wish to thank the National Defense Science andTechnology Project (JCCPCX201705)+e authors also ap-preciate the great support from Beijing Information Scienceamp Technology University with Qin Xin Talents CultivationProgram (QXTCPA201903 and QXTCPB201901) ScientificResearch Level Promotion Project (2020KYNH112) andSchool Research Fund (2025041)

References

[1] China Association for Road Traffic Safety (CARTS) ldquoChinarsquosmotor vehicles and drivers maintain rapid growth in 2016rdquoRoad Traffic Management vol 2 p 9 2017 in Chinese

[2] M Gao Analysis of Highway Traffic Accidents in HebeiProvince and Preventive Measures [D] Peoplersquos Public Se-curity University of China Beijing China 2019 in Chinese

[3] P J Ossenbruggen J Pendharkar and J Ivan ldquoRoadwaysafety in rural and small urbanized areasrdquoAccident Analysis ampPrevention vol 33 no 4 pp 485ndash498 2001

[4] T Toroyan ldquoGlobal status report on road safety 2013 sup-porting a decade of actionrdquo Injury Prevention vol 15 no 4p 286 2013

[5] Xinhuanetcom ldquoAnalysis report on traffic of major cities inChina 2018 Q3 released by Autonavirdquo Urban Traffic vol 6pp 106-107 2018 in Chinese

[6] R Wang L Guo L Jin et al ldquoRecent research on safetyassisted driving technology of intelligent vehiclerdquo HighwayTransportation Technology vol 24 no 7 pp 107ndash111 2007 inChinese

[7] I F Akyildiz W Weilian Su Y Sankarasubramaniam andE Cayirci ldquoA survey on sensor networksrdquo IEEE Communi-cations Magazine vol 40 no 8 pp 102ndash114 2002

[8] S Ma and Z Zhang Computer Vision Fundamentals ofComputational eory and Algorithm Beijing Science PressBeijing China (in Chinese) 1998

[9] Yi Sun Short Range Wireless Communication and NetworkingTechnology Xirsquoan University of Electronic Science andTechnology Press Xirsquoan China (in Chinese) 2008

[10] Z Zhe H Jia W Jiang et al Research on Intelligent ControlTechnology Hebei University of technology and IndustryPress Hebei China (in Chinese) 2010

[11] Y Nie ldquoAnalysis of vehicle safety assisted driving technol-ogyrdquo Traffic and Transportation vol 2 pp 151ndash153 2008 (inChinese)

[12] L Jiaxing Driver Fatigue Monitoring and Warning SystemBased-On Multi-Parameret Fusion pp 226ndash232 LanzhouUniversity Lanzhou China 2013

[13] M Tetsuya and N Hidetoshi ldquoAnalysis of relationship be-tween characteristics of driverrsquos eye movements and visualscene in driving eventsrdquo in Proceedings of the 2011 IEEEInternational Conference on Fuzzy Systems (FUZZ-IEEE2011) IEEE Taipei Taiwan June 2011

[14] K Zhang D Zhang A de La Fortelle X Wu and J GregoireldquoState-driven priority scheduling mechanisms for driverlessvehicles approaching intersectionsrdquo IEEE Transactions onIntelligent Transportation Systems vol 16 no 5 pp 2487ndash2500 2015

[15] Y Wang E Wenjuan D Tian G Lu and Y Wang ldquoVehiclecollision warning system and collision detection algorithmbased on vehicle infrastructure integrationrdquo in Proceedings ofthe Advanced Forum on Transportation of China (AFTC 2011)7th IET Beijing China October 2011

[16] T Chen and S Lu ldquoAccurate and efficient traffic sign de-tection using discriminative adaboost and support vectorregressionrdquo IEEE Transactions on Vehicular Technologyvol 65 no 6 pp 4006ndash4015 2016

[17] F Yang ldquoDevelopment status and prospect of driverlessvehiclesrdquo Shanghai Automotive vol 3 pp 35ndash40 2014 (inChinese)

[18] A Bazzi A Zanella B M Masini and G Pasolini A Dis-tributed Algorithm for Virtual Traffic Lights with IEEE 80211pin Proceedings of the European Conference on Networks ampCommunications IEEE Valencia Spain October 2014

[19] G Wang Y Hou Y Zhang Y Zhou N Lu and N ChengldquoTlb-Vtl 3-Level buffer based virtual traffic light scheme forintelligent collaborative intersectionsrdquo in Proceedings IEEE86th Vehicular Technology Conference (Vtc-Fall) pp 1ndash5Toronto canada September 2017

[20] M B Younes and A Boukerche ldquoIntelligent traffic lightcontrolling algorithms using vehicular networksrdquo IEEETransactions on Vehicular Technology vol 65 no 8pp 5887ndash5899 2016

Advances in Civil Engineering 7

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering

Page 5: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

color feature+e thresholds fromT1 to T7 were set based onthe experimental results

yellow ifH(x y) middot [red]

H(x y) middot [blue]gtT1amp

H(x y) middot [green]

H(x y) middot [blue]gtT2amp

H(x y) middot [red]

H(x y) middot [green]gtT3

blue ifH(x y) middot [red]

H(x y) middot [blue]ltT4amp

H(x y) middot [green]

H(x y) middot [blue]ltT5

red ifH(x y) middot [red]

H(x y) middot [blue]gtT6amp

H(x y) middot [green]

H(x y) middot [blue]gtT7

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎪⎩

(3)

242 Size and Distance Calculation When all traffic conepixels in image H (x y) were marked traffic conersquos size anddistance were calculated as shown in equation (4) Size Swasthe number of pixels in one isolated traffic cone area inH (xy) while D was the average gray value in the same arearsquosdepth image D (x y)

S 0 as initial

S S + 1 ifH(x y) is traffic cone pixel

D 1113936i1113936jD(xi yj)

S D(xi yj)inside traffic cone area

⎧⎪⎪⎪⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎪⎪⎪⎩

(4)

243 Noise Filtering and Target Marking Since variousobjects showed up in the color images with colors similar tothose of the traffic cones it was necessary to eliminate those asnoise Because the traffic cone size was in reverse proportion tothe distance in the images filtering of the fake traffic cone pixelswas conducted based on the size S and average distance dataDas shown in equation (5) A traffic cone was ignored unless Swas less than the threshold at distanceD and it was confirmedif S was equal to or larger than the threshold at D Finallyminimal external rectangles were calculated to mark all of theexisting traffic cones in the area as the detected traffic cones

is traffic cone ifSge threshold atD

not traffic cone ifSge threshold atD1113896 (5)

3 Results and Discussion

+e experiment was separated into a color marking test anda distance matching test +e color marking test was mainlyfocused on the traffic cone recognition whereas the distancematching test validated the space measuring function Inaddition a road test was conducted to validate the algo-rithmrsquos stability and efficiency

31 Traffic Cone Recognition Test Twenty red traffic conesfourteen blue cones and sixteen yellow cones were manually

placed in front of the experiment car As shown in Figure 6recognized traffic cones were marked by rectangles with thesame colors as the bodies of the cones whereas the un-recognized ones were marked with white rectangles +eblue and yellow traffic cones reached a 100 detectionsuccess rate while the red ones were accurately detected 85of the time +e three undetected red traffic cones werelocated close to the left and right edges of the image andplaced on a section of the playground that was reddish incolor Also one of them was 10 meters away from thecamera and two were over twenty meters away from thecamera +e ground color might have influenced red colorrecognition

32 Distance Matching Test After the traffic cone markingprocess the distance data matching test was conducted andthe experiment results are shown in Figure 7 Fourteen blueand sixteen yellow traffic cones were matched with thecorresponding distance data from the depth image with a100 accuracy rate However only 15 out of 20 red trafficcones had the corresponding distance data in the pixel areaof the depth image Besides the three undetected red trafficcones in recognition test another two red ones on the leftside which were close to a blue pole were mismatched incolor and depth +e overlay might be the reason for thiserror Consequently 45 out of 50 traffic cones were suc-cessfully paired with their distance information and theoverall success rate was 90 A prediction error existed forthe paired traffic cones from 2 cm to 11m between predicteddistance and manual measured distance and this error wentup when the distance between the camera and the coneincreased +is error was within 6 and it was acceptablewhile the experiment car ran at a speed of 10 kmh

33 Road Test To simulate a temporary road the trafficcones with red color were designated as the left roadboundary and the blue ones were designated as the rightroad boundary+e yellow traffic cones were used to indicatethe start and end of the temporary road +e distance be-tween any two traffic cones of the same color was 5m andthe width of the temporary road as marked by the red andblue cones was 3m +e temporary road included a curve-

Advances in Civil Engineering 5

line section and a straight-line section and the road testimages are shown in Figure 8

+e experiment demonstrated that a machine visionsystem could detect red blue and yellow traffic cones andthe experiment car in an autopilot mode could successfullynavigate a temporary road at a speed of 10 kmh Withoutthe similar color influence the success rate of recognitionincreased At times one or two traffic cones were missingfrom a frame of color and depth image and this might beexplained by the following First some cones that were nearthe left and right edges of the images could not be paired in

color and depth and the same happened in the initial statictest Since the distance between the car and the traffic conesnear the edge of the image was quite long the error wouldnot impact driving safety Besides 12 frames of color anddepth images were captured in one second so the missingcones could be detected in the following frames while theymoved away from the image boundary area Second trafficcones that were entering or leaving the images while theexperiment car was moving might not have been detected ifthey showed up only partially Once these traffic cones fullyentered the images this problem was solved automatically

Figure 6 Traffic cone recognition static test

2

times104 mm

times104 mm

15

1

05ndash2 ndash15 ndash1 ndash05 0 05 1 15 2

Red coneBlue coneYellow cone

Figure 7 Traffic cone distance matching static test

(a) (b)

Figure 8 Road test (a) +e curve-line section (b) +e straight-line section

6 Advances in Civil Engineering

4 Conclusion

An image processing algorithm based on color and depthimages was successfully applied to traffic cone detectionEach image frame was analyzed within 80ms which in-cluded one color and one depth image capture and pro-cessing +e traffic cones were very accurately recognized bycolor with the success rates of color recognition being 85100 and 100 for red blue and yellow cones respectivelyAdditionally the distance was successfully sensed for 90 ofthe traffic cones by pairing color and depth images Some ofthe cones were missing in some of the image frames whenthey were located around the image edge area but they couldbe found in the following frames of the dynamic test With12 frames per second in the machine vision system cones atthe edges of the area naturally came in and out of the field ofvision of the moving camera +is method was very effectiveon a temporary road marked by traffic cones of differentcolors +e advantages of using paired color and depthimages for traffic cone detection can be summarized asfollows (1) +is method is sensitive to small safety-relatedtraffic cones (2) It uses a highly efficient and stable algo-rithm for recognition processing (3) It is a cost-effectivesolution for maintaining safe driving on temporary roads

Data Availability

All data presented and analyzed in the study were obtainedfrom laboratory tests at Beijing Information Science ampTechnology University in Beijing China All laboratorytesting data are presented in the figures and tables in thearticle We will be very pleased to share all our raw data Ifneeded please contact us via e-mail suqinghua1985qqcom

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+e authors wish to thank the National Defense Science andTechnology Project (JCCPCX201705)+e authors also ap-preciate the great support from Beijing Information Scienceamp Technology University with Qin Xin Talents CultivationProgram (QXTCPA201903 and QXTCPB201901) ScientificResearch Level Promotion Project (2020KYNH112) andSchool Research Fund (2025041)

References

[1] China Association for Road Traffic Safety (CARTS) ldquoChinarsquosmotor vehicles and drivers maintain rapid growth in 2016rdquoRoad Traffic Management vol 2 p 9 2017 in Chinese

[2] M Gao Analysis of Highway Traffic Accidents in HebeiProvince and Preventive Measures [D] Peoplersquos Public Se-curity University of China Beijing China 2019 in Chinese

[3] P J Ossenbruggen J Pendharkar and J Ivan ldquoRoadwaysafety in rural and small urbanized areasrdquoAccident Analysis ampPrevention vol 33 no 4 pp 485ndash498 2001

[4] T Toroyan ldquoGlobal status report on road safety 2013 sup-porting a decade of actionrdquo Injury Prevention vol 15 no 4p 286 2013

[5] Xinhuanetcom ldquoAnalysis report on traffic of major cities inChina 2018 Q3 released by Autonavirdquo Urban Traffic vol 6pp 106-107 2018 in Chinese

[6] R Wang L Guo L Jin et al ldquoRecent research on safetyassisted driving technology of intelligent vehiclerdquo HighwayTransportation Technology vol 24 no 7 pp 107ndash111 2007 inChinese

[7] I F Akyildiz W Weilian Su Y Sankarasubramaniam andE Cayirci ldquoA survey on sensor networksrdquo IEEE Communi-cations Magazine vol 40 no 8 pp 102ndash114 2002

[8] S Ma and Z Zhang Computer Vision Fundamentals ofComputational eory and Algorithm Beijing Science PressBeijing China (in Chinese) 1998

[9] Yi Sun Short Range Wireless Communication and NetworkingTechnology Xirsquoan University of Electronic Science andTechnology Press Xirsquoan China (in Chinese) 2008

[10] Z Zhe H Jia W Jiang et al Research on Intelligent ControlTechnology Hebei University of technology and IndustryPress Hebei China (in Chinese) 2010

[11] Y Nie ldquoAnalysis of vehicle safety assisted driving technol-ogyrdquo Traffic and Transportation vol 2 pp 151ndash153 2008 (inChinese)

[12] L Jiaxing Driver Fatigue Monitoring and Warning SystemBased-On Multi-Parameret Fusion pp 226ndash232 LanzhouUniversity Lanzhou China 2013

[13] M Tetsuya and N Hidetoshi ldquoAnalysis of relationship be-tween characteristics of driverrsquos eye movements and visualscene in driving eventsrdquo in Proceedings of the 2011 IEEEInternational Conference on Fuzzy Systems (FUZZ-IEEE2011) IEEE Taipei Taiwan June 2011

[14] K Zhang D Zhang A de La Fortelle X Wu and J GregoireldquoState-driven priority scheduling mechanisms for driverlessvehicles approaching intersectionsrdquo IEEE Transactions onIntelligent Transportation Systems vol 16 no 5 pp 2487ndash2500 2015

[15] Y Wang E Wenjuan D Tian G Lu and Y Wang ldquoVehiclecollision warning system and collision detection algorithmbased on vehicle infrastructure integrationrdquo in Proceedings ofthe Advanced Forum on Transportation of China (AFTC 2011)7th IET Beijing China October 2011

[16] T Chen and S Lu ldquoAccurate and efficient traffic sign de-tection using discriminative adaboost and support vectorregressionrdquo IEEE Transactions on Vehicular Technologyvol 65 no 6 pp 4006ndash4015 2016

[17] F Yang ldquoDevelopment status and prospect of driverlessvehiclesrdquo Shanghai Automotive vol 3 pp 35ndash40 2014 (inChinese)

[18] A Bazzi A Zanella B M Masini and G Pasolini A Dis-tributed Algorithm for Virtual Traffic Lights with IEEE 80211pin Proceedings of the European Conference on Networks ampCommunications IEEE Valencia Spain October 2014

[19] G Wang Y Hou Y Zhang Y Zhou N Lu and N ChengldquoTlb-Vtl 3-Level buffer based virtual traffic light scheme forintelligent collaborative intersectionsrdquo in Proceedings IEEE86th Vehicular Technology Conference (Vtc-Fall) pp 1ndash5Toronto canada September 2017

[20] M B Younes and A Boukerche ldquoIntelligent traffic lightcontrolling algorithms using vehicular networksrdquo IEEETransactions on Vehicular Technology vol 65 no 8pp 5887ndash5899 2016

Advances in Civil Engineering 7

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering

Page 6: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

line section and a straight-line section and the road testimages are shown in Figure 8

+e experiment demonstrated that a machine visionsystem could detect red blue and yellow traffic cones andthe experiment car in an autopilot mode could successfullynavigate a temporary road at a speed of 10 kmh Withoutthe similar color influence the success rate of recognitionincreased At times one or two traffic cones were missingfrom a frame of color and depth image and this might beexplained by the following First some cones that were nearthe left and right edges of the images could not be paired in

color and depth and the same happened in the initial statictest Since the distance between the car and the traffic conesnear the edge of the image was quite long the error wouldnot impact driving safety Besides 12 frames of color anddepth images were captured in one second so the missingcones could be detected in the following frames while theymoved away from the image boundary area Second trafficcones that were entering or leaving the images while theexperiment car was moving might not have been detected ifthey showed up only partially Once these traffic cones fullyentered the images this problem was solved automatically

Figure 6 Traffic cone recognition static test

2

times104 mm

times104 mm

15

1

05ndash2 ndash15 ndash1 ndash05 0 05 1 15 2

Red coneBlue coneYellow cone

Figure 7 Traffic cone distance matching static test

(a) (b)

Figure 8 Road test (a) +e curve-line section (b) +e straight-line section

6 Advances in Civil Engineering

4 Conclusion

An image processing algorithm based on color and depthimages was successfully applied to traffic cone detectionEach image frame was analyzed within 80ms which in-cluded one color and one depth image capture and pro-cessing +e traffic cones were very accurately recognized bycolor with the success rates of color recognition being 85100 and 100 for red blue and yellow cones respectivelyAdditionally the distance was successfully sensed for 90 ofthe traffic cones by pairing color and depth images Some ofthe cones were missing in some of the image frames whenthey were located around the image edge area but they couldbe found in the following frames of the dynamic test With12 frames per second in the machine vision system cones atthe edges of the area naturally came in and out of the field ofvision of the moving camera +is method was very effectiveon a temporary road marked by traffic cones of differentcolors +e advantages of using paired color and depthimages for traffic cone detection can be summarized asfollows (1) +is method is sensitive to small safety-relatedtraffic cones (2) It uses a highly efficient and stable algo-rithm for recognition processing (3) It is a cost-effectivesolution for maintaining safe driving on temporary roads

Data Availability

All data presented and analyzed in the study were obtainedfrom laboratory tests at Beijing Information Science ampTechnology University in Beijing China All laboratorytesting data are presented in the figures and tables in thearticle We will be very pleased to share all our raw data Ifneeded please contact us via e-mail suqinghua1985qqcom

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+e authors wish to thank the National Defense Science andTechnology Project (JCCPCX201705)+e authors also ap-preciate the great support from Beijing Information Scienceamp Technology University with Qin Xin Talents CultivationProgram (QXTCPA201903 and QXTCPB201901) ScientificResearch Level Promotion Project (2020KYNH112) andSchool Research Fund (2025041)

References

[1] China Association for Road Traffic Safety (CARTS) ldquoChinarsquosmotor vehicles and drivers maintain rapid growth in 2016rdquoRoad Traffic Management vol 2 p 9 2017 in Chinese

[2] M Gao Analysis of Highway Traffic Accidents in HebeiProvince and Preventive Measures [D] Peoplersquos Public Se-curity University of China Beijing China 2019 in Chinese

[3] P J Ossenbruggen J Pendharkar and J Ivan ldquoRoadwaysafety in rural and small urbanized areasrdquoAccident Analysis ampPrevention vol 33 no 4 pp 485ndash498 2001

[4] T Toroyan ldquoGlobal status report on road safety 2013 sup-porting a decade of actionrdquo Injury Prevention vol 15 no 4p 286 2013

[5] Xinhuanetcom ldquoAnalysis report on traffic of major cities inChina 2018 Q3 released by Autonavirdquo Urban Traffic vol 6pp 106-107 2018 in Chinese

[6] R Wang L Guo L Jin et al ldquoRecent research on safetyassisted driving technology of intelligent vehiclerdquo HighwayTransportation Technology vol 24 no 7 pp 107ndash111 2007 inChinese

[7] I F Akyildiz W Weilian Su Y Sankarasubramaniam andE Cayirci ldquoA survey on sensor networksrdquo IEEE Communi-cations Magazine vol 40 no 8 pp 102ndash114 2002

[8] S Ma and Z Zhang Computer Vision Fundamentals ofComputational eory and Algorithm Beijing Science PressBeijing China (in Chinese) 1998

[9] Yi Sun Short Range Wireless Communication and NetworkingTechnology Xirsquoan University of Electronic Science andTechnology Press Xirsquoan China (in Chinese) 2008

[10] Z Zhe H Jia W Jiang et al Research on Intelligent ControlTechnology Hebei University of technology and IndustryPress Hebei China (in Chinese) 2010

[11] Y Nie ldquoAnalysis of vehicle safety assisted driving technol-ogyrdquo Traffic and Transportation vol 2 pp 151ndash153 2008 (inChinese)

[12] L Jiaxing Driver Fatigue Monitoring and Warning SystemBased-On Multi-Parameret Fusion pp 226ndash232 LanzhouUniversity Lanzhou China 2013

[13] M Tetsuya and N Hidetoshi ldquoAnalysis of relationship be-tween characteristics of driverrsquos eye movements and visualscene in driving eventsrdquo in Proceedings of the 2011 IEEEInternational Conference on Fuzzy Systems (FUZZ-IEEE2011) IEEE Taipei Taiwan June 2011

[14] K Zhang D Zhang A de La Fortelle X Wu and J GregoireldquoState-driven priority scheduling mechanisms for driverlessvehicles approaching intersectionsrdquo IEEE Transactions onIntelligent Transportation Systems vol 16 no 5 pp 2487ndash2500 2015

[15] Y Wang E Wenjuan D Tian G Lu and Y Wang ldquoVehiclecollision warning system and collision detection algorithmbased on vehicle infrastructure integrationrdquo in Proceedings ofthe Advanced Forum on Transportation of China (AFTC 2011)7th IET Beijing China October 2011

[16] T Chen and S Lu ldquoAccurate and efficient traffic sign de-tection using discriminative adaboost and support vectorregressionrdquo IEEE Transactions on Vehicular Technologyvol 65 no 6 pp 4006ndash4015 2016

[17] F Yang ldquoDevelopment status and prospect of driverlessvehiclesrdquo Shanghai Automotive vol 3 pp 35ndash40 2014 (inChinese)

[18] A Bazzi A Zanella B M Masini and G Pasolini A Dis-tributed Algorithm for Virtual Traffic Lights with IEEE 80211pin Proceedings of the European Conference on Networks ampCommunications IEEE Valencia Spain October 2014

[19] G Wang Y Hou Y Zhang Y Zhou N Lu and N ChengldquoTlb-Vtl 3-Level buffer based virtual traffic light scheme forintelligent collaborative intersectionsrdquo in Proceedings IEEE86th Vehicular Technology Conference (Vtc-Fall) pp 1ndash5Toronto canada September 2017

[20] M B Younes and A Boukerche ldquoIntelligent traffic lightcontrolling algorithms using vehicular networksrdquo IEEETransactions on Vehicular Technology vol 65 no 8pp 5887ndash5899 2016

Advances in Civil Engineering 7

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering

Page 7: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

4 Conclusion

An image processing algorithm based on color and depthimages was successfully applied to traffic cone detectionEach image frame was analyzed within 80ms which in-cluded one color and one depth image capture and pro-cessing +e traffic cones were very accurately recognized bycolor with the success rates of color recognition being 85100 and 100 for red blue and yellow cones respectivelyAdditionally the distance was successfully sensed for 90 ofthe traffic cones by pairing color and depth images Some ofthe cones were missing in some of the image frames whenthey were located around the image edge area but they couldbe found in the following frames of the dynamic test With12 frames per second in the machine vision system cones atthe edges of the area naturally came in and out of the field ofvision of the moving camera +is method was very effectiveon a temporary road marked by traffic cones of differentcolors +e advantages of using paired color and depthimages for traffic cone detection can be summarized asfollows (1) +is method is sensitive to small safety-relatedtraffic cones (2) It uses a highly efficient and stable algo-rithm for recognition processing (3) It is a cost-effectivesolution for maintaining safe driving on temporary roads

Data Availability

All data presented and analyzed in the study were obtainedfrom laboratory tests at Beijing Information Science ampTechnology University in Beijing China All laboratorytesting data are presented in the figures and tables in thearticle We will be very pleased to share all our raw data Ifneeded please contact us via e-mail suqinghua1985qqcom

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+e authors wish to thank the National Defense Science andTechnology Project (JCCPCX201705)+e authors also ap-preciate the great support from Beijing Information Scienceamp Technology University with Qin Xin Talents CultivationProgram (QXTCPA201903 and QXTCPB201901) ScientificResearch Level Promotion Project (2020KYNH112) andSchool Research Fund (2025041)

References

[1] China Association for Road Traffic Safety (CARTS) ldquoChinarsquosmotor vehicles and drivers maintain rapid growth in 2016rdquoRoad Traffic Management vol 2 p 9 2017 in Chinese

[2] M Gao Analysis of Highway Traffic Accidents in HebeiProvince and Preventive Measures [D] Peoplersquos Public Se-curity University of China Beijing China 2019 in Chinese

[3] P J Ossenbruggen J Pendharkar and J Ivan ldquoRoadwaysafety in rural and small urbanized areasrdquoAccident Analysis ampPrevention vol 33 no 4 pp 485ndash498 2001

[4] T Toroyan ldquoGlobal status report on road safety 2013 sup-porting a decade of actionrdquo Injury Prevention vol 15 no 4p 286 2013

[5] Xinhuanetcom ldquoAnalysis report on traffic of major cities inChina 2018 Q3 released by Autonavirdquo Urban Traffic vol 6pp 106-107 2018 in Chinese

[6] R Wang L Guo L Jin et al ldquoRecent research on safetyassisted driving technology of intelligent vehiclerdquo HighwayTransportation Technology vol 24 no 7 pp 107ndash111 2007 inChinese

[7] I F Akyildiz W Weilian Su Y Sankarasubramaniam andE Cayirci ldquoA survey on sensor networksrdquo IEEE Communi-cations Magazine vol 40 no 8 pp 102ndash114 2002

[8] S Ma and Z Zhang Computer Vision Fundamentals ofComputational eory and Algorithm Beijing Science PressBeijing China (in Chinese) 1998

[9] Yi Sun Short Range Wireless Communication and NetworkingTechnology Xirsquoan University of Electronic Science andTechnology Press Xirsquoan China (in Chinese) 2008

[10] Z Zhe H Jia W Jiang et al Research on Intelligent ControlTechnology Hebei University of technology and IndustryPress Hebei China (in Chinese) 2010

[11] Y Nie ldquoAnalysis of vehicle safety assisted driving technol-ogyrdquo Traffic and Transportation vol 2 pp 151ndash153 2008 (inChinese)

[12] L Jiaxing Driver Fatigue Monitoring and Warning SystemBased-On Multi-Parameret Fusion pp 226ndash232 LanzhouUniversity Lanzhou China 2013

[13] M Tetsuya and N Hidetoshi ldquoAnalysis of relationship be-tween characteristics of driverrsquos eye movements and visualscene in driving eventsrdquo in Proceedings of the 2011 IEEEInternational Conference on Fuzzy Systems (FUZZ-IEEE2011) IEEE Taipei Taiwan June 2011

[14] K Zhang D Zhang A de La Fortelle X Wu and J GregoireldquoState-driven priority scheduling mechanisms for driverlessvehicles approaching intersectionsrdquo IEEE Transactions onIntelligent Transportation Systems vol 16 no 5 pp 2487ndash2500 2015

[15] Y Wang E Wenjuan D Tian G Lu and Y Wang ldquoVehiclecollision warning system and collision detection algorithmbased on vehicle infrastructure integrationrdquo in Proceedings ofthe Advanced Forum on Transportation of China (AFTC 2011)7th IET Beijing China October 2011

[16] T Chen and S Lu ldquoAccurate and efficient traffic sign de-tection using discriminative adaboost and support vectorregressionrdquo IEEE Transactions on Vehicular Technologyvol 65 no 6 pp 4006ndash4015 2016

[17] F Yang ldquoDevelopment status and prospect of driverlessvehiclesrdquo Shanghai Automotive vol 3 pp 35ndash40 2014 (inChinese)

[18] A Bazzi A Zanella B M Masini and G Pasolini A Dis-tributed Algorithm for Virtual Traffic Lights with IEEE 80211pin Proceedings of the European Conference on Networks ampCommunications IEEE Valencia Spain October 2014

[19] G Wang Y Hou Y Zhang Y Zhou N Lu and N ChengldquoTlb-Vtl 3-Level buffer based virtual traffic light scheme forintelligent collaborative intersectionsrdquo in Proceedings IEEE86th Vehicular Technology Conference (Vtc-Fall) pp 1ndash5Toronto canada September 2017

[20] M B Younes and A Boukerche ldquoIntelligent traffic lightcontrolling algorithms using vehicular networksrdquo IEEETransactions on Vehicular Technology vol 65 no 8pp 5887ndash5899 2016

Advances in Civil Engineering 7

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering

Page 8: Advanced Driver-Assistance System (ADAS) for Intelligent …downloads.hindawi.com/journals/ace/2020/8883639.pdf · environmentmonitoring[7–10].Generally,ADASconsists ofactivesafetyandpassivesafety.Passivesafetyrelieson

[21] M Betke E Haritaoglu and L S Davis ldquoMultiple vehicledetection and tracking in hard real-timerdquo in Proceedings of theIEEE Intelligent Vehicles Symposium IEEE Las Vegas NVUSA July 1996

[22] Y Chen X Huang and S Yang ldquoResearch and developmentof automobile anti-collision early warning systemrdquo ComputerSimulation vol 12 pp 247ndash251 2006 (in Chinese)

[23] S Tak S Woo and H Yeo ldquoSampling-based collisionwarning system with smartphone in cloud computing envi-ronmentrdquo in Proceedings of the 2015 IEEE Intelligent VehiclesSymposium (IV) IEEE Dearborn MI USA June 2015

[24] C Wang Design and Implementation of Active Safety EarlyWarning System for Automobiles Dalian University oftechnology Dalian China 2013

[25] J Yang J Wang and B Liu ldquoAn intersection collisionwarning system using wi-fi smartphones in VANETrdquo inProceedings of the Global Communications Conference DBLPAtlanta GA USA December 2011

[26] A Dhall D Dai and L Van Gool Real-time 3d Traffic ConeDetection for Autonomous Driving 2019

[27] Y Huang and J Xue ldquoReal-time traffic cone detection forautonomous vehiclerdquo in Proceedings of the Control ConferenceIEEE Piscataway NJ USA 2015

[28] L Zhou HWang DWang L Xie and K P Tee ldquoTraffic conedetection and localization in techx challenge 2013rdquo PhysicalReview Letters Na vol 234 no 2 pp 287ndash297 2015

8 Advances in Civil Engineering