Autonomous Underwater Vehicle for Vision Based Tracking

Autonomous Underwater Vehicle for Vision Based Tracking

AVAILABLEG.Santhan ONLINE AT WWW .SCIENCEDIRECT .COM 00 (2018) 000–000 Kumar/ Procedia Computer Science AVAILABLE AT WWW.SCIENCEDIRECT.COM AVAILABLE O...

2MB Sizes 0 Downloads 9 Views

AVAILABLEG.Santhan ONLINE AT WWW .SCIENCEDIRECT .COM 00 (2018) 000–000 Kumar/ Procedia Computer Science AVAILABLE AT WWW.SCIENCEDIRECT.COM AVAILABLE ONLINE ONLINE AT WWW.SCIENCEDIRECT.COM AVAILABLE ONLINE AT WWW.SCIENCEDIRECT.COM AVAILABLE ONLINE AT WWW.SCIENCEDIRECT.COM

SCIENCEDIRECT SCIENCEDIRECT SCIENCEDIRECT SCIENCEDIRECT SCIENCEDIRECT Procedia Computer Science 00 (2018) 000–000

Procedia Computer Science 00 Available online at www.sciencedirect.com Procedia Computer Science 00 (2018) (2018) 000–000 000–000 Procedia Computer Science 00 (2018) 000–000 Procedia Computer Science 00 (2018) 000–000

ScienceDirect

www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia

Procedia Computer Science 133 (2018) 169–180

International Conference on Robotics and Smart Manufacturing (RoSMa2018) International International Conference Conference on on Robotics Robotics and and Smart Smart Manufacturing Manufacturing (RoSMa2018) (RoSMa2018) International Conference on Robotics and Smart Manufacturing (RoSMa2018) International Conference on Robotics and Smart Manufacturing (RoSMa2018) Autonomous Underwater Vehicle for Vision Based Tracking

Autonomous Underwater Vehicle for Vision Based Tracking Autonomous Underwater Vehicle for Vision Based Tracking * for Vision Based Tracking Autonomous Underwater Vehicle G.Santhan Kumar**, Unnikrishnan V. Painumgal , M.N.V.Chaitanya Kumar, K.H.V.Rajesh. *

G.Santhan G.Santhan Kumar Kumar**,, Unnikrishnan Unnikrishnan V. V. Painumgal Painumgal**,, M.N.V.Chaitanya M.N.V.Chaitanya Kumar, Kumar, K.H.V.Rajesh. K.H.V.Rajesh. G.Santhan Kumar *, Unnikrishnan V. Painumgal*, M.N.V.Chaitanya Kumar, K.H.V.Rajesh. G.Santhan Kumar , Unnikrishnan V. Painumgal , M.N.V.Chaitanya Kumar, K.H.V.Rajesh. Vignan's Institute Of Information Technology, Besides VSEZ , Duvvada, Vadlapudi Post , Gajuwaka, Visakhapatnam, Andhra Pradesh, 530049, India Vignan's Vignan's Institute Institute Of Of Information Information Technology, Technology, Besides Besides VSEZ VSEZ ,, Duvvada, Duvvada, Vadlapudi Vadlapudi Post Post ,, Gajuwaka, Gajuwaka, Visakhapatnam, Visakhapatnam, Andhra Andhra Pradesh, Pradesh, 530049, 530049, India India Vignan's Institute Of Information Technology, Besides VSEZ , Duvvada, Vadlapudi Post , Gajuwaka, Visakhapatnam, Andhra Pradesh, 530049, India Vignan's Institute Of Information Technology, Besides VSEZ , Duvvada, Vadlapudi Post , Gajuwaka, Visakhapatnam, Andhra Pradesh, 530049, India

Abstract Abstract Abstract Abstract This paper describes about the design, construction and control of an “Autonomous Underwater Vehicle for Vision Based Tracking”, Abstract This describes about construction of “Autonomous Underwater Vision Tracking”, built in Vignan’s Institute of Information Technology, Centreand forcontrol Innovation The underwater visionVehicle robot isfor essentially an Autonomous This paper paper describes about the the design, design, construction and control of an anLab. “Autonomous Underwater Vehicle for Vision Based Based Tracking”, This paper describes about the design, construction and control oftracking anLab. “Autonomous Underwater Vehicle for Vision Based Tracking”, built in Vignan’s Institute of Information Technology, Centre for Innovation The underwater vision robot is essentially an Autonomous Underwater Vehicle (AUV) for study of marine animals by automatically and following them using computer vision. The AUV can built in Vignan’s Institute of Information Technology, Centreand forcontrol Innovation The underwater visionVehicle robot isfor essentially an Autonomous This paper describes about the design, construction of anLab. “Autonomous Underwater Vision Based Tracking”, built in Vignan’s Institute of Information Technology, Centre for Innovation Lab. The underwater vision robot is essentially an Autonomous Underwater Vehicle (AUV) for study of marine animals by automatically tracking and following them using computer vision. The AUV can also be as a platform for Photo-mapping of the Seafloor using the onboard camera light arrangement. The robot vision. is proposed to have 5 Underwater Vehicle (AUV) study of marine animals by automatically tracking and and following them using computer AUV can built in used Vignan’s Institute offor Information Technology, Centre for Innovation Lab. The underwater vision robot is essentially anThe Autonomous Underwater Vehicle (AUV) for study of marine animals by automatically tracking and and following them(IMU) using interfaced computer vision. The AUV can also be used as a platform for Photo-mapping of the Seafloor using the onboard camera light arrangement. The robot is proposed to have 5 thrusters configurations to achieve 4 degrees of freedom controlled by an Inertial Measurement Unit with control unit and also be used Vehicle as a platform forfor Photo-mapping Seafloor using the onboard camera light arrangement. The robot vision. is proposed to have 5 Underwater (AUV) study of marine the animals by automatically tracking and and following them using computer The AUV can also be used a platform for Photo-mapping the Seafloor using the onboard and light arrangement. The robot is proposed tounit have 5 thrusters configurations to achieve 4 freedom controlled by an Measurement Unit (IMU) interfaced with control and powered by as commercial battery packs of .The robot is equipped roll,camera pitch, heading, and which provide thrusters configurations toLiPo achieve 4 degrees degrees freedom controlled bywith an Inertial Inertial Measurement Unit depth (IMU)sensors interfaced with controlsufficient and also be used as a platform for Photo-mapping of the Seafloor using the onboard camera and light arrangement. The robot is proposed tounit have 5 thrusters configurations to achieve 4 degrees of freedom controlled by an Inertial Measurement Unit (IMU) interfaced with control unit and powered by commercial LiPo battery packs .The robot is equipped with roll, pitch, heading, and depth sensors which provide sufficient feedback signals to automatically control the vehicle to track a pre-planned trajectory. The centre of gravity and centre of buoyancy of the powered configurations by commercialtoLiPo battery packs of .The robot controlled is equippedbywith roll, pitch, heading, and which thrusters achieve 4 degrees freedom an Inertial Measurement Unit depth (IMU)sensors interfaced withprovide controlsufficient unit and powered by commercial LiPo abattery packs .The robot is equipped withthis roll, heading, andsensors depth sensors which provide feedback signals to control the vehicle to aa pre-planned trajectory. The gravity and centre of buoyancy of vehicle are positioned in such way that it self-stabilized. Along with thepitch, combinations of and speed control driverssufficient provide feedback signals to automatically automatically control theis.The vehicle to track track pre-planned trajectory. The centre centre ofdepth gravity and centre ofprovide buoyancy of the the powered by commercial LiPo battery packs robot is equipped with roll, pitch, heading, andof sensors which sufficient feedback signals to automatically control the vehicle to track a pre-planned trajectory. The centre of gravity and centre of buoyancy of the vehicle are positioned in such a way that it is self-stabilized. Along with this the combinations of sensors and speed control drivers provide more stability to the system using acontrol closed control to system, without the this operator involvement. vehicle aresignals positioned in such a way that loop it self-stabilized. with the combinations of sensors andand speed control drivers provide feedback to automatically theis vehicle track Along a pre-planned trajectory. The centre of gravity centre of buoyancy of the vehicle areThe positioned incaptures such a way thatduring it is self-stabilized. Along with this theplanned combinations sensors and speed control drivers more to the system using aa closed loop control system, without the operator involvement. AUV videos its mission using the camera. It is to have of a multi-core umbilical cable for videoprovide signal, more stability stability to the also system using closed control system, without the this operator involvement. vehicle are positioned in such a way that loop it is self-stabilized. Along with the combinations of sensors and speed control drivers provide more stability to the system using a closed loop control system, without the operator involvement. The AUV also captures videos during its mission using the camera. It is planned to have a multi-core umbilical cable for signal, water leakage alarm, feedback signals andduring battery lines. This will used fortodevelopment and test purposescable and for willvideo be removed The to AUV captures videos itscharging mission using the camera. Itonly is planned have a multi-core umbilical video signal, more stability the also system using a closed loop control system, without the be operator involvement. water feedback signals and battery lines. This will be used for development and test purposes and will be removed The alarm, AUV also captures videos during itscharging mission using the camera. Itonly isvehicle planned totrack havedifferent a multi-core umbilical cable for video signal, during autonomous missions. Various control schemes can be applied for the to paths. The AUV is designed to the water leakage leakage alarm, feedback signals and battery charging lines. This will be only used for development and test purposes and will be removed The AUV also captures videos during its mission using the camera. It is planned to have a multi-core umbilical cable for video signal, water leakage alarm, feedback signals and battery charging lines. This will be only used for development and test purposes and will be removed during autonomous missions. Various control schemes can be applied for the vehicle to track different paths. The AUV is designed to the dimension of 575×210×175mm.TheAUV uses O-rings for the hulls for good water sealing effect as well as for faster assembly duringleakage autonomous control schemes canlines. be applied forbethe vehicle to track differentand paths. The AUVand is will designed to and the water alarm,missions. feedback Various signals and battery charging This will only used for development test purposes be removed during autonomous missions. Various control schemes can bethe applied for the vehicle tosealing track different paths. The AUV is designed to and the dimension of 575×210×175mm.TheAUV uses O-rings for hulls for good water effect as well as for faster assembly disassembly. dimension of 575×210×175mm.TheAUV uses O-rings for the hulls for good water sealing effect as well as for faster assembly and during autonomous missions. Various control schemes can be applied for the vehicle to track different paths. The AUV is designed to the dimension of 575×210×175mm.TheAUV uses O-rings for the hulls for good water sealing effect as well as for faster assembly and disassembly. We expect this AUV development to mature in to an advanced system can be used as a platform for study of the ocean by scientists, disassembly.of 575×210×175mm.TheAUV uses O-rings for the hulls for good water sealing effect as well as for faster assembly and dimension disassembly. We this development to for environmental studies and for defense applications. We expect expect this AUV AUV development to mature mature in in to to an an advanced advanced system system can can be be used used as as aa platform platform for for study study of of the the ocean ocean by by scientists, scientists, disassembly. We expect this AUV development to mature in to an advanced system can be used as a platform for study of the ocean by scientists, for environmental studies and for defense applications. for environmental studies and for defense applications. We expect this AUV development to mature in to an advanced system can be used as a platform for study of the ocean by scientists, for environmental studies and forbydefense applications. © The Published Elsevier Ltd. Ltd. for environmental studies and for defense applications. © 2018 2018 TheAuthors. Authors. Published by Elsevier © 2018 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). © 2018 The Authors. Published by Elsevier Ltd.BY-NC-ND This is an open access article under the CC license (https://creativecommons.org/licenses/by-nc-nd/4.0/) © 2018 The Authors. Published by Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). © 2018 The Authors. Published by Elsevier Ltd. Peer-review under responsibility of the scientific committee of the International Conference on Robotics and Smart Manufacturing. This is an Autonomous open accessUnderwater article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). Keywords: Vehicle; Vision Based Control System; Image Tracking and Processing. This is an Autonomous open accessUnderwater article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). Keywords: Vehicle; Vision Based Control System; Image Tracking and Processing. Keywords: Keywords: Autonomous Autonomous Underwater Underwater Vehicle; Vehicle; Vision Vision Based Based Control Control System; System; Image Image Tracking Tracking and and Processing. Processing. Keywords: Autonomous Underwater Vehicle; Vision Based Control System; Image Tracking and Processing. Keywords: Autonomous Underwater Vehicle; Vision Based Control System; Image Tracking and Processing.

1. Introduction 1. 1. Introduction Introduction 1. Introduction We know that more than 70% of the earth’s surface is covered by the oceans and the ocean contain 97% of the earth 1. Introduction We more the surface is covered covered by the the oceans andofthe the ocean contain 97% of of the earth earth water. However thethat knowledge we 70% have of about the oceans is limited and only lessoceans than 5% theocean Oceans have been explored so We know know that more than than 70% of the earth’s earth’s surface is by and contain 97% the We know that more than 70% of the earth’s surface is covered by the oceans and the ocean contain 97% of the earth water. However the knowledge we have about the oceans is limited and only less than 5% of the Oceans have been explored so far. TillHowever now Ocean exploration oceanographic surveys has been done by usecontain ofhave ships andof collecting water. thethat knowledge weand have about the oceans is limited andpredominantly only lessoceans than 5% thethe Oceans been explored so We know more than 70% of the earth’s surface is covered by the andofthe ocean 97% the earth water. However the knowledge we have about the oceans is limited and only less than 5% of the Oceans have been explored so far. Till now Ocean exploration and oceanographic surveys has been predominantly done by the use of ships and collecting samples from thethesea by various methods [1]. the However using is predominantly expensive and 5% hence cannot be of done in aand larger scale. far. TillHowever now Ocean exploration oceanographic surveys hasships been done the use ships collecting water. knowledge weand have about oceans is limited and only less than ofby the Oceans have been explored so far. Till from now exploration and oceanographic surveys beenis bycannot the use of ships collecting samples from the seaincreasing by various various methods [1].development However using ships is predominantly expensive and done hence cannot be done into aaand larger scale. Therefore thereOcean is an trend towards andhas utilization of Autonomous Underwater Vehicles lower thescale. cost samples the sea by methods [1]. However using ships expensive and hence be done in larger far. Till now Ocean exploration and oceanographic surveys has been predominantly done by the use of ships and collecting samples seaincreasing by various methods [1].development However using ships is hostile expensive and hence be done into alower largerthe scale. Therefore there is an trend towards and utilization of Underwater Vehicles cost of ocean from surveys also reduce the risks of sending humans to the environments incannot the deep sea with unpredictable Therefore therethe is and an trend towards andin utilization of Autonomous Autonomous Underwater Vehicles cost samples from the seaincreasing by various methods [1].development However using ships is expensive and hence cannot be done into alower largerthescale. Therefore there ishigh an increasing trend towards development and utilization ofwe Autonomous Underwater Vehicles to lower the cost of ocean surveys and also reduce the risks of sending humans in to the hostile environments in the deep sea with unpredictable weather and the pressure at greater depths. Therefore in this research propose development of a low cost and reliable of ocean surveys also reduce the towards risks of development sending humans to the hostile environments in the deep sea with unpredictable Therefore there is and an increasing trend andin utilization of Autonomous Underwater Vehicles to lower the cost of ocean surveys and also reduce the risks of sending humans in to the hostile environments in the deep sea with unpredictable weather and the high pressure at greater depths. Therefore in this research we propose development of a low cost and reliable AUV which can make Ocean exploration affordable and enable collection the sea.deep The AUV canunpredictable be used weather and the high pressure at greater Therefore in this we proposefrom development of asea lowwith cost andalso reliable of ocean surveys and also reduce the risksdepths. of sending humans ingreater toresearch the data hostile environments in the weather and can the pressure at greater depths. Therefore in this research we proposeand development of aAUV low cost and reliable AUV which make Ocean exploration affordable and enable greater data collection the The can also used as a platform to high add for its missions including environmental studies defense Thebe Australian AUV which makevarious Ocean sensors exploration affordable and enable greater datawe collection from the sea. sea.applications. The can be also used weather and can the high pressure at greater depths. Therefore in this research proposefrom development of aAUV low cost and reliable AUV which can makevarious Ocean[2] exploration affordable and enable greater data collection from the areas. sea.applications. The AUV can beAustralian also used as a platform to add sensors for its missions including environmental studies and defense The Sea floor survey department has developed an AUV which can do sea floor mapping of large as a platform to add various sensors for its missions including environmental studies and defense applications. The Australian AUV which can make Ocean exploration affordable and enable greater data collection from the sea. The AUV can be also used as platform to department add various[2] sensors for its missions environmental studies and defense applications. The Australian Sea survey has an which do of areas. Seaaa floor floor survey has developed developed an AUV AUVincluding which can can do sea sea floor floor mapping mapping of large large areas. as platform to department add various[2] sensors for its missions including environmental studies and defense applications. The Australian Sea floor survey department [2] has developed an AUV which can do sea floor mapping of large areas. Sea floor survey department [2] has developed an AUV which can do sea floor mapping of large areas.

* Corresponding authors. Tel.:+91-9493201623; +91-9940422310. * Corresponding authors. Tel.:+91-9493201623; +91-9940422310. E-mail address:[email protected]; [email protected] * Corresponding authors. Tel.:+91-9493201623; +91-9940422310. E-mail address:[email protected]; [email protected] * Corresponding authors. Tel.:+91-9493201623; +91-9940422310. E-mail address:[email protected]; [email protected] * Corresponding authors. Tel.:+91-9493201623; +91-9940422310. E-mail address:[email protected]; [email protected] E-mail address:[email protected]; [email protected] 1877-0509© 2018 The Authors. Published by Elsevier Ltd. 1877-0509© 2018 The Authors. Published by Elsevier 1877-0509 © 2018 The Authors. Published byLtd. Elsevier Ltd. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). 1877-0509© 2018 The Authors. Published by Elsevier Ltd. 1877-0509© 2018 The Authors. Published Elsevier Ltd. This open access article under the CC BY-NC-ND license This open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) This is isisan anan open access under the CC by BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). (https://creativecommons.org/licenses/by-nc-nd/4.0/). 1877-0509© 2018 Thearticle Authors. Published by Elsevier Ltd. This is an openunder access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/). Peer-review responsibility of the scientific committee of the International Conference on Robotics and Smart Manufacturing. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/).

10.1016/j.procs.2018.07.021

2 2 2 2 2

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 However the AUV is not using the cameras for real time control of the vehicle. The camera images are captured and G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 However the AUV is not using the for real time control of (2018) the vehicle. TheYebisura camera are captured 170 in memory for offline processing. G.Santhan Kumar et al. Procedia Computer Science 133 (2018) 169–180images G.Santhan Kumar/ Procedia 00 000–000 stored Thecameras University of/Computer Tokyo aScience miniature AUV named [3] is developed by and the G.Santhan Kumar/ Procedia Computer Science 00 000–000 However the AUV isprocessing. not using the fordoes real timetracking control of (2018) theAUV vehicle. Theimages camerafrom images are captured and G.Santhan Kumar/ Procedia 00 (2018) 000–000 stored memory forinoffline Thecameras University of Computer Tokyo aScience miniature named Yebisura [3] forward is developed by and the studentsinto take part student competitions. This AUV line based on real time looking

However the AUV isprocessing. not using the fordoes real timetracking control of theAUV vehicle. Theimages camerafrom images are captured and stored memory forinoffline Thecameras University of Tokyo a miniature named Yebisura [3]animals is developed by the students to take camera. part student competitions. This AUV line based real time forward looking and However the AUV is not using the cameras time control of the vehicle. The camera images are captured bottominlooking Hence the vision algorithm offor thereal AUV is not capable ofon tracking moving marine whose shape However the AUV is not using the for real time control of the the vehicle. The camera images are captured captured and stored inlooking memory forinoffline processing. Thecameras University of Tokyo a miniature AUV named Yebisura [3]animals is developed by the students to take part student competitions. This AUV does line tracking based on real time images from forward looking and However the AUV is not using the cameras for real time control of vehicle. The camera images are bottom camera. Hence the vision algorithm of the AUV is not capable of tracking moving marine whose shape stored in memory for offline processing. The University of Tokyo [3] is developed the and size can vary largely based on the distance of the animal (fish) aaorminiature the pose AUV of thenamed animal.Yebisura In the research proposed by in this stored in memory for offline processing. The University of Tokyo miniature AUV named Yebisura [3] is developed by the students to take part in student competitions. This AUV does line tracking based on real time images from forward looking and bottom looking camera. Hence the vision algorithm of the AUV is not capable of tracking moving marine animals whose shape stored into memory forinoffline processing. TheThis of Tokyo aorminiature named Yebisura [3] isvideograph developed by the and size can vary largely based onof thetracking distance ofAUV the (fish) posetoAUV of the animal. In the from research proposed in and this students take part student competitions. does line tracking on real time images looking paper we take the challenging task aUniversity live fishanimal and following the based fish study their behaviour andforward them in students to take part in student competitions. This AUV does line tracking based on real time images from forward looking and bottom looking camera. Hence the vision algorithm of the AUV is not capable of tracking moving marine animals whose shape and size can vary largely based on the distance of the animal (fish) or the pose of the animal. In the research proposed in this students to take part in student competitions. This AUV does line tracking based on real time images from forward looking and paper we take the challenging task of tracking a live of fish and following the fish of to tracking their behaviour andanimals videograph them in bottom looking camera. Hence the vision algorithm the AUV is not capable moving marine whose shape fully autonomous manner. This will enable long-term unattended autonomous study of marine animals and also the same bottom looking camera. Hence the vision algorithm of the AUV is not not capable of tracking moving marine animals whose shape and size can vary largely based on thetracking distance of the animal (fish) or the fish poseof the animal. In the research in this paper we take the challenging task of a live fish and following toof study their behaviour andanimals videograph them in bottom looking camera. the vision algorithm of the AUV is capable tracking marine whose shape fully autonomous manner. This will enable long-term unattended autonomous study of moving marine animals andproposed also the same and size can largely based on the distance of the animal (fish) or the pose of the animal. In the research proposed in this algorithm canvary be used forHence remote underwater surveillance for defense applications. and size can vary largely based on the distance of the animal (fish) or the pose of the animal. In the research proposed in this paper wecan take theused challenging task of a live fishanimal and following their behaviour and videograph them in fully autonomous manner. This will enable long-term unattended autonomous of marine animals andproposed also the and size largely based on thetracking distance of the (fish) or the fish poseto ofstudy the animal. In the research insame this algorithm canvary be for remote underwater surveillance for defense applications. paper we take the challenging task of tracking a live fish and following the fish to study their behaviour and videograph them in paper we take take theused challenging taskunderwater of tracking tracking a live live fish fish and and following the fish fish to to study their behaviour and videograph videograph them in fully autonomous manner. This will enable long-term unattended autonomous of marine animals and also the same algorithm can be for remote surveillance for defense applications. paper we the challenging task of a following the study their behaviour and them in An Autonomous Underwater Vehicle (AUV) is an underwater vehiclestudy capable of self-propulsion, also known as fully autonomous manner. This will enable long-term unattended autonomous of marine animals and also the same fully autonomous manner. This will enable long-term unattended autonomous study of marine animals and also the same algorithm can be used for remote underwater surveillance for defense applications. fully autonomous manner. This will enable long-term unattended autonomous marine animals and also also known the same An Autonomous Underwater Vehicle (AUV) is andefense underwater vehicle capable of self-propulsion, as unmanned underwater vehicle. It underwater is a robotic device that is driven through the waterstudy by a of propulsion system, controlled by an on algorithm can be used for remote surveillance for applications. algorithm can be used used for for remote underwater surveillance for defense applications. An Autonomous Underwater Vehicle (AUV) isfor andefense underwater vehicle of self-propulsion, also known as unmanned underwater vehicle. It underwater is ainrobotic device that is driven through the water bythe a propulsion system, controlled by an on algorithm can be remote surveillance applications. board computer and maneuverable three dimension. Such vehicles come undercapable category of mobile robots that have An Autonomous Underwater Vehicle (AUV) isSuch an underwater vehicle capable ofhuman self-propulsion, also known as unmanned underwater vehicle. Itintelligence is ainrobotic that is driven through the water bythe a or propulsion system, controlled by an on board computer and three dimension. vehicles under category of mobile robots that have actuators, sensors and maneuverable on-board todevice successfully complete theircome task with little no efforts. The application of An Autonomous Underwater Vehicle (AUV) is an underwater vehicle capable of self-propulsion, also known as An Autonomous Underwater Vehicle (AUV) is an underwater vehicle capable of self-propulsion, also known as unmanned underwater vehicle. Itintelligence is asuch robotic device that is driven through the water bythe a or propulsion system, controlled by an on board computer and maneuverable in three dimension. Such vehicles come under category of mobile robots that have actuators, sensors and on-board to successfully complete their task with little no human efforts. The application of An Autonomous Underwater Vehicle (AUV) is an underwater vehicle capable of self-propulsion, also known as AUVs has increased in recent years, as cable or pipeline tracking and deep ocean exploration. These AUVs are designed to unmanned underwater vehicle. It is a robotic device that is driven through the water by a propulsion system, controlled by an on unmanned underwater vehicle. It is ainrobotic device that is driven through the water by aa or propulsion system, controlled by an on board computer and dimension. vehicles under the category of mobile robots that have actuators, andinmaneuverable on-board to successfully complete their task with no reduce human efforts. The application of AUVs hassensors increased recent cable orand pipeline tracking andcome deep ocean exploration. These AUVs are designed to unmanned underwater vehicle. It is robotic device that isSuch driven through the water by propulsion system, controlled byreduce an on be smaller and flexible so thatyears, itintelligence canasuch gothree toassmaller constrained region easily. Inlittle order to drag and in-turn to board computer and maneuverable in three dimension. Such vehicles come under the category of mobile robots that have board computer and maneuverable in dimension. Such vehicles come under the of mobile robots that have actuators, sensors andand on-board to successfully complete their task with orcategory no reduce human efforts. application of AUVs has increased recent such cable orand pipeline tracking deep ocean exploration. AUVs are designed to be smaller and flexible soimproved thatyears, itintelligence can toassmaller constrained region easily. Inlittle order to andThe in-turn to reduce board computer and ingothree three dimension. Such vehicles come under the ofdrag mobile robots that have energy consumption dynamics, the AUVs are designed toand have a streamlined body [4].These actuators, sensors andinmaneuverable on-board intelligence to successfully complete their task with little orcategory no human human efforts. The application of actuators, sensors and on-board intelligence to successfully complete their task with little or no efforts. The application of AUVs has increased in recent years, such as cable or pipeline tracking and deep ocean exploration. These AUVs are designed to be smaller and flexible so that it can go to smaller and constrained region easily. In order to reduce drag and in-turn to reduce energy has consumption improved dynamics, the AUVs are complete designed toand have a streamlined [4].These actuators, sensors andand on-board intelligence to successfully their task with little orbody no human efforts. Theare application of AUVs increased in recent years, such as cable or pipeline tracking deep ocean exploration. AUVs designed to AUVs has increased in recent such cable or pipeline tracking and deep ocean exploration. These are designed to be smaller and flexible soimproved thatyears, itascan go toas smaller and constrained region easily. In order toand reduce dragAUVs and in-turn to reduce energy consumption and dynamics, the AUVs are designed to have a streamlined body [4]. AUVs has increased in recent years, such as cable or pipeline tracking and deep ocean exploration. These AUVs are designed to The paper is organized follows. At first we start with describing the specifications features of the proposed AUV. be smaller and flexible so that it can go to smaller and constrained region easily. In order to reduce drag and in-turn to reduce be smaller and flexible so that go smaller and constrained region In to reduce drag and to energy consumption improved dynamics, the AUVs are designed to haveeasily. a streamlined body [4]. be smaller and flexible so that it itfunctional can go to toconcept smaller and constrained region easily. In order order toand reduce drag and in-turn to reduce reduce The paper isand organized ascan follows. At first we start with describing the specifications features of thein-turn proposed AUV. Then we move on to the overall of the AUV. In section 1.3 the vehicle architecture is explained in detail along energy consumption and improved dynamics, the AUVs are designed to have a streamlined body [4]. energy consumption and improved dynamics, the AUVs are designed to aa streamlined body The paper organized as Then follows. At first we start with the specifications and[4]. features of theabout proposed AUV. energy consumption improved dynamics, the AUVs are designed to have have streamlined body [4]. Then move on toisand the overall functional of the AUV. Indescribing section 1.3 the vehicle architecture explained in detail along with awe block diagram of the AUV. theconcept vehicle control system is briefly presented. The section 1.5isexplains the Vision The paper is organized as follows. At first we start with describing the specifications and features of theabout proposed AUV. Then we move on to the overall functional concept of the AUV. In section 1.3 the vehicle architecture explained in detail along with block diagram of the AUV. Then the At vehicle control system is briefly presented. Theare section 1.5isexplains the Vision baseda tracking system proposed in the AUV. Then the vehicle external design and features presented in section 2. Section 2.1 The paper is organized as follows. first we start with describing the specifications and features of the proposed AUV. The paper organized as follows. At first we start with the specifications and features of the proposed AUV. Then we move on tois the overall functional concept of the AUV. Indescribing section 1.3 the vehicle architecture isexplains explained in detail along with a block diagram of the AUV. Then the vehicle control system is briefly presented. The section 1.5 about the Vision The paper is organized as follows. At first we start with describing the specifications and features of the proposed AUV. based tracking system proposed in the AUV. Then the vehicle external design and features are presented in section 2. Section 2.1 describes construction of the vehicle including the various hardware sections that form the essentials parts of the AUV. This Then we move on to the overall functional concept of the AUV. In section 1.3 the vehicle architecture is explained in detail along Then move on the overall functional of the AUV. In 1.3 the vehicle architecture explained in along with awe block diagram ofproposed the AUV. Then theconcept vehicle control system is briefly The section 1.5is the Vision based tracking system in the including AUV. Then vehicle external design and features are presented inonsection 2. detail Section 2.1 Then we move on to toCamera the overall functional concept of the AUV. In section section 1.3presented. thethat vehicle architecture isexplains explained in detail along describes construction ofthe the vehicle thethe various hardware sections form the essentials parts ofabout the AUV. This includes thrusters, and Controller Board. The section 2.2 explains the various tests conducted the AUV and their with a block diagram of AUV. Then the vehicle control system is briefly presented. The section 1.5 explains about the Vision with aa tracking block diagram of the AUV. Then the vehicle control system is briefly presented. The section 1.5 explains about the Vision based system proposed in the AUV. Then the vehicle external design and features are presented in section 2. Section 2.1 describes construction of the vehicle including the various hardware sections that form the essentials parts of the AUV. This with block diagram of the AUV. Then the vehicle control system is briefly presented. The section 1.5 explains about the Vision includes Camera and Controller Board. Thevehicle section 2.2 explains various tests conductedinonsection the AUV and their results arethrusters, presented. Finally the in conclusion isThen discussed in detail in section 4.the based tracking system proposed the AUV. the external design and features are presented 2. Section 2.1 based tracking system proposed in the AUV. Then vehicle external design and features are presented in section 2. Section 2.1 describes construction of the vehicle including thethe various hardware sections that formtests the essentials parts of AUV the AUV. This includes thrusters, Camera and Controller Board. The section 2.2 explains the various conducted on the and their based tracking system proposed in the AUV. Then the vehicle external design and features are presented in section 2. Section 2.1 results are presented. Finally the conclusion is discussed in detail in section 4. describes construction of the vehicle including the various hardware sections sections that that form form the the essentials essentials parts parts of of the the AUV. AUV. This This describes construction of the vehicle including the various hardware includes thrusters, Camera and Controller Board. The section 2.2 explains the various tests conducted on the AUV and their results are presented. Finally the conclusion is discussed in detail in section 4. describes construction of the vehicle including theThe various hardware sections that formtests the essentials parts of AUV the AUV. This 1.1. Specifications includes thrusters, Camera and Controller Board. section 2.2 explains the various conducted on the and their includes Camera The explains results arethrusters, presented. Finallyand the Controller conclusion Board. is discussed in detail2.2 in section 4.the includes thrusters, Camera and Controller Board. The section section 2.2 explains4. the various various tests tests conducted conducted on on the the AUV AUV and and their their 1.1. Specifications results are presented. Finally the conclusion is discussed in detail in section results are presented. Finally the conclusion is discussed in detail in section 4. 1.1. Specifications results are presented. Finally the conclusion is discussed in detail in section 4. For the successful design of any complex systems, it is most important to start by clearly identifying the most critical 1.1. Specifications 1.1. Specifications For the successful of any complex systems, is most important to the start by clearly identifying requirements. In the case of design underwater robotic vehicles, theseit are usually stated by envisaged missions, suchthe as most depthcritical rating, 1.1. Specifications 1.1. Specifications For the successful design ofmaneuverability, any complex systems, is most important to the start by clearly identifying the critical requirements. In the case of sensors, underwater robotic vehicles, theseit are usually stated by envisaged missions, such as most depthare rating, battery endurance, payload communications and user interference. The main design decisions then For the successful design of any complex systems, it is most important to start by clearly identifying the most critical requirements. Ina combination the case of sensors, underwater robotic vehicles, theseitwith are usually stated by the envisaged missions, such as most depth rating, battery endurance, payload maneuverability, communications and user interference. The main design decisions are then For the successful design of any complex systems, is most important to start by clearly identifying the critical determined by of these specifications, together possible constraints in fabrications, assembly and operational For the successful design of any complex systems, is most important to start by clearly identifying the critical requirements. Ina combination the case of sensors, underwater robotic vehicles, theseit are usually stated by the envisaged missions, such as most depth rating, battery endurance, payload maneuverability, communications and user interference. The main design decisions are then For the successful design of any complex systems, it is most important to start by clearly identifying the most critical determined by of these specifications, together with possible constraints in fabrications, assembly and operational requirements. In the case of underwater robotic vehicles, these are usually stated by the envisaged missions, such as depth rating, logistics. Hence we started the design ofrobotic the AUV by defining the specifications as given in Table 1. requirements. In the case of underwater vehicles, these are usually stated by the envisaged missions, such as depth rating, battery endurance, payload sensors, maneuverability, communications and user interference. The main design decisions are then determined by a combination of these specifications, together with possible constraints in fabrications, assembly and operational requirements. In the case of underwater robotic vehicles, these are usually stated by the envisaged missions, such as depth rating, logistics. Hence wepayload started the designmaneuverability, of the AUV by defining the specifications as given in Table 1. battery endurance, sensors, communications and user interference. The main design decisions are then battery endurance, payload sensors, maneuverability, communications and user interference. The main design decisions are then determined by1:aSpecifications combination ofdesign these specifications, together withspecifications possible constraints ininfabrications, assembly and operational logistics. Hence we started the of the AUV by defining the as given Table 1. Table of AUV battery endurance, payload sensors, maneuverability, communications and user interference. The main design decisions are then determined by a combination of these specifications, together with possible constraints in fabrications, assembly and operational determined by a combination of these specifications, together with possible constraints in fabrications, assembly and operational logistics. Hence we started the design of the AUV by defining the specifications as given in Table 1. Table 1: Specifications of AUV determined by a we combination ofdesign these of specifications, together the withspecifications possible constraints ininfabrications, assembly and operational logistics. Hence started the the AUV by defining as given Table 1. Table 1: Specifications of AUV Values logistics. Hence we started the design of by in 1. logistics. Table Hence weS.No started the design of the the AUV AUV Features by defining defining the the specifications specifications as as given given575x210x175 in Table Table 1.mm 1. Body dimensions (L x W x H) 1: Specifications of AUV S.No Features Table 1: Specifications of AUV rating S.No2. 1. Depth Body dimensions (L x W x H)Features Table 1: Specifications of AUV Table 1: Specifications of AUVof Freedom (surge, sway, heave & yaw) 1. Body (L x W x H)Features 2. Degrees Depthdimensions rating S.No3. S.No4. Features 2. Depth rating 3. Hovering Degrees ofcapability Freedom sway, heave & yaw) Body dimensions (L x(surge, W x H) S.No1. Features S.No1. Features Body dimensions (L x(surge, W x H) 5. Body Imagedimensions capturing feature 3. Degrees ofcapability Freedom sway, heave & yaw) 4. Hovering 2. Depth rating 1. (L x W x H) 1. Body dimensions (L x W x H) 2. Depth rating 6. Depth Depth rating sensor, Leak detector 4. Hovering 5. Image capturing feature 3. Degrees ofcapability Freedom (surge, sway, heave & yaw) 2. 2. Depth rating 3. Degrees of Freedom (surge, sway, heave & yaw) 7. Degrees Forward velocity 5. Image capturing feature 6. Depth sensor, Leak detector 4. Hovering capability 3. of Freedom (surge, sway, heave & yaw) 3. of Freedom (surge, sway, heave & yaw) 4. Hovering capability 8. Degrees Endurance 6. Depth sensor, Leak detector 7. Forward velocity 5. Image capturing feature 4. Hovering capability 4. capability 5. Image capturing feature 9. Hovering Way point tracking 7. Forward velocity 8. Endurance 6. Depth sensor, Leak detector 5. Image capturing feature 5. capturing feature 6. Depth sensor, Leak detector 8. Endurance 10. Image Imagepoint tracking and vision based control 9. Way tracking 7. Forward velocity 6. Depth sensor, Leak detector 6. Depth sensor, Leak detector 7. Forward velocity 9. Way tracking 8. Endurance 10. Forward Imagepoint tracking and vision based control 7. velocity 7. Forward velocity 8. Endurance 10. Image tracking and vision based control 9. point tracking 8. Way Endurance Concept 8. 9. Endurance Way point tracking 10. Way Imagepoint tracking and vision based control 9. tracking Concept 9. tracking 10. Way Imagepoint tracking and vision based control 10. Image tracking and vision based control Concept 10. Image tracking and vision based control

Values 10 meters Valuesmm 575x210x175 4575x210x175 10 meters Valuesmm Valuesmm Yes 10 4 meters 575x210x175 Values Valuesmm 575x210x175 30 meters fps (max) 4Yes 10 575x210x175 mm 575x210x175 10 meters 30 meters fps (max) mm 4Yes 10 10 meters 4Yes 1 m/s fps(max) (max) 430 4Yes Yes 12 hours m/s 30 fps(max) (max) Yes Yes 30 fps (max) 12 hours m/s Yes 30 fps(max) (max) 30 fps (max) Yes 12 hours m/s (max) Yes Yes 12Yes m/s (max) 1 hours m/s (max) 12Yes m/s (max) hours 2 hours 2Yes hours Yes Yes Yes Yes

1.2. 1.2. 1.2. The main inbuilt feature of the proposed AUV is the vision based tracking and video recording of marine animal 1.2. Concept 1.2. Concept The feature the proposed AUVis iscapable the vision basedtotracking and videodepth recording of marine animal behaviour andmain photoinbuilt mapping of theofseafloor. The AUV of diving pre-programmed and follow waypoints to 1.2. Concept 1.2. Concept The main feature thesurvey proposed AUVis iscapable thecapture vision basedtotracking andcameras, videodepth recording offorward marine animal behaviour and photoinbuilt mapping of theofthe seafloor. The of diving pre-programmed follow waypoints to conduct the desired survey. During the AUV vehicle will images using two oneand is the looking The main inbuilt feature thesurvey proposed AUVis iscapable thecapture vision basedtotracking andcameras, videodepth recording offorward marine animal behaviour and photo mapping of theof seafloor. The ofthe diving pre-programmed follow waypoints to conductand the desired survey. During the the AUV vehicle will images using oneand is the looking camera the otherinbuilt is the feature bottom looking camera. While following waypoints, if two the vehicle identifies any marine animals The main of the proposed AUV is the vision based tracking and video recording of marine animal The main inbuilt feature of the proposed AUV is the vision based tracking and video recording of marine animal behaviour and photo mapping of the seafloor. The AUV is capable of diving to pre-programmed depth and follow waypoints to conduct the desired survey. During the survey the vehicle will capture images using two cameras, one is the forward looking The main inbuilt feature of the proposed AUV is the animal vision based tracking and videodepth recording of marine animal camera and the other isthe thevisual bottom looking camera. While following the waypoints, if the vehicle identifies any marine animals then it can enter in to tracking mode and follow the such as fishes, then once the vehicle loses track of the behaviour and photo mapping of the seafloor. The AUV is capable of diving to pre-programmed and follow waypoints to behaviour photo mapping of the seafloor. The AUV is capable of diving to pre-programmed depth and follow waypoints to conduct theand desired survey. During the survey the vehicle will capture images using two cameras, one is the forward looking camera and the other is the bottom looking camera. While following the waypoints, if the vehicle identifies any marine animals behaviour and photo mapping of the seafloor. The AUV is capable of diving to pre-programmed depth and follow waypoints to then it can enter in to the visual tracking mode and follow the animal such as fishes, then once the vehicle loses track of the animal then the vehicle follows to the next way point and continues its mission until reaching the last waypoint and then surfaces. conduct the desired survey. During the survey the vehicle will capture images using two cameras, one is the forward looking conduct the desired survey. During the survey the vehicle will capture images using two cameras, one is the forward looking camera and the other is the bottom looking camera. While following the waypoints, if the vehicle identifies any marine animals then it can enter in to the visual tracking mode and follow the animal such as fishes, then once the vehicle loses track of the conduct the entire desired survey. During the survey the vehicle will capture images using two cameras, one isstores the forward looking animal then vehicle follows to the next way point and and continues itslooking mission until reaching the last waypoint and then surfaces. During the mission the forward looking camera bottom camera capture images and in the internal camera and the other is the bottom looking camera. While following the waypoints, if the vehicle identifies any marine animals camera and the other the bottom looking camera. following the waypoints, if the vehicle identifies any marine animals then it then can enter inmission tois the mode andWhile follow the animal such as fishes, then once the vehicle loses track of the animal vehicle follows to tracking the next way point and continues itslooking mission until reaching the last and camera and thebottom other is thevisual bottom looking camera. While following the waypoints, ifcapture the vehicle identifies any marine animals During the entire the forward looking camera and bottom camera images and stores inthen thesurfaces. internal memory. The camera images can be mosaicked using feature matching technique to create awaypoint large 2D photomap then it can enter in to the visual tracking mode and follow the animal such as fishes, then once the vehicle loses track of the then it can enter in to the visual tracking mode and follow the animal such as fishes, then once the vehicle loses track of animal then the vehicle follows to the next way point and continues its mission until reaching the last waypoint and then surfaces. During the entire mission the forward looking camera and bottom looking camera capture images and stores in the internal then it then can enter in tocamera the visual tracking mode and and follow the animal such asuntil fishes, then once the vehicle loses track of the the memory. The bottom images can be mosaicked using feature matching technique to the create awaypoint large 2Dand photomap surveyed area through offline data processing. animal the vehicle follows to the next way point continues its mission reaching last then surfaces. animal then the vehicle follows to the next way point and continues its mission until reaching the last waypoint and then surfaces. During the entire mission thedata forward looking camera and bottom camera capture images and stores inthen thesurfaces. internal memory. The bottom camera images can be mosaicked feature matching technique to the create large 2Dand photomap of the animal then thethrough vehicle follows to the next way point andusing continues itslooking mission until reaching last awaypoint surveyed area offline processing. During the entire mission the forward looking camera and bottom looking camera capture images and stores in the internal During the entire mission the forward camera and camera capture images and in memory. The bottom camera images canlooking be mosaicked featurelooking matching technique to create 2D photomap of the surveyed area through offline processing. During the entire mission thedata forward looking camera using and bottom bottom looking camera capture imagesaa large and stores stores in the the internal internal 1.3. Vehicle Architecture memory. The bottom camera images can be mosaicked using feature matching technique to create large 2D photomap of the memory. The bottom camera images can be mosaicked using feature matching technique to create a large 2D photomap surveyed area through offline data processing. 1.3. Vehicle Architecture memory. The through bottom camera images can be mosaicked using feature matching technique to create a large 2D photomap of of the the surveyed area offline data processing. surveyed area through offline data processing. 1.3. Vehicle Architecture surveyedThe areaarchitecture through offline data processing. of the proposed AUV is presented in the block diagram in Figure 1. The AUV is controlled by an onboard 1.3. Vehicle Architecture 1.3. Vehicle The architecture of thecontrols proposedtheAUV is presented theprocesses block diagram in Figure 1. The AUV is to controlled byvision an onboard Raspberry PIArchitecture CPU board which mission as well asinthe the computer vision algorithms enable the based 1.3. Vehicle Architecture 1.3. Vehicle The architecture of the proposed AUV is the presented the block in Figure 1. AUVplan is to controlled an onboard Raspberry PIArchitecture CPU board which mission as5well asinthe processes thewaypoints computer vision algorithms enable vision based tracking. The control system in controls the CPUthe controls thrusters based ondiagram the in theThe mission as well the asbybased on the The architecture of the proposed AUV is the presented the block diagram in Figure 1. The AUVplan is to controlled bybased an onboard Raspberry PI CPU board which mission as5well asin the processes the computer vision algorithms enable the vision based tracking. The control system in controls the CPUthe controls thrusters based on the waypoints in the mission as well ascontrol on the feedback The from the vision based tracking algorithm. The Thrusters driver acts as an interface from the CPU to take the signals architecture of the proposed AUV is presented in the block diagram in Figure 1. The AUV is controlled by an onboard architecture of proposed AUV is presented the block diagram in Figure 1. The AUV is controlled by an onboard Raspberry PI CPU board which mission as5well asin processes the computer vision algorithms to enable the vision based tracking. The control system in controls the CPUthe controls thrusters based on the in the mission plan as well ascontrol on the architecture of the the proposed AUV is the presented inthe the block diagram Figure 1. The AUV is to controlled bybased anIMU onboard feedback from the based tracking algorithm. The Thrusters driver acts anininterface from the CPU take the signals and the deliver the vision proportional high-power current directly from the battery toaswaypoints drive the thrusters motor. The Depth Sensor, and Raspberry PI CPU board which controls the mission as well as the processes the computer vision algorithms to enable the vision based Raspberry PI CPU board which mission as as the processes the computer vision algorithms to enable the vision based tracking. The control system in controls the CPUthe controls the 5well thrusters based on the waypoints in the mission plan as well ascontrol based on the feedback from the vision based tracking algorithm. The Thrusters driver acts as an interface from the CPU to take the signals Raspberry PI CPU board which controls the mission as well as the processes the computer vision algorithms to enable the vision based and the deliver the proportional high-power current directly from the battery to drive the thrusters motor. The Depth Sensor, IMU and tracking. The control system in the controls 55 thrusters based on the waypoints in the mission plan as well as based on Leak detector are the sensor inputs toCPU the CPU. Thethe Forward and Bottom cameras and respective LED lights are also controlled by the tracking. The control system in the CPU controls the thrusters based on the waypoints in the mission plan as well as based on the feedback from the vision based tracking algorithm. The Thrusters driver acts as an interface from the CPU to take the control signals and the deliver the proportional high-power current directly from the battery to drive the thrusters motor. The Depth Sensor, IMU and tracking. system in tracking thefor controls the 5 thrusters based on theasnatural waypoints in the mission plan as well ascontrol based signals on the Leak are the sensor toCPU thealgorithm. CPU. The Forward and Bottom cameras and respective LED lights are also controlled by feedback from the vision based The Thrusters driver acts an interface from the CPU to take the CPU. detector TheThe LEDcontrol lights will beinputs used lighting the camera scene when sufficient ambient light is not present. feedback from the vision based tracking algorithm. The Thrusters driver acts as an interface from the CPU to take the control signals and the deliver the proportional high-power current directly from the battery to drive the thrusters motor. The Depth Sensor, IMU and Leak detector are the sensor inputs to the CPU. The Forward and Bottom cameras and respective LED lights are also controlled by the feedback from the vision based tracking algorithm. The Thrusters driver acts as an interface from the CPU to take the control signals CPU.the The LED the lights will be used for lightingcurrent the camera scene when sufficient natural ambient light is notThe present. and deliver proportional high-power directly from the battery to drive the thrusters motor. Depth Sensor, IMU and and deliver the proportional high-power current directly from the battery to drive thrusters motor. The Depth IMU Leak detector the sensor to lighting the CPU. The Forward and Bottom cameras andthe respective LED lights are also Sensor, controlled by and the CPU.the The LEDare lights will beinputs used for the camera scene when ambient light is not present. and the deliver the proportional high-power current directly from the sufficient battery to natural drive the thrusters motor. The Depth Sensor, IMU and Leak detector are the sensor inputs to the CPU. The Forward and Bottom cameras and respective LED lights are also controlled by the Leak detector are the sensor inputs to the CPU. The Forward and Bottom cameras and respective LED lights are also controlled by the CPU. detector The LEDare lights will beinputs used for lighting the camera scene when sufficient natural ambient light is not present. Leak the sensor to the CPU. The Forward and Bottom cameras and respective LED lights are also controlled by the CPU. The LED lights will be used for lighting the camera scene when sufficient natural ambient light is not present.

3 3 3 3 3

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 G.Santhan Kumar et al. / Computer ProcediaScience Computer Science 133 (2018) 169–180 171 G.Santhan Kumar/ Procedia 00 (2018) 000–000 G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 The are stored in card CPU system. G.Santhan Scienceby 00 the (2018) 000–000 The images images captured captured by by the the camera camera areKumar/ storedProcedia in SD SD Computer card memory memory by the CPU system. A A 12Volt 12Volt Lithium Lithium Polymer Polymer battery battery

The images captured by camera are stored in card memory by the CPU system. A pack entire system. figure given below shows the block diagram the proposed AUV. Thethe images by the the camera stored in SD SD memory system. A 12Volt 12Volt Lithium Lithium Polymer Polymer battery battery pack powers powers the entirecaptured system. The The figure givenare below shows thecard block diagrambyof ofthe theCPU proposed AUV. The images captured by the camera are stored in SD card memory by the CPU system. A 12Volt Lithium Polymer battery pack powers the entire system. The figure given below shows the block diagram of the proposed AUV. pack powers the entire system. The figure given below shows the block diagram of the proposed AUV. pack powers the entire system. The figure given below shows the block diagram of the proposed AUV.

1.4. 1.4. Vehicle Vehicle Control Control 1.4. 1.4. Vehicle Vehicle Control Control 1.4. Vehicle Control

Figure 1: Block diagram of the proposed AUV Figure 1: Block diagram of the proposed AUV Figure 1: Block diagram of the proposed AUV Figure 1: Block diagram of the proposed AUV Figure 1: Block diagram of the proposed AUV

Figure 2: Simplified control scheme Figure 2: Simplified control scheme Figure 2: Simplified control scheme Figure 2: Simplified control scheme vehicle Figure control scheme.control The scheme mission 2: Simplified vehicle control scheme. The mission

The Figure Figure 22 shows shows aa simplified simplified plan will will contain contain the the details details of of the the depth depth of of The plan The Figure 2 shows a simplified vehicle control scheme. The mission plan will contain the details of the depth of operation, the cruise altitude from sea floor and way points of the mission including surfacing location. In the absence of the shows from a simplified control The mission plan surfacing will contain the details the depth of operation,The theFigure cruise 2altitude sea floorvehicle and way pointsscheme. of the mission including location. In theofabsence of the operation, the cruise from sea and way of mission including location. the of The Figure 2altitude shows a simplified vehicle control scheme. The mission plan surfacing will contain the details the of vision waypoint tracker takes of the and generates control signals to the depth thrusters operation, thetracker cruise the altitude from sea floor floor andcare way points of the the system mission including surfacing location. Incontrol theofabsence absence of the the vision based based tracker the waypoint tracker takes care ofpoints the control control system and generates control signals toIn control the thrusters vision based tracker the waypoint tracker takes care of the control system and generates control signals to control the thrusters operation, the cruise altitude from sea floor and way points of the mission including surfacing location. In the absence of the based on feedback from the sensors like depth sensor and IMU. The vision based tracker system continuously looks for marine vision based trackerfrom the waypoint takessensor care ofand theIMU. control generates to control based on feedback the sensorstracker like depth Thesystem visionand based tracker control system signals continuously looksthe forthrusters marine based on the like depth sensor The vision based tracker system continuously looks for marine vision based trackerfrom the waypoint takesplan carefor ofand theIMU. control generates control signals the animals of interested mentioned in the mission the particular mission. When aa marine animal such aa fish matching the based on feedback from the sensors sensors like depth sensor and IMU. Thesystem visionand based tracker system continuously looks forthrusters marine animals offeedback interested mentioned in tracker the mission plan for the particular mission. When marine animal suchtoas ascontrol fish matching the animals of interested mentioned in the mission plan for the particular mission. When a marine animal such as a fish matching the based on feedback from the sensors like depth sensor and IMU. The vision based tracker system continuously looks for marine shape/colour features mentioned in the mission plan and internal feature data base is found then the control system keep the last animals of interested particular mission. When marine animal such assystem a fish matching the shape/colour features mentioned mentioned in inthe themission missionplan planfor andthe internal feature data base is afound then the control keep the last shape/colour features mentioned in the mission plan and internal feature data base is found then the control system keep the last animals of interested mentioned in the mission plan for the particular mission. When a marine animal such as a fish matching the passed way point in its memory and starts tracking the fish and capturing videos. Once the vision tracker loses track of the object shape/colour features mentioned in the mission plan and internal feature data base is found then the control system keep the last passed way point in its memory and starts tracking the fish and capturing videos. Once the vision tracker loses track of the object passed way point in its memory and starts tracking the fish and capturing videos. Once the vision tracker loses track of the object shape/colour features mentioned in the mission plan and internal feature data base is found then the control system keep the last of interest, the vehicle continues to its next waypoint as per mission plan and completes the mission. The low-level control is passed way the point in its memory tracking the fish andmission capturing videos. Once the vision trackerThe loseslow-level track of the object of interest, vehicle continuesand to starts its next waypoint as per plan and completes the mission. control is of interest, the vehicle continues to its next waypoint as per mission plan and completes the mission. The low-level control is passed way point in its memory and starts tracking the fish and capturing videos. Once the vision tracker loses track of the object done using a simple PID controller to send control signals to thrusters in real-time. of interest, vehicle nextcontrol waypoint as per missioninplan and completes the mission. The low-level control is done using athe simple PIDcontinues controllertotoitssend signals to thrusters real-time. done using aathe simple PID controller signals to real-time. of interest, vehicle continues nextcontrol waypoint as per missionin and completes the mission. The low-level control is done using simple PID controllertoto toitssend send control signals to thrusters thrusters inplan real-time. 1.5. Vision based tracking done1.5. using a simple controller to send control signals to thrusters in real-time. Vision basedPID tracking 1.5. 1.5. Vision Vision based based tracking tracking 1.5. Vision based tracking

Figure 3: Schematic of vision based tracking system Figure 3: Schematic of vision based tracking system Figure 3: Schematic of vision based tracking system Figure 3: Schematic of vision based tracking system Figure 3: Schematic of vision based tracking system

4

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000

G.Santhan Kumar et al. / Procedia Computer Science 133 (2018) 169–180

172

A schematic of the vision based tracking system is shown in Figure 3. The vision based tracking system proposed in the AUV will use image feature matching technique to identify the species of marine animal of interest which is specified in the mission plan. Instantaneous images captured by the vehicles front camera will be scanned for matching features by the feature matching computer vision algorithm. The features are compared with the features in a feature database pre indexed with the marine animal features of the common animals commonly found in the geographic area of survey. Until a match is found each acquire image frame is scanned for matching features. Once a marine animal with matching features is found as per mission plan input, then the identified animal is tracked by the standard vision tracker algorithm and the vehicle thrusters are also controlled to follow the animal and keep the animal in the scene of the camera as closely as possible to capture its behaviour in the images. Once tracking is lost i.e. the marine animal is not visible in the camera image then feature matching again starts and the process continues. 2. Vehicle design The design of the AUV is show in the Figure 4. The vehicle will have one waterproof central main cylinder with a transparent dome faced front section as seen in the figure. The dome portion in the front will be made from acrylic material to enable transparency for the front camera to the see the front area of the vehicle. The remaining section of the cylinder will be made from aluminum or other opaque materials. There will be two small auxiliary cylinders in either side which can be used to carry additional batteries to extend the endurance of the vehicle in a single diving mission. There will be 5 thrusters to provide 4 degrees of freedom namely Surge, Sway, Heave and Yaw for change of heading. The central cylinder will contain all the electronics systems and main battery required for operation of the vehicle. There will be a skid below the vehicle for proper placement and transportations of the vehicle on land and also to prevent hitting or damaging the bottom camera when moving close to the seafloor. (a)

(b)

(c)

(d)

Figure 4: Different views of AUV

2.1. Construction 2.1.1. Building materials The AUVs dome portion in the front will be made from acrylic material to enable transparency for the front camera to the see the front area of the vehicle. The remaining section of the cylinder will be made from aluminum material. The bottom camera casing will be made from acrylic material. 2.1.2. Propulsion The AUV uses brushless DC motor along with customized and 3D printed propellers to act as it thrusters. Thrusters take the electrical energy from the battery and transform it into mechanical energy or motion.

5

G.Santhan Kumar et al. / Procedia Computer Science 133 (2018) 169–180

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000

173

Thrusters have a cowling on them and specially shaped blades to conform to the inside of the cowling called Nozzles or “Kortz Nozzles”. The Kortz Nozzles help to generate higher net thrust as compared to thruster without the nozzle [5]. We used standard Quadcopter BLDC motors available in our lab to reduce the development cost and also to speed up the overall project development time. • Motors Here we are using brushless DC (BLDC) motors of 1000kv and 12volts rating.

Figure 5: Brushless DC motor

Propellers We used propellers which are made on 3D printing machine using PLA material of 1.75 mm thickness. Figure 6 shows the different models of propellers which are made and tested. •

Figure 6: 3-D printed propeller models

2.1.3. Electronic speed controllers The motors can spin in both directions i.e. clockwise and counter clock wise by using bi-directional ESC’s connected with Arduino board.

Figure 7: Controlling bidirectional ESC using arduino

6 174

G.Santhan Kumar et al. / Procedia Computer Science 133 (2018) 169–180

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000

2.1.4. Battery (LiPo) The AUV use electrical power from an on-board battery as the source of energy to drive its various components. The power source we used is a 12V DC Lithium Polymer battery. The actual voltage produced by a fully charged LiPo battery is about 13VDC which directly fed to thruster motors through ESC. The low voltage supply required for the controller board is obtained by using a step down DC-DC converter. The vehicle is designed to operate for about 2 hours when operating autonomously in untethered mode [5].

2.1.5. Forward Camera

Figure 8: LiPo Battery used for the AUV

A camera with frame rate of 30fps is used for image capturing and also for image tracking and processing. Two servo motors are used to form a gimbals mechanism to provide pan and tilt motion for the camera. The entire pan, tilt mechanism and camera arrangement is placed inside the hemispherical acrylic dome of the vehicles main cylinder. This fully enclosed design allows 180 degrees of camera view maintaining a waterproof seal. The pan and tilt feature provides a wider field of view of the camera. Light emitting diodes (LEDs) on the board are used to illuminate the underwater environment. The end cap has two grooves for the O-rings to keep the enclosure watertight [6]. 2.1.6. Bottom Camera The bottom looking camera is used to capture the images of the sea bottom directly below the AUV. These images can be stitched together during offline data processing to form a photo mosaic image map of the seafloor. The bottom camera is also useful for taking part in AUV competitions where line tracking of a line marked in a tank or pool floor is necessary. The secondary camera is placed in a waterproof housing similar to the image shown in Figure 9.

2.1.7. Control Board (Raspberry Pi)

Figure 9: Secondary camera

A Raspberry Pi Controller was chosen to be used because of its more complex design for a full processor and its established software drivers and libraries. The Ethernet port on the R-Pi board enables remote communication with the AUV through a Ethernet cable during debugging process. In addition to this the R-Pi board has 4 USB ports which can be used to interface standard low-cost USB cameras to the board for the vision based tracking. Additionally the USB cameras as less expensive as and littler than Ethernet port based camera’s [7]. The Raspberry Pi software management is easier and can be easily kept up-to-date. The board can run a standard Linux OS on the board which gives way to several additional features and standard programming libraries that are available. The programming of the board is done in Python, which allows for quicker unit testing and easier management of a large code base rather than lower level C programs. Using the Ethernet port on Raspberry Pi board, the AUV is able to have its code updated while in the water using remote login capability. This allows for rapid fixes and iterative improvements while testing the AUV in the water [8].

7 G.Santhan Kumar/ Procedia 00 (2018) 000–000 G.Santhan Kumar et al. /Computer ProcediaScience Computer Science 133 (2018) 169–180



175

2.2 Tests and Results 2.2.1. Thrust measurement The thrust measurement was done in a water tank (size 650 x 470 x 360mm) to calculate the thrust values in grams that can be produced by each BLDC motor according to the respective thrust input give in percentage. To carry out thrust measurement in the lab we made a custom setup of thrust measuring equipment as show in Figure 10. The thruster is attached to one end a wooden pole as seen in figure. The wooden pole is fixed on to a wooden sheet using a circular metal rod which acts like a pivot and double leaver arrangement. At the other end of the pole at the same distance from motor to pivot point a digital balance gauge is attached as seen in the figure. The thruster power was increased in percentage from 10% until 100% to measure the corresponding force by using the digital gauge. Out test showed that the thrust generated is directionally proportional to the percentage thrust control. The measured thrust values during the experiment are shown in Table 2. Figure 11 shows a graph representing the thrust generated by the thruster for the various thrust input in percentage. (a)

(b)

(c)

(d)

Figure 10: (a) Thrust measurement setup; (b) Measuring thrust in water; (c) Rope connected to weighing machine hook over a pulley and (d) Digital weighing machine.

2.2.2. Results Table 2: Thrust measurements values S.No 1 2 3 4 5 6 7 8 9 10 11

Thrusters Input (%)

Thrust Value (Grams)

0

0

10

220

20

350

30

540

40

760

50

820

60

910

70

1050

80

1120

90

1200

100

1280

8

176

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000

G.Santhan Kumar et al. / Procedia Computer Science 133 (2018) 169–180

Figure 11: Thrust generated by the thruster during the lab test

Following are the some important tests which are conducted on the AUV • Non-diving test to check functionality. • Testing vision based tracking. • Diving test of test heave, surge, sway, yaw, and automatic depth control using feedback from pressure sensor. • Testing automated wave point tracking. 3.1 Development and test of the vision based tracking Algorithm: The past few decades have witnessed rapid increase in automated robots taking tedious or dangerous tasks such as surveillance, rescuing, and geological exploration etc. Among these tasks, target tracking by using unmanned aerial vehicles (UAVs) and autonomous underwater vehicles (AUVs) has found many applications in engineering practice with both research and commercial background [9]. Traditional approach to target tracking by an AUV involves image detection of the shape, size and velocity of the moving target. On the other hand, processing the uploaded images of the moving object provides the global coordinates of the tracker to facilitate navigation. In particular, with the help of the image processing technology, image trajectory of a moving target can be obtained by the use of a surveillance camera. Moreover, real-time tracker positions are estimated by an algorithm. These two sets of information can be considered substitutes for the detecting data of the target and the global coordinates of the tracker. They jointly provide sufficient feedback for the tracker to achieve a tracking task. Vision-based tracking by an AUV basically consists of three consecutive steps. They are (i) Image processing, (ii) Coordinate transformation/estimation and (iii) Trajectory following. By applying an image processing algorithm to a sequence of image frames taken by the surveillance camera on the tracker, visual tracking of the target on these image frames can be achieved. Coordinate transformation between the image frame and the real world frame then generates the desired trajectory for the tracker in the real world frame such that the projection of the desired trajectory on the ground plane imitates the original target movement. Based on the real-time estimation of the current tracker position obtained from the algorithm. A trajectory following law propels the AUV to follow the desired trajectory so that target tracking can be accomplished. Tracking is achieved by collecting the information of the approximated horizontal distance between the target and the tracker from an image frame as feedback to navigation law that keeps the distance within a reasonable range in the real world frame. The centroid of the moving target in an image frame is then guaranteed to be inside a given area, and the navigation law guides the AUV to move in a path to follow the target. 3.1.1 Components of the Testing Platform for vision based tracking test The testing setup comprises of an aquarium with a live gold fish. The fish is the objective point for following, so we conduct feature matching test on every single image frame data of the red gold fish which has been captured for the visual tracking. A Logitech webcam is utilized as surveillance or observation camera to track the fish. Image frames generated by the camera have 640 pixels in width and 480 pixels in height.

9

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000



G.Santhan Kumar et al. / Procedia Computer Science 133 (2018) 169–180

177

Figure 12: Live gold fish for image tracking

3.2. Image Processing Algorithm: The Image processing algorithm captures the real-time image frames from the camera and compares it with the template image already fed in to the feature matching algorithm. Once the match is found then image tracker tracks the object of interest. We have used ORB based feature matching and tracking. The underwater environment is fairly clutter free and so the features of the target (fish) like color and shape can be identified and tracked.

Figure 13: System overview

3.2.1. Feature based object tracking Accurate tracking of feature points over image sequences is a critical and essential process for vision systems algorithm based on target recognition to achieve visual tracking .We will load the template of query image or training image at a rate of 4 frames per sec. Here tracking is done by ORB (Oriented FAST and Rotated BRIEF) with open CV library. This is an efficient good alternative to SIFT or SURF due to the computational cost and feature matching speed. Moreover ORB is an open source code and can be used freely without any patent restrictions [10].

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000 fusion of FASTG.Santhan key point detector and BRIEF descriptor with several modifications Kumar/ Procedia Computer Science 00 (2018) 000–000 Kumar et al. /and Procedia Computer Science 133 several (2018) 169–180 fusion of FASTG.Santhan key point detector BRIEF descriptor with modifications G.Santhan Kumar/ Procedia Computer Science 00 (2018)measure 000–000 uses to find keypoint points, and then apply Harris corner to findmodifications top N points fusionFAST of FAST key detector and BRIEF descriptor with several

ORB is basically a to enhance the to enhance the performance. First ait among them.the It ORB is basically to enhance performance. First ait uses FAST to findkey keypoint points, and then apply Harris corner measure to findmodifications top N points to among them.the It ORB is basically fusion of FAST detector and BRIEF descriptor with several enhance also use pyramid toitproduce multitoscale-features. Butand onethen problem isHarris that, FAST doesn’t compute the orientation ORB is much performance. First uses FAST find key points, apply corner measure to find top N points among them. It ORB is pyramid basically fusionFAST of FAST key point detector andapply BRIEF descriptor with several modifications to enhance the also use toaitproduce multi scale-features. But onethen problem isHarris that, FAST doesn’t compute the orientation ORB is much performance. First uses to find key points, and corner measure to find top N points among them. It faster than SURF to and SIFT and ORB descriptor works better than SURF. ORB isdoesn’t awith goodseveral choice modifications inthe low-power devices the also use pyramid produce multi scale-features. But one problem isHarris that, FAST compute orientation ORB islike much ORB is basically a fusion of FAST key point detector and BRIEF descriptor to enhance the performance. First it uses FAST to find key points, and then apply corner measure to find top N points among them. It fasteruse than SURF to and SIFT and ORB descriptor works better than SURF. ORB isdoesn’t a goodcompute choice inthe low-power devices like the also pyramid produce multi scale-features. But one problem is that, FAST orientation ORB is much Raspberry PI board. faster than SURF and SIFTFAST and ORB descriptor works better than SURF. ORB a goodcompute choice low-power the performance. Firstto itproduce uses toscale-features. find key points, then apply corneris measure to findin N points devices among It also use pyramid multi Butand one problem isHarris that, FAST the orientation ORB them. islike much Raspberry PI board. faster than SURF and SIFT and ORB descriptor works better than SURF. ORB isdoesn’t a good choice intop low-power devices like the Raspberry PI board. also use pyramid to produce multi scale-features. But one problem is that, FAST doesn’t compute the orientation ORB is much faster than PI SURF and SIFT and ORB descriptor works better than SURF. ORB is a good choice in low-power devices like the Raspberry board. faster than PI SURF and SIFT and ORB descriptor works better than SURF. ORB is a good choice in low-power devices like the Raspberry board. Raspberry PI board.

178 ORB is basically a

Figure 14: ORB feature matching Figure 14: ORB feature matching Figure 14: ORB feature matching Figure 14: ORB feature matching Figure 14: ORB feature matching Figure 14: ORB feature matching

• SIFT • SIFT • SIFT • SIFT Scale Invariant Feature Transform (SIFT) is a feature detector developed by Lowe in 2004. Although SIFT has proven • SIFT Scale Invariant Feature Transform (SIFT) is a feature detector developed by Lowe in 2004. Although SIFT has proven Scale Invariant Feature Transform (SIFT) is a itfeature detector by Lowe in 2004.which Although proven to be• very efficient in object recognition applications, requires a largedeveloped computational complexity is a SIFT majorhas drawback SIFT Scale Invariant Feature Transform (SIFT) is a itfeature detector by Lowe in 2004.which Although proven to be very efficient in object recognition applications, requires a largedeveloped computational complexity is a SIFT majorhas drawback to be very efficient in object recognition applications, it requires a large computational complexity which is a major drawback especially for real-time applications. Scale Invariant Feature Transform (SIFT) is a itfeature detector by Lowe in 2004.which Although proven to be very efficient in object recognition applications, requires a largedeveloped computational complexity is a SIFT majorhas drawback especially for real-time applications. especially for real-time applications. Scale Invariant Feature Transform (SIFT) is a itfeature detector by Lowe in 2004.which Although proven to be very efficient in object recognition applications, requires a largedeveloped computational complexity is a SIFT majorhas drawback especially for real-time applications. ORB to be•• very efficient in object recognition applications, it requires a large computational complexity which is a major drawback especially for real-time applications. ORB • ORB especially for real-time applications. • ORB ORB is a fusion of the FAST key point detector and BRIEF descriptor with some modifications. Initially to determine • ORB ORB is a fusion of the FAST key point detector and BRIEF descriptor with some modifications. Initially to determine the key Harris measure is applied to find topsome N points. FAST does not compute the ORB is ita uses fusionFAST. of the Then FASTakey pointcorner detector and BRIEF descriptor with modifications. Initially to determine • points, ORB the key points, Harris measure is applied to find topsome N points. FAST does not compute the ORB is ita uses fusionFAST. of the Then FASTakey pointcorner detector and BRIEF descriptor with modifications. Initially to determine orientation andisisita rotation variant. It computes the intensity weighted centroid of the patch with FAST locateddoes corner center. The the key points, uses FAST. Then akey Harris corner measure is applied to find topsome N points. notatcompute the ORB fusion of the FAST point detector and BRIEF descriptor with modifications. Initially to determine orientation and isit rotation variant. It computes the intensity weighted centroid of the with FAST locateddoes corner center. The the key points, uses FAST. Then a Harris corner measure is applied to find top patch N points. notatcompute the direction ofand theisis vector from thisThen corner point tothe centroid gives the orientation. Moments computed todoes improve rotation orientation It computes intensity centroid of the patch with located corner center. The ORB fusion ofvariant. the FAST point detector andweighted BRIEF descriptor with modifications. Initially tothe determine the key points, ita rotation uses FAST. akey Harris measure is applied to find topsome N are points. FAST notat the direction ofand the is vector from this corner point tocorner centroid gives the orientation. Moments are computed to corner improve the rotation orientation rotation variant. It computes the intensity weighted centroid of the patch with located atcompute center. The invariance. direction of the vector from this corner point to centroid gives the orientation. Moments are computed to improve the rotation the key points, it uses FAST. Then a Harris corner measure is applied to find top N points. FAST does not compute the orientation rotation It computes intensity weighted centroid Moments of the patch located at the center. The invariance. direction ofand the is vector fromvariant. this corner point tothe centroid gives the orientation. are with computed to corner improve rotation invariance. orientation and is rotation variant. It computes the intensity weighted centroid of the patch with located corner at center. The direction of the vectoroffrom thisobject corner point invariance. Table 3. Results matching in real timeto centroid gives the orientation. Moments are computed to improve the rotation direction of the vectoroffrom thisobject corner point Table 3. Results matching in real timeto centroid gives the orientation. Moments are computed to improve the rotation invariance. Table 3. Results of matching object in real time invariance. Table 3. Results matching object in real time Objectofdetection method Time taken for each frame Accuracy in matching % Objectofdetection Time taken each frame Table 3. Results matching object in real time SIFT method 4.2for sec Objectofdetection method Time taken each frame SIFT 4.2for sec Table 3. Results matching object in real time Object detection Time taken each frame ORB 0.22 sec SIFT method 4.2for sec ORB 0.22 sec SIFT 4.2 sec Object detection Time taken forsec each frame ORB method 0.22 Object detection method computational power Time taken each framea ORB greater 0.22 sec SIFT 4.2for The SIFT requires and itsec is also SIFT 4.2 it sec ORB greater computational power and 0.22 secis also a The SIFT requires itThe is open and it requires less computational power which isa SIFTsource requires greater computational power and it is also ORB 0.22itsecis itThe is open andgreater it requires less computational power which SIFTsource requires computational power and also isa

Accuracy in76matching % Accuracy in76matching % Accuracy in63 76matching % 63matching % Accuracy in76 63 Accuracy in 63matching patented algorithm. We76 adopted%for 63adopted for patented algorithm. We76 best suitable for Raspberry PI. Duefor to patented algorithm. We63adopted best suitable for Raspberry PI. Duefor to patented algorithm. We adopted

ORB method ORB method because its fastness in ORB method because its fastness in ORB method matchingititiscan work in real tracking of computational object while SIFT cannot for realfor time tracking.PI.The Table shows the because open source andtime it requires less power whichbeisapplied best suitable Raspberry Due to its3 fastness in SIFT requires computational power and it iswhich also patented algorithm. We adopted for method matching itiscan work intracking real time tracking of computational object while SIFT be for real time tracking. Table shows the because itThe open source andgreater it requires less power isaapplied best suitable for Raspberry PI. Due to ORB its3 fastness in comparisons of object performance between SIFT anditcannot ORB algorithms. The speed of SIFT isThe 4.2 seconds for each matching it can work in real time tracking of object while SIFT cannot be applied for real time tracking. The Table 3 shows the The SIFT requires greater computational power and is also a patented algorithm. We adopted for ORB method because ititiscan open source andtime it requires less computational power whichbe isapplied best suitable for Raspberry PI. Due to its3 fastness in comparisons of object tracking performance between SIFT and ORB algorithms. The speed of SIFT is 4.2 seconds for each matching work in real tracking of object while SIFT cannot for real time tracking. The Table shows the frame whereas istracking much can process 4.5 frames per and takes only seconds frame. However comparisons of ORB object performance between SIFTpower andsecond ORB algorithms. The 0.22 speed of SIFTfor is each 4.2 seconds for each because ititiscan open source andtime itand requires less whichbe isapplied best for Raspberry PI. Due to its3 fastness in matching work intracking real tracking of computational object SIFT cannot for real time tracking. The Table shows the frame whereas ORB is much and can process 4.5 while frames per second and takessuitable only 0.22 seconds for each frame. However comparisons of object performance between SIFT and ORB algorithms. The speed of SIFT is 4.2 seconds for each lookingwhereas atitthe object detection accuracy SIFT performs better the ORB algorithm. We tested for 100 frames with a printed frame ORB is much and can process 4.5 while frames perthan second andapplied takes onlyreal 0.22 seconds forThe each frame. However matching can work intracking real time tracking of object SIFT cannot be for time tracking. 3 shows the comparisons ofobject objectdetection performance between SIFT and ORB algorithms. The We speed of SIFT is each 4.2Table seconds for each looking at the accuracy SIFT performs better than the ORB algorithm. tested for 100 frames with a printed frame whereas ORB is much and can process 4.5 frames per second and takes only 0.22 seconds for frame. However image ofatathe fish (shown in figure 3) moving manually in front of camera and theWe camera kept static. Although the looking detection accuracy SIFT itbetween performs better than thethe ORB algorithm. tested for 100 frames with afor printed comparisons ofobject object performance SIFT and ORB algorithms. The speed of was SIFT is each 4.2 seconds each frame ORB istracking much and process 4.5 frames second and takes only 0.22 seconds for frame. However image whereas ofatathe fish (shown in figure 3)can moving itperforms manually in per front of the camera and theWe camera was kept static. Although the looking object detection accuracy SIFT better than thethe ORB algorithm. tested forits100 frames with a printed accuracy of the ORB algorithm is lesser than the SIFT algorithm, we chose to use the ORB due to enhanced computational image of a fish (shown in figure 3) moving it manually in front of camera and the camera was kept static. Although the frame whereas ORB is much and can process 4.5 frames per second and takes only 0.22 seconds for each frame. However looking atof the object detection accuracy SIFT performs better than thethe ORB algorithm. We tested forits100 frames with a printed accuracy the ORB algorithm is lesser than the SIFT algorithm, we chose to use the ORB due to enhanced computational image of a fish (shown in figure 3) moving it manually in front of camera and the camera was kept static. Although the speed. Aatof more detailed and consistent accuracy test can be than conducted. However the ORB focus of our work was tocomputational usea aprinted faster accuracy theobject ORB algorithm is 3) lesser than itthe SIFT algorithm, we chose to use due to enhanced looking the detection accuracy SIFT performs better thethe ORB algorithm. We tested forits 100 frames image fish (shown in moving manually inbefront of camera andthe the camera was kept static. Although the speed. of A amore detailed andfigure consistent accuracy test can conducted. However the focus of our work was towith use a faster accuracy the ORB is lesser than the SIFT algorithm, we chose to use the ORB due to enhanced algorithm forfish ourdetailed targetalgorithm application onmoving Raspberry PI, comparative study was not essential foritswork our study. speed. A of more andfigure consistent accuracy testa detailed can befront conducted. However the focus of our was tocomputational use a faster image of a (shown in 3) it manually in of the camera and the camera was kept static. Although the accuracy of the ORB algorithm is lesser than the SIFT algorithm, we chose to use the ORB due to its enhanced computational algorithm for ourdetailed target application on Raspberry comparativeHowever study wasthe notfocus essential for work our study. speed. A more and consistent accuracyPI, testa detailed can be conducted. of our was to use a faster algorithm forthe ourdetailed targetalgorithm application on Raspberry a detailed comparative study was not essential our study. accuracy of ORB is lesser than thePI, SIFT algorithm, we chose to use the ORB due tofor itswork enhanced computational speed. A more and consistent accuracy test can be conducted. However the focus of our was to use a faster algorithm for our target application on Raspberry PI, a detailed comparative study was not essential for our study. 3.3. Testing of Image based Tracking speed. A more detailed and Tracking consistent accuracyPI, testa detailed can be conducted. of our was to use a faster 3.3. Testing ofour Image based algorithm forof targetbased application on Raspberry comparativeHowever study wasthe notfocus essential for work our study. 3.3. Testing Image Tracking algorithm forofour targetbased application on Raspberry PI, a detailed comparative study was not essential for our study. 3.3. Testing Image Tracking We of tested thebased imageTracking based tracking algorithm on live fish in an aquarium tank as shown in the figure 12. As explained 3.3. Testing Image We of tested thebased imageTracking based tracking algorithm on live fish in an aquarium tank as shown in the figure 12. As explained 3.3.the Testing Image We tested the image based trackingalgorithm algorithmwas on used live fish aquarium tank ascoding shownwas in the figure As that explained in previous section, the ORB tracking for in theantests. The python made in a12.way if the We tested the image based trackingalgorithm algorithmwas on used live fish aquarium tank ascoding shownwas in the figure As that explained in the previous section, the ORB tracking for in theantests. The python made in a12.way if the in the isprevious section, the ORB tracking was used for prints theantests. The coding was made in a12. that if the target found then the image program draws a boxalgorithm around the target then a text as python “fish found” and ifthe no target isway found then We tested the based tracking algorithm on live fish in aquarium tank as shown in figure As explained in the isprevious section, the ORBdraws tracking was usedthen for prints the tests. The codingand was made in aisway that if the target found then the program a boxalgorithm around the target a text as python “fish found” if no target found then target isprevious found then the program a boxalgorithm around the target prints a text as python “fish and no target isway found then the text “fish nottested found” is image printed ondraws the image. We the based tracking algorithm on live then fish in antests. aquarium tank found” ascoding shown inifthe figure 12. As that explained in the section, the ORB tracking was used for the The was made in a if the target is found then the a box around the target then prints a text as “fish found” and if no target is found then the text “fish not found” is program printed ondraws the image. text “fish not found” is printed on the image. in the previous section, the ORB tracking algorithm was used for the tests. The python coding was made in a way that if the target is found then the a box around the target then prints a text as “fish found” and if no target is found then the text “fish not found” is program printed ondraws the image. target is found then the a box around the target then prints a text as “fish found” and if no target is found then the text “fish not found” is program printed ondraws the image.

11

G.Santhan Kumar et al. / Procedia Computer Science 133 (2018) 169–180

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000

179

As you see in Figure 15 (b) the images with other fishes and with no fish is automatically marked as “fish not found” by the program. Whereas in images where the gold fish is present (Figure 15 (a)) are marked as “fish found” and also the bounding box and centroid of the bounding box (green color) is marked by the program. In the figure 15 (a) we can also notice the red color cross line marking the center of the image frame and is marked as 0,0 by the algorithm. The distance from center of the object of interest from the center of the image frame is also calculated and marked in green as (48, 49). This represents the pixel number in-terms of rows and columns. Similarly the position of the tracked object (85, 45) is marked in the figure 15 (c). The values of distance of the centroid of the tracking object from the center of the image frame will be used as control values to control the thrusters with a control goal of bringing the target object exactly to the middle of the image frame. This will ensure that the vehicle follows the fish and captures the images of the fish during the entire tracking period until the object is lost completely from the field of view of the camera or moves relatively far away that tracking algorithm is not able to detect the fish. Presently the USB camera was placed outside the aquarium and the tests were carried out. However in the actual application the Raspberry PI board and USB camera will be placed in the water proof enclosure inside the body of the AUV. Presently the fabrication of the AUV structure is in progress and soon we will be able to assemble the entire AUV and test the tracking Algorithm to do automatic real-time control of the AUV to track the fish and capture the images. a) Object/fish found

b) Object/fish not found

Figure 15: Results of fish tracking test

4. Conclusion This research presents the design and development of a compact and low-cost AUV which is capable of performing autonomous image based tracking and photo mosaicking. The detailed design and features of the vehicle has been presented. A custom designed thrusters using brushless DC motor and 3D printed propeller blades are used to propel the AUV. The AUV is also capable of tracking marine animals using the onboard camera and vision based tracking algorithm. The AUV is developed for a depth rating of 10 meters which is suitable only for test tank operations. However once we demonstrate this vehicles capability to the probable end users of the vehicle such as CIFT (Central Institute of Fisheries Technology), DRDO etc. We plan to apply for further research funding for developing the vehicle in to a full ocean going vehicle and utilize for real field applications. In future the AUV also can be used as a platform for testing and learning conventional and advanced control algorithms and techniques. Using neural networks and artificial intelligence the AUVs autonomy can be further enhanced in the future.

12

180

G.Santhan Kumar/ Procedia Computer Science 00 (2018) 000–000

G.Santhan Kumar et al. / Procedia Computer Science 133 (2018) 169–180

Acknowledgments We wish to express our sincere gratitude to our University “Vignan's Institute of Information Technology, Visakhapatnam, A.P, India”. Specifically we would like to thank the “Centre for Innovation Lab” staff for providing us all necessary tools and continues support for the project. References [1] Woods Hole Oceanographic Institution (WHOI) website, Woods Hole, Massachusetts, it is the largest independent oceanographic research institution in the U.S. [2] Australian sea-floor survey data, with images and expert annotations by Michael Bewley, Ariell Friedman, Renate Ferrari, Nicole Hill, Renae Hovey, Neville Barrett, Ezequiel M. Marzinelli, Oscar Pizarro, Will Figueira, Lisa Meyer, Russ Babcock, Lynda Bellchambers, Maria Byrne & Stefan B. Williams. [3] http://underwaterrobonet.org/jamstec/arch2012haru/spec/YebisURA.pdf, retrieved on 30 April 2018. [4] Design and construction of an autonomous underwater vehicle by Khairul Alamn, TapabrataRay, Sreenatha G Anavatti; School of Engineering and Information Technology, University of New South Wales,UNSWCanberra,ACT2600,Australia. [5] ROV PROGRAM – TEAM MANUAL Underwater Robotics for High School Students© 2010 Eastern Edge Robotics, December 2010. [6] ROV Cephalopod Technical Documentation, PROVEN ROBOTICS 2017, Technical Documentation ,Port Cities of the Future; Commerce, Entertainment, Health and Safety PURDUE UNIVERSITY, West Lafayette, in USA. [7] Autonomous Underwater Vehicles (AUVs): Their past, present and future contributions to the advancement of marine geosciences; Russell B.Wynn, National Oceanography Centre and European Way, Southampton SO14 3ZH, UK. [8] Design and development of an Autonomous Underwater Vehicle (AUV-FKEUTeM): Mohd Shahrieel Mohd Aras, Technical University of Malaysia Malacca, https://www.researchgate.net/publication/267752524. [9] Vision based Tracking by a Quadrotor on ROS: Yusheng Wei, Zongli Lin and Charles L. Brown; University of Virginia, Charlottesville, VA 22904-4743, USA (e-mail: [email protected], [email protected]). [10] Ethan Rublee, Vincent Rabaud, Kurt Konolige, Gary R. Bradski: ORB: An efficient alternative to SIFT or SURF. ICCV 2011: 2564-2571.