A Navigational System for Quadcopter Remote Inspection of ...€¦ · A Navigational System for...

download A Navigational System for Quadcopter Remote Inspection of ...€¦ · A Navigational System for Quadcopter Remote Inspection of Offshore Substations Elisabeth Welburn, Hassan Hakim

If you can't read please download the document

Transcript of A Navigational System for Quadcopter Remote Inspection of ...€¦ · A Navigational System for...

  • The University of Manchester Research

    A Navigational System for Quadcopter Remote Inspectionof Offshore Substations

    Document VersionAccepted author manuscript

    Link to publication record in Manchester Research Explorer

    Citation for published version (APA):Welburn, E., Hakim Khalili, H., Gupta, A., Watson, S., & Carrasco, J. (2019). A Navigational System forQuadcopter Remote Inspection of Offshore Substations. In The Fifteenth International Conference on Autonomicand Autonomous Systems 2019

    Published in:The Fifteenth International Conference on Autonomic and Autonomous Systems 2019

    Citing this paperPlease note that where the full-text provided on Manchester Research Explorer is the Author Accepted Manuscriptor Proof version this may differ from the final Published version. If citing, it is advised that you check and use thepublisher's definitive version.

    General rightsCopyright and moral rights for the publications made accessible in the Research Explorer are retained by theauthors and/or other copyright owners and it is a condition of accessing publications that users recognise andabide by the legal requirements associated with these rights.

    Takedown policyIf you believe that this document breaches copyright please refer to the University of Manchester’s TakedownProcedures [http://man.ac.uk/04Y6Bo] or contact [email protected] providingrelevant details, so we can investigate your claim.

    Download date:29. May. 2020

    https://www.research.manchester.ac.uk/portal/en/publications/a-navigational-system-for-quadcopter-remote-inspection-of-offshore-substations(6931ea92-70fa-4ea4-973d-d1e83c8081c0).html/portal/ananya.gupta.html/portal/simon.watson.html/portal/joaquin.carrasco.htmlhttps://www.research.manchester.ac.uk/portal/en/publications/a-navigational-system-for-quadcopter-remote-inspection-of-offshore-substations(6931ea92-70fa-4ea4-973d-d1e83c8081c0).htmlhttps://www.research.manchester.ac.uk/portal/en/publications/a-navigational-system-for-quadcopter-remote-inspection-of-offshore-substations(6931ea92-70fa-4ea4-973d-d1e83c8081c0).html

  • A Navigational System for Quadcopter Remote Inspection of Offshore Substations

    Elisabeth Welburn, Hassan Hakim Khalili, Ananya Gupta, Joaquin Carrasco and Simon WatsonSchool of Electrical and Electronic Engineering

    The University of Manchester, Manchester, UK M14 9PLEmail: [email protected]

    AbstractEffective and safe maintenance of offshore infrastruc-ture is hampered by its remote location. Robotic inspection canprovide a retro�t solution, improving safety for human personnelby removing them from a potentially hazardous environment, andalso reduce operational costs. There are three primary challengesfor navigation around an offshore substation: low visibility, highelectromagnetic �elds and the absence of Global PositioningSystem (GPS) signals. This paper details a navigational systemthat enables Unmanned Aerial Vehicles (UAVs) to operate withina dark and GPS-denied environment.

    KeywordsRobotics In Hazardous Fields; Aerial Robotics;SLAM; Sensor-based Control

    I. INTRODUCTIONThe remote inspection and asset management of offshore

    wind farms and the connection to shore will be worth up to 2billion pounds annually by 2025. However, current methods ofinspection are dangerous for human personnel and introducehigh costs for the industry as a whole [1].

    Currently, Supervisory Control and Data Acquisition(SCADA) systems and thermal imaging inspections are beingused in data management to inform substation operationsand maintenance. However, the limited number of quali�edinspectors coupled with the high demand leads to commonunexpected failures. Automation could potentially alleviate thisby increasing inspection frequency and standardizing proce-dures [2].

    Robotic inspection platforms have the potential to ensurethe maintenance of vital infrastructure, reducing associatedexpenditure and hazards [1]. However, this endeavour presentsunique challenges that must be overcome for it to become aviable commercial method. Considering the offshore substationenvironment in the context of a navigational system, theinherently symmetrical nature coupled with the occlusion ofGPS signal may attribute to dif�culty ascertaining an accurateestimation of the robot’s global 6 Degree of Freedom (DoF)pose. The high electromagnetic �elds necessitate the use ofshielding, limiting external sensor hardware [3]. The presenceof high electromagnetic �elds could potentially interfere withthe nominal operation of the propulsion motors [4]. To cir-cumvent this, the use of a magnetometer within the proposednavigational system will be neglected. Moreover, the sensorpayload must also be minimal to extend battery life, facilitatingthe implementation of autonomous capabilities in this remotelocation. Also, the absence of visible light limits the use ofvision-based odometry.

    Three levels of autonomy can be de�ned: pure tele-operation, safe-guarded tele-operation and autonomous navi-gation [5]. The navigational system presented within this paperprovides a method of remote tele-operation. However, the aimis to extend this system to full autonomy in the future with

    more sophisticated obstacle avoidance capabilities that accountfor the electromagnetic �elds.

    This paper presents a navigational system for UAVs tooperate inside a HVDC valve hall. The quadcopter is equippedwith two 2D LiDARs that are mounted perpendicularly to eachother, the combination of which provides a 3D estimation ofthe robot pose. However, this estimation is, in part, basedupon relative movement of the surrounding landmarks betweenframes and so is subject to a certain amount of drift. This isfurther exacerbated by the repetitive and symmetrical nature ofthese landmarks. To remedy this, the implementation of QuickResponse (QR) codes were investigated as global referencepoints to correct for this accumulated error. Sensor fusionwas accomplished with the use of an Extended Kalman Filter(EKF).

    The remainder of this paper is structured as follows:Section II of this paper will consider related works and SectionIII will detail the system architecture, while Section IV willanalyse the collated results and several conclusions will bedrawn concerning further extensions of this work and theviability of this system within industry.

    II. RELATED WORKfor UAVs as well as ground vehicles, with the view of

    adapting these methods for UAV navigation of a GPS-deniedand dark environment.

    A. Current Navigational TechniquesIn [6], an autonomous navigational system was developed

    for a ground vehicle deployed within a GPS-denied green-house. This system used the Hector SLAM ROS package,that is also used within this system, and combined this witha potential �elds path planning algorithm. Structural changesdue to the growth of crops were accounted in the path planningalgorithm while being safe to operate in the presence ofhumans. This is reminiscent of faults occurring and causinga �uctuation of electromagnetic �elds inside the HVDC valvehall environment, changing the required clearances to maintainnominal operation of the UAV.

    In [7], a UAV was deployed into a GPS-denied, dark tunnelwhere a perception system comprising of a near-infra-redstereo camera, �ashing LEDs, inertial sensors, and a 3D depthsensor to derive the geometry of the environment. A horizon-based planner accounted for the system’s uncertainty duringmission execution and generated collision-free explorationpaths. However, the environment here was unknown and static,whereas the valve hall geometry is known and faults within theracks can cause a �uctuation of the electromagnetic �elds.

    In [2], a robot that transverses substations with the use ofa rail system and collects IR and visible images, positioning,time and component description and transmits this, as well

  • as energy, to a control centre with the use of the rails.This mitigates faults caused by intermediate electromagneticinterference between the robot and the control centre. Also,magnetic references on the rails negate the need for markingsimplemented onto the substation infrastructure. Also, the useof the commercial voltage for Brazil facilitates installationin other locations. However, the rail-based robot requires theinstallation of an extensive rail system in existing substationsto operate [2]. The system proposed within this paper providesa retro-�t solution that could potentially accomplish the sametask.

    B. Vision-Based Odometry TechniquesThe dark nature of the valve hall restricts the perception

    within the visible spectrum. However, a Near-Infrared (NIR)camera will be �tted to the drone for fault detection purposesand also a LED spotlight can provide limited ambient lightingin the immediate vicinity.

    In general, though visual odometry is useful for local posi-tion control and stability, these methods often suffer from long-term drifts and are not suitable for building large-scale maps[8]. RGB-D cameras provide both a colour image and per-pixel depth estimates and are prominent within mobile roboticplatforms due to their richness of the data collected coupledwith their reducing cost. In [8], a system for the navigationof a micro-air vehicle within a cluttered, GPS-denied indoorenvironments with the use of an on-board RGB-D cameraand an Inertial Measurement Unit (IMU) was developed. Thissystem periodically corrected for the drift present within thelocal state estimation based upon visual odometry with resultsfrom the RGB-D mapping algorithm [9]. However, this systemwas unsuitable for real-time situations as the loop closing andSLAM algorithms were not suf�ciently fast to be run on anon-board processor.

    The use of Quick Response codes within absolute local-isation methods for indoor mobile robots is widespread dueto their large data storage capabilities, small size, low costand simple implementation. A possible issue with their use isthat the recognition rate is reduced if the QR code is smallwithin the camera’s �eld of view or the robot moves toofast. Considering this application in real-time, the processingresources are suf�ciently low to enable use of the QR codes,as it was found within [10], that the time taken to calculatethe relative position of the robot was between 20 to 30 ms.

    Within [10], an industrial camera was mounted onto amobile ground robot pointing upwards in order to identifyQR codes mounted to the ceiling. Meanwhile, a laser range-�nder was used for object detection as well as the constructionof a 2D map with the use of a Rao-Blackwellized particle�lter. The Dijkstra algorithm, as well as the Dyanmic WindowApproach were used to implement both local and global pathplanning capabilities [10]. However, this system is not usablein situations where the QR codes were occluded from thecamera’s �eld of view due to sheltering obstacles or ambientlight. Odometry data was used to compensate for the driftoccurring within the short time interval travelling between theQR codes, whereby the error accumulation was mitigated withthe use of additional sensor inputs [10].

    In [11], a tailored extended H1 �lter (EHF) fused odom-etry and gyroscope data with pose estimates based upon

    QR code landmarks. However, this method is more compu-tationally expensive compared to an EKF, taking longer toconverge on an accurate estimate [11], which is paramountwhen instructing real-time control as in this scenario.

    III. SYSTEM ARCHITECTUREWithin this section, the architecture of the navigational

    system as depicted in Figure 1 is discussed.

    Figure 1. Software Architecture for the Proposed Navigational System.

    A. Mobile Robotic PlatformThe quadcopter utilised for the proof-of-concept system is

    the Hector quadrotor Robot Operating System (ROS) package[12] due to its pre-existing and well-documented integrationwith the Gazebo simulator. The visual geometry was writtenwithin COLLADA format and the collision geometry wasmodelled as a STL mesh. A low polygon count reduces thedemand from rendering the model, allowing simulations to beran at a higher percentage of real-time. The propellers arerepresented by actuator discs to facilitate the maintenance ofboundary conditions [12]. The hector quadrotor is depictedbelow in Figure 2.

    Figure 2. The Hector quadrotor rendered within the Gazebo simulator.Image taken from [12].

    The CAD model of an offshore substation, as shown in Fig-ure 3, with the correct clearances as the real-time environmentwas constructed and used to collate results.

    B. 2D SLAM AlgorithmLow-cost laser range-�nders are prevalent in autonomous

    robot applications due to their low price and ability to traceterrains and structures in the contiguous area, while consumingrelatively little power [13].

    In the proposed system, two Hokuyo utm-30lx LiDARswere mounted perpendicularly to each other. It was assumedthat the z-axis was out-of-plane relative to the ground planeand the x-axis was pointing in the forward direction of the

  • Figure 3. CAD model of the HVDC valve hall

    quadcopter. The horizontal, planar LiDAR was used within the2D SLAM algorithm to construct a map of the surroundings.

    A ROS node, Hector Mapping [14], was selected as the 2DSLAM algorithm, of which the only requirement was a highfrequency laser scanner, such as the Hokuyo utm30lx LiDARin this scenario.

    C. Floor ExtractionFor height estimation and greater spatial understanding, a

    secondary LiDAR was mounted, perpendicular to the primaryLiDAR, onto the underside of Hector quadrotor. The verticalLiDAR produced a 2D vertical laser scan of the environment.A split-and-merge algorithm [15] was then implemented todifferentiate the walls and �oor using the relative angle ofthe incident laser endpoints. The roll and pitch recorded bythe EKF was processed and the calculated relative anglesof the identi�ed line segments were rotated to avoid falselyrecognising the walls as the �oor during operation.

    A laser pointing vertically downwards was also consideredas the method of height estimation, however this is a lessrobust method than the aforementioned secondary LiDAR.This is because if the quadrotor turned near the boundariesof the space, the singular laser point could potentially rotateto be incidental on walls or substation racks. This could bemitigated with the use of fusion with the orientation fromthe IMU device to account for the laser rotation. However,IMU data suffers from drift and so a secondary LiDAR wasused in the implementation to provide more information of thetransformation of the laser scan points relative to the quadrotor.

    D. QR Codes as Global LandmarksVision-based odometry is generally computationally inten-

    sive and also suffers from robustness under varying lightingconditions [11]. However, in this scenario there is an absenceof visible light and so ambient light levels are constant. Also,vision-based odometry was implemented with the view toperiodically correct for drift in the 2D SLAM algorithm poseestimations.

    A FLIR One Near-Infrared (NIR) camera of a spectralrange between 8 - 14 �m will be utilised to enable simultane-ous QR code detection and faults within the infrared spectrum.An infrared LED emitting light between 750 - 950 nm will bemounted on top of the camera, illuminating the proximal �eldof view. However, for the purposes of this simulation, a genericcamera is created within a virtual world lit by ambient lightingto ensure an accurate estimation of the nominal accuracy ofthe vision-based odometry.

    The QR codes were generated with the use of the open-source library, arUco markers. These were then placed onthe racks within the virtual substation environment at regularintervals. The ROS package, �ducial SLAM [16], was used toboth identify the unique identi�er of the QR code as well asproduce a 6 DoF pose estimation of the drone using the knownglobal poses of the QR codes.

    The QR codes could potentially be used in two capacitiesduring drone operation. Correct identi�cation of a unique QRcode indicates the drone is within the correct general vicinityof the rack. These could form the basis of a command interfaceto set the goal destination that determines the generated path.The unique identi�ers of visible QR codes in the camera �eldof view are shown in Figure 3.

    Figure 4. The virtual UAV inspecting a substation rack. Inset is thecamera �eld of view.

    Alternatively, the pose of the visible QR marker in a knownlocation can be processed to output a 6 DoF pose estimationof the drone that could have been later fused with the othersensor measurements within the EKF. However, this was foundto produce erroneous estimations of 6 DoF pose, as discussedlater.

    IV. SENSOR FUSIONSensor fusion is necessary within this system to identify the

    optimal estimate of the UAVs pose. Considering the sensormeasurements modelled in this section, (xs; ys; zs; �s; �s;s)was produced from the 2D SLAM performed using the planarLiDAR, z from the height measurement using the perpendicu-lar LiDAR and, �nally, (x; y; z; �; �;) was produced from thevision-based odometry system based upon QR codes visible tothe on-board camera.

    First, the orientation measurements must be converted froma quarternion in the local frame to Euler angles that arerelative to the global frame. A function available within theROS python library tf was used for this conversion. For thepurposes of this system, the starting pose of the spawned robotwas assumed to be the global frame in terms of the way-point commands that were converted into command velocities.However, this coordinate frame was mapped onto the 2Doccupancy grid constructed to utilise the A* path planningalgorithm.

    The measurements were taken at different unsynchronisedrates. To accompany this, each sensor measurement is sampledwhen a new IMU measurement is received at a rate of 100 Hz.In this way, a suf�cient sample rate is ensured.

    Kalman �lters are algorithms for the estimation of dynamicstate variables by combining state predictions with measure-

  • ments. For discrete systems, the future values of the statevariables can be predicted using Kalman �lters.

    The EKF can overcome the linearity assumption of theKalman Filter that both the motion model and sensor model arelinear Gaussian [13]. Within this system, an extended Kalman�lter was implemented, where the non-linearity is introducedwith the continuously-variable rotation relative to the globalframe.

    For time-invariant systems, the function f computes thepredicted state from the previous estimate, the h functioncomputes the output. The variables, wk and vk, represent theprocess and observation white noises, respectively, i. e.

    xk+1 = f(xk; uk) + wk (1)zk = h(xk) + vk (2)

    The white noises wk and vk are assumed to be zero mean andcovariances Qk and Rk, respectively.

    For the purposes of this Kalman �lter, all variables werewithin the global frame. The values used for initialisation ofthe Kalman �lter were the coordinates of spawning the robotmodel.

    In the case of the Hector quadcopter, the state variableswere updated with the use of inertial measurements fromthe IMU unit. The state vector, X , comprised of these statevariables:

    X = [x y z _x _y _z � � ]T (3)

    where x, y, z are the positions on the X, Y, and Z axes and �is the roll, � is the pitch and is the yaw of the quadrotor.

    The global displacement, s, in each of the X, Y and Zaxes can be modelled using dead reckoning with the initialdisplacement, s0, the rotation matrix that transforms betweenthe body frame to the inertial frame, R, the initial velocity atthe start of the time interval v0, IMU acceleration within theIMU frame of reference, a, the gravitational constant, g, andthe length of the time interval, t.

    With some abuse of notation, this relationship is encapsu-lated in the dynamic matrix, f , that described how the stateevolves to the next time step, as below:

    f =

    2

    664

    s0 + _s4T + 12R(a� g)4T2

    _v0 +4T�R(a� g)

    !

    �0 +4T�

    3

    775 ; (4)

    where � is the mapping of the angular velocities in the bodyframe (p; q; r) to the changes in the Euler angles within theinertial frame [17], e.g.

    � =

    2

    41 sin� tan � cos� tan �0 cos� � sin�0 sin�cos �

    cos�cos �

    3

    5 : (5)

    The measurement function, h, is given by

    h = [x y z � � ]T : (6)

    The linearisation of (1) provides the state transition matrix,Fk, by computing the Jacobian of the dynamic matrix f with

    respect to the state vector. Similarly, the observation matrixHk can also be de�ned as the Jacobian of the measurementmatrix, h with respect to the state vector.

    The control inputs into the system can be assumed to bethe linear accelerations and the angular velocities as measuredby the IMU, i.e.

    u =�x y z _� _� _

    �T (7)

    The time update of the EKF algorithm is given by

    x̂kjk�1 = f(xk�1; uk�1); (8)Pkjk�1 = FkPk�1jk�1FTk +Qk: (9)

    The measurement update steps are then computed to adjustthe Kalman gain, Kk and to update the estimate with the actualmeasurement, zk, and to update the error covariance, Pkjk. Themeasurement residual, ey as well as the covariance residual, Skare also calculated.

    eyk = zk � h(x̂kjk�1) (10)Sk = HkPkjk�1HTk +Rk (11)

    Kk = Pkjk�1HTk S�1k (12)

    x̂kjk = x̂kjk�1 +Kkeyk (13)Pkjk = (I �KkHk)Pkjk�1 (14)

    This method is less computationally expensive in compar-ison to other methods, such as the H1 �lter (EHF) [11]. Oneof the drawbacks of an extended Kalman �lter is that it is notan optimal estimator. Moreover, if the initial state vector iswrong, the �lter will quickly diverge due to its linearisation. Asa result, the EKF requires extensive tuning of these parameters.

    V. PATH PLANNING ALGORITHMThe �nal pose estimation from the EKF were fed into an

    A* path planning module [18]. The robot radius is consideredgreater than the nominal dimensions, ensuring clearances fromthe high electromagnetic �elds present within the substationracks were maintained.

    Prior to this, a 2D occupancy grid of the substation plan isconstructed using the known dimensions of the CAD model. Interms of command way-points, height correction is performed�rst to adjust the drone to the speci�ed goal height because ofthe largely constant geometry of the environment within thevertical plane. Then, an A* path planning algorithm is used togenerate the path shown in Figure 5 within the substation.

    Figure 5. A 2D occupancy grid and path generated by the A* staralgorithm

  • This produces the optimal trajectory consisting of 0.5 mincrements between the start position and goal position. Afterthis, alterations to the orientation of the drone are made toenable 360 degree inspection.

    VI. RESULTSTo ascertain a baseline and compute the errors of the

    constituent algorithms, the standard deviations of each of themeasurements were calculated. These standard deviations werethen used as a baseline for tuning of the Q and R matriceswithin the EKF.

    The pose estimation generated from the EKF during themapped trajectory in Figure 5 was used to gauge the viabilityof the proposed navigational system. The IMU measurements,as well as the hector mapping 2D pose and extracted �oorheight were fused by the EKF. The 3D position of the UAV iscompared to the ground truth within Figure 6. The error presentwithin the EKF output is tabulated in Table I. An error of1.67% in the x-axis throughout the course of a twenty-minutemission is tolerable. However, an error of 5.45% in the y-axisis unsatisfactory and further tuning of the EKF is required toalleviate this. The height estimation algorithm was found toproduce the least error, with a 0.3% throughout the length ofthe mission. The average battery life of a UAV is between tento �fteen minutes, depending on payload and so these �guresrepresent a probable overestimation of the drift present withinthe EKF.

    TABLE I. ERROR IN EKF OUTPUT

    Global Axis Distance Travelled (m) Error (m) Error (%)x 39 38.35 1.67y 5.5 5.2 5.45z 5 4.985 0.3

    Figure 6. EKF pose estimation

    The camera stream was also recorded, whereby visible QRcodes unique identi�ers were overlaid, as seen in Figure 4. The6 DoF pose estimation from the visible QR codes was alsocollated to evaluate whether this data should be incorporatedinto the EKF. However, as can be seen by Figure 7, these poseestimations are extremely erratic and will not contribute to theoverall stability of the EKF upon fusion.

    The inaccurate 6 DoF pose estimation produced from thefiducial slam could potentially be due to the 2D nature of

    the QR codes hindering precise depth perception. It also maybe due to the monocular nature of the camera.

    Considering Figure 7, ultimately direct pose estimationfrom �ducial slam was not implemented within the EKF.However, the unique identi�ers displayed within the camerastream could potentially facilitate inspection of substationracks by providing a visual veri�cation of the current vicinityof the UAV.

    Figure 7. EKF pose estimation with the �ducial slam results

    In summary, these set of results suggest this system couldpotentially be adapted for implementation into a real-worldsystem.

    VII. CONCLUSIONIn conclusion, the proposed system is a proof of concept

    system concerning the fusion of a 2D SLAM algorithm, IMUand �oor extraction technique for navigation of a dark, GPS-denied environment. Possible mechanisms of correction viavision-based odometry upon the identi�cation of QR codeswithin the environment were explored and it was concludedthat though the QR codes provide visual cues of the dronescurrent position they fail as reference points to generate anaccurate 6 DoF pose. This system provides a retro-�t solutionfor the remote inspection of substations, merely requiring thecareful placement of QR codes within the environment.

    Future work includes the implementation of this naviga-tional system onto a drone within an indoor, con�ned anddark environment. The computation and sensing required forlocal position control will be performed on-board the vehicle,reducing the dependence on unreliable wireless links [8]. Thepath planning capabilities will also be expanded to account forthe presence of electromagnetic �elds with the implementationof a modi�ed potential �elds algorithm. Moreover, this systempaves the way for the use of thermal imaging to identify faultswithin the substation infrastructure that involves non-contactprecision temperature measurements and non-destructive test-ing will also be explored [5].

    ACKNOWLEDGMENTSThis work was supported by the Holistic Operation and

    Maintenance for Energy from Offshore Wind Farms (HOMEOffshore) project (EPSRC Grant Number: EP/P009743/1) andthe Robotics and Arti�cal Intelligence for Nuclear (RAIN)project (EPSRC Grant Number: EP/R026084/1). The authors

  • would like to thank both Dr. Andrew West and Dr. ThomasWright of the University of Manchester for their continuedsupport.

    REFERENCES[1] E. M. Barnes, S. Watson, D. Flynn, S. Djurovi·c, A. M. Barnes,

    K. Brown, J. Carmona, M. Collu, C. Crabtree, W. Crowther, S. Dju,P. R. Green, M. Heggo, K. Kababbe, B. Kazemtabrizi, J. Keane,D. Lane, Z. Lin, P. Mawby, A. Mohammed, L. Ran, A. Stetco, W. Tang,and S. Watson, Technology Drivers in Windfarm Asset ManagementPosition Paper, 2018, pp. 146.

    [2] B. P. Silva, R. A. Ferreira, S. C. Gomes, F. A. Calado, R. M. Andrade,and M. P. Porto, On-rail solution for autonomous inspections inelectrical substations, Infrared Physics & Technology, vol. 90, may2018, pp. 5358. [Online]. Available: https://www.sciencedirect.com/science/article/pii/S1350449517307247

    [3] M. Heggo, K. Kabbabe, V. Peesapati, R. Gardner, S. Watson, andB. Crowther, Evaluation and mitigation of high electrostatic �elds onoperation of aerial inspections vehicles in hvdc environments, in EERADeepWind19, Jan 2019.

    [4] M. Heggo, A. Mohammed, J. Melecio Ramirez, K. Kabbabe, P. Tuohy,S. Watson, and S. Durovic, Evaluation and mitigation of offshore hvdcvalve hall magnetic �eld impact on inspection quadcopter propulsionmotors, in EERA DeepWind19, Jan 2019.

    [5] P. Rea and E. Ottaviano, Design and development of anInspection Robotic System for indoor applications, Robotics andComputer-Integrated Manufacturing, vol. 49, feb 2018, pp. 143151. [Online]. Available: https://linkinghub.elsevier.com/retrieve/pii/S0736584517300613

    [6] E. H. Harik, A. Korsaeth, E. H. C. Harik, and A. Korsaeth, CombiningHector SLAM and Arti�cial Potential Field for Autonomous NavigationInside a Greenhouse, Robotics, vol. 7, no. 2, may 2018, p. 22.[Online]. Available: http://www.mdpi.com/2218-6581/7/2/22

    [7] C. Papachristos, S. Khattak, and K. Alexis, Autonomous explorationof visually-degraded environments using aerial robots, in 2017International Conference on Unmanned Aircraft Systems (ICUAS).IEEE, jun 2017, pp. 775780. [Online]. Available: http://ieeexplore.ieee.org/document/7991510/

    [8] A. S. Huang, A. Bachrach, P. Henry, M. Krainin,D. Maturana, D. Fox, and N. Roy, Visual Odometry andMapping for Autonomous Flight Using an RGB-D Camera.Springer, Cham, 2017, pp. 235252. [Online]. Available:http://link.springer.com/10.1007/978-3-319-29363-914

    [9] P. Henry, M. Krainin, E. Herbst, X. Ren, and D. Fox, RGB-D Mapping:Using Depth Cameras for Dense 3D Modeling of Indoor Environments.Springer, Berlin, Heidelberg, 2014, pp. 477491. [Online]. Available:http://link.springer.com/10.1007/978-3-642-28572-133

    [10] H. Zhang, C. Zhang, W. Yang, and C.-Y. Chen, Localization andnavigation using QR code for mobile robot in indoor environment,in 2015 IEEE International Conference on Robotics and Biomimetics(ROBIO). IEEE, dec 2015, pp. 25012506. [Online]. Available:http://ieeexplore.ieee.org/document/7419715/

    [11] P. Nazemzadeh, D. Fontanelli, D. Macii, and L. Palopoli, IndoorLocalization of Mobile Robots Through QR Code Detection and DeadReckoning Data Fusion, IEEE/ASME Transactions on Mechatronics,vol. 22, no. 6, dec 2017, pp. 25882599. [Online]. Available:http://ieeexplore.ieee.org/document/8066377/

    [12] J. Meyer, A. Sendobry, S. Kohlbrecher, U. Klingauf, and O. vonStryk, Comprehensive simulation of quadrotor UAVs using ROS andgazebo, in Simulation, Modeling, and Programming for AutonomousRobots. Springer Berlin Heidelberg, 2012, pp. 400411. [Online].Available: https://doi.org/10.1007%2F978-3-642-34327-8 36

    [13] W.-C. Jiang, M.-Y. Ju, Y.-J. Chen, and W.-C. Jiang, Implementationof Odometry with EKF in Hector SLAM Methods, InternationalJournal of Automation and Smart Technology, vol. 8, no. 1, mar2018, pp. 918. [Online]. Available: http://www.ausmt.org/index.php/AUSMT/article/view/1558

    [14] Hector mapping, accessed: 2019-05-09. [Online]. Available:nurlf http://wiki.ros.org/hector mappingg

    [15] Laser line extraction, accessed: 2019-05-09. [Online]. Available:https://github.com/kam3k/laser line extraction

    [16] Fiducial slam, accessed: 2019-05-09. [Online]. Available: https://github.com/UbiquityRobotics/�ducials

    [17] T. I. Fossen, Handbook of Marine Craft Hydrodynamics and MotionControl. Chichester, UK: John Wiley & Sons, Ltd, apr 2011. [Online].Available: http://doi.wiley.com/10.1002/9781119994138

    [18] Pythonrobotics, accessed: 2019-05-09. [Online]. Available: https://github.com/AtsushiSakai/PythonRobotics