.The last BUGWRIGHT2 integration week took place in Perama, Athens, Greece, in March 2024, on the ferry Kefalonia from the company Levante Ferries.

The BugWright2 project

Autonomous Robotic Inspection and Maintenance on Ship Hulls

Last Integration week in Athens

[Integration Week]

đź‘· The last BUGWRIGHT2 integration week took place in Perama, Athens, Greece, in March 2024, on the ferry Kefalonia from the company Levante Ferries.

The objectives of this final integration week was to bring together most of the technological developments conducted in the project. Specifically, we started by validating the survey flight capabilities of UIB’s drone. The combined software from UIB and KLU created  a 3D mesh of the ferry, combined with the position of Aruco markers and UWB tags. After sharing this information with the other partners, the other platforms could start to estimate their location in a common frame related to the inspected infrastructure. This included the magnetic-wheeled crawlers developed by RBP and CNRS for ultrasonic thickness measurements, the small drones from LSL and KLU performing multi-robot inspection flights and the underwater drones developed jointly by BEYE, NTNU and LSTS. In addition to the platforms, we also integrated into the framework the Augmented Reality user interface developed by RWTH and UT. In the final demonstration, a user wearing the AR glasses could move inside the ferry hold while observing the various robots and the data they reported, superimposed on reality. Beyond the data reporting, this last integration week also let us test the motion planning and trajectory capabilities of all the robotic platforms.

Greetings from the UPORTO LSTS lab team in Greece, where we’ve had a productive Integration week for the BUGWRIGHT2 EU project!
During this period, our team successfully conducted testing and inspection manoeuvres on the vessel’s hull using the shipyard facilities. Among other sensors, we integrated the Oculus sonar technology for precise hull tracking, despite challenges posed by its low data rate. To optimize performance, significant modifications were made to the control software, ensuring stability during inspection manoeuvres.
Additionally, integration tests were conducted to transmit the vehicle’s position to a sophisticated 3D interface from our project partners, enhancing our capabilities for real-time data visualization.

Throughout the Integration Week, UT was actively involved in several tasks deriving from Work Packages 7, 9, 10 and 11. We collected photo and video material, ensuring comprehensive visual assets. Additionally, we monitored the final integration of the VR interface design onto the large-scale pilot, ensuring a smooth transition. Communication was vital as we discussed the User Interface functionality and human resource instruments among project partners. Through that we provided support in integrating our results into the overall system and considered their connectivity for a potential follow-up project. 

Figure 1: groupe picture of the Consortium

LSL is responsible for applying drone swarms to achieve highly efficient visual inspections. During the last integration week, significant efforts were directed towards enhancing swarm localization relative to a unified frame of reference, as well as seamless integration with our partners UIB for corrosion detection and RWTH for the user interface. Several missions showcased both the Lawnmower approach to path planning, commonly used in robotics, and our novel algorithm, the Partitioned Traveling Salesperson Problem. The latter is able to leverage prior information on corrosion locations to reduce mission duration. By the end of the integration week, we achieved real-time transmission of hull images to UIB for corrosion detection procedure. Additionally, robust connectivity with our partners enabled them to visualize transmitted images alongside the drone’s real-time position relative to their 3D ship model.

RWTH collaborated with the other partners with robotic platforms to visualise their provided data in the developed user interface for WP7. Additionally, further work was conducted for WP8 to improve the AR functionality and test its capabilities together with the other partners. 

During the final integration week in Piraeus, UNI-KLU demonstrated together with other project partners (i.e., UIB for the aerial drone, CNRS for the above water crawler, NTNU for the underwater drone, and RWTH for the visualization and human interface) the localization of multiple and heterogeneous mobile robots for ship hull inspection. The demonstration included in particular the streamlined process of an initialization step to automatically initialize navigation aids such as visual markers and UWB anchors for subsequent inspection tasks and visualization for AR/VR based human interaction

Figure 2: Initialization path that provides most information to initialize navigation aids (visual markers and UWB anchors) for the heterogeneous teams of inspection robots.

The specific integration work during the week focused on the design of information rich initialization paths for the UIB UAV in order to gather as much information on the targets (markers and anchors) to be initialized within the shortest amount of time. Automated processing was then tested to produce the generalized localization information provided in a general format (.yaml file and TF tree) to the teams of inspection robots. The information rich path was designed such that non-experts could easily execute it using UIB’s user interaction concept based on task primitives. A sample path is sketched in Figure 1.

Several improvements have been made to the initialization algorithm and to the data quality assessment such that the initialized poses of the visual markers and positions of the UWB anchors could be performed in sufficient precision for subsequent inspection robot navigation, 3D lidar mesh transformation, and visualization in AR/VR.

Next steps are to analyze in detail the initialization errors of the navigation aids. For this CNRS already captured the ground truth of the markers and the environment (mesh) with a total station.

In addition the above work on the initialization process, UNI-KLU provided support to the partners to streamline the unified control framework. This included the unification of the interfaces and the parameter/information file format and read-out. The figure below shows an overview of the unified control framework with all the information and control flows, as well as information exchange formats. 

UNI-KLU specifically supported NTNU to include the new MaRS version that can initialize using a visual marker. Previously, MaRS on the AUV initialized using GNSS and magnetometer readings. However, both signals were deemed to be too unreliable in harbor environments. Also, the visual marker under water was rigidly attached to a marker above water which was included in the above mentioned initialization process. Thus, the underwater marker was referenced in the global and unified inspection reference frame (including a corresponding .yaml parameter file that was loaded onto the AUV). With the new initialization capability and the .yaml file on the AUV, the AUV could directly localize itself in the unified inspection reference frame and provide data to the visualization framework of RWTH in that frame. Unfortunately, a software update on the AUV invalidated the previous low level adaptations done by NTNU to use the MaRS estimation for control such that the tests could not be performed in their panned final form.

UNI-KLU was preparing a new visual-inertial odometry framework based on an equivariant state estimator, MSCEqF [A]. The framework includes a multi-state constraint equivariant estimator that leverages fully the underlying geometry of the system kinematics.  The framework was tested with real data from the UAV of UIB navigating in the ship during the integration week. The figures below show the side and top view of an inspection trajectory estimated purely with the MSCEqF approach.

As next steps the performance of this framework will be evaluated using the ground truth data and compared to the lidar and previous visual-inertial odometry framework.

References:

[A] Alessandro Fornasier, Pieter van Goor, Eren Allak, Robert Mahony, Stephan Weiss: MSCEqF: A Multi State Constraint Equivariant Filter for Vision-aided Inertial Navigation. IEEE Robotics and Automation Letters (RA-L), 2023.

Regarding UIB team activities during the integration week:

  • Continuation with the development and integration of the different processes aiming at the implementation of the first stage of a BW2 inspection mission.
    • This first stage aims at building the localization framework for the rest of robots and providing a first set of inspection data (comprising flight log, images, point clouds and mesh, including the positioning information for the data collected and the corresponding processing results; all this data are available for the rest of robots).
    • During this first stage, the inspection-oriented drone developed by the UIB operates as a standalone platform and flies in front of the structure to inspect, collecting in a fully autonomous, systematic way the inspection data.
    • A number of processes run next in the drone ground station, in order to: (1) build the localization framework [methodology developed by the Univ. of Klagenfurt team], (2) build a mesh (to be used by the rest of robots along the BW2 inspection pipeline) and (3) detect defects in the images collected.
Figure 3: Pictures of various moments of the test activities carried out during the integration week
  • As part of these developments and integration efforts, a number of tests were performed in front of the inner hull of the lower-floor garage of a Ro-Ro / Passenger ship for checking the suitability of the processes that lead to the generation of the localization framework [developed by the Univ. of Klagenfurt team].
    • The localization framework is based on visual markers (ARUCO tags) and an Ultra-Wide Band network.
    • The rest of robots involved in a BW2 inspection mission make use of the resulting calibration to plan their motion and conclude the inspection at a more detailed level.
  • Continuing with the developments and integration efforts, a number of tests were performed for checking the suitability of the processes leading to building the mesh of the vessel hull. The mesh, apart from being an output of an inspection mission, is intended to be useful for the rest of the robots of the inspection pipeline, as input for their motion planning modules (see figure 3).
  • Several experiments on defect detection were also conducted for the different flights performed, by processing the images collected. The output of this step comprises several files, including the images resulting from the processing, among which there is a file that indicates where the defects are suspected to be. This information can be used for planning the motion of the different robotic platforms in the second stage of a BW2 inspection mission.
Figure 4: Mesh obtained from one of the flights performed in the area of inspection
The AUR LAB from Norwegian University of Science and Technology (NTNU) participated at the BUGWRIGHT2 integration week in Athens during March 2024. The task for this participant is to capture video of the part of the ship’s hull below the water surface. The equpment used is a Blueye X3 ROV equiped with Oculus front facing sonar and bottom mounted doppler velocity logger (DVL). The sonar has two purposes. One purpose is during manual navigation of the ROV, to position the ROV and avoid collision. The ROV is also able to automatically navigate along the ship’s hull with a fixed distance to the hull. During automatic navigation, the sonar will keep the ROV orthogonal to the ship’s hull in order to record video of good quality. The DVL measures the velocity of the ROV relative to the seabed. DVL together with an internal measurement unit (IMU), the ROV will also know it’s position, with some error. The position error will accumulate with time.
 
The recorded video will, along with recorded position along the hull, will be used as input to the bugwright system, where corrosion and other issues at the hull can be identified.
 
The software used is the SDK of the Blueye ROV. On top of it, ROS with Python is used to control the ROV with input from the sonar and DVL. The software will at all times measure the distance and angle between the ROV and the ship’s hull.

During the last integration week RINA as T9.4: Evaluation and validation leader aimed to conclude with the final actions of the task and complete the deliverable that had as a submission date the end of March. The final part of the evaluation report was to capture the appreciation of the end users (classification society, shipowners, shipyards) regarding the platforms developed on the project. RINA had compiled a survey with different questions addressed to each end user and had circulated the document before the integration week. During the meeting, the surveys regarding the evaluation of the end users were collected from the relevant partners. In addition, follow up discussion was performed regarding the responses of the end users to identify the advantages and limitations of the BUGWRIGHT2 platforms. In addition, to capture the perspective of the classification societies regarding the platforms, RINA invited surveyors from the RINA Hellas office, who assisted on the evaluation. A two-day briefing was given to the surveyors, and they had the chance to interact with the platforms and question the developers to understand better the platforms. After this briefing, they completed the survey and also provided their feedback regarding the platforms similar with the other end users. RINA also collected and integrated on the deliverable the contribution of AASA regarding the crawlers’ evaluation. Finally, after collecting the required material, the analysis of the findings from the surveys was performed and the final part of the deliverable regarding the end users appreciation, including potential limitations and future research required, was drafted.

Arsenal do Alfeite, SA took part in the integration week in Athens, in March 2024, at the Perama shipyard. As an end-user, AA SA was always ready to collaborate. Specifically, AA exchanged opinions with RINA on how to assess certain platforms as end-users. Additionally, partners from other platforms, such as underwater visual system, sought the end user’s opinion on the presentation results, their layout, and whether those results were clear. AA also participated in an augmented reality experience with the aerial platform, where the entire process to achieve the result was explained, and we shared our feedback on it.

During the BUGWRIGHT2 final integration week, WMU conducted onsite observations of RITs at play. Work also included physical standpoints on localizations from the technical partners, as well as queries on hazards, deployment as well as concerns related to regulatory drawbacks. All responses were provided by the technical team.

The Consortium would like to thanks DANAOS, GLAFCOS and Dr. Kyriakos P. Mahos, from Levante Ferries, for their welcome, support and help during the week. 

#h2020 #euproject #integrationweek #fieldtests #research #euproject #dronetechnology #innovation #testing #data #robots #augmentedreality #BUGWRIGHT2 #LSTS #UnderwaterExploration

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grand agreement No 871260.