Next Article in Journal
Application of an Additive Manufactured Hybrid Metal/Composite Shock Absorber Panel to a Military Seat Ejection System
Next Article in Special Issue
Vegetation Removal on 3D Point Cloud Reconstruction of Cut-Slopes Using U-Net
Previous Article in Journal
Dye Schedule Optimization: A Case Study in a Textile Industry
Previous Article in Special Issue
Automated Vision-Based Crack Detection on Concrete Surfaces Using Deep Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Port Structure Inspection Based on 6-DOF Displacement Estimation Combined with Homography Formulation and Genetic Algorithm

1
Department of Structural Engineering Research, Korea Institute of Civil Engineering and Building Technology, Goyang-si 10233, Korea
2
SRH Solution, Hwaseong-si 18496, Korea
3
Department of Civil and Environmental Engineering, Hanbat National University, Daejeon 34158, Korea
*
Author to whom correspondence should be addressed.
Submission received: 18 June 2021 / Revised: 9 July 2021 / Accepted: 12 July 2021 / Published: 13 July 2021
(This article belongs to the Special Issue Artificial Intelligence Technologies for Structural Health Monitoring)

Abstract

:
A vision sensor-based 6-DOF displacement evaluation method incorporating a genetic algorithm was proposed to monitor the critical defects of port infrastructure, such as deflection, slope, and slip. The 6-DOF behavior of the port structure, including subsidence, was estimated based on the specification of the target and fixed structures nearby. The method calculates the relative position of the target port structure and measures the movement of the structure over time. To improve the measurement accuracy, a genetic algorithm was used to adjust the intrinsic parameters that were previously estimated using the checkerboards. The results of measuring 6-DOF displacements based on the tuned intrinsic parameters confirmed that it has the potential to accurately measure the 6-DOF behavior of port facilities. The possibility of field application was examined through an artificial movement that was induced in the image of the port facility to create an arbitrary displacement between two points.

1. Introduction

The aging and deterioration of port facilities in the Republic of Korea has become an issue that should be addressed. As of 2020, 49.4% (538 locations) of port facilities are aged over 20 years, while 13.1% (143 locations) are aged over 40 years. According to the safety inspection and precision safety diagnosis reports of port facilities, in the facilities of more than 20 years age, the A-grade ratio decreased sharply, while in the case of 40 years or more, the A- and B-grade ratios tended to decrease. Moreover, the increase in the intensity and frequency of natural disasters related to climate change increase the variability of the design external force and enhance the possibility of large-scale damage to aging port facilities [1]. Figure 1 shows the critical damage cases that have occurred in port facilities. In response, the Ministry of Oceans and Fisheries of the Republic of Korea has established a national roadmap in 2020 for the smart sensing, monitoring, analysis, evaluation, and repair of port facilities for proactive and timely maintenance. In the detailed guidelines for infrastructure safety inspection and precision safety diagnosis, the critical major defects in port facilities are defined as: foundation scour, damage and corrosion of piles, loss of internal force due to carbonation and chloride attack in concrete, corrosion of lock gate facilities, and the normal displacement and settlement of berthing structures [2,3].
The settlement and normal displacement of berthing structures is generally evaluated by surface level surveying; the foundation scour should be evaluated by divers, and the members towards the sea should be inspected by inspectors moving in a boat. These evaluations and inspections are carried out every few years, and thus continuous monitoring is difficult. Attachment of various electric sensors is one method to monitor the behavior, such as displacement, settlement, slip, and slope, but it is complicated to organize the sensing system with consideration of the berth, salt attack, high-risk work on the members towards the sea, and facility users’ route [4,5,6,7]. Thus, in this paper, we present a technique for measuring the precise behavior of a berthing structure that could be caused by scouring, settlement, slip, damage, material deterioration, and so forth.
The behavior of the structures using a vision-based non-contact type displacement measurement system has gained rapid developments in the past decade [8]. Kohut et al. (2013) presented a vision-based deflection measurement method using the digital image correlation coefficient [9]. Jeon et al. (2014) proposed a 6-DOF translational and rotational displacement measurement system with a vision sensor and a uniquely designed marker [10]. Ye et al. (2015) proposed a multi-point displacement measurement method by use of a pattern-matching algorithm [11]. Feng et al. (2015) proposed a structural displacement measurement method with a subpixel resolution using the upsampled cross correlation algorithm [12]. Zhou et al. (2020) proposed a videogrammetric technique for displacement monitoring that eliminates the measurement error due to the image drift induced by temperature variation [13]. Most of the aforementioned non-contact type vision-based displacement measurement systems, however, have one of following drawbacks: only estimated deflection, which is 1-DOF displacement, markers are attached on the structures for feature points detection, or the accuracy of the measurements highly depends on the camera calibration results for calculating intrinsic parameters.
The 6-DOF displacement also can be measured using structured light composed of lasers and vision sensors [14,15,16]. The translational and rotation displacement measurement system called a paired structured light system composed of two sides facing each other, each with one or two lasers, a screen, and a camera. The lasers on each side project their beams on the screen on the opposite side, and a camera near the screen captures an image of the screen. By calculating the positions of the laser beams, the relative displacement between two sides can be estimated. In a follow-up study conducted by the same research group, a 2-DOF manipulator was introduced to an increased range of the displacement measurement. In the case of a visually servoed paired structure light system, the displacement can be estimated with an error within 0.2 mm and 0.2 deg, but the installation of a relatively heavy sensing system on port structures and mobile platforms is required. Therefore, in this paper, a displacement measurement method using the fixed intrinsic parameter of the camera is applied to measure the displacement of 6-DOF between the camera and the fixed/port structure; with its use, the movement of the port structure based on fixed structure can be measured. In this paper, the floating port structure was assumed to be a rigid body, and it was assumed that there was no deformation in shape. Since the displacement estimation of the structure is highly dependent on the camera-intrinsic parameter, in this paper, the intrinsic parameter is tuned based on the given measured translational and rotational displacements using a genetic algorithm. An indoor model experiment and an outdoor field image-based experiment were performed, and the results of the experiments confirmed that translational and rotational displacements are estimated more precisely after calibrating the intrinsic parameters of the vision sensor, and the proposed technique is applicable to the field.
The remainder of the paper is organized as follows. In Section 2, the translational and rotational displacement estimation method using a vision sensor is described. The application of the genetic algorithm for tuning the camera-intrinsic parameters is introduced in Section 3. To validate the performance and applicability of the proposed method, the experimental tests using model structures and captured image with a drone are conducted and the results are discussed in Section 4. Conclusions and further research directions are discussed in Section 5.

2. 6-DOF Displacement Estimation Using Vision Sensor

The 6-DOF relative displacements that include translational and rotational displacements in three axes can be estimated by using positions of feature points in world coordinates and the intrinsic parameters of a vision sensor. The intrinsic parameters determine the optical properties of the camera lens, including the focal lengths, principal points, and distortion coefficients. Figure 2 represents the geometry view of the feature points in both the world and image planes. In the figure, Qi and qi (i = 1, …, N) denote the corresponding points of the world and image planes, respectively, where N is the number of feature points. The points in the world plane, Qi, defined as Qi = [X Y Z 1]T, are represented in the three-dimensional coordinate system. The corresponding points, qi, defined as qi = [u v 1]T, are represented in two-dimensional space. The relationship between the two planes can be expressed in terms of matrix multiplication, as follows:
u v 1 = f u 0 c u 0 f v c v 0 0 1 x d y d 1 ,  
x d y d = x ( 1 + K 1 r 2 + K 2 r 4 ) + 2 K 3 x y + K 4 ( r 2 + 2 x 2 ) y ( 1 + K 1 r 2 + K 2 r 4 ) + 2 K 3 ( r 2 + 2 x y 2 ) + K 4 x y  
X c Y c Z c = r 11 r 12 r 13 t x r 21 r 22 r 23 t y r 31 r 32 r 33 t z X Y Z 1  
where fu and fv are the focal length, cu and cv represent the principal point where the focal axis of the camera intersects the image plane; K1 and K2 are the radial distortion coefficients, K3 and K4 are the tangential distortion coefficients; r and t are parameters of the rotation matrix and translation vector. In Equation (2), x = Xc/Zc, y = Yc/Zc, and r2 = x2+y2, where Xc, Yc, and Zc are defined in Equation (3). The homography matrix composed of intrinsic and extrinsic camera parameters explains how to map pixels on a 2D image to the corresponding real-world coordinates in 3D scenes, as shown in Equations (1)–(3) [17,18]. By using the feature points on the same level, 3 × 3 sized homography matrix can be used with the given 2D-to-2D point correspondences. Since the degree of freedom of the homography matrix is equal to eight, at least four point-to-point correspondences are required. In other words, the rotation matrix and the translation vector can be obtained with the known positions of more than four feature points (N ≥ 4) [18]. By calculating rotational and translational displacements from the vision sensor to the target and the fixed structures, the relative 6-DOF displacement of the target structure can be estimated.
Figure 3 shows the entire process of the relative displacement estimation between the two structures using image processing techniques. The figure shows that the camera captures the image of the structures, then the camera lens distortion is corrected by using the previously calculated intrinsic parameters. From the undistorted image, the feature points of the structures, including corners, are detected by using various image processing techniques, such as binarization, and corner detection at the sub-pixel level. By calculating at least four feature point positions, 6-DOF displacement between the camera and the structures can be estimated. The relative displacement can be estimated by using the previously calculated displacements on each structure. The relative displacement between two structures, the fixed and target structures, can be estimated using the following equations:
F D T ( x , y , z , θ , φ , ψ ) = T ( x , y , z ) R x ( θ ) R y ( φ ) R z ( ψ ) = c φ c ψ c φ s ψ s φ x s θ s φ c ψ + c θ s ψ s θ s φ s ψ + c θ s ψ s θ c φ y c θ s φ c ψ + s θ s ψ c θ s φ s ψ + s θ s ψ c θ c φ z 0 0 0 1
F D T = F D C C D T
In Equation (4), FDT is the transformation matrix composed of the 6-DOF relative displacement between fixed coordinate relative to the target coordinate, and F and T indicate the fixed and target structures, respectively. The matrix consists of the product of translation matrix T(x,y,z) along X, Y, and Z axes with rotation matrices Rx(θ), Ry(φ), and Rz(ψ) about X, Y, and Z axes, respectively. In the equation, Sθ and Cθ denote sinθ and cosθ, respectively. The details of each matrix can be found in [19]. In Equation (5), FDc and TDc are the relative displacements between fixed or target coordinates relative to the camera coordinates, indicated as C. CDT can be estimated by inverting TDC. The relative displacements estimated from images taken at different time t − 1 and t are used to estimate the structural behavior, as follows:
T , t D T , t 1 = T D F , t F D T , t 1
where TDF,t equals the translational and rotational displacements between the fixed and target structures at time t.

3. Application of Genetic Algorithm for Optimization of the Camera-Intrinsic Parameters

Metaheuristic algorithms has been developing rapidly in recent years to solve real-life complex problems in various fields [20,21]. Most of the metaheuristic algorithms are inspired from biological evolution, swarm behavior, and laws of physics and can be classified into two categories such as single solution and population-based metaheuristics [22]. In comparison with the single solution approach that improve the solution by using local search, the population-based metaheuristics maintain the diversity in the population and avoid sucking in local optima [23]. Among the population-based metaheuristic algorithms, genetic algorithm (GA), which is one of the well-known algorithms, is used to find the parameter sets in the homography equation. GA, introduced by A. S. Fraser in 1957, is guaranteed to converge to an optimal solution in multivariable function by repeating population generation, fitness/penalty evaluation, selection, reproduction, crossover, and mutation [24,25]. Compared to other optimization methods, it is capable of solving any optimization problem based on a chromosome approach, and of handling a multiple solution search space with less complexity, and in a more straightforward manner [26]. GA is widely used in various research fields due to its advantage in creating models in a probabilistic manner. It includes new information in a non-arbitrary way, despite the limitation of being time-consuming and computationally intensive.
Algorithm 1 shows the entire procedure of optimizing the intrinsic parameters of the homography equation by using GA. The algorithm shows that the initial population of chromosomes, composed of parameters of the homography equation, such as Pset = [fu,fv,cu,cv,K], where K includes the radial distortion coefficients (K1 and K2), and tangential distortion coefficients (K3 and K4) is generated. After the generation, the penalty of each chromosome is evaluated, and the best chromosome is obtained that minimizes the difference between the estimated and previously given translational and rotation displacements, which are extrinsic parameters of the vision sensor. The objective function to optimize the translational and rotational displacements of different units is set as a normalized vector objective function, as follows [27]:
F p e n a l t y = argmin P ^ s e t i = 1 N max _ g e n D ^ i D i max D i min D i
where D i ^ and Di are the true and estimated displacements. The chromosome with the lowest penalty value has a higher probability of being selected in the next generation. The selected best chromosome is reproduced to form a new population, and crossover and mutation are performed to prevent GA from converging on local minima. Based on the updated population, Steps 2–4 are looped until the stopping criteria are satisfied, or the number of generations reaches the maximum number of generations. The parameter set with minimum penalty value is selected, and the constituted equation is automatically tuned. In this study, a single point crossover, proportional roulette wheel selection, and single point mutation method are used [28,29]. The population size of 150, percent probability of crossover of 0.6%, percent probability of mutation of 0.05%, and maximum number of generations of 200 are used.
Algorithm 1. Procedure of optimizing intrinsic parameters of the vision sensor with genetic algorithm.
Input:
   Population size, n
   Maximum number of iterations, Nmax_gen
   Initial values and the searching area of the chromosomes, P
Output:
   Global best solution, Pbt
begin
   Step 1: Generate the initial population of chromosomes
    P s e t = [ f u , f v , c u , c v , K ]
   while satisfaction of stopping criteria OR number of generations is less than the maximum number of generations
       Step 2: Evaluate the penalty of each chromosome, Pi (I = 1,2,⋯,n)
        F p e n a l t y = argmin P ^ s e t i = 1 N max _ g e n D ^ i D i max D i min D i
       Step 3: Select the best chromosome, and do reproduction
       Step 4: Perform the crossover and mutation
   end
   Step 5: Achieve the best individual in all generation, Pbt
end
To set the searching range of the parameters to be tuned, intrinsic parameters calculated by using checkerboards are analyzed, and the coefficient of variation, also called relative standard deviation, is calculated [30]. Figure 4 shows the checkerboards with different sizes. Table 1 shows the intrinsic parameters of each case with the combinations of one or two different sized checkerboards that are estimated. Figure 5 shows the box plots and coefficient of variations that are calculated. In this paper, the searching range of the parameters in the genetic algorithm is set from the calculated interquartile range in the box plots. Since the relative standard deviations of radial distortion parameter on the Y axis, and tangential distortion on the X and Y axes, show relatively large, the searching range is additionally multiplied by the weights on the three distortion parameters.

4. Experimental Tests

4.1. Verification of Displacement Estimation Using Model Structures

To verify the performance of the application of a genetic algorithm, experimental tests with artificial structures and a motion stage were performed. The structures were produced by simulating the shapes of actual port structures, and the relative displacement between the target structure placed on the motion stage and the fixed structure were estimated (see Figure 6). Figure 7 shows the graphic user interface based on visual c++, which employs image binarization using adaptive threshold, edge detection in subpixel level, and the camera extrinsic parameter estimation, which is developed to find the relative displacement in a captured image. The estimated relative displacement between the two structures in the before and after images, the movement of the target structure according to the change of time, is calculated.
By using different patterns and size of the checkerboards and experimental data sets with the X-axis translational displacement and Y-axis rotational displacement, intrinsic parameters are calculated (see Table 1). The median, minimum, and maximum values are used to generate populations of the chromosomes in GA. Since the relative standard deviations of radial distortion parameter on the Y axis, and tangential distortion on the X and Y axes show relatively large, as shown in Figure 5, the weights on the three distortion parameters are set to be 2.5 to enlarge the searching range. Table 2 shows the translational and rotational displacement results using the camera-intrinsic parameter adjusted by applying GA in the calculation of the 6-DOF displacement. The experimental test without GA has been performed with intrinsic parameters calculated by 40 captured images, using a checkerboard shown in Figure 4f. The table includes error of 6-DOF displacements calculated based on ten different GA parameters and actual movement. The results show that the estimated displacements with the compensated camera-intrinsic parameters show better performance in both the translational and rotational displacements estimation. In the design standard for port and harbor structures [31,32,33], the maximum allowable horizontal displacement at the functional performance level is 100 mm. Considering the acceptable measurement tolerance, the proposed method with the RMSE of less than 3 mm and 1° for translational and rotation displacements, respectively, can be applied to the port structures to monitor the structural condition.

4.2. Verification of Field Applicability Using Port Structure Images

To verify the applicability of the proposed method, an experimental test with an image of one of the major port structures in the Incheon Republic of Korea was performed. An inspection drone specialized for port facilities was developed containing the following: a module for precise three-dimensional position control using multiple GNSS and corrected signals, a module for mounting a multi-angle camera and a front gimbal, and a folding frame capable of being carried by a person for photo and videography (see Figure 8a). The Figure 8b shows the 3D flight trajectory when capturing the images at high altitude. Through the development of real-time image streaming control technology that integrates the ground control module and the LTE module, it is possible to control the drone in the invisible area more than 3 km away from Incheon port.
The artificial movement of the structure was generated by moving the target structure using an integrated orthophoto, and the relative displacement of the structure between two images was calculated as shown in Figure 9. The figure shows that the main displacement is predicted by the X-axis displacement, which is the longitudinal directions of the target structure. The estimated relative displacement in the test is found to be D = [−43,011.9, −825.5, 439.6, −1.2, 0, 2.6] with all units in mm or degrees. The intrinsic parameters were tuned by using GA with the specifications of the structures, which are the coordinates of feature points in the fixed and the target structures. By using the proposed method, it will be possible to determine whether to continue using the port structures by estimating the displacement before and after a disaster.

5. Conclusions

The translational and rotational displacements of port structures can be estimated by capturing images that include both a fixed and a target structure. The movement of the target structure relative to the fixed structure can be calculated by estimating the displacements from the camera to the fixed and target structures, respectively. The movement of the structure can be measured by the vision sensor mounted on mobile platforms such as drones without attaching a special sensing system to the structure. Genetic algorithm was introduced to improve the accuracy of the displacements, and the results confirmed that the root mean square errors of translational and rotational displacement were greatly reduced. The applicability of the proposed method to port infrastructure was verified using high-latitude orthogonal images, and the specifications of the structures with a mobile platform. In the future, deep learning techniques will be applied to enable robust detection of the structures against changes in external environmental conditions and ensure usability and safety of constantly monitored major port facilities.

Author Contributions

H.J. conceived the presented idea and supervised the project. J.M., H.B. and Y.B. developed the detection and quantification method and performed the experimental tests. All authors have read and agreed to the published version of the manuscript.

Funding

This research was a part of the project titled ‘Development of smart maintenance monitoring techniques to prepare for disaster and deterioration of port infra structures (No. 20210659)’ funded by the Ministry of Oceans and Fisheries, Korea.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data available on request due to restrictions e.g., privacy or ethical.

Acknowledgments

This research was a part of the project titled ‘Development of smart maintenance monitoring techniques to prepare for disaster and deterioration of port infra structures (No. 20210659)’ funded by the Ministry of Oceans and Fisheries, Korea. The port images obtained by the drone in Figure 8 were provided by SISTECH, Korea.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Ministry of Oceans and Fisheries. The 1st Port Facility Management Plan (2020~2025). 2021. Available online: https://www.mof.go.kr/article/view.do?menuKey=375&boardKey=9&articleKey=37413 (accessed on 11 May 2021).
  2. Lim, J.; Cho, I.; Lee, J.; Lee, A.; Park, M. A study on performance model for establishing strategies of port facilities maintenance. J. Korean Soc. Hazard Mitig. 2018, 18, 359–367. [Google Scholar]
  3. Ministry of Oceans and Fisheries. Detailed Guidelines for Safety Inspection of Port Facilities. 2015. Available online: https://www.mof.go.kr/article/view.do?menuKey=390&boardKey=26&articleKey=9766 (accessed on 11 May 2021).
  4. Cha, K.; Kim, S.-W.; Kim, J.H.; Park, M.-Y.; Kong, J.S. Development of the deterioration models for the port structures by the multiple regression analysis and markov chain. J. Comput. Struct. Eng. Inst. Korea 2015, 28, 229–239. [Google Scholar]
  5. Jo, B.W.; Jo, J.H.; Khan, R.M.A.; Kim, J.H.; Lee, Y.S. Development of a cloud computing-based pier type port structure stability evaluation platform using fiber Bragg grating sensors. Sensors 2018, 18, 1681. [Google Scholar] [CrossRef] [Green Version]
  6. Del Grosso, A.; Inaudi, D.; Lanata, F. Strain and displacement monitoring of a quay wall in the Port of Genoa by means of fibre optic sensors. In Proceedings of the 2nd European Conference on Structural Control, ENPC, Paris, France, 3–6 July 2000. [Google Scholar]
  7. Lee, S.-Y.; Nguyen, K.-D.; Huynh, T.-C.; Kim, J.-T.; Yi, J.-H.; Han, S.-H. Vibration-based damage monitoring of harbor caisson structure with damaged foundation-structure interface. Smart Struct. Syst. 2012, 10, 517–546. [Google Scholar] [CrossRef]
  8. Koch, C.; Paal, S.G.; Rashidi, A.; Zhu, Z.; Konig, M.; Brilakis, I. Achievements and challenges in machine vision-based inspection of large concrete structures. Adv. Struct. Eng. 2014, 17, 303–318. [Google Scholar]
  9. Kohut, P.; Holak, K.; Uhl, T.; Ortyl, Ł.; Owerko, T.; Kuras, P.; Kocierz, R. Monitoring of a civil structure’s state based on noncontact measurements. Struct. Health Monit. 2013, 12, 411–429. [Google Scholar] [CrossRef]
  10. Jeon, H.; Kim, Y.; Lee, D.; Myung, H. Vision-based remote 6-DOF structural displacement monitoring system using a unique marker. Smart Struct. Syst. 2014, 13, 927–942. [Google Scholar] [CrossRef]
  11. Ye, X.W.; Yi, T.-H.; Dong, C.Z.; Liu, T.; Bai, H. Multi-point displacement monitoring of bridges using a vision-based approach. Int. J. Wind. Struct. 2015, 20, 315–326. [Google Scholar] [CrossRef]
  12. Feng, D.; Feng, M.Q.; Ozer, E.; Fukuda, Y. A Vision-Based Sensor for Noncontact Structural Displacement Measurement. Sensors 2015, 15, 16557–16575. [Google Scholar] [CrossRef]
  13. Zhou, H.-F.; Lu, L.-J.; Li, Z.-Y.; Ni, Y.-Q. Exploration of temperature effect on videogrammetric technique for displacement monitoring. Smart Struct. Syst. 2020, 25, 135–153. [Google Scholar]
  14. Jeon, H.; Bang, Y.; Myung, H. A paired visual servoing system for 6-DOF displacement measurement of structures. Smart Mater. Struct. 2011, 20, 45019. [Google Scholar] [CrossRef]
  15. Jeon, H.; Myeong, W.; Shin, J.-U.; Park, J.-W.; Jung, H.-J.; Myung, H. Experimental validation of visually servoed paired structured light system (ViSP) for structural displacement monitoring. IEEE/ASME Trans. Mechatron. 2013, 19, 1603–1611. [Google Scholar]
  16. Jeon, H.; Choi, S.; Shin, J.; Kim, Y.; Myung, H. High-speed 6-DOF structural displacement monitoring by fusing ViSP (Visually Servoed Paired structured light system) and IMU with extended Kalman filter. Struct. Control Health Monit. 2017, 24, e1926. [Google Scholar] [CrossRef]
  17. Deligiannidis, L.; Arabina, H.R. (Eds.) Emerging Trends in Image Processing, Computer Vision and Pattern Recognition; Morgan Kaufmann: Burlington, MA, USA, 2015; ISBN 9780128020456. [Google Scholar]
  18. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision; Cambridge University Press: Cambridge, UK, 2003; ISBN 9780511811685. [Google Scholar]
  19. Craig, J.J. Introduction to Robotics: Mechanics and Control, 3rd ed.; Pearson Prentice Hall: Hoboken, NJ, USA, 2005; ISBN 9780201543612. [Google Scholar]
  20. Katoch, S.; Chauhan, S.S.; Kumar, V. A review on genetic algorithm: Past, present, and future. Multimed. Tools Appl. 2021, 80, 8091–8126. [Google Scholar] [CrossRef] [PubMed]
  21. Panagant, N.; Bureerat, S. Truss topology, shape and sizing optimization by fully stressed design based on hybrid grey wolf optimization and adaptive differential evolution. Eng. Optim. 2018, 50, 1645–1661. [Google Scholar] [CrossRef]
  22. Bonabeau, E.; Dorigo, M.; Theraulaz, G. Swarm Intelligence: From Natural to Artificial Systems; Oxford University Press, Inc.: New York, NY, USA, 1999; ISBN 9780195131581. [Google Scholar]
  23. Cao, Y.; Zhang, H.; Li, W.; Zhou, M.; Zhang, Y.; Chaovalitwongse, W.A. Comprehensive Learning Particle Swarm Optimization Algorithm with Local Search for Multimodal Functions. IEEE Trans. Evol. Comput. 2019, 23, 718–731. [Google Scholar] [CrossRef]
  24. Fraser, A.S. Simulation of genetic systems by automatic digital computers I. Introduction. Aust. J. Biol. Sci. 1957, 10, 484–491. [Google Scholar] [CrossRef]
  25. Eberhart, R.C.; Shi, Y. Computational Intelligence: Concepts to Implementations; Morgan Kaufmann: Burlington, MA, USA, 2011; ISBN 0080553834. [Google Scholar]
  26. Tabassum, M.; Mathew, K. A genetic algorithm analysis towards optimization solutions. Int. J. Digit. Inf. Wirel. Commun. 2014, 4, 124–142. [Google Scholar] [CrossRef]
  27. Chiandussi, G.; Codegone, M.; Ferrero, S.; Varesio, F.E. Comparison of multi-objective optimization methodologies for engineering applications. Comput. Math. Appl. 2012, 63, 912–942. [Google Scholar] [CrossRef] [Green Version]
  28. Hopgood, A.A.; Mierzejewska, A. Transform ranking: A new method of fitness scaling in genetic algorithms. In Proceedings of the International Conference on Innovative Techniques and Applications of Artificial Intelligence, Cambridge, UK, 9–11 December 2008; Springer: London, UK, 2008; pp. 349–354. [Google Scholar]
  29. Ponsich, A.; Azzaro-Pantel, C.; Domenech, S.; Pibouleau, L. Constraint handling strategies in genetic algorithms application to optimal batch plant design. Chem. Eng. Process. Process Intensif. 2008, 47, 420–434. [Google Scholar] [CrossRef] [Green Version]
  30. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  31. Ministry of Oceans and Fisheries (Republic of Korea). Korea Design Standard for Port and Harbor Facilities: Earthquake (KDS 64 17 00: 2019); Ministry of Oceans and Fisheries (Republic of Korea): Sejong, Korea, 2019; pp. 20–30.
  32. Ferritto, J.M. Design Criteria for Earthquake Hazard Mitigation of Navy Piers and Wharves (NFESC-TR-2069-SHR); Naval Facilities Engineering Service Center: Port Hueneme, CA, USA, 1997. [Google Scholar]
  33. British Standards Document, Maritime Structures. Code of Practice for the Design of Quay Walls, Jetties and Dolphins (BS 6349-2); British Standards Institution: London, UK, 2010. [Google Scholar]
Figure 1. Critical damage to the facilities.
Figure 1. Critical damage to the facilities.
Applsci 11 06470 g001
Figure 2. Homogeneous transformation between world and image planes.
Figure 2. Homogeneous transformation between world and image planes.
Applsci 11 06470 g002
Figure 3. A block diagram of the displacement estimation process.
Figure 3. A block diagram of the displacement estimation process.
Applsci 11 06470 g003
Figure 4. Checkerboards with (a) 3 × 4, (b) 3 × 6, (c) 4 × 5, (d) 4 × 7, (e) 5 × 6, and (f) 5 × 8 squares for the estimation of the intrinsic parameters of the vision sensor.
Figure 4. Checkerboards with (a) 3 × 4, (b) 3 × 6, (c) 4 × 5, (d) 4 × 7, (e) 5 × 6, and (f) 5 × 8 squares for the estimation of the intrinsic parameters of the vision sensor.
Applsci 11 06470 g004
Figure 5. Box plot ((a) parameters of the focal length and principal points, (b) parameters of the distortion) and coefficient of variation of the estimated intrinsic parameters of the vision sensor.
Figure 5. Box plot ((a) parameters of the focal length and principal points, (b) parameters of the distortion) and coefficient of variation of the estimated intrinsic parameters of the vision sensor.
Applsci 11 06470 g005
Figure 6. Experimental setup (a) port facilities for displacement measurement; (b) model structures and the motion stage.
Figure 6. Experimental setup (a) port facilities for displacement measurement; (b) model structures and the motion stage.
Applsci 11 06470 g006
Figure 7. Graphic User Interface for estimating 6-DOF displacement.
Figure 7. Graphic User Interface for estimating 6-DOF displacement.
Applsci 11 06470 g007
Figure 8. Experimental setup with (a) a drone with a front gimbal and its (b) flight trajectories at Incheon port.
Figure 8. Experimental setup with (a) a drone with a front gimbal and its (b) flight trajectories at Incheon port.
Applsci 11 06470 g008
Figure 9. Estimation of the relative displacement using port structure images.
Figure 9. Estimation of the relative displacement using port structure images.
Applsci 11 06470 g009
Table 1. Estimated intrinsic parameters using different sets of checkerboards. The checkerboard ‘(a)’ denotes the size and configuration of the checkerboard presented in Figure 4a.
Table 1. Estimated intrinsic parameters using different sets of checkerboards. The checkerboard ‘(a)’ denotes the size and configuration of the checkerboard presented in Figure 4a.
Checker-BoardsFocal LengthPrincipal PointsRadial DistortionTangential Distortion
FxFyCxCyK1K2K3K4
(a)2654.952667.25692.02421.61−0.39750.1285−0.00120.0130
(a), (b)2634.642624.021023.62380.04−0.46290.30870.00590.0008
(a), (c)2578.672572.71948.75459.45−0.44610.18650.00170.0037
(a), (d)2586.032584.541001.43424.18−0.44360.23980.00020.0006
(a), (e)2598.732582.47990.96431.47−0.41780.05850.00500.0029
(a), (f)2363.732370.191044.13347.82−0.46880.3937−0.0005−0.0020
(b)2705.332673.961029.39219.31−0.49170.58650.0155−0.0029
(b), (c)2615.952614.07997.06411.56−0.43250.25600.00140.0037
(b), (d)2734.702649.68930.35253.04−0.3618−1.03970.02780.0034
(b), (e)2621.912601.25981.15414.57−0.42330.18510.00700.0043
(b), (f)2645.722647.74998.71327.04−0.45190.31650.00250.0003
(c)2665.992665.67887.69365.05−0.50300.93190.00600.0119
(c), (d)2590.242598.32921.28451.59−0.44600.2329−0.00350.0065
(c), (e)2611.212601.35984.80439.40−0.42890.16360.00260.0034
(c), (f)2584.712574.961007.15351.71−0.44980.07260.0016−0.0016
(d)2716.702700.58938.09197.31−0.49940.46720.01630.0039
(d), (e)2585.512583.791009.34408.67−0.40500.13070.00020.0029
(d), (f)2658.122629.061053.09263.17−0.48850.24790.0145−0.0042
(e)2552.712543.47979.48434.32−0.44220.5115−0.00100.0042
(e), (f)2621.562604.26994.28321.01−0.48910.25850.0120−0.0048
(f)2770.832755.421027.54124.33−0.48650.57110.01370.0018
Table 2. Root mean square error (RMSE) of 6-DOF displacement estimation.
Table 2. Root mean square error (RMSE) of 6-DOF displacement estimation.
Experimental TestTranslationRotation
w/o GAw/ GAw/o GAw/ GA
RMSE of Case 1
(−10mm translational movement along X-axis)
0.84850.6122 (−28%)0.77030.3020 (−61%)
Errors of experimental results with ten different GA parameters1-1−1.00.41.00.4−0.10.50.0−1.30.30.2−0.60.0
1-20.70.00.10.2−0.40.0
1-31.0−0.40.70.3−0.20.0
1-40.7−0.40.40.3−0.30.0
1-5−0.2−0.21.00.3−1.00.0
1-60.9−0.40.80.3−0.40.0
1-71.1−0.40.40.3−0.10.0
1-81.3−0.50.40.40.0−0.1
1-90.4−0.30.60.3−0.50.0
1-101.1−0.60.30.5−0.10.0
RMSE of Case 2
(5° rotational movement about Y-axis)
2.31882.0538 (−11%)0.87940.5255 (−40%)
Errors of experimental results with ten different GA parameters2-12.7−1.0−2.82.80.0−2.50.0−1.40.6−0.8−0.70.0
2-23.10.0−0.1−0.6−0.7−0.1
2-33.1−0.2−1.2−0.7−0.60.0
2-42.30.1−2.8−0.9−1.00.0
2-53.00.40.5−0.8−0.8−0.2
2-63.4−0.3−1.2−0.6−0.40.0
2-73.1−0.2−1.5−0.6−0.50.0
2-83.9−0.8−1.7−0.4−0.10.0
2-93.2−0.4−1.0−0.4−0.50.0
2-102.7−0.5−2.8−0.7−0.90.0
RMSE of Case 3
(−30 mm translational movement along Z-axis)
3.41132.3104 (−32%)0.66830.4607 (−31%)
Errors of experimental results with ten different GA parameters3-11.94.92.71.71.8−4.11.10.2−0.30.7−0.1−0.2
3-22.51.3−3.70.6−0.1−0.2
3-31.91.9−2.20.8−0.4−0.2
3-41.31.9−3.41.0−0.1−0.2
3-52.71.6−2.80.4−0.4−0.2
3-62.12.1−2.80.80.0−0.2
3-71.92.0−3.20.80.0−0.2
3-81.52.1−2.40.7−0.3−0.2
3-91.72.1−2.20.7−0.2−0.2
3-101.02.2−2.50.6−0.6−0.2
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Min, J.; Bang, Y.; Bang, H.; Jeon, H. Port Structure Inspection Based on 6-DOF Displacement Estimation Combined with Homography Formulation and Genetic Algorithm. Appl. Sci. 2021, 11, 6470. https://0-doi-org.brum.beds.ac.uk/10.3390/app11146470

AMA Style

Min J, Bang Y, Bang H, Jeon H. Port Structure Inspection Based on 6-DOF Displacement Estimation Combined with Homography Formulation and Genetic Algorithm. Applied Sciences. 2021; 11(14):6470. https://0-doi-org.brum.beds.ac.uk/10.3390/app11146470

Chicago/Turabian Style

Min, Jiyoung, Yuseok Bang, Hyuntae Bang, and Haemin Jeon. 2021. "Port Structure Inspection Based on 6-DOF Displacement Estimation Combined with Homography Formulation and Genetic Algorithm" Applied Sciences 11, no. 14: 6470. https://0-doi-org.brum.beds.ac.uk/10.3390/app11146470

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop