Euclidean Graphs as Crack Pattern Descriptors for Automated Crack Analysis in Digital Images

Sensors (Basel). 2022 Aug 9;22(16):5942. doi: 10.3390/s22165942.

Abstract

Typical crack detection processes in digital images produce a binary-segmented image that constitutes the basis for all of the following analyses. Binary images are, however, an unsatisfactory data format for advanced crack analysis algorithms due to their sparse nature and lack of significant data structuring. Therefore, this work instead proposes a new approach based on Euclidean graphs as functional crack pattern descriptors for all post-detection analyses. Conveying both geometrical and topological information in an integrated representation, Euclidean graphs are an ideal structure for efficient crack path description, as they precisely locate the cracks on the original image and capture salient crack skeleton features. Several Euclidean graph-based algorithms for autonomous crack refining, correlation and analysis are described, with significant advantages in both their capabilities and implementation convenience over the traditional, binary image-based approach. Moreover, Euclidean graphs allow the autonomous selection of specific cracks or crack parts based on objective criteria. Well-known performance metrics, namely precision, recall, intersection over union and F1-score, have been adapted for use with Euclidean graphs. The automated generation of Euclidean graphs from binary-segmented images is also reported, enabling the application of this technique to most existing detection methods (e.g., threshold-based or neural network-based) for cracks and other curvilinear features in digital images.

Keywords: Dijkstra’s algorithm; autonomous structure inspection; computer vision; crack path descriptor; cracked cement surface; graph algorithms; image segmentation; line-shaped feature description; performance evaluation indicators; structural damage assessment.

MeSH terms

  • Algorithms*
  • Neural Networks, Computer*

Grants and funding

This research received no external funding.