The Subsurface Objects Classification Using a Convolutional Neural Network

Mostafa El Saadouny, Jan Barowski, Ilona Rolfes

10th IEEE Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON 2019), pp. 874-877, doi: 10.1109/IEMCON.2019.8936250, Vancouver, Canada, Oct 17-19, 2019


Abstract

The artificial intelligence has been witnessing a monumental growth in bridging the gap between the capabilities of humans and machines. The advancements in computer vision with deep learning have been constructed primarily over the well-known algorithm, which is the convolutional neural network (CNN). The CNN is considered as the best artificial intelligence algorithm for image classification problems. In this paper, we present a CNN for classifying the shallowly buried objects detected by a ground penetrating radar (GPR) system. The GPR is one of the promising tools for investigating the shallowly buried objects. One of the main problems that hinder the GPR, is the strong reflections encountered from the surface and other buried unwanted objects. Therefore, the GPR requires suitable image processing and clutter reduction algorithms to eliminate the clutter and enhance the objects responses. The processed GPR images include responses from different objects and to differentiate between these objects, a CNN has been implemented for this purpose. The presented CNN consists of 3 convolutional layers and each of these layers implements filters which scan the whole input image. Afterwards, the output of each layer has been processed using the rectified linear unit (ReLU) activation function followed by a max-pooling layer. The output of the last layer has been directed as an input to the final output layer which produces a certain probability for each class output using the softmax function. The CNN has been trained by the processed GPR images and the optimization has been accomplished using the Adam optimizer. The obtained performance curves show a high degree of accuracy in classifying the GPR images.

[IEEE Library]

tags: