sensors-logo

Journal Browser

Journal Browser

Sensors and Advanced Sensing Techniques for Signal, Image and Computer Vision Applications

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Physical Sensors".

Deadline for manuscript submissions: closed (20 November 2021) | Viewed by 9456

Special Issue Editors


E-Mail Website
Guest Editor
University of Thessaly
Interests: Image Processing; Computer Vision; Artificial Intelligence; Deep Learning, Biomedical Applications
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Computer Science and Computer Engineering and Informatics, Frederick University, Limassol, Cyprus
Interests: Medical Image Processing, eHealth Systems, Biomedical Systems, Health Telematics, Artificial Neural Networks in medicine

Special Issue Information

Dear Colleagues

Recent years have witnessed key technological advances in the driving of multidisciplinary areas of IT  systems and services using sensors. Improving quality of life via ubiquitous access to services, facilitating continuous monitoring and timely interventions, while reducing the associated costs and removing socioeconomic barriers in access to specialized applications underpins the vision of future systems. Empowering applications on the move and building on the concept of smart homes and cities are some examples of sensor applications. Capitalizing on the dynamics of emerging 5G wireless networks, miniaturized biosensors and nano-technologies, mobile and pervasive computing including internet of things (IoT), and cloud computing technologies are expected to reshape the standards over the next decades.

This Special Issue of Sensors entitled “Signal Image and Video Processing” welcomes submissions regarding, but not limited to:

  • Sensors for Computer Vision and Video processing;
  • Special Image Sensing for Remote Sensing, Weather conditions monitoring;
  • Sensors for Biomedical Signals retrieval and monitoring;
  • Optical Image Sensors;
  • Robotic Vision;
  • Smart sensors’ technologies;
  • VLSI architectures for high speed video processing;
  • Processes - Materials for smart image sensors;
  • Emerging Imaging Sensor Applications;
  • Sensors for resources monitoring (e.g., water, electricity);
  • Sensor Networks (including IoT for video processing and related areas);
  • Sensor Systems: Signals, Processing and Interfaces;
  • Sensing Technologies for real-time Image and Video Compression, Clustering and Classification;
  • Sensors for Image Reconstruction, Retrieval, Modelling and Data Sampling;
  • Applications using new imaging sensors technologies (e.g., Biomedical Image and Video Processing, Logistics, Security, wearable, etc.).

Dr. Stavros Karkanis
Dr. Efthyvoulos Kyriacou
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Image sensors
  • Computer Vision
  • Advanced sensors’ architectures
  • Smart image sensors
  • Real-time imaging applications
  • Image sensors’ applications

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 7114 KiB  
Article
Automated Aircraft Dent Inspection via a Modified Fourier Transform Profilometry Algorithm
by Pasquale Lafiosca, Ip-Shing Fan and Nicolas P. Avdelidis
Sensors 2022, 22(2), 433; https://0-doi-org.brum.beds.ac.uk/10.3390/s22020433 - 07 Jan 2022
Cited by 3 | Viewed by 3282
Abstract
The search for dents is a consistent part of the aircraft inspection workload. The engineer is required to find, measure, and report each dent over the aircraft skin. This process is not only hazardous, but also extremely subject to human factors and environmental [...] Read more.
The search for dents is a consistent part of the aircraft inspection workload. The engineer is required to find, measure, and report each dent over the aircraft skin. This process is not only hazardous, but also extremely subject to human factors and environmental conditions. This study discusses the feasibility of automated dent scanning via a single-shot triangular stereo Fourier transform algorithm, designed to be compatible with the use of an unmanned aerial vehicle. The original algorithm is modified introducing two main contributions. First, the automatic estimation of the pass-band filter removes the user interaction in the phase filtering process. Secondly, the employment of a virtual reference plane reduces unwrapping errors, leading to improved accuracy independently of the chosen unwrapping algorithm. Static experiments reached a mean absolute error of 0.1 mm at a distance of 60 cm, while dynamic experiments showed 0.3 mm at a distance of 120 cm. On average, the mean absolute error decreased by 34%, proving the validity of the proposed single-shot 3D reconstruction algorithm and suggesting its applicability for future automated dent inspections. Full article
Show Figures

Figure 1

9 pages, 3414 KiB  
Article
Estimation of Fluor Emission Spectrum through Digital Photo Image Analysis with a Water-Based Liquid Scintillator
by Ji-Won Choi, Ji-Young Choi and Kyung-Kwang Joo
Sensors 2021, 21(24), 8483; https://0-doi-org.brum.beds.ac.uk/10.3390/s21248483 - 20 Dec 2021
Cited by 1 | Viewed by 1580
Abstract
In this paper, we performed a feasibility study of using a water-based liquid scintillator (WbLS) for conducting imaging analysis with a digital camera. The liquid scintillator (LS) dissolves a scintillating fluor in an organic base solvent to emit light. We synthesized a liquid [...] Read more.
In this paper, we performed a feasibility study of using a water-based liquid scintillator (WbLS) for conducting imaging analysis with a digital camera. The liquid scintillator (LS) dissolves a scintillating fluor in an organic base solvent to emit light. We synthesized a liquid scintillator using water as a solvent. In a WbLS, a suitable surfactant is needed to mix water and oil together. As an application of the WbLS, we introduced a digital photo image analysis in color space. A demosaicing process to reconstruct and decode color is briefly described. We were able to estimate the emission spectrum of the fluor dissolved in the WbLS by analyzing the pixel information stored in the digital image. This technique provides the potential to estimate fluor components in the visible region without using an expensive spectrophotometer. In addition, sinogram analysis was performed with Radon transformation to reconstruct transverse images with longitudinal photo images of the WbLS sample. Full article
Show Figures

Figure 1

14 pages, 271126 KiB  
Article
A DNN-Based UVI Calculation Method Using Representative Color Information of Sun Object Images
by Deog-Hyeon Ga, Seung-Taek Oh and Jae-Hyun Lim
Sensors 2021, 21(22), 7766; https://0-doi-org.brum.beds.ac.uk/10.3390/s21227766 - 22 Nov 2021
Viewed by 1606
Abstract
As outdoor activities are necessary for maintaining our health, research interest in environmental conditions such as the weather, atmosphere, and ultraviolet (UV) radiation is increasing. In particular, UV radiation, which can benefit or harm the human body depending on the degree of exposure, [...] Read more.
As outdoor activities are necessary for maintaining our health, research interest in environmental conditions such as the weather, atmosphere, and ultraviolet (UV) radiation is increasing. In particular, UV radiation, which can benefit or harm the human body depending on the degree of exposure, is recognized as an essential environmental factor that needs to be identified. However, unlike the weather and atmospheric conditions, which can be identified to some extent by the naked eye, UV radiation corresponds to wavelength bands that humans cannot recognize; hence, the intensity of UV radiation cannot be measured. Recently, although devices and sensors that can measure UV radiation have been launched, it is very difficult for ordinary users to acquire ambient UV radiation information directly because of the cost and inconvenience caused by operating separate devices. Herein, a deep neural network (DNN)-based ultraviolet index (UVI) calculation method is proposed using representative color information of sun object images. First, Mask-region-based convolutional neural networks (R-CNN) are applied to sky images to extract sun object regions and then detect the representative color of the sun object regions. Then, a deep learning model is constructed to calculate the UVI by inputting RGB color values, which are representative colors detected later along with the altitude angle and azimuth of the sun at that time. After selecting each day of spring and autumn, the performance of the proposed method was tested, and it was confirmed that accurate UVI could be calculated within a range of mean absolute error of 0.3. Full article
Show Figures

Figure 1

18 pages, 9120 KiB  
Article
Long-Exposure RGB Photography with a Fixed Stand for the Measurement of a Trajectory of a Dynamic Impact Device in Real Scale
by Ľudovít Kovanič, Ľubomír Ambriško, Daniela Marasová, Peter Blišťan, Tomáš Kasanický and Michal Cehlár
Sensors 2021, 21(20), 6818; https://0-doi-org.brum.beds.ac.uk/10.3390/s21206818 - 14 Oct 2021
Cited by 7 | Viewed by 1950
Abstract
The present manuscript proposes a novel method for the measurement of a trajectory of a falling impact hammer in the dynamic loading of conveyor belts and the determination of their impact resistance. The proposed method has been experimentally tested and the results of [...] Read more.
The present manuscript proposes a novel method for the measurement of a trajectory of a falling impact hammer in the dynamic loading of conveyor belts and the determination of their impact resistance. The proposed method has been experimentally tested and the results of the measurements are presented in this manuscript. The proposed method is based on the long-exposure photography with a long-duration opened shutter of the Nikon D5000 DSLR camera. Results of the experimental research were compared with direct reference measurements performed using the L-GAGE LT3 laser distance sensor. Differences between values, obtained by the new method and by the reference measurements were up to ±3 mm. The standard deviation identified in all the experiments was 1 mm. Full article
Show Figures

Figure 1

Back to TopTop