Topic Editors

Department of Electronic Engineering, National Formosa University, Yunlin City 632, Taiwan
The Graduate Institute of Science Education and the Department of Earth Sciences, National Taiwan Normal University (NTNU), Taipei, Taiwan
Director of the Cognitions Humaine et Artificielle Laboratory, University Paris 8, 93526 Saint-Denis, France
Department of Electrical Engineering, National Central University, Taoyuan 32001, Taiwan
Department of Chemical and Materials Engineering, National University of Kaohsiung, Kaohsiung, Taiwan

Application of IoT on Manufacturing, Communication and Engineering

Abstract submission deadline
closed (31 March 2024)
Manuscript submission deadline
closed (31 May 2024)
Viewed by
25153

Topic Information

Dear Colleagues,

The 2023 IEEE 5th Eurasia Conference on IoT, Communication and Engineering (IEEE ECICE 2023) will be held in Yunlin, Taiwan, on 27–29 October 2023, and it will provide a unified communication platform for researchers in the fields of IoT and advanced manufacturing. The booming economic development in Asia, particularly of the leading manufacturing industries, including automobile, machinery, computer, communication, consumer product, flat panel display to semiconductor and micro/nano areas, has attracted increasing attention among universities, research institutions and many industrial corporations. This conference aims to provide a broad international forum for researchers, engineers, and professionals from all over the world working in the areas of IoT and manufacturing to discuss and exchange various scientific, technical and management aspects across the wide spectrum of the society. The theme of the conference is set as smart manufacturing, focusing on new and emerging technologies. This Topic “Application of IoT on Manufacturing, Communication and Engineering”, includes five journals, Symmetry, Applied Sciences, Sensors, Coatings and Energies, which will publish excellent papers about relative fields. It enables interdisciplinary collaboration of science and engineering technologists in the academic and industrial fields, as well as international networking. Researchers with innovative ideas or research results in all aspects of advanced manufacture are encouraged to submit their contributions.

Topics of interest include, but are not limited to, the following:

  • Internet and IoT technology;
  • Communication science and engineering;
  • Computer science and information technology;
  • Computational science and engineering;
  • Electrical and electronic engineering;
  • Mechanical and automation engineering;
  • Advanced machining and forming processes;
  • Micro- and nano-fabrication;
  • Surface manufacturing processes;
  • Gears manufacturing;
  • Bio-medical manufacturing;
  • Precision engineering measurement;
  • Robotics and automation;
  • Additive manufacturing technology;
  • Smart manufacturing technology for Industry 4.0;
  • Environmental sustainability.

Prof. Dr. Teen-­Hang Meen
Prof. Dr. Chun-Yen Chang
Prof. Dr. Charles Tijus
Prof. Dr. Po-Lei Lee
Prof. Dr. Cheng-Fu Yang
Topic Editors

Keywords

  • Internet of Thing
  • smart manufacturing
  • communication
  • micro and nano fabrication
  • engineering

Participating Journals

Journal Name Impact Factor CiteScore Launched Year First Decision (median) APC
Applied Sciences
applsci
2.7 5.3 2011 16.9 Days CHF 2400
Coatings
coatings
3.4 5.0 2011 13.8 Days CHF 2600
Energies
energies
3.2 6.2 2008 16.1 Days CHF 2600
Sensors
sensors
3.9 7.3 2001 17 Days CHF 2600
Symmetry
symmetry
2.7 5.4 2009 16.2 Days CHF 2400

Preprints.org is a multidiscipline platform providing preprint service that is dedicated to sharing your research from the start and empowering your research journey.

MDPI Topics is cooperating with Preprints.org and has built a direct connection between MDPI journals and Preprints.org. Authors are encouraged to enjoy the benefits by posting a preprint at Preprints.org prior to publication:

  1. Immediately share your ideas ahead of publication and establish your research priority;
  2. Protect your idea from being stolen with this time-stamped preprint article;
  3. Enhance the exposure and impact of your research;
  4. Receive feedback from your peers in advance;
  5. Have it indexed in Web of Science (Preprint Citation Index), Google Scholar, Crossref, SHARE, PrePubMed, Scilit and Europe PMC.

Published Papers (19 papers)

Order results
Result details
Journals
Select all
Export citation of selected articles as:
23 pages, 4249 KiB  
Article
Rolling Bearing Remaining Useful Life Prediction Based on CNN-VAE-MBiLSTM
by Lei Yang, Yibo Jiang, Kang Zeng and Tao Peng
Sensors 2024, 24(10), 2992; https://0-doi-org.brum.beds.ac.uk/10.3390/s24102992 - 8 May 2024
Viewed by 473
Abstract
Ensuring precise prediction of the remaining useful life (RUL) for bearings in rolling machinery is crucial for preventing sudden machine failures and optimizing equipment maintenance strategies. Since the significant interference encountered in real industrial environments and the high complexity of the machining process, [...] Read more.
Ensuring precise prediction of the remaining useful life (RUL) for bearings in rolling machinery is crucial for preventing sudden machine failures and optimizing equipment maintenance strategies. Since the significant interference encountered in real industrial environments and the high complexity of the machining process, accurate and robust RUL prediction of rolling bearings is of tremendous research importance. Hence, a novel RUL prediction model called CNN-VAE-MBiLSTM is proposed in this paper by integrating advantages of convolutional neural network (CNN), variational autoencoder (VAE), and multiple bi-directional long short-term memory (MBiLSTM). The proposed approach includes a CNN-VAE model and a MBiLSTM model. The CNN-VAE model performs well for automatically extracting low-dimensional features from time–frequency spectrum of multi-axis signals, which simplifies the construction of features and minimizes the subjective bias of designers. Based on these features, the MBiLSTM model achieves a commendable performance in the prediction of RUL for bearings, which independently captures sequential characteristics of features in each axis and further obtains differences among multi-axis features. The performance of the proposed approach is validated through an industrial case, and the result indicates that it exhibits a higher accuracy and a better anti-noise capacity in RUL predictions than comparable methods. Full article
Show Figures

Figure 1

11 pages, 5282 KiB  
Article
Effect of Photoanode Process Sequence on Efficiency of Dye-Sensitized Solar Cells
by Tian-Chiuan Wu, Wei-Ming Huang, Jenn-Kai Tsai, Cheng-En Chang and Teen-Hang Meen
Coatings 2024, 14(3), 304; https://0-doi-org.brum.beds.ac.uk/10.3390/coatings14030304 - 29 Feb 2024
Cited by 1 | Viewed by 982
Abstract
Owing to its contribution to carbon emission reduction, green energy has received widespread attention. Among green energy sources, solar energy is regarded as the most important. In solar energy production, dye-sensitized solar cells (DSSCs) have been favored owing to their characteristics of simple [...] Read more.
Owing to its contribution to carbon emission reduction, green energy has received widespread attention. Among green energy sources, solar energy is regarded as the most important. In solar energy production, dye-sensitized solar cells (DSSCs) have been favored owing to their characteristics of simple manufacturing and high efficiency as a third-generation solar cell technology. DSSCs are prospective candidates for powering indoor Internet of Things (IoT) devices. In this study, to find a method to enhance DSSCs’ efficiency, the advantages and disadvantages of the screen printing method and the mechanical pressing and annealing method were analyzed. Using an improved method, a TiO2 photoanode was processed and annealed, and the DSSCs with the photoanode showed an efficiency increase from 1.10 to 4.78%. Full article
Show Figures

Figure 1

36 pages, 2666 KiB  
Review
Real-Time Remote Patient Monitoring: A Review of Biosensors Integrated with Multi-Hop IoT Systems via Cloud Connectivity
by Raihan Uddin and Insoo Koo
Appl. Sci. 2024, 14(5), 1876; https://0-doi-org.brum.beds.ac.uk/10.3390/app14051876 - 25 Feb 2024
Cited by 2 | Viewed by 3241
Abstract
This comprehensive review paper explores the intricate integration of biosensors with multi-hop Internet of Things (IoT) systems, representing a paradigm shift in healthcare through real-time remote patient monitoring. The strategic deployment of biosensors in different locations in medical facilities, intricately connected to multiple [...] Read more.
This comprehensive review paper explores the intricate integration of biosensors with multi-hop Internet of Things (IoT) systems, representing a paradigm shift in healthcare through real-time remote patient monitoring. The strategic deployment of biosensors in different locations in medical facilities, intricately connected to multiple microcontrollers, serves as a cornerstone in the establishment of robust multi-hop IoT networks. This paper highlights the role of this multi-hop IoT network, which efficiently facilitates the seamless transmission of vital health data to a centralized server. Crucially, the utilization of cloud connectivity emerges as a linchpin in this integration, providing a secure and scalable platform for remote patient monitoring. This cloud-based approach not only improves the accessibility of critical health information but also transcends physical limitations, allowing healthcare providers to monitor patients in real-time from any location. This paper highlights the transformative potential of this integration in overcoming traditional healthcare limitations through real-time remote patient monitoring. Full article
Show Figures

Figure 1

21 pages, 9944 KiB  
Article
Communication Delay Outlier Detection and Compensation for Teleoperation Using Stochastic State Estimation
by Eugene Kim, Myeonghwan Hwang, Taeyoon Lim, Chanyeong Jeong, Seungha Yoon and Hyunrok Cha
Sensors 2024, 24(4), 1241; https://0-doi-org.brum.beds.ac.uk/10.3390/s24041241 - 15 Feb 2024
Viewed by 706
Abstract
There have been numerous studies attempting to overcome the limitations of current autonomous driving technologies. However, there is no doubt that it is challenging to promise integrity of safety regarding urban driving scenarios and dynamic driving environments. Among the reported countermeasures to supplement [...] Read more.
There have been numerous studies attempting to overcome the limitations of current autonomous driving technologies. However, there is no doubt that it is challenging to promise integrity of safety regarding urban driving scenarios and dynamic driving environments. Among the reported countermeasures to supplement the uncertain behavior of autonomous vehicles, teleoperation of the vehicle has been introduced to deal with the disengagement of autonomous driving. However, teleoperation can lead the vehicle to unforeseen and hazardous situations from the viewpoint of wireless communication stability. In particular, communication delay outliers that severely deviate from the passive communication delay should be highlighted because they could hamper the cognition of the circumstances monitored by the teleoperator, or the control signal could be contaminated regardless of the teleoperator’s intention. In this study, communication delay outliers were detected and classified based on the stochastic approach (passive delays and outliers were estimated as 98.67% and 1.33%, respectively). Results indicate that communication delay outliers can be automatically detected, independently of the real-time quality of wireless communication stability. Moreover, the proposed framework demonstrates resilience against outliers, thereby mitigating potential performance degradation. Full article
Show Figures

Figure 1

13 pages, 3978 KiB  
Article
An IoT Real-Time Potable Water Quality Monitoring and Prediction Model Based on Cloud Computing Architecture
by Rita Wiryasaputra, Chin-Yin Huang, Yu-Ju Lin and Chao-Tung Yang
Sensors 2024, 24(4), 1180; https://0-doi-org.brum.beds.ac.uk/10.3390/s24041180 - 11 Feb 2024
Viewed by 1936
Abstract
In order to achieve the Sustainable Development Goals (SDG), it is imperative to ensure the safety of drinking water. The characteristics of each drinkable water, encompassing taste, aroma, and appearance, are unique. Inadequate water infrastructure and treatment can affect these features and may [...] Read more.
In order to achieve the Sustainable Development Goals (SDG), it is imperative to ensure the safety of drinking water. The characteristics of each drinkable water, encompassing taste, aroma, and appearance, are unique. Inadequate water infrastructure and treatment can affect these features and may also threaten public health. This study utilizes the Internet of Things (IoT) in developing a monitoring system, particularly for water quality, to reduce the risk of contracting diseases. Water quality components data, such as water temperature, alkalinity or acidity, and contaminants, were obtained through a series of linked sensors. An Arduino microcontroller board acquired all the data and the Narrow Band-IoT (NB-IoT) transmitted them to the web server. Due to limited human resources to observe the water quality physically, the monitoring was complemented by real-time notifications alerts via a telephone text messaging application. The water quality data were monitored using Grafana in web mode, and the binary classifiers of machine learning techniques were applied to predict whether the water was drinkable or not based on the data collected, which were stored in a database. The non-decision tree, as well as the decision tree, were evaluated based on the improvements of the artificial intelligence framework. With a ratio of 60% for data training: at 20% for data validation, and 10% for data testing, the performance of the decision tree (DT) model was more prominent in comparison with the Gradient Boosting (GB), Random Forest (RF), Neural Network (NN), and Support Vector Machine (SVM) modeling approaches. Through the monitoring and prediction of results, the authorities can sample the water sources every two weeks. Full article
Show Figures

Figure 1

18 pages, 10065 KiB  
Article
Chatter Stability Prediction for Deep-Cavity Turning of a Bent-Blade Cutter
by Xiaojuan Wang, Qinghua Song and Zhanqiang Liu
Sensors 2024, 24(2), 606; https://0-doi-org.brum.beds.ac.uk/10.3390/s24020606 - 18 Jan 2024
Viewed by 653
Abstract
The bent-blade cutter is widely used in machining typical deep-cavity parts such as turbine discs and disc shafts, but few scholars have studied the dynamics of the turning process. The existing mechanism of regenerative chatter in the metal-cutting process does not consider the [...] Read more.
The bent-blade cutter is widely used in machining typical deep-cavity parts such as turbine discs and disc shafts, but few scholars have studied the dynamics of the turning process. The existing mechanism of regenerative chatter in the metal-cutting process does not consider the influence of bending and torsional vibration, the change of tool profile and the complex machining geometry, so it cannot be directly used to reveal the underlying cause of the chatter phenomena in the deep inner cavity part turning process. This paper attempts to investigate the dynamic problem of the bent-blade cutter turning process. The dynamic model of a bent-blade cutter is proposed by considering the regenerative chatter effect. Based on the extended Timoshenko beam element (E-TBM) theory and finite element method (FEM), the coupling between the bending vibrations and the torsional vibrations, as well as the dynamic cutting forces, are modeled along the turning path. The vibration characteristics of the bending–torsion combination of cutter board and cutter bar, together with the dynamical governing equation, were analyzed theoretically. The chatter stability of a bent-blade cutter with a bending and torsion combination effect is predicted in the turning process. A series of turning experiments are carried out to verify the accuracy and efficiency of the presented model. Furthermore, the influence of cutting parameters on the cutting process is analyzed, and the results can be used to optimize the cutting parameters for suppressing machining vibration and improving machining process stability. Full article
Show Figures

Figure 1

20 pages, 10946 KiB  
Article
The Implementation of a Gesture Recognition System with a Millimeter Wave and Thermal Imager
by Yi-Lin Cheng, Wen-Hsiang Yeh and Yu-Ping Liao
Sensors 2024, 24(2), 581; https://0-doi-org.brum.beds.ac.uk/10.3390/s24020581 - 17 Jan 2024
Viewed by 956
Abstract
During the COVID-19 pandemic, the number of cases continued to rise. As a result, there was a growing demand for alternative control methods to traditional buttons or touch screens. However, most current gesture recognition technologies rely on machine vision methods. However, this method [...] Read more.
During the COVID-19 pandemic, the number of cases continued to rise. As a result, there was a growing demand for alternative control methods to traditional buttons or touch screens. However, most current gesture recognition technologies rely on machine vision methods. However, this method can lead to suboptimal recognition results, especially in situations where the camera is operating in low-light conditions or encounters complex backgrounds. This study introduces an innovative gesture recognition system for large movements that uses a combination of millimeter wave radar and a thermal imager, where the multi-color conversion algorithm is used to improve palm recognition on the thermal imager together with deep learning approaches to improve its accuracy. While the user performs gestures, the mmWave radar captures point cloud information, which is then analyzed through neural network model inference. It also integrates thermal imaging and palm recognition to effectively track and monitor hand movements on the screen. The results suggest that this combined method significantly improves accuracy, reaching a rate of over 80%. Full article
Show Figures

Figure 1

12 pages, 4793 KiB  
Article
Research on Low-Density Parity-Check Decoding Algorithm for Reliable Transmission in Satellite Communications
by Yuanzhi He and Yaoling Wang
Appl. Sci. 2024, 14(2), 746; https://0-doi-org.brum.beds.ac.uk/10.3390/app14020746 - 15 Jan 2024
Viewed by 791
Abstract
Satellite communications face difficulties such as intensified environmental attenuation, dynamic time-varying links, and diverse business scenarios, which usually require channel coding schemes with high coding gain and high throughput. Low-density parity-check (LDPC) codes are dominant in satellite communication coding schemes due to their [...] Read more.
Satellite communications face difficulties such as intensified environmental attenuation, dynamic time-varying links, and diverse business scenarios, which usually require channel coding schemes with high coding gain and high throughput. Low-density parity-check (LDPC) codes are dominant in satellite communication coding schemes due to their excellent performance in approaching the Shannon limit and the characteristics of parallel computing. The traditional weighted-Algorithm B decoding algorithm ignores the channel received information and involves frequent multiplication operations and iteration, which introduces the channel received information for hard-decision and constellation mapping processing. Meanwhile, we design the correlated reliability between the extrinsic information and the mapping processing information to improve the correctness of decoding. The multiplication operation in the iterative process can be replaced by the simple sum of the Hamming distance coefficient, the correlated reliability between the extrinsic information and the mapping processing information, and the extrinsic information frequency, thereby reducing the complexity and storage load of the system. The simulation results show that the presented MRAI-LDPC algorithm can obtain about 0.4 dB performance gain, and the average number of iterations is reduced by 68% under a low SNR. The algorithm can achieve better error-correcting performance and higher throughput, providing strong support for reliable transmission of satellite communications. Full article
Show Figures

Figure 1

13 pages, 5625 KiB  
Article
Investigation of Laser Ablation Quality Based on Data Science and Machine Learning XGBoost Classifier
by Chien-Chung Tsai and Tung-Hon Yiu
Appl. Sci. 2024, 14(1), 326; https://0-doi-org.brum.beds.ac.uk/10.3390/app14010326 - 29 Dec 2023
Viewed by 982
Abstract
This work proposes a matching data science approach for the laser ablation quality, reb, the study of Si3N4 film based on supervised machine learning classifiers in the CMOS-MEMS process. The study demonstrates that there exists an energy threshold, [...] Read more.
This work proposes a matching data science approach for the laser ablation quality, reb, the study of Si3N4 film based on supervised machine learning classifiers in the CMOS-MEMS process. The study demonstrates that there exists an energy threshold, Eth, for laser ablation. If the laser energy surpasses this threshold, increasing the interval time will not contribute significantly to the recovery of pulse laser energy. Thus, reb enhancement is limited. When the energy is greater than 0.258 mJ, there exists a critical value of interval time at which the reb value is relatively low for each energy level, respectively. In addition, the variation of reb, Δreb, is independent of the interval time at the invariant point of energy between 0.32 mJ and 0.36 mJ. Energy and interval time exhibit a Pearson correlation of 0.82 and 0.53 with reb, respectively. To maintain Δreb below 0.15, green laser ablation of Si3N4 at operating energies of 0.258–0.378 mJ can adopt a baseline interval time of the initial baseline multiplied by 1/∜2. Additionally, for operating energies of 0.288–0.378 mJ during Si3N4 laser ablation, Δreb can be kept below 0.1. With the forced partition methods, namely, the k-means method and percentile method, the XGBoost (v 2.0.3) classifier maintains a competitive accuracy across test sizes of 0.20–0.40, outperforming the machine learning algorithms Random Forest and Logistic Regression, with the highest accuracy of 0.78 at a test size of 0.20. Full article
Show Figures

Figure 1

14 pages, 6939 KiB  
Article
Ionic Storage Materials for Anodic Discoloration in Electrochromic Devices
by Po-Wen Chen, Chen-Te Chang and Po-Hsiu Kuo
Energies 2023, 16(24), 8119; https://0-doi-org.brum.beds.ac.uk/10.3390/en16248119 - 17 Dec 2023
Viewed by 1093
Abstract
The ion storage layer in electrochromic devices (ECDs) stores protons or lithium ions to provide electrochemical stability and extend cycle durability. This paper reports on the performance and stability of ECDs paired with various ion storage layers (NiO, V2O5, [...] Read more.
The ion storage layer in electrochromic devices (ECDs) stores protons or lithium ions to provide electrochemical stability and extend cycle durability. This paper reports on the performance and stability of ECDs paired with various ion storage layers (NiO, V2O5, and IrO2 films). The complementary ECD using a V2O5 ion storage layer presented the fastest response time, but the lowest optical contrast. In addition, the ECD using an IrO2 ion storage layer proved the most effective as an ion storage layer, due to its high optical modulation ability capability and long-term stability. Chronoamperometry analysis revealed that IrO2-based ECD (glass/IZTO/WO3/liquid electrolyte/IrO2/IZTO/glass) can be highly effective in modulating optical transmittance, as indicated by T = 61.5% (from Tbleaching (69.6%) to Tcoloring (8.1%)) and switching times of 5.3 s for coloring and 7.3 s for bleaching at 633 nm. Full article
Show Figures

Figure 1

23 pages, 7499 KiB  
Article
Research on Optimum Charging Current Profile with Multi-Stage Constant Current Based on Bio-Inspired Optimization Algorithms for Lithium-Ion Batteries
by Shun-Chung Wang and Zhi-Yao Zhang
Energies 2023, 16(22), 7641; https://0-doi-org.brum.beds.ac.uk/10.3390/en16227641 - 17 Nov 2023
Cited by 1 | Viewed by 894
Abstract
Compared with the conventional constant-current constant-voltage (CC-CV) charging method, the multi-stage constant-current (MSCC) charging method offers advantages such as rapid charging speed and high charging efficiency. However, MSCC must find the optimal charging current profile (OCCP) in order to achieve the aforementioned benefits. [...] Read more.
Compared with the conventional constant-current constant-voltage (CC-CV) charging method, the multi-stage constant-current (MSCC) charging method offers advantages such as rapid charging speed and high charging efficiency. However, MSCC must find the optimal charging current profile (OCCP) in order to achieve the aforementioned benefits. Hence, in this paper, five bio-inspired optimization algorithms (BIOAs), including particle swarm optimization (PSO), modified PSO (MPSO), grey wolf optimization (GWO), modified GWO (MGWO), and the jellyfish search algorithm (JSA), are applied to solve the problem of searching for the OCCP of the MSCC. The best solution-finding procedure is run on the MATLAB platform developed based on minimizing the objective function of combining charging time (CT) and energy loss (EL) with a proportional weight. Without requiring numerous and time-consuming actual charge-and-discharge experiments, a wide range of searches can be quickly achieved only through the battery equivalent circuit model (ECM) established. The theoretical derivation and correctness are confirmed via the simulation and experimental results, which demonstrate that the OCCPs obtained by using the devised charging strategies possess the shortest CT and the best charging efficiency (CE), and among them, MPSO has the best fitness value (FV). Compared with the traditional CC-CV method, the experimental results show that the maximum improvement rates (IRs) of the studied approaches in terms of six charging performance evaluation indicators (CPEIs), including CT, charging capacity (CHC), CE, charging energy (CWh), average temperature rise (ATR), and FV, are 21.10%, 0.40%, 0.24%, 2.85%, 18.86%, and 68.99%, respectively. Furthermore, according to the comprehensive evaluation with CPEIs, the top three with the best overall performance are the JSA, MPSO, and GWO methods, respectively. Full article
Show Figures

Figure 1

16 pages, 9569 KiB  
Article
Enhancing UAV Visual Landing Recognition with YOLO’s Object Detection by Onboard Edge Computing
by Ming-You Ma, Shang-En Shen and Yi-Cheng Huang
Sensors 2023, 23(21), 8999; https://0-doi-org.brum.beds.ac.uk/10.3390/s23218999 - 6 Nov 2023
Cited by 5 | Viewed by 1569
Abstract
A visual camera system combined with the unmanned aerial vehicle (UAV) onboard edge computer should deploy an efficient object detection ability, increase the frame per second rate of the object of interest, and the wide searching ability of the gimbal camera for finding [...] Read more.
A visual camera system combined with the unmanned aerial vehicle (UAV) onboard edge computer should deploy an efficient object detection ability, increase the frame per second rate of the object of interest, and the wide searching ability of the gimbal camera for finding the emergent landing platform and for future reconnaissance area missions. This paper proposes an approach to enhance the visual capabilities of this system by using the You Only Look Once (YOLO)-based object detection (OD) with Tensor RTTM acceleration technique, an automated visual tracking gimbal camera control system, and multithread programing for image transmission to the ground station. With lightweight edge computing (EC), the mean average precision (mAP) was satisfied and we achieved a higher frame per second (FPS) rate via YOLO accelerated with TensorRT for an onboard UAV. The OD compares four YOLO models to recognize objects of interest for landing spots at the home university first. Then, the trained dataset with YOLOv4-tiny was successfully applied to another field with a distance of more than 100 km. The system’s capability to accurately recognize a different landing point in new and unknown environments is demonstrated successfully. The proposed approach substantially reduces the data transmission and processing time to ground stations with automated visual tracking gimbal control, and results in rapid OD and the feasibility of using NVIDIA JetsonTM Xavier NX by deploying YOLOs with more than 35 FPS for the UAV. The enhanced visual landing and future reconnaissance mission capabilities of real-time UAVs were demonstrated. Full article
Show Figures

Figure 1

14 pages, 4034 KiB  
Article
Seamless Industry 4.0 Integration: A Multilayered Cyber-Security Framework for Resilient SCADA Deployments in CPPS
by Eric Wai and C. K. M. Lee
Appl. Sci. 2023, 13(21), 12008; https://0-doi-org.brum.beds.ac.uk/10.3390/app132112008 - 3 Nov 2023
Cited by 2 | Viewed by 1709
Abstract
The increased connectivity and automation capabilities of Industry 4.0 cyber-physical production systems (CPPS) create significant cyber-security vulnerabilities in supervisory control and data acquisition (SCADA) environments if robust protections are not properly implemented. Legacy industrial control systems and new IP-enabled sensors, instruments, controllers, and [...] Read more.
The increased connectivity and automation capabilities of Industry 4.0 cyber-physical production systems (CPPS) create significant cyber-security vulnerabilities in supervisory control and data acquisition (SCADA) environments if robust protections are not properly implemented. Legacy industrial control systems and new IP-enabled sensors, instruments, controllers, and appliances often lack basic safeguards like encryption, rigorous access controls, and endpoint security. This exposes manufacturers to substantial risks of cyberattacks that could manipulate, disrupt, or disable critical physical assets and processes related to their production lines and facilities. This study presents a multilayered cybersecurity framework to address these challenges and harden SCADA environments by implementing granular access controls, network micro-segmentation, anomaly detection, encrypted communications, and legacy system upgrades. The multilayered defense-in-depth (DID) approach combines policies, processes, and technologies to counter emerging vulnerabilities. The methodology was implemented in an electronics manufacturing facility across access control, zoning, monitoring, and encryption scenarios. Results show security improvements, including 57.4% fewer unauthorized access events, 41.2% faster threat containment, and 79.2% fewer hacking attempts. The quantified metrics highlight the CPPS resilience and threat mitigation capabilities enabled by the securely designed SCADA architecture, which allows manufacturers to confidently pursue Industry 4.0 integration and digital transformation with minimized disruption. Full article
Show Figures

Figure 1

29 pages, 2761 KiB  
Review
A Systematic Mapping: Exploring Internet of Everything Technologies and Innovations
by Fazlina Mohd Ali, Nur Arzilawati Md Yunus, Nur Nabila Mohamed, Marizuana Mat Daud and Elankovan A. Sundararajan
Symmetry 2023, 15(11), 1964; https://0-doi-org.brum.beds.ac.uk/10.3390/sym15111964 - 24 Oct 2023
Cited by 1 | Viewed by 1700
Abstract
The Internet of Everything (IoE) represents a paradigm shift in the world of connectivity. While the Internet of Things (IoT) initiated the era of interconnected devices, the IoE takes this concept to new heights by interlinking objects, individuals, data, and processes. Symmetry in [...] Read more.
The Internet of Everything (IoE) represents a paradigm shift in the world of connectivity. While the Internet of Things (IoT) initiated the era of interconnected devices, the IoE takes this concept to new heights by interlinking objects, individuals, data, and processes. Symmetry in IoE innovation and technology is essential for creating a harmonious and efficient ecosystem to ensure that the benefits are accessible to a broad spectrum of society while minimizing potential drawbacks. This comprehensive review paper explores the multifaceted landscape of the IoE, delving into its core concepts, enabling technologies, real-world applications, and the intricate web of challenges it presents. A focal point of this review is the diverse array of real-world applications spanning healthcare, smart cities, industry 4.0, agriculture, and sustainability. Previous works and examples illustrate how the IoE reshapes these domains, leading to greater efficiency, sustainability, and improved decision making. However, the transformative power of the IoE is accompanied by a host of challenges, including security and privacy concerns, interoperability issues, and the ethical implications of ubiquitous connectivity. These challenges are dissected in order to comprehensively understand the obstacles and potential solutions in the IoE landscape. As we stand on the cusp of an IoE-driven future, this review paper serves as a valuable resource for researchers, policy makers, and industry professionals seeking to navigate the complexities of this emerging paradigm. By illuminating the intricacies of the IoE, this review fosters a deeper appreciation for the transformative potential and the multifaceted challenges that lie ahead in the Internet of Everything era. Full article
Show Figures

Figure 1

13 pages, 833 KiB  
Article
Development of an Industrial Safety System Based on Voice Assistant
by Jaime Paúl Ayala Taco, Oswaldo Alexander Ibarra Jácome, Jaime Luciano Ayala Pico and Brian Andrés López Castro
Appl. Sci. 2023, 13(21), 11624; https://0-doi-org.brum.beds.ac.uk/10.3390/app132111624 - 24 Oct 2023
Viewed by 881
Abstract
Currently, there are limitations in the human–machine interfaces (HMIs) used in industry, either due to the characteristics of users’ cognitive abilities or interfaces, which hinder communication and interaction between humans and equipment. For this reason, this work presents an alternative interaction model based [...] Read more.
Currently, there are limitations in the human–machine interfaces (HMIs) used in industry, either due to the characteristics of users’ cognitive abilities or interfaces, which hinder communication and interaction between humans and equipment. For this reason, this work presents an alternative interaction model based on a voice assistant, Alexa, which promotes more natural, intuitive, direct, and understandable communication. The purpose of this work is the development of an industrial safety system for a controlled electric motor based on Alexa voice assistant, which allows the monitoring of its operating parameters, such as phase current, housing temperature, and rotor vibration, as well as making it possible to control ignition and shut down and change the rotation of the motor with a prior password, as a safety measure. Commercial smart devices and Arduino-compatible modules were used to achieve this, providing them with the Internet of Things (IoT) feature. In addition, several software platforms, such as Blynk, Tuya Smart, Node Red, and Voiceflow, are used to perform data transmission, device management, and programming of the Alexa skill, oriented to the execution of the security and run system. This shows the potential capacity of voice assistants in the industry to deliver information more naturally to humans and obtain optimal notifications. However, problems were evidenced, such as the influence of noise in the environment when communicating with the assistant, the vocalization of words, low voice tones, and accents typical of the language, that will increase the security level of the system and prevent potential identity theft. Full article
Show Figures

Figure 1

15 pages, 7893 KiB  
Article
Technical Solution for Monitoring Climatically Active Gases Using the Turbulent Pulsation Method
by Ekaterina Kulakova and Elena Muravyova
Sensors 2023, 23(20), 8645; https://0-doi-org.brum.beds.ac.uk/10.3390/s23208645 - 23 Oct 2023
Viewed by 1096
Abstract
This article introduces a technical solution for investigating the movement of gases in the atmosphere through the turbulent pulsation method. A comprehensive control system was developed to measure and record the concentrations of carbon dioxide and methane, temperature, humidity, atmospheric air pressure, wind [...] Read more.
This article introduces a technical solution for investigating the movement of gases in the atmosphere through the turbulent pulsation method. A comprehensive control system was developed to measure and record the concentrations of carbon dioxide and methane, temperature, humidity, atmospheric air pressure, wind direction, and speed in the vertical plane. The selection and validation of sensor types and brands for each parameter, along with the system for data collection, registration, and device monitoring, were meticulously executed. The AHT21 + ENS160 sensor was chosen for temperature measurement, the BME680 was identified as the optimal sensor for humidity and atmospheric pressure control, Eu-M-CH4-OD was designated for methane gas analysis, and CM1107N for carbon dioxide measurements. Wind direction and speed are best measured with the SM5386V anemometer. The control system utilizes the Arduino controller, and software was developed for the multicomponent gas analyzer. Full article
Show Figures

Figure 1

22 pages, 2186 KiB  
Article
Autonomous Scheduling for Reliable Transmissions in Industrial Wireless Sensor Networks
by Armaghan Darbandi and Myung-Kyun Kim
Energies 2023, 16(20), 7039; https://0-doi-org.brum.beds.ac.uk/10.3390/en16207039 - 11 Oct 2023
Cited by 1 | Viewed by 776
Abstract
Deploying Internet of Things (IoT) on low-power lossy wireless sensor/actuator networks (LLN) in harsh industrial environments presents challenges such as dynamic link qualities due to noise, signal attenuations and spurious interferences. However, the critical demand for industrial applications is reliability of data delivery [...] Read more.
Deploying Internet of Things (IoT) on low-power lossy wireless sensor/actuator networks (LLN) in harsh industrial environments presents challenges such as dynamic link qualities due to noise, signal attenuations and spurious interferences. However, the critical demand for industrial applications is reliability of data delivery on low-cost low-power sensor/actuator devices. To address these issues, this paper proposes a fully autonomous scheduling approach, called Auto-Sched, which ensures reliability of data delivery for both downlink and uplink traffic scheduling and enhances network robustness against node/link failures. To ensure reliability, Auto-Sched assigns retransmission time slots based on the reliability constraints of the communication link. To avoid collision issues, Auto-Sched creates an upward pipeline-like communication schedule for uplink end-to-end data delivery, and a downward pipeline-like communication schedule for downlink scheduling. For enhancing network robustness, we propose a simple algorithm for real-time autonomous schedule reconstruction, when node or link failures occur, with minimal influence on communication overhead. Performance evaluations quantified the performance of our proposed approaches under a variety of scenarios comparing them with existing approaches. Full article
Show Figures

Figure 1

19 pages, 4108 KiB  
Article
Analytical Technique Leveraging Processing Gain for Evaluating the Anti-Jamming Potential of Underwater Acoustic Direct Sequence Spread Spectrum Communication Systems
by Xiaowei Wang and Qidou Zhou
Symmetry 2023, 15(9), 1710; https://0-doi-org.brum.beds.ac.uk/10.3390/sym15091710 - 6 Sep 2023
Cited by 1 | Viewed by 974
Abstract
This study proposes an analytical technique underpinned by processing gain to evaluate the anti-jamming potential of an underwater acoustic direct-sequence spread-spectrum (DSSS) communication system that employs a short-period pseudo-noise (PN) sequence. The processing gain comes from the symmetry of the coding, which provides [...] Read more.
This study proposes an analytical technique underpinned by processing gain to evaluate the anti-jamming potential of an underwater acoustic direct-sequence spread-spectrum (DSSS) communication system that employs a short-period pseudo-noise (PN) sequence. The processing gain comes from the symmetry of the coding, which provides a mechanism for separating desired signals from unwanted ones, and the apparent randomness of the coding, which suppresses interference and noise in the system. The robustness of such a system against wideband interference, partial-band jamming, and single-frequency interference is emulated. Outcomes suggest that, in comparison to a standard binary phase shift keying (BPSK) system, the DSSS system’s ability to resist wideband interference is limited, with only a marginal increase in immunity performance of approximately 0.5 dB. Contrarily, it suppresses partial-band jamming effectively, with the suppression level dependent on the interference bandwidth and its relative position concerning the signal carrier frequency. The influence of single-frequency interference on system performance depends similarly on its relative location relative to the signal carrier frequency. In all situations where the interference frequency offset is an integer multiple of the bit bandwidth, the system exhibits the worst performance when the frequency offset equals the bit bandwidth. Upon comparing resistance levels to identical power interferences targeted at the signal carrier frequency, our system demonstrates optimal resilience to single-frequency interference. In concordance with the empirical findings, the simulated results substantiate both the effectiveness and practicability of the proposed analytical method based on processing gain. Subsequently, this study contributes a novel perspective for evaluating the anti-jamming potential of DSSS systems. Full article
Show Figures

Figure 1

14 pages, 3883 KiB  
Article
Unveiling LoRa’s Oceanic Reach: Assessing the Coverage of the Azores LoRaWAN Network from an Island
by João Pinelo, André Dionísio Rocha, Miguel Arvana, João Gonçalves, Nuno Cota and Pedro Silva
Sensors 2023, 23(17), 7394; https://0-doi-org.brum.beds.ac.uk/10.3390/s23177394 - 24 Aug 2023
Viewed by 2076
Abstract
In maritime settings, effective communication between vessels and land infrastructure is crucial, but existing technologies often prove impractical for energy-sensitive IoT applications, like deploying sensors at sea. In this study, we explore the viability of a low-power, cost-effective wireless communication solution for maritime [...] Read more.
In maritime settings, effective communication between vessels and land infrastructure is crucial, but existing technologies often prove impractical for energy-sensitive IoT applications, like deploying sensors at sea. In this study, we explore the viability of a low-power, cost-effective wireless communication solution for maritime sensing data. Specifically, we conduct an experimental assessment of the Azorean Long Range Wide Area Network (LoRaWAN) coverage. Our tests involve positioning the gateway at the island’s highest point and installing end nodes on medium-sized fishing vessels. Through measurements of received signal strength indicator (RSSI), signal-to-noise ratio (SNR), and lines of sight (LOS), we showcase the potential of LoRaWAN transmissions to achieve communication distances exceeding 130 km in a LOS-free scenario over the ocean. These findings highlight the promising capabilities of LoRaWAN for reliable and long-range maritime communication of sensing data. Full article
Show Figures

Figure 1

Back to TopTop