Machine Learning in IoT Networking and Communications

A special issue of Journal of Sensor and Actuator Networks (ISSN 2224-2708). This special issue belongs to the section "Big Data, Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (31 March 2022) | Viewed by 42973

Special Issue Editor

School of Electronic Engineering and Computer Science, Queen Mary University of London, Mile End Road, London E1 4NS, UK
Interests: IoT networks; IoT-enabled digital twins; machine learning for smart urban mobility; IoT security and communication

Special Issue Information

Dear Colleagues,

The fast and wide spread of Internet of Things (IoT) applications offers new opportunities in multiple domains but also presents new challenges. A skyrocketing number of IoT devices (sensors, actuators, etc.) is deployed to collect critical data and to control environments such as manufacturing, healthcare, urban/built areas, and public safety. At the same time, machine learning (ML) has shown significant success in transforming heterogeneous and complex datasets into coherent output and actionable insights. Thus, the marriage of ML and IoT has a pivotal role in enabling smart environments with precision in decision-making and adaptive automation. However, leveraging ML and IoT still faces significant challenges obstructing the full realisation of foreseen opportunities. Direct challenges relate to scalability, security, accessibility, resilience and latency, all of which have resulted in a growing corpus of research addressing one or more of these issues. Nevertheless, the overarching challenge concerns the exportability of advancements in this area across multiple applications. For instance, an acoustic scene classification method that successfully detects violence in a small town would completely fail in busy cities, and an autonomous pod trained to deliver groceries in a controlled environment would not succeed elsewhere. Thus, the biggest challenge in pushing forward the seamless integration of ML and IoT systems is the exportability of technologies which creates opportunities for novel research and interdisciplinary efforts.

The papers in this Special Issue will focus on state-of-the-art research and challenges in leveraging ML and IoT. In this Special Issue, we shall solicit papers that cover numerous topics of interest that include but are not limited to:

  • ML and IoT for system deployment and operation;
  • ML and IoT for assisted automation;
  • ML-enabled real-time IoT data analytics;
  • ML- and IoT-enabled digital twin;
  • Cloud/edge computing systems for IoT employing ML;
  • ML-enabled spatial-temporal IoT data fusion for intelligent decision making;
  • Data-centric simulations for IoT systems;
  • ML for IoT application orchestration;
  • ML for managing security in IoT data processing;
  • ML for IoT attack detection and prevention;
  • Testbed and empirical studies.

Dr. Mona Jaber
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Journal of Sensor and Actuator Networks is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • machine learning
  • artificial intelligence
  • Internet of Things
  • digital twin
  • exportable AI
  • IoT security
  • IoT data fusion

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research, Review

3 pages, 171 KiB  
Editorial
Machine Learning in IoT Networking and Communications
by Mona Jaber
J. Sens. Actuator Netw. 2022, 11(3), 37; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan11030037 - 29 Jul 2022
Viewed by 1977
Abstract
The fast and wide spread of Internet of Things (IoT) applications offers new opportunities in multiple domains but also presents new challenges [...] Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)

Research

Jump to: Editorial, Review

12 pages, 3899 KiB  
Article
A Machine Learning Approach to Solve the Network Overload Problem Caused by IoT Devices Spatially Tracked Indoors
by Daniel Carvalho, Daniel Sullivan, Rafael Almeida and Carlos Caminha
J. Sens. Actuator Netw. 2022, 11(2), 29; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan11020029 - 16 Jun 2022
Cited by 1 | Viewed by 2151
Abstract
Currently, there are billions of connected devices, and the Internet of Things (IoT) has boosted these numbers. In the case of private networks, a few hundred devices connected can cause instability and even data loss in communication. In this article, we propose a [...] Read more.
Currently, there are billions of connected devices, and the Internet of Things (IoT) has boosted these numbers. In the case of private networks, a few hundred devices connected can cause instability and even data loss in communication. In this article, we propose a machine learning-based modeling to solve network overload caused by continuous monitoring of the trajectories of several devices tracked indoors. The proposed modeling was evaluated with over a hundred thousand of coordinate locations of objects tracked in three synthetic environments and one real environment. It has been shown that it is possible to solve the network overload problem by increasing the latency in sending data and predicting intermediate coordinates of the trajectories on the server-side with ensemble models, such as Random Forest, and using Artificial Neural Networks without relevant data loss. It has also been shown that it is possible to predict at least thirty intermediate coordinates of the trajectories of objects tracked with R2 greater than 0.8. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

22 pages, 6034 KiB  
Article
A Novel Road Maintenance Prioritisation System Based on Computer Vision and Crowdsourced Reporting
by Edwin Salcedo, Mona Jaber and Jesús Requena Carrión
J. Sens. Actuator Netw. 2022, 11(1), 15; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan11010015 - 14 Feb 2022
Cited by 10 | Viewed by 4772
Abstract
The maintenance of critical infrastructure is a costly necessity that developing countries often struggle to deliver timely repairs. The transport system acts as the arteries of any economy in development, and the formation of potholes on roads can lead to injuries and the [...] Read more.
The maintenance of critical infrastructure is a costly necessity that developing countries often struggle to deliver timely repairs. The transport system acts as the arteries of any economy in development, and the formation of potholes on roads can lead to injuries and the loss of lives. Recently, several countries have enabled pothole reporting platforms for their citizens, so that repair work data can be centralised and visible for everyone. Nevertheless, many of these platforms have been interrupted because of the rapid growth of requests made by users. Not only have these platforms failed to filter duplicate or fake reports, but they have also failed to classify their severity, albeit that this information would be key in prioritising repair work and improving the safety of roads. In this work, we aimed to develop a prioritisation system that combines deep learning models and traditional computer vision techniques to automate the analysis of road irregularities reported by citizens. The system consists of three main components. First, we propose a processing pipeline that segments road sections of repair requests with a UNet-based model that integrates a pretrained Resnet34 as the encoder. Second, we assessed the performance of two object detection architectures—EfficientDet and YOLOv5—in the task of road damage localisation and classification. Two public datasets, the Indian Driving Dataset (IDD) and the Road Damage Detection Dataset (RDD2020), were preprocessed and augmented to train and evaluate our segmentation and damage detection models. Third, we applied feature extraction and feature matching to find possible duplicated reports. The combination of these three approaches allowed us to cluster reports according to their location and severity using clustering techniques. The results showed that this approach is a promising direction for authorities to leverage limited road maintenance resources in an impactful and effective way. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

10 pages, 2749 KiB  
Article
Machine Learning Enabled Food Contamination Detection Using RFID and Internet of Things System
by Abubakar Sharif, Qammer H. Abbasi, Kamran Arshad, Shuja Ansari, Muhammad Zulfiqar Ali, Jaspreet Kaur, Hasan T. Abbas and Muhammad Ali Imran
J. Sens. Actuator Netw. 2021, 10(4), 63; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan10040063 - 02 Nov 2021
Cited by 11 | Viewed by 4781
Abstract
This paper presents an approach based on radio frequency identification (RFID) and machine learning for contamination sensing of food items and drinks such as soft drinks, alcohol, baby formula milk, etc. We employ sticker-type inkjet printed ultra-high-frequency (UHF) RFID tags for contamination sensing [...] Read more.
This paper presents an approach based on radio frequency identification (RFID) and machine learning for contamination sensing of food items and drinks such as soft drinks, alcohol, baby formula milk, etc. We employ sticker-type inkjet printed ultra-high-frequency (UHF) RFID tags for contamination sensing experimentation. The RFID tag antenna was mounted on pure as well as contaminated food products with known contaminant quantity. The received signal strength indicator (RSSI), as well as the phase of the backscattered signal from the RFID tag mounted on the food item, are measured using the Tagformance Pro setup. We used a machine-learning algorithm XGBoost for further training of the model and improving the accuracy of sensing, which is about 90%. Therefore, this research study paves a way for ubiquitous contamination/content sensing using RFID and machine learning technologies that can enlighten their users about the health concerns and safety of their food. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

17 pages, 422 KiB  
Article
Network Attack Classification in IoT Using Support Vector Machines
by Christiana Ioannou and Vasos Vassiliou
J. Sens. Actuator Netw. 2021, 10(3), 58; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan10030058 - 31 Aug 2021
Cited by 26 | Viewed by 3959
Abstract
Machine learning (ML) techniques learn a system by observing it. Events and occurrences in the network define what is expected of the network’s operation. It is for this reason that ML techniques are used in the computer network security field to detect unauthorized [...] Read more.
Machine learning (ML) techniques learn a system by observing it. Events and occurrences in the network define what is expected of the network’s operation. It is for this reason that ML techniques are used in the computer network security field to detect unauthorized intervention. In the event of suspicious activity, the result of the ML analysis deviates from the definition of expected normal network activity and the suspicious activity becomes apparent. Support vector machines (SVM) are ML techniques that have been used to profile normal network activity and classify it as normal or abnormal. They are trained to configure an optimal hyperplane that classifies unknown input vectors’ values based on their positioning on the plane. We propose to use SVM models to detect malicious behavior within low-power, low-rate and short range networks, such as those used in the Internet of Things (IoT). We evaluated two SVM approaches, the C-SVM and the OC-SVM, where the former requires two classes of vector values (one for the normal and one for the abnormal activity) and the latter observes only normal behavior activity. Both approaches were used as part of an intrusion detection system (IDS) that monitors and detects abnormal activity within the smart node device. Actual network traffic with specific network-layer attacks implemented by us was used to create and evaluate the SVM detection models. It is shown that the C-SVM achieves up to 100% classification accuracy when evaluated with unknown data taken from the same network topology it was trained with and 81% accuracy when operating in an unknown topology. The OC-SVM that is created using benign activity achieves at most 58% accuracy. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

15 pages, 347 KiB  
Article
OPriv: Optimizing Privacy Protection for Network Traffic
by Louma Chaddad, Ali Chehab and Ayman Kayssi
J. Sens. Actuator Netw. 2021, 10(3), 38; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan10030038 - 24 Jun 2021
Cited by 1 | Viewed by 2609
Abstract
Statistical traffic analysis has absolutely exposed the privacy of supposedly secure network traffic, proving that encryption is not effective anymore. In this work, we present an optimal countermeasure to prevent an adversary from inferring users’ online activities, using traffic analysis. First, we formulate [...] Read more.
Statistical traffic analysis has absolutely exposed the privacy of supposedly secure network traffic, proving that encryption is not effective anymore. In this work, we present an optimal countermeasure to prevent an adversary from inferring users’ online activities, using traffic analysis. First, we formulate analytically a constrained optimization problem to maximize network traffic obfuscation while minimizing overhead costs. Then, we provide OPriv, a practical and efficient algorithm to solve dynamically the non-linear programming (NLP) problem, using Cplex optimization. Our heuristic algorithm selects target applications to mutate to and the corresponding packet length, and subsequently decreases the security risks of statistical traffic analysis attacks. Furthermore, we develop an analytical model to measure the obfuscation system’s resilience to traffic analysis attacks. We suggest information theoretic metrics for quantitative privacy measurement, using entropy. The full privacy protection of OPriv is assessed through our new metrics, and then through extensive simulations on real-world data traces. We show that our algorithm achieves strong privacy protection in terms of traffic flow information without impacting the network performance. We are able to reduce the accuracy of a classifier from 91.1% to 1.42% with only 0.17% padding overhead. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

33 pages, 2504 KiB  
Article
Digital Twin-Driven Decision Making and Planning for Energy Consumption
by Yasmin Fathy, Mona Jaber and Zunaira Nadeem
J. Sens. Actuator Netw. 2021, 10(2), 37; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan10020037 - 20 Jun 2021
Cited by 34 | Viewed by 5047
Abstract
The Internet of Things (IoT) is revolutionising how energy is delivered from energy producers and used throughout residential households. Optimising the residential energy consumption is a crucial step toward having greener and sustainable energy production. Such optimisation requires a household-centric energy management system [...] Read more.
The Internet of Things (IoT) is revolutionising how energy is delivered from energy producers and used throughout residential households. Optimising the residential energy consumption is a crucial step toward having greener and sustainable energy production. Such optimisation requires a household-centric energy management system as opposed to a one-rule-fits all approach. In this paper, we propose a data-driven multi-layer digital twin of the energy system that aims to mirror households’ actual energy consumption in the form of a household digital twin (HDT). When linked to the energy production digital twin (EDT), HDT empowers the household-centric energy optimisation model to achieve the desired efficiency in energy use. The model intends to improve the efficiency of energy production by flattening the daily energy demand levels. This is done by collaboratively reorganising the energy consumption patterns of residential homes to avoid peak demands whilst accommodating the resident needs and reducing their energy costs. Indeed, our system incorporates the first HDT model to gauge the impact of various modifications on the household energy bill and, subsequently, on energy production. The proposed energy system is applied to a real-world IoT dataset that spans over two years and covers seventeen households. Our conducted experiments show that the model effectively flattened the collective energy demand by 20.9% on synthetic data and 20.4% on a real dataset. At the same time, the average energy cost per household was reduced by 10.7% for the synthetic data and 17.7% for the real dataset. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

16 pages, 1105 KiB  
Article
Optimising Performance for NB-IoT UE Devices through Data Driven Models
by Omar Nassef, Toktam Mahmoodi, Foivos Michelinakis, Kashif Mahmood and Ahmed Elmokashfi
J. Sens. Actuator Netw. 2021, 10(1), 21; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan10010021 - 05 Mar 2021
Cited by 1 | Viewed by 3066
Abstract
This paper presents a data driven framework for performance optimisation of Narrow-Band IoT user equipment. The proposed framework is an edge micro-service that suggests one-time configurations to user equipment communicating with a base station. Suggested configurations are delivered from a Configuration Advocate, to [...] Read more.
This paper presents a data driven framework for performance optimisation of Narrow-Band IoT user equipment. The proposed framework is an edge micro-service that suggests one-time configurations to user equipment communicating with a base station. Suggested configurations are delivered from a Configuration Advocate, to improve energy consumption, delay, throughput or a combination of those metrics, depending on the user-end device and the application. Reinforcement learning utilising gradient descent and genetic algorithm is adopted synchronously with machine and deep learning algorithms to predict the environmental states and suggest an optimal configuration. The results highlight the adaptability of the Deep Neural Network in the prediction of intermediary environmental states, additionally the results present superior performance of the genetic reinforcement learning algorithm regarding its performance optimisation. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

Review

Jump to: Editorial, Research

20 pages, 1443 KiB  
Review
Challenges of Malware Detection in the IoT and a Review of Artificial Immune System Approaches
by Hadeel Alrubayyi, Gokop Goteng, Mona Jaber and James Kelly
J. Sens. Actuator Netw. 2021, 10(4), 61; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan10040061 - 26 Oct 2021
Cited by 16 | Viewed by 4845
Abstract
The fast growth of the Internet of Things (IoT) and its diverse applications increase the risk of cyberattacks, one type of which is malware attacks. Due to the IoT devices’ different capabilities and the dynamic and ever-evolving environment, applying complex security measures is [...] Read more.
The fast growth of the Internet of Things (IoT) and its diverse applications increase the risk of cyberattacks, one type of which is malware attacks. Due to the IoT devices’ different capabilities and the dynamic and ever-evolving environment, applying complex security measures is challenging, and applying only basic security standards is risky. Artificial Immune Systems (AIS) are intrusion-detecting algorithms inspired by the human body’s adaptive immune system techniques. Most of these algorithms imitate the human’s body B-cell and T-cell defensive mechanisms. They are lightweight, adaptive, and able to detect malware attacks without prior knowledge. In this work, we review the recent advances in employing AIS for the improved detection of malware in IoT networks. We present a critical analysis that highlights the limitations of the state-of-the-art in AIS research and offer insights into promising new research directions. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

34 pages, 2211 KiB  
Review
Trends in Intelligent Communication Systems: Review of Standards, Major Research Projects, and Identification of Research Gaps
by Konstantinos Koufos, Karim EI Haloui, Mehrdad Dianati, Matthew Higgins, Jaafar Elmirghani, Muhammad Ali Imran and Rahim Tafazolli
J. Sens. Actuator Netw. 2021, 10(4), 60; https://0-doi-org.brum.beds.ac.uk/10.3390/jsan10040060 - 12 Oct 2021
Cited by 13 | Viewed by 5595
Abstract
The increasing complexity of communication systems, following the advent of heterogeneous technologies, services and use cases with diverse technical requirements, provide a strong case for the use of artificial intelligence (AI) and data-driven machine learning (ML) techniques in studying, designing and operating emerging [...] Read more.
The increasing complexity of communication systems, following the advent of heterogeneous technologies, services and use cases with diverse technical requirements, provide a strong case for the use of artificial intelligence (AI) and data-driven machine learning (ML) techniques in studying, designing and operating emerging communication networks. At the same time, the access and ability to process large volumes of network data can unleash the full potential of a network orchestrated by AI/ML to optimise the usage of available resources while keeping both CapEx and OpEx low. Driven by these new opportunities, the ongoing standardisation activities indicate strong interest to reap the benefits of incorporating AI and ML techniques in communication networks. For instance, 3GPP has introduced the network data analytics function (NWDAF) at the 5G core network for the control and management of network slices, and for providing predictive analytics, or statistics, about past events to other network functions, leveraging AI/ML and big data analytics. Likewise, at the radio access network (RAN), the O-RAN Alliance has already defined an architecture to infuse intelligence into the RAN, where closed-loop control models are classified based on their operational timescale, i.e., real-time, near real-time, and non-real-time RAN intelligent control (RIC). Different from the existing related surveys, in this review article, we group the major research studies in the design of model-aided ML-based transceivers following the breakdown suggested by the O-RAN Alliance. At the core and the edge networks, we review the ongoing standardisation activities in intelligent networking and the existing works cognisant of the architecture recommended by 3GPP and ETSI. We also review the existing trends in ML algorithms running on low-power micro-controller units, known as TinyML. We conclude with a summary of recent and currently funded projects on intelligent communications and networking. This review reveals that the telecommunication industry and standardisation bodies have been mostly focused on non-real-time RIC, data analytics at the core and the edge, AI-based network slicing, and vendor inter-operability issues, whereas most recent academic research has focused on real-time RIC. In addition, intelligent radio resource management and aspects of intelligent control of the propagation channel using reflecting intelligent surfaces have captured the attention of ongoing research projects. Full article
(This article belongs to the Special Issue Machine Learning in IoT Networking and Communications)
Show Figures

Figure 1

Back to TopTop