Next Issue
Volume 2, March
Previous Issue
Volume 1, September
 
 

IoT, Volume 1, Issue 2 (December 2020) – 22 articles

Cover Story (view full-size image): Time (or clock) synchronization is a large and vital research field, as synchronization is a precondition for many applications. Examples are distributed data acquisition, distributed databases, and real-time communication. First, this survey paper introduces the research area of time synchronization and emphasizes its relation to other research areas. Second, we give an overview of the state of the art of time synchronization. We analyze both established protocol and research approaches according to three criteria: used estimation algorithm, achievable synchronization accuracy, and experimental conditions. We consider the following research question: which estimation method achieves which accuracies under which conditions? In contrast to other survey papers, we particularly consider wireless and wired synchronization and focus on estimation algorithms and their achievable accuracy.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
18 pages, 935 KiB  
Article
Analysis of P4 and XDP for IoT Programmability in 6G and Beyond
by David Carrascal, Elisa Rojas, Joaquin Alvarez-Horcajo, Diego Lopez-Pajares and Isaías Martínez-Yelmo
IoT 2020, 1(2), 605-622; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020031 - 15 Dec 2020
Cited by 7 | Viewed by 4274
Abstract
Recently, two technologies have emerged to provide advanced programmability in Software-Defined Networking (SDN) environments, namely P4 and XDP. At the same time, the Internet of Things (IoT) represents a pillar of future 6G networks, which will be also sustained by SDN. In this [...] Read more.
Recently, two technologies have emerged to provide advanced programmability in Software-Defined Networking (SDN) environments, namely P4 and XDP. At the same time, the Internet of Things (IoT) represents a pillar of future 6G networks, which will be also sustained by SDN. In this regard, there is a need to analyze the suitability of P4 and XDP for IoT. In this article, we aim to compare both technologies to help future research efforts in the field. For this purpose, we evaluate both technologies by implementing diverse use cases, assessing their performance and providing a quick qualitative overview. All tests and design scenarios are publicly available in GitHub to guarantee replication and serve as initial steps for researchers that want to initiate in the field. Results illustrate that currently XDP is the best option for constrained IoT devices, showing lower latency times, half the CPU usage, and reduced memory in comparison with P4. However, development of P4 programs is more straightforward and the amount of code lines is more similar regardless of the scenario. Additionally, P4 has a lot of potential in IoT if a special effort is made to improve the most common software target, BMv2. Full article
Show Figures

Graphical abstract

54 pages, 787 KiB  
Review
A Study on the Evolution of Ransomware Detection Using Machine Learning and Deep Learning Techniques
by Damien Warren Fernando, Nikos Komninos and Thomas Chen
IoT 2020, 1(2), 551-604; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020030 - 15 Dec 2020
Cited by 32 | Viewed by 9304
Abstract
This survey investigates the contributions of research into the detection of ransomware malware using machine learning and deep learning algorithms. The main motivations for this study are the destructive nature of ransomware, the difficulty of reversing a ransomware infection, and how important it [...] Read more.
This survey investigates the contributions of research into the detection of ransomware malware using machine learning and deep learning algorithms. The main motivations for this study are the destructive nature of ransomware, the difficulty of reversing a ransomware infection, and how important it is to detect it before infecting a system. Machine learning is coming to the forefront of combatting ransomware, so we attempted to identify weaknesses in machine learning approaches and how they can be strengthened. The threat posed by ransomware is exceptionally high, with new variants and families continually being found on the internet and dark web. Recovering from ransomware infections is difficult, given the nature of the encryption schemes used by them. The increase in the use of artificial intelligence also coincides with this boom in ransomware. The exploration into machine learning and deep learning approaches when it comes to detecting ransomware poses high interest because machine learning and deep learning can detect zero-day threats. These techniques can generate predictive models that can learn the behaviour of ransomware and use this knowledge to detect variants and families which have not yet been seen. In this survey, we review prominent research studies which all showcase a machine learning or deep learning approach when detecting ransomware malware. These studies were chosen based on the number of citations they had by other research. We carried out experiments to investigate how the discussed research studies are impacted by malware evolution. We also explored the new directions of ransomware and how we expect it to evolve in the coming years, such as expansion into IoT (Internet of Things), with IoT being integrated more into infrastructures and into homes. Full article
Show Figures

Graphical abstract

22 pages, 2298 KiB  
Review
A Study on Industrial IoT for the Mining Industry: Synthesized Architecture and Open Research Directions
by Abdullah Aziz, Olov Schelén and Ulf Bodin
IoT 2020, 1(2), 529-550; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020029 - 10 Dec 2020
Cited by 37 | Viewed by 9959
Abstract
The Industrial Internet of Things (IIoT) has the potential to improve the production and business processes by enabling the extraction of valuable information from industrial processes. The mining industry, however, is rather traditional and somewhat slow to change due to infrastructural limitations in [...] Read more.
The Industrial Internet of Things (IIoT) has the potential to improve the production and business processes by enabling the extraction of valuable information from industrial processes. The mining industry, however, is rather traditional and somewhat slow to change due to infrastructural limitations in communication, data management, storage, and exchange of information. Most research efforts so far on applying IIoT in the mining industry focus on specific concerns such as ventilation monitoring, accident analysis, fleet and personnel management, tailing dam monitoring, and pre-alarm system while an overall IIoT architecture suitable for the general conditions in the mining industry is still missing. This article analyzes the current state of Information Technology in the mining sector and identifies a major challenge of vertical fragmentation due to the technological variety of various systems and devices offered by different vendors, preventing interoperability, data distribution, and the exchange of information securely between devices and systems. Based on guidelines and practices from the major IIoT standards, a high-level IIoT architecture suitable for the mining industry is then synthesized and presented, addressing the identified challenges and enabling smart mines by automation, interoperable systems, data distribution, and real-time visibility of the mining status. Remote controlling, data processing, and interoperability techniques of the architecture evolve all stages of mining from prospecting to reclamation. The adoption of such IIoT architecture in the mining industry offers safer mine site for workers, predictable mining operations, interoperable environment for both traditional and modern systems and devices, automation to reduce human intervention, and enables underground surveillance by converging operational technology (OT) and information technology (IT). Significant open research challenges and directions are also studied and identified in this paper, such as mobility management, scalability, virtualization at the IIoT edge, and digital twins. Full article
(This article belongs to the Special Issue Industrial IoT as IT and OT Convergence: Challenges and Opportunities)
Show Figures

Figure 1

23 pages, 2800 KiB  
Review
Building Resilience against COVID-19 Pandemic Using Artificial Intelligence, Machine Learning, and IoT: A Survey of Recent Progress
by S. M. Abu Adnan Abir, Shama Naz Islam, Adnan Anwar, Abdun Naser Mahmood and Aman Maung Than Oo
IoT 2020, 1(2), 506-528; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020028 - 06 Dec 2020
Cited by 28 | Viewed by 11739
Abstract
Coronavirus disease 2019 (COVID-19) has significantly impacted the entire world today and stalled off regular human activities in such an unprecedented way that it will have an unforgettable footprint on the history of mankind. Different countries have adopted numerous measures to build resilience [...] Read more.
Coronavirus disease 2019 (COVID-19) has significantly impacted the entire world today and stalled off regular human activities in such an unprecedented way that it will have an unforgettable footprint on the history of mankind. Different countries have adopted numerous measures to build resilience against this life-threatening disease. However, the highly contagious nature of this pandemic has challenged the traditional healthcare and treatment practices. Thus, artificial intelligence (AI) and machine learning (ML) open up new mechanisms for effective healthcare during this pandemic. AI and ML can be useful for medicine development, designing efficient diagnosis strategies and producing predictions of the disease spread. These applications are highly dependent on real-time monitoring of the patients and effective coordination of the information, where the Internet of Things (IoT) plays a key role. IoT can also help with applications such as automated drug delivery, responding to patient queries, and tracking the causes of disease spread. This paper represents a comprehensive analysis of the potential AI, ML, and IoT technologies for defending against the COVID-19 pandemic. The existing and potential applications of AI, ML, and IoT, along with a detailed analysis of the enabling tools and techniques are outlined. A critical discussion on the risks and limitations of the aforementioned technologies are also included. Full article
Show Figures

Graphical abstract

12 pages, 1163 KiB  
Letter
Natural Language Understanding for Multi-Level Distributed Intelligent Virtual Sensors
by Radu-Casian Mihailescu, Georgios Kyriakou and Angelos Papangelis
IoT 2020, 1(2), 494-505; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020027 - 06 Dec 2020
Cited by 1 | Viewed by 2499
Abstract
In this paper we address the problem of automatic sensor composition for servicing human-interpretable high-level tasks. To this end, we introduce multi-level distributed intelligent virtual sensors (multi-level DIVS) as an overlay framework for a given mesh of physical and/or virtual sensors already deployed [...] Read more.
In this paper we address the problem of automatic sensor composition for servicing human-interpretable high-level tasks. To this end, we introduce multi-level distributed intelligent virtual sensors (multi-level DIVS) as an overlay framework for a given mesh of physical and/or virtual sensors already deployed in the environment. The goal for multi-level DIVS is two-fold: (i) to provide a convenient way for the user to specify high-level sensing tasks; (ii) to construct the computational graph that provides the correct output given a specific sensing task. For (i) we resort to a conversational user interface, which is an intuitive and user-friendly manner in which the user can express the sensing problem, i.e., natural language queries, while for (ii) we propose a deep learning approach that establishes the correspondence between the natural language queries and their virtual sensor representation. Finally, we evaluate and demonstrate the feasibility of our approach in the context of a smart city setup. Full article
Show Figures

Graphical abstract

20 pages, 5756 KiB  
Article
Precise Water Leak Detection Using Machine Learning and Real-Time Sensor Data
by João Alves Coelho, André Glória and Pedro Sebastião
IoT 2020, 1(2), 474-493; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020026 - 03 Dec 2020
Cited by 24 | Viewed by 13034
Abstract
Water is a crucial natural resource, and it is widely mishandled, with an estimated one third of world water utilities having loss of water of around 40% due to leakage. This paper presents a proposal for a system based on a wireless sensor [...] Read more.
Water is a crucial natural resource, and it is widely mishandled, with an estimated one third of world water utilities having loss of water of around 40% due to leakage. This paper presents a proposal for a system based on a wireless sensor network designed to monitor water distribution systems, such as irrigation systems, which, with the help of an autonomous learning algorithm, allows for precise location of water leaks. The complete system architecture is detailed, including hardware, communication, and data analysis. A study to discover the best machine learning algorithm between random forest, decision trees, neural networks, and Support Vector Machine (SVM) to fit leak detection is presented, including the methodology, training, and validation as well as the obtained results. Finally, the developed system is validated in a real-case implementation that shows that it is able to detect leaks with a 75% accuracy. Full article
Show Figures

Graphical abstract

23 pages, 368 KiB  
Article
A Review on Scaling Mobile Sensing Platforms for Human Activity Recognition: Challenges and Recommendations for Future Research
by Liliana I. Carvalho and Rute C. Sofia
IoT 2020, 1(2), 451-473; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020025 - 29 Nov 2020
Cited by 8 | Viewed by 4021
Abstract
Mobile sensing has been gaining ground due to the increasing capabilities of mobile and personal devices that are carried around by citizens, giving access to a large variety of data and services based on the way humans interact. Mobile sensing brings several advantages [...] Read more.
Mobile sensing has been gaining ground due to the increasing capabilities of mobile and personal devices that are carried around by citizens, giving access to a large variety of data and services based on the way humans interact. Mobile sensing brings several advantages in terms of the richness of available data, particularly for human activity recognition. Nevertheless, the infrastructure required to support large-scale mobile sensing requires an interoperable design, which is still hard to achieve today. This review paper contributes to raising awareness of challenges faced today by mobile sensing platforms that perform learning and behavior inference with respect to human routines: how current solutions perform activity recognition, which classification models they consider, and which types of behavior inferences can be seamlessly provided. The paper provides a set of guidelines that contribute to a better functional design of mobile sensing infrastructures, keeping scalability as well as interoperability in mind. Full article
Show Figures

Graphical abstract

15 pages, 2519 KiB  
Review
Bibliometric Analysis of Scientific Productivity around Edge Computing and the Internet of Things
by Antonio-José Moreno-Guerrero, Francisco-Javier Hinojo-Lucena, Magdalena Ramos Navas-Parejo and Gerardo Gómez-García
IoT 2020, 1(2), 436-450; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020024 - 17 Nov 2020
Cited by 4 | Viewed by 3183
Abstract
Technological progress has recently led to the emergence of various technological resources and means that are improving specific aspects of society. An example of this can be found in the “internet of things” and “edge computing”. The present study aims at knowing and [...] Read more.
Technological progress has recently led to the emergence of various technological resources and means that are improving specific aspects of society. An example of this can be found in the “internet of things” and “edge computing”. The present study aims at knowing and analyzing the scientific literature of the set of terms formed by “edge computing” and “internet of things”, called from now on ECIT. In order to carry out the research, a study has been developed, based on bibliometrics, by means of scientific mapping. In this case, different production indicators have been taken into account, as well as the structural and dynamic development of the terms and authors extracted from the publications through the programs Analyze Results, Creation Citation Report and SciMAT. The results indicate that the study theme “edge computing” and “internet of things” is of recent creation, given that its beginnings date back to 2014. Since then the level of production has been dizzying, increasing considerably in the past two years. It can be concluded that the field of study of ECIT is of recent creation, with a solid research base based on the “internet of things”. Furthermore, the themes “big data”, “energy” and “framework” can be considered as the future lines of research on ECIT. Full article
Show Figures

Figure 1

38 pages, 905 KiB  
Review
Estimators for Time Synchronization—Survey, Analysis, and Outlook
by Henning Puttnies, Peter Danielis, Ali Rehan Sharif and Dirk Timmermann
IoT 2020, 1(2), 398-435; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020023 - 17 Nov 2020
Cited by 15 | Viewed by 5719
Abstract
Time (or clock) synchronization is a large and vital field of research, as synchronization is a precondition for many applications. A few example applications are distributed data acquisition, distributed databases, and real-time communication. First, this survey paper introduces the research area of time [...] Read more.
Time (or clock) synchronization is a large and vital field of research, as synchronization is a precondition for many applications. A few example applications are distributed data acquisition, distributed databases, and real-time communication. First, this survey paper introduces the research area of time synchronization and emphasizes its relation to other research areas. Second, we give an overview of the state-of-the-art of time synchronization. Herein, we discuss both established protocol and research approaches. We analyze all techniques according to three criteria: used estimation algorithm, achievable synchronization accuracy, and the experimental conditions. In our opinion, this analysis highlights potential improvements. The most important question in this survey is as follows: which estimation method can be used to achieve which accuracies under which conditions? The intention behind this is to identify estimation methods that are particularly worth considering, as these already achieve good results in the wireless area but have not yet been examined in the wired area (and vice versa). This survey paper differs from other surveys in particular through the consideration of wireless and wired synchronization and the focus on estimation algorithms and their achievable accuracy. Full article
Show Figures

Figure 1

16 pages, 4199 KiB  
Concept Paper
AI and Blockchain Integrated Billing Architecture for Charging the Roaming Electric Vehicles
by Raziq Yaqub, Sadiq Ahmad, Hassan Ali and Azzam ul Asar
IoT 2020, 1(2), 382-397; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020022 - 15 Nov 2020
Cited by 11 | Viewed by 5202
Abstract
Due to the proliferation of extended travel range electric vehicles (EVs), these will travel through different networks that might be served by different utility companies. Therefore, we propose an architecture capable of offering a charging service to roaming vehicles. Furthermore, although the energy [...] Read more.
Due to the proliferation of extended travel range electric vehicles (EVs), these will travel through different networks that might be served by different utility companies. Therefore, we propose an architecture capable of offering a charging service to roaming vehicles. Furthermore, although the energy internet supports both the flow of energy and information, it does not support seamless EV roaming service, because it is based on a centralized architecture. The blockchain technology that is based on a decentralized system has the potential to support a secure billing platform for charging the EVs roaming through different electrical jurisdictions. Furthermore, the integration of artificial intelligence (AI) ensures that the participating players get a fair portion of the revenue. Thus, the objective of this paper is to develop an AI and blockchain integrated billing architecture that would offer a charging service to the “roaming” EVs and present a fair and unified billing solution. Full article
Show Figures

Graphical abstract

22 pages, 2389 KiB  
Article
An Evaluation of Wearable Inertial Sensor Configuration and Supervised Machine Learning Models for Automatic Punch Classification in Boxing
by Matthew T. O. Worsey, Hugo G. Espinosa, Jonathan B. Shepherd and David V. Thiel
IoT 2020, 1(2), 360-381; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020021 - 13 Nov 2020
Cited by 21 | Viewed by 6194
Abstract
Machine learning is a powerful tool for data classification and has been used to classify movement data recorded by wearable inertial sensors in general living and sports. Inertial sensors can provide valuable biofeedback in combat sports such as boxing; however, the use of [...] Read more.
Machine learning is a powerful tool for data classification and has been used to classify movement data recorded by wearable inertial sensors in general living and sports. Inertial sensors can provide valuable biofeedback in combat sports such as boxing; however, the use of such technology has not had a global uptake. If simple inertial sensor configurations can be used to automatically classify strike type, then cumbersome tasks such as video labelling can be bypassed and the foundation for automated workload monitoring of combat sport athletes is set. This investigation evaluates the classification performance of six different supervised machine learning models (tuned and untuned) when using two simple inertial sensor configurations (configuration 1—inertial sensor worn on both wrists; configuration 2—inertial sensor worn on both wrists and third thoracic vertebrae [T3]). When trained on one athlete, strike prediction accuracy was good using both configurations (sensor configuration 1 mean overall accuracy: 0.90 ± 0.12; sensor configuration 2 mean overall accuracy: 0.87 ± 0.09). There was no significant statistical difference in prediction accuracy between both configurations and tuned and untuned models (p > 0.05). Moreover, there was no significant statistical difference in computational training time for tuned and untuned models (p > 0.05). For sensor configuration 1, a support vector machine (SVM) model with a Gaussian rbf kernel performed the best (accuracy = 0.96), for sensor configuration 2, a multi-layered perceptron neural network (MLP-NN) model performed the best (accuracy = 0.98). Wearable inertial sensors can be used to accurately classify strike-type in boxing pad work, this means that cumbersome tasks such as video and notational analysis can be bypassed. Additionally, automated workload and performance monitoring of athletes throughout training camp is possible. Future investigations will evaluate the performance of this algorithm on a greater sample size and test the influence of impact window-size on prediction accuracy. Additionally, supervised machine learning models should be trained on data collected during sparring to see if high accuracy holds in a competition setting. This can help move closer towards automatic scoring in boxing. Full article
Show Figures

Graphical abstract

23 pages, 7966 KiB  
Article
A Proposed Low-Cost Viticulture Stress Framework for Table Grape Varieties
by Sotirios Kontogiannis and Christodoulos Asiminidis
IoT 2020, 1(2), 337-359; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020020 - 04 Nov 2020
Cited by 2 | Viewed by 3636
Abstract
Climate change significantly affects viticulture by reducing the production yield and the quality characteristics of its final products. In some observed cases, the consequences of climate outages such as droughts, hail and floods are absolutely devastating for the farmers and the sustained local [...] Read more.
Climate change significantly affects viticulture by reducing the production yield and the quality characteristics of its final products. In some observed cases, the consequences of climate outages such as droughts, hail and floods are absolutely devastating for the farmers and the sustained local economies. Hence, it is essential to develop new in implementation monitoring solutions that offer remote real-time surveillance, alert triggering, minimum maintenance and automated generation of incident alerts with precision responses. This paper presents a new framework and a system for vine stress monitoring called Vity-stress. The Vity-stress framework combines field measurements with precise viticulture suggestions and stress avoidance planning. The key points of the proposed framework’s system are that it is easy to develop, easy to maintain and cheap to implement applicability. Focusing on the Mediterranean cultivated table grape varieties that are strongly affected by climate change, we propose a new stress conditions monitoring system to support our framework. The proposition includes distributed field located sensors and a novel camera module implementing deep neural network algorithms to detect stress indicators. Additionally, a new wireless sensor network supported by the iBeacon protocol has been developed. The results of the sensory measurements’ data logging and imposed image detection process’s evaluation shows that the proposed system can successfully detect different stress levels in vineyards, which in turn can allow producers to identify specific areas for irrigation, thereby saving water, energy and time. Full article
Show Figures

Graphical abstract

17 pages, 3690 KiB  
Article
Monitoring Activities of Daily Living Using UWB Radar Technology: A Contactless Approach
by Sindre Klavestad, Gebremariam Assres, Siri Fagernes and Tor-Morten Grønli
IoT 2020, 1(2), 320-336; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020019 - 30 Oct 2020
Cited by 11 | Viewed by 6669
Abstract
In recent years, the ultra-wideband (UWB) radar technology has shown great potential in monitoring activities of daily living (ADLs) for smart homes. In this paper, we investigate the significance of using non-wearable UWB sensors for developing non-intrusive, unobtrusive, and privacy-preserving monitoring of elderly [...] Read more.
In recent years, the ultra-wideband (UWB) radar technology has shown great potential in monitoring activities of daily living (ADLs) for smart homes. In this paper, we investigate the significance of using non-wearable UWB sensors for developing non-intrusive, unobtrusive, and privacy-preserving monitoring of elderly ADLs. A controlled experiment was setup, implementing multiple non-wearable sensors in a smart home Lab setting. A total of nine (n = 9) participants were involved in conducting predefined scenarios of ADLs- cooking, eating, resting, sleeping and mobility. We employed the UWB sensing prototype and conventional implementation technologies, and the sensed data of both systems were stored, analysed and their performances were compared. The result shows that the performance of the non-wearable UWB technology is as good as that of the conventional ones. Furthermore, we provided a proof-of-concept solution for the real-time detection of abnormal behaviour based on excessive activity levels, and a model for automatic alerts to caregivers for timely medical assistance on-demand. Full article
Show Figures

Figure 1

11 pages, 3972 KiB  
Technical Note
Miniaturized On-Chip NFC Antenna versus Screen-Printed Antenna for the Flexible Disposable Sensor Strips
by Atefeh Kordzadeh, Dominik Holzmann, Alfred Binder, Thomas Moldaschl, Johannes Sturm and Ali Roshanghias
IoT 2020, 1(2), 309-319; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020018 - 28 Oct 2020
Cited by 4 | Viewed by 3645
Abstract
With the ongoing trend toward miniaturization via system-on-chip (SoC), both radio-frequency (RF) SoCs and on-chip multi-sensory systems are gaining significance. This paper compares the inductance of a miniaturized on-chip near field communication (NFC) antenna versus the conventional screen-printed on-substrate ones that have been [...] Read more.
With the ongoing trend toward miniaturization via system-on-chip (SoC), both radio-frequency (RF) SoCs and on-chip multi-sensory systems are gaining significance. This paper compares the inductance of a miniaturized on-chip near field communication (NFC) antenna versus the conventional screen-printed on-substrate ones that have been used for the transfer of sensory data from a chip to a cell phone reader. Furthermore, the transferred power efficiency in a coupled NFC system is calculated for various chip coil geometries and the results are compared. The proposed NFC antenna was fabricated via a lithography process for an application-specific integrated circuit (ASIC) chip. The chip had a small area of 2.4 × 2.4 mm2, therefore a miniaturized NFC antenna was designed, whereas the screen-printed on-substrate antennas had an area of 35 × 51 mm2. This paper investigates the effects of different parameters such as conductor thickness and materials, double layering, and employing ferrite layers with different thicknesses on the performance of the on-chip antennas using full-wave simulations. The presence of a ferrite layer to increase the inductance of the antenna and mitigate the interactions with backplates has proven useful. The best performance was obtained via double-layering of the coils, which was similar to on-substrate antennas, while a size reduction of 99.68% was gained. Consequently, the coupling factors and maximum achievable power transmission efficiency of the on-chip antenna and on-substrate antenna were studied and compared. Full article
Show Figures

Figure 1

23 pages, 7237 KiB  
Article
Evaluation of Low-Cost Sensors for Weather and Carbon Dioxide Monitoring in Internet of Things Context
by Tiago Araújo, Lígia Silva and Adriano Moreira
IoT 2020, 1(2), 286-308; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020017 - 23 Oct 2020
Cited by 10 | Viewed by 3905
Abstract
In a context of increased environmental awareness, the Internet of Things has allowed individuals or entities to build their own connected devices to share data about the environment. These data are often obtained from widely available low-cost sensors. Some companies are also selling [...] Read more.
In a context of increased environmental awareness, the Internet of Things has allowed individuals or entities to build their own connected devices to share data about the environment. These data are often obtained from widely available low-cost sensors. Some companies are also selling low-cost sensing kits for in-house or outdoor use. The work described in this paper evaluated, in the short term, the performance of a set of low-cost sensors for temperature, relative humidity, atmospheric pressure and carbon dioxide, commonly used in these platforms. The research challenge addressed with this work was assessing how trustable the raw data obtained from these sensors are. The experiments made use of 18 climatic sensors from six different models, and they were evaluated in a controlled climatic chamber that reproduced controlled situations for temperature and humidity. Four CO2 sensors from two different models were analysed through exposure to different gas concentrations in an indoor environment. Our results revealed temperature sensors with a very high positive coefficient of determination (r2 ≥ 0.99), as well as the presence of bias and almost zero random error; the humidity sensors demonstrated a very high positive correlation (r2 ≥ 0.98), significant bias and small-yet-relevant random error; the atmospheric pressure sensors presented good reproducibility, but further studies are required to evaluate their accuracy and precision. For carbon dioxide, the non-dispersive infra-red sensors demonstrated very satisfactory results (r2 ≥ 0.97, with a minimum root mean squared error (RMSE) value of 26 ppm); the metal oxide sensors, despite their moderate results (minimum RMSE equal to 40 ppm and r2 of 0.8–0.96), presented hysteresis, environmental dependence and even positioning interference. The results suggest that most of the evaluated low-cost sensors can provide a good sense of reality at a very good cost–benefit ratio in certain situations. Full article
Show Figures

Graphical abstract

27 pages, 2781 KiB  
Article
IoT Network Security: Threats, Risks, and a Data-Driven Defense Framework
by Charles Wheelus and Xingquan Zhu
IoT 2020, 1(2), 259-285; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020016 - 19 Oct 2020
Cited by 51 | Viewed by 12332
Abstract
The recent surge in Internet of Things (IoT) deployment has increased the pace of integration and extended the reach of the Internet from computers, tablets and phones to a myriad of devices in our physical world. Driven by the IoT, with each passing [...] Read more.
The recent surge in Internet of Things (IoT) deployment has increased the pace of integration and extended the reach of the Internet from computers, tablets and phones to a myriad of devices in our physical world. Driven by the IoT, with each passing day, the Internet becomes more integrated with everyday life. While IoT devices provide endless new capabilities and make life more convenient, they also vastly increase the opportunity for nefarious individuals, criminal organizations and even state actors to spy on, and interfere with, unsuspecting users of IoT systems. As this looming crisis continues to grow, calls for data science approaches to address these problems have increased, and current research shows that predictive models trained with machine learning algorithms hold great potential to mitigate some of these issues. In this paper, we first carry out an analytics approach to review security risks associated with IoT systems, and then propose a machine learning-based solution to characterize and detect IoT attacks. We use a real-world IoT system with secured gate access as a platform, and introduce the IoT system in detail, including features to capture security threats/attacks to the system. By using data collected from a nine month period as our testbed, we evaluate the efficacy of predictive models trained by means of machine learning, and propose design principles and a loose framework for implementing secure IoT systems. Full article
(This article belongs to the Special Issue Cyber Security and Privacy in IoT)
Show Figures

Graphical abstract

19 pages, 602 KiB  
Article
A Deep Learning Model for Demand-Driven, Proactive Tasks Management in Pervasive Computing
by Kostas Kolomvatsos and Christos Anagnostopoulos
IoT 2020, 1(2), 240-258; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020015 - 14 Oct 2020
Cited by 6 | Viewed by 2680
Abstract
Pervasive computing applications deal with the intelligence surrounding users that can facilitate their activities. This intelligence is provided in the form of software components incorporated in embedded systems or devices in close distance with end users. One example of infrastructure that can host [...] Read more.
Pervasive computing applications deal with the intelligence surrounding users that can facilitate their activities. This intelligence is provided in the form of software components incorporated in embedded systems or devices in close distance with end users. One example of infrastructure that can host intelligent pervasive services is the Edge Computing (EC) ecosystem. EC nodes can execute a number of tasks for data collected by devices present in the Internet of Things (IoT). In this paper, we propose an intelligent, proactive tasks management model based on demand. Demand depicts the number of users or applications interested in using the available tasks in EC nodes, thus characterizing their popularity. We rely on a Deep Machine Learning (DML) model and more specifically on a Long Short Term Memory (LSTM) network to learn the distribution of demand indicators for each task and estimate the future interest in them. This information is combined with historical observations of and support for a decision making scheme to conclude which tasks that are offloaded due to limited interest in them. We have to recognise that, in our decision making, we also take into consideration the load that every task may add to the processing node where it will be allocated. The description of our model is accompanied by a large set of experimental simulations for evaluating the proposed mechanism. We provide numerical results and reveal that the proposed scheme is capable of deciding on the fly, while concluding the most efficient decisions. Full article
(This article belongs to the Special Issue Efficiency of Modern Data Centers (EMDC))
Show Figures

Figure 1

22 pages, 2578 KiB  
Article
Sentiment Analysis on Twitter Data of World Cup Soccer Tournament Using Machine Learning
by Ravikumar Patel and Kalpdrum Passi
IoT 2020, 1(2), 218-239; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020014 - 10 Oct 2020
Cited by 44 | Viewed by 8454
Abstract
In the derived approach, an analysis is performed on Twitter data for World Cup soccer 2014 held in Brazil to detect the sentiment of the people throughout the world using machine learning techniques. By filtering and analyzing the data using natural language processing [...] Read more.
In the derived approach, an analysis is performed on Twitter data for World Cup soccer 2014 held in Brazil to detect the sentiment of the people throughout the world using machine learning techniques. By filtering and analyzing the data using natural language processing techniques, sentiment polarity was calculated based on the emotion words detected in the user tweets. The dataset is normalized to be used by machine learning algorithms and prepared using natural language processing techniques like word tokenization, stemming and lemmatization, part-of-speech (POS) tagger, name entity recognition (NER), and parser to extract emotions for the textual data from each tweet. This approach is implemented using Python programming language and Natural Language Toolkit (NLTK). A derived algorithm extracts emotional words using WordNet with its POS (part-of-speech) for the word in a sentence that has a meaning in the current context, and is assigned sentiment polarity using the SentiWordNet dictionary or using a lexicon-based method. The resultant polarity assigned is further analyzed using naïve Bayes, support vector machine (SVM), K-nearest neighbor (KNN), and random forest machine learning algorithms and visualized on the Weka platform. Naïve Bayes gives the best accuracy of 88.17% whereas random forest gives the best area under the receiver operating characteristics curve (AUC) of 0.97. Full article
(This article belongs to the Special Issue The Leverage of Social Media and IoT)
Show Figures

Figure 1

20 pages, 3062 KiB  
Article
A User Study of a Wearable System to Enhance Bystanders’ Facial Privacy
by Alfredo J. Perez, Sherali Zeadally, Scott Griffith, Luis Y. Matos Garcia and Jaouad A. Mouloud
IoT 2020, 1(2), 198-217; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020013 - 10 Oct 2020
Cited by 8 | Viewed by 2885
Abstract
The privacy of users and information are becoming increasingly important with the growth and pervasive use of mobile devices such as wearables, mobile phones, drones, and Internet of Things (IoT) devices. Today many of these mobile devices are equipped with cameras which enable [...] Read more.
The privacy of users and information are becoming increasingly important with the growth and pervasive use of mobile devices such as wearables, mobile phones, drones, and Internet of Things (IoT) devices. Today many of these mobile devices are equipped with cameras which enable users to take pictures and record videos anytime they need to do so. In many such cases, bystanders’ privacy is not a concern, and as a result, audio and video of bystanders are often captured without their consent. We present results from a user study in which 21 participants were asked to use a wearable system called FacePET developed to enhance bystanders’ facial privacy by providing a way for bystanders to protect their own privacy rather than relying on external systems for protection. While past works in the literature focused on privacy perceptions of bystanders when photographed in public/shared spaces, there has not been research with a focus on user perceptions of bystander-based wearable devices to enhance privacy. Thus, in this work, we focus on user perceptions of the FacePET device and/or similar wearables to enhance bystanders’ facial privacy. In our study, we found that 16 participants would use FacePET or similar devices to enhance their facial privacy, and 17 participants agreed that if smart glasses had features to conceal users’ identities, it would allow them to become more popular. Full article
(This article belongs to the Special Issue Cyber Security and Privacy in IoT)
Show Figures

Graphical abstract

18 pages, 9433 KiB  
Article
Predictive Maintenance of Bus Fleet by Intelligent Smart Electronic Board Implementing Artificial Intelligence
by Alessandro Massaro, Sergio Selicato and Angelo Galiano
IoT 2020, 1(2), 180-197; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020012 - 01 Oct 2020
Cited by 15 | Viewed by 5888
Abstract
This paper is focused on the design and development of a smart and compact electronic control unit (ECU) for the monitoring of a bus fleet. The ECU system is able to extract all vehicle data by the on-board diagnostics-(ODB)-II and SAE J1939 standards. [...] Read more.
This paper is focused on the design and development of a smart and compact electronic control unit (ECU) for the monitoring of a bus fleet. The ECU system is able to extract all vehicle data by the on-board diagnostics-(ODB)-II and SAE J1939 standards. The integrated system Internet of Things (IoT) system, is interconnected in the cloud by an artificial intelligence engine implementing multilayer perceptron artificial neural network (MLP-ANN) and is able to predict maintenance of each vehicle by classifying the driver behavior. The key performance indicator (KPI) of the driver behavior has been estimated by data mining k-means algorithm. The MLP-ANN model has been tested by means of a dataset found in literature by allowing the correct choice of the calculus parameters. A low means square error (MSE) of the order of 10−3 is checked thus proving the correct use of MLP-ANN. Based on the analysis of the results, are defined methodologies of key performance indicators (KPIs), correlating driver behavior with the engine stress defining the bus maintenance plan criteria. All the results are joined into a cloud platform showing fleet efficiency dashboards. The proposed topic has been developed within the framework of an industry research project collaborating with a company managing bus fleet. Full article
(This article belongs to the Special Issue Edge Computing Optimization Using Artificial Intelligence Methods)
Show Figures

Graphical abstract

19 pages, 5178 KiB  
Article
Visual and Artistic Effects of an IoT System in Smart Cities: Research Flow
by Mariana-Daniela González-Zamar and Emilio Abad-Segura
IoT 2020, 1(2), 161-179; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020011 - 27 Sep 2020
Cited by 4 | Viewed by 3278
Abstract
In smart cities, the progress of technology has allowed the implementation of sensors, originating the Internet of Things (IoT) and making cities safer and more sustainable. Hence, the presence of elements that generate visual and artistic effects of IoT technology can make a [...] Read more.
In smart cities, the progress of technology has allowed the implementation of sensors, originating the Internet of Things (IoT) and making cities safer and more sustainable. Hence, the presence of elements that generate visual and artistic effects of IoT technology can make a great contribution to the provision of information that the urbanite needs. The aim of this study is to analyze worldwide research on the visual and artistic effects of IoT in smart cities. Bibliometric techniques were utilized on 1278 articles on this subject matter for the period of 2010–2019 to achieve results on activity production. This has increased yearly, where in the last triennium, it has accumulated 85.21% of documents. Computer science and engineering were the most prominent subject areas where the articles were classified. The lines of research in the development of this research topic have been detected. Furthermore, the main directions for future research have also been identified. This study aims to contribute to highlighting the drivers of this field of research, in addition to providing the available information and future directions to improve academic and scientific discussion. Full article
(This article belongs to the Special Issue Internet of Things Technologies for Smart Cities)
Show Figures

Graphical abstract

14 pages, 4339 KiB  
Article
Temperature-Compensated Spread Spectrum Sound-Based Local Positioning System for Greenhouse Operations
by Lok Wai Jacky Tsay, Tomoo Shiigi, Zichen Huang, Xunyue Zhao, Tetsuhito Suzuki, Yuichi Ogawa and Naoshi Kondo
IoT 2020, 1(2), 147-160; https://0-doi-org.brum.beds.ac.uk/10.3390/iot1020010 - 27 Sep 2020
Cited by 5 | Viewed by 2772
Abstract
A spread spectrum sound-based local positioning system (SSSLPS) has been developed for indoor agricultural robots by our research group. Such an SSSLPS has several advantages, including effective propagation, low cost, and ease of use. When using sound velocity for field position measurements in [...] Read more.
A spread spectrum sound-based local positioning system (SSSLPS) has been developed for indoor agricultural robots by our research group. Such an SSSLPS has several advantages, including effective propagation, low cost, and ease of use. When using sound velocity for field position measurements in a greenhouse, spatial and temporal variations in temperature during the day can have a major effect on sound velocity and subsequent positioning accuracy. In this research, a temperature-compensated sound velocity positioning was proposed and evaluated in comparison to a conventional temperature sensor method. Results indicate that this new proposed method has a positioning accuracy to within 20 mm in a 3 m × 9 m ridged greenhouse. It has the potential to replace the current system of using the temperature sensors in a greenhouse. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop