Next Issue
Volume 13, February
Previous Issue
Volume 12, December
 
 

Computers, Volume 13, Issue 1 (January 2024) – 31 articles

Cover Story (view full-size image): Automatic data fusion is an important field of machine learning that has been increasingly studied. The objective is to improve the classification performance from several individual classifiers in terms of result accuracy and stability. The fusion step can be applied at early and/or late stages of the classification procedure. Early fusion consists of combining features from different sources or domains before training the individual classifiers. In turn, late fusion combines the results from the individual classifiers after testing. Late fusion has two setups, combination of posterior probabilities (soft fusion) and combination of decisions (hard fusion). A theoretical analysis of the conditions for applying early and late fusion and a study on recent data fusion methods are provided. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
18 pages, 4771 KiB  
Article
Multiclass AI-Generated Deepfake Face Detection Using Patch-Wise Deep Learning Model
by Muhammad Asad Arshed, Shahzad Mumtaz, Muhammad Ibrahim, Christine Dewi, Muhammad Tanveer and Saeed Ahmed
Computers 2024, 13(1), 31; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010031 - 21 Jan 2024
Cited by 1 | Viewed by 3533
Abstract
In response to the rapid advancements in facial manipulation technologies, particularly facilitated by Generative Adversarial Networks (GANs) and Stable Diffusion-based methods, this paper explores the critical issue of deepfake content creation. The increasing accessibility of these tools necessitates robust detection methods to curb [...] Read more.
In response to the rapid advancements in facial manipulation technologies, particularly facilitated by Generative Adversarial Networks (GANs) and Stable Diffusion-based methods, this paper explores the critical issue of deepfake content creation. The increasing accessibility of these tools necessitates robust detection methods to curb potential misuse. In this context, this paper investigates the potential of Vision Transformers (ViTs) for effective deepfake image detection, leveraging their capacity to extract global features. Objective: The primary goal of this study is to assess the viability of ViTs in detecting multiclass deepfake images compared to traditional Convolutional Neural Network (CNN)-based models. By framing the deepfake problem as a multiclass task, this research introduces a novel approach, considering the challenges posed by Stable Diffusion and StyleGAN2. The objective is to enhance understanding and efficacy in detecting manipulated content within a multiclass context. Novelty: This research distinguishes itself by approaching the deepfake detection problem as a multiclass task, introducing new challenges associated with Stable Diffusion and StyleGAN2. The study pioneers the exploration of ViTs in this domain, emphasizing their potential to extract global features for enhanced detection accuracy. The novelty lies in addressing the evolving landscape of deepfake creation and manipulation. Results and Conclusion: Through extensive experiments, the proposed method exhibits high effectiveness, achieving impressive detection accuracy, precision, and recall, and an F1 rate of 99.90% on a multiclass-prepared dataset. The results underscore the significant potential of ViTs in contributing to a more secure digital landscape by robustly addressing the challenges posed by deepfake content, particularly in the presence of Stable Diffusion and StyleGAN2. The proposed model outperformed when compared with state-of-the-art CNN-based models, i.e., ResNet-50 and VGG-16. Full article
Show Figures

Figure 1

18 pages, 5486 KiB  
Article
Improved Deep Learning Model for Workpieces of Rectangular Pipeline Surface Defect Detection
by Changxing Chen and Afizan Azman
Computers 2024, 13(1), 30; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010030 - 21 Jan 2024
Viewed by 1292
Abstract
This study introduces a novel approach to address challenges in workpiece surface defect identification. It presents an enhanced Single Shot MultiBox Detector model, incorporating attention mechanisms and multi-feature fusion. The research methodology involves carefully curating a dataset from authentic on-site factory production, enabling [...] Read more.
This study introduces a novel approach to address challenges in workpiece surface defect identification. It presents an enhanced Single Shot MultiBox Detector model, incorporating attention mechanisms and multi-feature fusion. The research methodology involves carefully curating a dataset from authentic on-site factory production, enabling the training of a model with robust real-world generalization. Leveraging the Single Shot MultiBox Detector model lead to improvements integrating channel and spatial attention mechanisms in the feature extraction network. Diverse feature extraction methods enhance the network’s focus on crucial information, improving its defect detection efficacy. The proposed model achieves a significant Mean Average Precision (mAP) improvement, reaching 99.98% precision, a substantial 3% advancement over existing methodologies. Notably, the proposed model exhibits a tendency for the values of the P-R curves in object detection for each category to approach 1, which allows a better balance between the requirements of real-time detection and precision. Within the threshold range of 0.2 to 1, the model maintains a stable level of precision, consistently remaining between 0.99 and 1. In addition, the average running speed is 2 fps lower compared to other models, and the reduction in detection speed after the model improvement is kept within 1%. The experimental results indicate that the model excels in pixel-level defect identification, which is crucial for precise defect localization. Empirical experiments validate the algorithm’s superior performance. This research represents a pivotal advancement in workpiece surface defect identification, combining technological innovation with practical efficacy. Full article
Show Figures

Figure 1

15 pages, 1827 KiB  
Article
An Interactive Training Model for Myoelectric Regression Control Based on Human–Machine Cooperative Performance
by Carles Igual, Alberto Castillo and Jorge Igual
Computers 2024, 13(1), 29; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010029 - 21 Jan 2024
Viewed by 1315
Abstract
Electromyography-based wearable biosensors are used for prosthetic control. Machine learning prosthetic controllers are based on classification and regression models. The advantage of the regression approach is that it permits us to obtain a smoother and more natural controller. However, the existing training methods [...] Read more.
Electromyography-based wearable biosensors are used for prosthetic control. Machine learning prosthetic controllers are based on classification and regression models. The advantage of the regression approach is that it permits us to obtain a smoother and more natural controller. However, the existing training methods for regression-based solutions is the same as the training protocol used in the classification approach, where only a finite set of movements are trained. In this paper, we present a novel training protocol for myoelectric regression-based solutions that include a feedback term that allows us to explore more than a finite set of movements and is automatically adjusted according to real-time performance of the subject during the training session. Consequently, the algorithm distributes the training time efficiently, focusing on the movements where the performance is worse and optimizing the training for each user. We tested and compared the existing and new training strategies in 20 able-bodied participants and 4 amputees. The results show that the novel training procedure autonomously produces a better training session. As a result, the new controller outperforms the one trained with the existing method: for the able-bodied participants, the average number of targets hit is increased from 86% to 95% and the path efficiency from 40% to 84%, while for the subjects with limb deficiencies, the completion rate is increased from 58% to 69% and the path efficiency from 24% to 56%. Full article
(This article belongs to the Special Issue Uncertainty-Aware Artificial Intelligence)
Show Figures

Figure 1

29 pages, 11833 KiB  
Article
Enhancing Collaborative Learning and E-Mentoring in a Smart Education System in Higher Education
by Loan Nguyen, Sarath Tomy and Eric Pardede
Computers 2024, 13(1), 28; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010028 - 18 Jan 2024
Viewed by 1876
Abstract
The requirement to develop a smart education system is critical in the era of ubiquitous technology. In the smart education environment, intelligent pedagogies are constructed to take advantage of technological devices and foster learners’ competencies which undoubtedly assist learners in dealing with knowledge [...] Read more.
The requirement to develop a smart education system is critical in the era of ubiquitous technology. In the smart education environment, intelligent pedagogies are constructed to take advantage of technological devices and foster learners’ competencies which undoubtedly assist learners in dealing with knowledge and handling issues in a dynamic society more effectively and productively. This research suggests two effective learning strategies: (1) collaborative learning, which helps learners improve their knowledge and skills by exchanging resources and experiences, and (2) e-mentoring, which connects learners to a wide range of professional communities. This research first proposes a model to show how these two learning methods help learners achieve their goals, along with a set of hypotheses that are explained in detail. Then, a smart education system is proposed which comprises the two learning strategies with the necessary features. Lastly, two questionnaires, one for facilitators and the other for learners, are used to evaluate the usefulness and the feasibility of the proposed model in a real-world educational environment. The great majority of respondents agreed with all the statements, demonstrating the efficiency of the research for educators and learners. Full article
(This article belongs to the Special Issue Recent Advances in Computer-Assisted Learning)
Show Figures

Figure 1

42 pages, 969 KiB  
Review
A Review of Blockchain’s Role in E-Commerce Transactions: Open Challenges, and Future Research Directions
by Latifa Albshaier, Seetah Almarri and M. M. Hafizur Rahman
Computers 2024, 13(1), 27; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010027 - 17 Jan 2024
Cited by 1 | Viewed by 9116
Abstract
The Internet’s expansion has changed how the services accessed and businesses operate. Blockchain is an innovative technology that emerged after the rise of the Internet. In addition, it maintains transactions on encrypted databases that are distributed among many computer networks, much like digital [...] Read more.
The Internet’s expansion has changed how the services accessed and businesses operate. Blockchain is an innovative technology that emerged after the rise of the Internet. In addition, it maintains transactions on encrypted databases that are distributed among many computer networks, much like digital ledgers for online transactions. This technology has the potential to establish a decentralized marketplace for Internet retailers. Sensitive information, like customer data and financial statements, should be routinely transferred via e-commerce. As a result, the system becomes a prime target for cybercriminals seeking illegal access to data. As e-commerce increases, so does the frequency of hacker attacks that raise concerns about the safety of e-commerce platforms’ databases. Owing to the sensitivity of customer data, employee records, and customer records, organizations must ensure their protection. A data breach not only affects an enterprise’s financial performance but also erodes clients’ confidence in the platform. Currently, e-commerce businesses face numerous challenges, including the security of the e-commerce system, transparency and trust in its effectiveness. A solution to these issues is the application of blockchain technology in the e-commerce industry. Blockchain technology simplifies fraud detection and investigation by recording transactions and accompanying data. Blockchain technology enables transaction tracking by creating a detailed record of all the related data, which can assist in identifying and preventing fraud in the future. Using blockchain cryptocurrency will record the sender’s address, recipient’s address, amount transferred, and timestamp, which creates an immutable and transparent ledger of all transaction data. Full article
Show Figures

Figure 1

14 pages, 993 KiB  
Article
Development of a New Post-Quantum Digital Signature Algorithm: Syrga-1
by Kunbolat Algazy, Kairat Sakan, Ardabek Khompysh and Dilmukhanbet Dyusenbayev
Computers 2024, 13(1), 26; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010026 - 16 Jan 2024
Viewed by 1447
Abstract
The distinguishing feature of hash-based algorithms is their high confidence in security. When designing electronic signature schemes, proofs of security reduction to certain properties of cryptographic hash functions are used. This means that if the scheme is compromised, then one of these properties [...] Read more.
The distinguishing feature of hash-based algorithms is their high confidence in security. When designing electronic signature schemes, proofs of security reduction to certain properties of cryptographic hash functions are used. This means that if the scheme is compromised, then one of these properties will be violated. It is important to note that the properties of cryptographic hash functions have been studied for many years, but if a specific hash function used in a protocol turns out to be insecure, it can simply be replaced with another one while keeping the overall construction unchanged. This article describes a new post-quantum signature algorithm, Syrga-1, based on a hash function. This algorithm is designed to sign r messages with a single secret key. One of the key primitives of the signature algorithm is a cryptographic hash function. The proposed algorithm uses the HAS01 hashing algorithm developed by researchers from the Information Security Laboratory of the Institute of Information and Computational Technologies. The security and efficiency of the specified hash algorithm have been demonstrated in other articles by its authors. Hash-based signature schemes are attractive as post-quantum signature schemes because their security can be quantified, and their security has been proven. Full article
Show Figures

Figure 1

19 pages, 802 KiB  
Review
Securing Mobile Edge Computing Using Hybrid Deep Learning Method
by Olusola Adeniyi, Ali Safaa Sadiq, Prashant Pillai, Mohammad Aljaidi and Omprakash Kaiwartya
Computers 2024, 13(1), 25; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010025 - 16 Jan 2024
Cited by 1 | Viewed by 1847
Abstract
In recent years, Mobile Edge Computing (MEC) has revolutionized the landscape of the telecommunication industry by offering low-latency, high-bandwidth, and real-time processing. With this advancement comes a broad range of security challenges, the most prominent of which is Distributed Denial of Service (DDoS) [...] Read more.
In recent years, Mobile Edge Computing (MEC) has revolutionized the landscape of the telecommunication industry by offering low-latency, high-bandwidth, and real-time processing. With this advancement comes a broad range of security challenges, the most prominent of which is Distributed Denial of Service (DDoS) attacks, which threaten the availability and performance of MEC’s services. In most cases, Intrusion Detection Systems (IDSs), a security tool that monitors networks and systems for suspicious activity and notify administrators in real time of potential cyber threats, have relied on shallow Machine Learning (ML) models that are limited in their abilities to identify and mitigate DDoS attacks. This article highlights the drawbacks of current IDS solutions, primarily their reliance on shallow ML techniques, and proposes a novel hybrid Autoencoder–Multi-Layer Perceptron (AE–MLP) model for intrusion detection as a solution against DDoS attacks in the MEC environment. The proposed hybrid AE–MLP model leverages autoencoders’ feature extraction capabilities to capture intricate patterns and anomalies within network traffic data. This extracted knowledge is then fed into a Multi-Layer Perceptron (MLP) network, enabling deep learning techniques to further analyze and classify potential threats. By integrating both AE and MLP, the hybrid model achieves higher accuracy and robustness in identifying DDoS attacks while minimizing false positives. As a result of extensive experiments using the recently released NF-UQ-NIDS-V2 dataset, which contains a wide range of DDoS attacks, our results demonstrate that the proposed hybrid AE–MLP model achieves a high accuracy of 99.98%. Based on the results, the hybrid approach performs better than several similar techniques. Full article
Show Figures

Figure 1

23 pages, 6251 KiB  
Article
Augmented Reality Escape Classroom Game for Deep and Meaningful English Language Learning
by Angeliki Voreopoulou, Stylianos Mystakidis and Avgoustos Tsinakos
Computers 2024, 13(1), 24; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010024 - 16 Jan 2024
Cited by 1 | Viewed by 1900
Abstract
A significant volume of literature has extensively reported on and presented the benefits of employing escape classroom games (ECGs), on one hand, and on augmented reality (AR) in English language learning, on the other. However, there is little evidence on how AR-powered ECGs [...] Read more.
A significant volume of literature has extensively reported on and presented the benefits of employing escape classroom games (ECGs), on one hand, and on augmented reality (AR) in English language learning, on the other. However, there is little evidence on how AR-powered ECGs can enhance deep and meaningful foreign language learning. Hence, this study presents the design, development and user evaluation of an innovative augmented reality escape classroom game created for teaching English as a foreign language (EFL). The game comprises an imaginative guided group tour around the Globe Theatre in London that is being disrupted by Shakespeare’s ghost. The game was evaluated by following a qualitative research method that depicts the in-depth perspectives of ten in-service English language teachers. The data collection instruments included a 33-item questionnaire and semi-structured interviews. The findings suggest that this escape game is a suitable pedagogical tool for deep and meaningful language learning and that it can raise cultural awareness, while enhancing vocabulary retention and the development of receptive and productive skills in English. Students’ motivation and satisfaction levels toward language learning are estimated to remain high due to the game’s playful nature, its interactive elements, as well as the joyful atmosphere created through active communication, collaboration, creativity, critical thinking and peer work. This study provides guidelines and support for the design and development of similar augmented reality escape classroom games (ARECGs) to improve teaching practices and foreign language education. Full article
(This article belongs to the Special Issue Extended Reality (XR) Applications in Education 2023)
Show Figures

Graphical abstract

12 pages, 1412 KiB  
Article
Cloud-Based Infrastructure and DevOps for Energy Fault Detection in Smart Buildings
by Kaleb Horvath, Mohamed Riduan Abid, Thomas Merino, Ryan Zimmerman, Yesem Peker and Shamim Khan
Computers 2024, 13(1), 23; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010023 - 16 Jan 2024
Viewed by 1451
Abstract
We have designed a real-world smart building energy fault detection (SBFD) system on a cloud-based Databricks workspace, a high-performance computing (HPC) environment for big-data-intensive applications powered by Apache Spark. By avoiding a Smart Building Diagnostics as a Service approach and keeping a tightly [...] Read more.
We have designed a real-world smart building energy fault detection (SBFD) system on a cloud-based Databricks workspace, a high-performance computing (HPC) environment for big-data-intensive applications powered by Apache Spark. By avoiding a Smart Building Diagnostics as a Service approach and keeping a tightly centralized design, the rapid development and deployment of the cloud-based SBFD system was achieved within one calendar year. Thanks to Databricks’ built-in scheduling interface, a continuous pipeline of real-time ingestion, integration, cleaning, and analytics workflows capable of energy consumption prediction and anomaly detection was implemented and deployed in the cloud. The system currently provides fault detection in the form of predictions and anomaly detection for 96 buildings on an active military installation. The system’s various jobs all converge within 14 min on average. It facilitates the seamless interaction between our workspace and a cloud data lake storage provided for secure and automated initial ingestion of raw data provided by a third party via the Secure File Transfer Protocol (SFTP) and BLOB (Binary Large Objects) file system secure protocol drivers. With a powerful Python binding to the Apache Spark distributed computing framework, PySpark, these actions were coded into collaborative notebooks and chained into the aforementioned pipeline. The pipeline was successfully managed and configured throughout the lifetime of the project and is continuing to meet our needs in deployment. In this paper, we outline the general architecture and how it differs from previous smart building diagnostics initiatives, present details surrounding the underlying technology stack of our data pipeline, and enumerate some of the necessary configuration steps required to maintain and develop this big data analytics application in the cloud. Full article
(This article belongs to the Special Issue Sensors and Smart Cities 2023)
Show Figures

Figure 1

14 pages, 283 KiB  
Article
A Comparative Study of Commit Representations for JIT Vulnerability Prediction
by Tamás Aladics, Péter Hegedűs and Rudolf Ferenc
Computers 2024, 13(1), 22; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010022 - 11 Jan 2024
Viewed by 1297
Abstract
With the evolution of software systems, their size and complexity are rising rapidly. Identifying vulnerabilities as early as possible is crucial for ensuring high software quality and security. Just-in-time (JIT) vulnerability prediction, which aims to find vulnerabilities at the time of commit, has [...] Read more.
With the evolution of software systems, their size and complexity are rising rapidly. Identifying vulnerabilities as early as possible is crucial for ensuring high software quality and security. Just-in-time (JIT) vulnerability prediction, which aims to find vulnerabilities at the time of commit, has increasingly become a focus of attention. In our work, we present a comparative study to provide insights into the current state of JIT vulnerability prediction by examining three candidate models: CC2Vec, DeepJIT, and Code Change Tree. These unique approaches aptly represent the various techniques used in the field, allowing us to offer a thorough description of the current limitations and strengths of JIT vulnerability prediction. Our focus was on the predictive power of the models, their usability in terms of false positive (FP) rates, and the granularity of the source code analysis they are capable of handling. For training and evaluation, we used two recently published datasets containing vulnerability-inducing commits: ProjectKB and Defectors. Our results highlight the trade-offs between predictive accuracy and operational flexibility and also provide guidance on the use of ML-based automation for developers, especially considering false positive rates in commit-based vulnerability prediction. These findings can serve as crucial insights for future research and practical applications in software security. Full article
Show Figures

Figure 1

23 pages, 9993 KiB  
Article
Advanced Road Safety: Collective Perception for Probability of Collision Estimation of Connected Vehicles
by Sabrine Belmekki and Dominique Gruyer
Computers 2024, 13(1), 21; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010021 - 9 Jan 2024
Viewed by 1272
Abstract
In the dynamic landscape of vehicular communication systems, connected vehicles (CVs) present unprecedented capabilities in perception, cooperation, and, notably, probability of collision management. This paper’s main concern is the collision probability of collision estimation. Achieving effective collision estimation heavily relies on the sensor [...] Read more.
In the dynamic landscape of vehicular communication systems, connected vehicles (CVs) present unprecedented capabilities in perception, cooperation, and, notably, probability of collision management. This paper’s main concern is the collision probability of collision estimation. Achieving effective collision estimation heavily relies on the sensor perception of obstacles and a critical collision probability prediction system. This paper is dedicated to refining the estimation of collision probability through the intentional integration of CV communications, with a specific focus on the collective perception of connected vehicles. The primary objective is to enhance the understanding of the potential probability of collisions in the surrounding environment by harnessing the collective insights gathered through inter-vehicular communication and collaboration. This improvement enables a superior anticipation capacity for both the driving system and the human driver, thereby enhancing road safety. Furthermore, the incorporation of extended perception strategies holds the potential for more accurate collision probability estimation, providing the driving system or human driver with increased time to react and make informed decisions, further fortifying road safety measures. The results underscore a significant enhancement in collision probability awareness, as connected vehicles collectively contribute to a more comprehensive collision probability landscape. Consequently, this heightened collective collision probability perception improves the anticipation capacity of both the driving system and the human driver, contributing to an elevated level of road safety. For future work, the exploration of our extended perception techniques to achieve real-time probability of collision estimation is proposed. Such endeavors aim to drive the development of robust and anticipatory autonomous driving systems that truly harness the benefits of connected vehicle technologies. Full article
(This article belongs to the Special Issue Cooperative Vehicular Networking 2023)
Show Figures

Figure 1

37 pages, 8647 KiB  
Article
Forecasting of Bitcoin Illiquidity Using High-Dimensional and Textual Features
by Faraz Sasani, Mohammad Moghareh Dehkordi, Zahra Ebrahimi, Hakimeh Dustmohammadloo, Parisa Bouzari, Pejman Ebrahimi, Enikő Lencsés and Mária Fekete-Farkas
Computers 2024, 13(1), 20; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010020 - 9 Jan 2024
Viewed by 1498
Abstract
Liquidity is the ease of converting an asset (physical/digital) into cash or another asset without loss and is shown by the relationship between the time scale and the price scale of an investment. This article examines the illiquidity of Bitcoin (BTC). Bitcoin hash [...] Read more.
Liquidity is the ease of converting an asset (physical/digital) into cash or another asset without loss and is shown by the relationship between the time scale and the price scale of an investment. This article examines the illiquidity of Bitcoin (BTC). Bitcoin hash rate information was collected at three different time intervals; parallel to these data, textual information related to these intervals was collected from Twitter for each day. Due to the regression nature of illiquidity prediction, approaches based on recurrent networks were suggested. Seven approaches: ANN, SVM, SANN, LSTM, Simple RNN, GRU, and IndRNN, were tested on these data. To evaluate these approaches, three evaluation methods were used: random split (paper), random split (run) and linear split (run). The research results indicate that the IndRNN approach provided better results. Full article
(This article belongs to the Special Issue Uncertainty-Aware Artificial Intelligence)
Show Figures

Figure 1

35 pages, 14642 KiB  
Article
Faraway, so Close: Perceptions of the Metaverse on the Edge of Madness
by Mónica Cruz, Abílio Oliveira and Alessandro Pinheiro
Computers 2024, 13(1), 19; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010019 - 8 Jan 2024
Viewed by 1445
Abstract
With the evolution of technologies, virtual reality allows us to dive into cyberspace through different devices and have immersive experiences in different contexts, which, in a simple way, we call virtual worlds or multiverse (integrating Metaverse versions). Through virtual reality, it is possible [...] Read more.
With the evolution of technologies, virtual reality allows us to dive into cyberspace through different devices and have immersive experiences in different contexts, which, in a simple way, we call virtual worlds or multiverse (integrating Metaverse versions). Through virtual reality, it is possible to create infinite simulated environments to immerse ourselves in. Future internet may be slightly different from what we use today. Virtual immersion situations are common (particularly in gaming), and the Metaverse has become a lived and almost real experience claiming its presence in our daily lives. To investigate possible perspectives or concepts regarding the Metaverse, virtual reality, and immersion, we considered a main research question: To what extent can a film centered on the multiverse be associated with adults’ Metaverse perceptions? Considering that all participants are adults, the objectives of this study are: (1) Verify the representations of the Metaverse; (2) Verify the representations of immersion; (3) Verify the representations of the multiverse; (4) Verify the importance of a film (related to the Metaverse and the multiverse) on the representations found. This study—framed in a Ph.D. research project—analyzed the participants’ answers through an online survey using two films to gather thoughts, ideas, emotions, sentiments, and reactions according to our research objectives. Some limitations were considered, such as the number of participants, number of the questionnaire questions and the knowledge or lack of the main concepts. Our results showed that a virtual world created by a movie might stimulate the perception of almost living in that supposed reality, accepting the multiverse and Metaverse not as distant concepts but as close experiences, even in an unconscious form. This finding is also a positive contribution to a discussion in progress aiming for an essential understanding of the Metaverse as a complex concept. Full article
Show Figures

Figure 1

18 pages, 889 KiB  
Article
Mining Negative Associations from Medical Databases Considering Frequent, Regular, Closed and Maximal Patterns
by Raja Rao Budaraju and Sastry Kodanda Rama Jammalamadaka
Computers 2024, 13(1), 18; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010018 - 8 Jan 2024
Viewed by 1368
Abstract
Many data mining studies have focused on mining positive associations among frequent and regular item sets. However, none have considered time and regularity bearing in mind such associations. The frequent and regular item sets will be huge, even when regularity and frequency are [...] Read more.
Many data mining studies have focused on mining positive associations among frequent and regular item sets. However, none have considered time and regularity bearing in mind such associations. The frequent and regular item sets will be huge, even when regularity and frequency are considered without any time consideration. Negative associations are equally important in medical databases, reflecting considerable discrepancies in medications used to treat various disorders. It is important to find the most effective negative associations. The mined associations should be as small as possible so that the most important disconnections can be found. This paper proposes a mining method that mines medical databases to find regular, frequent, closed, and maximal item sets that reflect minimal negative associations. The proposed algorithm reduces the negative associations by 70% when the maximal and closed properties have been used, considering any sample size, regularity, or frequency threshold. Full article
Show Figures

Figure 1

17 pages, 622 KiB  
Article
An Event-Centric Knowledge Graph Approach for Public Administration as an Enabler for Data Analytics
by Dimitris Zeginis and Konstantinos Tarabanis
Computers 2024, 13(1), 17; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010017 - 5 Jan 2024
Viewed by 1668
Abstract
In a continuously evolving environment, organizations, including public administrations, need to quickly adapt to change and make decisions in real-time. This requires having a real-time understanding of their context that can be achieved by adopting an event-native mindset in data management which focuses [...] Read more.
In a continuously evolving environment, organizations, including public administrations, need to quickly adapt to change and make decisions in real-time. This requires having a real-time understanding of their context that can be achieved by adopting an event-native mindset in data management which focuses on the dynamics of change compared to the state-based traditional approaches. In this context, this paper proposes the adoption of an event-centric knowledge graph approach for the holistic data management of all data repositories in public administration. Towards this direction, the paper proposes an event-centric knowledge graph model for the domain of public administration that captures these dynamics considering events as first-class entities for knowledge representation. The development of the model is based on a state-of-the-art analysis of existing event-centric knowledge graph models that led to the identification of core concepts related to event representation, on a state-of-the-art analysis of existing public administration models that identified the core entities of the domain, and on a theoretical analysis of concepts related to events, public services, and effective public administration in order to outline the context and identify the domain-specific needs for event modeling. Further, the paper applies the model in the context of Greek public administration in order to validate it and showcase the possibilities that arise. The results show that the adoption of event-centric knowledge graph approaches for data management in public administration can facilitate data analytics, continuous integration, and the provision of a 360-degree-view of end-users. We anticipate that the proposed approach will also facilitate real-time decision-making, continuous intelligence, and ubiquitous AI. Full article
Show Figures

Figure 1

20 pages, 1362 KiB  
Article
A Multi-Channel Packet Scheduling Approach to Improving Video Delivery Performance in Vehicular Networks
by Pedro Pablo Garrido Abenza, Manuel P. Malumbres, Pablo Piñol and Otoniel López-Granado
Computers 2024, 13(1), 16; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010016 - 4 Jan 2024
Viewed by 1312
Abstract
When working with the Wireless Access in Vehicular Environment (WAVE) protocol stack, the multi-channel operation mechanism of the IEEE 1609.4 protocol may impact the overall network performance, especially when using video streaming applications. In general, packets delivered from the application layer during a [...] Read more.
When working with the Wireless Access in Vehicular Environment (WAVE) protocol stack, the multi-channel operation mechanism of the IEEE 1609.4 protocol may impact the overall network performance, especially when using video streaming applications. In general, packets delivered from the application layer during a Control Channel (CCH) time slot have to wait for transmission until the next Service Channel (SCH) time slot arrives. The accumulation of packets at the beginning of the latter time slot may introduce additional delays and higher contention when all the network nodes try, at the same time, to obtain access to the shared channel in order to send the delayed packets as soon as possible. In this work, we have analyzed these performance issues and proposed a new method, which we call SkipCCH, that helps the MAC layer to overcome the high contention produced by the packet transmission bursts at the beginning of every SCH slot. This high contention implies an increase in the number of packet losses, which directly impacts the overall network performance. With our proposal, streaming video in vehicular networks will provide a better quality of reconstructed video at the receiver side under the same network conditions. Furthermore, this method has particularly proven its benefits when working with Quality of Service (QoS) techniques, not only by increasing the received video quality but also because it avoids starvation of the lower-priority traffic. Full article
(This article belongs to the Special Issue Vehicular Networking and Intelligent Transportation Systems 2023)
Show Figures

Figure 1

2 pages, 328 KiB  
Correction
Correction: Nayak et al. Brain Tumour Classification Using Noble Deep Learning Approach with Parametric Optimization through Metaheuristics Approaches. Computers 2022, 11, 10
by Dillip Ranjan Nayak, Neelamadhab Padhy, Pradeep Kumar Mallick, Dilip Kumar Bagal and Sachin Kumar
Computers 2024, 13(1), 15; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010015 - 3 Jan 2024
Viewed by 1175
Abstract
Figure 1 was reproduced without the correct copyright permissions from the copyright holder (Medical Sciences) [...] Full article
Show Figures

Figure 4

25 pages, 9733 KiB  
Article
Blockchain-Powered Gaming: Bridging Entertainment with Serious Game Objectives
by Dimitrios Stamatakis, Dimitrios G. Kogias, Pericles Papadopoulos, Panagiotis A. Karkazis and Helen C. Leligou
Computers 2024, 13(1), 14; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010014 - 3 Jan 2024
Viewed by 2073
Abstract
The advancement and acceptance of new technologies often hinges on the level of understanding and trust among potential users. Blockchain technology, despite its broad applications across diverse sectors, is often met with skepticism due to a general lack of understanding and incidents of [...] Read more.
The advancement and acceptance of new technologies often hinges on the level of understanding and trust among potential users. Blockchain technology, despite its broad applications across diverse sectors, is often met with skepticism due to a general lack of understanding and incidents of illicit activities in the cryptocurrency domain. This study aims to demystify blockchain technology by providing an in-depth examination of its application in a novel blockchain-based card game, centered around renewable energy and sustainable resource management. This paper introduces a serious game that uses blockchain to enhance user interaction, ownership, and gameplay, demonstrating the technology’s potential to revolutionize the gaming industry. Notable aspects of the game, such as ownership of virtual assets, transparent transaction histories, trustless game mechanics, user-driven content creation, gasless transactions, and mechanisms for in-game asset trading and cross-platform asset reuse are analyzed. The paper discusses how these features, not only provide a richer gaming experience but also serve as effective tools for raising awareness about sustainable energy and resource management, thereby bridging the gap between entertainment and education. The case study offers valuable insights into how blockchain can create dynamic, secure, and participatory virtual environments, shifting the paradigm of traditional online gaming. Full article
(This article belongs to the Special Issue When Blockchain Meets IoT: Challenges and Potentials)
Show Figures

Figure 1

21 pages, 2681 KiB  
Article
A Comparative Study on Recent Automatic Data Fusion Methods
by Luis Manuel Pereira, Addisson Salazar and Luis Vergara
Computers 2024, 13(1), 13; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010013 - 30 Dec 2023
Cited by 1 | Viewed by 1732
Abstract
Automatic data fusion is an important field of machine learning that has been increasingly studied. The objective is to improve the classification performance from several individual classifiers in terms of accuracy and stability of the results. This paper presents a comparative study on [...] Read more.
Automatic data fusion is an important field of machine learning that has been increasingly studied. The objective is to improve the classification performance from several individual classifiers in terms of accuracy and stability of the results. This paper presents a comparative study on recent data fusion methods. The fusion step can be applied at early and/or late stages of the classification procedure. Early fusion consists of combining features from different sources or domains to form the observation vector before the training of the individual classifiers. On the contrary, late fusion consists of combining the results from the individual classifiers after the testing stage. Late fusion has two setups, combination of the posterior probabilities (scores), which is called soft fusion, and combination of the decisions, which is called hard fusion. A theoretical analysis of the conditions for applying the three kinds of fusion (early, late, and late hard) is introduced. Thus, we propose a comparative analysis with different schemes of fusion, including weaknesses and strengths of the state-of-the-art methods studied from the following perspectives: sensors, features, scores, and decisions. Full article
Show Figures

Figure 1

19 pages, 4843 KiB  
Article
A Robust Timing Synchronization Algorithm Based on PSSS for LTE-V2X
by Ju Zhang, Bin Chen, Jiahui Qiu, Lingfan Zhuang, Zhiyuan Wang and Liu Liu
Computers 2024, 13(1), 12; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010012 - 30 Dec 2023
Viewed by 1339
Abstract
In recent years, Long-Term Evolution Vehicle-to-Everything (LTE-V2X) communication technology has received extensive attention. Timing synchronization is a crucial step in the receiving process, addressing Timing Offsets (TOs) resulting from random propagation delays, sampling frequency mismatches between the transmitter and receiver or a combination [...] Read more.
In recent years, Long-Term Evolution Vehicle-to-Everything (LTE-V2X) communication technology has received extensive attention. Timing synchronization is a crucial step in the receiving process, addressing Timing Offsets (TOs) resulting from random propagation delays, sampling frequency mismatches between the transmitter and receiver or a combination of both. However, the presence of high-speed relative movement between nodes and a low antenna height leads to a significant Doppler frequency offset, resulting in a low Signal-to-Noise Ratio (SNR) for received signals in LTE-V2X communication scenarios. This paper aims to investigate LTE-V2X technology with a specific focus on time synchronization. The research centers on the time synchronization method utilizing the Primary Sidelink Synchronization Signal (PSSS) and conducts a comprehensive analysis of existing algorithms, highlighting their respective advantages and disadvantages. On this basis, a robust timing synchronization algorithm for LTE-V2X communication scenarios is proposed. The algorithm comprises three key steps: coarse synchronization, frequency offset estimation and fine synchronization. Enhanced robustness is achieved through algorithm fusion, optimal decision threshold design and predefined frequency offset values. Furthermore, a hardware-in-the-loop simulation platform is established. The simulation results demonstrate a substantial performance improvement for the proposed algorithm compared to existing methods under adverse channel conditions characterized by high frequency offsets and low SNR. Full article
(This article belongs to the Special Issue Vehicular Networking and Intelligent Transportation Systems 2023)
Show Figures

Figure 1

25 pages, 2717 KiB  
Article
Towards Blockchain-Integrated Enterprise Resource Planning: A Pre-Implementation Guide
by Lahlou Imane, Motaki Noureddine, Sarsri Driss and L’yarfi Hanane
Computers 2024, 13(1), 11; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010011 - 26 Dec 2023
Viewed by 2003
Abstract
In the face of numerous challenges in supply chain management, new technologies are being implemented to overcome obstacles and improve overall performance. Among these technologies, blockchain, a part of the distributed ledger family, offers several advantages when integrated with ERP systems, such as [...] Read more.
In the face of numerous challenges in supply chain management, new technologies are being implemented to overcome obstacles and improve overall performance. Among these technologies, blockchain, a part of the distributed ledger family, offers several advantages when integrated with ERP systems, such as transparency, traceability, and data security. However, blockchain remains a novel, complex, and costly technology. The purpose of this paper is to guide decision-makers in determining whether integrating blockchain technology with ERP systems is appropriate during the pre-implementation phase. This paper focuses on the literature reviews, theories, and expert opinions to achieve its objectives. It first provides an overview of blockchain technology, then discusses its potential benefits to the supply chain, and finally proposes a framework to assist decision-makers in determining whether blockchain meets the needs of their consortium and whether this integration aligns with available resources. The results highlight the complexity of blockchain, the importance of detailed and in-depth research in deciding whether to integrate blockchain technology into ERP systems, and future research prospects. The findings of this article also present the critical decisions to be made prior to the implementation of blockchain, in the event that decision-makers choose to proceed with blockchain integration. The findings of this article augment the existing literature and can be applied in real-world contexts by stakeholders involved in blockchain integration projects with ERP systems. Full article
Show Figures

Figure 1

17 pages, 950 KiB  
Article
Facilitating Communication in Neuromuscular Diseases: An Adaptive Approach with Fuzzy Logic and Machine Learning in Augmentative and Alternative Communication Systems
by Jhon Fernando Sánchez-Álvarez, Gloria Patricia Jaramillo-Álvarez and Jovani Alberto Jiménez-Builes
Computers 2024, 13(1), 10; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010010 - 26 Dec 2023
Viewed by 1484
Abstract
Augmentative and alternative communication techniques (AAC) are essential to assist individuals facing communication difficulties. (1) Background: It is acknowledged that dynamic solutions that adjust to the changing needs of patients are necessary in the context of neuromuscular diseases. (2) Methods: In order address [...] Read more.
Augmentative and alternative communication techniques (AAC) are essential to assist individuals facing communication difficulties. (1) Background: It is acknowledged that dynamic solutions that adjust to the changing needs of patients are necessary in the context of neuromuscular diseases. (2) Methods: In order address this concern, a differential approach was suggested that entailed the prior identification of the disease state. This approach employs fuzzy logic to ascertain the disease stage by analyzing intuitive patterns; it is contrasted with two intelligent systems. (3) Results: The results indicate that the AAC system’s adaptability enhances with the progression of the disease’s phases, thereby ensuring its utility throughout the lifespan of the individual. Although the adaptive AAC system exhibits signs of improvement, an expanded assessment involving a greater number of patients is required. (4) Conclusions: Qualitative assessments of comparative studies shed light on the difficulties associated with enhancing accuracy and adaptability. This research highlights the significance of investigating the use of fuzzy logic or artificial intelligence methods in order to solve the issue of symptom variability in disease staging. Full article
Show Figures

Figure 1

16 pages, 1372 KiB  
Article
Custom ASIC Design for SHA-256 Using Open-Source Tools
by Lucas Daudt Franck, Gabriel Augusto Ginja, João Paulo Carmo, José A. Afonso and Maximiliam Luppe
Computers 2024, 13(1), 9; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010009 - 25 Dec 2023
Cited by 1 | Viewed by 2697
Abstract
The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated [...] Read more.
The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated the search for more efficient hardware solutions, such as application-specific integrated circuits (ASICs). This work presents a custom ASIC hardware accelerator for the SHA-256 algorithm entirely created using open-source electronic design automation tools. The integrated circuit was synthesized using SkyWater SKY130 130 nm process technology through the OpenLANE automated workflow. The proposed final design is compatible with 32-bit microcontrollers, has a total area of 104,585 µm2, and operates at a maximum clock frequency of 97.9 MHz. Several optimization configurations were tested and analyzed during the synthesis phase to enhance the performance of the final design. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

33 pages, 8073 KiB  
Article
The Doubly Linked Tree of Singly Linked Rings: Providing Hard Real-Time Database Operations on an FPGA
by Simon Lohmann and Dietmar Tutsch
Computers 2024, 13(1), 8; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010008 - 24 Dec 2023
Viewed by 1463
Abstract
We present a hardware data structure specifically designed for FPGAs that enables the execution of the hard real-time database CRUD operations using a hybrid data structure that combines trees and rings. While the number of rows and columns has to be limited for [...] Read more.
We present a hardware data structure specifically designed for FPGAs that enables the execution of the hard real-time database CRUD operations using a hybrid data structure that combines trees and rings. While the number of rows and columns has to be limited for hard real-time execution, the actual content can be of any size. Our structure restricts full navigational freedom to every but the leaf layer, thus keeping the memory overhead for the data stored in the leaves low. Although its nodes differ in function, all have exactly the same size and structure, reducing the number of cascaded decisions required in the database operations. This enables fast and efficient hardware implementation on FPGAs. In addition to the usual comparison with known data structures, we also analyze the tradeoff between the memory consumption of our approach and a simplified version that is doubly linked in all layers. Full article
(This article belongs to the Special Issue Advances in Database Engineered Applications 2023)
Show Figures

Figure 1

20 pages, 11378 KiB  
Article
Robust Face Mask Detection by a Socially Assistive Robot Using Deep Learning
by Yuan Zhang, Meysam Effati, Aaron Hao Tan and Goldie Nejat
Computers 2024, 13(1), 7; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010007 - 23 Dec 2023
Viewed by 1539
Abstract
Wearing masks in indoor and outdoor public places has been mandatory in a number of countries during the COVID-19 pandemic. Correctly wearing a face mask can reduce the transmission of the virus through respiratory droplets. In this paper, a novel two-step deep learning [...] Read more.
Wearing masks in indoor and outdoor public places has been mandatory in a number of countries during the COVID-19 pandemic. Correctly wearing a face mask can reduce the transmission of the virus through respiratory droplets. In this paper, a novel two-step deep learning (DL) method based on our extended ResNet-50 is presented. It can detect and classify whether face masks are missing, are worn correctly or incorrectly, or the face is covered by other means (e.g., a hand or hair). Our DL method utilizes transfer learning with pretrained ResNet-50 weights to reduce training time and increase detection accuracy. Training and validation are achieved using the MaskedFace-Net, MAsked FAces (MAFA), and CelebA datasets. The trained model has been incorporated onto a socially assistive robot for robust and autonomous detection by a robot using lower-resolution images from the onboard camera. The results show a classification accuracy of 84.13% for the classification of no mask, correctly masked, and incorrectly masked faces in various real-world poses and occlusion scenarios using the robot. Full article
Show Figures

Figure 1

34 pages, 2372 KiB  
Article
Multi-Network Latency Prediction for IoT and WSNs
by Josiah E. Balota, Ah-Lian Kor and Olatunji A. Shobande
Computers 2024, 13(1), 6; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010006 - 23 Dec 2023
Viewed by 1369
Abstract
The domain of Multi-Network Latency Prediction for IoT and Wireless Sensor Networks (WSNs) confronts significant challenges. However, continuous research efforts and progress in areas such as machine learning, edge computing, security technologies, and hybrid modelling are actively influencing the closure of identified gaps. [...] Read more.
The domain of Multi-Network Latency Prediction for IoT and Wireless Sensor Networks (WSNs) confronts significant challenges. However, continuous research efforts and progress in areas such as machine learning, edge computing, security technologies, and hybrid modelling are actively influencing the closure of identified gaps. Effectively addressing the inherent complexities in this field will play a crucial role in unlocking the full potential of latency prediction systems within the dynamic and diverse landscape of the Internet of Things (IoT). Using linear interpolation and extrapolation algorithms, the study explores the use of multi-network real-time end-to-end latency data for precise prediction. This approach has significantly improved network performance through throughput and response time optimization. The findings indicate prediction accuracy, with the majority of experimental connection pairs achieving over 95% accuracy, and within a 70% to 95% accuracy range. This research provides tangible evidence that data packet and end-to-end latency time predictions for heterogeneous low-rate and low-power WSNs, facilitated by a localized database, can substantially enhance network performance, and minimize latency. Our proposed JosNet model simplifies and streamlines WSN prediction by employing linear interpolation and extrapolation techniques. The research findings also underscore the potential of this approach to revolutionize the management and control of data packets in WSNs, paving the way for more efficient and responsive wireless sensor networks. Full article
(This article belongs to the Special Issue Applied ML for Industrial IoT)
Show Figures

Figure 1

18 pages, 1023 KiB  
Article
NLP Sentiment Analysis and Accounting Transparency: A New Era of Financial Record Keeping
by Alessio Faccia, Julie McDonald and Babu George
Computers 2024, 13(1), 5; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010005 - 23 Dec 2023
Viewed by 2437
Abstract
Transparency in financial reporting is crucial for maintaining trust in financial markets, yet fraudulent financial statements remain challenging to detect and prevent. This study introduces a novel approach to detecting financial statement fraud by applying sentiment analysis to analyse the textual data within [...] Read more.
Transparency in financial reporting is crucial for maintaining trust in financial markets, yet fraudulent financial statements remain challenging to detect and prevent. This study introduces a novel approach to detecting financial statement fraud by applying sentiment analysis to analyse the textual data within financial reports. This research aims to identify patterns and anomalies that might indicate fraudulent activities by examining the language and sentiment expressed across multiple fiscal years. The study focuses on three companies known for financial statement fraud: Wirecard, Tesco, and Under Armour. Utilising Natural Language Processing (NLP) techniques, the research analyses polarity (positive or negative sentiment) and subjectivity (degree of personal opinion) within the financial statements, revealing intriguing patterns. Wirecard showed a consistent tone with a slight decrease in 2018, Tesco exhibited marked changes in the fraud year, and Under Armour presented subtler shifts during the fraud years. While the findings present promising trends, the study emphasises that sentiment analysis alone cannot definitively detect financial statement fraud. It provides insights into the tone and mood of the text but cannot reveal intentional deception or financial discrepancies. The results serve as supplementary information, enriching traditional financial analysis methods. This research contributes to the field by exploring the potential of sentiment analysis in financial fraud detection, offering a unique perspective that complements quantitative methods. It opens new avenues for investigation and underscores the need for an integrated, multidimensional approach to fraud detection. Full article
Show Figures

Figure 1

0 pages, 906 KiB  
Article
Ionospheric Error Models for Satellite-Based Navigation—Paving the Road towards LEO-PNT Solutions
by Majed Imad, Antoine Grenier, Xiaolong Zhang, Jari Nurmi and Elena Simona Lohan
Computers 2024, 13(1), 4; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010004 - 22 Dec 2023
Viewed by 1771
Abstract
Low Earth Orbit (LEO) constellations have recently gained tremendous attention in the navigational field due to their larger constellation size, faster geometry variations, and higher signal power levels than Global Navigation Satellite Systems (GNSS), making them favourable for Position, Navigation, and Timing (PNT) [...] Read more.
Low Earth Orbit (LEO) constellations have recently gained tremendous attention in the navigational field due to their larger constellation size, faster geometry variations, and higher signal power levels than Global Navigation Satellite Systems (GNSS), making them favourable for Position, Navigation, and Timing (PNT) purposes. Satellite signals are heavily attenuated from the atmospheric layers, especially from the ionosphere. Ionospheric delays are, however, expected to be smaller in signals from LEO satellites than GNSS due to their lower orbital altitudes and higher carrier frequency. Nevertheless, unlike for GNSS, there are currently no standardized models for correcting the ionospheric errors in LEO signals. In this paper, we derive a new model called Interpolated and Averaged Memory Model (IAMM) starting from existing International GNSS Service (IGS) data and based on the observation that ionospheric effects repeat every 11 years. Our IAMM model can be used for ionospheric corrections for signals from any satellite constellation, including LEO. This model is constructed based on averaging multiple ionospheric data and reflecting the electron content inside the ionosphere. The IAMM model’s primary advantage is its ability to be used both online and offline without needing real-time input parameters, thus making it easy to store in a device’s memory. We compare this model with two benchmark models, the Klobuchar and International Reference Ionosphere (IRI) models, by utilizing GNSS measurement data from 24 scenarios acquired in several European countries using both professional GNSS receivers and Android smartphones. The model’s behaviour is also evaluated on LEO signals using simulated data (as measurement data based on LEO signals are still not available in the open-access community; we show a significant reduction in ionospheric delays in LEO signals compared to GNSS. Finally, we highlight the remaining open challenges toward viable ionospheric-delay models in an LEO-PNT context. Full article
Show Figures

Figure 1

24 pages, 485 KiB  
Article
Modeling Seasonality of Emotional Tension in Social Media
by Alexey Nosov, Yulia Kuznetsova, Maksim Stankevich, Ivan Smirnov and Oleg Grigoriev
Computers 2024, 13(1), 3; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010003 - 22 Dec 2023
Viewed by 1277
Abstract
Social media has become an almost unlimited resource for studying social processes. Seasonality is a phenomenon that significantly affects many physical and mental states. Modeling collective emotional seasonal changes is a challenging task for the technical, social, and humanities sciences. This is due [...] Read more.
Social media has become an almost unlimited resource for studying social processes. Seasonality is a phenomenon that significantly affects many physical and mental states. Modeling collective emotional seasonal changes is a challenging task for the technical, social, and humanities sciences. This is due to the laboriousness and complexity of obtaining a sufficient amount of data, processing and evaluating them, and presenting the results. At the same time, understanding the annual dynamics of collective sentiment provides us with important insights into collective behavior, especially in various crises or disasters. In our study, we propose a scheme for identifying and evaluating signs of the seasonal rise and fall of emotional tension based on social media texts. The analysis is based on Russian-language comments in VKontakte social network communities devoted to city news and the events of a small town in the Nizhny Novgorod region, Russia. Workflow steps include a statistical method for categorizing data, exploratory analysis to identify common patterns, data aggregation for modeling seasonal changes, the identification of typical data properties through clustering, and the formulation and validation of seasonality criteria. As a result of seasonality modeling, it is shown that the calendar seasonal model corresponds to the data, and the dynamics of emotional tension correlate with the seasons. The proposed methodology is useful for a wide range of social practice issues, such as monitoring public opinion or assessing irregular shifts in mass emotions. Full article
(This article belongs to the Special Issue Computational Modeling of Social Processes and Social Networks)
Show Figures

Figure 1

18 pages, 3160 KiB  
Article
Twofold Machine-Learning and Molecular Dynamics: A Computational Framework
by Christos Stavrogiannis, Filippos Sofos, Maria Sagri, Denis Vavougios and Theodoros E. Karakasidis
Computers 2024, 13(1), 2; https://0-doi-org.brum.beds.ac.uk/10.3390/computers13010002 - 22 Dec 2023
Cited by 1 | Viewed by 1470
Abstract
Data science and machine learning (ML) techniques are employed to shed light into the molecular mechanisms that affect fluid-transport properties at the nanoscale. Viscosity and thermal conductivity values of four basic monoatomic elements, namely, argon, krypton, nitrogen, and oxygen, are gathered from experimental [...] Read more.
Data science and machine learning (ML) techniques are employed to shed light into the molecular mechanisms that affect fluid-transport properties at the nanoscale. Viscosity and thermal conductivity values of four basic monoatomic elements, namely, argon, krypton, nitrogen, and oxygen, are gathered from experimental and simulation data in the literature and constitute a primary database for further investigation. The data refers to a wide pressure–temperature (P-T) phase space, covering fluid states from gas to liquid and supercritical. The database is enriched with new simulation data extracted from our equilibrium molecular dynamics (MD) simulations. A machine learning (ML) framework with ensemble, classical, kernel-based, and stacked algorithmic techniques is also constructed to function in parallel with the MD model, trained by existing data and predicting the values of new phase space points. In terms of algorithmic performance, it is shown that the stacked and tree-based ML models have given the most accurate results for all elements and can be excellent choices for small to medium-sized datasets. In such a way, a twofold computational scheme is constructed, functioning as a computationally inexpensive route that achieves high accuracy, aiming to replace costly experiments and simulations, when feasible. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop