Previous Issue
Volume 12, September

Information, Volume 12, Issue 10 (October 2021) – 39 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Select all
Export citation of selected articles as:
Article
Improving Undergraduate Novice Programmer Comprehension through Case-Based Teaching with Roles of Variables to Provide Scaffolding
Information 2021, 12(10), 424; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100424 (registering DOI) - 16 Oct 2021
Abstract
A role-based teaching approach was proposed in order to decrease the cognitive load placed by the case-based teaching method in the undergraduate novice programmer comprehension. The results are evaluated by using the SOLO (Structure of Observed Learning Outcomes) taxonomy. Data analysis suggested novice [...] Read more.
A role-based teaching approach was proposed in order to decrease the cognitive load placed by the case-based teaching method in the undergraduate novice programmer comprehension. The results are evaluated by using the SOLO (Structure of Observed Learning Outcomes) taxonomy. Data analysis suggested novice programmers with role-based teaching tended to experience better performances, including the SOLO level of program comprehension, program debugging scores, program explaining scores, except for programming language knowledge scores, compared with the classical case-based teaching method. Considering the SOLO category of program comprehension and performances, evidence that the roles of variables can provide scaffolding to understand case programs through combining its program structure with its related problem domain is discussed, and the SOLO categories for relational reasoning are proposed. Meanwhile, the roles of variables can assist the novice in learning programming language knowledge. These results indicate that combing case-based teaching with the role of variables is an effective way to improve novice program comprehension. Full article
(This article belongs to the Special Issue Future Trends in Computer Programming Education)
Show Figures

Figure 1

Review
Method to Address Complexity in Organizations Based on a Comprehensive Overview
Information 2021, 12(10), 423; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100423 (registering DOI) - 16 Oct 2021
Abstract
Digitalization increasingly enforces organizations to accommodate changes and gain resilience. Emerging technologies, changing organizational structures and dynamic work environments bring opportunities and pose new challenges to organizations. Such developments, together with the growing volume and variety of the exchanged data, mainly yield complexity. [...] Read more.
Digitalization increasingly enforces organizations to accommodate changes and gain resilience. Emerging technologies, changing organizational structures and dynamic work environments bring opportunities and pose new challenges to organizations. Such developments, together with the growing volume and variety of the exchanged data, mainly yield complexity. This complexity often represents a solid barrier to efficiency and impedes understanding, controlling, and improving processes in organizations. Hence, organizations are prevailingly seeking to identify and avoid unnecessary complexity, which is an odd mixture of different factors. Similarly, in research, much effort has been put into measuring, reviewing, and studying complexity. However, these efforts are highly fragmented and lack a joint perspective. Further, this negatively affects the complexity research acceptance by practitioners. In this study, we extend the body of knowledge on complexity research and practice addressing its high fragmentation. In particular, a comprehensive literature analysis of complexity research is conducted to capture different types of complexity in organizations. The results are comparatively analyzed, and a morphological box containing three aspects and ten features is developed. In addition, an established multi-dimensional complexity framework is employed to synthesize the results. Using the findings from these analyses and adopting the Goal Question Metric, we propose a method for complexity management. This method serves to provide key insights and decision support in the form of extensive guidelines for addressing complexity. Thus, our findings can assist organizations in their complexity management initiatives. Full article
Article
Relativistic Effects on Satellite–Ground Two–Way Precise Time Synchronization
Information 2021, 12(10), 422; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100422 (registering DOI) - 15 Oct 2021
Abstract
An ultrahigh precise clock (space optical clock) will be installed onboard a low-orbit spacecraft (a usual expression for a low-orbit satellite operating on an orbit at an altitude of less than 1000 km) in the future, which will be expected to obtain better [...] Read more.
An ultrahigh precise clock (space optical clock) will be installed onboard a low-orbit spacecraft (a usual expression for a low-orbit satellite operating on an orbit at an altitude of less than 1000 km) in the future, which will be expected to obtain better time-frequency performance in a microgravity environment, and provide the possible realization of ultrahigh precise long-range time synchronization. The advancement of the microwave two-way time synchronization method can offer an effective solution for developing time-frequency transfer technology. In this study, we focus on a method of precise satellite-ground two-way time synchronization and present their key aspects. For reducing the relativistic effects on two-way precise time synchronization, we propose a high-precision correction method. We show the results of tests using simulated data with fully realistic effects such as atmospheric delays, orbit errors, and earth gravity, and demonstrate the satisfactory performance of the methods. The accuracy of the relativistic error correction method is investigated in terms of the spacecraft attitude error, phase center calibration error (the residual error after calibrating phase center offset), and precise orbit determination (POD) error. The results show that the phase center calibration error and POD error contribute greatly to the residual of relativistic correction, at approximately 0.1~0.3 ps, and time synchronization accuracy better than 0.6 ps can be achieved with our proposed methods. In conclusion, the relativistic error correction method is effective, and the satellite-ground two-way precise time synchronization method yields more accurate results. The results of Beidou two-way time synchronization system can only achieve sub-ns accuracy, while the final accuracy obtained by the methods in this paper can improved to ps-level. Full article
(This article belongs to the Section Information Processes)
Article
The Digital Dimension of Mobilities: Mapping Spatial Relationships between Corporeal and Digital Displacements in Barcelona
Information 2021, 12(10), 421; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100421 (registering DOI) - 15 Oct 2021
Abstract
This paper explores the ways in which technologies reshape everyday activities, adopting a mobility perspective of the digital environment, which is reframed in terms of the constitutive/substitutive element of corporeal mobility. We propose the construction of a Digital Mobility Index, quantified by measuring [...] Read more.
This paper explores the ways in which technologies reshape everyday activities, adopting a mobility perspective of the digital environment, which is reframed in terms of the constitutive/substitutive element of corporeal mobility. We propose the construction of a Digital Mobility Index, quantified by measuring the usage typology in which the technology is employed to enable mobility. Through a digital perspective on mobilities, it is possible to investigate how embodied practices and experiences of different modes of physical or virtual displacements are facilitated and emerge through technologies. The role of technologies in facilitating the anchoring of mobilities, transporting the tangible and intangible flow of goods, and in mediating social relations through space and time is emphasized through analysis of how digital usage can reproduce models typical of the neoliberal city, the effects of which in terms of spatial (in)justice have been widely discussed in the literature. The polarization inherent to the digital divide has been characterized by a separation between what has been called the “space of flows” (well connected, mobile, and offering more opportunities) and the “space of places” (poorly connected, fixed, and isolated). This digital divide indeed takes many forms, including divisions between classes, urban locations, and national spaces. By mapping “hyper- and hypo-mobilized” territories in Barcelona, this paper examines two main dimensions of digital inequality, on the one hand identifying the usage of the technological and digital in terms of the capacity to reach services and places, and on the other, measuring the territorial demographic and economic propensity to access to ICT as a predictive insight into the geographies of the social gap which emerge at municipal level. This approach complements conventional data sources such as municipal statistics and the digital divide enquiry conducted in Barcelona into the underlying digital capacities of the city and the digital skills of the population. Full article
(This article belongs to the Special Issue Beyond Digital Transformation: Digital Divides and Digital Dividends)
Article
Industrial Networks Driven by SDN Technology for Dynamic Fast Resilience
Information 2021, 12(10), 420; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100420 (registering DOI) - 15 Oct 2021
Abstract
Software-defined networking (SDN) provides the prospect of logically centralized management in industrial networks and simplified programming among devices. It also facilitates the reconfiguration of connectivity when there is a network element failure. This paper presents a new Industrial SDN (ISDN) resilience that addresses [...] Read more.
Software-defined networking (SDN) provides the prospect of logically centralized management in industrial networks and simplified programming among devices. It also facilitates the reconfiguration of connectivity when there is a network element failure. This paper presents a new Industrial SDN (ISDN) resilience that addresses the gap between two types of resilience: the first is restoration while the second is protection. Using a restoration approach increases the recovery time proportionally to the number of affected flows contrarily to the protection approach which attains the fast recovery. Nevertheless, the protection approach utilizes more flow rules (flow entries) in the switch which in return increments the lookup time taken to discover an appropriate flow entry in the flow table. This can have a negative effect on the end-to-end delay before a failure occurs (in the normal situation). In order to balance both approaches, we propose a Mixed Fast Resilience (MFR) approach to ensure the fast recovery of the primary path without any impact on the end-to-end delay in the normal situation. In the MFR, the SDN controller establishes a new path after failure detection and this is based on flow rules stored in its memory through the dynamic hash table structure as the internal flow table. At that time, it transmits the flow rules to all switches across the appropriate secondary path simultaneously from the failure point to the destination switch. Moreover, these flow rules which correspond to secondary paths are cached in the hash table by considering the current minimum path weight. This strategy leads to reduction in the load at the SDN controller and the calculation time of a new working path. The MFR approach applies the dual primary by considering several metrics such as packet-loss probability, delay, and bandwidth which are the Quality of Service (QoS) requirements for many industrial applications. Thus, we have built a simulation network and conducted an experimental testbed. The results showed that our resilience approach reduces the failure recovery time as opposed to the restoration approaches and is more scalable than a protection approach. In the normal situation, the MFR approach reduces the lookup time and end-to-end delay than a protection approach. Furthermore, the proposed approach improves the performance by minimizing the packet loss even under failing links. Full article
(This article belongs to the Section Artificial Intelligence)
Article
Financial Volatility Forecasting: A Sparse Multi-Head Attention Neural Network
Information 2021, 12(10), 419; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100419 (registering DOI) - 14 Oct 2021
Viewed by 139
Abstract
Accurately predicting the volatility of financial asset prices and exploring its laws of movement have profound theoretical and practical guiding significance for financial market risk early warning, asset pricing, and investment portfolio design. The traditional methods are plagued by the problem of substandard [...] Read more.
Accurately predicting the volatility of financial asset prices and exploring its laws of movement have profound theoretical and practical guiding significance for financial market risk early warning, asset pricing, and investment portfolio design. The traditional methods are plagued by the problem of substandard prediction performance or gradient optimization. This paper proposes a novel volatility prediction method based on sparse multi-head attention (SP-M-Attention). This model discards the two-dimensional modeling strategy of time and space of the classic deep learning model. Instead, the solution is to embed a sparse multi-head attention calculation module in the network. The main advantages are that (i) it uses the inherent advantages of the multi-head attention mechanism to achieve parallel computing, (ii) it reduces the computational complexity through sparse measurements and feature compression of volatility, and (iii) it avoids the gradient problems caused by long-range propagation and therefore, is more suitable than traditional methods for the task of analysis of long time series. In the end, the article conducts an empirical study on the effectiveness of the proposed method through real datasets of major financial markets. Experimental results show that the prediction performance of the proposed model on all real datasets surpasses all benchmark models. This discovery will aid financial risk management and the optimization of investment strategies. Full article
(This article belongs to the Special Issue Applications of Artificial Intelligence Using Real Data)
Show Figures

Figure 1

Article
Could a Conversational AI Identify Offensive Language?
Information 2021, 12(10), 418; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100418 - 12 Oct 2021
Viewed by 398
Abstract
In recent years, we have seen a wide use of Artificial Intelligence (AI) applications in the Internet and everywhere. Natural Language Processing and Machine Learning are important sub-fields of AI that have made Chatbots and Conversational AI applications possible. Those algorithms are built [...] Read more.
In recent years, we have seen a wide use of Artificial Intelligence (AI) applications in the Internet and everywhere. Natural Language Processing and Machine Learning are important sub-fields of AI that have made Chatbots and Conversational AI applications possible. Those algorithms are built based on historical data in order to create language models, however historical data could be intrinsically discriminatory. This article investigates whether a Conversational AI could identify offensive language and it will show how large language models often produce quite a bit of unethical behavior because of bias in the historical data. Our low-level proof-of-concept will present the challenges to detect offensive language in social media and it will discuss some steps to propitiate strong results in the detection of offensive language and unethical behavior using a Conversational AI. Full article
(This article belongs to the Special Issue Information Technology: New Generations (ITNG 2020 & 2021))
Article
Cybersecurity Awareness Framework for Academia
Information 2021, 12(10), 417; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100417 - 12 Oct 2021
Viewed by 211
Abstract
Cybersecurity is a multifaceted global phenomenon representing complex socio-technical challenges for governments and private sectors. With technology constantly evolving, the types and numbers of cyberattacks affect different users in different ways. The majority of recorded cyberattacks can be traced to human errors. Despite [...] Read more.
Cybersecurity is a multifaceted global phenomenon representing complex socio-technical challenges for governments and private sectors. With technology constantly evolving, the types and numbers of cyberattacks affect different users in different ways. The majority of recorded cyberattacks can be traced to human errors. Despite being both knowledge- and environment-dependent, studies show that increasing users’ cybersecurity awareness is found to be one of the most effective protective approaches. However, the intangible nature, socio-technical dependencies, constant technological evolutions, and ambiguous impact make it challenging to offer comprehensive strategies for better communicating and combatting cyberattacks. Research in the industrial sector focused on creating institutional proprietary risk-aware cultures. In contrast, in academia, where cybersecurity awareness should be at the core of an academic institution’s mission to ensure all graduates are equipped with the skills to combat cyberattacks, most of the research focused on understanding students’ attitudes and behaviors after infusing cybersecurity awareness topics into some courses in a program. This work proposes a conceptual Cybersecurity Awareness Framework to guide the implementation of systems to improve the cybersecurity awareness of graduates in any academic institution. This framework comprises constituents designed to continuously improve the development, integration, delivery, and assessment of cybersecurity knowledge into the curriculum of a university across different disciplines and majors; this framework would thus lead to a better awareness among all university graduates, the future workforce. This framework may be adjusted to serve as a blueprint that, once adjusted by academic institutions to accommodate their missions, guides institutions in developing or amending their policies and procedures for the design and assessment of cybersecurity awareness. Full article
(This article belongs to the Section Information and Communications Technology)
Show Figures

Figure 1

Article
An Approach to Ranking the Sources of Information Dissemination in Social Networks
Information 2021, 12(10), 416; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100416 - 11 Oct 2021
Viewed by 223
Abstract
The problem of countering the spread of destructive content in social networks is currently relevant for most countries of the world. Basically, automatic monitoring systems are used to detect the sources of the spread of malicious information, and automated systems, operators, and counteraction [...] Read more.
The problem of countering the spread of destructive content in social networks is currently relevant for most countries of the world. Basically, automatic monitoring systems are used to detect the sources of the spread of malicious information, and automated systems, operators, and counteraction scenarios are used to counteract it. The paper suggests an approach to ranking the sources of the distribution of messages with destructive content. In the process of ranking objects by priority, the number of messages created by the source and the integral indicator of the involvement of its audience are considered. The approach realizes the identification of the most popular and active sources of dissemination of destructive content. The approach does not require the analysis of graphs of relationships and provides an increase in the efficiency of the operator. The proposed solution is applicable both to brand reputation monitoring systems and for countering cyberbullying and the dissemination of destructive information in social networks. Full article
(This article belongs to the Special Issue Information Spreading on Networks)
Article
Short Word-Length Entering Compressive Sensing Domain: Improved Energy Efficiency in Wireless Sensor Networks
Information 2021, 12(10), 415; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100415 - 11 Oct 2021
Viewed by 258
Abstract
This work combines compressive sensing and short word-length techniques to achieve localization and target tracking in wireless sensor networks with energy-efficient communication between the network anchors and the fusion center. Gradient descent localization is performed using time-of-arrival (TOA) data which are indicative of [...] Read more.
This work combines compressive sensing and short word-length techniques to achieve localization and target tracking in wireless sensor networks with energy-efficient communication between the network anchors and the fusion center. Gradient descent localization is performed using time-of-arrival (TOA) data which are indicative of the distance between anchors and the target thereby achieving range-based localization. The short word-length techniques considered are delta modulation and sigma-delta modulation. The energy efficiency is due to the reduction of the data volume transmitted from anchors to the fusion center by employing any of the two delta modulation variants with compressive sensing techniques. Delta modulation allows the transmission of one bit per TOA sample. The communication energy efficiency is increased by RⱮ, R ≥ 1, where R is the sample reduction ratio of compressive sensing, and Ɱ is the number of bits originally present in a TOA-sample word. It is found that the localization system involving sigma-delta modulation has a superior performance to that using delta-modulation or pure compressive sampling alone, in terms of both energy efficiency and localization error in the presence of TOA measurement noise and transmission noise, owing to the noise shaping property of sigma-delta modulation. Full article
(This article belongs to the Special Issue Short Word-Length Systems for Smart Information Processing)
Show Figures

Figure 1

Article
Text Mining and Sentiment Analysis of Newspaper Headlines
Information 2021, 12(10), 414; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100414 - 09 Oct 2021
Viewed by 344
Abstract
Text analytics are well-known in the modern era for extracting information and patterns from text. However, no study has attempted to illustrate the pattern and priorities of newspaper headlines in Bangladesh using a combination of text analytics techniques. The purpose of this paper [...] Read more.
Text analytics are well-known in the modern era for extracting information and patterns from text. However, no study has attempted to illustrate the pattern and priorities of newspaper headlines in Bangladesh using a combination of text analytics techniques. The purpose of this paper is to examine the pattern of words that appeared on the front page of a well-known daily English newspaper in Bangladesh, The Daily Star, in 2018 and 2019. The elucidation of that era’s possible social and political context was also attempted using word patterns. The study employs three widely used and contemporary text mining techniques: word clouds, sentiment analysis, and cluster analysis. The word cloud reveals that election, kill, cricket, and Rohingya-related terms appeared more than 60 times in 2018, whereas BNP, poll, kill, AL, and Khaleda appeared more than 80 times in 2019. These indicated the country’s passion for cricket, political turmoil, and Rohingya-related issues. Furthermore, sentiment analysis reveals that words of fear and negative emotions appeared more than 600 times, whereas anger, anticipation, sadness, trust, and positive-type emotions came up more than 400 times in both years. Finally, the clustering method demonstrates that election, politics, deaths, digital security act, Rohingya, and cricket-related words exhibit similarity and belong to a similar group in 2019, whereas rape, deaths, road, and fire-related words clustered in 2018 alongside a similar-appearing group. In general, this analysis demonstrates how vividly the text mining approach depicts Bangladesh’s social, political, and law-and-order situation, particularly during election season and the country’s cricket craze, and also validates the significance of the text mining approach to understanding the overall view of a country during a particular time in an efficient manner. Full article
(This article belongs to the Special Issue Text Mining: Classification, Clustering and Extraction Techniques)
Show Figures

Figure 1

Article
New Approach of Measuring Human Personality Traits Using Ontology-Based Model from Social Media Data
Information 2021, 12(10), 413; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100413 - 08 Oct 2021
Viewed by 368
Abstract
Human online activities leave digital traces that provide a perfect opportunity to understand their behavior better. Social media is an excellent place to spark conversations or state opinions. Thus, it generates large-scale textual data. In this paper, we harness those data to support [...] Read more.
Human online activities leave digital traces that provide a perfect opportunity to understand their behavior better. Social media is an excellent place to spark conversations or state opinions. Thus, it generates large-scale textual data. In this paper, we harness those data to support the effort of personality measurement. Our first contribution is to develop the Big Five personality trait-based model to detect human personalities from their textual data in the Indonesian language. The model uses an ontology approach instead of the more famous machine learning model. The former better captures the meaning and intention of phrases and words in the domain of human personality. The legacy and more thorough ways to assess nature are by doing interviews or by giving questionnaires. Still, there are many real-life applications where we need to possess an alternative method, which is cheaper and faster than the legacy methodology to select individuals based on their personality. The second contribution is to support the model implementation by building a personality measurement platform. We use two distinct features for the model: an n-gram sorting algorithm to parse the textual data and a crowdsourcing mechanism that facilitates public involvement contributing to the ontology corpus addition and filtering. Full article
Article
GPR Investigation at the Archaeological Site of Le Cesine, Lecce, Italy
Information 2021, 12(10), 412; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100412 - 08 Oct 2021
Viewed by 242
Abstract
In this contribution, we present some results achieved in the archaeological site of Le Cesine, close to Lecce, in southern Italy. The investigations have been performed in a site close to the Adriatic Sea, only slightly explored up to now, and where the [...] Read more.
In this contribution, we present some results achieved in the archaeological site of Le Cesine, close to Lecce, in southern Italy. The investigations have been performed in a site close to the Adriatic Sea, only slightly explored up to now, and where the presence of an ancient Roman harbour is alleged on the basis of remains visible above all under the current sea level. This measurement campaign has been performed in the framework of a short-term scientific mission (STSM) performed in the framework of the European Cost Action 17131 (acronym SAGA), and has been aimed to identify possible points where future localized excavation might and hopefully will be performed in the next few years. Both a traditional elaboration and an innovative data processing based on a linear inverse scattering model have been performed on the data. Full article
(This article belongs to the Special Issue Techniques and Data Analysis in Cultural Heritage)
Show Figures

Figure 1

Article
Big-Data Management: A Driver for Digital Transformation?
Information 2021, 12(10), 411; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100411 (registering DOI) - 07 Oct 2021
Viewed by 699
Abstract
The rapid evolution of technology has led to a global increase in data. Due to the large volume of data, a new characterization occurred in order to better describe the new situation, namel. big data. Living in the Era of Information, businesses are [...] Read more.
The rapid evolution of technology has led to a global increase in data. Due to the large volume of data, a new characterization occurred in order to better describe the new situation, namel. big data. Living in the Era of Information, businesses are flooded with information through data processing. The digital age has pushed businesses towards finding a strategy to transform themselves in order to overtake market changes, successfully compete, and gain a competitive advantage. The aim of current paper is to extensively analyze the existing online literature to find the main (most valuable) components of big-data management according to researchers and the business community. Moreover, analysis was conducted to help readers in understanding how these components can be used from existing businesses during the process of digital transformation. Full article
Show Figures

Figure 1

Article
How Many Participants Are Required for Validation of Automated Vehicle Interfaces in User Studies?
Information 2021, 12(10), 410; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100410 - 06 Oct 2021
Viewed by 276
Abstract
Empirical validation and verification procedures require the sophisticated development of research methodology. Therefore, researchers and practitioners in human–machine interaction and the automotive domain have developed standardized test protocols for user studies. These protocols are used to evaluate human–machine interfaces (HMI) for driver distraction [...] Read more.
Empirical validation and verification procedures require the sophisticated development of research methodology. Therefore, researchers and practitioners in human–machine interaction and the automotive domain have developed standardized test protocols for user studies. These protocols are used to evaluate human–machine interfaces (HMI) for driver distraction or automated driving. A system or HMI is validated in regard to certain criteria that it can either pass or fail. One important aspect is the number of participants to include in the study and the respective number of potential failures concerning the pass/fail criteria of the test protocol. By applying binomial tests, the present work provides recommendations on how many participants should be included in a user study. It sheds light on the degree to which inferences from a sample with specific pass/fail ratios to a population is permitted. The calculations take into account different sample sizes and different numbers of observations within a sample that fail the criterion of interest. The analyses show that required sample sizes increase to high numbers with a rising degree of controllability that is assumed for a population. The required sample sizes for a specific controllability verification (e.g., 85%) also increase if there are observed cases of fails in regard to the safety criteria. In conclusion, the present work outlines potential sample sizes and valid inferences about populations and the number of observed failures in a user study. Full article
Show Figures

Figure 1

Article
Combating Fake News with Transformers: A Comparative Analysis of Stance Detection and Subjectivity Analysis
Information 2021, 12(10), 409; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100409 - 03 Oct 2021
Viewed by 315
Abstract
The widespread use of social networks has brought to the foreground a very important issue, the veracity of the information circulating within them. Many natural language processing methods have been proposed in the past to assess a post’s content with respect to its [...] Read more.
The widespread use of social networks has brought to the foreground a very important issue, the veracity of the information circulating within them. Many natural language processing methods have been proposed in the past to assess a post’s content with respect to its reliability; however, end-to-end approaches are not comparable in ability to human beings. To overcome this, in this paper, we propose the use of a more modular approach that produces indicators about a post’s subjectivity and the stance provided by the replies it has received to date, letting the user decide whether (s)he trusts or does not trust the provided information. To this end, we fine-tuned state-of-the-art transformer-based language models and compared their performance with previous related work on stance detection and subjectivity analysis. Finally, we discuss the obtained results. Full article
(This article belongs to the Special Issue Information Spreading on Networks)
Show Figures

Figure 1

Article
VERCASM-CPS: Vulnerability Analysis and Cyber Risk Assessment for Cyber-Physical Systems
Information 2021, 12(10), 408; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100408 - 30 Sep 2021
Viewed by 338
Abstract
Since Cyber-Physical Systems (CPS) are widely used in critical infrastructures, it is essential to protect their assets from cyber attacks to increase the level of security, safety and trustworthiness, prevent failure developments, and minimize losses. It is necessary to analyze the CPS configuration [...] Read more.
Since Cyber-Physical Systems (CPS) are widely used in critical infrastructures, it is essential to protect their assets from cyber attacks to increase the level of security, safety and trustworthiness, prevent failure developments, and minimize losses. It is necessary to analyze the CPS configuration in an automatic mode to detect the most vulnerable CPS components and reconfigure or replace them promptly. In this paper, we present a methodology to determine the most secure CPS configuration by using a public database of cyber vulnerabilities to identify the most secure CPS components. We also integrate the CPS cyber risk analysis with a Controlled Moving Target Defense, which either replaces the vulnerable CPS components or re-configures the CPS to harden it, while the vulnerable components are being replaced. Our solution helps to design a more secure CPS by updating the configuration of existing CPS to make them more resilient against cyber attacks. In this paper, we will compare cyber risk scores for different CPS configurations and show that the Windows® 10 build 20H2 operating system is more secure than Linux Ubuntu® 20.04, while Red Hat® Enterprise® Linux is the most secure in some system configurations. Full article
(This article belongs to the Special Issue Secure and Trustworthy Cyber–Physical Systems)
Show Figures

Figure 1

Article
Application of Multi-Criteria Decision-Making Models for the Evaluation Cultural Websites: A Framework for Comparative Analysis
Information 2021, 12(10), 407; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100407 - 30 Sep 2021
Viewed by 325
Abstract
Websites in the post COVID-19 era play a very important role as the Internet gains more visitors. A website may significantly contribute to the electronic presence of a cultural organization, such as a museum, but its success should be confirmed by an evaluation [...] Read more.
Websites in the post COVID-19 era play a very important role as the Internet gains more visitors. A website may significantly contribute to the electronic presence of a cultural organization, such as a museum, but its success should be confirmed by an evaluation experiment. Taking into account the importance of such an experiment, we present in this paper DEWESA, a generalized framework that uses and compares multi-criteria decision-making models for the evaluation of cultural websites. DEWESA presents in detail the steps that have to be followed for applying and comparing multi-criteria decision-making models for cultural websites’ evaluation. The framework is implemented in the current paper for the evaluation of museum websites. In the particular case study, five different models are implemented (SAW, WPM, TOPSIS, VIKOR, and PROMETHEE II) and compared. The comparative analysis is completed by a sensitivity analysis, in which the five multi-criteria decision-making models are compared concerning their robustness. Full article
(This article belongs to the Special Issue Evaluating Methods and Decision Making)
Show Figures

Figure 1

Article
PFMNet: Few-Shot Segmentation with Query Feature Enhancement and Multi-Scale Feature Matching
Information 2021, 12(10), 406; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100406 - 30 Sep 2021
Viewed by 334
Abstract
The datasets in the latest semantic segmentation model often need to be manually labeled for each pixel, which is time-consuming and requires much effort. General models are unable to make better predictions, for new categories of information that have never been seen before, [...] Read more.
The datasets in the latest semantic segmentation model often need to be manually labeled for each pixel, which is time-consuming and requires much effort. General models are unable to make better predictions, for new categories of information that have never been seen before, than the few-shot segmentation that has emerged. However, the few-shot segmentation is still faced up with two challenges. One is the inadequate exploration of semantic information conveyed in the high-level features, and the other is the inconsistency of segmenting objects at different scales. To solve these two problems, we have proposed a prior feature matching network (PFMNet). It includes two novel modules: (1) the Query Feature Enhancement Module (QFEM), which makes full use of the high-level semantic information in the support set to enhance the query feature, and (2) the multi-scale feature matching module (MSFMM), which increases the matching probability of multi-scales of objects. Our method achieves an intersection over union average score of 61.3% for one-shot segmentation and 63.4% for five-shot segmentation, which surpasses the state-of-the-art results by 0.5% and 1.5%, respectively. Full article
Show Figures

Figure 1

Article
UGRansome1819: A Novel Dataset for Anomaly Detection and Zero-Day Threats
Information 2021, 12(10), 405; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100405 - 30 Sep 2021
Viewed by 375
Abstract
This research attempts to introduce the production methodology of an anomaly detection dataset using ten desirable requirements. Subsequently, the article presents the produced dataset named UGRansome, created with up-to-date and modern network traffic (netflow), which represents cyclostationary patterns of normal and abnormal classes [...] Read more.
This research attempts to introduce the production methodology of an anomaly detection dataset using ten desirable requirements. Subsequently, the article presents the produced dataset named UGRansome, created with up-to-date and modern network traffic (netflow), which represents cyclostationary patterns of normal and abnormal classes of threatening behaviours. It was discovered that the timestamp of various network attacks is inferior to one minute and this feature pattern was used to record the time taken by the threat to infiltrate a network node. The main asset of the proposed dataset is its implication in the detection of zero-day attacks and anomalies that have not been explored before and cannot be recognised by known threats signatures. For instance, the UDP Scan attack has been found to utilise the lowest netflow in the corpus, while the Razy utilises the highest one. In turn, the EDA2 and Globe malware are the most abnormal zero-day threats in the proposed dataset. These feature patterns are included in the corpus, but derived from two well-known datasets, namely, UGR’16 and ransomware that include real-life instances. The former incorporates cyclostationary patterns while the latter includes ransomware features. The UGRansome dataset was tested with cross-validation and compared to the KDD99 and NSL-KDD datasets to assess the performance of Ensemble Learning algorithms. False alarms have been minimized with a null empirical error during the experiment, which demonstrates that implementing the Random Forest algorithm applied to UGRansome can facilitate accurate results to enhance zero-day threats detection. Additionally, most zero-day threats such as Razy, Globe, EDA2, and TowerWeb are recognised as advanced persistent threats that are cyclostationary in nature and it is predicted that they will be using spamming and phishing for intrusion. Lastly, achieving the UGRansome balance was found to be NP-Hard due to real life-threatening classes that do not have a uniform distribution in terms of several instances. Full article
Show Figures

Figure 1

Article
Biological Tissue Damage Monitoring Method Based on IMWPE and PNN during HIFU Treatment
Information 2021, 12(10), 404; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100404 - 30 Sep 2021
Viewed by 259
Abstract
Biological tissue damage monitoring is an indispensable part of high-intensity focused ultrasound (HIFU) treatment. As a nonlinear method, multi-scale permutation entropy (MPE) is widely used in the monitoring of biological tissue. However, the traditional MPE method neglects the amplitude information when calculating the [...] Read more.
Biological tissue damage monitoring is an indispensable part of high-intensity focused ultrasound (HIFU) treatment. As a nonlinear method, multi-scale permutation entropy (MPE) is widely used in the monitoring of biological tissue. However, the traditional MPE method neglects the amplitude information when calculating the time series complexity, and the stability of MPE is poor due to the defects in the coarse-grained process. In order to solve the above problems, the method of improved coarse-grained multi-scale weighted permutation entropy (IMWPE) is proposed in this paper. Compared with the MPE, the IMWPE method not only includes the amplitude of signal when calculating the signal complexity, but also improves the stability of entropy value. The IMWPE method is applied to the HIFU echo signals during HIFU treatment, and the probabilistic neural network (PNN) is used for monitoring the biological tissue damage. The results show that compared with multi-scale sample entropy (MSE)-PNN and MPE-PNN methods, the proposed IMWPE-PNN method can correctly identify all the normal tissues, and can more effectively identify damaged tissues and denatured tissues. The recognition rate for the three kinds of biological tissues is higher, up to 96.7%. This means that the IMWPE-PNN method can better monitor the status of biological tissue damage during HIFU treatment. Full article
(This article belongs to the Special Issue Biosignal and Medical Image Processing)
Show Figures

Figure 1

Communication
Fall Detection with CNN-Casual LSTM Network
Information 2021, 12(10), 403; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100403 - 29 Sep 2021
Viewed by 349
Abstract
Falls are one of the main causes of elderly injuries. If the faller can be found in time, further injury can be effectively avoided. In order to protect personal privacy and improve the accuracy of fall detection, this paper proposes a fall detection [...] Read more.
Falls are one of the main causes of elderly injuries. If the faller can be found in time, further injury can be effectively avoided. In order to protect personal privacy and improve the accuracy of fall detection, this paper proposes a fall detection algorithm using the CNN-Casual LSTM network based on three-axis acceleration and three-axis rotation angular velocity sensors. The neural network in this system includes an encoding layer, a decoding layer, and a ResNet18 classifier. Furthermore, the encoding layer includes three layers of CNN and three layers of Casual LSTM. The decoding layer includes three layers of deconvolution and three layers of Casual LSTM. The decoding layer maps spatio-temporal information to a hidden variable output that is more conducive relative to the work of the classification network, which is classified by ResNet18. Moreover, we used the public data set SisFall to evaluate the performance of the algorithm. The results of the experiments show that the algorithm has high accuracy up to 99.79%. Full article
(This article belongs to the Topic Advances in Online and Distance Learning)
Show Figures

Figure 1

Article
Simple but Effective Knowledge-Based Query Reformulations for Precision Medicine Retrieval
Information 2021, 12(10), 402; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100402 - 29 Sep 2021
Viewed by 309
Abstract
In Information Retrieval (IR), the semantic gap represents the mismatch between users’ queries and how retrieval models answer to these queries. In this paper, we explore how to use external knowledge resources to enhance bag-of-words representations and reduce the effect of the semantic [...] Read more.
In Information Retrieval (IR), the semantic gap represents the mismatch between users’ queries and how retrieval models answer to these queries. In this paper, we explore how to use external knowledge resources to enhance bag-of-words representations and reduce the effect of the semantic gap between queries and documents. In this regard, we propose several simple but effective knowledge-based query expansion and reduction techniques, and we evaluate them for the medical domain. The query reformulations proposed are used to increase the probability of retrieving relevant documents through the addition to, or the removal from, the original query of highly specific terms. The experimental analyses on different test collections for Precision Medicine IR show the effectiveness of the developed techniques. In particular, a specific subset of query reformulations allow retrieval models to achieve top performing results in all the considered test collections. Full article
(This article belongs to the Section Information Systems)
Show Figures

Figure 1

Article
Topic Modeling for Amharic User Generated Texts
Information 2021, 12(10), 401; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100401 - 29 Sep 2021
Viewed by 499
Abstract
Topic Modeling is a statistical process, which derives the latent themes from extensive collections of text. Three approaches to topic modeling exist, namely, unsupervised, semi-supervised and supervised. In this work, we develop a supervised topic model for an Amharic corpus. We also investigate [...] Read more.
Topic Modeling is a statistical process, which derives the latent themes from extensive collections of text. Three approaches to topic modeling exist, namely, unsupervised, semi-supervised and supervised. In this work, we develop a supervised topic model for an Amharic corpus. We also investigate the effect of stemming on topic detection on Term Frequency Inverse Document Frequency (TF-IDF) features, Latent Dirichlet Allocation (LDA) features and a combination of these two feature sets using four supervised machine learning tools, that is, Support Vector Machine (SVM), Naive Bayesian (NB), Logistic Regression (LR), and Neural Nets (NN). We evaluate our approach using an Amharic corpus of 14,751 documents of ten topic categories. Both qualitative and quantitative analysis of results show that our proposed supervised topic detection outperforms with an accuracy of 88% by SVM using state-of-the-art-approach TF-IDF word features with the application of the Synthetic Minority Over-sampling Technique (SMOTE) and with no stemming operation. The results show that text features with stemming slightly improve the performance of the topic classifier over features with no stemming. Full article
Show Figures

Figure 1

Article
Mixed Scheduling Model for Limited-Stop and Normal Bus Service with Fleet Size Constraint
Information 2021, 12(10), 400; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100400 - 28 Sep 2021
Viewed by 344
Abstract
Limited-stop service is useful to increase operation efficiency where the demand is unbalanced at different stops and unidirectional. A mixed scheduling model for limited-stop buses and normal buses is proposed considering the fleet size constraint. This model can optimize the total cost in [...] Read more.
Limited-stop service is useful to increase operation efficiency where the demand is unbalanced at different stops and unidirectional. A mixed scheduling model for limited-stop buses and normal buses is proposed considering the fleet size constraint. This model can optimize the total cost in terms of waiting time, in-vehicle time and operation cost by simultaneously adjusting the frequencies of limited-stop buses and normal buses. The feasibility and validity of the proposed model is shown by applying it to one bus route in the city of Zhenjiang, China. The results indicate that the mixed scheduling service can reduce the total cost and travel time compared with the single scheduling service in the case of unbalanced passenger flow distribution and fleet constraints. With a larger fleet, the mixed scheduling service is superior. There is an optimal fleet allocation that minimizes the cost for the system, and a significant saving could be attained by the mixed scheduling service. This study contributed to the depth analysis of the relationship among the influencing factors of mixed scheduling, such as fleet size constraint, departure interval and cost. Full article
Show Figures

Figure 1

Article
Online Interactions and Problematic Internet Use of Croatian Students during the COVID-19 Pandemic
Information 2021, 12(10), 399; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100399 - 28 Sep 2021
Viewed by 310
Abstract
The COVID-19 pandemic caused a transition to online services in almost all aspects of life. Today, online access is an important aspect of child well-being more than ever. The aim of the study was to investigate online activities and gender differences of children [...] Read more.
The COVID-19 pandemic caused a transition to online services in almost all aspects of life. Today, online access is an important aspect of child well-being more than ever. The aim of the study was to investigate online activities and gender differences of children with a special focus on harmful online content, cyberbullying, and Internet addiction. Our research was conducted among students from one Croatian county (average age = 14.97, N = 494). The Internet Addiction Test, the European Cyberbullying Intervention Project Questionnaire, as well as questions constructed for the purposes of this research (e.g., online contents) were used. Between 20% and 30% of students spend four or more hours a day online. Furthermore, 14.57% of students showed moderate signs of addiction, and 1.42% already showed severe signs of addiction, where girls had significantly higher results. The results indicated that 12.75% of students were victims, 5.87% were perpetrators, and 8.3% were, at the same time, committing and experiencing cyberbullying. Children who commit and/or experience cyberbullying achieve higher results on the scale of Internet addiction than children who do not participate in cyberbullying. These findings contribute to our understanding of Internet usage and especially its problematic aspect in such a complex time as the COVID-19 pandemic, and they can be useful for planning future interventions with children. Full article
Show Figures

Figure 1

Article
The Impact of Organizational Practices on the Information Security Management Performance
Information 2021, 12(10), 398; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100398 - 28 Sep 2021
Viewed by 351
Abstract
Information explosion and pressures are leading organizations to invest heavily in information security to ensure that information technology decisions align with business goals and manage risks. Limited studies have been done using small- and-medium-sized enterprises (SMEs) in the manufacturing sector. Furthermore, a small [...] Read more.
Information explosion and pressures are leading organizations to invest heavily in information security to ensure that information technology decisions align with business goals and manage risks. Limited studies have been done using small- and-medium-sized enterprises (SMEs) in the manufacturing sector. Furthermore, a small number of parameters have been used in the previous studies. This research aims to examine and analyze the effect of security organizational practices on information security management performance with many parameters. A model has been developed together with hypotheses to evaluate the impact of organizational practices on information security management performance. The data is collected from 171 UK employees at manufacturing SMEs that had already implemented security policies. The structure equation model is employed via the SPSS Amos 22 tool for the evaluation of results. Our results state that security training, knowledge sharing, security education, and security visibility significantly impact information security performance. In addition, this study highlights a significant impact of both security training and knowledge sharing on trust in the organization. Business leaders and decision-makers can reference the proposed model and the corresponding study results to develop favourable tactics to achieve their goals regarding information security management. Full article
(This article belongs to the Special Issue Recent Advances in IoT and Cyber/Physical Security)
Show Figures

Figure 1

Article
Pear Defect Detection Method Based on ResNet and DCGAN
Information 2021, 12(10), 397; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100397 - 28 Sep 2021
Viewed by 341
Abstract
To address the current situation, in which pear defect detection is still based on a workforce with low efficiency, we propose the use of the CNN model to detect pear defects. Since it is challenging to obtain defect images in the implementation process, [...] Read more.
To address the current situation, in which pear defect detection is still based on a workforce with low efficiency, we propose the use of the CNN model to detect pear defects. Since it is challenging to obtain defect images in the implementation process, a deep convolutional adversarial generation network was used to augment the defect images. As the experimental results indicated, the detection accuracy of the proposed method on the 3000 validation set was as high as 97.35%. Variant mainstream CNNs were compared to evaluate the model’s performance thoroughly, and the top performer was selected to conduct further comparative experiments with traditional machine learning methods, such as support vector machine algorithm, random forest algorithm, and k-nearest neighbor clustering algorithm. Moreover, the other two varieties of pears that have not been trained were chosen to validate the robustness and generalization capability of the model. The validation results illustrated that the proposed method is more accurate than the commonly used algorithms for pear defect detection. It is robust enough to be generalized well to other datasets. In order to allow the method proposed in this paper to be applied in agriculture, an intelligent pear defect detection system was built based on an iOS device. Full article
Show Figures

Figure 1

Article
Tool for Measuring Productivity in Software Development Teams
Information 2021, 12(10), 396; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100396 - 27 Sep 2021
Viewed by 311
Abstract
Despite efforts to define productivity, there is no consensus in the software industry regarding what the term productivity means and, instead of having only one metric or factor that describes productivity, it is defined by a set of aspects. Our objective is to [...] Read more.
Despite efforts to define productivity, there is no consensus in the software industry regarding what the term productivity means and, instead of having only one metric or factor that describes productivity, it is defined by a set of aspects. Our objective is to develop a tool that supports the productivity measurement of software development teams according to the factors found in the literature. We divided these factors into four groups: People, Product, Organization, and Open Source Software Projects. We developed a web system containing the factors that influence productivity identified in this work, called Productive, to support software development teams in measuring their productivity. After developed the tool, we monitored its use over eight weeks with two small software development teams. From the results, we found that software development companies can use the system to support monitoring team productivity. The results also point to an improvement in productivity while using the system, and a survey applied to users demonstrates the users’ positive perception regarding the results obtained. In future work, we will monitor the use of the tool and investigate the users’ perceptions in other project contexts. Full article
Show Figures

Figure 1

Article
Redefining the MDA Framework—The Pursuit of a Game Design Ontology
Information 2021, 12(10), 395; https://0-doi-org.brum.beds.ac.uk/10.3390/info12100395 - 26 Sep 2021
Viewed by 296
Abstract
In computer science, an ontology is a way of showing the properties of a subject area and how they are related by defining a set of concepts and categories that represent the subject. There have been many attempts to create a widely accepted [...] Read more.
In computer science, an ontology is a way of showing the properties of a subject area and how they are related by defining a set of concepts and categories that represent the subject. There have been many attempts to create a widely accepted ontology for the universe of games. Most of these attempts are defined based on an analytical perspective: few have found frequent use outside universities, as they are not easily translated to the development of games, which is a design perspective. There are some core aspects of the domain that turn this task into a difficult goal to achieve. In addition, game designers tend to refuse a methodology or a structured way of developing a game; the main concern is that it can impair creativity in a field that could not survive without it. A defined ontology would improve and mature the growing industry of digital games, both by enhancing the understanding of the domain and by supporting a structured methodology for designing games. This paper describes the properties of digital games and shows how they make it difficult to create an ontology for that field of study, especially when it comes to a design perspective. It clarifies the closest approach to a unified ontology that there is for the game domain: the mechanics, dynamics and aesthetics framework (MDA). We propose the redefinition of MDA’s taxonomy, calling it Redefining the MDA (RMDA), providing better use for the approach from a designer’s perspective, embracing the design properties of the domain, and overcoming issues found in the literature of the game domain. The main purpose of this paper is to clarify the MDA framework by redefining its main components, mechanics, dynamics and aesthetics, as a way to make the tool more understandable and useful for game designers. Understanding aesthetics and how developers can invoke them by correctly defining mechanics and creating dynamics is the main focus of the paper. Thus, some examples are provided in order to explain the applicability of the RMDA as a methodology to produce games. Full article
(This article belongs to the Special Issue The Systems and Methods of Game Design)
Previous Issue
Back to TopTop