Next Issue
Volume 12, October
Previous Issue
Volume 12, August
 
 

Information, Volume 12, Issue 9 (September 2021) – 46 articles

Cover Story (view full-size image): A common problem in underwater side-scan sonar images is the acoustic shadow generated by the beam. In addition to this, there are a number of causes impairing image quality. In this work, an innovative algorithm based on histogram processing is presented. The algorithm automatically calculates the optimal threshold for converting the original gray scale images into binary images. Experimental results indicate that the proposed algorithm produces superior results compared to popular thresholding methods, as well as common edge detection filters, even after corrosion expansion. The proposed algorithm is simple, robust and adaptive, and facilitates man-made object recognition on the seafloor; hence, it can be used in maritime archaeology, shipwreck detection, underwater development and military applications. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
13 pages, 505 KiB  
Article
Integrating Comprehensive Human Oversight in Drone Deployment: A Conceptual Framework Applied to the Case of Military Surveillance Drones
by Ilse Verdiesen, Andrea Aler Tubella and Virginia Dignum
Information 2021, 12(9), 385; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090385 - 21 Sep 2021
Cited by 9 | Viewed by 3552
Abstract
Accountability is a value often mentioned in the debate on intelligent systems and their increased pervasiveness in our society. When focusing specifically on autonomous systems, a critical gap emerges: although there is much work on governance and attribution of accountability, there is a [...] Read more.
Accountability is a value often mentioned in the debate on intelligent systems and their increased pervasiveness in our society. When focusing specifically on autonomous systems, a critical gap emerges: although there is much work on governance and attribution of accountability, there is a significant lack of methods for the operationalisation of accountability within the socio-technical layer of autonomous systems. In the case of autonomous unmanned aerial vehicles or drones—the critical question of how to maintain accountability as they undertake fully autonomous flights becomes increasingly important as their uses multiply in both the commercial and military fields. In this paper, we aim to fill the operationalisation gap by proposing a socio-technical framework to guarantee human oversight and accountability in drone deployments, showing its enforceability in the real case of military surveillance drones. By keeping a focus on accountability and human oversight as values, we align with the emphasis placed on human responsibility, while requiring a concretisation of what these principles mean for each specific application, connecting them with concrete socio-technical requirements. In addition, by constraining the framework to observable elements of pre- and post-deployment, we do not rely on assumptions made on the internal workings of the drone nor the technical fluency of the operator. Full article
Show Figures

Figure 1

12 pages, 5171 KiB  
Article
Simulation and Experiment Analysis of 10 kV Flexible Grounding Device
by Mengxuan Liu, Chi Zhang and Kangli Liu
Information 2021, 12(9), 384; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090384 - 19 Sep 2021
Cited by 2 | Viewed by 1878
Abstract
The traditional 10 kV distribution network grounding system has some disadvantages, such as small grounding current and poor arc extinguishing effect, thus, hindering the detection of high-resistance grounding fault. Therefore, this paper studied the flexible grounding system consisting of small-resistance and active inverter [...] Read more.
The traditional 10 kV distribution network grounding system has some disadvantages, such as small grounding current and poor arc extinguishing effect, thus, hindering the detection of high-resistance grounding fault. Therefore, this paper studied the flexible grounding system consisting of small-resistance and active inverter in parallel. The control system comprises the compensation current calculation module, the fault detection module, and line protection strategy. During a single-phase grounding fault, the device is designed to inject a current of a given amplitude and phase into the neutral point to effectively suppress fault-point voltage and current and, meanwhile, quickly identifying the fault line or the busbar fault and then systematically protecting the distribution line. In addition, a large number of simulations have performed based on three grounding faults (metal, low-resistance, and high-resistance) and two modes (ungrounded and small-resistance grounding). The device can all be functional. Finally, a 400 V-level experimental prototype was built, and the experimental results are consistent with the simulation results, which can verify the effectiveness and feasibility of the flexible grounding device. Full article
Show Figures

Figure 1

15 pages, 672 KiB  
Article
Dual-Channel Heterogeneous Graph Network for Author Name Disambiguation
by Xin Zheng, Pengyu Zhang, Yanjie Cui, Rong Du and Yong Zhang
Information 2021, 12(9), 383; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090383 - 18 Sep 2021
Cited by 3 | Viewed by 2426
Abstract
Name disambiguation has long been a significant issue in many fields, such as literature management and social analysis. In recent years, methods based on graph networks have performed well in name disambiguation, but these works have rarely used heterogeneous graphs to capture relationships [...] Read more.
Name disambiguation has long been a significant issue in many fields, such as literature management and social analysis. In recent years, methods based on graph networks have performed well in name disambiguation, but these works have rarely used heterogeneous graphs to capture relationships between nodes. Heterogeneous graphs can extract more comprehensive relationship information so that more accurate node embedding can be learned. Therefore, a Dual-Channel Heterogeneous Graph Network is proposed to solve the name disambiguation problem. We use the heterogeneous graph network to capture various node information to ensure that our method can learn more accurate data structure information. In addition, we use fastText to extract the semantic information of the data. Then, a clustering method based on DBSCAN is used to classify academic papers by different authors into different clusters. In many experiments based on real datasets, our method achieved high accuracy, which proves its effectiveness. Full article
Show Figures

Figure 1

2 pages, 146 KiB  
Editorial
Editorial for Special Issue Detecting Attack and Incident Zone System
by Christoforos Ntantogian
Information 2021, 12(9), 382; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090382 - 18 Sep 2021
Viewed by 1243
Abstract
Attackers who have a strong motivation to succeed in their nefarious goals are often able to breach the security of their targets and cause havoc [...] Full article
(This article belongs to the Special Issue Detecting Attack and Incident Zone System)
13 pages, 1389 KiB  
Article
Spatial Pattern and Influencing Factors of Outward Foreign Direct Investment Enterprises in the Yangtze River Economic Belt of China
by Fei Shi, Haiying Xu, Wei-Ling Hsu, Yee-Chaur Lee and Juhua Zhu
Information 2021, 12(9), 381; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090381 - 18 Sep 2021
Cited by 3 | Viewed by 2334
Abstract
This paper studies outward foreign direct investment (OFDI) enterprises in the Yangtze River Economic Belt. Using geographical information system (GIS) spatial analysis and SPSS correlation analysis methods, it analyzes the change in the spatial distribution of OFDI enterprises in 2010, 2014, and 2018. [...] Read more.
This paper studies outward foreign direct investment (OFDI) enterprises in the Yangtze River Economic Belt. Using geographical information system (GIS) spatial analysis and SPSS correlation analysis methods, it analyzes the change in the spatial distribution of OFDI enterprises in 2010, 2014, and 2018. It explores the influencing factors that have an impact on this change. The results show the following: (1) The geographical distribution of OFDI enterprises in the Yangtze River Economic Belt is uneven. In the downstream region, OFDI enterprises have significant advantages in both quantity and quality over those in the mid- and up-stream regions. In recent years, a multi-core spatial pattern has gradually emerged. (2) The factors influencing the spatial distribution of OFDI enterprises have been gradually changing from one dominant factor, i.e., technological innovation capability, to four core factors, namely, urbanization level, economic development level, technological innovation capability, and degree of economic openness. The research results serve as an important reference for future policy adjustment in the Yangtze River Economic Belt. First, the Yangtze River Economic Belt should adjust industrial policies; comprehensively increase the level of OFDI; accelerate the upgrading and transformation of regional industries; and, at the same time, inject vitality into the development of the world economy. Moreover, the downstream region should fully play a leading role in the Yangtze River Economic Belt, especially in encouraging OFDI enterprises to establish global production networks. Meanwhile, enterprises in the upstream region are encouraged to establish regional production networks to accelerate the development of inland open highlands. Full article
Show Figures

Figure 1

12 pages, 948 KiB  
Article
Investigating Complimentary E-Marketing Strategy for Small- and Medium-Sized Enterprises at Growth Stage in Taiwan
by Chiu-Ching Lin
Information 2021, 12(9), 380; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090380 - 17 Sep 2021
Cited by 7 | Viewed by 3448
Abstract
Globally, 95% of enterprises are small and medium sized enterprises (SMEs). Social media has recently become a powerful marketing tool due to characteristics such as the ability to share digital information and interact with consumers instantly. In SMEs, limited budgets restrict the use [...] Read more.
Globally, 95% of enterprises are small and medium sized enterprises (SMEs). Social media has recently become a powerful marketing tool due to characteristics such as the ability to share digital information and interact with consumers instantly. In SMEs, limited budgets restrict the use of social media as marketing tools. Thus, complimentary use of social media may be an alternative method for SMEs to maximize their marketing strategies. This study identifies what marketing goals are important for SME growth and how complimentary social media are used for attaining marketing goals. The analytic hierarchy process (AHP) analysis was conducted to confirm the order of local weights and global weights for marketing goals and complimentary social media. We found that the order of local weights for marketing goals for SMEs in the growth phase is brand awareness > online purchase > sales potential. The order of global weights for complimentary social media to meet above marketing goals is Facebook > PIXNET > Twitter > Instagram > YouTube > LINE. Finally, we used an SME from Taiwan as a case study to ensure that the application of the above complimentary social media can meet above marketing goals and potentially increase the survival of SMEs at the growth phase in Taiwan. Full article
Show Figures

Figure 1

15 pages, 314 KiB  
Article
Multinomial Logistic Regression to Estimate the Financial Education and Financial Knowledge of University Students in Chile
by Hanns de la Fuente-Mella, Benito Umaña-Hermosilla, Marisela Fonseca-Fuentes and Claudio Elórtegui-Gómez
Information 2021, 12(9), 379; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090379 - 17 Sep 2021
Cited by 6 | Viewed by 3273
Abstract
All individuals face decisions during their lifetime that directly influence the economic well-being of their families. Therefore, financial education can be a fundamental tool to maximize our economic resources and use them wisely. A virtual survey was administered to 410 volunteer students belonging [...] Read more.
All individuals face decisions during their lifetime that directly influence the economic well-being of their families. Therefore, financial education can be a fundamental tool to maximize our economic resources and use them wisely. A virtual survey was administered to 410 volunteer students belonging to a public university in southern Chile. The objective was to determine the level of financial knowledge and appreciation of financial education of future professionals. The most important results demonstrate a reality in which young people said they had the habit of saving and budgeting at home and were responsible for paying their bills on time. However, only a very small number of participants claimed having a superior level of knowledge regarding financial literacy. The main challenge for universities is to include this topic in the elective curriculum of all degree programs to promote financial criterion development that contributes to the comprehensive training and professional competencies of future graduates. Full article
(This article belongs to the Special Issue Data Analytics in Social Science and Information Theory II)
15 pages, 2915 KiB  
Article
A Guided Scratch Visual Execution Environment to Introduce Programming Concepts to CS1 Students
by Raquel Hijón-Neira, Cornelia Connolly, Daniel Palacios-Alonso and Oriol Borrás-Gené
Information 2021, 12(9), 378; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090378 - 17 Sep 2021
Cited by 3 | Viewed by 2254
Abstract
First-year computer science (CS1) university students traditionally have difficulties understanding how to program. This paper describes research introducing CS1 students to programming concepts using a Scratch programming language guided visual execution environment (VEE). The concepts addressed are those from an introductory programming course [...] Read more.
First-year computer science (CS1) university students traditionally have difficulties understanding how to program. This paper describes research introducing CS1 students to programming concepts using a Scratch programming language guided visual execution environment (VEE). The concepts addressed are those from an introductory programming course (sequences, variables, operators, conditionals, loops, and events and parallelism). The VEE guides novice students through programming concepts, explaining and guiding interactive exercises executed in Scratch by using metaphors and serious games. The objective of this study is, firstly, to investigate if a cohort of 124 CS1 students, from three distinct groups, studying at the same university, are able to improve their programming skills guided by the VEE. Secondly, is the improvement different for various programming concepts? All the CS1 students were taught the module by the same tutor in four 2-h sessions (8 h), and a qualitative research approach was adopted. The results show students significantly improved their programming knowledge, and this improvement is significant for all the programming concepts, although greater for certain concepts such as operators, conditionals, and loops than others. It also shows that students lacked initial knowledge of events and parallelism, though most had used Scratch during their high school years. The sequence concept was the most popular concept known to them. A collateral finding in this study is how the students’ previous knowledge and learning gaps affected grades they required to access and begin study at the university level. Full article
Show Figures

Figure 1

10 pages, 680 KiB  
Article
Working in the 21st Century. The Coronavirus Crisis: A Driver of Digitalisation, Teleworking, and Innovation, with Unintended Social Consequences
by Antonio López Peláez, Amaya Erro-Garcés, Francisco Javier Pinilla García and Dimitrios Kiriakou
Information 2021, 12(9), 377; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090377 - 17 Sep 2021
Cited by 22 | Viewed by 5364
Abstract
(1) Background: This article seeks to shed a light on the innovation, digitalisation, and teleworking processes that have occurred because of the coronavirus crisis. (2) Methods: To this end, we analyse data from Eurostat (2020), the European Companies Survey (2013; 2019) and the [...] Read more.
(1) Background: This article seeks to shed a light on the innovation, digitalisation, and teleworking processes that have occurred because of the coronavirus crisis. (2) Methods: To this end, we analyse data from Eurostat (2020), the European Companies Survey (2013; 2019) and the Living, Working and COVID-19 Dataset (2020), the latter two gathered by Eurofound. (3) Results: Our main findings reveal that COVID-19 has accelerated a process of digitalisation that has produced relevant changes in labour relations and, consequently, in companies’ organisation. (4) Conclusions: In short, home confinement has had a profound impact on work and occupational risks. Full article
(This article belongs to the Special Issue Digital Work—Information Technology and Commute Choice)
Show Figures

Figure 1

21 pages, 7358 KiB  
Article
CNMF: A Community-Based Fake News Mitigation Framework
by Shaimaa Galal, Noha Nagy and Mohamed. E. El-Sharkawi
Information 2021, 12(9), 376; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090376 - 16 Sep 2021
Cited by 9 | Viewed by 2378
Abstract
Fake news propagation in online social networks (OSN) is one of the critical societal threats nowadays directing attention to fake news mitigation and intervention techniques. One of the typical mitigation techniques focus on initiating news mitigation campaigns targeting a specific set of users [...] Read more.
Fake news propagation in online social networks (OSN) is one of the critical societal threats nowadays directing attention to fake news mitigation and intervention techniques. One of the typical mitigation techniques focus on initiating news mitigation campaigns targeting a specific set of users when the infected set of users is known or targeting the entire network when the infected set of users is unknown. The contemporary mitigation techniques assume the campaign users’ acceptance to share a mitigation news (MN); however, in reality, user behavior is different. This paper focuses on devising a generic mitigation framework, where the social crowd can be employed to combat the influence of fake news in OSNs when the infected set of users is undefined. The framework is composed of three major phases: facts discovery, facts searching and, community recommendation. Mitigation news circulation is accomplished by recruiting a set of social crowd users (news propagators) who are likely to accept posting the mitigation news article. We propose a set of features that identify prospect OSN audiences and news propagators. Moreover, we inspect the variant properties of the news circulation process, such as incentivizing news propagators, determining the required number of news propagators, and the adaptivity of the MN circulation process. The paper pinpoints the significance of facts searching and news propagator’s behavior features introduced in the experimental results. Full article
(This article belongs to the Special Issue Decentralization and New Technologies for Social Media)
Show Figures

Figure 1

14 pages, 1906 KiB  
Article
A Review of Tabular Data Synthesis Using GANs on an IDS Dataset
by Stavroula Bourou, Andreas El Saer, Terpsichori-Helen Velivassaki, Artemis Voulkidis and Theodore Zahariadis
Information 2021, 12(9), 375; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090375 - 14 Sep 2021
Cited by 49 | Viewed by 9126
Abstract
Recent technological innovations along with the vast amount of available data worldwide have led to the rise of cyberattacks against network systems. Intrusion Detection Systems (IDS) play a crucial role as a defense mechanism in networks against adversarial attackers. Machine Learning methods provide [...] Read more.
Recent technological innovations along with the vast amount of available data worldwide have led to the rise of cyberattacks against network systems. Intrusion Detection Systems (IDS) play a crucial role as a defense mechanism in networks against adversarial attackers. Machine Learning methods provide various cybersecurity tools. However, these methods require plenty of data to be trained efficiently, which may be hard to collect or to use due to privacy reasons. One of the most notable Machine Learning tools is the Generative Adversarial Network (GAN), and it has great potential for tabular data synthesis. In this work, we start by briefly presenting the most popular GAN architectures, VanillaGAN, WGAN, and WGAN-GP. Focusing on tabular data generation, CTGAN, CopulaGAN, and TableGAN models are used for the creation of synthetic IDS data. Specifically, the models are trained and evaluated on an NSL-KDD dataset, considering the limitations and requirements that this procedure needs. Finally, based on certain quantitative and qualitative methods, we argue and evaluate the most prominent GANs for tabular network data synthesis. Full article
Show Figures

Figure 1

20 pages, 486 KiB  
Article
A Tweet Sentiment Classification Approach Using a Hybrid Stacked Ensemble Technique
by Babacar Gaye, Dezheng Zhang and Aziguli Wulamu
Information 2021, 12(9), 374; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090374 - 14 Sep 2021
Cited by 30 | Viewed by 5374
Abstract
With the extensive availability of social media platforms, Twitter has become a significant tool for the acquisition of peoples’ views, opinions, attitudes, and emotions towards certain entities. Within this frame of reference, sentiment analysis of tweets has become one of the most fascinating [...] Read more.
With the extensive availability of social media platforms, Twitter has become a significant tool for the acquisition of peoples’ views, opinions, attitudes, and emotions towards certain entities. Within this frame of reference, sentiment analysis of tweets has become one of the most fascinating research areas in the field of natural language processing. A variety of techniques have been devised for sentiment analysis, but there is still room for improvement where the accuracy and efficacy of the system are concerned. This study proposes a novel approach that exploits the advantages of the lexical dictionary, machine learning, and deep learning classifiers. We classified the tweets based on the sentiments extracted by TextBlob using a stacked ensemble of three long short-term memory (LSTM) as base classifiers and logistic regression (LR) as a meta classifier. The proposed model proved to be effective and time-saving since it does not require feature extraction, as LSTM extracts features without any human intervention. We also compared our proposed approach with conventional machine learning models such as logistic regression, AdaBoost, and random forest. We also included state-of-the-art deep learning models in comparison with the proposed model. Experiments were conducted on the sentiment140 dataset and were evaluated in terms of accuracy, precision, recall, and F1 Score. Empirical results showed that our proposed approach manifested state-of-the-art results by achieving an accuracy score of 99%. Full article
Show Figures

Figure 1

17 pages, 932 KiB  
Article
The Mitigators of Ad Irritation and Avoidance of YouTube Skippable In-Stream Ads: An Empirical Study in Taiwan
by Hota Chia-Sheng Lin, Neil Chueh-An Lee and Yi-Chieh Lu
Information 2021, 12(9), 373; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090373 - 14 Sep 2021
Cited by 7 | Viewed by 4540
Abstract
On YouTube, skippable in-stream advertisements (ads) are critical income for both YouTube and content creators. However, ads inevitably irritate viewers, and as a result, they tend to avoid ads. Thus, this study attempts to identify potential mitigators—source attractiveness and reciprocal altruism—of ad irritation [...] Read more.
On YouTube, skippable in-stream advertisements (ads) are critical income for both YouTube and content creators. However, ads inevitably irritate viewers, and as a result, they tend to avoid ads. Thus, this study attempts to identify potential mitigators—source attractiveness and reciprocal altruism—of ad irritation and avoidance in the context of YouTube skippable in-stream ads. Using an online survey (n = 512) in Taiwan, the proposed model is examined by a partial least squares structural equation modeling analysis. The findings show that while ad irritation has a positive effect on ad avoidance, reciprocal altruism can significantly reduce both ad irritation and avoidance. However, source attractiveness fails to mitigate ad irritation and avoidance. Theoretical and managerial implications of these findings are discussed, and several solutions for reducing ad irritation and avoidance are provided. Full article
Show Figures

Figure 1

29 pages, 2759 KiB  
Article
Multi-Attribute Group Decision-Making Based on Interval-Valued q-Rung Orthopair Fuzzy Power Generalized Maclaurin Symmetric Mean Operator and Its Application in Online Education Platform Performance Evaluation
by Jun Wang and Yang Zhou
Information 2021, 12(9), 372; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090372 - 13 Sep 2021
Cited by 6 | Viewed by 1781
Abstract
This paper aims to propose a novel multi-attribute group decision-making (MAGDM) method based on interval-valued q-rung orthopair fuzzy sets (IVq-ROFSs). The IVq-ROFSs have been proved to be effective in handling MAGDM problems, and several novel decision-making methods have been proposed. Nevertheless, it is [...] Read more.
This paper aims to propose a novel multi-attribute group decision-making (MAGDM) method based on interval-valued q-rung orthopair fuzzy sets (IVq-ROFSs). The IVq-ROFSs have been proved to be effective in handling MAGDM problems, and several novel decision-making methods have been proposed. Nevertheless, it is worth pointing out that these approaches still have some limitations, and there still exist some realistic situations that cannot be solved by existing MAGDM methods. Hence, the objective of this paper is to introduce a novel MAGDM method, which can overcome some of the drawbacks of existing approaches. To effectively and appropriately aggregate interval-valued q-rung orthopair fuzzy numbers (IVq-ROFNs), we combine the power average with generalized Maclaurin symmetric mean (GMSM), propose the power GMSM operator and extend it into IVq-ROFSs. Afterwards, a collection of new aggregation operators for IVq-ROFNs are developed. In this paper, we study definitions of these operators and investigate their characteristics as well as special cases. Then, based on the new aggregation operators, we present a new MAGDM method. Finally, we apply the proposed MAGDM method in online education platform performance evaluation to illustrate its effectiveness and validity. In addition, we also provide comparative analysis to explain why decision-makers should use our method instead of the others. Full article
(This article belongs to the Special Issue New Applications in Multiple Criteria Decision Analysis)
Show Figures

Figure 1

11 pages, 1839 KiB  
Article
A Novel Joint TDOA/FDOA Passive Localization Scheme Using Interval Intersection Algorithm
by Lingyu Ai, Min Pang, Changxu Shan, Chao Sun, Youngok Kim and Biao Zhou
Information 2021, 12(9), 371; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090371 - 13 Sep 2021
Cited by 5 | Viewed by 1965
Abstract
Due to the large measurement error in the practical non-cooperative scene, the passive localization algorithms based on traditional numerical calculation using time difference of arrival (TDOA) and frequency difference of arrival (FDOA) often have no solution, i.e., the estimated result cannot meet the [...] Read more.
Due to the large measurement error in the practical non-cooperative scene, the passive localization algorithms based on traditional numerical calculation using time difference of arrival (TDOA) and frequency difference of arrival (FDOA) often have no solution, i.e., the estimated result cannot meet the localization background knowledge. In this context, this paper intends to introduce interval analysis theory into joint FDOA/TDOA-based localization algorithm. The proposed algorithm uses the dichotomy algorithm to fuse the interval measurement of TDOA and FDOA for estimating the velocity and position of a moving target. The estimation results are given in the form of an interval. The estimated interval must contain the true values of the position and velocity of the radiation target, and the size of the interval reflects the confidence of the estimation. The point estimation of the position and the velocity of the target is given by the midpoint of the estimation interval. Simulation analysis shows the efficacy of the algorithm. Full article
(This article belongs to the Section Information Processes)
Show Figures

Figure 1

21 pages, 2896 KiB  
Article
Hypergraph Application on Business Process Performance
by Khawla Bouafia and Bálint Molnár
Information 2021, 12(9), 370; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090370 - 13 Sep 2021
Cited by 2 | Viewed by 2247
Abstract
The modeling of the graphical representation of business processes (BP) or workflows in enterprise information systems (IS) is often to represent various activities, entities, relations, functions, and communicate between them in an enterprise to achieve the major goal of operational support. In this [...] Read more.
The modeling of the graphical representation of business processes (BP) or workflows in enterprise information systems (IS) is often to represent various activities, entities, relations, functions, and communicate between them in an enterprise to achieve the major goal of operational support. In this work, we decided to use graph representation approaches, especially hypergraphs to depict the complex relationships that exist among the artifacts and constituents of BP for more efficient and accurate manipulations. We used bipartite and further hypergraph formats for storing and curating data. We have investigated the various descriptive languages and representation models of BP as process modeling, workflow and process integration, and object-oriented (OO) languages. We have carried out experiments using different approach combinations, but for observing quiltedrepresentation, we focused on the main consistencies of “DBP”. As the final approach, we used the “DBP” stream and data schemes that are defined by us to proceed with using pure Python for manually generating data and external Python libraries to store, curate, and visualize “DBP”. Full article
(This article belongs to the Special Issue Business Process Management)
Show Figures

Figure 1

19 pages, 626 KiB  
Article
A Buffer Management Algorithm Based on Dynamic Marking Threshold to Restrain MicroBurst in Data Center Network
by Yan Yu, Xianliang Jiang, Guang Jin, Zihang Gao and Penghui Li
Information 2021, 12(9), 369; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090369 - 12 Sep 2021
Cited by 1 | Viewed by 2051
Abstract
The data center has become the infrastructure of most Internet services, and its network carries different types of business flow, such as query, data backup, control information, etc. At the same time, the throughput-sensitive large flows occupy a lot of bandwidth, resulting in [...] Read more.
The data center has become the infrastructure of most Internet services, and its network carries different types of business flow, such as query, data backup, control information, etc. At the same time, the throughput-sensitive large flows occupy a lot of bandwidth, resulting in the small flow’s longer completion time, finally affecting the performance of the applications. Recent proposals consider only dynamically adjusting the ECN threshold or reversing the ECN packet priority. This paper combines these two improvements and presents the HDCQ method for coordinating data center queuing, separating large and small flows, and scheduling in order to ensure flow completion time. It uses the ECN mechanism to design load-adaptive marking threshold update algorithms for small flows to prevent micro-bursts from occurring. At the same time, packets marked with ECN or ACK are raised in priority, prompting these packets to be fed back to the sender as soon as possible, effectively reducing the TCP control loop delay. Extensive experimental analysis on the network simulator (NS-2) shows that the HDCQ algorithm has better performance in the face of micro-burst traffic, reducing the average flow completion time by up to 24% compared with the PIAS. Full article
Show Figures

Figure 1

20 pages, 1130 KiB  
Article
Advanced Fusion and Empirical Mode Decomposition-Based Filtering Methods for Breathing Rate Estimation from Seismocardiogram Signals
by Christina Kozia and Randa Herzallah
Information 2021, 12(9), 368; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090368 - 11 Sep 2021
Cited by 5 | Viewed by 2324
Abstract
Breathing Rate (BR), an important deterioration indicator, has been widely neglected in hospitals due to the requirement of invasive procedures and the need for skilled nurses to be measured. On the other hand, biomedical signals such as Seismocardiography (SCG), which measures heart vibrations [...] Read more.
Breathing Rate (BR), an important deterioration indicator, has been widely neglected in hospitals due to the requirement of invasive procedures and the need for skilled nurses to be measured. On the other hand, biomedical signals such as Seismocardiography (SCG), which measures heart vibrations transmitted to the chest-wall, can be used as a non-invasive technique to estimate the BR. This makes SCG signals a highly appealing way for estimating the BR. As such, this work proposes three novel methods for extracting the BR from SCG signals. The first method is based on extracting respiration-dependent features such as the fundamental heart sound components, S1 and S2 from the SCG signal. The second novel method investigates for the first time the use of data driven methods such as the Empirical Mode Decomposition (EMD) method to identify the respiratory component from an SCG signal. Finally, the third advanced method is based on fusing frequency information from the respiration signals that result from the aforementioned proposed methods and other standard methods. The developed methods in this paper are then evaluated on adult recordings from the combined measurement of ECG, the Breathing and Seismocardiograms database. Both fusion and EMD filter-based methods outperformed the individual methods, giving a mean absolute error of 1.5 breaths per minute, using a one-minute window of data. Full article
(This article belongs to the Special Issue Biomedical Signal Processing and Data Analytics in Healthcare Systems)
Show Figures

Figure 1

21 pages, 876 KiB  
Article
Game Design as an Autonomous Research Subject
by Pedro Pinto Neves and Nelson Zagalo
Information 2021, 12(9), 367; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090367 - 10 Sep 2021
Cited by 2 | Viewed by 2908
Abstract
This paper examines the methods and systems of game design from the standpoint of existing method proposals failing to establish a common basis for systematizing design knowledge, which this paper aims to help resolve. Game design has often been subsumed by game development [...] Read more.
This paper examines the methods and systems of game design from the standpoint of existing method proposals failing to establish a common basis for systematizing design knowledge, which this paper aims to help resolve. Game design has often been subsumed by game development and associated disciplines, and game design methodology has often been subsumed by game analysis. This paper reviews related work in defining game design as an autonomous research subject and then divides the methods and systems of game design into complementary methods and core methods, with only the latter, consisting chiefly of design patterns, attempting to systematize how game design knowledge is generated. Seminal game patterns have been descriptive rather than -prescriptive and so have failed to find the requisite practitioner adoption to fulfill their role as a living method. One recent pattern approach has sought to resolve this issue by promoting pattern usage generally over the adoption of a particular language. This paper outlines an alternate and possibly complementary approach of a novel, practical basis for game design literacy for helping core methods work as a basis for systematizing game design knowledge. The proposed basis sacrifices descriptiveness to prescriptiveness to shape methods in that direction. Full article
(This article belongs to the Special Issue The Systems and Methods of Game Design)
Show Figures

Figure 1

21 pages, 1217 KiB  
Article
The Geranium Platform: A KG-Based System for Academic Publications
by Giovanni Garifo, Giuseppe Futia, Antonio Vetrò and Juan Carlos De Martin
Information 2021, 12(9), 366; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090366 - 08 Sep 2021
Viewed by 2314
Abstract
Knowledge Graphs (KGs) have emerged as a core technology for incorporating human knowledge because of their capability to capture the relational dimension of information and of its semantic properties. The nature of KGs meets one of the vocational pursuits of academic institutions, which [...] Read more.
Knowledge Graphs (KGs) have emerged as a core technology for incorporating human knowledge because of their capability to capture the relational dimension of information and of its semantic properties. The nature of KGs meets one of the vocational pursuits of academic institutions, which is sharing their intellectual output, especially publications. In this paper, we describe and make available the Polito Knowledge Graph (PKG) –which semantically connects information on more than 23,000 publications and 34,000 authors– and Geranium, a semantic platform that leverages the properties of the PKG to offer advanced services for search and exploration. In particular, we describe the Geranium recommendation system, which exploits Graph Neural Networks (GNNs) to suggest collaboration opportunities between researchers of different disciplines. This work integrates the state of the art because we use data from a real application in the scholarly domain, while the current literature still explores the combination of KGs and GNNs in a prototypal context using synthetic data. The results shows that the fusion of these technologies represents a promising approach for recommendation and metadata inference in the scholarly domain. Full article
(This article belongs to the Collection Knowledge Graphs for Search and Recommendation)
Show Figures

Figure 1

24 pages, 3292 KiB  
Article
Relationship between Perceived UX Design Attributes and Persuasive Features: A Case Study of Fitness App
by Kiemute Oyibo and Julita Vassileva
Information 2021, 12(9), 365; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090365 - 07 Sep 2021
Cited by 7 | Viewed by 4548
Abstract
Research shows that a well-designed user interface is more likely to be persuasive than a poorly designed one. However, there is a limited understanding of the relationship between user-experience (UX) design attributes and users’ receptiveness to the persuasive features of a persuasive technology [...] Read more.
Research shows that a well-designed user interface is more likely to be persuasive than a poorly designed one. However, there is a limited understanding of the relationship between user-experience (UX) design attributes and users’ receptiveness to the persuasive features of a persuasive technology aimed at motivating behavior change. To bridge this gap, we carried out an online case study among 228 participants from Canada and the United States to investigate the relationship between perceived UX design attributes and users’ receptiveness to persuasive features. The study serves as exploratory work by focusing on a single prototype (homepage of a fitness app); four commonly researched UX design attributes (perceived aesthetics, perceived usability, perceived credibility, and perceived usefulness); and six commonly employed persuasive features (Goal-Setting/Self-Monitoring, Reward, Cooperation, Competition, Social Comparison, and Social Learning) illustrated on storyboards. The results of the Partial Least Square Path Modeling show that perceived usefulness, followed by perceived aesthetics, has the strongest relationship with users’ receptiveness to the persuasive features of a fitness app. Specifically, perceived usefulness and perceived aesthetics have a significant relationship with users’ receptiveness to all but two of the six persuasive features, respectively, as well as with the overall perceived persuasiveness of the fitness app. These findings are supported by participants’ comments on the perceived UX design attributes of the fitness app and the persuasive features illustrated on the storyboards. However, perceived usability and perceived credibility have weak or non-significant relationships with users’ receptiveness to the six persuasive features. The findings suggest that designers should prioritize utilitarian benefits (perceived usefulness) and hedonic benefits (perceived aesthetics) over perceived usability and perceived credibility when designing fitness apps to support behavior change. Full article
(This article belongs to the Special Issue Designing Digital Health Technologies as Persuasive Technologies)
Show Figures

Figure 1

17 pages, 6172 KiB  
Article
CANet: A Combined Attention Network for Remote Sensing Image Change Detection
by Di Lu, Liejun Wang, Shuli Cheng, Yongming Li and Anyu Du
Information 2021, 12(9), 364; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090364 - 07 Sep 2021
Cited by 8 | Viewed by 2178
Abstract
Change detection (CD) is one of the essential tasks in remote sensing image processing and analysis. Remote sensing CD is a process of determining and evaluating changes in various surface objects over time. The impressive achievements of deep learning in image processing and [...] Read more.
Change detection (CD) is one of the essential tasks in remote sensing image processing and analysis. Remote sensing CD is a process of determining and evaluating changes in various surface objects over time. The impressive achievements of deep learning in image processing and computer vision provide an innovative concept for the task of CD. However, existing methods based on deep learning still have problems detecting small changed regions correctly and distinguishing the boundaries of the changed regions. To solve the above shortcomings and improve the efficiency of CD networks, inspired by the fact that an attention mechanism can refine features effectively, we propose an attention-based network for remote sensing CD, which has two important components: an asymmetric convolution block (ACB) and a combined attention mechanism. First, the proposed method extracts the features of bi-temporal images, which contain two parallel encoders with shared weights and structures. Then, the feature maps are fed into the combined attention module to reconstruct the change maps and obtain refined feature maps. The proposed CANet is evaluated on the two publicly available datasets for challenging remote sensing image CD. Extensive empirical results with four popular metrics show that the designed framework yields a robust CD detector with good generalization performance. In the CDD and LEVIR-CD datasets, the F1 values of the CANet are 3.3% and 1.3% higher than those of advanced CD methods, respectively. A quantitative analysis and qualitative comparison indicate that our method outperforms competitive baselines in terms of both effectiveness and robustness. Full article
Show Figures

Figure 1

21 pages, 1464 KiB  
Article
Discerning Meaning and Producing Information: Semiosis in Knowing the Past
by Kenneth Thibodeau
Information 2021, 12(9), 363; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090363 - 06 Sep 2021
Cited by 3 | Viewed by 2661
Abstract
This article explores how the meaning of information related to things, people, events, and processes in the past is discerned and interpreted to satisfy some current purpose. Starting from the premise that Information about the Past results from a cognitive construction, it considers [...] Read more.
This article explores how the meaning of information related to things, people, events, and processes in the past is discerned and interpreted to satisfy some current purpose. Starting from the premise that Information about the Past results from a cognitive construction, it considers factors that affect the probability of success in producing Information about the Past. The article analyzes the process, components, and products of learning about the past, building on Constructed Past Theory and applying concepts from semiotics. It identifies characteristic ways in which things in the past are misinterpreted. Full article
(This article belongs to the Special Issue Selected Papers for IT 2021: Information Theory)
Show Figures

Figure 1

19 pages, 4399 KiB  
Article
Dynamic Adaptation Method of Business Process Based on Hierarchical Feature Model
by Le Zhang, Qi Gao and Tingyu Li
Information 2021, 12(9), 362; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090362 - 05 Sep 2021
Cited by 1 | Viewed by 2671
Abstract
With the continuous complexity and frequent changes in business application scenarios, companies urgently need to establish a flexible business process management mechanism that includes dynamic rules, in which dynamic adaptation methods of business processes play a vital role. Aiming at the problem that [...] Read more.
With the continuous complexity and frequent changes in business application scenarios, companies urgently need to establish a flexible business process management mechanism that includes dynamic rules, in which dynamic adaptation methods of business processes play a vital role. Aiming at the problem that the current methods only use the preset process template and the decision-making database, it cannot respond quickly to business changes and reconfigure the business process. This research proposes a dynamic adaptation method of business process based on the hierarchical feature model, builds a hierarchical feature model of complex processes, then establishes a hierarchical business policy set to achieve an agile response to business emergencies. By constructing a mapping model, the feature model is associated with the BPMN model to realize the rapid execution of the reconfiguration process model. The feasibility and effectiveness of the proposed method are verified by process examples and the developed business process dynamic adaptation tool. Full article
Show Figures

Figure 1

10 pages, 1748 KiB  
Article
Cow Rump Identification Based on Lightweight Convolutional Neural Networks
by Handan Hou, Wei Shi, Jinyan Guo, Zhe Zhang, Weizheng Shen and Shengli Kou
Information 2021, 12(9), 361; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090361 - 02 Sep 2021
Cited by 3 | Viewed by 2511
Abstract
Individual identification of dairy cows based on computer vision technology shows strong performance and practicality. Accurate identification of each dairy cow is the prerequisite of artificial intelligence technology applied in smart animal husbandry. While the rump of each dairy cow also has lots [...] Read more.
Individual identification of dairy cows based on computer vision technology shows strong performance and practicality. Accurate identification of each dairy cow is the prerequisite of artificial intelligence technology applied in smart animal husbandry. While the rump of each dairy cow also has lots of important features, so do the back and head, which are also important for individual recognition. In this paper, we propose a non-contact cow rump identification method based on convolutional neural networks. First, the rump image sequences of the cows while feeding were collected. Then, an object detection model was applied to detect the cow rump object in each frame of image. Finally, a fine-tuned convolutional neural network model was trained to identify cow rumps. An image dataset containing 195 different cows was created to validate the proposed method. The method achieved an identification accuracy of 99.76%, which showed a better performance compared to other related methods and a good potential in the actual production environment of cow husbandry, and the model is light enough to be deployed in an edge-computing device. Full article
Show Figures

Figure 1

17 pages, 427 KiB  
Article
Topic Models Ensembles for AD-HOC Information Retrieval
by Pablo Ormeño, Marcelo Mendoza and Carlos Valle
Information 2021, 12(9), 360; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090360 - 01 Sep 2021
Cited by 1 | Viewed by 2480
Abstract
Ad hoc information retrieval (ad hoc IR) is a challenging task consisting of ranking text documents for bag-of-words (BOW) queries. Classic approaches based on query and document text vectors use term-weighting functions to rank the documents. Some of these methods’ limitations consist of [...] Read more.
Ad hoc information retrieval (ad hoc IR) is a challenging task consisting of ranking text documents for bag-of-words (BOW) queries. Classic approaches based on query and document text vectors use term-weighting functions to rank the documents. Some of these methods’ limitations consist of their inability to work with polysemic concepts. In addition, these methods introduce fake orthogonalities between semantically related words. To address these limitations, model-based IR approaches based on topics have been explored. Specifically, topic models based on Latent Dirichlet Allocation (LDA) allow building representations of text documents in the latent space of topics, the better modeling of polysemy and avoiding the generation of orthogonal representations between related terms. We extend LDA-based IR strategies using different ensemble strategies. Model selection obeys the ensemble learning paradigm, for which we test two successful approaches widely used in supervised learning. We study Boosting and Bagging techniques for topic models, using each model as a weak IR expert. Then, we merge the ranking lists obtained from each model using a simple but effective top-k list fusion approach. We show that our proposal strengthens the results in precision and recall, outperforming classic IR models and strong baselines based on topic models. Full article
Show Figures

Figure 1

11 pages, 846 KiB  
Article
Topic Modeling for Analyzing Topic Manipulation Skills
by Seok-Ju Hwang, Yoon-Kyoung Lee, Jong-Dae Kim, Chan-Young Park and Yu-Seop Kim
Information 2021, 12(9), 359; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090359 - 31 Aug 2021
Cited by 1 | Viewed by 1819
Abstract
There are many ways to communicate with people, the most representative of which is a conversation. A smooth conversation should not only be written in a grammatically appropriate manner, but also deal with the subject of conversation; this is known as language ability. [...] Read more.
There are many ways to communicate with people, the most representative of which is a conversation. A smooth conversation should not only be written in a grammatically appropriate manner, but also deal with the subject of conversation; this is known as language ability. In the past, this ability has been evaluated by language analysis/therapy experts. However, this process is time-consuming and costly. In this study, the researchers developed a Hallym Systematic Analyzer of Korean language to automate the conversation analysis process traditionally conducted by language analysis/treatment experts. However, current morpheme analyzers or parsing analyzers can only evaluate certain elements of a conversation. Therefore, in this paper, we added the ability to analyze the topic manipulation skills (the number of topics and the rate of topic maintenance) using the existing Hallym Systematic Analyzer of Korean language. The purpose of this study was to utilize the topic modeling technique to automatically evaluate topic manipulation skills. By quantitatively evaluating the topic management capabilities that were previously evaluated in a conventional manner, it was possible to automatically analyze language ability in a wider range of aspects. The experimental results show that the automatic analysis methodology presented in this study achieved a very high level of correlation with language analysis/therapy professionals. Full article
(This article belongs to the Special Issue Novel Methods and Applications in Natural Language Processing)
Show Figures

Figure 1

17 pages, 5627 KiB  
Article
Betraying Blockchain: Accountability, Transparency and Document Standards for Non-Fungible Tokens (NFTs)
by Kristin Cornelius
Information 2021, 12(9), 358; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090358 - 31 Aug 2021
Cited by 27 | Viewed by 15393
Abstract
Transparency and accountability are important aspects to any technological endeavor and are popular topics of research as many everyday items have become ‘smart’ and interact with user data on a regular basis. Recent technologies such as blockchain tout these traits through the design [...] Read more.
Transparency and accountability are important aspects to any technological endeavor and are popular topics of research as many everyday items have become ‘smart’ and interact with user data on a regular basis. Recent technologies such as blockchain tout these traits through the design of their infrastructure and their ability as recordkeeping mechanisms. This project analyzes and compares records produced by non-fungible tokens (NFTs), an increasingly popular blockchain application for recording and trading digital assets, and compares them to ‘document standards,’ an interdisciplinary method of contract law, diplomatics, document/interface theory, and evidentiary proof, to see if they live up to the bar that has been set by a body of literature concerned with authentic documents. Through a close reading of the current policies on transparency (i.e., CCPA, GDPR), compliance and recordkeeping (i.e., FCPA, SOX, UETA), and the consideration of blockchain records as user-facing interfaces, this study draws the conclusion that without an effort to design these records with these various concerns in mind and from the perspectives of all three stakeholders (Users, Firms, and Regulators), any transparency will only be illusory and could serve the opposite purpose for bad actors if not resolved. Full article
(This article belongs to the Special Issue Pervasive Computing in IoT)
Show Figures

Figure 1

19 pages, 1625 KiB  
Article
P2ISE: Preserving Project Integrity in CI/CD Based on Secure Elements
by Antonio Muñoz, Aristeidis Farao, Jordy Ryan Casas Correia and Christos Xenakis
Information 2021, 12(9), 357; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090357 - 31 Aug 2021
Cited by 13 | Viewed by 3672
Abstract
During the past decade, software development has evolved from a rigid, linear process to a highly automated and flexible one, thanks to the emergence of continuous integration and delivery environments. Nowadays, more and more development teams rely on such environments to build their [...] Read more.
During the past decade, software development has evolved from a rigid, linear process to a highly automated and flexible one, thanks to the emergence of continuous integration and delivery environments. Nowadays, more and more development teams rely on such environments to build their complex projects, as the advantages they offer are numerous. On the security side however, most environments seem to focus on the authentication part, neglecting other critical aspects such as the integrity of the source code and the compiled binaries. To ensure the soundness of a software project, its source code must be secured from malicious modifications. Yet, no method can accurately verify that the integrity of the project’s source code has not been breached. This paper presents P2ISE, a novel integrity preserving tool that provides strong security assertions for developers against attackers. At the heart of P2ISE lies the TPM trusted computing technology which is leveraged to ensure integrity preservation. We have implemented the P2ISE and quantitatively assessed its performance and efficiency. Full article
(This article belongs to the Special Issue Detecting Attack and Incident Zone System)
Show Figures

Figure 1

10 pages, 300 KiB  
Article
Multinomial Logistic Regression to Estimate and Predict the Job Opportunities for People with Disabilities in Chile
by Nelson Lay-Raby, Hanns de la Fuente-Mella and Omar Lameles-Corvalán
Information 2021, 12(9), 356; https://0-doi-org.brum.beds.ac.uk/10.3390/info12090356 - 31 Aug 2021
Cited by 2 | Viewed by 2861
Abstract
In Chile there is a growing interest from society in improving the access of people with disabilities to the labor market. However, applied research on this topic is not abundant. The purpose of this research is to estimate the job opportunities of people [...] Read more.
In Chile there is a growing interest from society in improving the access of people with disabilities to the labor market. However, applied research on this topic is not abundant. The purpose of this research is to estimate the job opportunities of people with disabilities in Chile. For this, the data from the second Chilean national disability study were used to make a Multinomial Logistic Regression Model that would help to predict the probability of certain variables that influence job opportunity. For the generated model, variables related to the additional income of people (subsidies or extra income), educational level attained, pursuit of studies, and the degree of disability itself were found. It was determined how some variables affect the employment opportunity, particularly, variables related to continuity and access to studies. Full article
(This article belongs to the Special Issue Data Analytics in Social Science and Information Theory II)
Previous Issue
Next Issue
Back to TopTop