Computational Intelligence, Soft Computing and Communication Networks for Applied Science

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: closed (20 April 2020) | Viewed by 57916

Special Issue Editor


E-Mail Website
Guest Editor
Disaster Preparedness and Emergency Management, University of Hawaii, 2540 Dole Street, Honolulu, HI 96822, USA
Interests: epidemiology and prevention of congenital anomalies; psychosis and affective psychosis; cancer epidemiology and prevention; molecular and human genome epidemiology; evidence synthesis related to public health and health services research
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Based on their ability to capture the uncertainty, complexity, and stochastic nature of the underlying physical and sociopolitical processes, recent advances in artificial and computational intelligence have transformed the modeling and management of healthcare, environmental systems, and many fields in the applied sciences. Computational intelligence and soft computing approaches not only process large amounts of information historical data and/or data acquired via interaction with the environment but also continually learn through the consequences of action–result combinations. All aspects of communication systems and networks and computational intelligence will be considered in this Special Issue. Artificial intelligence and soft computing paradigms often leverage nature-inspired computational methodologies, including artificial neural networks (ANNs), fuzzy sets, and evolutionary algorithms (EA), including genetic algorithms (EA/GAs) and their hybridizations, such as neuro-fuzzy computing and neo-fuzzy systems. These systems have produced valuable, timely, robust, high-quality, and human-competitive results that have contributed to artificial intelligence research breakthroughs ranging from deep learning to genetic programming. Powerful computational intelligence and soft computing paradigms have recently been uncovered in numerous branches of soft systems science, including neural networks, swarm intelligence, expert systems, evolutionary computing, fuzzy systems, and artificial immune systems.

Prof. Dr. Jason K. Levy
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Soft, mobile cloud-based computing for social networks
  • Data mining and big data analytics for applied science and engineering
  • Fuzzy system theory in health and environmental applications
  • Socioenvironmental data analytical approaches using computational methods
  • Deep learning and machine learning algorithms for industrial applications
  • Intelligent techniques for smart surveillance and security in public health systems
  • Crowd computing-assisted access control and digital rights management
  • Evolutionary algorithms for data analysis and recommendations
  • Crowd intelligence and computing paradigms
  • Computer vision and image processing and pattern recognition technologies healthcare
  • Parallel and distributed computing for smart healthcare services
  • Autonomous systems and industrial processes optimization
  • Extreme and intelligent manufacturing
  • Wireless and optical communications and networking
  • Parallel and distributed computing
  • Cloud computing and networks
  • Networked control systems and information security
  • Speech/image/video processing and communications 
  • Green computing and Internet of Things

Related Special Issue

Published Papers (14 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

33 pages, 4021 KiB  
Article
Process Automation and Blockchain in Intelligence and Investigation Units: An Approach
by Gleidson Sobreira Leite, Adriano Bessa Albuquerque and Plácido Rogerio Pinheiro
Appl. Sci. 2020, 10(11), 3677; https://0-doi-org.brum.beds.ac.uk/10.3390/app10113677 - 26 May 2020
Cited by 11 | Viewed by 3046
Abstract
In the context of combating crime, government institutions in several countries have instituted units specialized in investigation and intelligence activities to act in different areas and expertise. However, due to the considerable complexity and specificity of these activities, as well as a significant [...] Read more.
In the context of combating crime, government institutions in several countries have instituted units specialized in investigation and intelligence activities to act in different areas and expertise. However, due to the considerable complexity and specificity of these activities, as well as a significant concern with security and other related aspects, there are challenges regarding the location and adoption of approaches aimed at applying process automation in the context of these units. Motivated by this scenario, this work presents an approach that adopts concepts of process automation in order to assist researchers and professionals interested in studies and practices aimed at simplification and/or automation of processes in the context of intelligence and investigation units. Exploring the main characteristics of blockchain technology, this paper also presents an overview of different application trends of blockchain technology and proposes the use of this technology as a support mechanism in the management, storage, and sharing of generated digital assets. On the other hand, to analyze the feasibility of applying the approach, a survey was carried out with specialists from specialized units and a real case scenario experience of use was performed. Results show evidence of the feasibility of use and suitability of the approach for the given context, and that it helps interested parties regarding the application of process automation in the scenario of intelligence and investigation units. Full article
Show Figures

Figure 1

14 pages, 5352 KiB  
Article
Online Mining Intrusion Patterns from IDS Alerts
by Kai Zhang, Shoushan Luo, Yang Xin, Hongliang Zhu and Yuling Chen
Appl. Sci. 2020, 10(8), 2983; https://0-doi-org.brum.beds.ac.uk/10.3390/app10082983 - 24 Apr 2020
Cited by 7 | Viewed by 2240
Abstract
The intrusion detection system (IDS) which is used widely in enterprises, has produced a large number of logs named alerts, from which the intrusion patterns can be mined. These patterns can be used to construct the intrusion scenarios or discover the final objectives [...] Read more.
The intrusion detection system (IDS) which is used widely in enterprises, has produced a large number of logs named alerts, from which the intrusion patterns can be mined. These patterns can be used to construct the intrusion scenarios or discover the final objectives of the malicious actors, and even assist the forensic works of network crimes. In this paper, a novel algorithm for the intrusion pattern mining is proposed which aimsto solve the difficult problems of the intrusion action sequence such as the loss of important intrusion actions, the disorder of the action sequence and the random noise actions. These common problems often occur in the real production environment which cause serious performance decrease in the analyzing system. The proposed algorithm is based on the online analysis of the intrusion action sequences extracted from IDS alerts, through calculating the influences of a particular action on the subsequent actions, the real intrusion patterns are discovered. The experimental results show that the method is effective in discovering pattern from the complex intrusion action sequences. Full article
Show Figures

Figure 1

21 pages, 445 KiB  
Article
PRIPRO: A Comparison of Classification Algorithms for Managing Receiving Notifications in Smart Environments
by João Antônio Martins, Iago Sestrem Ochôa, Luis Augusto Silva, André Sales Mendes, Gabriel Villarrubia González, Juan De Paz Santana and Valderi Reis Quietinho Leithardt
Appl. Sci. 2020, 10(2), 502; https://0-doi-org.brum.beds.ac.uk/10.3390/app10020502 - 10 Jan 2020
Cited by 9 | Viewed by 2962
Abstract
With the evolution of technology over the years, it has become possible to develop intelligent environments based on the concept of the Internet of Things, distributed systems, and machine learning. Such environments are infused with various solutions to solve user demands from services. [...] Read more.
With the evolution of technology over the years, it has become possible to develop intelligent environments based on the concept of the Internet of Things, distributed systems, and machine learning. Such environments are infused with various solutions to solve user demands from services. One of these solutions is the Ubiquitous Privacy (UBIPRI) middleware, whose central concept is to maintain privacy in smart environments and to receive notifications as one of its services. However, this service is freely performed, disregarding the privacy that the environment employs. Another consideration is that, based on the researched related work, it was possible to identify that the authors do not use statistical hypothesis tests in their solutions developed in the presented context. This work proposes an architecture for notification management in smart environments, composed by a notification manager named Privacy Notification Manager (PRINM) to assign it to UBIPRI and to aim to perform experiments between classification algorithms to delimit which one is most feasible to implement in the PRINM decision-making mechanism. The experiments showed that the J48 algorithm obtained the best results compared to the other algorithms tested and compared. Full article
Show Figures

Figure 1

18 pages, 1538 KiB  
Article
Currency Crises Prediction Using Deep Neural Decision Trees
by David Alaminos, Rafael Becerra-Vicario, Manuel Á. Fernández-Gámez and Ana J. Cisneros Ruiz
Appl. Sci. 2019, 9(23), 5227; https://0-doi-org.brum.beds.ac.uk/10.3390/app9235227 - 01 Dec 2019
Cited by 12 | Viewed by 3484
Abstract
Currency crises are major events in the international monetary system. They affect the monetary policy of countries and are associated with risks of vulnerability for open economies. Much research has been carried out on the behavior of these events, and models have been [...] Read more.
Currency crises are major events in the international monetary system. They affect the monetary policy of countries and are associated with risks of vulnerability for open economies. Much research has been carried out on the behavior of these events, and models have been developed to predict falls in the value of currencies. However, the limitations of existing models mean further research is required in this area, since the models are still of limited accuracy and have only been developed for emerging countries. This article presents an innovative global model for predicting currency crises. The analysis is geographically differentiated for regions, considering both emerging and developed countries and can accurately estimate future scenarios for currency crises at the global level. It uses a sample of 162 countries making it possible to account for the regional heterogeneity of the warning indicators. The method used was deep neural decision trees (DNDTs), a technique based on decision trees implemented by deep learning neural networks, which was compared with other methodologies widely applied in prediction. Our model has significant potential for the adaptation of macroeconomic policy to the risks derived from falls in the value of currencies, providing tools that help ensure financial stability at the global level. Full article
Show Figures

Figure 1

17 pages, 3242 KiB  
Article
A Dynamic Programmable Network for Large-Scale Scientific Data Transfer Using AmoebaNet
by Syed Asif Raza Shah and Seo-Young Noh
Appl. Sci. 2019, 9(21), 4541; https://0-doi-org.brum.beds.ac.uk/10.3390/app9214541 - 25 Oct 2019
Cited by 1 | Viewed by 2361
Abstract
Large scientific experimental facilities currently are generating a tremendous amount of data. In recent years, the significant growth of scientific data analysis has been observed across scientific research centers. Scientific experimental facilities are producing an unprecedented amount of data and facing new challenges [...] Read more.
Large scientific experimental facilities currently are generating a tremendous amount of data. In recent years, the significant growth of scientific data analysis has been observed across scientific research centers. Scientific experimental facilities are producing an unprecedented amount of data and facing new challenges to transfer the large data sets across multi continents. In particular, these days the data transfer is playing an important role in new scientific discoveries. The performance of distributed scientific environment is highly dependent on high-performance, adaptive, and robust network service infrastructures. To support large scale data transfer for extreme-scale distributed science, there is the need of high performance, scalable, end-to-end, and programmable networks that enable scientific applications to use the networks efficiently. We worked on the AmoebaNet solution to address the problems of a dynamic programmable network for bulk data transfer in extreme-scale distributed science environments. A major goal of the AmoebaNet project is to apply software-defined networking (SDN) technology to provide “Application-aware” network to facilitate bulk data transfer. We have prototyped AmoebaNet’s SDN-enabled network service that allows application to dynamically program the networks at run-time for bulk data transfers. In this paper, we evaluated AmoebaNet solution with real world test cases and shown that how it efficiently and dynamically can use the networks for bulk data transfer in large-scale scientific environments. Full article
Show Figures

Graphical abstract

18 pages, 1701 KiB  
Article
An Effective Surrogate Ensemble Modeling Method for Satellite Coverage Traffic Volume Prediction
by Siyu Ye, Yi Zhang, Wen Yao, Quan Chen and Xiaoqian Chen
Appl. Sci. 2019, 9(18), 3689; https://0-doi-org.brum.beds.ac.uk/10.3390/app9183689 - 05 Sep 2019
Cited by 4 | Viewed by 1963
Abstract
The satellite constellation network is a powerful tool to provide ground traffic business services for continuous global coverage. For the resource-limited satellite network, it is necessary to predict satellite coverage traffic volume (SCTV) in advance to properly allocate onboard resources for better task [...] Read more.
The satellite constellation network is a powerful tool to provide ground traffic business services for continuous global coverage. For the resource-limited satellite network, it is necessary to predict satellite coverage traffic volume (SCTV) in advance to properly allocate onboard resources for better task fulfillment. Traditionally, a global SCTV distribution data table is first statistically constructed on the ground according to historical data and uploaded to the satellite. Then SCTV is predicted onboard by a data table lookup. However, the cost of the large data transmission and storage is expensive and prohibitive for satellites. To solve these problems, this paper proposes to distill the data into a surrogate model to be uploaded to the satellite, which can both save the valuable communication link resource and improve the SCTV prediction accuracy compared to the table lookup. An effective surrogate ensemble modeling method is proposed in this paper for better prediction. First, according to prior geographical knowledge of the SCTV distribution, the global earth surface domain is split into multiple sub-domains. Second, on each sub-domain, multiple candidate surrogates are built. To fully exploit these surrogates and combine them into a more accurate ensemble, a partial weighted aggregation method (PWTA) is developed. For each sub-domain, PWTA adaptively selects the candidate surrogates with higher accuracy as the contributing models, based on which the ultimate ensemble is constructed for each sub-domain SCTV prediction. The proposed method is demonstrated and testified with an air traffic SCTV engineering problem. The results demonstrate the effectiveness of PWTA regarding good local and global prediction accuracy and modeling robustness. Full article
Show Figures

Figure 1

12 pages, 3429 KiB  
Article
Automatic Defect Detection for Web Offset Printing Based on Machine Vision
by Erhu Zhang, Yajun Chen, Min Gao, Jinghong Duan and Cuining Jing
Appl. Sci. 2019, 9(17), 3598; https://0-doi-org.brum.beds.ac.uk/10.3390/app9173598 - 02 Sep 2019
Cited by 15 | Viewed by 3589
Abstract
In the printing industry, defect detection is of crucial importance for ensuring the quality of printed matter. However, rarely has research been conducted for web offset printing. In this paper, we propose an automatic defect detection method for web offset printing, which consists [...] Read more.
In the printing industry, defect detection is of crucial importance for ensuring the quality of printed matter. However, rarely has research been conducted for web offset printing. In this paper, we propose an automatic defect detection method for web offset printing, which consists of determining first row of captured images, image registration and defect detection. Determining the first row of captured images is a particular problem of web offset printing, which has not been studied before. To solve this problem, a fast computational algorithm based on image projection is given, which can convert 2D image searching into 1D feature matching. For image registration, a shape context descriptor is constructed by considering the shape concave-convex feature, which can effectively reduce the dimension of features compared with the traditional image registration method. To tolerate the position difference and brightness deviation between the detected image and the reference image, a modified image subtraction is proposed for defect detection. The experimental results demonstrate the effectiveness of the proposed method. Full article
Show Figures

Figure 1

16 pages, 1043 KiB  
Article
An Integrated Cognitive Radio Network for Coastal Smart Cities
by Huma Ghafoor and Insoo Koo
Appl. Sci. 2019, 9(17), 3557; https://0-doi-org.brum.beds.ac.uk/10.3390/app9173557 - 30 Aug 2019
Cited by 2 | Viewed by 2531
Abstract
The integration of different networks has attracted significant attention in academia. Both terrestrial and maritime communications systems have been attracting keen interest for ways to deal with various applications. As the environment of cognitive vehicular and maritime networks is extremely dynamic, these networks [...] Read more.
The integration of different networks has attracted significant attention in academia. Both terrestrial and maritime communications systems have been attracting keen interest for ways to deal with various applications. As the environment of cognitive vehicular and maritime networks is extremely dynamic, these networks suffer with a long delay because of intermittent links while providing services for different applications. To this end, here we introduce the integration of cognitive vehicular and maritime networks to design a coastal smart city by utilizing software-defined networking, network function virtualization, and fog computing under the same infrastructure. This novel integrated cognitive coastal city fulfills the demand of each application user in a hybrid environment with a quicker response time. The idea is to combine vehicular and maritime communications to meet different user demands. Different virtual networks are launched by network function virtualization, and are managed and controlled by a software-defined networking controller. From the integration of software-defined networking, network function virtualization, and fog computing, both vehicular and marine users are provided with stable paths to meet each application’s demands. Full article
Show Figures

Figure 1

22 pages, 1046 KiB  
Article
Dependency Analysis based Approach for Virtual Machine Placement in Software-Defined Data Center
by Jargalsaikhan Narantuya, Taejin Ha, Jaewon Bae and Hyuk Lim
Appl. Sci. 2019, 9(16), 3223; https://0-doi-org.brum.beds.ac.uk/10.3390/app9163223 - 07 Aug 2019
Cited by 4 | Viewed by 2854
Abstract
In data centers, cloud-based services are usually deployed among multiple virtual machines (VMs), and these VMs have data traffic dependencies on each other. However, traffic dependency between VMs has not been fully considered when the services running in the data center are expanded [...] Read more.
In data centers, cloud-based services are usually deployed among multiple virtual machines (VMs), and these VMs have data traffic dependencies on each other. However, traffic dependency between VMs has not been fully considered when the services running in the data center are expanded by creating additional VMs. If highly dependent VMs are placed in different physical machines (PMs), the data traffic increases in the underlying physical network of the data center. To reduce the amount of data traffic in the underlying network and improve the service performance, we propose a traffic-dependency-based strategy for VM placement in software-defined data center (SDDC). The traffic dependencies between the VMs are analyzed by principal component analysis, and highly dependent VMs are grouped by gravity-based clustering. Each group of highly dependent VMs is placed within an appropriate PM based on the Hungarian matching method. This strategy of dependency-based VM placement facilitates reducing data traffic volume of the data center, since the highly dependent VMs are placed within the same PM. The results of the performance evaluation in SDDC testbed indicate that the proposed VM placement method efficiently reduces the amount of data traffic in the underlying network and improves the data center performance. Full article
Show Figures

Figure 1

16 pages, 2487 KiB  
Article
Toward Automatic Cardiomyocyte Clustering and Counting through Hesitant Fuzzy Sets
by Jiayao Wang, Olamide Timothy Tawose, Linhua Jiang and Dongfang Zhao
Appl. Sci. 2019, 9(14), 2875; https://0-doi-org.brum.beds.ac.uk/10.3390/app9142875 - 18 Jul 2019
Viewed by 2993
Abstract
The isolation and observation of cardiomyocytes serve as the fundamental approach to cardiovascular research. The state-of-the-practice for the isolation and observation relies on manual operation of the entire culture process. Such a manual approach not only incurs high human errors, but also takes [...] Read more.
The isolation and observation of cardiomyocytes serve as the fundamental approach to cardiovascular research. The state-of-the-practice for the isolation and observation relies on manual operation of the entire culture process. Such a manual approach not only incurs high human errors, but also takes a long period of time. This paper proposes a new computer-aided paradigm to automatically, accurately, and efficiently perform the clustering and counting of cardiomyocytes, one of the key procedures for evaluating the success rate of cardiomyocytes isolation and the quality of culture medium. The key challenge of the proposed method lies in the unique, rod-like shape of cardiomyocytes, which has been hardly addressed in literature. Our proposed method employs a novel algorithm inspired by hesitant fuzzy sets and integrates an efficient implementation into the whole process of analyzing cardiomyocytes. The system, along with the data extracted from adult rats’ cardiomyocytes, has been experimentally evaluated with Matlab, showing promising results. The false accept rate (FAR) and the false reject rate (FRR) are as low as 1.46% and 1.97%, respectively. The accuracy rate is up to 98.7%—20% higher than the manual approach—and the processing time is reduced from tens of seconds to 3–5 s—an order of magnitude performance improvement. Full article
Show Figures

Figure 1

20 pages, 2880 KiB  
Article
Performance Analysis of Feature Selection Methods in Software Defect Prediction: A Search Method Approach
by Abdullateef Oluwagbemiga Balogun, Shuib Basri, Said Jadid Abdulkadir and Ahmad Sobri Hashim
Appl. Sci. 2019, 9(13), 2764; https://0-doi-org.brum.beds.ac.uk/10.3390/app9132764 - 09 Jul 2019
Cited by 72 | Viewed by 4569
Abstract
Software Defect Prediction (SDP) models are built using software metrics derived from software systems. The quality of SDP models depends largely on the quality of software metrics (dataset) used to build the SDP models. High dimensionality is one of the data quality problems [...] Read more.
Software Defect Prediction (SDP) models are built using software metrics derived from software systems. The quality of SDP models depends largely on the quality of software metrics (dataset) used to build the SDP models. High dimensionality is one of the data quality problems that affect the performance of SDP models. Feature selection (FS) is a proven method for addressing the dimensionality problem. However, the choice of FS method for SDP is still a problem, as most of the empirical studies on FS methods for SDP produce contradictory and inconsistent quality outcomes. Those FS methods behave differently due to different underlining computational characteristics. This could be due to the choices of search methods used in FS because the impact of FS depends on the choice of search method. It is hence imperative to comparatively analyze the FS methods performance based on different search methods in SDP. In this paper, four filter feature ranking (FFR) and fourteen filter feature subset selection (FSS) methods were evaluated using four different classifiers over five software defect datasets obtained from the National Aeronautics and Space Administration (NASA) repository. The experimental analysis showed that the application of FS improves the predictive performance of classifiers and the performance of FS methods can vary across datasets and classifiers. In the FFR methods, Information Gain demonstrated the greatest improvements in the performance of the prediction models. In FSS methods, Consistency Feature Subset Selection based on Best First Search had the best influence on the prediction models. However, prediction models based on FFR proved to be more stable than those based on FSS methods. Hence, we conclude that FS methods improve the performance of SDP models, and that there is no single best FS method, as their performance varied according to datasets and the choice of the prediction model. However, we recommend the use of FFR methods as the prediction models based on FFR are more stable in terms of predictive performance. Full article
Show Figures

Figure 1

15 pages, 1823 KiB  
Article
Fuzzy Logic Controller Parameter Optimization Using Metaheuristic Cuckoo Search Algorithm for a Magnetic Levitation System
by Gabriel García-Gutiérrez, Diego Arcos-Aviles, Enrique V. Carrera, Francesc Guinjoan, Emilia Motoasca, Paúl Ayala and Alexander Ibarra
Appl. Sci. 2019, 9(12), 2458; https://0-doi-org.brum.beds.ac.uk/10.3390/app9122458 - 16 Jun 2019
Cited by 30 | Viewed by 6178
Abstract
The main benefits of fuzzy logic control (FLC) allow a qualitative knowledge of the desired system’s behavior to be included as IF-THEN linguistic rules for the control of dynamical systems where either an analytic model is not available or is too complex due, [...] Read more.
The main benefits of fuzzy logic control (FLC) allow a qualitative knowledge of the desired system’s behavior to be included as IF-THEN linguistic rules for the control of dynamical systems where either an analytic model is not available or is too complex due, for instance, to the presence of nonlinear terms. The computational structure requires the definition of the FLC parameters namely, membership functions (MF) and a rule base (RB) defining the desired control policy. However, the optimization of the FLC parameters is generally carried out by means of a trial and error procedure or, more recently by using metaheuristic nature-inspired algorithms, for instance, particle swarm optimization, genetic algorithms, ant colony optimization, cuckoo search, etc. In this regard, the cuckoo search (CS) algorithm as one of the most promising and relatively recent developed nature-inspired algorithms, has been used to optimize FLC parameters in a limited variety of applications to determine the optimum FLC parameters of only the MF but not to the RB, as an extensive search in the literature has shown. In this paper, an optimization procedure based on the CS algorithm is presented to optimize all the parameters of the FLC, including the RB, and it is applied to a nonlinear magnetic levitation system. Comparative simulation results are provided to validate the features improvement of such an approach which can be extended to other FLC based control systems. Full article
Show Figures

Figure 1

Review

Jump to: Research

27 pages, 3448 KiB  
Review
Emotion AI-Driven Sentiment Analysis: A Survey, Future Research Directions, and Open Issues
by Priya Chakriswaran, Durai Raj Vincent, Kathiravan Srinivasan, Vishal Sharma, Chuan-Yu Chang and Daniel Gutiérrez Reina
Appl. Sci. 2019, 9(24), 5462; https://0-doi-org.brum.beds.ac.uk/10.3390/app9245462 - 12 Dec 2019
Cited by 45 | Viewed by 12649
Abstract
The essential use of natural language processing is to analyze the sentiment of the author via the context. This sentiment analysis (SA) is said to determine the exactness of the underlying emotion in the context. It has been used in several subject areas [...] Read more.
The essential use of natural language processing is to analyze the sentiment of the author via the context. This sentiment analysis (SA) is said to determine the exactness of the underlying emotion in the context. It has been used in several subject areas such as stock market prediction, social media data on product reviews, psychology, judiciary, forecasting, disease prediction, agriculture, etc. Many researchers have worked on these areas and have produced significant results. These outcomes are beneficial in their respective fields, as they help to understand the overall summary in a short time. Furthermore, SA helps in understanding actual feedback shared across different platforms such as Amazon, TripAdvisor, etc. The main objective of this thorough survey was to analyze some of the essential studies done so far and to provide an overview of SA models in the area of emotion AI-driven SA. In addition, this paper offers a review of ontology-based SA and lexicon-based SA along with machine learning models that are used to analyze the sentiment of the given context. Furthermore, this work also discusses different neural network-based approaches for analyzing sentiment. Finally, these different approaches were also analyzed with sample data collected from Twitter. Among the four approaches considered in each domain, the aspect-based ontology method produced 83% accuracy among the ontology-based SAs, the term frequency approach produced 85% accuracy in the lexicon-based analysis, and the support vector machine-based approach achieved 90% accuracy among the other machine learning-based approaches. Full article
Show Figures

Figure 1

29 pages, 1125 KiB  
Review
Application of Technological Solutions in the Fight Against Money Laundering—A Systematic Literature Review
by Gleidson Sobreira Leite, Adriano Bessa Albuquerque and Plácido Rogerio Pinheiro
Appl. Sci. 2019, 9(22), 4800; https://0-doi-org.brum.beds.ac.uk/10.3390/app9224800 - 10 Nov 2019
Cited by 13 | Viewed by 5689
Abstract
With the growing interest in technological solutions aimed at combating money laundering, several studies involving the application of technology have been carried out. However, there were no records of studies aimed at identifying, selecting, rigorously analyzing and synthesizing the literature on solutions that [...] Read more.
With the growing interest in technological solutions aimed at combating money laundering, several studies involving the application of technology have been carried out. However, there were no records of studies aimed at identifying, selecting, rigorously analyzing and synthesizing the literature on solutions that adopt technology to combat money laundering. This paper presents a systematic review of the literature on the application of technological solutions in the fight against money laundering. Seventy-one papers were selected from the 795 studies initially retrieved for data extraction, analysis and synthesis based on predefined inclusion and exclusion criteria. The results obtained with the data analysis made it possible to identify a general categorization of the domains of application of the approaches, as well as a mapping and classification of the support mechanisms adopted. The findings of this review showed that, among the application domain categories identified, the detection of suspicious transactions attracted greater attention from researchers. Regarding the support mechanisms adopted, the application of data mining techniques was used more extensively to detect money laundering. Topics for further research and refinement were also identified, such as the need for a better description of data analysis to provide more convincing evidence to support the benefits presented. Full article
Show Figures

Figure 1

Back to TopTop