Special Issue "Computational Complexity"

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Complexity".

Deadline for manuscript submissions: closed (30 April 2016).

Special Issue Editors

Prof. Dr. José A. Tenreiro Machado
grade Website
Guest Editor
Department of Electrical Engineering, Institute of Engineering, Polytechnic Institute of Porto, 4249-015 Porto, Portugal
Interests: nonlinear dynamics; fractional calculus; modeling; control; evolutionary computing; genomics; robotics, complex systems
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Complex systems (CS) are composed of many parts that interact dynamically with each other at different scales in time and space. They possess the capability to evolve, to adapt, and to self-organize, revealing a collective behavior that is much richer than the one exhibited by their individual components. The high sensitivity of CS to initial conditions and their characteristic emergent behavior, give rise to large-scale dynamics that cannot be described by a single rule, nor reduced to just one level of description. The issues in modeling CS have led to the development of novel computational and modeling tools with applications to a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties, being a major concern in Computational Complexity Theory.

This Special Issue focuses on new theoretical and practical findings, and computational methods, applicable to difficult problems, whose solution demands for extensive and powerful resources, approaching the limits of the available computer systems. Submissions addressing novel issues, as well as those on more specific topics illustrating the broad impact of computational complexity theory in computer science, mathematics, physics, engineering, geophysics, social science, among others, are welcomed.

All papers should fit the scope of the journal Entropy. The topics of interest include (but are not limited to):

  • Agent based modeling and simulation
  • Advanced computational applications in geosciences, physics, mathematics, engineering, finance, economy, medicine, biology, and social science
  • Complex networks
  • Data mining and knowledge discovery
  • Intelligent systems
  • Probability and statistics in complex systems
  • Quantum information science
  • Novel computational schemes

Prof. Dr. J. A. Tenreiro Machado
Prof. Dr. António M. Lopes
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Complexity
  • Complex systems
  • Computation theory
  • Simulation and Modeling
  • Entropy
  • Information
  • Algorithms

Published Papers (13 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

Editorial
Computational Complexity
Entropy 2017, 19(2), 61; https://0-doi-org.brum.beds.ac.uk/10.3390/e19020061 - 07 Feb 2017
Cited by 1 | Viewed by 1861
Abstract
Complex systems (CS) involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS [...] Read more.
Complex systems (CS) involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...]
Full article
(This article belongs to the Special Issue Computational Complexity)

Research

Jump to: Editorial

Article
Entropy Complexity and Stability of a Nonlinear Dynamic Game Model with Two Delays
Entropy 2016, 18(9), 317; https://0-doi-org.brum.beds.ac.uk/10.3390/e18090317 - 30 Aug 2016
Cited by 11 | Viewed by 2635
Abstract
In this paper, a duopoly game model with double delays in hydropower market is established, and the research focus on the influence of time delay parameter on the complexity of the system. Firstly, we established a game model for the enterprises considering both [...] Read more.
In this paper, a duopoly game model with double delays in hydropower market is established, and the research focus on the influence of time delay parameter on the complexity of the system. Firstly, we established a game model for the enterprises considering both the current and the historical output when making decisions. Secondly, the existence and stability of Hopf bifurcation are analyzed, and the conditions and main conclusions of Hopf bifurcation are given. Thirdly, numerical simulation and analysis are carried out to verify the conclusions of the theoretical analysis. The effect of delay parameter on the stability of the system is simulated by a bifurcation diagram, the Lyapunov exponent, and an entropic diagram; in addition, the stability region of the system is given by a 2D parameter bifurcation diagram and a 3D parameter bifurcation diagram. Finally, the method of delayed feedback control is used to control the chaotic system. The research results can provide a guideline for enterprise decision-making. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Normalized Minimum Error Entropy Algorithm with Recursive Power Estimation
Entropy 2016, 18(7), 239; https://0-doi-org.brum.beds.ac.uk/10.3390/e18070239 - 24 Jun 2016
Cited by 2 | Viewed by 2418
Abstract
The minimum error entropy (MEE) algorithm is known to be superior in signal processing applications under impulsive noise. In this paper, based on the analysis of behavior of the optimum weight and the properties of robustness against impulsive noise, a normalized [...] Read more.
The minimum error entropy (MEE) algorithm is known to be superior in signal processing applications under impulsive noise. In this paper, based on the analysis of behavior of the optimum weight and the properties of robustness against impulsive noise, a normalized version of the MEE algorithm is proposed. The step size of the MEE algorithm is normalized with the power of input entropy that is estimated recursively for reducing its computational complexity. The proposed algorithm yields lower minimum MSE (mean squared error) and faster convergence speed simultaneously than the original MEE algorithm does in the equalization simulation. On the condition of the same convergence speed, its performance enhancement in steady state MSE is above 3 dB. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Graphical abstract

Article
Empirical Laws and Foreseeing the Future of Technological Progress
Entropy 2016, 18(6), 217; https://0-doi-org.brum.beds.ac.uk/10.3390/e18060217 - 02 Jun 2016
Cited by 7 | Viewed by 2936
Abstract
The Moore’s law (ML) is one of many empirical expressions that is used to characterize natural and artificial phenomena. The ML addresses technological progress and is expected to predict future trends. Yet, the “art” of predicting is often confused with the accurate fitting [...] Read more.
The Moore’s law (ML) is one of many empirical expressions that is used to characterize natural and artificial phenomena. The ML addresses technological progress and is expected to predict future trends. Yet, the “art” of predicting is often confused with the accurate fitting of trendlines to past events. Presently, data-series of multiple sources are available for scientific and computational processing. The data can be described by means of mathematical expressions that, in some cases, follow simple expressions and empirical laws. However, the extrapolation toward the future is considered with skepticism by the scientific community, particularly in the case of phenomena involving complex behavior. This paper addresses these issues in the light of entropy and pseudo-state space. The statistical and dynamical techniques lead to a more assertive perspective on the adoption of a given candidate law. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Multi-Level Formation of Complex Software Systems
Entropy 2016, 18(5), 178; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050178 - 12 May 2016
Cited by 9 | Viewed by 2688
Abstract
We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems [...] Read more.
We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems as multi-level networks is more realistic. In particular, the software networks are organized by three levels of granularity, which represents the modularity and hierarchy in the formation process of real-world software systems. More importantly, simulations based on this model have generated more realistic structural properties of software networks, such as power-law, clustering and modularization. On the basis of this model, how the structure of software systems effects software design principles is then explored, and it could be helpful for understanding software evolution and software engineering practices. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Graphical abstract

Article
Forecasting Energy Value at Risk Using Multiscale Dependence Based Methodology
Entropy 2016, 18(5), 170; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050170 - 04 May 2016
Cited by 6 | Viewed by 2223
Abstract
In this paper, we propose a multiscale dependence-based methodology to analyze the dependence structure and to estimate the downside portfolio risk measures in the energy markets. More specifically, under this methodology, we formulate a new bivariate Empirical Mode Decomposition (EMD) copula based approach [...] Read more.
In this paper, we propose a multiscale dependence-based methodology to analyze the dependence structure and to estimate the downside portfolio risk measures in the energy markets. More specifically, under this methodology, we formulate a new bivariate Empirical Mode Decomposition (EMD) copula based approach to analyze and model the multiscale dependence structure in the energy markets. The proposed model constructs the Copula-based dependence structure formulation in the Bivariate Empirical Mode Decomposition (BEMD)-based multiscale domain. Results from the empirical studies using the typical Australian electricity daily prices show that there exists a multiscale dependence structure between different regional markets across different scales. The proposed model taking into account the multiscale dependence structure demonstrates statistically significantly-improved performance in terms of accuracy and reliability measures. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Operational Complexity of Supplier-Customer Systems Measured by Entropy—Case Studies
Entropy 2016, 18(4), 137; https://0-doi-org.brum.beds.ac.uk/10.3390/e18040137 - 14 Apr 2016
Cited by 4 | Viewed by 2133
Abstract
This paper discusses a unified entropy-based approach for the quantitative measurement of operational complexity of company supplier-customer relations. Classical Shannon entropy is utilized. Beside this quantification tool, we also explore the relations between Shannon entropy and (c,d)-entropy in more [...] Read more.
This paper discusses a unified entropy-based approach for the quantitative measurement of operational complexity of company supplier-customer relations. Classical Shannon entropy is utilized. Beside this quantification tool, we also explore the relations between Shannon entropy and (c,d)-entropy in more details. An analytic description of so called iso-quant curves is given, too. We present five case studies, albeit in an anonymous setting, describing various details of general procedures for measuring the operational complexity of supplier-customer systems. In general, we assume a problem-oriented database exists, which contains detailed records of all product forecasts, orders and deliveries both in quantity and time, scheduled and realized, too. Data processing detects important flow variations both in volumes and times, e.g., order—forecast, delivery—order, and actual production—scheduled one. The unifying quantity used for entropy computation is the time gap between actual delivery time and order issue time, which is nothing else but a lead time in inventory control models. After data consistency checks, histograms and empirical distribution functions are constructed. Finally, the entropy, information-theoretic measure of supplier-customer operational complexity, is calculated. Basic steps of the algorithm are mentioned briefly, too. Results of supplier-customer system analysis from selected Czech small and medium-sized enterprises (SMEs) are presented in various computational and managerial decision making details. An enterprise is ranked as SME one, if it has at most 250 employees and its turnover does not exceed 50 million USD per year, or its balance sheet total does not exceed 43 million USD per year, alternatively. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
A Comparison of Classification Methods for Telediagnosis of Parkinson’s Disease
Entropy 2016, 18(4), 115; https://0-doi-org.brum.beds.ac.uk/10.3390/e18040115 - 30 Mar 2016
Cited by 26 | Viewed by 4051
Abstract
Parkinson’s disease (PD) is a progressive and chronic nervous system disease that impairs the ability of speech, gait, and complex muscle-and-nerve actions. Early diagnosis of PD is quite important for alleviating the symptoms. Cost effective and convenient telemedicine technology helps to distinguish the [...] Read more.
Parkinson’s disease (PD) is a progressive and chronic nervous system disease that impairs the ability of speech, gait, and complex muscle-and-nerve actions. Early diagnosis of PD is quite important for alleviating the symptoms. Cost effective and convenient telemedicine technology helps to distinguish the patients with PD from healthy people using variations of dysphonia, gait or motor skills. In this study, a novel telemedicine technology was developed to detect PD remotely using dysphonia features. Feature transformation and several machine learning (ML) methods with 2-, 5- and 10-fold cross-validations were implemented on the vocal features. It was observed that the combination of principal component analysis (PCA) as a feature transformation (FT) and k-nearest neighbor (k-NN) as a classifier with 10-fold cross-validation has the best accuracy as 99.1%. All ML processes were applied to the prerecorded PD dataset using a newly created program named ParkDet 2.0. Additionally, the blind test interface was created on the ParkDet so that users could detect new patients with PD in future. Clinicians or medical technicians, without any knowledge of ML, will be able to use the blind test interface to detect PD at a clinic or remote location utilizing internet as a telemedicine application. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
A Complexity-Based Approach for the Detection of Weak Signals in Ocean Ambient Noise
Entropy 2016, 18(3), 101; https://0-doi-org.brum.beds.ac.uk/10.3390/e18030101 - 18 Mar 2016
Cited by 31 | Viewed by 3282
Abstract
There are numerous studies showing that there is a constant increase in the ocean ambient noise level and the ever-growing demand for developing algorithms for detecting weak signals in ambient noise. In this study, we utilize dynamical and statistical complexity to detect the [...] Read more.
There are numerous studies showing that there is a constant increase in the ocean ambient noise level and the ever-growing demand for developing algorithms for detecting weak signals in ambient noise. In this study, we utilize dynamical and statistical complexity to detect the presence of weak ship noise embedded in ambient noise. The ambient noise and ship noise were recorded in the South China Sea. The multiscale entropy (MSE) method and the complexity-entropy causality plane (C-H plane) were used to quantify the dynamical and statistical complexity of the measured time series, respectively. We generated signals with varying signal-to-noise ratio (SNR) by varying the amplification of a ship signal. The simulation results indicate that the complexity is sensitive to change in the information in the ambient noise and the change in SNR, a finding that enables the detection of weak ship signals in strong background ambient noise. The simulation results also illustrate that complexity is better than the traditional spectrogram method, particularly effective for detecting low SNR signals in ambient noise. In addition, complexity-based MSE and C-H plane methods are simple, robust and do not assume any underlying dynamics in time series. Hence, complexity should be used in practical situations. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Selected Remarks about Computer Processing in Terms of Flow Control and Statistical Mechanics
Entropy 2016, 18(3), 93; https://0-doi-org.brum.beds.ac.uk/10.3390/e18030093 - 12 Mar 2016
Cited by 3 | Viewed by 2607
Abstract
Despite the fact that much has been said about processing in computer science, it seems that there is still much to do. A classical approach assumes that the computations done by computers are a kind of mathematical operation (calculations of functions values) and [...] Read more.
Despite the fact that much has been said about processing in computer science, it seems that there is still much to do. A classical approach assumes that the computations done by computers are a kind of mathematical operation (calculations of functions values) and have no special relations to energy transformation and flow. However, there is a possibility to get a new view on selected topics, and as a special case, the sorting problem is presented; we know many different sorting algorithms, including those that have complexity equal to O(n lg(n)) , which means that this problem is algorithmically closed, but it is also possible to focus on the problem of sorting in terms of flow control, entropy and statistical mechanics. This is done in relation to the existing definitions of sorting, connections between sorting and ordering and some important aspects of computer processing understood as a flow that are not taken into account in many theoretical considerations in computer science. The proposed new view is an attempt to change the paradigm in the description of algorithms’ performance by computational complexity and processing, taking into account the existing references between the idea of Turing machines and their physical implementations. This proposal can be expressed as a physics of computer processing; a reference point to further analysis of algorithmic and interactive processing in computer systems. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Tea Category Identification Using a Novel Fractional Fourier Entropy and Jaya Algorithm
Entropy 2016, 18(3), 77; https://0-doi-org.brum.beds.ac.uk/10.3390/e18030077 - 27 Feb 2016
Cited by 84 | Viewed by 4315
Abstract
This work proposes a tea-category identification (TCI) system, which can automatically determine tea category from images captured by a 3 charge-coupled device (CCD) digital camera. Three-hundred tea images were acquired as the dataset. Apart from the 64 traditional color histogram features that were [...] Read more.
This work proposes a tea-category identification (TCI) system, which can automatically determine tea category from images captured by a 3 charge-coupled device (CCD) digital camera. Three-hundred tea images were acquired as the dataset. Apart from the 64 traditional color histogram features that were extracted, we also introduced a relatively new feature as fractional Fourier entropy (FRFE) and extracted 25 FRFE features from each tea image. Furthermore, the kernel principal component analysis (KPCA) was harnessed to reduce 64 + 25 = 89 features. The four reduced features were fed into a feedforward neural network (FNN). Its optimal weights were obtained by Jaya algorithm. The 10 × 10-fold stratified cross-validation (SCV) showed that our TCI system obtains an overall average sensitivity rate of 97.9%, which was higher than seven existing approaches. In addition, we used only four features less than or equal to state-of-the-art approaches. Our proposed system is efficient in terms of tea-category identification. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Perturbation of Fractional Multi-Agent Systems in Cloud Entropy Computing
Entropy 2016, 18(1), 31; https://0-doi-org.brum.beds.ac.uk/10.3390/e18010031 - 19 Jan 2016
Cited by 8 | Viewed by 2286
Abstract
A perturbed multi-agent system is a scheme self-possessed of multiple networking agents within a location. This scheme can be used to discuss problems that are impossible or difficult for a specific agent to solve. Intelligence cloud entropy management systems involve functions, methods, procedural [...] Read more.
A perturbed multi-agent system is a scheme self-possessed of multiple networking agents within a location. This scheme can be used to discuss problems that are impossible or difficult for a specific agent to solve. Intelligence cloud entropy management systems involve functions, methods, procedural approaches, and algorithms. In this study, we introduce a new perturbed algorithm based on the fractional Poisson process. The discrete dynamics are suggested by using fractional entropy and fractional type Tsallis entropy. Moreover, we study the algorithm stability. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Evolution Characteristics of Complex Fund Network and Fund Strategy Identification
Entropy 2015, 17(12), 8073-8088; https://0-doi-org.brum.beds.ac.uk/10.3390/e17127861 - 08 Dec 2015
Cited by 5 | Viewed by 2592
Abstract
Earlier investment practices show that there lies a discrepancy between the actual fund strategy and stated fund strategy. Using a minimum spanning tree (MST) and planar maximally-filtered graph (PMFG), we build a network of open-ended funds in China’s market and investigate the evolution [...] Read more.
Earlier investment practices show that there lies a discrepancy between the actual fund strategy and stated fund strategy. Using a minimum spanning tree (MST) and planar maximally-filtered graph (PMFG), we build a network of open-ended funds in China’s market and investigate the evolution characteristics of the networks over multiple time periods and timescales. The evolution characteristics, especially the locations of clustering central nodes, show that the actual strategy of the open-ended funds in China’s market significantly differs from the original stated strategy. When the investment horizon and timescale extend, the funds approach an identical actual strategy. This work introduces a novel network-based quantitative method to help investors identify the actual strategy of open-ended funds. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Back to TopTop