Next Issue
Volume 22, May
Previous Issue
Volume 22, March

Entropy, Volume 22, Issue 4 (April 2020) – 120 articles

Cover Story (view full-size image): Here, we motivate a geometric perspective of the concept of information flow between components of a complex dynamical system. The most popular methods in this area are probabilistic in nature, including the Nobel-prize-winning work on Granger causality, and also the recently highly popular transfer entropy. Beyond conceptual advancement, a geometric description of causality further allows for new and efficient computational methods of causality inference. In this direction, we introduce a new measure of causal inference based on contrasting fractal correlation dimensions, conditionally applied to compete for explanations of future forecasts. In this setting, we believe our geometric interpretation of information flow has both computational efficiency and theoretical interpretation reasons to contribute positively to many fields of science. View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
Article
Robust Change Point Test for General Integer-Valued Time Series Models Based on Density Power Divergence
Entropy 2020, 22(4), 493; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040493 - 24 Apr 2020
Cited by 7 | Viewed by 1322
Abstract
In this study, we consider the problem of testing for a parameter change in general integer-valued time series models whose conditional distribution belongs to the one-parameter exponential family when the data are contaminated by outliers. In particular, we use a robust change point [...] Read more.
In this study, we consider the problem of testing for a parameter change in general integer-valued time series models whose conditional distribution belongs to the one-parameter exponential family when the data are contaminated by outliers. In particular, we use a robust change point test based on density power divergence (DPD) as the objective function of the minimum density power divergence estimator (MDPDE). The results show that under regularity conditions, the limiting null distribution of the DPD-based test is a function of a Brownian bridge. Monte Carlo simulations are conducted to evaluate the performance of the proposed test and show that the test inherits the robust properties of the MDPDE and DPD. Lastly, we demonstrate the proposed test using a real data analysis of the return times of extreme events related to Goldman Sachs Group stock. Full article
Show Figures

Figure 1

Article
An Efficient, Parallelized Algorithm for Optimal Conditional Entropy-Based Feature Selection
Entropy 2020, 22(4), 492; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040492 - 24 Apr 2020
Viewed by 1263
Abstract
In Machine Learning, feature selection is an important step in classifier design. It consists of finding a subset of features that is optimum for a given cost function. One possibility to solve feature selection is to organize all possible feature subsets into a [...] Read more.
In Machine Learning, feature selection is an important step in classifier design. It consists of finding a subset of features that is optimum for a given cost function. One possibility to solve feature selection is to organize all possible feature subsets into a Boolean lattice and to exploit the fact that the costs of chains in that lattice describe U-shaped curves. Minimization of such cost function is known as the U-curve problem. Recently, a study proposed U-Curve Search (UCS), an optimal algorithm for that problem, which was successfully used for feature selection. However, despite of the algorithm optimality, the UCS required time in computational assays was exponential on the number of features. Here, we report that such scalability issue arises due to the fact that the U-curve problem is NP-hard. In the sequence, we introduce the Parallel U-Curve Search (PUCS), a new algorithm for the U-curve problem. In PUCS, we present a novel way to partition the search space into smaller Boolean lattices, thus rendering the algorithm highly parallelizable. We also provide computational assays with both synthetic data and Machine Learning datasets, where the PUCS performance was assessed against UCS and other golden standard algorithms in feature selection. Full article
(This article belongs to the Special Issue Information-Theoretical Methods in Data Mining)
Show Figures

Figure 1

Article
Useful Dual Functional of Entropic Information Measures
Entropy 2020, 22(4), 491; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040491 - 24 Apr 2020
Viewed by 994
Abstract
There are entropic functionals galore, but not simple objective measures to distinguish between them. We remedy this situation here by appeal to Born’s proposal, of almost a hundred years ago, that the square modulus of any wave function |ψ|2 be [...] Read more.
There are entropic functionals galore, but not simple objective measures to distinguish between them. We remedy this situation here by appeal to Born’s proposal, of almost a hundred years ago, that the square modulus of any wave function | ψ | 2 be regarded as a probability distribution P. the usefulness of using information measures like Shannon’s in this pure-state context has been highlighted in [Phys. Lett. A1993, 181, 446]. Here we will apply the notion with the purpose of generating a dual functional [ F α R : { S Q } R + ], which maps entropic functionals onto positive real numbers. In such an endeavor, we use as standard ingredients the coherent states of the harmonic oscillator (CHO), which are unique in the sense of possessing minimum uncertainty. This use is greatly facilitated by the fact that the CHO can be given analytic, compact closed form as shown in [Rev. Mex. Fis. E 2019, 65, 191]. Rewarding insights are to be obtained regarding the comparison between several standard entropic measures. Full article
(This article belongs to the Special Issue Entropic Forces in Complex Systems)
Show Figures

Figure 1

Article
Limitations to Estimating Mutual Information in Large Neural Populations
Entropy 2020, 22(4), 490; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040490 - 24 Apr 2020
Cited by 1 | Viewed by 1520
Abstract
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily [...] Read more.
Information theory provides a powerful framework to analyse the representation of sensory stimuli in neural population activity. However, estimating the quantities involved such as entropy and mutual information from finite samples is notoriously hard and any direct estimate is known to be heavily biased. This is especially true when considering large neural populations. We study a simple model of sensory processing and show through a combinatorial argument that, with high probability, for large neural populations any finite number of samples of neural activity in response to a set of stimuli is mutually distinct. As a consequence, the mutual information when estimated directly from empirical histograms will be equal to the stimulus entropy. Importantly, this is the case irrespective of the precise relation between stimulus and neural activity and corresponds to a maximal bias. This argument is general and applies to any application of information theory, where the state space is large and one relies on empirical histograms. Overall, this work highlights the need for alternative approaches for an information theoretic analysis when dealing with large neural populations. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

Article
Energy Dissipation and Decoherence in Solid-State Quantum Devices: Markovian versus non-Markovian Treatments
Entropy 2020, 22(4), 489; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040489 - 24 Apr 2020
Cited by 2 | Viewed by 1020
Abstract
The design and optimization of new-generation solid-state quantum hardware absolutely requires reliable dissipation versus decoherence models. Depending on the device operational condition, the latter may range from Markov-type schemes (both phenomenological- and microscopic- like) to quantum-kinetic approaches. The primary goal of this paper [...] Read more.
The design and optimization of new-generation solid-state quantum hardware absolutely requires reliable dissipation versus decoherence models. Depending on the device operational condition, the latter may range from Markov-type schemes (both phenomenological- and microscopic- like) to quantum-kinetic approaches. The primary goal of this paper is to review in a cohesive way virtues versus limitations of the most popular approaches, focussing on a few critical issues recently pointed out (see, e.g., Phys. Rev. B 90, 125140 (2014); Eur. Phys. J. B 90, 250 (2017)) and linking them within a common framework. By means of properly designed simulated experiments of a prototypical quantum-dot nanostructure (described via a two-level electronic system coupled to a phonon bath), we shall show that both conventional (i.e., non-Lindblad) Markov models and density-matrix-based non-Markov approaches (i.e., quantum-kinetic treatments) may lead to significant positivity violations. While for the former case the problem is easily avoidable by choosing genuine Lindblad-type dissipation models, for the latter, a general strategy is still missing. Full article
(This article belongs to the Special Issue Open Quantum Systems (OQS) for Quantum Technologies)
Show Figures

Figure 1

Article
Comparison of Outlier-Tolerant Models for Measuring Visual Complexity
Entropy 2020, 22(4), 488; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040488 - 24 Apr 2020
Cited by 3 | Viewed by 1244
Abstract
Providing the visual complexity of an image in terms of impact or aesthetic preference can be of great applicability in areas such as psychology or marketing. To this end, certain areas such as Computer Vision have focused on identifying features and computational models [...] Read more.
Providing the visual complexity of an image in terms of impact or aesthetic preference can be of great applicability in areas such as psychology or marketing. To this end, certain areas such as Computer Vision have focused on identifying features and computational models that allow for satisfactory results. This paper studies the application of recent ML models using input images evaluated by humans and characterized by features related to visual complexity. According to the experiments carried out, it was confirmed that one of these methods, Correlation by Genetic Search (CGS), based on the search for minimum sets of features that maximize the correlation of the model with respect to the input data, predicted human ratings of image visual complexity better than any other model referenced to date in terms of correlation, RMSE or minimum number of features required by the model. In addition, the variability of these terms were studied eliminating images considered as outliers in previous studies, observing the robustness of the method when selecting the most important variables to make the prediction. Full article
Show Figures

Figure 1

Article
An Improved Total Uncertainty Measure in the Evidence Theory and Its Application in Decision Making
Entropy 2020, 22(4), 487; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040487 - 24 Apr 2020
Cited by 7 | Viewed by 1248
Abstract
Dempster–Shafer evidence theory (DS theory) has some superiorities in uncertain information processing for a large variety of applications. However, the problem of how to quantify the uncertainty of basic probability assignment (BPA) in DS theory framework remain unresolved. The goal of this paper [...] Read more.
Dempster–Shafer evidence theory (DS theory) has some superiorities in uncertain information processing for a large variety of applications. However, the problem of how to quantify the uncertainty of basic probability assignment (BPA) in DS theory framework remain unresolved. The goal of this paper is to define a new belief entropy for measuring uncertainty of BPA with desirable properties. The new entropy can be helpful for uncertainty management in practical applications such as decision making. The proposed uncertainty measure has two components. The first component is an improved version of Dubois–Prade entropy, which aims to capture the non-specificity portion of uncertainty with a consideration of the element number in frame of discernment (FOD). The second component is adopted from Nguyen entropy, which captures conflict in BPA. We prove that the proposed entropy satisfies some desired properties proposed in the literature. In addition, the proposed entropy can be reduced to Shannon entropy if the BPA is a probability distribution. Numerical examples are presented to show the efficiency and superiority of the proposed measure as well as an application in decision making. Full article
(This article belongs to the Section Signal and Data Analysis)
Show Figures

Figure 1

Article
A New Limit Theorem for Quantum Walk in Terms of Quantum Bernoulli Noises
Entropy 2020, 22(4), 486; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040486 - 24 Apr 2020
Viewed by 1104
Abstract
In this paper, we consider limit probability distributions of the quantum walk recently introduced by Wang and Ye (C.S. Wang and X.J. Ye, Quantum walk in terms of quantum Bernoulli noises, Quantum Inf. Process. 15 (2016), no. 5, 1897–1908). We first establish several [...] Read more.
In this paper, we consider limit probability distributions of the quantum walk recently introduced by Wang and Ye (C.S. Wang and X.J. Ye, Quantum walk in terms of quantum Bernoulli noises, Quantum Inf. Process. 15 (2016), no. 5, 1897–1908). We first establish several technical theorems, which themselves are also interesting. Then, by using these theorems, we prove that, for a wide range of choices of the initial state, the above-mentioned quantum walk has a limit probability distribution of standard Gauss type, which actually gives a new limit theorem for the walk. Full article
(This article belongs to the Special Issue Quantum Information Processing)
Article
Cooperation on Interdependent Networks by Means of Migration and Stochastic Imitation
Entropy 2020, 22(4), 485; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040485 - 23 Apr 2020
Cited by 16 | Viewed by 1392
Abstract
Evolutionary game theory in the realm of network science appeals to a lot of research communities, as it constitutes a popular theoretical framework for studying the evolution of cooperation in social dilemmas. Recent research has shown that cooperation is markedly more resistant in [...] Read more.
Evolutionary game theory in the realm of network science appeals to a lot of research communities, as it constitutes a popular theoretical framework for studying the evolution of cooperation in social dilemmas. Recent research has shown that cooperation is markedly more resistant in interdependent networks, where traditional network reciprocity can be further enhanced due to various forms of interdependence between different network layers. However, the role of mobility in interdependent networks is yet to gain its well-deserved attention. Here we consider an interdependent network model, where individuals in each layer follow different evolutionary games, and where each player is considered as a mobile agent that can move locally inside its own layer to improve its fitness. Probabilistically, we also consider an imitation possibility from a neighbor on the other layer. We show that, by considering migration and stochastic imitation, further fascinating gateways to cooperation on interdependent networks can be observed. Notably, cooperation can be promoted on both layers, even if cooperation without interdependence would be improbable on one of the layers due to adverse conditions. Our results provide a rationale for engineering better social systems at the interface of networks and human decision making under testing dilemmas. Full article
(This article belongs to the Special Issue Dynamic Processes on Complex Networks)
Show Figures

Figure 1

Article
Melanoma and Nevus Skin Lesion Classification Using Handcraft and Deep Learning Feature Fusion via Mutual Information Measures
Entropy 2020, 22(4), 484; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040484 - 23 Apr 2020
Cited by 22 | Viewed by 2098
Abstract
In this paper, a new Computer-Aided Detection (CAD) system for the detection and classification of dangerous skin lesions (melanoma type) is presented, through a fusion of handcraft features related to the medical algorithm ABCD rule (Asymmetry Borders-Colors-Dermatoscopic Structures) and deep learning features employing [...] Read more.
In this paper, a new Computer-Aided Detection (CAD) system for the detection and classification of dangerous skin lesions (melanoma type) is presented, through a fusion of handcraft features related to the medical algorithm ABCD rule (Asymmetry Borders-Colors-Dermatoscopic Structures) and deep learning features employing Mutual Information (MI) measurements. The steps of a CAD system can be summarized as preprocessing, feature extraction, feature fusion, and classification. During the preprocessing step, a lesion image is enhanced, filtered, and segmented, with the aim to obtain the Region of Interest (ROI); in the next step, the feature extraction is performed. Handcraft features such as shape, color, and texture are used as the representation of the ABCD rule, and deep learning features are extracted using a Convolutional Neural Network (CNN) architecture, which is pre-trained on Imagenet (an ILSVRC Imagenet task). MI measurement is used as a fusion rule, gathering the most important information from both types of features. Finally, at the Classification step, several methods are employed such as Linear Regression (LR), Support Vector Machines (SVMs), and Relevant Vector Machines (RVMs). The designed framework was tested using the ISIC 2018 public dataset. The proposed framework appears to demonstrate an improved performance in comparison with other state-of-the-art methods in terms of the accuracy, specificity, and sensibility obtained in the training and test stages. Additionally, we propose and justify a novel procedure that should be used in adjusting the evaluation metrics for imbalanced datasets that are common for different kinds of skin lesions. Full article
Show Figures

Figure 1

Correction
Correction: Li, Q.; Liang, S.Y. Incipient Fault Diagnosis of Rolling Bearings Based on Impulse-Step Impact Dictionary and Re-Weighted Minimizing Nonconvex Penalty Lq Regular Technique. Entropy 2017, 19, 421
Entropy 2020, 22(4), 483; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040483 - 23 Apr 2020
Viewed by 857
Abstract
The authors were not aware of some errors and imprecise descriptions made in the proofreading phase, therefore, we wish to make the following corrections to this paper [...] Full article
Article
On the Structure of the World Economy: An Absorbing Markov Chain Approach
Entropy 2020, 22(4), 482; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040482 - 23 Apr 2020
Cited by 3 | Viewed by 1473
Abstract
The expansion of global production networks has raised many important questions about the interdependence among countries and how future changes in the world economy are likely to affect the countries’ positioning in global value chains. We are approaching the structure and lengths of [...] Read more.
The expansion of global production networks has raised many important questions about the interdependence among countries and how future changes in the world economy are likely to affect the countries’ positioning in global value chains. We are approaching the structure and lengths of value chains from a completely different perspective than has been available so far. By assigning a random endogenous variable to a network linkage representing the number of intermediate sales/purchases before absorption (final use or value added), the discrete-time absorbing Markov chains proposed here shed new light on the world input/output networks. The variance of this variable can help assess the risk when shaping the chain length and optimize the level of production. Contrary to what might be expected simply on the basis of comparative advantage, the results reveal that both the input and output chains exhibit the same quasi-stationary product distribution. Put differently, the expected proportion of time spent in a state before absorption is invariant to changes of the network type. Finally, the several global metrics proposed here, including the probability distribution of global value added/final output, provide guidance for policy makers when estimating the resilience of world trading system and forecasting the macroeconomic developments. Full article
(This article belongs to the Special Issue Dynamic Processes on Complex Networks)
Show Figures

Figure 1

Article
A Study on Non-Linear DPL Model for Describing Heat Transfer in Skin Tissue during Hyperthermia Treatment
Entropy 2020, 22(4), 481; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040481 - 22 Apr 2020
Cited by 5 | Viewed by 1314
Abstract
The article studies the simulation-based mathematical modeling of bioheat transfer under the Dirichlet boundary condition. We used complex non-linear dual-phase-lag bioheat transfer (DPLBHT) for analyzing the temperature distribution in skin tissues during hyperthermia treatment of infected cells. The perfusion term, metabolic heat source, [...] Read more.
The article studies the simulation-based mathematical modeling of bioheat transfer under the Dirichlet boundary condition. We used complex non-linear dual-phase-lag bioheat transfer (DPLBHT) for analyzing the temperature distribution in skin tissues during hyperthermia treatment of infected cells. The perfusion term, metabolic heat source, and external heat source were the three parts of the volumetric heat source that were used in the model. The non-linear DPLBHT model predicted a more accurate temperature within skin tissues. The finite element Runge–Kutta (4,5) (FERK (4,5)) method, which was based on two techniques, finite difference and Runge–Kutta (4,5), was applied for calculating the result in the case of our typical non-linear problem. The paper studies and presents the non-dimensional unit. Thermal damage of normal tissue was observed near zero during hyperthermia treatment. The effects of the non-dimensional time, non-dimensional space coordinate, location parameter, regional parameter, relaxation and thermalization time, metabolic heat source, associated metabolic heat source parameter, perfusion rate, associated perfusion heat source parameter, and external heat source coefficient on the dimensionless temperature profile were studied in detail during the hyperthermia treatment process. Full article
(This article belongs to the Special Issue Biological Statistical Mechanics)
Show Figures

Figure 1

Article
Hall Effect on Radiative Casson Fluid Flow with Chemical Reaction on a Rotating Cone through Entropy Optimization
Entropy 2020, 22(4), 480; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040480 - 22 Apr 2020
Cited by 9 | Viewed by 1112
Abstract
Magnetohydrodynamic (MHD) flow with Hall current has numerous applications in industrial areas such as Hall current accelerators, MHD power generators, planetary dynamics, Hall current sensors, etc. In this paper, the analysis of an unsteady MHD Casson fluid with chemical reaction over a rotating [...] Read more.
Magnetohydrodynamic (MHD) flow with Hall current has numerous applications in industrial areas such as Hall current accelerators, MHD power generators, planetary dynamics, Hall current sensors, etc. In this paper, the analysis of an unsteady MHD Casson fluid with chemical reaction over a rotating cone is presented. The impacts of Hall current, joule heating, thermal radiation, and viscous dissipation are analyzed. Entropy optimization is also considered in the present analysis. The system of coupled equations is tackled with homotopy analysis method (HAM). The convergence of HAM is also shown through figures. Deviations in the flow due to dimensionless parameters are shown graphically. Similarly, the variation in skin friction, Nusselt number, and Sherwood number are deliberated through Tables. A justification of the current consequences is presented. Full article
(This article belongs to the Special Issue Entropy Generation and Heat Transfer II)
Show Figures

Figure 1

Article
Binary Expression Enhances Reliability of Messaging in Gene Networks
Entropy 2020, 22(4), 479; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040479 - 22 Apr 2020
Cited by 1 | Viewed by 1341
Abstract
The promoter state of a gene and its expression levels are modulated by the amounts of transcription factors interacting with its regulatory regions. Hence, one may interpret a gene network as a communicating system in which the state of the promoter of a [...] Read more.
The promoter state of a gene and its expression levels are modulated by the amounts of transcription factors interacting with its regulatory regions. Hence, one may interpret a gene network as a communicating system in which the state of the promoter of a gene (the source) is communicated by the amounts of transcription factors that it expresses (the message) to modulate the state of the promoter and expression levels of another gene (the receptor). The reliability of the gene network dynamics can be quantified by Shannon’s entropy of the message and the mutual information between the message and the promoter state. Here we consider a stochastic model for a binary gene and use its exact steady state solutions to calculate the entropy and mutual information. We show that a slow switching promoter with long and equally standing ON and OFF states maximizes the mutual information and reduces entropy. That is a binary gene expression regime generating a high variance message governed by a bimodal probability distribution with peaks of the same height. Our results indicate that Shannon’s theory can be a powerful framework for understanding how bursty gene expression conciliates with the striking spatio-temporal precision exhibited in pattern formation of developing organisms. Full article
(This article belongs to the Special Issue Information Flow and Entropy Production in Biomolecular Networks)
Show Figures

Figure 1

Article
Mechanical Fault Diagnosis of a High Voltage Circuit Breaker Based on High-Efficiency Time-Domain Feature Extraction with Entropy Features
Entropy 2020, 22(4), 478; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040478 - 22 Apr 2020
Cited by 4 | Viewed by 1091
Abstract
The fault samples of high voltage circuit breakers are few, the vibration signals are complex, the existing research methods cannot extract the effective information in the features, and it is easy to overfit, slow training, and other problems. To improve the efficiency of [...] Read more.
The fault samples of high voltage circuit breakers are few, the vibration signals are complex, the existing research methods cannot extract the effective information in the features, and it is easy to overfit, slow training, and other problems. To improve the efficiency of feature extraction of a circuit breaker vibration signal and the accuracy of circuit breaker state recognition, a Light Gradient Boosting Machine (LightGBM) method based on time-domain feature extraction with multi-type entropy features for mechanical fault diagnosis of the high voltage circuit breaker is proposed. First, the original vibration signal of the high voltage circuit breaker is segmented in the time domain; then, 16 features including 5 kinds of entropy features are extracted directly from each part of the original signal after time-domain segmentation, and the original feature set is constructed. Second, the Split importance value of each feature is calculated, and the optimal feature subset is determined by the forward feature selection, taking the classification accuracy of LightGBM as the decision variable. After that, the LightGBM classifier is constructed based on the feature vector of the optimal feature subset, which can accurately distinguish the mechanical fault state of the high voltage circuit breaker. The experimental results show that the new method has the advantages of high efficiency of feature extraction and high accuracy of fault identification. Full article
Show Figures

Figure 1

Article
Higher-Order Cumulants Drive Neuronal Activity Patterns, Inducing UP-DOWN States in Neural Populations
Entropy 2020, 22(4), 477; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040477 - 22 Apr 2020
Cited by 2 | Viewed by 1091
Abstract
A major challenge in neuroscience is to understand the role of the higher-order correlations structure of neuronal populations. The dichotomized Gaussian model (DG) generates spike trains by means of thresholding a multivariate Gaussian random variable. The DG inputs are Gaussian distributed, and thus [...] Read more.
A major challenge in neuroscience is to understand the role of the higher-order correlations structure of neuronal populations. The dichotomized Gaussian model (DG) generates spike trains by means of thresholding a multivariate Gaussian random variable. The DG inputs are Gaussian distributed, and thus have no interactions beyond the second order in their inputs; however, they can induce higher-order correlations in the outputs. We propose a combination of analytical and numerical techniques to estimate higher-order, above the second, cumulants of the firing probability distributions. Our findings show that a large amount of pairwise interactions in the inputs can induce the system into two possible regimes, one with low activity (“DOWN state”) and another one with high activity (“UP state”), and the appearance of these states is due to a combination between the third- and fourth-order cumulant. This could be part of a mechanism that would help the neural code to upgrade specific information about the stimuli, motivating us to examine the behavior of the critical fluctuations through the Binder cumulant close to the critical point. We show, using the Binder cumulant, that higher-order correlations in the outputs generate a critical neural system that portrays a second-order phase transition. Full article
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications)
Show Figures

Figure 1

Article
Analysis of an Integrated Solar Combined Cycle with Recuperative Gas Turbine and Double Recuperative and Double Expansion Propane Cycle
Entropy 2020, 22(4), 476; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040476 - 21 Apr 2020
Cited by 2 | Viewed by 1099
Abstract
The main objective of this paper is to present and analyze an innovative configuration of integrated solar combined cycle (ISCC). As novelties, the plant includes a recuperative gas turbine and the conventional bottoming Rankine cycle is replaced by a recently developed double recuperative [...] Read more.
The main objective of this paper is to present and analyze an innovative configuration of integrated solar combined cycle (ISCC). As novelties, the plant includes a recuperative gas turbine and the conventional bottoming Rankine cycle is replaced by a recently developed double recuperative double expansion (DRDE) cycle. The configuration results in a fuel saving in the combustion chamber at the expense of a decreased exhaust gas temperature, which is just adequate to feed the DRDE cycle that uses propane as the working fluid. The solar contribution comes from a solar field of parabolic trough collectors, with oil as the heat transfer fluid. The optimum integration point for the solar contribution is addressed. The performance of the proposed ISCC-R-DRDE design conditions and off-design operation was assessed (daily and yearly) at two different locations. All results were compared to those obtained under the same conditions by a conventional ISCC, as well as similar configurations without solar integration. The proposed configuration obtains a lower heat rate on a yearly basis in the studied locations and lower levelized cost of energy (LCOE) than that of the ISCC, which indicates that such a configuration could become a promising technology. Full article
(This article belongs to the Special Issue Thermodynamic Optimization of Complex Energy Systems)
Show Figures

Figure 1

Article
Constructal Design of an Arrow-Shaped High Thermal Conductivity Channel in a Square Heat Generation Body
Entropy 2020, 22(4), 475; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040475 - 20 Apr 2020
Cited by 5 | Viewed by 1091
Abstract
A heat conduction model with an arrow-shaped high thermal conductivity channel (ASHTCC) in a square heat generation body (SHGB) is established in this paper. By taking the minimum maximum temperature difference (MMTD) as the optimization goal, constructal designs of the ASHTCC are conducted [...] Read more.
A heat conduction model with an arrow-shaped high thermal conductivity channel (ASHTCC) in a square heat generation body (SHGB) is established in this paper. By taking the minimum maximum temperature difference (MMTD) as the optimization goal, constructal designs of the ASHTCC are conducted based on single, two, and three degrees of freedom optimizations under the condition of fixed ASHTCC material. The outcomes illustrate that the heat conduction performance (HCP) of the SHGB is better when the structure of the ASHTCC tends to be flat. Increasing the thermal conductivity ratio and area fraction of the ASHTCC material can improve the HCP of the SHGB. In the discussed numerical examples, the MMTD obtained by three degrees of freedom optimization are reduced by 8.42% and 4.40%, respectively, compared with those obtained by single and two degrees of freedom optimizations. Therefore, three degrees of freedom optimization can further improve the HCP of the SHGB. Compared the HCPs of the SHGBs with ASHTCC and the T-shaped one, the MMTD of the former is reduced by 13.0%. Thus, the structure of the ASHTCC is proven to be superior to that of the T-shaped one. The optimization results gained in this paper have reference values for the optimal structure designs for the heat dissipations of various electronic devices. Full article
Show Figures

Figure 1

Article
Modification of the Logistic Map Using Fuzzy Numbers with Application to Pseudorandom Number Generation and Image Encryption
Entropy 2020, 22(4), 474; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040474 - 20 Apr 2020
Cited by 12 | Viewed by 1292
Abstract
A modification of the classic logistic map is proposed, using fuzzy triangular numbers. The resulting map is analysed through its Lyapunov exponent (LE) and bifurcation diagrams. It shows higher complexity compared to the classic logistic map and showcases phenomena, like antimonotonicity and crisis. [...] Read more.
A modification of the classic logistic map is proposed, using fuzzy triangular numbers. The resulting map is analysed through its Lyapunov exponent (LE) and bifurcation diagrams. It shows higher complexity compared to the classic logistic map and showcases phenomena, like antimonotonicity and crisis. The map is then applied to the problem of pseudo random bit generation, using a simple rule to generate the bit sequence. The resulting random bit generator (RBG) successfully passes the National Institute of Standards and Technology (NIST) statistical tests, and it is then successfully applied to the problem of image encryption. Full article
Show Figures

Figure 1

Article
Cross-Domain Recommendation Based on Sentiment Analysis and Latent Feature Mapping
Entropy 2020, 22(4), 473; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040473 - 20 Apr 2020
Cited by 1 | Viewed by 1418
Abstract
Cross-domain recommendation is a promising solution in recommendation systems by using relatively rich information from the source domain to improve the recommendation accuracy of the target domain. Most of the existing methods consider the rating information of users in different domains, the label [...] Read more.
Cross-domain recommendation is a promising solution in recommendation systems by using relatively rich information from the source domain to improve the recommendation accuracy of the target domain. Most of the existing methods consider the rating information of users in different domains, the label information of users and items and the review information of users on items. However, they do not effectively use the latent sentiment information to find the accurate mapping of latent features in reviews between domains. User reviews usually include user’s subjective views, which can reflect the user’s preferences and sentiment tendencies to various attributes of the items. Therefore, in order to solve the cold-start problem in the recommendation process, this paper proposes a cross-domain recommendation algorithm (CDR-SAFM) based on sentiment analysis and latent feature mapping by combining the sentiment information implicit in user reviews in different domains. Different from previous sentiment research, this paper divides sentiment into three categories based on three-way decision ideas—namely, positive, negative and neutral—by conducting sentiment analysis on user review information. Furthermore, the Latent Dirichlet Allocation (LDA) is used to model the user’s semantic orientation to generate the latent sentiment review features. Moreover, the Multilayer Perceptron (MLP) is used to obtain the cross domain non-linear mapping function to transfer the user’s sentiment review features. Finally, this paper proves the effectiveness of the proposed CDR-SAFM framework by comparing it with existing recommendation algorithms in a cross-domain scenario on the Amazon dataset. Full article
(This article belongs to the Special Issue Computation in Complex Networks)
Show Figures

Figure 1

Article
Residue Cluster Classes: A Unified Protein Representation for Efficient Structural and Functional Classification
Entropy 2020, 22(4), 472; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040472 - 20 Apr 2020
Cited by 4 | Viewed by 1391
Abstract
Proteins are characterized by their structures and functions, and these two fundamental aspects of proteins are assumed to be related. To model such a relationship, a single representation to model both protein structure and function would be convenient, yet so far, the most [...] Read more.
Proteins are characterized by their structures and functions, and these two fundamental aspects of proteins are assumed to be related. To model such a relationship, a single representation to model both protein structure and function would be convenient, yet so far, the most effective models for protein structure or function classification do not rely on the same protein representation. Here we provide a computationally efficient implementation for large datasets to calculate residue cluster classes (RCCs) from protein three-dimensional structures and show that such representations enable a random forest algorithm to effectively learn the structural and functional classifications of proteins, according to the CATH and Gene Ontology criteria, respectively. RCCs are derived from residue contact maps built from different distance criteria, and we show that 7 or 8 Å with or without amino acid side-chain atoms rendered the best classification models. The potential use of a unified representation of proteins is discussed and possible future areas for improvement and exploration are presented. Full article
(This article belongs to the Special Issue Statistical Inference from High Dimensional Data)
Show Figures

Figure 1

Article
Time-Dependent Pseudo-Hermitian Hamiltonians and a Hidden Geometric Aspect of Quantum Mechanics
Entropy 2020, 22(4), 471; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040471 - 20 Apr 2020
Cited by 6 | Viewed by 1269
Abstract
A non-Hermitian operator H defined in a Hilbert space with inner product ·|· may serve as the Hamiltonian for a unitary quantum system if it is η-pseudo-Hermitian for a metric operator (positive-definite automorphism) η. The latter defines [...] Read more.
A non-Hermitian operator H defined in a Hilbert space with inner product · | · may serve as the Hamiltonian for a unitary quantum system if it is η -pseudo-Hermitian for a metric operator (positive-definite automorphism) η . The latter defines the inner product · | η · of the physical Hilbert space H η of the system. For situations where some of the eigenstates of H depend on time, η becomes time-dependent. Therefore, the system has a non-stationary Hilbert space. Such quantum systems, which are also encountered in the study of quantum mechanics in cosmological backgrounds, suffer from a conflict between the unitarity of time evolution and the unobservability of the Hamiltonian. Their proper treatment requires a geometric framework which clarifies the notion of the energy observable and leads to a geometric extension of quantum mechanics (GEQM). We provide a general introduction to the subject, review some of the recent developments, offer a straightforward description of the Heisenberg-picture formulation of the dynamics for quantum systems having a time-dependent Hilbert space, and outline the Heisenberg-picture formulation of dynamics in GEQM. Full article
(This article belongs to the Special Issue Quantum Dynamics with Non-Hermitian Hamiltonians)
Show Figures

Figure 1

Article
Entropy-Based Measure of Statistical Complexity of a Game Strategy
Entropy 2020, 22(4), 470; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040470 - 20 Apr 2020
Viewed by 943
Abstract
In this note, we introduce excess strategic entropy—an entropy-based measure of complexity of the strategy. It measures complexity and predictability of the (mixed) strategy of a player. We show and discuss properties of this measure and its possible applications. Full article
Article
Equation of State of Four- and Five-Dimensional Hard-Hypersphere Mixtures
Entropy 2020, 22(4), 469; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040469 - 20 Apr 2020
Cited by 2 | Viewed by 1069
Abstract
New proposals for the equation of state of four- and five-dimensional hard-hypersphere mixtures in terms of the equation of state of the corresponding monocomponent hard-hypersphere fluid are introduced. Such proposals (which are constructed in such a way so as to yield the exact [...] Read more.
New proposals for the equation of state of four- and five-dimensional hard-hypersphere mixtures in terms of the equation of state of the corresponding monocomponent hard-hypersphere fluid are introduced. Such proposals (which are constructed in such a way so as to yield the exact third virial coefficient) extend, on the one hand, recent similar formulations for hard-disk and (three-dimensional) hard-sphere mixtures and, on the other hand, two of our previous proposals also linking the mixture equation of state and the one of the monocomponent fluid but unable to reproduce the exact third virial coefficient. The old and new proposals are tested by comparison with published molecular dynamics and Monte Carlo simulation results and their relative merit is evaluated. Full article
(This article belongs to the Special Issue Statistical Mechanics and Thermodynamics of Liquids and Crystals)
Show Figures

Graphical abstract

Article
Feature Extraction of Ship-Radiated Noise Based on Enhanced Variational Mode Decomposition, Normalized Correlation Coefficient and Permutation Entropy
Entropy 2020, 22(4), 468; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040468 - 20 Apr 2020
Cited by 6 | Viewed by 1144
Abstract
Due to the complexity and variability of underwater acoustic channels, ship-radiated noise (SRN) detected using the passive sonar is prone to be distorted. The entropy-based feature extraction method can improve this situation, to some extent. However, it is impractical to directly extract the [...] Read more.
Due to the complexity and variability of underwater acoustic channels, ship-radiated noise (SRN) detected using the passive sonar is prone to be distorted. The entropy-based feature extraction method can improve this situation, to some extent. However, it is impractical to directly extract the entropy feature for the detected SRN signals. In addition, the existing conventional methods have a lack of suitable de-noising processing under the presence of marine environmental noise. To this end, this paper proposes a novel feature extraction method based on enhanced variational mode decomposition (EVMD), normalized correlation coefficient (norCC), permutation entropy (PE), and the particle swarm optimization-based support vector machine (PSO-SVM). Firstly, EVMD is utilized to obtain a group of intrinsic mode functions (IMFs) from the SRN signals. The noise-dominant IMFs are then eliminated by a de-noising processing prior to PE calculation. Next, the correlation coefficient between each signal-dominant IMF and the raw signal and PE of each signal-dominant IMF are calculated, respectively. After this, the norCC is used to weigh the corresponding PE and the sum of these weighted PE is considered as the final feature parameter. Finally, the feature vectors are fed into the PSO-SVM multi-class classifier to classify the SRN samples. The experimental results demonstrate that the recognition rate of the proposed methodology is up to 100%, which is much higher than the currently existing methods. Hence, the method proposed in this paper is more suitable for the feature extraction of SRN signals. Full article
Show Figures

Figure 1

Article
Weyl Prior and Bayesian Statistics
Entropy 2020, 22(4), 467; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040467 - 20 Apr 2020
Viewed by 1003
Abstract
When using Bayesian inference, one needs to choose a prior distribution for parameters. The well-known Jeffreys prior is based on the Riemann metric tensor on a statistical manifold. Takeuchi and Amari defined the α-parallel prior, which generalized the Jeffreys prior by exploiting [...] Read more.
When using Bayesian inference, one needs to choose a prior distribution for parameters. The well-known Jeffreys prior is based on the Riemann metric tensor on a statistical manifold. Takeuchi and Amari defined the α -parallel prior, which generalized the Jeffreys prior by exploiting a higher-order geometric object, known as a Chentsov–Amari tensor. In this paper, we propose a new prior based on the Weyl structure on a statistical manifold. It turns out that our prior is a special case of the α -parallel prior with the parameter α equaling n , where n is the dimension of the underlying statistical manifold and the minus sign is a result of conventions used in the definition of α -connections. This makes the choice for the parameter α more canonical. We calculated the Weyl prior for univariate Gaussian and multivariate Gaussian distribution. The Weyl prior of the univariate Gaussian turns out to be the uniform prior. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Article
Symbolic Analysis Applied to the Specification of Spatial Trends and Spatial Dependence
Entropy 2020, 22(4), 466; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040466 - 20 Apr 2020
Cited by 2 | Viewed by 837
Abstract
This article provides symbolic analysis tools for specifying spatial econometric models. It firstly considers testing spatial dependence in the presence of potential leading deterministic spatial components (similar to time-series tests for unit roots in the presence of temporal drift and/or time-trend) and secondly [...] Read more.
This article provides symbolic analysis tools for specifying spatial econometric models. It firstly considers testing spatial dependence in the presence of potential leading deterministic spatial components (similar to time-series tests for unit roots in the presence of temporal drift and/or time-trend) and secondly considers how to econometrically model spatial economic relations that might contain unobserved spatial structure of unknown form. Hypothesis testing is conducted with a symbolic-entropy based non-parametric statistical procedure, recently proposed by Garcia-Cordoba, Matilla-Garcia, and Ruiz (2019), which does not rely on prior weight matrices assumptions. It is shown that the use of geographically restricted semiparametric spatial models is a promising modeling strategy for cross-sectional datasets that are compatible with some types of spatial dependence. The results state that models that merely incorporate space coordinates might be sufficient to capture space dependence. Hedonic models for Baltimore, Boston, and Toledo housing prices datasets are revisited, studied (with the new proposed procedures), and compared with standard spatial econometric methodologies. Full article
(This article belongs to the Special Issue Information theory and Symbolic Analysis: Theory and Applications)
Show Figures

Figure A1

Article
Early Detection of Alzheimer’s Disease: Detecting Asymmetries with a Return Random Walk Link Predictor
Entropy 2020, 22(4), 465; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040465 - 19 Apr 2020
Cited by 4 | Viewed by 1520
Abstract
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially [...] Read more.
Alzheimer’s disease has been extensively studied using undirected graphs to represent the correlations of BOLD signals in different anatomical regions through functional magnetic resonance imaging (fMRI). However, there has been relatively little analysis of this kind of data using directed graphs, which potentially offer the potential to capture asymmetries in the interactions between different anatomical brain regions. The detection of these asymmetries is relevant to detect the disease in an early stage. For this reason, in this paper, we analyze data extracted from fMRI images using the net4Lap algorithm to infer a directed graph from the available BOLD signals, and then seek to determine asymmetries between the left and right hemispheres of the brain using a directed version of the Return Random Walk (RRW). Experimental evaluation of this method reveals that it leads to the identification of anatomical brain regions known to be implicated in the early development of Alzheimer’s disease in clinical studies. Full article
(This article belongs to the Special Issue Information Theory in Computational Neuroscience)
Show Figures

Figure 1

Article
Tsallis Entropy, Likelihood, and the Robust Seismic Inversion
Entropy 2020, 22(4), 464; https://0-doi-org.brum.beds.ac.uk/10.3390/e22040464 - 19 Apr 2020
Cited by 7 | Viewed by 1797
Abstract
The nonextensive statistical mechanics proposed by Tsallis have been successfully used to model and analyze many complex phenomena. Here, we study the role of the generalized Tsallis statistics on the inverse problem theory. Most inverse problems are formulated as an optimisation problem that [...] Read more.
The nonextensive statistical mechanics proposed by Tsallis have been successfully used to model and analyze many complex phenomena. Here, we study the role of the generalized Tsallis statistics on the inverse problem theory. Most inverse problems are formulated as an optimisation problem that aims to estimate the physical parameters of a system from indirect and partial observations. In the conventional approach, the misfit function that is to be minimized is based on the least-squares distance between the observed data and the modelled data (residuals or errors), in which the residuals are assumed to follow a Gaussian distribution. However, in many real situations, the error is typically non-Gaussian, and therefore this technique tends to fail. This problem has motivated us to study misfit functions based on non-Gaussian statistics. In this work, we derive a misfit function based on the q-Gaussian distribution associated with the maximum entropy principle in the Tsallis formalism. We tested our method in a typical geophysical data inverse problem, called post-stack inversion (PSI), in which the physical parameters to be estimated are the Earth’s reflectivity. Our results show that the PSI based on Tsallis statistics outperforms the conventional PSI, especially in the non-Gaussian noisy-data case. Full article
(This article belongs to the Section Multidisciplinary Applications)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop