Next Issue
Volume 18, June
Previous Issue
Volume 18, April
From the start of 2016, the journal uses article numbers instead of page numbers to identify articles. If you are required to add page numbers to a citation, you can do with using a colon in the format [article number]:1–[last page], e.g. 10:1–20.

Entropy, Volume 18, Issue 5 (May 2016) – 42 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
Article
An Intelligent and Fast Chaotic Encryption Using Digital Logic Circuits for Ad-Hoc and Ubiquitous Computing
Entropy 2016, 18(5), 201; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050201 - 23 May 2016
Cited by 5 | Viewed by 2649
Abstract
Delays added by the encryption process represent an overhead for smart computing devices in ad-hoc and ubiquitous computing intelligent systems. Digital Logic Circuits are faster than other computing techniques, so these can be used for fast encryption to minimize processing delays. Chaotic Encryption [...] Read more.
Delays added by the encryption process represent an overhead for smart computing devices in ad-hoc and ubiquitous computing intelligent systems. Digital Logic Circuits are faster than other computing techniques, so these can be used for fast encryption to minimize processing delays. Chaotic Encryption is more attack-resilient than other encryption techniques. One of the most attractive properties of cryptography is known as an avalanche effect, in which two different keys produce distinct cipher text for the same information. Important properties of chaotic systems are sensitivity to initial conditions and nonlinearity, which makes two similar keys that generate different cipher text a source of confusion. In this paper a novel fast and secure Chaotic Map-based encryption technique using 2’s Compliment (CET-2C) has been proposed, which uses a logistic map which implies that a negligible difference in parameters of the map generates different cipher text. Cryptanalysis of the proposed algorithm shows the strength and security of algorithm and keys. Performance of the proposed algorithm has been analyzed in terms of running time, throughput and power consumption. It is to be shown in comparison graphs that the proposed algorithm gave better results compare to different algorithms like AES and some others. Full article
Show Figures

Figure 1

Review
Beyond Hypothesis Testing
Entropy 2016, 18(5), 199; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050199 - 20 May 2016
Cited by 1 | Viewed by 2267
Abstract
The extraordinary success of physicists to find simple laws that explain many phenomena is beguiling. With the exception of quantum mechanics, it suggests a deterministic world in which theories are right or wrong, and the world is simple. However, attempts to apply such [...] Read more.
The extraordinary success of physicists to find simple laws that explain many phenomena is beguiling. With the exception of quantum mechanics, it suggests a deterministic world in which theories are right or wrong, and the world is simple. However, attempts to apply such thinking to other phenomena have not been so successful. Individually and collectively we face many situations dominated by uncertainty, about weather and climate, about how wisely to raise children, and how the economy should be managed. The controversy about hypothesis testing is dominated by the tension between simple explanations and the complexity of the world we live in. Full article
(This article belongs to the Special Issue Statistical Significance and the Logic of Hypothesis Testing)
Article
What Exactly is the Nusselt Number in Convective Heat Transfer Problems and are There Alternatives?
Entropy 2016, 18(5), 198; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050198 - 20 May 2016
Cited by 12 | Viewed by 3457
Abstract
The often used Nusselt number is critically questioned with respect to its physical meaning. Based on a rigorous dimensional analysis, alternative assessment numbers are found that in a systematic way separately account for the quantitative and qualitative aspect of a heat transfer process. [...] Read more.
The often used Nusselt number is critically questioned with respect to its physical meaning. Based on a rigorous dimensional analysis, alternative assessment numbers are found that in a systematic way separately account for the quantitative and qualitative aspect of a heat transfer process. The qualitative aspect is related to the entropy generated in the temperature field of a real, irreversible heat transfer. The irreversibility can be quantified by referring it to the so-called entropic potential of the energy involved in the transfer process. Full article
(This article belongs to the Section Thermodynamics)
Show Figures

Figure 1

Article
Insights into Entropy as a Measure of Multivariate Variability
Entropy 2016, 18(5), 196; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050196 - 20 May 2016
Cited by 19 | Viewed by 3124
Abstract
Entropy has been widely employed as a measure of variability for problems, such as machine learning and signal processing. In this paper, we provide some new insights into the behaviors of entropy as a measure of multivariate variability. The relationships between multivariate entropy [...] Read more.
Entropy has been widely employed as a measure of variability for problems, such as machine learning and signal processing. In this paper, we provide some new insights into the behaviors of entropy as a measure of multivariate variability. The relationships between multivariate entropy (joint or total marginal) and traditional measures of multivariate variability, such as total dispersion and generalized variance, are investigated. It is shown that for the jointly Gaussian case, the joint entropy (or entropy power) is equivalent to the generalized variance, while total marginal entropy is equivalent to the geometric mean of the marginal variances and total marginal entropy power is equivalent to the total dispersion. The smoothed multivariate entropy (joint or total marginal) and the kernel density estimation (KDE)-based entropy estimator (with finite samples) are also studied, which, under certain conditions, will be approximately equivalent to the total dispersion (or a total dispersion estimator), regardless of the data distribution. Full article
(This article belongs to the Special Issue Information: Meanings and Interpretations)
Show Figures

Figure 1

Article
A Two-Stage Maximum Entropy Prior of Location Parameter with a Stochastic Multivariate Interval Constraint and Its Properties
Entropy 2016, 18(5), 188; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050188 - 20 May 2016
Cited by 4 | Viewed by 1782
Abstract
This paper proposes a two-stage maximum entropy prior to elicit uncertainty regarding a multivariate interval constraint of the location parameter of a scale mixture of normal model. Using Shannon’s entropy, this study demonstrates how the prior, obtained by using two stages of a [...] Read more.
This paper proposes a two-stage maximum entropy prior to elicit uncertainty regarding a multivariate interval constraint of the location parameter of a scale mixture of normal model. Using Shannon’s entropy, this study demonstrates how the prior, obtained by using two stages of a prior hierarchy, appropriately accounts for the information regarding the stochastic constraint and suggests an objective measure of the degree of belief in the stochastic constraint. The study also verifies that the proposed prior plays the role of bridging the gap between the canonical maximum entropy prior of the parameter with no interval constraint and that with a certain multivariate interval constraint. It is shown that the two-stage maximum entropy prior belongs to the family of rectangle screened normal distributions that is conjugate for samples from a normal distribution. Some properties of the prior density, useful for developing a Bayesian inference of the parameter with the stochastic constraint, are provided. We also propose a hierarchical constrained scale mixture of normal model (HCSMN), which uses the prior density to estimate the constrained location parameter of a scale mixture of normal model and demonstrates the scope of its applicability. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

Article
Predicting China’s SME Credit Risk in Supply Chain Finance Based on Machine Learning Methods
Entropy 2016, 18(5), 195; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050195 - 19 May 2016
Cited by 23 | Viewed by 3686
Abstract
We propose a new integrated ensemble machine learning (ML) method, i.e., RS-RAB (Random Subspace-Real AdaBoost), for predicting the credit risk of China’s small and medium-sized enterprise (SME) in supply chain finance (SCF). The sample of empirical analysis is comprised of two data [...] Read more.
We propose a new integrated ensemble machine learning (ML) method, i.e., RS-RAB (Random Subspace-Real AdaBoost), for predicting the credit risk of China’s small and medium-sized enterprise (SME) in supply chain finance (SCF). The sample of empirical analysis is comprised of two data sets on a quarterly basis during the period of 2012–2013: one includes 48 listed SMEs obtained from the SME Board of Shenzhen Stock Exchange; the other one consists of three listed core enterprises (CEs) and six listed CEs that are respectively collected from the Main Board of Shenzhen Stock Exchange and Shanghai Stock Exchange. The experimental results show that RS-RAB possesses an outstanding prediction performance and is very suitable for forecasting the credit risk of China’s SME in SCF by comparison with the other three ML methods. Full article
(This article belongs to the Section Complexity)
Show Figures

Figure 1

Article
Detection of Left-Sided and Right-Sided Hearing Loss via Fractional Fourier Transform
Entropy 2016, 18(5), 194; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050194 - 19 May 2016
Cited by 36 | Viewed by 3389
Abstract
In order to detect hearing loss more efficiently and accurately, this study proposed a new method based on fractional Fourier transform (FRFT). Three-dimensional volumetric magnetic resonance images were obtained from 15 patients with left-sided hearing loss (LHL), 20 healthy controls (HC), and 14 [...] Read more.
In order to detect hearing loss more efficiently and accurately, this study proposed a new method based on fractional Fourier transform (FRFT). Three-dimensional volumetric magnetic resonance images were obtained from 15 patients with left-sided hearing loss (LHL), 20 healthy controls (HC), and 14 patients with right-sided hearing loss (RHL). Twenty-five FRFT spectrums were reduced by principal component analysis with thresholds of 90%, 95%, and 98%, respectively. The classifier is the single-hidden-layer feed-forward neural network (SFN) trained by the Levenberg–Marquardt algorithm. The results showed that the accuracies of all three classes are higher than 95%. In all, our method is promising and may raise interest from other researchers. Full article
(This article belongs to the Special Issue Wavelets, Fractals and Information Theory II)
Show Figures

Figure 1

Article
Entropy and the Self-Organization of Information and Value
Entropy 2016, 18(5), 193; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050193 - 19 May 2016
Cited by 15 | Viewed by 2911
Abstract
Adam Smith, Charles Darwin, Rudolf Clausius, and Léon Brillouin considered certain “values” as key quantities in their descriptions of market competition, natural selection, thermodynamic processes, and information exchange, respectively. None of those values can be computed from elementary properties of the particular object [...] Read more.
Adam Smith, Charles Darwin, Rudolf Clausius, and Léon Brillouin considered certain “values” as key quantities in their descriptions of market competition, natural selection, thermodynamic processes, and information exchange, respectively. None of those values can be computed from elementary properties of the particular object they are attributed to, but rather values represent emergent, irreducible properties. In this paper, such values are jointly understood as information values in certain contexts. For this aim, structural information is distinguished from symbolic information. While the first can be associated with arbitrary physical processes or structures, the latter requires conventions which govern encoding and decoding of the symbols which form a message. As a value of energy, Clausius’ entropy is a universal measure of the structural information contained in a thermodynamic system. The structural information of a message, in contrast to its meaning, can be evaluated by Shannon’s entropy of communication. Symbolic information is found only in the realm of life, such as in animal behavior, human sociology, science, or technology, and is often cooperatively valuated by competition. Ritualization is described here as a universal scenario for the self-organization of symbols by which symbolic information emerges from structural information in the course of evolution processes. Emergent symbolic information exhibits the novel fundamental code symmetry which prevents the meaning of a message from being reducible to the physical structure of its carrier. While symbols turn arbitrary during the ritualization transition, their structures preserve information about their evolution history. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Article
Common Probability Patterns Arise from Simple Invariances
Entropy 2016, 18(5), 192; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050192 - 19 May 2016
Cited by 13 | Viewed by 2345
Abstract
Shift and stretch invariance lead to the exponential-Boltzmann probability distribution. Rotational invariance generates the Gaussian distribution. Particular scaling relations transform the canonical exponential and Gaussian patterns into the variety of commonly observed patterns. The scaling relations themselves arise from the fundamental invariances of [...] Read more.
Shift and stretch invariance lead to the exponential-Boltzmann probability distribution. Rotational invariance generates the Gaussian distribution. Particular scaling relations transform the canonical exponential and Gaussian patterns into the variety of commonly observed patterns. The scaling relations themselves arise from the fundamental invariances of shift, stretch and rotation, plus a few additional invariances. Prior work described the three fundamental invariances as a consequence of the equilibrium canonical ensemble of statistical mechanics or the Jaynesian maximization of information entropy. By contrast, I emphasize the primacy and sufficiency of invariance alone to explain the commonly observed patterns. Primary invariance naturally creates the array of commonly observed scaling relations and associated probability patterns, whereas the classical approaches derived from statistical mechanics or information theory require special assumptions to derive commonly observed scales. Full article
(This article belongs to the Collection Advances in Applied Statistical Mechanics)
Article
Rotation of Galaxies within Gravity of the Universe
Entropy 2016, 18(5), 191; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050191 - 19 May 2016
Cited by 4 | Viewed by 2643
Abstract
Rotation of galaxies is examined by the general principle of least action. This law of nature describes a system in its surroundings, here specifically a galaxy in the surrounding Universe. According to this holistic theory the gravitational potential due to all matter in [...] Read more.
Rotation of galaxies is examined by the general principle of least action. This law of nature describes a system in its surroundings, here specifically a galaxy in the surrounding Universe. According to this holistic theory the gravitational potential due to all matter in the expanding Universe relates to the universal curvature which, in turn, manifests itself as the universal acceleration. Then the orbital velocities from the central bulge to distant perimeters are understood to balance both the galactic and universal acceleration. Since the galactic acceleration decreases with distance from the galaxy’s center to its luminous edge, the orbital velocities of ever more distant stars and gas clouds tend toward a value that tallies the universal acceleration. This tiny term has been acknowledged earlier by including it as a parameter in the modified gravitational law, but here the tiny acceleration is understood to result from the gravitational potential that spans across the expanding Universe. This resolution of the galaxy rotation problem is compared with observations and contrasted with models of dark matter. Also, other astronomical observations that have been interpreted as evidence for dark matter are discussed in light of the least-action principle. Full article
(This article belongs to the Section Astrophysics, Cosmology, and Black Holes)
Show Figures

Figure 1

Article
Specific Differential Entropy Rate Estimation for Continuous-Valued Time Series
Entropy 2016, 18(5), 190; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050190 - 19 May 2016
Cited by 15 | Viewed by 2604
Abstract
We introduce a method for quantifying the inherent unpredictability of a continuous-valued time series via an extension of the differential Shannon entropy rate. Our extension, the specific entropy rate, quantifies the amount of predictive uncertainty associated with a specific state, rather than averaged [...] Read more.
We introduce a method for quantifying the inherent unpredictability of a continuous-valued time series via an extension of the differential Shannon entropy rate. Our extension, the specific entropy rate, quantifies the amount of predictive uncertainty associated with a specific state, rather than averaged over all states. We provide a data-driven approach for estimating the specific entropy rate of an observed time series. Finally, we consider three case studies of estimating the specific entropy rate from synthetic and physiological data relevant to the analysis of heart rate variability. Full article
Show Figures

Figure 1

Article
MoNbTaV Medium-Entropy Alloy
Entropy 2016, 18(5), 189; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050189 - 19 May 2016
Cited by 67 | Viewed by 5282 | Correction
Abstract
Guided by CALPHAD (Calculation of Phase Diagrams) modeling, the refractory medium-entropy alloy MoNbTaV was synthesized by vacuum arc melting under a high-purity argon atmosphere. A body-centered cubic solid solution phase was experimentally confirmed in the as-cast ingot using X-ray diffraction and scanning electron [...] Read more.
Guided by CALPHAD (Calculation of Phase Diagrams) modeling, the refractory medium-entropy alloy MoNbTaV was synthesized by vacuum arc melting under a high-purity argon atmosphere. A body-centered cubic solid solution phase was experimentally confirmed in the as-cast ingot using X-ray diffraction and scanning electron microscopy. The measured lattice parameter of the alloy (3.208 Å) obeys the rule of mixtures (ROM), but the Vickers microhardness (4.95 GPa) and the yield strength (1.5 GPa) are about 4.5 and 4.6 times those estimated from the ROM, respectively. Using a simple model on solid solution strengthening predicts a yield strength of approximately 1.5 GPa. Thermodynamic analysis shows that the total entropy of the alloy is more than three times the configurational entropy at room temperature, and the entropy of mixing exhibits a small negative departure from ideal mixing. Full article
(This article belongs to the Special Issue High-Entropy Alloys and High-Entropy-Related Materials)
Show Figures

Graphical abstract

Article
Charged, Rotating Black Objects in Einstein–Maxwell-Dilaton Theory in D ≥ 5
Entropy 2016, 18(5), 187; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050187 - 16 May 2016
Cited by 1 | Viewed by 1911
Abstract
We show that the general framework proposed by Kleihaus et al. (2015) for the study of asymptotically flat vacuum black objects with k + 1 equal magnitude angular momenta in D 5 spacetime dimensions (with [...] Read more.
We show that the general framework proposed by Kleihaus et al. (2015) for the study of asymptotically flat vacuum black objects with k + 1 equal magnitude angular momenta in D 5 spacetime dimensions (with 0 k D - 5 2 ) can be extended to the case of Einstein–Maxwell-dilaton (EMd) theory. This framework can describe black holes with spherical horizon topology, the simplest solutions corresponding to a class of electrically charged (dilatonic) Myers–Perry black holes. Balanced charged black objects with S n + 1 × S 2 k + 1 horizon topology can also be studied (with D = 2 k + n + 4 ). Black rings correspond to the case k = 0 , while the solutions with k > 0 are black ringoids. The basic properties of EMd solutions are discussed for the special case of a Kaluza–Klein value of the dilaton coupling constant. We argue that all features of these solutions can be derived from those of the vacuum seed configurations. Full article
(This article belongs to the Special Issue Black Hole Thermodynamics II)
Show Figures

Figure 1

Article
Quantum Thermodynamics in Strong Coupling: Heat Transport and Refrigeration
Entropy 2016, 18(5), 186; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050186 - 16 May 2016
Cited by 45 | Viewed by 4300
Abstract
The performance characteristics of a heat rectifier and a heat pump are studied in a non-Markovian framework. The device is constructed from a molecule connected to a hot and cold reservoir. The heat baths are modelled using the stochastic surrogate Hamiltonian method. The [...] Read more.
The performance characteristics of a heat rectifier and a heat pump are studied in a non-Markovian framework. The device is constructed from a molecule connected to a hot and cold reservoir. The heat baths are modelled using the stochastic surrogate Hamiltonian method. The molecule is modelled by an asymmetric double-well potential. Each well is semi-locally connected to a heat bath composed of spins. The dynamics are driven by a combined system–bath Hamiltonian. The temperature of the baths is regulated by a secondary spin bath composed of identical spins in thermal equilibrium. A random swap operation exchange spins between the primary and secondary baths. The combined system is studied in various system–bath coupling strengths. In all cases, the average heat current always flows from the hot towards the cold bath in accordance with the second law of thermodynamics. The asymmetry of the double well generates a rectifying effect, meaning that when the left and right baths are exchanged the heat current follows the hot-to-cold direction. The heat current is larger when the high frequency is coupled to the hot bath. Adding an external driving field can reverse the transport direction. Such a refrigeration effect is modelled by a periodic driving field in resonance with the frequency difference of the two potential wells. A minimal driving amplitude is required to overcome the heat leak effect. In the strong driving regime the cooling power is non-monotonic with the system–bath coupling. Full article
(This article belongs to the Special Issue Quantum Thermodynamics)
Show Figures

Graphical abstract

Article
An Information Entropy-Based Animal Migration Optimization Algorithm for Data Clustering
Entropy 2016, 18(5), 185; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050185 - 16 May 2016
Cited by 6 | Viewed by 2420
Abstract
Data clustering is useful in a wide range of application areas. The Animal Migration Optimization (AMO) algorithm is one of the recently introduced swarm-based algorithms, which has demonstrated good performances for solving numeric optimization problems. In this paper, we presented a modified AMO [...] Read more.
Data clustering is useful in a wide range of application areas. The Animal Migration Optimization (AMO) algorithm is one of the recently introduced swarm-based algorithms, which has demonstrated good performances for solving numeric optimization problems. In this paper, we presented a modified AMO algorithm with an entropy-based heuristic strategy for data clustering. The main contribution is that we calculate the information entropy of each attribute for a given data set and propose an adaptive strategy that can automatically balance convergence speed and global search efforts according to its entropy in both migration and updating steps. A series of well-known benchmark clustering problems are employed to evaluate the performance of our approach. We compare experimental results with k-means, Artificial Bee Colony (ABC), AMO, and the state-of-the-art algorithms for clustering and show that the proposed AMO algorithm generally performs better than the compared algorithms on the considered clustering problems. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Graphical abstract

Article
Selection of Entropy Based Features for Automatic Analysis of Essential Tremor
Entropy 2016, 18(5), 184; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050184 - 16 May 2016
Cited by 16 | Viewed by 3138
Abstract
Biomedical systems produce biosignals that arise from interaction mechanisms. In a general form, those mechanisms occur across multiple scales, both spatial and temporal, and contain linear and non-linear information. In this framework, entropy measures are good candidates in order provide useful evidence about [...] Read more.
Biomedical systems produce biosignals that arise from interaction mechanisms. In a general form, those mechanisms occur across multiple scales, both spatial and temporal, and contain linear and non-linear information. In this framework, entropy measures are good candidates in order provide useful evidence about disorder in the system, lack of information in time-series and/or irregularity of the signals. The most common movement disorder is essential tremor (ET), which occurs 20 times more than Parkinson’s disease. Interestingly, about 50%–70% of the cases of ET have a genetic origin. One of the most used standard tests for clinical diagnosis of ET is Archimedes’ spiral drawing. This work focuses on the selection of non-linear biomarkers from such drawings and handwriting, and it is part of a wider cross study on the diagnosis of essential tremor, where our piece of research presents the selection of entropy features for early ET diagnosis. Classic entropy features are compared with features based on permutation entropy. Automatic analysis system settled on several Machine Learning paradigms is performed, while automatic features selection is implemented by means of ANOVA (analysis of variance) test. The obtained results for early detection are promising and appear applicable to real environments. Full article
(This article belongs to the Special Issue Entropy on Biosignals and Intelligent Systems)
Show Figures

Figure 1

Article
Geometric Model of Black Hole Quantum N-portrait, Extradimensions and Thermodynamics
Entropy 2016, 18(5), 181; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050181 - 14 May 2016
Cited by 20 | Viewed by 2851
Abstract
Recently a short scale modified black hole metric, known as holographic metric, has been proposed in order to capture the self-complete character of gravity. In this paper we show that such a metric can reproduce some geometric features expected from the quantum N [...] Read more.
Recently a short scale modified black hole metric, known as holographic metric, has been proposed in order to capture the self-complete character of gravity. In this paper we show that such a metric can reproduce some geometric features expected from the quantum N-portrait beyond the semi-classical limit. We show that for a generic N this corresponds to having an effective energy momentum tensor in Einstein equations or, equivalently, non-local terms in the gravity action. We also consider the higher dimensional extension of the metric and the case of an AdS cosmological term. We provide a detailed thermodynamic analysis of both cases, with particular reference to the repercussions on the Hawking-Page phase transition. Full article
(This article belongs to the Special Issue Entropy in Quantum Gravity and Quantum Cosmology)
Show Figures

Figure 1

Article
A Conjecture Regarding the Extremal Values of Graph Entropy Based on Degree Powers
Entropy 2016, 18(5), 183; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050183 - 13 May 2016
Cited by 3 | Viewed by 1959
Abstract
Many graph invariants have been used for the construction of entropy-based measures to characterize the structure of complex networks. The starting point has been always based on assigning a probability distribution to a network when using Shannon’s entropy. In particular, Cao et al. [...] Read more.
Many graph invariants have been used for the construction of entropy-based measures to characterize the structure of complex networks. The starting point has been always based on assigning a probability distribution to a network when using Shannon’s entropy. In particular, Cao et al. (2014 and 2015) defined special graph entropy measures which are based on degrees powers. In this paper, we obtain some lower and upper bounds for these measures and characterize extremal graphs. Moreover we resolve one part of a conjecture stated by Cao et al. Full article
Article
Entropy-Based Incomplete Cholesky Decomposition for a Scalable Spectral Clustering Algorithm: Computational Studies and Sensitivity Analysis
Entropy 2016, 18(5), 182; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050182 - 13 May 2016
Cited by 6 | Viewed by 2678
Abstract
Spectral clustering methods allow datasets to be partitioned into clusters by mapping the input datapoints into the space spanned by the eigenvectors of the Laplacian matrix. In this article, we make use of the incomplete Cholesky decomposition (ICD) to construct an approximation of [...] Read more.
Spectral clustering methods allow datasets to be partitioned into clusters by mapping the input datapoints into the space spanned by the eigenvectors of the Laplacian matrix. In this article, we make use of the incomplete Cholesky decomposition (ICD) to construct an approximation of the graph Laplacian and reduce the size of the related eigenvalue problem from N to m, with m N . In particular, we introduce a new stopping criterion based on normalized mutual information between consecutive partitions, which terminates the ICD when the change in the cluster assignments is below a given threshold. Compared with existing ICD-based spectral clustering approaches, the proposed method allows the reduction of the number m of selected pivots (i.e., to obtain a sparser model) and at the same time, to maintain high clustering quality. The method scales linearly with respect to the number of input datapoints N and has low memory requirements, because only matrices of size N × m and m × m are calculated (in contrast to standard spectral clustering, where the construction of the full N × N similarity matrix is needed). Furthermore, we show that the number of clusters can be reliably selected based on the gap heuristics computed using just a small matrix R of size m × m instead of the entire graph Laplacian. The effectiveness of the proposed algorithm is tested on several datasets. Full article
(This article belongs to the Special Issue Information Theoretic Learning)
Show Figures

Graphical abstract

Article
Relationship between Population Dynamics and the Self-Energy in Driven Non-Equilibrium Systems
Entropy 2016, 18(5), 180; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050180 - 13 May 2016
Cited by 8 | Viewed by 2447
Abstract
We compare the decay rates of excited populations directly calculated within a Keldysh formalism to the equation of motion of the population itself for a Hubbard-Holstein model in two dimensions. While it is true that these two approaches must give the same answer, [...] Read more.
We compare the decay rates of excited populations directly calculated within a Keldysh formalism to the equation of motion of the population itself for a Hubbard-Holstein model in two dimensions. While it is true that these two approaches must give the same answer, it is common to make a number of simplifying assumptions, within the differential equation for the populations, that allows one to interpret the decay in terms of hot electrons interacting with a phonon bath. Here, we show how care must be taken to ensure an accurate treatment of the equation of motion for the populations due to the fact that there are identities that require cancellations of terms that naively look like they contribute to the decay rates. In particular, the average time dependence of the Green’s functions and self-energies plays a pivotal role in determining these decay rates. Full article
(This article belongs to the Special Issue Quantum Nonequilibrium Dynamics)
Show Figures

Figure 1

Article
Multi-Level Formation of Complex Software Systems
Entropy 2016, 18(5), 178; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050178 - 12 May 2016
Cited by 9 | Viewed by 2683
Abstract
We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems [...] Read more.
We present a multi-level formation model for complex software systems. The previous works extract the software systems to software networks for further studies, but usually investigate the software networks at the class level. In contrast to these works, our treatment of software systems as multi-level networks is more realistic. In particular, the software networks are organized by three levels of granularity, which represents the modularity and hierarchy in the formation process of real-world software systems. More importantly, simulations based on this model have generated more realistic structural properties of software networks, such as power-law, clustering and modularization. On the basis of this model, how the structure of software systems effects software design principles is then explored, and it could be helpful for understanding software evolution and software engineering practices. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Graphical abstract

Article
Comparing the Models of Steepest Entropy Ascent Quantum Thermodynamics, Master Equation and the Difference Equation for a Simple Quantum System Interacting with Reservoirs
Entropy 2016, 18(5), 176; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050176 - 12 May 2016
Cited by 7 | Viewed by 2405
Abstract
There is increasing interest concerning the details about how quantum systems interact with their surroundings. A number of methodologies have been used to describe these interactions, including Master Equations (ME) based on a system-plus-reservoir (S + R) approach, and more recently, Steepest Entropy [...] Read more.
There is increasing interest concerning the details about how quantum systems interact with their surroundings. A number of methodologies have been used to describe these interactions, including Master Equations (ME) based on a system-plus-reservoir (S + R) approach, and more recently, Steepest Entropy Ascent Quantum Thermodynamics (SEAQT) which asserts that entropy is a fundamental physical property and that isolated quantum systems that are not at stable equilibrium may spontaneously relax without environmental influences. In this paper, the ME, SEAQT approaches, and a simple linear difference equation (DE) model are compared with each other and experimental data in order to study the behavior of a single trapped ion as it interacts with one or more external heat reservoirs. The comparisons of the models present opportunities for additional study to verify the validity and limitations of these approaches. Full article
Show Figures

Figure 1

Article
Fatiguing Effects on the Multi-Scale Entropy of Surface Electromyography in Children with Cerebral Palsy
Entropy 2016, 18(5), 177; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050177 - 10 May 2016
Cited by 9 | Viewed by 2114
Abstract
The objective of this study was to investigate the effects of muscle fatigue on the multi-scale entropy of surface electromyography (EMG) in children with cerebral palsy (CP) and typical development (TD). Sixteen CP children and eighteen TD children participated in experiments where they [...] Read more.
The objective of this study was to investigate the effects of muscle fatigue on the multi-scale entropy of surface electromyography (EMG) in children with cerebral palsy (CP) and typical development (TD). Sixteen CP children and eighteen TD children participated in experiments where they performed upper limb cyclic lifting tasks following a muscle fatiguing process, while the surface EMG signals were recorded from their upper trapezius muscles. Multi-scale entropy (MSE) analyses of the surface EMG were applied by calculating sample entropy (SampEn) on individual intrinsic mode functions (IMFs) adaptively generated by empirical mode decomposition (EMD) of the original signal. The declining degree of the resultant MSE curve was found to reflect muscle fatigue level for all subjects, with its slope (purposely calculated over the first four scales) increasing significantly as the fatigue level increased. Further, such a slope increase was less significant for CP children as compared with TD children. Our findings confirmed that the decrease of muscle fiber conduction velocity (MFCV) and the increase of motor unit synchronization may be two possible factors induced by muscle fatigue, and further indicated that there appear to be some neuromuscular changes (such as MFCV decrease, motor unit synchronization increase, motor unit firing rates reduction, selective loss of larger motor units) that occur as a result of cerebral palsy. These changes may account for experimentally observed difference in fatiguing effects between subject groups. Our study provides an investigative tool to assess muscle fatigue as well as to help reveal complex neuropathological changes underlying the motor impairments of CP children. Full article
Show Figures

Figure 1

Article
Analysis of the Chaotic Behavior of the Lower Hybrid Wave Propagation in Magnetised Plasma by Hamiltonian Theory
Entropy 2016, 18(5), 175; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050175 - 07 May 2016
Cited by 2 | Viewed by 2491
Abstract
The Hamiltonian character of the ray tracing equations describing the propagation of the Lower Hybrid Wave (LHW) in a magnetic confined plasma device (tokamak) is investigated in order to study the evolution of the parallel wave number along the propagation path. The chaotic [...] Read more.
The Hamiltonian character of the ray tracing equations describing the propagation of the Lower Hybrid Wave (LHW) in a magnetic confined plasma device (tokamak) is investigated in order to study the evolution of the parallel wave number along the propagation path. The chaotic diffusion of the “time-averaged” parallel wave number at higher values (with respect to that launched by the antenna at the plasma edge) has been evaluated, in order to find an explanation of the filling of the spectral gap (Fisch, 1987) by “Hamiltonian chaos” in the Lower Hybrid Current Drive (LHCD) experiments (Fisch, 1978). The present work shows that the increase of the parallel wave number \(n_{\parallel}\) due to toroidal effects, in the case of the typical plasma parameters of the Frascati Tokamak Upgrade (FTU) experiment, is insufficient to explain the filling of the spectral gap, and the consequent current drive and another mechanism must come into play to justify the wave absorption by Landau damping. Analytical calculations have been supplemented by a numerical algorithm based on the symplectic integration of the ray equations implemented in a ray tracing code, in order to preserve exactly the symplectic character of a Hamiltonian flow. Full article
Show Figures

Graphical abstract

Article
Quantum Errors and Disturbances: Response to Busch, Lahti and Werner
Entropy 2016, 18(5), 174; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050174 - 06 May 2016
Cited by 7 | Viewed by 2672
Abstract
Busch, Lahti and Werner (BLW) have recently criticized the operator approach to the description of quantum errors and disturbances. Their criticisms are justified to the extent that the physical meaning of the operator definitions has not hitherto been adequately explained. We rectify that [...] Read more.
Busch, Lahti and Werner (BLW) have recently criticized the operator approach to the description of quantum errors and disturbances. Their criticisms are justified to the extent that the physical meaning of the operator definitions has not hitherto been adequately explained. We rectify that omission. We then examine BLW’s criticisms in the light of our analysis. We argue that, although the BLW approach favour (based on the Wasserstein two-deviation) has its uses, there are important physical situations where an operator approach is preferable. We also discuss the reason why the error-disturbance relation is still giving rise to controversies almost a century after Heisenberg first stated his microscope argument. We argue that the source of the difficulties is the problem of interpretation, which is not so wholly disconnected from experimental practicalities as is sometimes supposed. Full article
Show Figures

Figure 1

Article
Magnetically-Driven Quantum Heat Engines: The Quasi-Static Limit of Their Efficiency
Entropy 2016, 18(5), 173; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050173 - 06 May 2016
Cited by 10 | Viewed by 4814
Abstract
The concept of a quantum heat engine (QHEN) has been discussed in the literature, not only due to its intrinsic scientific interest, but also as an alternative to efficiently recover, on a nanoscale device, thermal energy in the form of useful work. The [...] Read more.
The concept of a quantum heat engine (QHEN) has been discussed in the literature, not only due to its intrinsic scientific interest, but also as an alternative to efficiently recover, on a nanoscale device, thermal energy in the form of useful work. The quantum character of a QHEN relies, for instance, on the fact that any of its intermediate states is determined by a density matrix operator. In particular, this matrix can represent a mixed state. For a classical heat engine, a theoretical upper bound for its efficiency is obtained by analyzing its quasi-static operation along a cycle drawn by a sequence of quasi-equilibrium states. A similar analysis can be carried out for a quantum engine, where quasi-static processes are driven by the evolution of ensemble-averaged observables, via variation of the corresponding operators or of the density matrix itself on a tunable physical parameter. We recently proposed two new conceptual designs for a magnetically-driven quantum engine, where the tunable parameter is the intensity of an external magnetic field. Along this article, we shall present the general quantum thermodynamics formalism developed in order to analyze this type of QHEN, and moreover, we shall apply it to describe the theoretical efficiency of two different practical implementations of this concept: an array of semiconductor quantum dots and an ensemble of graphene flakes submitted to mechanical tension. Full article
(This article belongs to the Special Issue Quantum Thermodynamics)
Show Figures

Figure 1

Article
Stochastic Resonance, Self-Organization and Information Dynamics in Multistable Systems
Entropy 2016, 18(5), 172; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050172 - 04 May 2016
Cited by 9 | Viewed by 3029
Abstract
A class of complex self-organizing systems subjected to fluctuations of environmental or intrinsic origin and to nonequilibrium constraints in the form of an external periodic forcing is analyzed from the standpoint of information theory. Conditions under which the response of information entropy and [...] Read more.
A class of complex self-organizing systems subjected to fluctuations of environmental or intrinsic origin and to nonequilibrium constraints in the form of an external periodic forcing is analyzed from the standpoint of information theory. Conditions under which the response of information entropy and related quantities to the nonequilibrium constraint can be optimized via a stochastic resonance-type mechanism are identified, and the role of key parameters is assessed. Full article
(This article belongs to the Special Issue Information and Self-Organization)
Show Figures

Figure 1

Article
Forecasting Energy Value at Risk Using Multiscale Dependence Based Methodology
Entropy 2016, 18(5), 170; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050170 - 04 May 2016
Cited by 6 | Viewed by 2221
Abstract
In this paper, we propose a multiscale dependence-based methodology to analyze the dependence structure and to estimate the downside portfolio risk measures in the energy markets. More specifically, under this methodology, we formulate a new bivariate Empirical Mode Decomposition (EMD) copula based approach [...] Read more.
In this paper, we propose a multiscale dependence-based methodology to analyze the dependence structure and to estimate the downside portfolio risk measures in the energy markets. More specifically, under this methodology, we formulate a new bivariate Empirical Mode Decomposition (EMD) copula based approach to analyze and model the multiscale dependence structure in the energy markets. The proposed model constructs the Copula-based dependence structure formulation in the Bivariate Empirical Mode Decomposition (BEMD)-based multiscale domain. Results from the empirical studies using the typical Australian electricity daily prices show that there exists a multiscale dependence structure between different regional markets across different scales. The proposed model taking into account the multiscale dependence structure demonstrates statistically significantly-improved performance in terms of accuracy and reliability measures. Full article
(This article belongs to the Special Issue Computational Complexity)
Show Figures

Figure 1

Article
Pressure and Compressibility of Conformal Field Theories from the AdS/CFT Correspondence
Entropy 2016, 18(5), 169; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050169 - 03 May 2016
Cited by 15 | Viewed by 2081
Abstract
The equation of state associated with N = 4 supersymmetric Yang–Mills in four dimensions, for S U ( N ) in the large N limit, is investigated using the AdS/CFT correspondence. An asymptotically AdS black-hole on the gravity side provides a thermal background [...] Read more.
The equation of state associated with N = 4 supersymmetric Yang–Mills in four dimensions, for S U ( N ) in the large N limit, is investigated using the AdS/CFT correspondence. An asymptotically AdS black-hole on the gravity side provides a thermal background for the Yang–Mills theory on the boundary in which the cosmological constant is equivalent to a volume. The thermodynamic variable conjugate to the cosmological constant is a pressure, and the P - V diagram of the quark-gluon plasma is studied. It is known that there is a critical point where the heat capacity diverges, and this is reflected in the isothermal compressibility. Critical exponents are derived and found to be mean field in the large N limit. The same analysis applied to three- and six-dimensional conformal field theories again yields mean field exponents associated with the compressibility at the critical point. Full article
(This article belongs to the Special Issue Black Hole Thermodynamics II)
Show Figures

Figure 1

Article
Quantum Private Query Protocol Based on Two Non-Orthogonal States
Entropy 2016, 18(5), 163; https://0-doi-org.brum.beds.ac.uk/10.3390/e18050163 - 03 May 2016
Cited by 6 | Viewed by 1961
Abstract
We propose a loss tolerant quantum private query (QPQ) protocol based on two non-orthogonal states and unambiguous state discrimination (USD) measurement. By analyzing a two-point attack by a third party, we find that our protocol has a stronger ability to resist external attacks [...] Read more.
We propose a loss tolerant quantum private query (QPQ) protocol based on two non-orthogonal states and unambiguous state discrimination (USD) measurement. By analyzing a two-point attack by a third party, we find that our protocol has a stronger ability to resist external attacks than G-protocol and Y-protocol. Our protocol requires a smaller number of compressions than that in G-protocol (Gao et al., Opt. Exp. 2012, 20, 17411–17420) and Y-protocol (Yan et al. Quant. Inf. Process. 2014, 13, 805–813), which means less post-processing. Our protocol shows better database security and user privacy compared with G-protocol. Full article
(This article belongs to the Special Issue Entanglement Entropy)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop