Next Issue
Volume 17, February
Previous Issue
Volume 16, December

Entropy, Volume 17, Issue 1 (January 2015) – 25 articles , Pages 1-482

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
Article
Modeling and Analyzing the Interaction between Network Rumors and Authoritative Information
Entropy 2015, 17(1), 471-482; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010471 - 19 Jan 2015
Cited by 21 | Viewed by 3366
Abstract
In this paper, we propose a novel two-stage rumor spreading Susceptible-Infected-Authoritative-Removed (SIAR) model for complex homogeneous and heterogeneous networks. The interaction Markov chains (IMC) mean-field equations based on the SIAR model are derived to describe the dynamic interaction between the rumors and authoritative [...] Read more.
In this paper, we propose a novel two-stage rumor spreading Susceptible-Infected-Authoritative-Removed (SIAR) model for complex homogeneous and heterogeneous networks. The interaction Markov chains (IMC) mean-field equations based on the SIAR model are derived to describe the dynamic interaction between the rumors and authoritative information. We use a Monte Carlo simulation method to characterize the dynamics of the Susceptible-Infected-Removed (SIR) and SIAR models, showing that the SIAR model with consideration of authoritative information gives a more realistic description of propagation features of rumors than the SIR model. The simulation results demonstrate that the critical threshold λc of the SIAR model has the tiniest increase than the threshold of SIR model. The sooner the authoritative information is introduced, the less negative impact the rumors will bring. We also get the result that heterogeneous networks are more prone to the spreading of rumors. Additionally, the inhibition of rumor spreading, as one of the characteristics of the new SIAR model itself, is instructive for later studies on the rumor spreading models and the controlling strategies. Full article
(This article belongs to the Special Issue Recent Advances in Chaos Theory and Complex Networks)
Show Figures

Article
A Recipe for the Estimation of Information Flow in a Dynamical System
Entropy 2015, 17(1), 438-470; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010438 - 19 Jan 2015
Cited by 33 | Viewed by 5261
Abstract
Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables [...] Read more.
Information-theoretic quantities, such as entropy and mutual information (MI), can be used to quantify the amount of information needed to describe a dataset or the information shared between two datasets. In the case of a dynamical system, the behavior of the relevant variables can be tightly coupled, such that information about one variable at a given instance in time may provide information about other variables at later instances in time. This is often viewed as a flow of information, and tracking such a flow can reveal relationships among the system variables. Since the MI is a symmetric quantity; an asymmetric quantity, called Transfer Entropy (TE), has been proposed to estimate the directionality of the coupling. However, accurate estimation of entropy-based measures is notoriously difficult. Every method has its own free tuning parameter(s) and there is no consensus on an optimal way of estimating the TE from a dataset. We propose a new methodology to estimate TE and apply a set of methods together as an accuracy cross-check to provide a reliable mathematical tool for any given data set. We demonstrate both the variability in TE estimation across techniques as well as the benefits of the proposed methodology to reliably estimate the directionality of coupling among variables. Full article
(This article belongs to the Special Issue Transfer Entropy)
Show Figures

Article
Self-Similarity in Population Dynamics: Surname Distributions and Genealogical Trees
Entropy 2015, 17(1), 425-437; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010425 - 19 Jan 2015
Cited by 3 | Viewed by 2659
Abstract
The frequency distribution of surnames turns out to be a relevant issue not only in historical demography but also in population biology, and especially in genetics, since surnames tend to behave like neutral genes and propagate like Y chromosomes. The stochastic dynamics leading [...] Read more.
The frequency distribution of surnames turns out to be a relevant issue not only in historical demography but also in population biology, and especially in genetics, since surnames tend to behave like neutral genes and propagate like Y chromosomes. The stochastic dynamics leading to the observed scale-invariant distributions has been studied as a Yule process, as a branching phenomenon and also by field-theoretical renormalization group techniques. In the absence of mutations the theoretical models are in good agreement with empirical evidence, but when mutations are present a discrepancy between the theoretical and the experimental exponents is observed. Hints for the possible origin of the mismatch are discussed, with some emphasis on the difference between the asymptotic frequency distribution of a full population and the frequency distributions observed in its samples. A precise connection is established between surname distributions and the statistical properties of genealogical trees. Ancestors tables, being obviously self-similar, may be investigated theoretically by renormalization group techniques, but they can also be studied empirically by exploiting the large online genealogical databases concerning European nobility. Full article
(This article belongs to the Special Issue Entropic Aspects in Statistical Physics of Complex Systems)
Article
Entropy, Age and Time Operator
Entropy 2015, 17(1), 407-424; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010407 - 19 Jan 2015
Cited by 5 | Viewed by 3371
Abstract
The time operator and internal age are intrinsic features of entropy producing innovation processes. The innovation spaces at each stage are the eigenspaces of the time operator. The internal age is the average innovation time, analogous to lifetime computation. Time operators were originally [...] Read more.
The time operator and internal age are intrinsic features of entropy producing innovation processes. The innovation spaces at each stage are the eigenspaces of the time operator. The internal age is the average innovation time, analogous to lifetime computation. Time operators were originally introduced for quantum systems and highly unstable dynamical systems. Extending the time operator theory to regular Markov chains allows one to relate internal age with norm distances from equilibrium. The goal of this work is to express the evolution of internal age in terms of Lyapunov functionals constructed from entropies. We selected the Boltzmann–Gibbs–Shannon entropy and more general entropy functions, namely the Tsallis entropies and the Kaniadakis entropies. Moreover, we compare the evolution of the distance of initial distributions from equilibrium to the evolution of the Lyapunov functionals constructed from norms with the evolution of Lyapunov functionals constructed from entropies. It is remarkable that the entropy functionals evolve, violating the second law of thermodynamics, while the norm functionals evolve thermodynamically. Full article
(This article belongs to the Special Issue Entropic Aspects in Statistical Physics of Complex Systems)
Show Figures

Letter
On an Objective Basis for the Maximum Entropy Principle
Entropy 2015, 17(1), 401-406; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010401 - 19 Jan 2015
Viewed by 2559
Abstract
In this letter, we elaborate on some of the issues raised by a recent paper by Neapolitan and Jiang concerning the maximum entropy (ME) principle and alternative principles for estimating probabilities consistent with known, measured constraint information. We argue that the ME solution [...] Read more.
In this letter, we elaborate on some of the issues raised by a recent paper by Neapolitan and Jiang concerning the maximum entropy (ME) principle and alternative principles for estimating probabilities consistent with known, measured constraint information. We argue that the ME solution for the “problematic” example introduced by Neapolitan and Jiang has stronger objective basis, rooted in results from information theory, than their alternative proposed solution. We also raise some technical concerns about the Bayesian analysis in their work, which was used to independently support their alternative to the ME solution. The letter concludes by noting some open problems involving maximum entropy statistical inference. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Article
Tsallis Distribution Decorated with Log-Periodic Oscillation
Entropy 2015, 17(1), 384-400; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010384 - 14 Jan 2015
Cited by 30 | Viewed by 2928
Abstract
In many situations, in all branches of physics, one encounters the power-like behavior of some variables, which is best described by a Tsallis distribution characterized by a nonextensivity parameter q and scale parameter T. However, there exist experimental results that can be described [...] Read more.
In many situations, in all branches of physics, one encounters the power-like behavior of some variables, which is best described by a Tsallis distribution characterized by a nonextensivity parameter q and scale parameter T. However, there exist experimental results that can be described only by a Tsallis distributions, which are additionally decorated by some log-periodic oscillating factor. We argue that such a factor can originate from allowing for a complex nonextensivity parameter q. The possible information conveyed by such an approach (like the occurrence of complex heat capacity, the notion of complex probability or complex multiplicative noise) will also be discussed. Full article
(This article belongs to the Special Issue Entropic Aspects in Statistical Physics of Complex Systems)
Show Figures

Article
Message Authentication over Noisy Channels
Entropy 2015, 17(1), 368-383; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010368 - 14 Jan 2015
Cited by 5 | Viewed by 3834
Abstract
The essence of authentication is the transmission of unique and irreproducible information. In this paper, the authentication becomes a problem of the secure transmission of the secret key over noisy channels. A general analysis and design framework for message authentication is presented based [...] Read more.
The essence of authentication is the transmission of unique and irreproducible information. In this paper, the authentication becomes a problem of the secure transmission of the secret key over noisy channels. A general analysis and design framework for message authentication is presented based on the results of Wyner’s wiretap channel. Impersonation and substitution attacks are primarily investigated. Information-theoretic lower and upper bounds on the opponent’s success probability are derived, and the lower bound and the upper bound are shown to match. In general, the fundamental limits on message authentication over noisy channels are fully characterized. Analysis results demonstrate that introducing noisy channels is a reliable way to enhance the security of authentication. Full article
Show Figures

Graphical abstract

Article
Robust H Finite-Time Control for Discrete Markovian Jump Systems with Disturbances of Probabilistic Distributions
Entropy 2015, 17(1), 346-367; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010346 - 14 Jan 2015
Cited by 12 | Viewed by 3120
Abstract
This paper is concerned with the robust H finite-time control for discrete delayed nonlinear systems with Markovian jumps and external disturbances. It is usually assumed that the disturbance affects the system states and outputs with the same influence degree of 100%, which [...] Read more.
This paper is concerned with the robust H finite-time control for discrete delayed nonlinear systems with Markovian jumps and external disturbances. It is usually assumed that the disturbance affects the system states and outputs with the same influence degree of 100%, which is not evident enough to reflect the situation where the disturbance affects these two parts by different influence degrees. To tackle this problem, a probabilistic distribution denoted by binomial sequences is introduced to describe the external disturbance. Throughout the paper, the definitions of the finite-time boundedness (FTB) and the H FTB are firstly given respectively. To extend the results further, a model which combines a linear dynamic system and a static nonlinear operator is referred to describe the system under discussion. Then by virtue of state feedback control method, some new sufficient criteria are derived which guarantee the FTB and H FTB performances for the considered system. Finally, an example is provided to demonstrate the effectiveness of the developed control laws. Full article
(This article belongs to the Special Issue Complex Systems and Nonlinear Dynamics)
Show Figures

Article
Black-Box Optimization Using Geodesics in Statistical Manifolds
Entropy 2015, 17(1), 304-345; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010304 - 13 Jan 2015
Cited by 5 | Viewed by 2843
Abstract
Information geometric optimization (IGO) is a general framework for stochastic optimization problems aiming at limiting the influence of arbitrary parametrization choices: the initial problem is transformed into the optimization of a smooth function on a Riemannian manifold, defining a parametrization-invariant first order differential [...] Read more.
Information geometric optimization (IGO) is a general framework for stochastic optimization problems aiming at limiting the influence of arbitrary parametrization choices: the initial problem is transformed into the optimization of a smooth function on a Riemannian manifold, defining a parametrization-invariant first order differential equation and, thus, yielding an approximately parametrization-invariant algorithm (up to second order in the step size). We define the geodesic IGO update, a fully parametrization-invariant algorithm using the Riemannian structure, and we compute it for the manifold of Gaussians, thanks to Noether’s theorem. However, in similar algorithms, such as CMA-ES (Covariance Matrix Adaptation - Evolution Strategy) and xNES (exponential Natural Evolution Strategy), the time steps for the mean and the covariance are decoupled. We suggest two ways of doing so: twisted geodesic IGO (GIGO) and blockwise GIGO. Finally, we show that while the xNES algorithm is not GIGO, it is an instance of blockwise GIGO applied to the mean and covariance matrix separately. Therefore, xNES has an almost parametrization-invariant description. Full article
(This article belongs to the Special Issue Information, Entropy and Their Geometric Structures)
Show Figures

Article
Information Decomposition in Bivariate Systems: Theory and Application to Cardiorespiratory Dynamics
Entropy 2015, 17(1), 277-303; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010277 - 12 Jan 2015
Cited by 81 | Viewed by 5687
Abstract
In the framework of information dynamics, the temporal evolution of coupled systems can be studied by decomposing the predictive information about an assigned target system into amounts quantifying the information stored inside the system and the information transferred to it. While information storage [...] Read more.
In the framework of information dynamics, the temporal evolution of coupled systems can be studied by decomposing the predictive information about an assigned target system into amounts quantifying the information stored inside the system and the information transferred to it. While information storage and transfer are computed through the known self-entropy (SE) and transfer entropy (TE), an alternative decomposition evidences the so-called cross entropy (CE) and conditional SE (cSE), quantifying the cross information and internal information of the target system, respectively. This study presents a thorough evaluation of SE, TE, CE and cSE as quantities related to the causal statistical structure of coupled dynamic processes. First, we investigate the theoretical properties of these measures, providing the conditions for their existence and assessing the meaning of the information theoretic quantity that each of them reflects. Then, we present an approach for the exact computation of information dynamics based on the linear Gaussian approximation, and exploit this approach to characterize the behavior of SE, TE, CE and cSE in benchmark systems with known dynamics. Finally, we exploit these measures to study cardiorespiratory dynamics measured from healthy subjects during head-up tilt and paced breathing protocols. Our main result is that the combined evaluation of the measures of information dynamics allows to infer the causal effects associated with the observed dynamics and to interpret the alteration of these effects with changing experimental conditions. Full article
(This article belongs to the Special Issue Entropy and Cardiac Physics)
Show Figures

Graphical abstract

Article
The Entropy of an Armco Iron under Irreversible Deformation
Entropy 2015, 17(1), 264-276; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010264 - 12 Jan 2015
Cited by 4 | Viewed by 2440
Abstract
This work is devoted to the development of a theoretical approach for the calculation of entropy in metals under plastic deformation. The thermodynamic analysis of the plastic deformation process allowed us to obtain the expression for determination of the entropy production. The value [...] Read more.
This work is devoted to the development of a theoretical approach for the calculation of entropy in metals under plastic deformation. The thermodynamic analysis of the plastic deformation process allowed us to obtain the expression for determination of the entropy production. The value of the entropy production in an Armco iron specimen under plastic deformation as calculated the basis of the proposed technique and infrared thermography data. This method also lets us define the inelastic strain caused by the initiation and growth of the defects which was used as the internal variable in the considered thermomechanical model from the experimental data. In order to verify the obtained results a theoretical analysis of the modeled situation was carried out. Full article
(This article belongs to the Section Thermodynamics)
Show Figures

Article
A Comparative Study on Energy and Exergy Analyses of a CI Engine Performed with Different Multiple Injection Strategies at Part Load: Effect of Injection Pressure
Entropy 2015, 17(1), 244-263; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010244 - 12 Jan 2015
Cited by 19 | Viewed by 3455
Abstract
In this study, a four stroke four cylinder direct injection CI engine was run using three different injection pressures. In all measurements, the fuel quantity per cycle, the pre injection and main injection timing, the boost pressure and the engine speed were kept [...] Read more.
In this study, a four stroke four cylinder direct injection CI engine was run using three different injection pressures. In all measurements, the fuel quantity per cycle, the pre injection and main injection timing, the boost pressure and the engine speed were kept constant. The motor tests were performed under 130, 140 and 150 MPa rail pressure. During the theoretical part of the study, combustion, emission, energy and exergy analysis were made using the test results. An increase in the injection pressure increases combustion efficiency. The results show that combustion efficiency is not enough by itself, because the increase in the power need of the injection pump, decreases the thermal efficiency. The increase in the combustion temperature, increases the cooling loss and decreases the exergetic efficiency. In addition, the NOx emissions increased by 12% and soot emissions decreased 44% via increasing injection pressure by 17%. The thermal and exergetic efficiencies are found inversely proportional with injection pressure. Exergy destruction is found independent of the injection pressure and its value is obtained as ~6%. Full article
(This article belongs to the Special Issue Exergy: Analysis and Applications)
Show Figures

Article
Multiscale Entropy Analysis of Heart Rate Variability for Assessing the Severity of Sleep Disordered Breathing
Entropy 2015, 17(1), 231-243; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010231 - 12 Jan 2015
Cited by 18 | Viewed by 3749
Abstract
Obstructive sleep apnea (OSA) is an independent cardiovascular risk factor to which autonomic nervous dysfunction has been reported to be an important contributor. Ninety subjects recruited from the sleep center of a single medical center were divided into four groups: normal snoring subjects [...] Read more.
Obstructive sleep apnea (OSA) is an independent cardiovascular risk factor to which autonomic nervous dysfunction has been reported to be an important contributor. Ninety subjects recruited from the sleep center of a single medical center were divided into four groups: normal snoring subjects without OSA (apnea hypopnea index, AHI < 5, n = 11), mild OSA (5 ≤ AHI < 15, n = 10), moderate OSA (15 ≤ AHI < 30, n = 24), and severe OSA (AHI ≥ 30, n = 45). Demographic (i.e., age, gender), anthropometric (i.e., body mass index, neck circumference), and polysomnographic (PSG) data were recorded and compared among the different groups. For each subject, R-R intervals (RRI) from 10 segments of 10-minute electrocardiogram recordings during non-rapid eye movement sleep at stage N2 were acquired and analyzed for heart rate variability (HRV) and sample entropy using multiscale entropy index (MEI) that was divided into small scale (MEISS, scale 1–5) and large scale (MEILS, scale 6–10). Our results not only demonstrated that MEISS could successfully distinguish normal snoring subjects and those with mild OSA from those with moderate and severe disease, but also revealed good correlation between MEISS and AHI with Spearman correlation analysis (r = −0.684, p < 0.001). Therefore, using the two parameters of EEG and ECG, MEISS may serve as a simple preliminary screening tool for assessing the severity of OSA before proceeding to PSG analysis. Full article
(This article belongs to the Special Issue Entropy and Cardiac Physics)
Show Figures

Article
An 18 Moments Model for Dense Gases: Entropy and Galilean Relativity Principles without Expansions
Entropy 2015, 17(1), 214-230; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010214 - 09 Jan 2015
Cited by 12 | Viewed by 4321
Abstract
The 14 moments model for dense gases, introduced in the last few years by Arima, Taniguchi, Ruggeri and Sugiyama, is here extended up to 18 moments. They have found the closure of the balance equations up to a finite order with respect to [...] Read more.
The 14 moments model for dense gases, introduced in the last few years by Arima, Taniguchi, Ruggeri and Sugiyama, is here extended up to 18 moments. They have found the closure of the balance equations up to a finite order with respect to equilibrium; it is also possible to impose for that model the entropy and Galilean relativity principles up to whatever order with respect to equilibrium, but by using Taylor’s expansion. Here, the exact solution is found, without expansions, but a bigger number of moments has to be considered and reasons will be shown suggesting that this number is at least 18. Full article
(This article belongs to the Section Thermodynamics)
Article
Deduction of Lorentz Transformations from Classical Thermodynamics
Entropy 2015, 17(1), 197-213; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010197 - 08 Jan 2015
Cited by 3 | Viewed by 3136
Abstract
The Lorentz transformations are obtained by assuming that the laws of classical thermodynamics are invariant under changes of inertial reference frames. As Maxwell equations are used in order to deduce a wave equation that shows the constancy of the speed of light, by [...] Read more.
The Lorentz transformations are obtained by assuming that the laws of classical thermodynamics are invariant under changes of inertial reference frames. As Maxwell equations are used in order to deduce a wave equation that shows the constancy of the speed of light, by means of the laws of classical thermodynamics, the invariance of the Carnot cycle is deduced under reference frame changes. Starting with this result and the blackbody particle number density in a rest frame, the Lorentz transformations are obtained. A discussion about the universality of classical thermodynamics is given. Full article
Article
An Image Encryption Scheme Based on Hyperchaotic Rabinovich and Exponential Chaos Maps
Entropy 2015, 17(1), 181-196; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010181 - 08 Jan 2015
Cited by 42 | Viewed by 3682
Abstract
This paper proposes a new four-dimensional hyperchaotic map based on the Rabinovich system to realize chaotic encryption in higher dimension and improve the security. The chaotic sequences generated by Runge-Kutta method are combined with the chaotic sequences generated by an exponential chaos map [...] Read more.
This paper proposes a new four-dimensional hyperchaotic map based on the Rabinovich system to realize chaotic encryption in higher dimension and improve the security. The chaotic sequences generated by Runge-Kutta method are combined with the chaotic sequences generated by an exponential chaos map to generate key sequences. The key sequences are used for image encryption. The security test results indicate that the new hyperchaotic system has high security and complexity. The comparison between the new hyperchaotic system and the several low-dimensional chaotic systems shows that the proposed system performs more efficiently. Full article
(This article belongs to the Special Issue Recent Advances in Chaos Theory and Complex Networks)
Show Figures

Article
A Clustering Method Based on the Maximum Entropy Principle
Entropy 2015, 17(1), 151-180; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010151 - 07 Jan 2015
Cited by 23 | Viewed by 4630
Abstract
Clustering is an unsupervised process to determine which unlabeled objects in a set share interesting properties. The objects are grouped into k subsets (clusters) whose elements optimize a proximity measure. Methods based on information theory have proven to be feasible alternatives. They are [...] Read more.
Clustering is an unsupervised process to determine which unlabeled objects in a set share interesting properties. The objects are grouped into k subsets (clusters) whose elements optimize a proximity measure. Methods based on information theory have proven to be feasible alternatives. They are based on the assumption that a cluster is one subset with the minimal possible degree of “disorder”. They attempt to minimize the entropy of each cluster. We propose a clustering method based on the maximum entropy principle. Such a method explores the space of all possible probability distributions of the data to find one that maximizes the entropy subject to extra conditions based on prior information about the clusters. The prior information is based on the assumption that the elements of a cluster are “similar” to each other in accordance with some statistical measure. As a consequence of such a principle, those distributions of high entropy that satisfy the conditions are favored over others. Searching the space to find the optimal distribution of object in the clusters represents a hard combinatorial problem, which disallows the use of traditional optimization techniques. Genetic algorithms are a good alternative to solve this problem. We benchmark our method relative to the best theoretical performance, which is given by the Bayes classifier when data are normally distributed, and a multilayer perceptron network, which offers the best practical performance when data are not normal. In general, a supervised classification method will outperform a non-supervised one, since, in the first case, the elements of the classes are known a priori. In what follows, we show that our method’s effectiveness is comparable to a supervised one. This clearly exhibits the superiority of our method. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Editorial
Acknowledgement to Reviewers of Entropy in 2014
Entropy 2015, 17(1), 142-150; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010142 - 07 Jan 2015
Viewed by 2240
Abstract
The editors of Entropy would like to express their sincere gratitude to the following reviewers for assessing manuscripts in 2014:[...] Full article
Article
Assessment of Time and Frequency Domain Entropies to Detect Sleep Apnoea in Heart Rate Variability Recordings from Men and Women
Entropy 2015, 17(1), 123-141; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010123 - 06 Jan 2015
Cited by 21 | Viewed by 3653
Abstract
Heart rate variability (HRV) provides useful information about heart dynamics both under healthy and pathological conditions. Entropy measures have shown their utility to characterize these dynamics. In this paper, we assess the ability of spectral entropy (SE) and multiscale entropy (MsE) to characterize [...] Read more.
Heart rate variability (HRV) provides useful information about heart dynamics both under healthy and pathological conditions. Entropy measures have shown their utility to characterize these dynamics. In this paper, we assess the ability of spectral entropy (SE) and multiscale entropy (MsE) to characterize the sleep apnoea-hypopnea syndrome (SAHS) in HRV recordings from 188 subjects. Additionally, we evaluate eventual differences in these analyses depending on the gender. We found that the SE computed from the very low frequency band and the low frequency band showed ability to characterize SAHS regardless the gender; and that MsE features may be able to distinguish gender specificities. SE and MsE showed complementarity to detect SAHS, since several features from both analyses were automatically selected by the forward-selection backward-elimination algorithm. Finally, SAHS was modelled through logistic regression (LR) by using optimum sets of selected features. Modelling SAHS by genders reached significant higher performance than doing it in a jointly way. The highest diagnostic ability was reached by modelling SAHS in women. The LR classifier achieved 85.2% accuracy (Acc) and 0.951 area under the ROC curve (AROC). LR for men reached 77.6% Acc and 0.895 AROC, whereas LR for the whole set reached 72.3% Acc and 0.885 AROC. Our results show the usefulness of the SE and MsE analyses of HRV to detect SAHS, as well as suggest that, when using HRV, SAHS may be more accurately modelled if data are separated by gender. Full article
(This article belongs to the Special Issue Entropy and Cardiac Physics)
Show Figures

Graphical abstract

Article
Estimating the Entropy of a Weibull Distribution under Generalized Progressive Hybrid Censoring
Entropy 2015, 17(1), 102-122; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010102 - 05 Jan 2015
Cited by 29 | Viewed by 3880
Abstract
Recently, progressive hybrid censoring schemes have become quite popular in a life-testing problem and reliability analysis. However, the limitation of the progressive hybrid censoring scheme is that it cannot be applied when few failures occur before time T. Therefore, a generalized progressive hybrid [...] Read more.
Recently, progressive hybrid censoring schemes have become quite popular in a life-testing problem and reliability analysis. However, the limitation of the progressive hybrid censoring scheme is that it cannot be applied when few failures occur before time T. Therefore, a generalized progressive hybrid censoring scheme was introduced. In this paper, the estimation of the entropy of a two-parameter Weibull distribution based on the generalized progressively censored sample has been considered. The Bayes estimators for the entropy of the Weibull distribution based on the symmetric and asymmetric loss functions, such as the squared error, linex and general entropy loss functions, are provided. The Bayes estimators cannot be obtained explicitly, and Lindley’s approximation is used to obtain the Bayes estimators. Simulation experiments are performed to see the effectiveness of the different estimators. Finally, a real dataset has been analyzed for illustrative purposes. Full article
Show Figures

Graphical abstract

Article
Entropy-Based Characterization of Internet Background Radiation
Entropy 2015, 17(1), 74-101; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010074 - 31 Dec 2014
Cited by 9 | Viewed by 3744
Abstract
Network security requires real-time monitoring of network traffic in order to detect new and unexpected attacks. Attack detection methods based on deep packet inspection are time consuming and costly, due to their high computational demands. This paper proposes a fast, lightweight method to [...] Read more.
Network security requires real-time monitoring of network traffic in order to detect new and unexpected attacks. Attack detection methods based on deep packet inspection are time consuming and costly, due to their high computational demands. This paper proposes a fast, lightweight method to distinguish different attack types observed in an IP darkspace monitor. The method is based on entropy measures of traffic-flow features and machine learning techniques. The explored data belongs to a portion of the Internet background radiation from a large IP darkspace, i.e., real traffic captures that exclusively contain unsolicited traffic, ongoing attacks, attack preparation activities and attack aftermaths. Results from an in-depth traffic analysis based on packet headers and content are used as a reference to label data and to evaluate the quality of the entropy-based classification. Full IP darkspace traffic captures from a three-week observation period in April, 2012, are used to compare the entropy-based classification with the in-depth traffic analysis. Results show that several traffic types present a high correlation to the respective traffic-flow entropy signals and can even fit polynomial regression models. Therefore, sudden changes in traffic types caused by new attacks or attack preparation activities can be identified based on entropy variations. Full article
Show Figures

Graphical abstract

Review
The Big World of Nanothermodynamics
Entropy 2015, 17(1), 52-73; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010052 - 31 Dec 2014
Cited by 32 | Viewed by 5717
Abstract
Nanothermodynamics extends standard thermodynamics to facilitate finite-size effects on the scale of nanometers. A key ingredient is Hill’s subdivision potential that accommodates the non-extensive energy of independent small systems, similar to how Gibbs’ chemical potential accommodates distinct particles. Nanothermodynamics is essential for characterizing [...] Read more.
Nanothermodynamics extends standard thermodynamics to facilitate finite-size effects on the scale of nanometers. A key ingredient is Hill’s subdivision potential that accommodates the non-extensive energy of independent small systems, similar to how Gibbs’ chemical potential accommodates distinct particles. Nanothermodynamics is essential for characterizing the thermal equilibrium distribution of independently relaxing regions inside bulk samples, as is found for the primary response of most materials using various experimental techniques. The subdivision potential ensures strict adherence to the laws of thermodynamics: total energy is conserved by including an instantaneous contribution from the entropy of local configurations, and total entropy remains maximized by coupling to a thermal bath. A unique feature of nanothermodynamics is the completely-open nanocanonical ensemble. Another feature is that particles within each region become statistically indistinguishable, which avoids non-extensive entropy, and mimics quantum-mechanical behavior. Applied to mean-field theory, nanothermodynamics gives a heterogeneous distribution of regions that yields stretched-exponential relaxation and super-Arrhenius activation. Applied to Monte Carlo simulations, there is a nonlinear correction to Boltzmann’s factor that improves agreement between the Ising model and measured non-classical critical scaling in magnetic materials. Nanothermodynamics also provides a fundamental mechanism for the 1/f noise found in many materials. Full article
(This article belongs to the Special Issue Nanothermodynamics)
Show Figures

Article
Finite-Time Synchronization of Chaotic Complex Networks with Stochastic Disturbance
Entropy 2015, 17(1), 39-51; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010039 - 30 Dec 2014
Cited by 15 | Viewed by 3002
Abstract
This paper is concerned with the problem of finite-time synchronization in complex networks with stochastic noise perturbations. By using a novel finite-time ℒ -operator differential inequality and other inequality techniques, some novel sufficient conditions are obtained to ensure finite-time stochastic synchronization for the [...] Read more.
This paper is concerned with the problem of finite-time synchronization in complex networks with stochastic noise perturbations. By using a novel finite-time ℒ -operator differential inequality and other inequality techniques, some novel sufficient conditions are obtained to ensure finite-time stochastic synchronization for the complex networks concerned, where the coupling matrix need not be symmetric. The effects of control parameters on synchronization speed and time are also analyzed, and the synchronization time in this paper is shorter than that in the existing literature. The results here are also applicable to both directed and undirected weighted networks without any information of the coupling matrix. Finally, an example with numerical simulations is given to demonstrate the effectiveness of the proposed method. Full article
(This article belongs to the Special Issue Recent Advances in Chaos Theory and Complex Networks)
Show Figures

Article
A Color Image Encryption Algorithm Based on a Fractional-Order Hyperchaotic System
Entropy 2015, 17(1), 28-38; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010028 - 23 Dec 2014
Cited by 27 | Viewed by 3581
Abstract
In this paper, a new color image encryption algorithm based on a fractional-order hyperchaotic system is proposed. Firstly, four chaotic sequences are generated by a fractional-order hyperchaotic system. The parameters of such a system, together with the initial value, are regarded as the [...] Read more.
In this paper, a new color image encryption algorithm based on a fractional-order hyperchaotic system is proposed. Firstly, four chaotic sequences are generated by a fractional-order hyperchaotic system. The parameters of such a system, together with the initial value, are regarded as the secret keys and the plain image is encrypted by performing the XOR and shuffling operations simultaneously. The proposed encryption scheme is described in detail with security analyses, including correlation analysis, histogram analysis, differential attacks, and key sensitivity analysis. Experimental results show that the proposed encryption scheme has big key space, and high sensitivity to keys properties, and resists statistical analysis and differential attacks, so it has high security and is suitable for color image encryption. Full article
(This article belongs to the Special Issue Recent Advances in Chaos Theory and Complex Networks)
Show Figures

Article
Complexity-Regularized Regression for Serially-Correlated Residuals with Applications to Stock Market Data
Entropy 2015, 17(1), 1-27; https://0-doi-org.brum.beds.ac.uk/10.3390/e17010001 - 23 Dec 2014
Cited by 1 | Viewed by 2930
Abstract
A popular approach in the investigation of the short-term behavior of a non-stationary time series is to assume that the time series decomposes additively into a long-term trend and short-term fluctuations. A first step towards investigating the short-term behavior requires estimation of the [...] Read more.
A popular approach in the investigation of the short-term behavior of a non-stationary time series is to assume that the time series decomposes additively into a long-term trend and short-term fluctuations. A first step towards investigating the short-term behavior requires estimation of the trend, typically via smoothing in the time domain. We propose a method for time-domain smoothing, called complexity-regularized regression (CRR). This method extends recent work, which infers a regression function that makes residuals from a model “look random”. Our approach operationalizes non-randomness in the residuals by applying ideas from computational mechanics, in particular the statistical complexity of the residual process. The method is compared to generalized cross-validation (GCV), a standard approach for inferring regression functions, and shown to outperform GCV when the error terms are serially correlated. Regression under serially-correlated residuals has applications to time series analysis, where the residuals may represent short timescale activity. We apply CRR to a time series drawn from the Dow Jones Industrial Average and examine how both the long-term and short-term behavior of the market have changed over time. Full article
(This article belongs to the Section Complexity)
Show Figures

Graphical abstract

Previous Issue
Next Issue
Back to TopTop