Next Issue
Volume 23, August
Previous Issue
Volume 23, June
 
 
entropy-logo

Journal Browser

Journal Browser

Entropy, Volume 23, Issue 7 (July 2021) – 137 articles

Cover Story (view full-size image): The geometric properties of a continuous probability distribution give rise to a statistical manifold via the Fisher information metric and Riemannian geometry. Establishing a coordinate transformation that locally relates the manifold to the Euclidean space results in a coordinate system in which the distribution takes a particularly simple form. This enables an accurate variational approximation of the transformed distribution with a normal distribution, which is provided by the novel geometric Variational Inference (geoVI) algorithm. A computationally efficient implementation of the transformation equips geoVI with a high level of accuracy and an almost linear scaling with the problem size. This enables geoVI to solve non-linear, hierarchical Bayesian inverse problems in very high dimensions. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
14 pages, 563 KiB  
Article
Representation of a Monotone Curve by a Contour with Regular Change in Curvature
by Yevhen Havrylenko, Yuliia Kholodniak, Serhii Halko, Oleksandr Vershkov, Oleksandr Miroshnyk, Olena Suprun, Olena Dereza, Taras Shchur and Mścisław Śrutek
Entropy 2021, 23(7), 923; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070923 - 20 Jul 2021
Cited by 19 | Viewed by 2221
Abstract
The problem of modelling a smooth contour with a regular change in curvature representing a monotone curve with specified accuracy is solved in this article. The contour was formed within the area of the possible location of a convex curve, which can interpolate [...] Read more.
The problem of modelling a smooth contour with a regular change in curvature representing a monotone curve with specified accuracy is solved in this article. The contour was formed within the area of the possible location of a convex curve, which can interpolate a point series. The assumption that if a sequence of points can be interpolated by a monotone curve, then the reference curve on which these points have been assigned is monotone, provides the opportunity to implement the proposed approach to estimate the interpolation error of a point series of arbitrary configuration. The proposed methods for forming a convex regular contour by arcs of ellipses and B-spline ensure the interpolation of any point series in parts that can be interpolated by a monotone curve. At the same time, the deflection of the contour from the boundaries of the area of the possible location of the monotone curve can be controlled. The possibilities of the developed methods are tested while solving problems of the interpolation of a point series belonging to monotone curves. The problems are solved in the CAD system of SolidWorks with the use of software application created based on the methods developed in the research work. Full article
Show Figures

Figure 1

17 pages, 1110 KiB  
Article
Redundant Information Neural Estimation
by Michael Kleinman, Alessandro Achille, Stefano Soatto and Jonathan C. Kao
Entropy 2021, 23(7), 922; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070922 - 20 Jul 2021
Cited by 4 | Viewed by 3399
Abstract
We introduce the Redundant Information Neural Estimator (RINE), a method that allows efficient estimation for the component of information about a target variable that is common to a set of sources, known as the “redundant information”. We show that existing definitions of the [...] Read more.
We introduce the Redundant Information Neural Estimator (RINE), a method that allows efficient estimation for the component of information about a target variable that is common to a set of sources, known as the “redundant information”. We show that existing definitions of the redundant information can be recast in terms of an optimization over a family of functions. In contrast to previous information decompositions, which can only be evaluated for discrete variables over small alphabets, we show that optimizing over functions enables the approximation of the redundant information for high-dimensional and continuous predictors. We demonstrate this on high-dimensional image classification and motor-neuroscience tasks. Full article
(This article belongs to the Special Issue Information Flow in Neural Systems)
Show Figures

Figure 1

22 pages, 950 KiB  
Article
Stabilization and Synchronization of a Complex Hidden Attractor Chaotic System by Backstepping Technique
by Jesus M. Munoz-Pacheco, Christos Volos, Fernando E. Serrano, Sajad Jafari, Jacques Kengne and Karthikeyan Rajagopal
Entropy 2021, 23(7), 921; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070921 - 20 Jul 2021
Cited by 11 | Viewed by 2411
Abstract
In this paper, the stabilization and synchronization of a complex hidden chaotic attractor is shown. This article begins with the dynamic analysis of a complex Lorenz chaotic system considering the vector field properties of the analyzed system in the Cn domain. Then, [...] Read more.
In this paper, the stabilization and synchronization of a complex hidden chaotic attractor is shown. This article begins with the dynamic analysis of a complex Lorenz chaotic system considering the vector field properties of the analyzed system in the Cn domain. Then, considering first the original domain of attraction of the complex Lorenz chaotic system in the equilibrium point, by using the required set topology of this domain of attraction, one hidden chaotic attractor is found by finding the intersection of two sets in which two of the parameters, r and b, can be varied in order to find hidden chaotic attractors. Then, a backstepping controller is derived by selecting extra state variables and establishing the required Lyapunov functionals in a recursive methodology. For the control synchronization law, a similar procedure is implemented, but this time, taking into consideration the error variable which comprise the difference of the response system and drive system, to synchronize the response system with the original drive system which is the original complex Lorenz system. Full article
Show Figures

Figure 1

20 pages, 3368 KiB  
Article
A Network Approach to the Study of the Dynamics of Risk Spillover in China’s Bond Market
by Zhewen Liao, Hongli Zhang, Kun Guo and Ning Wu
Entropy 2021, 23(7), 920; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070920 - 20 Jul 2021
Cited by 9 | Viewed by 2970
Abstract
Since 2018, the bond market has surpassed the stock market, becoming the biggest investment area in China’s security market, and the systemic risks of China’s bond market are of non-negligible importance. Based on daily interest rate data of representative bond categories, this study [...] Read more.
Since 2018, the bond market has surpassed the stock market, becoming the biggest investment area in China’s security market, and the systemic risks of China’s bond market are of non-negligible importance. Based on daily interest rate data of representative bond categories, this study conducted a dynamic analysis based on generalized vector autoregressive volatility spillover variance decomposition, constructed a complex network, and adopted the minimum spanning tree method to clarify and analyze the risk propagation path between different bond types. It is found that the importance of each bond type is positively correlated with liquidity, transaction volume, and credit rating, and the inter-bank market is the most important market in the entire bond market, while interest rate bonds, bank bonds and urban investment bonds are important varieties with great systemic importance. In addition, the long-term trend of the dynamic spillover index of China’s bond market falls in line with the pace of the interest rate adjustments. To hold the bottom line of preventing financial systemic risks of China’s bond market, standard management, strict supervision, and timely regulation of the bond markets are required, and the structural entropy, as a useful indicator, also should be used in the risk management and monitoring. Full article
(This article belongs to the Special Issue Entropy-Based Applications in Economics, Finance, and Management)
Show Figures

Figure 1

10 pages, 380 KiB  
Article
Cross-Modality Person Re-Identification Based on Heterogeneous Center Loss and Non-Local Features
by Chengmei Han, Peng Pan, Aihua Zheng and Jin Tang
Entropy 2021, 23(7), 919; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070919 - 20 Jul 2021
Cited by 3 | Viewed by 2199
Abstract
Cross-modality person re-identification is the study of images of people matching under different modalities (RGB modality, IR modality). Given one RGB image of a pedestrian collected under visible light in the daytime, cross-modality person re-identification aims to determine whether the same pedestrian appears [...] Read more.
Cross-modality person re-identification is the study of images of people matching under different modalities (RGB modality, IR modality). Given one RGB image of a pedestrian collected under visible light in the daytime, cross-modality person re-identification aims to determine whether the same pedestrian appears in infrared images (IR images) collected by infrared cameras at night, and vice versa. Cross-modality person re-identification can solve the task of pedestrian recognition in low light or at night. This paper aims to improve the degree of similarity for the same pedestrian in two modalities by improving the feature expression ability of the network and designing appropriate loss functions. To implement our approach, we introduce a deep neural network structure combining heterogeneous center loss (HC loss) and a non-local mechanism. On the one hand, this can heighten the performance of feature representation of the feature learning module, and, on the other hand, it can improve the similarity of cross-modality within the class. Experimental data show that the network achieves excellent performance on SYSU-MM01 datasets. Full article
Show Figures

Figure 1

15 pages, 2570 KiB  
Article
Ischemic Stroke Risk Assessment by Multiscale Entropy Analysis of Heart Rate Variability in Patients with Persistent Atrial Fibrillation
by Ghina Chairina, Kohzoh Yoshino, Ken Kiyono and Eiichi Watanabe
Entropy 2021, 23(7), 918; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070918 - 19 Jul 2021
Cited by 1 | Viewed by 2458
Abstract
It has been recognized that heart rate variability (HRV), defined as the fluctuation of ventricular response intervals in atrial fibrillation (AFib) patients, is not completely random, and its nonlinear characteristics, such as multiscale entropy (MSE), contain clinically significant information. We investigated the relationship [...] Read more.
It has been recognized that heart rate variability (HRV), defined as the fluctuation of ventricular response intervals in atrial fibrillation (AFib) patients, is not completely random, and its nonlinear characteristics, such as multiscale entropy (MSE), contain clinically significant information. We investigated the relationship between ischemic stroke risk and HRV with a large number of stroke-naïve AFib patients (628 patients), focusing on those who had never developed an ischemic/hemorrhagic stroke before the heart rate measurement. The CHA2DS2VASc score was calculated from the baseline clinical characteristics, while the HRV analysis was made from the recording of morning, afternoon, and evening. Subsequently, we performed Kaplan–Meier method and cumulative incidence function with mortality as a competing risk to estimate the survival time function. We found that patients with sample entropy (SE(s)) 0.68 at 210 s had a significantly higher risk of an ischemic stroke occurrence in the morning recording. Meanwhile, the afternoon recording showed that those with SE(s)  0.76 at 240 s and SE(s)  0.78 at 270 s had a significantly lower risk of ischemic stroke occurrence. Therefore, SE(s) at 210 s (morning) and 240 s ≤ s ≤ 270 s (afternoon) demonstrated a statistically significant predictive value for ischemic stroke in stroke-naïve AFib patients. Full article
Show Figures

Figure 1

20 pages, 13007 KiB  
Article
Generalized Reversible Data Hiding with Content-Adaptive Operation and Fast Histogram Shifting Optimization
by Limengnan Zhou, Hongyu Han and Hanzhou Wu
Entropy 2021, 23(7), 917; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070917 - 19 Jul 2021
Cited by 3 | Viewed by 2103
Abstract
Reversible data hiding (RDH) has become a hot spot in recent years as it allows both the secret data and the raw host to be perfectly reconstructed, which is quite desirable in sensitive applications requiring no degradation of the host. A lot of [...] Read more.
Reversible data hiding (RDH) has become a hot spot in recent years as it allows both the secret data and the raw host to be perfectly reconstructed, which is quite desirable in sensitive applications requiring no degradation of the host. A lot of RDH algorithms have been designed by a sophisticated empirical way. It is not easy to extend them to a general case, which, to a certain extent, may have limited their wide-range applicability. Therefore, it motivates us to revisit the conventional RDH algorithms and present a general framework of RDH in this paper. The proposed framework divides the system design of RDH at the data hider side into four important parts, i.e., binary-map generation, content prediction, content selection, and data embedding, so that the data hider can easily design and implement, as well as improve, an RDH system. For each part, we introduce content-adaptive techniques that can benefit the subsequent data-embedding procedure. We also analyze the relationships between these four parts and present different perspectives. In addition, we introduce a fast histogram shifting optimization (FastHiSO) algorithm for data embedding to keep the payload-distortion performance sufficient while reducing the computational complexity. Two RDH algorithms are presented to show the efficiency and applicability of the proposed framework. It is expected that the proposed framework can benefit the design of an RDH system, and the introduced techniques can be incorporated into the design of advanced RDH algorithms. Full article
(This article belongs to the Special Issue Information Hiding and Coding Theory)
Show Figures

Figure 1

21 pages, 1006 KiB  
Article
Low-Latency Short-Packet Transmission over a Large Spatial Scale
by Lei Huang, Xiaoyu Zhao, Wei Chen and H. Vincent Poor
Entropy 2021, 23(7), 916; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070916 - 19 Jul 2021
Cited by 4 | Viewed by 2434
Abstract
Short-packet transmission has attracted considerable attention due to its potential to achieve ultralow latency in automated driving, telesurgery, the Industrial Internet of Things (IIoT), and other applications emerging in the coming era of the Six-Generation (6G) wireless networks. In 6G systems, a paradigm-shifting [...] Read more.
Short-packet transmission has attracted considerable attention due to its potential to achieve ultralow latency in automated driving, telesurgery, the Industrial Internet of Things (IIoT), and other applications emerging in the coming era of the Six-Generation (6G) wireless networks. In 6G systems, a paradigm-shifting infrastructure is anticipated to provide seamless coverage by integrating low-Earth orbit (LEO) satellite networks, which enable long-distance wireless relaying. However, how to efficiently transmit short packets over a sizeable spatial scale remains open. In this paper, we are interested in low-latency short-packet transmissions between two distant nodes, in which neither propagation delay, nor propagation loss can be ignored. Decode-and-forward (DF) relays can be deployed to regenerate packets reliably during their delivery over a long distance, thereby reducing the signal-to-noise ratio (SNR) loss. However, they also cause decoding delay in each hop, the sum of which may become large and cannot be ignored given the stringent latency constraints. This paper presents an optimal relay deployment to minimize the error probability while meeting both the latency and transmission power constraints. Based on an asymptotic analysis, a theoretical performance bound for distant short-packet transmission is also characterized by the optimal distance–latency–reliability tradeoff, which is expected to provide insights into designing integrated LEO satellite communications in 6G. Full article
(This article belongs to the Special Issue Short Packet Communications for 5G and Beyond)
Show Figures

Figure 1

13 pages, 1187 KiB  
Communication
Earth’s Complexity Is Non-Computable: The Limits of Scaling Laws, Nonlinearity and Chaos
by Sergio Rubin and Michel Crucifix
Entropy 2021, 23(7), 915; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070915 - 19 Jul 2021
Cited by 6 | Viewed by 4700
Abstract
Current physics commonly qualifies the Earth system as ‘complex’ because it includes numerous different processes operating over a large range of spatial scales, often modelled as exhibiting non-linear chaotic response dynamics and power scaling laws. This characterization is based on the fundamental assumption [...] Read more.
Current physics commonly qualifies the Earth system as ‘complex’ because it includes numerous different processes operating over a large range of spatial scales, often modelled as exhibiting non-linear chaotic response dynamics and power scaling laws. This characterization is based on the fundamental assumption that the Earth’s complexity could, in principle, be modeled by (surrogated by) a numerical algorithm if enough computing power were granted. Yet, similar numerical algorithms also surrogate different systems having the same processes and dynamics, such as Mars or Jupiter, although being qualitatively different from the Earth system. Here, we argue that understanding the Earth as a complex system requires a consideration of the Gaia hypothesis: the Earth is a complex system because it instantiates life—and therefore an autopoietic, metabolic-repair (M,R) organization—at a planetary scale. This implies that the Earth’s complexity has formal equivalence to a self-referential system that inherently is non-algorithmic and, therefore, cannot be surrogated and simulated in a Turing machine. We discuss the consequences of this, with reference to in-silico climate models, tipping points, planetary boundaries, and planetary feedback loops as units of adaptive evolution and selection. Full article
(This article belongs to the Special Issue Complexity and Evolution)
Show Figures

Figure 1

17 pages, 2542 KiB  
Article
Leadership Hijacking in Docker Swarm and Its Consequences
by Adi Farshteindiker and Rami Puzis
Entropy 2021, 23(7), 914; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070914 - 19 Jul 2021
Cited by 4 | Viewed by 2956
Abstract
With the advent of microservice-based software architectures, an increasing number of modern cloud environments and enterprises use operating system level virtualization, which is often referred to as container infrastructures. Docker Swarm is one of the most popular container orchestration infrastructures, providing high availability [...] Read more.
With the advent of microservice-based software architectures, an increasing number of modern cloud environments and enterprises use operating system level virtualization, which is often referred to as container infrastructures. Docker Swarm is one of the most popular container orchestration infrastructures, providing high availability and fault tolerance. Occasionally, discovered container escape vulnerabilities allow adversaries to execute code on the host operating system and operate within the cloud infrastructure. We show that Docker Swarm is currently not secured against misbehaving manager nodes. This allows a high impact, high probability privilege escalation attack, which we refer to as leadership hijacking, the possibility of which is neglected by the current cloud security literature. Cloud lateral movement and defense evasion payloads allow an adversary to leverage the Docker Swarm functionality to control each and every host in the underlying cluster. We demonstrate an end-to-end attack, in which an adversary with access to an application running on the cluster achieves full control of the cluster. To reduce the probability of a successful high impact attack, container orchestration infrastructures must reduce the trust level of participating nodes and, in particular, incorporate adversary immune leader election algorithms. Full article
(This article belongs to the Special Issue Swarms and Network Intelligence)
Show Figures

Figure 1

25 pages, 10888 KiB  
Article
Numerical Simulation of Swirl Flow Characteristics of CO2 Hydrate Slurry by Short Twisted Band
by Yongchao Rao, Zehui Liu, Shuli Wang, Lijun Li and Qi Sun
Entropy 2021, 23(7), 913; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070913 - 18 Jul 2021
Cited by 4 | Viewed by 1984
Abstract
The development of oil and gas resources is gradually transferring to the deep sea, and the hydrate plugging of submarine pipelines at high pressures and low temperatures is becoming an important problem to ensure the safety of pipeline operations. The swirl flow is [...] Read more.
The development of oil and gas resources is gradually transferring to the deep sea, and the hydrate plugging of submarine pipelines at high pressures and low temperatures is becoming an important problem to ensure the safety of pipeline operations. The swirl flow is a new method to expand the boundary of hydrate safe flow. Numerical simulation of the hydrate slurry flow characteristics in a horizontal pipeline by twisted band has been carried out, and the flow of CO2 hydrate slurry in low concentration has been simulated by the RSM and DPM models. The results show that the heat transfer efficiency is also related to Re and particle concentration. The velocity distribution has the form of symmetrical double peaks, and the peaks finally merge at the center of the pipeline. Vortexes firstly appear on both sides of the edge of the twisted band, and then move to the middle part of the twisted band. Finally, the vortex center almost coincides with the velocity center. The rotation direction of hydrate particles is the same as the twisted direction of the twisted band, twist rate (Y) is smaller, Re is larger, and the symmetric vortex lines merge farther away. The initial swirl number is mainly related to Y, but not Re. The swirl flow attenuates exponentially, and its attenuation rate is mainly related to Re, but not Y. Compared with ordinary pipelines, the swirl flow can obviously improve the transportation distance of hydrate slurry. Full article
(This article belongs to the Special Issue Phase Transition and Heat-Mass Transfer of Gas Hydrate in Sediment)
Show Figures

Figure 1

19 pages, 468 KiB  
Article
Channel Quality-Based Optimal Status Update for Information Freshness in Internet of Things
by Fuzhou Peng, Xiang Chen and Xijun Wang
Entropy 2021, 23(7), 912; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070912 - 18 Jul 2021
Cited by 1 | Viewed by 2063
Abstract
This paper investigates the status updating policy for information freshness in Internet of things (IoT) systems, where the channel quality is fed back to the sensor at the beginning of each time slot. Based on the channel quality, we aim to strike a [...] Read more.
This paper investigates the status updating policy for information freshness in Internet of things (IoT) systems, where the channel quality is fed back to the sensor at the beginning of each time slot. Based on the channel quality, we aim to strike a balance between the information freshness and the update cost by minimizing the weighted sum of the age of information (AoI) and the energy consumption. The optimal status updating problem is formulated as a Markov decision process (MDP), and the structure of the optimal updating policy is investigated. We prove that, given the channel quality, the optimal policy is of a threshold type with respect to the AoI. In particular, the sensor remains idle when the AoI is smaller than the threshold, while the sensor transmits the update packet when the AoI is greater than the threshold. Moreover, the threshold is proven to be a non-increasing function of channel state. A numerical-based algorithm for efficiently computing the optimal thresholds is proposed for a special case where the channel is quantized into two states. Simulation results show that our proposed policy performs better than two baseline policies. Full article
(This article belongs to the Special Issue Age of Information: Concept, Metric and Tool for Network Control)
Show Figures

Figure 1

32 pages, 753 KiB  
Article
ϕ-Informational Measures: Some Results and Interrelations
by Steeve Zozor and Jean-François Bercher
Entropy 2021, 23(7), 911; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070911 - 18 Jul 2021
Cited by 2 | Viewed by 2223
Abstract
In this paper, we focus on extended informational measures based on a convex function ϕ: entropies, extended Fisher information, and generalized moments. Both the generalization of the Fisher information and the moments rely on the definition of an escort distribution linked to [...] Read more.
In this paper, we focus on extended informational measures based on a convex function ϕ: entropies, extended Fisher information, and generalized moments. Both the generalization of the Fisher information and the moments rely on the definition of an escort distribution linked to the (entropic) functional ϕ. We revisit the usual maximum entropy principle—more precisely its inverse problem, starting from the distribution and constraints, which leads to the introduction of state-dependent ϕ-entropies. Then, we examine interrelations between the extended informational measures and generalize relationships such the Cramér–Rao inequality and the de Bruijn identity in this broader context. In this particular framework, the maximum entropy distributions play a central role. Of course, all the results derived in the paper include the usual ones as special cases. Full article
(This article belongs to the Special Issue Entropies, Divergences, Information, Identities and Inequalities)
Show Figures

Figure A1

23 pages, 5727 KiB  
Article
Robust Vehicle Speed Measurement Based on Feature Information Fusion for Vehicle Multi-Characteristic Detection
by Lei Yang, Jianchen Luo, Xiaowei Song, Menglong Li, Pengwei Wen and Zixiang Xiong
Entropy 2021, 23(7), 910; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070910 - 17 Jul 2021
Cited by 10 | Viewed by 3413
Abstract
A robust vehicle speed measurement system based on feature information fusion for vehicle multi-characteristic detection is proposed in this paper. A vehicle multi-characteristic dataset is constructed. With this dataset, seven CNN-based modern object detection algorithms are trained for vehicle multi-characteristic detection. The FPN-based [...] Read more.
A robust vehicle speed measurement system based on feature information fusion for vehicle multi-characteristic detection is proposed in this paper. A vehicle multi-characteristic dataset is constructed. With this dataset, seven CNN-based modern object detection algorithms are trained for vehicle multi-characteristic detection. The FPN-based YOLOv4 is selected as the best vehicle multi-characteristic detection algorithm, which applies feature information fusion of different scales with both rich high-level semantic information and detailed low-level location information. The YOLOv4 algorithm is improved by combing with the attention mechanism, in which the residual module in YOLOv4 is replaced by the ECA channel attention module with cross channel interaction. An improved ECA-YOLOv4 object detection algorithm based on both feature information fusion and cross channel interaction is proposed, which improves the performance of YOLOv4 for vehicle multi-characteristic detection and reduces the model parameter size and FLOPs as well. A multi-characteristic fused speed measurement system based on license plate, logo, and light is designed accordingly. The system performance is verified by experiments. The experimental results show that the speed measurement error rate of the proposed system meets the requirement of the China national standard GB/T 21555-2007 in which the speed measurement error rate should be less than 6%. The proposed system can efficiently enhance the vehicle speed measurement accuracy and effectively improve the vehicle speed measurement robustness. Full article
(This article belongs to the Special Issue Advances in Image Fusion)
Show Figures

Figure 1

17 pages, 11049 KiB  
Article
Tsallis q-Stat and the Evidence of Long-Range Interactions in Soil Temperature Dynamics
by Babalola O. Ogunsua and John A. Laoye
Entropy 2021, 23(7), 909; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070909 - 17 Jul 2021
Viewed by 1903
Abstract
The complexities in the variations of soil temperature and thermal diffusion poses a physical problem that requires more understanding. The quest for a better understanding of the complexities of soil temperature variation has prompted the study of the q-statistics in the soil [...] Read more.
The complexities in the variations of soil temperature and thermal diffusion poses a physical problem that requires more understanding. The quest for a better understanding of the complexities of soil temperature variation has prompted the study of the q-statistics in the soil temperature variation with the view of understanding the underlying dynamics of the temperature variation and thermal diffusivity of the soil. In this work, the values of Tsallis stationary state q index known as q-stat were computed from soil temperature measured at different stations in Nigeria. The intrinsic variations of the soil temperature were derived from the soil temperature time series by detrending method to extract the influences of other types of variations from the atmosphere. The detrended soil temperature data sets were further analysed to fit the q-Gaussian model. Our results show that our datasets fit into the Tsallis Gaussian distributions with lower values of q-stat during rainy season and around the wet soil regions of Nigeria and the values of q-stat obtained for monthly data sets were mostly in the range 1.2q2.9 for all stations, with very few values q closer to 1.2 for a few stations in the wet season. The distributions obtained from the detrended soil temperature data were mostly found to belong to the class of asymmetric q-Gaussians. The ability of the soil temperature data sets to fit into q-Gaussians might be due and the non-extensive statistical nature of the system and (or) consequently due to the presence of superstatistics. The possible mechanisms responsible this behaviour was further discussed. Full article
Show Figures

Figure 1

16 pages, 2286 KiB  
Article
Quasi-Static Variation of Power-Law and Log-Normal Distributions of Urban Population
by Atushi Ishikawa, Shouji Fujimoto, Arturo Ramos and Takayuki Mizuno
Entropy 2021, 23(7), 908; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070908 - 17 Jul 2021
Viewed by 2009
Abstract
We analytically derived and confirmed by empirical data the following three relations from the quasi-time-reversal symmetry, Gibrat’s law, and the non-Gibrat’s property observed in the urban population data of France. The first is the relation between the time variation of the power law [...] Read more.
We analytically derived and confirmed by empirical data the following three relations from the quasi-time-reversal symmetry, Gibrat’s law, and the non-Gibrat’s property observed in the urban population data of France. The first is the relation between the time variation of the power law and the quasi-time-reversal symmetry in the large-scale range of a system that changes quasi-statically. The second is the relation between the time variation of the log-normal distribution and the quasi-time-reversal symmetry in the mid-scale range. The third is the relation among the parameters of log-normal distribution, non-Gibrat’s property, and quasi-time-reversal symmetry. Full article
(This article belongs to the Special Issue Mathematical Analysis of Urban Spatial Networks)
Show Figures

Figure 1

18 pages, 4774 KiB  
Article
Adaptive Two-Step Bearing-Only Underwater Uncooperative Target Tracking with Uncertain Underwater Disturbances
by Xianghao Hou, Jianbo Zhou, Yixin Yang, Long Yang and Gang Qiao
Entropy 2021, 23(7), 907; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070907 - 16 Jul 2021
Cited by 10 | Viewed by 2176
Abstract
The bearing-only tracking of an underwater uncooperative target can protect maritime territories and allows for the utilization of sea resources. Considering the influences of an unknown underwater environment, this work aimed to estimate 2-D locations and velocities of an underwater target with uncertain [...] Read more.
The bearing-only tracking of an underwater uncooperative target can protect maritime territories and allows for the utilization of sea resources. Considering the influences of an unknown underwater environment, this work aimed to estimate 2-D locations and velocities of an underwater target with uncertain underwater disturbances. In this paper, an adaptive two-step bearing-only underwater uncooperative target tracking filter (ATSF) for uncertain underwater disturbances is proposed. Considering the nonlinearities of the target’s kinematics and the bearing-only measurements, in addition to the uncertain noise caused by an unknown underwater environment, the proposed ATSF consists of two major components, namely, an online noise estimator and a robust extended two-step filter. First, using a modified Sage-Husa online noise estimator, the uncertain process and measurement noise are estimated at each tracking step. Then, by adopting an extended state and by using a robust negative matrix-correcting method in conjunction with a regularized Newton-Gauss iteration scheme, the current state of the underwater uncooperative target is estimated. Finally, the proposed ATSF was tested via simulations of a 2-D underwater uncooperative target tracking scenario. The Monte Carlo simulation results demonstrated the reliability and accuracy of the proposed ATSF in bearing-only underwater uncooperative tracking missions. Full article
Show Figures

Figure 1

23 pages, 8240 KiB  
Article
Performance Analysis and Optimization of a Series Heat Exchangers Organic Rankine Cycle Utilizing Multi-Heat Sources from a Marine Diesel Engine
by Youyi Li and Tianhao Tang
Entropy 2021, 23(7), 906; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070906 - 16 Jul 2021
Cited by 4 | Viewed by 2220
Abstract
Organic Rankine Cycle (ORC) is an effective way to recycle waste heat sources of a marine diesel engine. The aim of the present paper is to analyze and optimize the thermoeconomic performance of a Series Heat Exchangers ORC (SHEORC) for recovering energy from [...] Read more.
Organic Rankine Cycle (ORC) is an effective way to recycle waste heat sources of a marine diesel engine. The aim of the present paper is to analyze and optimize the thermoeconomic performance of a Series Heat Exchangers ORC (SHEORC) for recovering energy from jacket water, scavenge air, and exhaust gas. The three sources are combined into three groups of jacket water (JW)→exhaust gas (EG), scavenge air (SA)→exhaust gas, and jacket water→scavenge air→exhaust gas. The influence of fluid mass flow rate, evaporation pressure, and heat source recovery proportion on the thermal performance and economic performance of SHEORC was studied. A single-objective optimization with power output as the objective and multi-objective optimization with exergy efficiency and levelized cost of energy (LCOE) as the objectives are carried out. The analysis results show that in jacket water→exhaust gas and jacket water→scavenge air→exhaust gas source combination, there is an optimal heat recovery proportion through which the SHEORC could obtain the best performance. The optimization results showed that R245ca has the best performance in thermoeconomic performance in all three source combinations. With scavenge air→exhaust, the power output, exergy efficiency, and LCOE are 354.19 kW, 59.02%, and 0.1150 $/kWh, respectively. Integrating the jacket water into the SA→EG group would not increase the power output, but would decrease the LCOE. Full article
(This article belongs to the Section Thermodynamics)
Show Figures

Figure 1

20 pages, 4267 KiB  
Article
Memory Effects in Quantum Dynamics Modelled by Quantum Renewal Processes
by Nina Megier, Manuel Ponzi, Andrea Smirne and Bassano Vacchini
Entropy 2021, 23(7), 905; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070905 - 16 Jul 2021
Cited by 3 | Viewed by 2185
Abstract
Simple, controllable models play an important role in learning how to manipulate and control quantum resources. We focus here on quantum non-Markovianity and model the evolution of open quantum systems by quantum renewal processes. This class of quantum dynamics provides us with a [...] Read more.
Simple, controllable models play an important role in learning how to manipulate and control quantum resources. We focus here on quantum non-Markovianity and model the evolution of open quantum systems by quantum renewal processes. This class of quantum dynamics provides us with a phenomenological approach to characterise dynamics with a variety of non-Markovian behaviours, here described in terms of the trace distance between two reduced states. By adopting a trajectory picture for the open quantum system evolution, we analyse how non-Markovianity is influenced by the constituents defining the quantum renewal process, namely the time-continuous part of the dynamics, the type of jumps and the waiting time distributions. We focus not only on the mere value of the non-Markovianity measure, but also on how different features of the trace distance evolution are altered, including times and number of revivals. Full article
(This article belongs to the Special Issue Processes with Memory in Natural and Social Sciences)
Show Figures

Figure 1

33 pages, 6636 KiB  
Article
Secret Communication Systems Using Chaotic Wave Equations with Neural Network Boundary Conditions
by Yuhan Chen, Hideki Sano, Masashi Wakaiki and Takaharu Yaguchi
Entropy 2021, 23(7), 904; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070904 - 16 Jul 2021
Cited by 7 | Viewed by 2285
Abstract
In a secret communication system using chaotic synchronization, the communication information is embedded in a signal that behaves as chaos and is sent to the receiver to retrieve the information. In a previous study, a chaotic synchronous system was developed by integrating the [...] Read more.
In a secret communication system using chaotic synchronization, the communication information is embedded in a signal that behaves as chaos and is sent to the receiver to retrieve the information. In a previous study, a chaotic synchronous system was developed by integrating the wave equation with the van der Pol boundary condition, of which the number of the parameters are only three, which is not enough for security. In this study, we replace the nonlinear boundary condition with an artificial neural network, thereby making the transmitted information difficult to leak. The neural network is divided into two parts; the first half is used as the left boundary condition of the wave equation and the second half is used as that on the right boundary, thus replacing the original nonlinear boundary condition. We also show the results for both monochrome and color images and evaluate the security performance. In particular, it is shown that the encrypted images are almost identical regardless of the input images. The learning performance of the neural network is also investigated. The calculated Lyapunov exponent shows that the learned neural network causes some chaotic vibration effect. The information in the original image is completely invisible when viewed through the image obtained after being concealed by the proposed system. Some security tests are also performed. The proposed method is designed in such a way that the transmitted images are encrypted into almost identical images of waves, thereby preventing the retrieval of information from the original image. The numerical results show that the encrypted images are certainly almost identical, which supports the security of the proposed method. Some security tests are also performed. The proposed method is designed in such a way that the transmitted images are encrypted into almost identical images of waves, thereby preventing the retrieval of information from the original image. The numerical results show that the encrypted images are certainly almost identical, which supports the security of the proposed method. Full article
(This article belongs to the Special Issue Dynamical Systems, Differential Equations and Applications)
Show Figures

Figure 1

13 pages, 318 KiB  
Article
Relative Entropy in Determining Degressive Proportional Allocations
by Katarzyna Cegiełka, Piotr Dniestrzański, Janusz Łyko, Arkadiusz Maciuk and Maciej Szczeciński
Entropy 2021, 23(7), 903; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070903 - 15 Jul 2021
Viewed by 1490
Abstract
The principle of degressively proportional apportionment of goods, being a compromise between equality and proportionality, facilitates the application of many different allocation rules. Agents with smaller entitlements are more interested in an allocation that is as close to equality as possible, while those [...] Read more.
The principle of degressively proportional apportionment of goods, being a compromise between equality and proportionality, facilitates the application of many different allocation rules. Agents with smaller entitlements are more interested in an allocation that is as close to equality as possible, while those with greater entitlements prefer an allocation as close to proportionality as possible. Using relative entropy to quantify the inequity of allocation, this paper indicates an allocation that neutralizes these two contradictory approaches by symmetrizing the inequities perceived by the smallest and largest agents participating in the apportionment. First, based on some selected properties, the set of potential allocation rules was reduced to those generated by power functions. Then, the existence of the power function whose exponent is determined so as to generate the allocation that symmetrizes the relative entropy with respect to equal and proportional allocations was shown. As a result, all agents of the apportionment are more inclined to accept the proposed allocation regardless of the size of their entitlements. The exponent found in this way shows the significant relationship between the problem under study and the well-known Theil indices of inequality. The problem may also be seen from this viewpoint. Full article
19 pages, 7723 KiB  
Article
3D Underwater Uncooperative Target Tracking for a Time-Varying Non-Gaussian Environment by Distributed Passive Underwater Buoys
by Xianghao Hou, Jianbo Zhou, Yixin Yang, Long Yang and Gang Qiao
Entropy 2021, 23(7), 902; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070902 - 15 Jul 2021
Cited by 5 | Viewed by 1724
Abstract
Accurate 3D passive tracking of an underwater uncooperative target is of great significance to make use of the sea resources as well as to ensure the safety of our maritime areas. In this paper, a 3D passive underwater uncooperative target tracking problem for [...] Read more.
Accurate 3D passive tracking of an underwater uncooperative target is of great significance to make use of the sea resources as well as to ensure the safety of our maritime areas. In this paper, a 3D passive underwater uncooperative target tracking problem for a time-varying non-Gaussian environment is studied. Aiming to overcome the low observability drawback inherent in the passive target tracking problem, a distributed passive underwater buoys observing system is considered and the optimal topology of the distributed measurement system is designed based on the nonlinear system observability analysis theory and the Cramer–Rao lower bound (CRLB) analysis method. Then, considering the unknown underwater environment will lead to time-varying non-Gaussian disturbances for both the target’s dynamics and the measurements, the robust optimal nonlinear estimator, namely the adaptive particle filter (APF), is proposed. Based on the Bayesian posterior probability and Monte Carlo techniques, the proposed algorithm utilizes the real-time optimal estimation technique to calculate the complex noise online and tackle the underwater uncooperative target tracking problem. Finally, the proposed algorithm is tested by simulated data and comprehensive comparisons along with detailed discussions that are made to demonstrate the effectiveness of the proposed APF. Full article
Show Figures

Figure 1

28 pages, 6086 KiB  
Article
Prediction of Liner Metal Temperature of an Aeroengine Combustor with Multi-Physics Scale-Resolving CFD
by Davide Bertini, Lorenzo Mazzei and Antonio Andreini
Entropy 2021, 23(7), 901; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070901 - 15 Jul 2021
Cited by 2 | Viewed by 2398
Abstract
Computational Fluid Dynamics is a fundamental tool to simulate the flow field and the multi-physics nature of the phenomena involved in gas turbine combustors, supporting their design since the very preliminary phases. Standard steady state RANS turbulence models provide a reasonable prediction, despite [...] Read more.
Computational Fluid Dynamics is a fundamental tool to simulate the flow field and the multi-physics nature of the phenomena involved in gas turbine combustors, supporting their design since the very preliminary phases. Standard steady state RANS turbulence models provide a reasonable prediction, despite some well-known limitations in reproducing the turbulent mixing in highly unsteady flows. Their affordable cost is ideal in the preliminary design steps, whereas, in the detailed phase of the design process, turbulence scale-resolving methods (such as LES or similar approaches) can be preferred to significantly improve the accuracy. Despite that, in dealing with multi-physics and multi-scale problems, as for Conjugate Heat Transfer (CHT) in presence of radiation, transient approaches are not always affordable and appropriate numerical treatments are necessary to properly account for the huge range of characteristics scales in space and time that occur when turbulence is resolved and heat conduction is simulated contextually. The present work describes an innovative methodology to perform CHT simulations accounting for multi-physics and multi-scale problems. Such methodology, named U-THERM3D, is applied for the metal temperature prediction of an annular aeroengine lean burn combustor. The theoretical formulations of the tool are described, together with its numerical implementation in the commercial CFD code ANSYS Fluent. The proposed approach is based on a time de-synchronization of the involved time dependent physics permitting to significantly speed up the calculation with respect to fully coupled strategy, preserving at the same time the effect of unsteady heat transfer on the final time averaged predicted metal temperature. The results of some preliminary assessment tests of its consistency and accuracy are reported before showing its exploitation on the real combustor. The results are compared against steady-state calculations and experimental data obtained by full annular tests at real scale conditions. The work confirms the importance of high-fidelity CFD approaches for the aerothermal prediction of liner metal temperature. Full article
(This article belongs to the Special Issue Computational Fluid Dynamics and Conjugate Heat Transfer)
Show Figures

Figure 1

7 pages, 496 KiB  
Article
Possibilities of Practical Use of Historical Distributions of Ash, Sulfur and Mercury Contents in Commercial Steam Coal of the USCB
by Ireneusz Pyka, Wojciech Kempa and Krzysztof Wierzchowski
Entropy 2021, 23(7), 900; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070900 - 15 Jul 2021
Viewed by 1856
Abstract
In the process of extracting hard coal, extensive databases are created on its quality parameters. A statistical assessment was made of the ash, sulfur, and mercury content of commercial coals produced in the Upper Silesian Coal Basin (USCB). The statistical methods applied: non-parametric [...] Read more.
In the process of extracting hard coal, extensive databases are created on its quality parameters. A statistical assessment was made of the ash, sulfur, and mercury content of commercial coals produced in the Upper Silesian Coal Basin (USCB). The statistical methods applied: non-parametric tests of compatibility for two populations, parametric significance tests, and non-parametric tests of compatibility for the three populations, showed that the distributions of ash and sulfur content in 2014 and 2015 are comparable and the average values are similar. Statistical tests indicated significant differences in the mercury content distributions and their variances. This demonstrates the need for ongoing monitoring of mercury content in commercial coals, as a prediction of mercury content from historical data is hardly possible. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

8 pages, 229 KiB  
Article
The Shannon–McMillan Theorem Proves Convergence to Equiprobability of Boltzmann’s Microstates
by Arnaldo Spalvieri
Entropy 2021, 23(7), 899; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070899 - 15 Jul 2021
Cited by 1 | Viewed by 1975
Abstract
This paper shows that, for a large number of particles and for distinguishable and non-interacting identical particles, convergence to equiprobability of the W microstates of the famous Boltzmann–Planck entropy formula S = k log(W) is proved by the Shannon–McMillan theorem, a [...] Read more.
This paper shows that, for a large number of particles and for distinguishable and non-interacting identical particles, convergence to equiprobability of the W microstates of the famous Boltzmann–Planck entropy formula S = k log(W) is proved by the Shannon–McMillan theorem, a cornerstone of information theory. This result further strengthens the link between information theory and statistical mechanics. Full article
(This article belongs to the Section Statistical Physics)
16 pages, 3314 KiB  
Article
Supervised Domain Adaptation for Automated Semantic Segmentation of the Atrial Cavity
by Marta Saiz-Vivó, Adrián Colomer, Carles Fonfría, Luis Martí-Bonmatí and Valery Naranjo
Entropy 2021, 23(7), 898; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070898 - 14 Jul 2021
Cited by 3 | Viewed by 2445
Abstract
Atrial fibrillation (AF) is the most common cardiac arrhythmia. At present, cardiac ablation is the main treatment procedure for AF. To guide and plan this procedure, it is essential for clinicians to obtain patient-specific 3D geometrical models of the atria. For this, there [...] Read more.
Atrial fibrillation (AF) is the most common cardiac arrhythmia. At present, cardiac ablation is the main treatment procedure for AF. To guide and plan this procedure, it is essential for clinicians to obtain patient-specific 3D geometrical models of the atria. For this, there is an interest in automatic image segmentation algorithms, such as deep learning (DL) methods, as opposed to manual segmentation, an error-prone and time-consuming method. However, to optimize DL algorithms, many annotated examples are required, increasing acquisition costs. The aim of this work is to develop automatic and high-performance computational models for left and right atrium (LA and RA) segmentation from a few labelled MRI volumetric images with a 3D Dual U-Net algorithm. For this, a supervised domain adaptation (SDA) method is introduced to infer knowledge from late gadolinium enhanced (LGE) MRI volumetric training samples (80 LA annotated samples) to a network trained with balanced steady-state free precession (bSSFP) MR images of limited number of annotations (19 RA and LA annotated samples). The resulting knowledge-transferred model SDA outperformed the same network trained from scratch in both RA (Dice equals 0.9160) and LA (Dice equals 0.8813) segmentation tasks. Full article
Show Figures

Figure 1

13 pages, 1607 KiB  
Article
The Renewed Role of Sweep Functions in Noisy Shortcuts to Adiabaticity
by Michele Delvecchio, Francesco Petiziol and Sandro Wimberger
Entropy 2021, 23(7), 897; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070897 - 14 Jul 2021
Cited by 1 | Viewed by 2309
Abstract
We study the robustness of different sweep protocols for accelerated adiabaticity following in the presence of static errors and of dissipative and dephasing phenomena. While in the noise-free case, counterdiabatic driving is, by definition, insensitive to the form of the original sweep function, [...] Read more.
We study the robustness of different sweep protocols for accelerated adiabaticity following in the presence of static errors and of dissipative and dephasing phenomena. While in the noise-free case, counterdiabatic driving is, by definition, insensitive to the form of the original sweep function, this property may be lost when the quantum system is open. We indeed observe that, according to the decay and dephasing channels investigated here, the performance of the system becomes highly dependent on the sweep function. Our findings are relevant for the experimental implementation of robust shortcuts-to-adiabaticity techniques for the control of quantum systems. Full article
(This article belongs to the Special Issue Shortcuts to Adiabaticity)
Show Figures

Figure 1

10 pages, 1911 KiB  
Article
Information Theory Based Evaluation of the RC4 Stream Cipher Outputs
by Evaristo José Madarro-Capó , Carlos Miguel Legón-Pérez , Omar Rojas and Guillermo Sosa-Gómez
Entropy 2021, 23(7), 896; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070896 - 14 Jul 2021
Cited by 3 | Viewed by 2262
Abstract
This paper presents a criterion, based on information theory, to measure the amount of average information provided by the sequences of outputs of the RC4 on the internal state. The test statistic used is the sum of the maximum plausible estimates of the [...] Read more.
This paper presents a criterion, based on information theory, to measure the amount of average information provided by the sequences of outputs of the RC4 on the internal state. The test statistic used is the sum of the maximum plausible estimates of the entropies H(jt|zt), corresponding to the probability distributions P(jt|zt) of the sequences of random variables (jt)tT and (zt)tT, independent, but not identically distributed, where zt are the known values of the outputs, while jt is one of the unknown elements of the internal state of the RC4. It is experimentally demonstrated that the test statistic allows for determining the most vulnerable RC4 outputs, and it is proposed to be used as a vulnerability metric for each RC4 output sequence concerning the iterative probabilistic attack. Full article
(This article belongs to the Special Issue Types of Entropies and Divergences with Their Applications)
Show Figures

Figure 1

20 pages, 321 KiB  
Article
Entropy, Information, and the Updating of Probabilities
by Ariel Caticha
Entropy 2021, 23(7), 895; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070895 - 14 Jul 2021
Cited by 7 | Viewed by 2553
Abstract
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the [...] Read more.
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged role played by the notion of independence in science. The resulting framework—the ME method—can handle arbitrary priors and arbitrary constraints. It includes the MaxEnt and Bayes’ rules as special cases and, therefore, unifies entropic and Bayesian methods into a single general inference scheme. The ME method goes beyond the mere selection of a single posterior, and also addresses the question of how much less probable other distributions might be, which provides a direct bridge to the theories of fluctuations and large deviations. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
16 pages, 394 KiB  
Article
Transparent Memory Tests Based on the Double Address Sequences
by Ireneusz Mrozek and Vyacheslav N. Yarmolik
Entropy 2021, 23(7), 894; https://0-doi-org.brum.beds.ac.uk/10.3390/e23070894 - 14 Jul 2021
Cited by 3 | Viewed by 1513
Abstract
An important achievement in the functional diagnostics of memory devices is the development and application of so-called transparent testing methods. This is especially important for modern computer systems, such as embedded systems, systems and networks on chips, on-board computer applications, network servers, and [...] Read more.
An important achievement in the functional diagnostics of memory devices is the development and application of so-called transparent testing methods. This is especially important for modern computer systems, such as embedded systems, systems and networks on chips, on-board computer applications, network servers, and automated control systems that require periodic testing of their components. This article analyzes the effectiveness of existing transparent tests based on the use of the properties of data stored in the memory, such as changing data and their symmetry. As a new approach for constructing transparent tests, we propose to use modified address sequences with duplicate addresses to reduce the time complexity of tests and increase their diagnostic abilities. Full article
Previous Issue
Next Issue
Back to TopTop