entropy-logo

Journal Browser

Journal Browser

Applications of Information Theory in the Geosciences

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory, Probability and Statistics".

Deadline for manuscript submissions: closed (31 July 2016) | Viewed by 140157

Special Issue Editor


E-Mail Website
Guest Editor
School of Informatics, Computing, and Cyber Systems, Northern Arizona University, Flagstaff, AZ, USA
Interests: complex systems; information theory; climate and urban microclimate; ecohydrology; water resources; water policy; statistics; engineering ethics; environmental data informatics; engineering education
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Information Theory is gaining many new applications in broad areas of Science, particularly the in the domain of Complex Adaptive Systems. These new applications often blend theoretical developments of Information Theory with innovative applications to complex-systems problems in the geosciences. This special issue specifically emphasizes research that addresses Geoscience problems using Information Theory approaches, by introducing a novel development of Information Theory for specific applications, and/or by solving a new Geoscience problem using the tools of Information Theory. Submissions at the boundaries of Information Theory, the Geosciences, and other disciplines are also welcome.

Prof. Benjamin L. Ruddell
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • geoscience
  • life science
  • physics
  • complex adaptive systems
  • Shannon entropy
  • information theory
  • nonlinearity
  • statistics
  • applications

Published Papers (21 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

1154 KiB  
Article
Entropy-Based Experimental Design for Optimal Model Discrimination in the Geosciences
by Wolfgang Nowak and Anneli Guthke
Entropy 2016, 18(11), 409; https://0-doi-org.brum.beds.ac.uk/10.3390/e18110409 - 17 Nov 2016
Cited by 27 | Viewed by 6280
Abstract
Choosing between competing models lies at the heart of scientific work, and is a frequent motivation for experimentation. Optimal experimental design (OD) methods maximize the benefit of experiments towards a specified goal. We advance and demonstrate an OD approach to maximize the information [...] Read more.
Choosing between competing models lies at the heart of scientific work, and is a frequent motivation for experimentation. Optimal experimental design (OD) methods maximize the benefit of experiments towards a specified goal. We advance and demonstrate an OD approach to maximize the information gained towards model selection. We make use of so-called model choice indicators, which are random variables with an expected value equal to Bayesian model weights. Their uncertainty can be measured with Shannon entropy. Since the experimental data are still random variables in the planning phase of an experiment, we use mutual information (the expected reduction in Shannon entropy) to quantify the information gained from a proposed experimental design. For implementation, we use the Preposterior Data Impact Assessor framework (PreDIA), because it is free of the lower-order approximations of mutual information often found in the geosciences. In comparison to other studies in statistics, our framework is not restricted to sequential design or to discrete-valued data, and it can handle measurement errors. As an application example, we optimize an experiment about the transport of contaminants in clay, featuring the problem of choosing between competing isotherms to describe sorption. We compare the results of optimizing towards maximum model discrimination with an alternative OD approach that minimizes the overall predictive uncertainty under model choice uncertainty. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

914 KiB  
Article
Global Atmospheric Dynamics Investigated by Using Hilbert Frequency Analysis
by Dario A. Zappalà, Marcelo Barreiro and Cristina Masoller
Entropy 2016, 18(11), 408; https://0-doi-org.brum.beds.ac.uk/10.3390/e18110408 - 16 Nov 2016
Cited by 6 | Viewed by 4967
Abstract
The Hilbert transform is a well-known tool of time series analysis that has been widely used to investigate oscillatory signals that resemble a noisy periodic oscillation, because it allows instantaneous phase and frequency to be estimated, which in turn uncovers interesting properties of [...] Read more.
The Hilbert transform is a well-known tool of time series analysis that has been widely used to investigate oscillatory signals that resemble a noisy periodic oscillation, because it allows instantaneous phase and frequency to be estimated, which in turn uncovers interesting properties of the underlying process that generates the signal. Here we use this tool to analyze atmospheric data: we consider daily-averaged Surface Air Temperature (SAT) time series recorded over a regular grid of locations covering the Earth’s surface. From each SAT time series, we calculate the instantaneous frequency time series by considering the Hilbert analytic signal. The properties of the obtained frequency data set are investigated by plotting the map of the average frequency and the map of the standard deviation of the frequency fluctuations. The average frequency map reveals well-defined large-scale structures: in the extra-tropics, the average frequency in general corresponds to the expected one-year period of solar forcing, while in the tropics, a different behaviour is found, with particular regions having a faster average frequency. In the standard deviation map, large-scale structures are also found, which tend to be located over regions of strong annual precipitation. Our results demonstrate that Hilbert analysis of SAT time-series uncovers meaningful information, and is therefore a promising tool for the study of other climatological variables. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

9965 KiB  
Article
Fuzzy Shannon Entropy: A Hybrid GIS-Based Landslide Susceptibility Mapping Method
by Majid Shadman Roodposhti, Jagannath Aryal, Himan Shahabi and Taher Safarrad
Entropy 2016, 18(10), 343; https://0-doi-org.brum.beds.ac.uk/10.3390/e18100343 - 27 Sep 2016
Cited by 74 | Viewed by 9954
Abstract
Assessing Landslide Susceptibility Mapping (LSM) contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs) in combination [...] Read more.
Assessing Landslide Susceptibility Mapping (LSM) contributes to reducing the risk of living with landslides. Handling the vagueness associated with LSM is a challenging task. Here we show the application of hybrid GIS-based LSM. The hybrid approach embraces fuzzy membership functions (FMFs) in combination with Shannon entropy, a well-known information theory-based method. Nine landslide-related criteria, along with an inventory of landslides containing 108 recent and historic landslide points, are used to prepare a susceptibility map. A random split into training (≈70%) and testing (≈30%) samples are used for training and validation of the LSM model. The study area—Izeh—is located in the Khuzestan province of Iran, a highly susceptible landslide zone. The performance of the hybrid method is evaluated using receiver operating characteristics (ROC) curves in combination with area under the curve (AUC). The performance of the proposed hybrid method with AUC of 0.934 is superior to multi-criteria evaluation approaches using a subjective scheme in this research in comparison with a previous study using the same dataset through extended fuzzy multi-criteria evaluation with AUC value of 0.894, and was built on the basis of decision makers’ evaluation in the same study area. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

427 KiB  
Article
Application of Information Theory for an Entropic Gradient of Ecological Sites
by Kürşad Özkan
Entropy 2016, 18(10), 340; https://0-doi-org.brum.beds.ac.uk/10.3390/e18100340 - 22 Sep 2016
Cited by 4 | Viewed by 4603
Abstract
The present study was carried out to compute the straightforward formulations of information entropy for ecological sites and to arrange their locations along the ordination axes using the values of those entropic measures. The data of plant communities taken from six sites found [...] Read more.
The present study was carried out to compute the straightforward formulations of information entropy for ecological sites and to arrange their locations along the ordination axes using the values of those entropic measures. The data of plant communities taken from six sites found in the Dedegül Mountain sub-district and the Sultan Mountain sub-district located in the Beyşehir Watershed was examined in the present study. Firstly entropic measures (i.e., marginal entropy, joint entropy, conditional entropy and mutual entropy) were computed for each of the sites. Next principal component analysis (PCA) was applied to the data composed of the values of those entropic measures. As a result of the first axis of the applied PCA, the arrangement of the sites was found meaningful from an ecological point of view because the sites were located along with the first component axis of the PCA by illustrating the climatic differences between the sub-districts. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

14659 KiB  
Article
Entropy Base Estimation of Moisture Content of the Top 10-m Unsaturated Soil for the Badain Jaran Desert in Northwestern China
by Xiangyang Zhou, Wenjuan Lei and Jinzhu Ma
Entropy 2016, 18(9), 323; https://0-doi-org.brum.beds.ac.uk/10.3390/e18090323 - 03 Sep 2016
Cited by 8 | Viewed by 4692
Abstract
Estimation of soil moisture distribution in desert regions is challenged by the deep unsaturated zone and the extreme natural environment. In this study, an entropy-based method, consisting of information entropy, principle of maximum entropy (PME), solutions to PME with constraints, and the determination [...] Read more.
Estimation of soil moisture distribution in desert regions is challenged by the deep unsaturated zone and the extreme natural environment. In this study, an entropy-based method, consisting of information entropy, principle of maximum entropy (PME), solutions to PME with constraints, and the determination of parameters, is used to estimate the soil moisture distribution in the 10 m deep vadose zone of a desert region. Firstly, the soil moisture distribution is described as a scaled probability density function (PDF), which is solved by PME with the constraints of normalization, known arithmetic mean and geometric mean, and the solution is the general form of gamma distribution. A constant arithmetic mean is determined by considering the stable average recharge rate at thousand year scale, and an approximate constant geometric mean is determined by the low flow rate (about 1 cm a year). Followed, the parameters of the scaled PDF of gamma distribution are determined by local environmental factors like terrain and vegetation: the multivariate linear equations are established to qualify the relationship between the parameters and the environmental factors on the basis of nineteen random soil moisture profiles about depth through the application of fuzzy mathematics. Finally, the accuracy is tested using correlation coefficient (CC) and relative error. This method performs with CC larger than 0.9 in more than a half profiles and most larger than 0.8, the relative errors are less than 30% in most of soil moisture profiles and can be as low as less than 15% when parameters fitted appropriately. Therefore, this study provides an alternative method to estimate soil moisture distribution in top 0–10 m layers of the Badain Jaran Desert based on local terrain and vegetation factors instead of drilling sand samples, this method would be useful in desert regions with extreme natural conditions since these environmental factors can be obtained by remote sensing data. Meanwhile, we should bear in mind that this method is challenged in humid regions since more intensive and frequent precipitation, and more vegetation cover make the system much more complex. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

1217 KiB  
Article
A Geographically Temporal Weighted Regression Approach with Travel Distance for House Price Estimation
by Jiping Liu, Yi Yang, Shenghua Xu, Yangyang Zhao, Yong Wang and Fuhao Zhang
Entropy 2016, 18(8), 303; https://0-doi-org.brum.beds.ac.uk/10.3390/e18080303 - 16 Aug 2016
Cited by 22 | Viewed by 7171
Abstract
Previous studies have demonstrated that non-Euclidean distance metrics can improve model fit in the geographically weighted regression (GWR) model. However, the GWR model often considers spatial nonstationarity and does not address variations in local temporal issues. Therefore, this paper explores a geographically temporal [...] Read more.
Previous studies have demonstrated that non-Euclidean distance metrics can improve model fit in the geographically weighted regression (GWR) model. However, the GWR model often considers spatial nonstationarity and does not address variations in local temporal issues. Therefore, this paper explores a geographically temporal weighted regression (GTWR) approach that accounts for both spatial and temporal nonstationarity simultaneously to estimate house prices based on travel time distance metrics. Using house price data collected between 1980 and 2016, the house price response and explanatory variables are then modeled using both the GWR and the GTWR approaches. Comparing the GWR model with Euclidean and travel distance metrics, the GTWR model with travel distance obtains the highest value for the coefficient of determination ( R 2 ) and the lowest values for the Akaike information criterion (AIC). The results show that the GTWR model provides a relatively high goodness of fit and sufficient space-time explanatory power with non-Euclidean distance metrics. The results of this study can be used to formulate more effective policies for real estate management. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Graphical abstract

1589 KiB  
Article
Characterization of Seepage Velocity beneath a Complex Rock Mass Dam Based on Entropy Theory
by Xixi Chen, Jiansheng Chen, Tao Wang, Huaidong Zhou and Linghua Liu
Entropy 2016, 18(8), 293; https://0-doi-org.brum.beds.ac.uk/10.3390/e18080293 - 11 Aug 2016
Cited by 4 | Viewed by 4969
Abstract
Owing to the randomness in the fracture flow system, the seepage system beneath a complex rock mass dam is inherently complex and highly uncertain, an investigation of the dam leakage by estimating the spatial distribution of the seepage field by conventional methods is [...] Read more.
Owing to the randomness in the fracture flow system, the seepage system beneath a complex rock mass dam is inherently complex and highly uncertain, an investigation of the dam leakage by estimating the spatial distribution of the seepage field by conventional methods is quite difficult. In this paper, the entropy theory, as a relation between the definiteness and probability, is used to probabilistically analyze the characteristics of the seepage system in a complex rock mass dam. Based on the principle of maximum entropy, an equation for the vertical distribution of the seepage velocity in a dam borehole is derived. The achieved distribution is tested and compared with actual field data, and the results show good agreement. According to the entropy of flow velocity in boreholes, the rupture degree of a dam bedrock has been successfully estimated. Moreover, a new sampling scheme is presented. The sampling frequency has a negative correlation with the distance to the site of the minimum velocity, which is preferable to the traditional one. This paper demonstrates the significant advantage of applying the entropy theory for seepage velocity analysis in a complex rock mass dam. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Graphical abstract

5404 KiB  
Article
Entropy-Weighted Instance Matching Between Different Sourcing Points of Interest
by Lin Li, Xiaoyu Xing, Hui Xia and Xiaoying Huang
Entropy 2016, 18(2), 45; https://0-doi-org.brum.beds.ac.uk/10.3390/e18020045 - 28 Jan 2016
Cited by 27 | Viewed by 5465
Abstract
The crucial problem for integrating geospatial data is finding the corresponding objects (the counterpart) from different sources. Most current studies focus on object matching with individual attributes such as spatial, name, or other attributes, which avoids the difficulty of integrating those attributes, but [...] Read more.
The crucial problem for integrating geospatial data is finding the corresponding objects (the counterpart) from different sources. Most current studies focus on object matching with individual attributes such as spatial, name, or other attributes, which avoids the difficulty of integrating those attributes, but at the cost of an ineffective matching. In this study, we propose an approach for matching instances by integrating heterogeneous attributes with the allocation of suitable attribute weights via information entropy. First, a normalized similarity formula is developed, which can simplify the calculation of spatial attribute similarity. Second, sound-based and word segmentation-based methods are adopted to eliminate the semantic ambiguity when there is a lack of a normative coding standard in geospatial data to express the name attribute. Third, category mapping is established to address the heterogeneity among different classifications. Finally, to address the non-linear characteristic of attribute similarity, the weights of the attributes are calculated by the entropy of the attributes. Experiments demonstrate that the Entropy-Weighted Approach (EWA) has good performance both in terms of precision and recall for instance matching from different data sets. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Graphical abstract

1608 KiB  
Article
Hot Spots and Persistence of Nitrate in Aquifers Across Scales
by Dipankar Dwivedi and Binayak P. Mohanty
Entropy 2016, 18(1), 25; https://0-doi-org.brum.beds.ac.uk/10.3390/e18010025 - 13 Jan 2016
Cited by 20 | Viewed by 4903
Abstract
Nitrate-N (NO3 -- N) is one of the most pervasive contaminants in groundwater. Nitrate in groundwater exhibits long-term behavior due to complex interactions at multiple scales among various geophysical factors, such as sources of nitrate-N, characteristics of the vadose zone and [...] Read more.
Nitrate-N (NO3 -- N) is one of the most pervasive contaminants in groundwater. Nitrate in groundwater exhibits long-term behavior due to complex interactions at multiple scales among various geophysical factors, such as sources of nitrate-N, characteristics of the vadose zone and aquifer attributes. To minimize contamination of nitrate-N in groundwater, it is important to estimate hot spots (>10 mg/L of NO3 -- N), trends and persistence of nitrate-N in groundwater. To analyze the trends and persistence of nitrate-N in groundwater at multiple spatio-temporal scales, we developed and used an entropy-based method along with the Hurst exponent in two different hydrogeologic settings: the Trinity and Ogallala Aquifers in Texas at fine (2 km × 2 km), intermediate (10 km × 10 km) and coarse (100 km × 100 km) scales. Results show that nitrate-N exhibits long-term persistence at the intermediate and coarse scales. In the Trinity Aquifer, overall mean nitrate-N has declined with a slight increase in normalized marginal entropy (NME) over each decade from 1940 to 2008; however, the number of hot spots has increased over time. In the Ogallala Aquifer, overall mean nitrate-N has increased with slight moderation in NME since 1940; however, the number of hot spots has significantly decreased for the same period at all scales. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

796 KiB  
Article
Nonlinear Predictive Control of a Hydropower System Model
by Runfan Zhang, Diyi Chen and Xiaoyi Ma
Entropy 2015, 17(9), 6129-6149; https://0-doi-org.brum.beds.ac.uk/10.3390/e17096129 - 01 Sep 2015
Cited by 21 | Viewed by 4718
Abstract
A six-dimensional nonlinear hydropower system controlled by a nonlinear predictive control method is presented in this paper. In terms of the nonlinear predictive control method; the performance index with terminal penalty function is selected. A simple method to find an appropriate terminal penalty [...] Read more.
A six-dimensional nonlinear hydropower system controlled by a nonlinear predictive control method is presented in this paper. In terms of the nonlinear predictive control method; the performance index with terminal penalty function is selected. A simple method to find an appropriate terminal penalty function is introduced and its effectiveness is proved. The input-to-state-stability of the controlled system is proved by using the Lyapunov function. Subsequently a six-dimensional model of the hydropower system is presented in the paper. Different with other hydropower system models; the above model includes the hydro-turbine system; the penstock system; the generator system; and the hydraulic servo system accurately describing the operational process of a hydropower plant. Furthermore, the numerical experiments show that the six-dimensional nonlinear hydropower system controlled by the method is stable. In addition, the numerical experiment also illustrates that the nonlinear predictive control method enjoys great advantages over a traditional control method in nonlinear systems. Finally, a strategy to combine the nonlinear predictive control method with other methods is proposed to further facilitate the application of the nonlinear predictive control method into practice. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

784 KiB  
Article
Probabilistic Forecasts: Scoring Rules and Their Decomposition and Diagrammatic Representation via Bregman Divergences
by Gareth Hughes and Cairistiona F.E. Topp
Entropy 2015, 17(8), 5450-5471; https://0-doi-org.brum.beds.ac.uk/10.3390/e17085450 - 31 Jul 2015
Cited by 5 | Viewed by 4743
Abstract
A scoring rule is a device for evaluation of forecasts that are given in terms of the probability of an event. In this article we will restrict our attention to binary forecasts. We may think of a scoring rule as a penalty attached [...] Read more.
A scoring rule is a device for evaluation of forecasts that are given in terms of the probability of an event. In this article we will restrict our attention to binary forecasts. We may think of a scoring rule as a penalty attached to a forecast after the event has been observed. Thus a relatively small penalty will accrue if a high probability forecast that an event will occur is followed by occurrence of the event. On the other hand, a relatively large penalty will accrue if this forecast is followed by non-occurrence of the event. Meteorologists have been foremost in developing scoring rules for the evaluation of probabilistic forecasts. Here we use a published meteorological data set to illustrate diagrammatically the Brier score and the divergence score, and their statistical decompositions, as examples of Bregman divergences. In writing this article, we have in mind environmental scientists and modellers for whom meteorological factors are important drivers of biological, physical and chemical processes of interest. In this context, we briefly draw attention to the potential for probabilistic forecasting of the within-season component of nitrous oxide emissions from agricultural soils. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

1002 KiB  
Article
On the Use of Information Theory to Quantify Parameter Uncertainty in Groundwater Modeling
by Alston Noronha and Jejung Lee
Entropy 2013, 15(6), 2398-2414; https://0-doi-org.brum.beds.ac.uk/10.3390/e15062398 - 13 Jun 2013
Cited by 3 | Viewed by 6031
Abstract
We applied information theory to quantify parameter uncertainty in a groundwater flow model. A number of parameters in groundwater modeling are often used with lack of knowledge of site conditions due to heterogeneity of hydrogeologic properties and limited access to complex geologic structures. [...] Read more.
We applied information theory to quantify parameter uncertainty in a groundwater flow model. A number of parameters in groundwater modeling are often used with lack of knowledge of site conditions due to heterogeneity of hydrogeologic properties and limited access to complex geologic structures. The present Information Theory-based (ITb) approach is to adopt entropy as a measure of uncertainty at the most probable state of hydrogeologic conditions. The most probable conditions are those at which the groundwater model is optimized with respect to the uncertain parameters. An analytical solution to estimate parameter uncertainty is derived by maximizing the entropy subject to constraints imposed by observation data. MODFLOW-2000 is implemented to simulate the groundwater system and to optimize the unknown parameters. The ITb approach is demonstrated with a three-dimensional synthetic model application and a case study of the Kansas City Plant. Hydraulic heads are the observations and hydraulic conductivities are assumed to be the unknown parameters. The applications show that ITb is capable of identifying which inputs of a groundwater model are the most uncertain and what statistical information can be used for site exploration. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

2005 KiB  
Article
Entropy of Shortest Distance (ESD) as Pore Detector and Pore-Shape Classifier
by Gabor Korvin, Boris Sterligov, Klaudia Oleschko and Sergey Cherkasov
Entropy 2013, 15(6), 2384-2397; https://0-doi-org.brum.beds.ac.uk/10.3390/e15062384 - 10 Jun 2013
Cited by 2 | Viewed by 6030
Abstract
The entropy of shortest distance (ESD) between geographic elements (“elliptical intrusions”, “lineaments”, “points”) on a map, or between "vugs", "fractures" and "pores" in the macro- or microscopic images of triple porosity naturally fractured vuggy carbonates provides a powerful new tool for the digital [...] Read more.
The entropy of shortest distance (ESD) between geographic elements (“elliptical intrusions”, “lineaments”, “points”) on a map, or between "vugs", "fractures" and "pores" in the macro- or microscopic images of triple porosity naturally fractured vuggy carbonates provides a powerful new tool for the digital processing, analysis, classification and space/time distribution prognostic of mineral resources as well as the void space in carbonates, and in other rocks. The procedure is applicable at all scales, from outcrop photos, FMI, UBI, USI (geophysical imaging techniques) to micrographs, as we shall illustrate through some examples. Out of the possible applications of the ESD concept, we discuss in details the sliding window entropy filtering for nonlinear pore boundary enhancement, and propose this procedure as unbiased thresholding technique. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

746 KiB  
Article
Reliability of Inference of Directed Climate Networks Using Conditional Mutual Information
by Jaroslav Hlinka, David Hartman, Martin Vejmelka, Jakob Runge, Norbert Marwan, Jürgen Kurths and Milan Paluš
Entropy 2013, 15(6), 2023-2045; https://0-doi-org.brum.beds.ac.uk/10.3390/e15062023 - 24 May 2013
Cited by 88 | Viewed by 11167
Abstract
Across geosciences, many investigated phenomena relate to specific complex systems consisting of intricately intertwined interacting subsystems. Such dynamical complex systems can be represented by a directed graph, where each link denotes an existence of a causal relation, or information exchange between the nodes. [...] Read more.
Across geosciences, many investigated phenomena relate to specific complex systems consisting of intricately intertwined interacting subsystems. Such dynamical complex systems can be represented by a directed graph, where each link denotes an existence of a causal relation, or information exchange between the nodes. For geophysical systems such as global climate, these relations are commonly not theoretically known but estimated from recorded data using causality analysis methods. These include bivariate nonlinear methods based on information theory and their linear counterpart. The trade-off between the valuable sensitivity of nonlinear methods to more general interactions and the potentially higher numerical reliability of linear methods may affect inference regarding structure and variability of climate networks. We investigate the reliability of directed climate networks detected by selected methods and parameter settings, using a stationarized model of dimensionality-reduced surface air temperature data from reanalysis of 60-year global climate records. Overall, all studied bivariate causality methods provided reproducible estimates of climate causality networks, with the linear approximation showing higher reliability than the investigated nonlinear methods. On the example dataset, optimizing the investigated nonlinear methods with respect to reliability increased the similarity of the detected networks to their linear counterparts, supporting the particular hypothesis of the near-linearity of the surface air temperature reanalysis data. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Graphical abstract

639 KiB  
Article
Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models
by J. Florian Wellmann
Entropy 2013, 15(4), 1464-1485; https://0-doi-org.brum.beds.ac.uk/10.3390/e15041464 - 22 Apr 2013
Cited by 31 | Viewed by 12696
Abstract
The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory [...] Read more.
The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i) which areas in a spatial analysis share information, and (ii) where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Graphical abstract

1163 KiB  
Article
Multiscale Interactions betweenWater and Carbon Fluxes and Environmental Variables in A Central U.S. Grassland
by Nathaniel A. Brunsell and Cassandra J. Wilson
Entropy 2013, 15(4), 1324-1341; https://0-doi-org.brum.beds.ac.uk/10.3390/e15041324 - 10 Apr 2013
Cited by 11 | Viewed by 5798
Abstract
The temporal interactions between water and carbon cycling and the controlling environmental variables are investigated using wavelets and information theory. We used 3.5 years of eddy covariance station observations from an abandoned agricultural field in the central U.S. Time-series of the entropy of [...] Read more.
The temporal interactions between water and carbon cycling and the controlling environmental variables are investigated using wavelets and information theory. We used 3.5 years of eddy covariance station observations from an abandoned agricultural field in the central U.S. Time-series of the entropy of water and carbon fluxes exhibit pronounced annual cycles, primarily explained by the modulation of the diurnal flux amplitude by other variables, such as the net radiation. Entropies of soil moisture and precipitation show almost no annual cycle, but the data were collected during above average precipitation years, which limits the role of moisture stress on the resultant fluxes. We also investigated the information contribution to resultant fluxes from selected environmental variables as a function of time-scale using relative entropy. The relative entropy of latent heat flux and ecosystem respiration show that the radiation terms contribute the most information to these fluxes at scales up to the diurnal scale. Vapor pressure deficit and air temperature contribute to the most information for the gross primary productivity and net ecosystem exchange at the daily time-scale. The relative entropy between the fluxes and soil moisture illustrates that soil moisture contributes information at approximately weekly time-scales, while the relative entropy with precipitation contributes information predominantly at the monthly time-scale. The use of information theory metrics is a relatively new technique for assessing biosphere-atmosphere interactions, and this study illustrates the utility of the approach for assessing the dominant time-scales of these interactions. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

1208 KiB  
Article
HydroZIP: How Hydrological Knowledge can Be Used to Improve Compression of Hydrological Data
by Steven V. Weijs, Nick Van de Giesen and Marc B. Parlange
Entropy 2013, 15(4), 1289-1310; https://0-doi-org.brum.beds.ac.uk/10.3390/e15041289 - 10 Apr 2013
Cited by 18 | Viewed by 8040
Abstract
From algorithmic information theory, which connects the information content of a data set to the shortest computer program that can produce it, it is known that there are strong analogies between compression, knowledge, inference and prediction. The more we know about a data [...] Read more.
From algorithmic information theory, which connects the information content of a data set to the shortest computer program that can produce it, it is known that there are strong analogies between compression, knowledge, inference and prediction. The more we know about a data generating process, the better we can predict and compress the data. A model that is inferred from data should ideally be a compact description of those data. In theory, this means that hydrological knowledge could be incorporated into compression algorithms to more efficiently compress hydrological data and to outperform general purpose compression algorithms. In this study, we develop such a hydrological data compressor, named HydroZIP, and test in practice whether it can outperform general purpose compression algorithms on hydrological data from 431 river basins from the Model Parameter Estimation Experiment (MOPEX) data set. HydroZIP compresses using temporal dependencies and parametric distributions. Resulting file sizes are interpreted as measures of information content, complexity and model adequacy. These results are discussed to illustrate points related to learning from data, overfitting and model complexity. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

462 KiB  
Article
Derivation of 2D Power-Law Velocity Distribution Using Entropy Theory
by Vijay P. Singh, Gustavo Marini and Nicola Fontana
Entropy 2013, 15(4), 1221-1231; https://0-doi-org.brum.beds.ac.uk/10.3390/e15041221 - 08 Apr 2013
Cited by 19 | Viewed by 7646
Abstract
The one-dimensional (1D) power law velocity distribution, commonly used for computing velocities in open channel flow, has been derived empirically. However, a multitude of problems, such as scour around bridge piers, cutoffs and diversions, pollutant dispersion, and so on, require the velocity distribution [...] Read more.
The one-dimensional (1D) power law velocity distribution, commonly used for computing velocities in open channel flow, has been derived empirically. However, a multitude of problems, such as scour around bridge piers, cutoffs and diversions, pollutant dispersion, and so on, require the velocity distribution in two dimensions. This paper employs the Shannon entropy theory for deriving the power law velocity distribution in two-dimensions (2D). The development encompasses the rectangular domain, but can be extended to any arbitrary domain, including a trapezoidal domain. The derived methodology requires only a few parameters and the good agreement is confirmed by comparing the velocity values calculated using the proposed methodology with values derived from both the 1D power law model and a logarithmic velocity distribution available in the literature. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

245 KiB  
Article
Information Properties of Boundary Line Models for N2O Emissions from Agricultural Soils
by Cairistiona F.E. Topp, Weijin Wang, Joanna M. Cloy, Robert M. Rees and Gareth Hughes
Entropy 2013, 15(3), 972-987; https://0-doi-org.brum.beds.ac.uk/10.3390/e15030972 - 05 Mar 2013
Cited by 9 | Viewed by 5754
Abstract
Boundary line models for N2O emissions from agricultural soils provide a means of estimating emissions within defined ranges. Boundary line models partition a two-dimensional region of parameter space into sub-regions by means of thresholds based on relationships between N2O [...] Read more.
Boundary line models for N2O emissions from agricultural soils provide a means of estimating emissions within defined ranges. Boundary line models partition a two-dimensional region of parameter space into sub-regions by means of thresholds based on relationships between N2O emissions and explanatory variables, typically using soil data available from laboratory or field studies. Such models are intermediate in complexity between the use of IPCC emission factors and complex process-based models. Model calibration involves characterizing the extent to which observed data are correctly forecast. Writing the numerical results from graphical two-threshold boundary line models as 3×3 prediction-realization tables facilitates calculation of expected mutual information, a measure of the amount of information about the observations contained in the forecasts. Whereas mutual information characterizes the performance of a forecaster averaged over all forecast categories, specific information and relative entropy both characterize aspects of the amount of information contained in particular forecasts. We calculate and interpret these information quantities for experimental N2O emissions data. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

251 KiB  
Article
Comparing Surface and Mid-Tropospheric CO2 Concentrations from Central U.S. Grasslands
by Ferdouz V. Cochran, Nathaniel A. Brunsell and David B. Mechem
Entropy 2013, 15(2), 606-623; https://0-doi-org.brum.beds.ac.uk/10.3390/e15020606 - 06 Feb 2013
Cited by 3 | Viewed by 6435
Abstract
Comparisons of eddy covariance (EC) tower measurements of CO2 concentration with mid-tropospheric observations from the Atmospheric Infrared Sounder (AIRS) allow for evaluation of the rising global signal of this greenhouse gas in relation to surface carbon dynamics. Using an information theory approach combining [...] Read more.
Comparisons of eddy covariance (EC) tower measurements of CO2 concentration with mid-tropospheric observations from the Atmospheric Infrared Sounder (AIRS) allow for evaluation of the rising global signal of this greenhouse gas in relation to surface carbon dynamics. Using an information theory approach combining relative entropy and wavelet multi-resolution analysis, this study has explored correlations and divergences between mid-tropospheric and surface CO2 concentrations in grasslands of northeastern Kansas. Results show that surface CO2 measurements at the Kansas Field Station (KFS) and the Konza Prairie Biological Stations 1B (KZU) and 4B (K4B) with different land-cover types correlate well with mid-tropospheric CO2 in this region at the 512-day timescale between 2007 and 2010. Relative entropy further reveals that AIRS observations are indicative of surface CO2 concentrations for all land-cover types on monthly (32-day) and longer timescales. AIRS observations are also similar to CO2 concentrations at shorter timescales at sites KFS and K4B experiencing woody encroachment, though these results require further investigation. Differences in species composition and microclimate add to the variability of surface concentrations compared with mid-tropospheric observations. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Figure 1

480 KiB  
Article
Expanding the Algorithmic Information Theory Frame for Applications to Earth Observation
by Daniele Cerra and Mihai Datcu
Entropy 2013, 15(1), 407-415; https://0-doi-org.brum.beds.ac.uk/10.3390/e15010407 - 22 Jan 2013
Cited by 11 | Viewed by 6420
Abstract
Recent years have witnessed an increased interest towards compression-based methods and their applications to remote sensing, as these have a data-driven and parameter-free approach and can be thus succesfully employed in several applications, especially in image information mining. This paper expands the algorithmic [...] Read more.
Recent years have witnessed an increased interest towards compression-based methods and their applications to remote sensing, as these have a data-driven and parameter-free approach and can be thus succesfully employed in several applications, especially in image information mining. This paper expands the algorithmic information theory frame, on which these methods are based. On the one hand, algorithms originally defined in the pattern matching domain are reformulated, allowing a better understanding of the available compression-based tools for remote sensing applications. On the other hand, the use of existing compression algorithms is proposed to store satellite images with added semantic value. Full article
(This article belongs to the Special Issue Applications of Information Theory in the Geosciences)
Show Figures

Graphical abstract

Back to TopTop