entropy-logo

Journal Browser

Journal Browser

Bayesianism

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Multidisciplinary Applications".

Deadline for manuscript submissions: 15 June 2024 | Viewed by 6247

Special Issue Editors


E-Mail Website
Guest Editor
Institute of Mathematics and Statistics, University of São Paulo, Rua do Matão, 1010, São Paulo 05508-900, Brazil
Interests: Bayesian statistics; controversies and paradoxes in probability and statistics; Bayesian reliability; Bayesian analysis of discrete data (BADD); applied statistics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Department of Statistics, Federal University of Bahia, Salvador 40170-110, Brazil
Interests: statistical learning; time series forecasting; robust statistics; data science; applied statistics
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Independent researcher
Interests: hypothesis testing; fundamentals of statistical inference; regression models; analysis of sports data; theoretical physics; probability theory and stochastic processes

Special Issue Information

Dear Colleagues,

Many statistics textbooks treat frequentist statistics as the main subject and relegate Bayesian statistics to a single chapter, often near the end of the book. Students are often told that there are two "schools'' of statistics, the frequentist (or “classical”) and the Bayesian, and that the two are in opposition to each other. Students are implicitly, or even sometimes explicitly, asked to choose between them. The reality, of course, is much more complex. There are other views and tools—likelihoodism, decision theory, “objective Bayesianism,” and fiducial inference, for example—meant to sit between the supposed extremes of subjective Bayesianism and frequentism. Some use priors and posteriors that are not probability functions (do not integrate into unity). There are statisticians, the chief editor of this Special Issue being one of them, who are considered Bayesians, but who have used frequentist techniques or applied frequentist concepts in their work. It is worth remarking here that some of the main tools of Bayesian statistics used in the 21st Century are based on frequencies in simulations collectively known as Markov chain Monte Carlo. Bayesian methods have also been used by statisticians considered frequentists to solve problems that arise in frequentist statistics. Even Sir Ronald Fisher, who first proposed fiducial inference and is considered the “founding father” of frequentist inference, was in his later works moving toward some of the inductive arguments of Bayesian inference and emphasizing the likelihoods stronger. 

In the end, statistics comes down to trying to infer something about a larger population from a smaller sample, and any approach to such a task will have strengths and weaknesses, will surely have pathological cases it cannot resolve, and will be subject to valid criticisms. Therefore, it is not surprising that mixing the ideas of the different “schools” of inference has been a successful approach and has expanded and enriched the palette of tools available to researchers in every quantitative field of study.  

The idea of this Special Issue is to treat Bayesian statistics as the main topic, but that does not mean it is to be treated as superior to any other paradigm of inference.  Contributions from practitioners and theoreticians using and advancing Bayesian thoughts and methods are obviously welcome, but so are contributions from those who use other paradigms, including criticisms of aspects of Bayesian inference in comparison to the authors' preferred methods. Our idea is to provide a snapshot of how Bayesian inference is understood and how it contributes to scientific endeavor today, late in the first quarter of the 21st Century.

When we speak of Bayesianism, we are referring to a philosophical and statistical framework that involves the representation of degrees of belief or justification using probabilities. It is characterized by the idea that belief comes in degrees that can be formalized using the axioms of probability theory. Bayesianism involves the assessment of the rationality of degrees of belief based on a set of rules, and these beliefs can be updated using Bayes's theorem based on new information or evidence.

Prof. Dr. Carlos Alberto De Bragança Pereira
Prof. Dr. Paulo Canas Rodrigues
Dr. Mark Andrew Gannon
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • prior distributions
  • posterior probabilities or densities
  • likelihood optimizations: weighted average or maximization

Published Papers (7 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

19 pages, 2693 KiB  
Article
Bayesian Non-Parametric Inference for Multivariate Peaks-over-Threshold Models
by Peter Trubey and Bruno Sansó
Entropy 2024, 26(4), 335; https://0-doi-org.brum.beds.ac.uk/10.3390/e26040335 - 14 Apr 2024
Viewed by 474
Abstract
We consider a constructive definition of the multivariate Pareto that factorizes the random vector into a radial component and an independent angular component. The former follows a univariate Pareto distribution, and the latter is defined on the surface of the positive orthant of [...] Read more.
We consider a constructive definition of the multivariate Pareto that factorizes the random vector into a radial component and an independent angular component. The former follows a univariate Pareto distribution, and the latter is defined on the surface of the positive orthant of the infinity norm unit hypercube. We propose a method for inferring the distribution of the angular component by identifying its support as the limit of the positive orthant of the unit p-norm spheres and introduce a projected gamma family of distributions defined through the normalization of a vector of independent random gammas to the space. This serves to construct a flexible family of distributions obtained as a Dirichlet process mixture of projected gammas. For model assessment, we discuss scoring methods appropriate to distributions on the unit hypercube. In particular, working with the energy score criterion, we develop a kernel metric that produces a proper scoring rule and presents a simulation study to compare different modeling choices using the proposed metric. Using our approach, we describe the dependence structure of extreme values in the integrated vapor transport (IVT), data describing the flow of atmospheric moisture along the coast of California. We find clear but heterogeneous geographical dependence. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

16 pages, 439 KiB  
Article
Stochastic Volatility Models with Skewness Selection
by Igor Martins and Hedibert Freitas Lopes
Entropy 2024, 26(2), 142; https://0-doi-org.brum.beds.ac.uk/10.3390/e26020142 - 06 Feb 2024
Viewed by 829
Abstract
This paper expands traditional stochastic volatility models by allowing for time-varying skewness without imposing it. While dynamic asymmetry may capture the likely direction of future asset returns, it comes at the risk of leading to overparameterization. Our proposed approach mitigates this concern by [...] Read more.
This paper expands traditional stochastic volatility models by allowing for time-varying skewness without imposing it. While dynamic asymmetry may capture the likely direction of future asset returns, it comes at the risk of leading to overparameterization. Our proposed approach mitigates this concern by leveraging sparsity-inducing priors to automatically select the skewness parameter as dynamic, static or zero in a data-driven framework. We consider two empirical applications. First, in a bond yield application, dynamic skewness captures interest rate cycles of monetary easing and tightening and is partially explained by central banks’ mandates. In a currency modeling framework, our model indicates no skewness in the carry factor after accounting for stochastic volatility. This supports the idea of carry crashes resulting from volatility surges instead of dynamic skewness. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

24 pages, 394 KiB  
Article
On the Nuisance Parameter Elimination Principle in Hypothesis Testing
by Andrés Felipe Flórez Rivera, Luis Gustavo Esteves, Victor Fossaluza and Carlos Alberto de Bragança Pereira
Entropy 2024, 26(2), 117; https://0-doi-org.brum.beds.ac.uk/10.3390/e26020117 - 29 Jan 2024
Viewed by 598
Abstract
The Non-Informative Nuisance Parameter Principle concerns the problem of how inferences about a parameter of interest should be made in the presence of nuisance parameters. The principle is examined in the context of the hypothesis testing problem. We prove that the mixed test [...] Read more.
The Non-Informative Nuisance Parameter Principle concerns the problem of how inferences about a parameter of interest should be made in the presence of nuisance parameters. The principle is examined in the context of the hypothesis testing problem. We prove that the mixed test obeys the principle for discrete sample spaces. We also show how adherence of the mixed test to the principle can make performance of the test much easier. These findings are illustrated with new solutions to well-known problems of testing hypotheses for count data. Full article
(This article belongs to the Special Issue Bayesianism)
25 pages, 1048 KiB  
Article
An Objective and Robust Bayes Factor for the Hypothesis Test One Sample and Two Population Means
by Israel A. Almodóvar-Rivera and Luis R. Pericchi-Guerra
Entropy 2024, 26(1), 88; https://0-doi-org.brum.beds.ac.uk/10.3390/e26010088 - 20 Jan 2024
Viewed by 974
Abstract
It has been over 100 years since the discovery of one of the most fundamental statistical tests: the Student’s t test. However, reliable conventional and objective Bayesian procedures are still essential for routine practice. In this work, we proposed an objective and robust [...] Read more.
It has been over 100 years since the discovery of one of the most fundamental statistical tests: the Student’s t test. However, reliable conventional and objective Bayesian procedures are still essential for routine practice. In this work, we proposed an objective and robust Bayesian approach for hypothesis testing for one-sample and two-sample mean comparisons when the assumption of equal variances holds. The newly proposed Bayes factors are based on the intrinsic and Berger robust prior. Additionally, we introduced a corrected version of the Bayesian Information Criterion (BIC), denoted BIC-TESS, which is based on the effective sample size (TESS), for comparing two population means. We studied our developed Bayes factors in several simulation experiments for hypothesis testing. Our methodologies consistently provided strong evidence in favor of the null hypothesis in the case of equal means and variances. Finally, we applied the methodology to the original Gosset sleep data, concluding strong evidence favoring the hypothesis that the average sleep hours differed between the two treatments. These methodologies exhibit finite sample consistency and demonstrate consistent qualitative behavior, proving reasonably close to each other in practice, particularly for moderate to large sample sizes. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

13 pages, 283 KiB  
Article
Linear Bayesian Estimation of Misrecorded Poisson Distribution
by Huiqing Gao, Zhanshou Chen and Fuxiao Li
Entropy 2024, 26(1), 62; https://0-doi-org.brum.beds.ac.uk/10.3390/e26010062 - 11 Jan 2024
Viewed by 775
Abstract
Parameter estimation is an important component of statistical inference, and how to improve the accuracy of parameter estimation is a key issue in research. This paper proposes a linear Bayesian estimation for estimating parameters in a misrecorded Poisson distribution. The linear Bayesian estimation [...] Read more.
Parameter estimation is an important component of statistical inference, and how to improve the accuracy of parameter estimation is a key issue in research. This paper proposes a linear Bayesian estimation for estimating parameters in a misrecorded Poisson distribution. The linear Bayesian estimation method not only adopts prior information but also avoids the cumbersome calculation of posterior expectations. On the premise of ensuring the accuracy and stability of computational results, we derived the explicit solution of the linear Bayesian estimation. Its superiority was verified through numerical simulations and illustrative examples. Full article
(This article belongs to the Special Issue Bayesianism)
15 pages, 1540 KiB  
Article
Objective Priors for Invariant e-Values in the Presence of Nuisance Parameters
by Elena Bortolato and Laura Ventura
Entropy 2024, 26(1), 58; https://0-doi-org.brum.beds.ac.uk/10.3390/e26010058 - 09 Jan 2024
Viewed by 848
Abstract
This paper aims to contribute to refining the e-values for testing precise hypotheses, especially when dealing with nuisance parameters, leveraging the effectiveness of asymptotic expansions of the posterior. The proposed approach offers the advantage of bypassing the need for elicitation of priors [...] Read more.
This paper aims to contribute to refining the e-values for testing precise hypotheses, especially when dealing with nuisance parameters, leveraging the effectiveness of asymptotic expansions of the posterior. The proposed approach offers the advantage of bypassing the need for elicitation of priors and reference functions for the nuisance parameters and the multidimensional integration step. For this purpose, starting from a Laplace approximation, a posterior distribution for the parameter of interest is only considered and then a suitable objective matching prior is introduced, ensuring that the posterior mode aligns with an equivariant frequentist estimator. Consequently, both Highest Probability Density credible sets and the e-value remain invariant. Some targeted and challenging examples are discussed. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

12 pages, 1108 KiB  
Article
Probabilistic Nearest Neighbors Classification
by Bruno Fava, Paulo C. Marques F. and Hedibert F. Lopes
Entropy 2024, 26(1), 39; https://0-doi-org.brum.beds.ac.uk/10.3390/e26010039 - 30 Dec 2023
Viewed by 865
Abstract
Analysis of the currently established Bayesian nearest neighbors classification model points to a connection between the computation of its normalizing constant and issues of NP-completeness. An alternative predictive model constructed by aggregating the predictive distributions of simpler nonlocal models is proposed, and analytic [...] Read more.
Analysis of the currently established Bayesian nearest neighbors classification model points to a connection between the computation of its normalizing constant and issues of NP-completeness. An alternative predictive model constructed by aggregating the predictive distributions of simpler nonlocal models is proposed, and analytic expressions for the normalizing constants of these nonlocal models are derived, ensuring polynomial time computation without approximations. Experiments with synthetic and real datasets showcase the predictive performance of the proposed predictive model. Full article
(This article belongs to the Special Issue Bayesianism)
Show Figures

Figure 1

Planned Papers

The below list represents only planned manuscripts. Some of these manuscripts have not been received by the Editorial Office yet. Papers submitted to MDPI journals are subject to peer-review.

Title: Bayesian non-parametric inference for multivariate peaks-over-threshold models
Authors: Peter Trubey and Bruno Sansó
Affiliation: Department of Statistics, University of California, Santa Cruz, CA 95064, USA
Back to TopTop