Mathematical Tools and Techniques Applicable to Probability Theory and Statistics

A special issue of Axioms (ISSN 2075-1680).

Deadline for manuscript submissions: closed (30 September 2022) | Viewed by 31854

Special Issue Editor


grade E-Mail Website
Guest Editor
Department of Mathematics and Statistics, University of Victoria, Victoria, BC V8W 3R4, Canada
Interests: real and complex analysis; fractional calculus and its applications; integral equations and transforms; higher transcendental functions and their applications; q-series and q-polynomials; analytic number theory; analytic and geometric Inequalities; probability and statistics; inventory modeling and optimization
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Investigations involving the theory and applications of mathematical analytic tools and techniques are remarkably widespread in many diverse areas of the mathematical, physical, biological, chemical, engineering, and statistical sciences. In this Special Issue, we invite and welcome review, expository, and original research articles dealing with (but not limited to) the recent advances in the subject of (among other related areas) probability theory and statistics.

We are looking forward to your contribution to this Special Issue.

Prof. Dr. Hari Mohan Srivastava

Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Axioms is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • mathematical (or higher transcendental) functions and their applications in probability theory and statistics
  • probabilistic derivations and applications of generating functions
  • the notion of statistical convergence and related developments
  • stochastic and martingale sequences and associated approximation theorems
  • statistical inference, statistical mechanics and related areas
  • summability theory and statistical applications

Related Special Issues

Published Papers (13 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

11 pages, 1027 KiB  
Article
Some Probabilistic Generalizations of the Cheney–Sharma and Bernstein Approximation Operators
by Seng Huat Ong, Choung Min Ng, Hong Keat Yap and Hari Mohan Srivastava
Axioms 2022, 11(10), 537; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11100537 - 08 Oct 2022
Cited by 5 | Viewed by 1186
Abstract
The objective of this paper is to give some probabilistic derivations of the Cheney, Sharma, and Bernstein approximation operators. Motivated by these probabilistic derivations, generalizations of the Cheney, Sharma, and Bernstein operators are defined. The convergence property of the Bernstein generalization is established. [...] Read more.
The objective of this paper is to give some probabilistic derivations of the Cheney, Sharma, and Bernstein approximation operators. Motivated by these probabilistic derivations, generalizations of the Cheney, Sharma, and Bernstein operators are defined. The convergence property of the Bernstein generalization is established. It is also shown that the Cheney–Sharma operator is the Szász–Mirakyan operator averaged by a certain probability distribution. Full article
Show Figures

Figure 1

20 pages, 3960 KiB  
Article
Mixture of Akash Distributions: Estimation, Simulation and Application
by Anum Shafiq, Tabassum Naz Sindhu, Showkat Ahmad Lone, Marwa K. H. Hassan and Kamsing Nonlaopon
Axioms 2022, 11(10), 516; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11100516 - 29 Sep 2022
Cited by 1 | Viewed by 1256
Abstract
In this paper, we propose a two-component mixture of Akash model (TC-MAM). The behavior of TC-MAM distribution has been presented graphically. Moment-based measures, including skewness, index of dispersion, kurtosis, and coefficient of variation, have been determined and hazard rate functions are presented graphically. [...] Read more.
In this paper, we propose a two-component mixture of Akash model (TC-MAM). The behavior of TC-MAM distribution has been presented graphically. Moment-based measures, including skewness, index of dispersion, kurtosis, and coefficient of variation, have been determined and hazard rate functions are presented graphically. The probability generating function, Mills ratio, characteristic function, cumulants, mean time to failure, and factorial moment generating function are all statistical aspects of the mixed model that we explore. Furthermore, we figure out the relevant parameters of the mixture model using the most suitable methods, such as least square, weighted least square, and maximum likelihood mechanisms. Findings of simulation experiments to examine behavior of these estimates are graphically presented. Finally, a set of data taken from the real world is examined in order to demonstrate the new model’s practical perspectives. All of the metrics evaluated favor the new model and the superiority of proposed distribution over mixture of Lindley, Shanker, and exponential distributions. Full article
Show Figures

Figure 1

22 pages, 3889 KiB  
Article
Arctan-Based Family of Distributions: Properties, Survival Regression, Bayesian Analysis and Applications
by Omid Kharazmi, Morad Alizadeh, Javier E. Contreras-Reyes and Hossein Haghbin
Axioms 2022, 11(8), 399; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11080399 - 12 Aug 2022
Cited by 3 | Viewed by 1640
Abstract
In this paper, a new class of the continuous distributions is established via compounding the arctangent function with a generalized log-logistic class of distributions. Some structural properties of the suggested model such as distribution function, hazard function, quantile function, asymptotics and a useful [...] Read more.
In this paper, a new class of the continuous distributions is established via compounding the arctangent function with a generalized log-logistic class of distributions. Some structural properties of the suggested model such as distribution function, hazard function, quantile function, asymptotics and a useful expansion for the new class are given in a general setting. Two special cases of this new class are considered by employing Weibull and normal distributions as the parent distribution. Further, we derive a survival regression model based on a sub-model with Weibull parent distribution and then estimate the parameters of the proposed regression model making use of Bayesian and frequentist approaches. We consider seven loss functions, namely the squared error, modified squared error, weighted squared error, K-loss, linear exponential, general entropy, and precautionary loss functions for Bayesian discussion. Bayesian numerical results include a Bayes estimator, associated posterior risk, credible and highest posterior density intervals are provided. In order to explore the consistency property of the maximum likelihood estimators, a simulation study is presented via Monte Carlo procedure. The parameters of two sub-models are estimated with maximum likelihood and the usefulness of these sub-models and a proposed survival regression model is examined by means of three real datasets. Full article
Show Figures

Figure 1

12 pages, 633 KiB  
Article
Some Notes for Two Generalized Trigonometric Families of Distributions
by Maria T. Vasileva
Axioms 2022, 11(4), 149; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11040149 - 24 Mar 2022
Cited by 3 | Viewed by 1641
Abstract
The paper deals with two general families of cumulative distribution functions based on arctangent function. We provide analysis of the error of the best one-sided Hausdorff approximation for some special cases of these families. We obtain precious estimates for the value of the [...] Read more.
The paper deals with two general families of cumulative distribution functions based on arctangent function. We provide analysis of the error of the best one-sided Hausdorff approximation for some special cases of these families. We obtain precious estimates for the value of the Hausdorff distance that can be used as an additional criterion in practice. Further, the family of recurrence generated adaptive functions is constructed and investigated. All new results are illustrated with suitable numerical experiments. Simple dynamic software modules show applicability of Hausdorff approximation. Full article
Show Figures

Figure 1

14 pages, 1787 KiB  
Article
Generalized Beta Prime Distribution Applied to Finite Element Error Approximation
by Joël Chaskalovic and Franck Assous
Axioms 2022, 11(3), 84; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11030084 - 22 Feb 2022
Viewed by 1343
Abstract
In this paper, we propose a new family of probability laws based on the Generalized Beta Prime distribution to evaluate the relative accuracy between two Lagrange finite elements Pk1 and Pk2,(k1<k2) [...] Read more.
In this paper, we propose a new family of probability laws based on the Generalized Beta Prime distribution to evaluate the relative accuracy between two Lagrange finite elements Pk1 and Pk2,(k1<k2). Usually, the relative finite element accuracy is based on the comparison of the asymptotic speed of convergence, when the mesh size h goes to zero. The new probability laws we propose here highlight that there exists, depending on h, cases where the Pk1 finite element is more likely accurate than the Pk2 element. To confirm this assertion, we highlight, using numerical examples, the quality of the fit between the statistical frequencies and the corresponding probabilities, as determined by the probability law. This illustrates that, when h goes away from zero, a finite element Pk1 may produce more precise results than a finite element Pk2, since the probability of the event “Pk1is more accurate thanPk2” becomes greater than 0.5. In these cases, finite element Pk2 is more likely overqualified. Full article
Show Figures

Figure 1

8 pages, 683 KiB  
Communication
Approximate Flow Friction Factor: Estimation of the Accuracy Using Sobol’s Quasi-Random Sampling
by Pavel Praks and Dejan Brkić
Axioms 2022, 11(2), 36; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms11020036 - 19 Jan 2022
Cited by 3 | Viewed by 2302
Abstract
The unknown friction factor from the implicit Colebrook equation cannot be expressed explicitly in an analytical way, and therefore to simplify the calculation, many explicit approximations can be used instead. The accuracy of such approximations should be evaluated only throughout the domain of [...] Read more.
The unknown friction factor from the implicit Colebrook equation cannot be expressed explicitly in an analytical way, and therefore to simplify the calculation, many explicit approximations can be used instead. The accuracy of such approximations should be evaluated only throughout the domain of interest in engineering practice where the number of test points can be chosen in many different ways, using uniform, quasi-uniform, random, and quasi-random patterns. To avoid picking points with undetected errors, a sufficient minimal number of such points should be chosen, and they should be distributed using proper patterns. A properly chosen pattern can minimize the required number of testing points that are sufficient to detect maximums of the error. The ability of the Sobol quasi-random vs. random distribution of testing points to capture the maximal relative error using a sufficiently small number of samples is evaluated. Sobol testing points that are quasi-randomly distributed can cover the domain of interest more evenly, avoiding large gaps. Sobol sequences are quasi-random and are always the same, which allows the exact repetition of scientific results. Full article
Show Figures

Figure 1

38 pages, 961 KiB  
Article
Spatial Statistical Models: An Overview under the Bayesian Approach
by Francisco Louzada, Diego Carvalho do Nascimento and Osafu Augustine Egbon
Axioms 2021, 10(4), 307; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms10040307 - 17 Nov 2021
Cited by 10 | Viewed by 6800
Abstract
Spatial documentation is exponentially increasing given the availability of Big Data in the Internet of Things, enabled by device miniaturization and data storage capacity. Bayesian spatial statistics is a useful statistical tool to determine the dependence structure and hidden patterns in space [...] Read more.
Spatial documentation is exponentially increasing given the availability of Big Data in the Internet of Things, enabled by device miniaturization and data storage capacity. Bayesian spatial statistics is a useful statistical tool to determine the dependence structure and hidden patterns in space through prior knowledge and data likelihood. However, this class of modeling is not yet well explored when compared to adopting classification and regression in machine-learning models, in which the assumption of the spatiotemporal independence of the data is often made, that is an inexistent or very weak dependence. Thus, this systematic review aims to address the main models presented in the literature over the past 20 years, identifying the gaps and research opportunities. Elements such as random fields, spatial domains, prior specification, the covariance function, and numerical approximations are discussed. This work explores the two subclasses of spatial smoothing: global and local. Full article
Show Figures

Figure 1

16 pages, 344 KiB  
Article
On the Discrete Weibull Marshall–Olkin Family of Distributions: Properties, Characterizations, and Applications
by Jiju Gillariose, Oluwafemi Samson Balogun, Ehab M. Almetwally, Rehan Ahmad Khan Sherwani, Farrukh Jamal and Joshin Joseph
Axioms 2021, 10(4), 287; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms10040287 - 30 Oct 2021
Cited by 11 | Viewed by 1752
Abstract
In this article, we introduce a new flexible discrete family of distributions, which accommodates wide collection of monotone failure rates. A sub-model of geometric distribution or a discrete generalization of the exponential model is proposed as a special case of the derived family. [...] Read more.
In this article, we introduce a new flexible discrete family of distributions, which accommodates wide collection of monotone failure rates. A sub-model of geometric distribution or a discrete generalization of the exponential model is proposed as a special case of the derived family. Besides, we point out a comprehensive record of some of its mathematical properties. Two distinct estimation methods for parameters estimation and two different methods for constructing confidence intervals are explored for the proposed distribution. In addition, three extensive Monte Carlo simulations studies are conducted to assess the advantages between estimation methods. Finally, the utility of the new model is embellished by dint of two real datasets. Full article
Show Figures

Figure 1

17 pages, 775 KiB  
Article
Is Football/Soccer Purely Stochastic, Made Out of Luck, or Maybe Predictable? How Does Bayesian Reasoning Assess Sports?
by Leonardo Barrios Blanco, Paulo Henrique Ferreira, Francisco Louzada and Diego Carvalho do Nascimento
Axioms 2021, 10(4), 276; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms10040276 - 26 Oct 2021
Viewed by 2631
Abstract
Predicting the game score is a well-explored duty, using mathematical/statistical models. Nonetheless, by adopting a Bayesian methodology, this study aimed to estimate probabilistically the Chilean Premier League teams’ position, considering them a hierarchical structure. This approach enabled the evaluation of the main Chilean [...] Read more.
Predicting the game score is a well-explored duty, using mathematical/statistical models. Nonetheless, by adopting a Bayesian methodology, this study aimed to estimate probabilistically the Chilean Premier League teams’ position, considering them a hierarchical structure. This approach enabled the evaluation of the main Chilean championship that provides the major soccer players for the national team. Thus, a countable (Poisson) regression structure was considered to explain each match as a combination of home advantage, added to the power of attack and defense of each team and considering their performance in the championship as an independent game. We were able to quantify the relationship across the defense and attack of each team and, in addition, were able to group/verify the performance of the entirety of the 2020 Chilean Premier League. For the model validation, we saved the last five games for the model prediction and we found that, in this league, the teams presented a statistical significance in the attack factors, which influences the scores (goals); however, all the teams showed low defense power and we have also found that playing at home or away did not present a game advantage. Our model was able to predict the Chilean league position table, with precision on the top five positions, and from the 6–11 positions there was a small shift (close performance in the championship) caused by the similarity of the expected number of goals, which implied the same position on the rank. This type of model has been shown to be very competitive for the soccer championship prediction. Full article
Show Figures

Figure 1

17 pages, 792 KiB  
Article
A Meta-Analysis for Simultaneously Estimating Individual Means with Shrinkage, Isotonic Regression and Pretests
by Nanami Taketomi, Yoshihiko Konno, Yuan-Tsung Chang and Takeshi Emura
Axioms 2021, 10(4), 267; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms10040267 - 20 Oct 2021
Cited by 9 | Viewed by 2458
Abstract
Meta-analyses combine the estimators of individual means to estimate the common mean of a population. However, the common mean could be undefined or uninformative in some scenarios where individual means are “ordered” or “sparse”. Hence, assessments of individual means become relevant, rather than [...] Read more.
Meta-analyses combine the estimators of individual means to estimate the common mean of a population. However, the common mean could be undefined or uninformative in some scenarios where individual means are “ordered” or “sparse”. Hence, assessments of individual means become relevant, rather than the common mean. In this article, we propose simultaneous estimation of individual means using the James–Stein shrinkage estimators, which improve upon individual studies’ estimators. We also propose isotonic regression estimators for ordered means, and pretest estimators for sparse means. We provide theoretical explanations and simulation results demonstrating the superiority of the proposed estimators over the individual studies’ estimators. The proposed methods are illustrated by two datasets: one comes from gastric cancer patients and the other from COVID-19 patients. Full article
Show Figures

Figure 1

16 pages, 314 KiB  
Article
Statistical Riemann and Lebesgue Integrable Sequence of Functions with Korovkin-Type Approximation Theorems
by Hari Mohan Srivastava, Bidu Bhusan Jena and Susanta Kumar Paikray
Axioms 2021, 10(3), 229; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms10030229 - 16 Sep 2021
Cited by 9 | Viewed by 1603
Abstract
In this work we introduce and investigate the ideas of statistical Riemann integrability, statistical Riemann summability, statistical Lebesgue integrability and statistical Lebesgue summability via deferred weighted mean. We first establish some fundamental limit theorems connecting these beautiful and potentially useful notions. Furthermore, based [...] Read more.
In this work we introduce and investigate the ideas of statistical Riemann integrability, statistical Riemann summability, statistical Lebesgue integrability and statistical Lebesgue summability via deferred weighted mean. We first establish some fundamental limit theorems connecting these beautiful and potentially useful notions. Furthermore, based upon our proposed techniques, we establish the Korovkin-type approximation theorems with algebraic test functions. Finally, we present two illustrative examples under the consideration of positive linear operators in association with the Bernstein polynomials to exhibit the effectiveness of our findings. Full article
22 pages, 2250 KiB  
Article
A Method for Visualizing Posterior Probit Model Uncertainty in the Early Prediction of Fraud for Sustainability Development
by Shih-Hsien Tseng and Tien Son Nguyen
Axioms 2021, 10(3), 178; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms10030178 - 04 Aug 2021
Cited by 1 | Viewed by 1939
Abstract
Corporate fraud is not only curtailed investors’ rights and privileges but also disrupts the overall market economy. For this reason, the formulation of a model that could help detect any unusual market fluctuations would be essential for investors. Thus, we propose an early [...] Read more.
Corporate fraud is not only curtailed investors’ rights and privileges but also disrupts the overall market economy. For this reason, the formulation of a model that could help detect any unusual market fluctuations would be essential for investors. Thus, we propose an early warning system for predicting fraud associated with financial statements based on the Bayesian probit model while examining historical data from 1999 to 2017 with 327 businesses in Taiwan to create a visual method to aid in decision making. In this study, we utilize a parametric estimation via the Markov Chain Monte Carlo (MCMC). The result show that it can reduce over or under-confidence within the decision-making process when standard logistic regression is utilized. In addition, the Bayesian probit model in this study is found to offer more accurate calculations and not only represent the prediction value of the responses but also possible ranges of these responses via a simple plot. Full article
Show Figures

Figure 1

16 pages, 2546 KiB  
Article
Water Particles Monitoring in the Atacama Desert: SPC Approach Based on Proportional Data
by Anderson Fonseca, Paulo Henrique Ferreira, Diego Carvalho do Nascimento, Rosemeire Fiaccone, Christopher Ulloa-Correa, Ayón García-Piña and Francisco Louzada
Axioms 2021, 10(3), 154; https://0-doi-org.brum.beds.ac.uk/10.3390/axioms10030154 - 13 Jul 2021
Cited by 9 | Viewed by 3162
Abstract
Statistical monitoring tools are well established in the literature, creating organizational cultures such as Six Sigma or Total Quality Management. Nevertheless, most of this literature is based on the normality assumption, e.g., based on the law of large numbers, and brings limitations towards [...] Read more.
Statistical monitoring tools are well established in the literature, creating organizational cultures such as Six Sigma or Total Quality Management. Nevertheless, most of this literature is based on the normality assumption, e.g., based on the law of large numbers, and brings limitations towards truncated processes as open questions in this field. This work was motivated by the register of elements related to the water particles monitoring (relative humidity), an important source of moisture for the Copiapó watershed, and the Atacama region of Chile (the Atacama Desert), and presenting high asymmetry for rates and proportions data. This paper proposes a new control chart for interval data about rates and proportions (symbolic interval data) when they are not results of a Bernoulli process. The unit-Lindley distribution has many interesting properties, such as having only one parameter, from which we develop the unit-Lindley chart for both classical and symbolic data. The performance of the proposed control chart is analyzed using the average run length (ARL), median run length (MRL), and standard deviation of the run length (SDRL) metrics calculated through an extensive Monte Carlo simulation study. Results from the real data applications reveal the tool’s potential to be adopted to estimate the control limits in a Statistical Process Control (SPC) framework. Full article
Show Figures

Figure 1

Back to TopTop