entropy-logo

Journal Browser

Journal Browser

Bayesian Machine Learning

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Statistical Physics".

Deadline for manuscript submissions: closed (31 December 2022) | Viewed by 4359

Special Issue Editors


E-Mail Website
Guest Editor
Machine Learning and Data Science Center, NTT Communication Science Laboratories, Kyoto 619-0237, Japan
Interests: statistical machine learning; Bayesian statistics; Bayesian inference; nonparametric Bayes
Department of Computer Science, Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, 113-0033, Japan
Interests: statistical machine learning; variational inference; representation learning; uncertainty estimation; medical image analysis

Special Issue Information

Dear Colleagues, 

Since the dawn of machine learning, Bayesian theory has played an important role because it enables practical learning from small amounts of data, the quantification of uncertainty in outcomes, and the introduction of a robust ensemble of models. Therefore, it is natural that approaches to issues with deep learning have followed the connection between Bayesian theory and deep learning, called Bayesian deep learning. It is also known that various techniques supporting deep learning, such as stochastic gradient methods, dropout, batch normalization, parameter regularization, and noise injection, are related to Bayesian theory. Bayesian machine learning is not limited to the study of deep learning. For example, Bayesian optimization, which enables the optimization of black-box functions using the nature of uncertainty quantification in nonparametric Bayesian theory, is being used in real-world applications such as hyperparameter tuning, materials development, and human interaction. Needless to say, Bayesian machine learning has historically been reworked many times, constantly evolving, and creating new technology to solve a wide range of uncertain and practical problems.  

This Special Issue focuses on research at the intersection of Bayesian theory and machine learning. Specifically, Bayesian theory related to machine learning, Bayesian latent variable models, Bayesian deep neural networks, Bayesian optimization, and various applications are welcome. In terms of applications, for example, data analysis on COVID-19 is also welcome.

Dr. Naonori Ueda
Dr. Issei Sato
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Dr. Naonori Ueda
Dr. Issei Sato
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Bayesian theory for machine learning
  • Bayesian latent variable models
  • Bayesian deep neural networks
  • Bayesian optimization
  • applications

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 3459 KiB  
Article
Click-through Rate Prediction and Uncertainty Quantification Based on Bayesian Deep Learning
by Xiaowei Wang and Hongbin Dong
Entropy 2023, 25(3), 406; https://0-doi-org.brum.beds.ac.uk/10.3390/e25030406 - 23 Feb 2023
Viewed by 2051
Abstract
Click-through rate (CTR) prediction is a research point for measuring recommendation systems and calculating AD traffic. Existing studies have proved that deep learning performs very well in prediction tasks, but most of the existing studies are based on deterministic models, and there is [...] Read more.
Click-through rate (CTR) prediction is a research point for measuring recommendation systems and calculating AD traffic. Existing studies have proved that deep learning performs very well in prediction tasks, but most of the existing studies are based on deterministic models, and there is a big gap in capturing uncertainty. Modeling uncertainty is a major challenge when using machine learning solutions to solve real-world problems in various domains. In order to quantify the uncertainty of the model and achieve accurate and reliable prediction results. This paper designs a CTR prediction framework combining feature selection and feature interaction. In this framework, a CTR prediction model based on Bayesian deep learning is proposed to quantify the uncertainty in the prediction model. On the squeeze network and DNN parallel prediction model framework, the approximate posterior parameter distribution of the model is obtained using the Monte Carlo dropout, and obtains the integrated prediction results. Epistemic and aleatoric uncertainty are defined and adopt information entropy to calculate the sum of the two kinds of uncertainties. Epistemic uncertainty could be measured by mutual information. Experimental results show that the model proposed is superior to other models in terms of prediction performance and has the ability to quantify uncertainty. Full article
(This article belongs to the Special Issue Bayesian Machine Learning)
Show Figures

Figure 1

17 pages, 1459 KiB  
Article
Multifidelity Model Calibration in Structural Dynamics Using Stochastic Variational Inference on Manifolds
by Panagiotis Tsilifis, Piyush Pandita, Sayan Ghosh and Liping Wang
Entropy 2022, 24(9), 1291; https://0-doi-org.brum.beds.ac.uk/10.3390/e24091291 - 13 Sep 2022
Cited by 1 | Viewed by 1221
Abstract
Bayesian techniques for engineering problems, which rely on Gaussian process (GP) regression, are known for their ability to quantify epistemic and aleatory uncertainties and for being data efficient. The mathematical elegance of applying these methods usually comes at a high computational cost when [...] Read more.
Bayesian techniques for engineering problems, which rely on Gaussian process (GP) regression, are known for their ability to quantify epistemic and aleatory uncertainties and for being data efficient. The mathematical elegance of applying these methods usually comes at a high computational cost when compared to deterministic and empirical Bayesian methods. Furthermore, using these methods becomes practically infeasible in scenarios characterized by a large number of inputs and thousands of training data. The focus of this work is on enhancing Gaussian process based metamodeling and model calibration tasks, when the size of the training datasets is significantly large. To achieve this goal, we employ a stochastic variational inference algorithm that enables rapid statistical learning of the calibration parameters and hyperparameter tuning, while retaining the rigor of Bayesian inference. The numerical performance of the algorithm is demonstrated on multiple metamodeling and model calibration problems with thousands of training data. Full article
(This article belongs to the Special Issue Bayesian Machine Learning)
Show Figures

Figure 1

Back to TopTop