Journal Description
Computation
Computation
is a peer-reviewed journal of computational science and engineering published monthly online by MDPI.
- Open Access— free for readers, with article processing charges (APC) paid by authors or their institutions.
- High Visibility: indexed within Scopus, ESCI (Web of Science), CAPlus / SciFinder, Inspec, dblp, and many other databases.
- Journal Rank: CiteScore - Q2 (Applied Mathematics)
- Rapid Publication: manuscripts are peer-reviewed and a first decision provided to authors approximately 15.2 days after submission; acceptance to publication is undertaken in 4.2 days (median values for papers published in this journal in the second half of 2021).
- Recognition of Reviewers: reviewers who provide timely, thorough peer-review reports receive vouchers entitling them to a discount on the APC of their next publication in any MDPI journal, in appreciation of the work done.
Latest Articles
A Mathematical and Numerical Framework for Traffic-Induced Air Pollution Simulation in Bamako
Computation 2022, 10(5), 76; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050076 - 17 May 2022
Abstract
►
Show Figures
We present a mathematical and numerical framework for the simulation of traffic-induced air pollution in Bamako. We consider a deterministic modeling approach where the spatio-temporal dynamics of the concentrations of air pollutants are governed by a so-called chemical transport model. The time integration
[...] Read more.
We present a mathematical and numerical framework for the simulation of traffic-induced air pollution in Bamako. We consider a deterministic modeling approach where the spatio-temporal dynamics of the concentrations of air pollutants are governed by a so-called chemical transport model. The time integration and spatial discretization of the model are achieved using the forward Euler algorithm and the finite-element method, respectively. The traffic emissions are estimated using a road traffic simulation package called SUMO. The numerical results for two road traffic-induced air pollutants, namely the carbon monoxide (CO) and the fine particulate matter (PM ), support that the proposed framework is suited for reproducing the dynamics of the pollutants specified.
Full article
Open AccessArticle
A Data-Driven Framework for Probabilistic Estimates in Oil and Gas Project Cost Management: A Benchmark Experiment on Natural Gas Pipeline Projects
Computation 2022, 10(5), 75; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050075 - 16 May 2022
Abstract
Nowadays, the Oil and Gas (O&G) industry faces significant challenges due to the relentless pressure for rationalization of project expenditure and cost reduction, the demand for greener and renewable energy solutions and the recent outbreak of the pandemic and geopolitical crises. Despite these
[...] Read more.
Nowadays, the Oil and Gas (O&G) industry faces significant challenges due to the relentless pressure for rationalization of project expenditure and cost reduction, the demand for greener and renewable energy solutions and the recent outbreak of the pandemic and geopolitical crises. Despite these barriers, the O&G industry still remains a key sector in the growth of world economy, requiring huge capital investments on critical megaprojects. On the other hand, the O&G projects, traditionally, experience cost overruns and delays with damaging consequences to both industry stakeholders and policy-makers. Regarding this, there is an urgent necessity for the adoption of innovative project management methods and tools facilitating the timely delivery of projects with high quality standards complying with budgetary restrictions. Certainly, the success of a project is intrinsically associated with the ability of the decision-makers to estimate, in a compelling way, the monetary resources required throughout the project’s life cycle, an activity that involves various sources of uncertainty. In this study, we focus on the critical management task of evaluating project cost performance through the development of a framework aiming at handling the inherent uncertainty of the estimation process based on well-established data-driven concepts, tools and performance metrics. The proposed framework is demonstrated through a benchmark experiment on a publicly available dataset containing information related to the construction cost of natural gas pipeline projects. The findings derived from the benchmark study showed that the applied algorithm and the adoption of a different feature scaling mechanism presented an interaction effect on the distribution of loss functions, when used as point and interval estimators of the actual cost. Regarding the evaluation of point estimators, Support Vector Regression with different feature scaling mechanisms achieved superior performances in terms of both accuracy and bias, whereas both K-Nearest Neighbors and Classification and Regression Trees variants indicated noteworthy prediction capabilities for producing narrow interval estimates that contain the actual cost value. Finally, the evaluation of the agreement between the performance rankings for the set of candidate models, when used as point and interval estimators revealed a moderate agreement (𝑎 = 0.425).
Full article
Open AccessArticle
Regression Machine Learning Models Used to Predict DFT-Computed NMR Parameters of Zeolites
by
, , , , , and
Computation 2022, 10(5), 74; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050074 - 13 May 2022
Abstract
Machine learning approaches can drastically decrease the computational time for the predictions of spectroscopic properties in materials, while preserving the quality of the computational approaches. We studied the performance of kernel-ridge regression (KRR) and gradient boosting regressor (GBR) models trained on the isotropic
[...] Read more.
Machine learning approaches can drastically decrease the computational time for the predictions of spectroscopic properties in materials, while preserving the quality of the computational approaches. We studied the performance of kernel-ridge regression (KRR) and gradient boosting regressor (GBR) models trained on the isotropic shielding values, computed with density-functional theory (DFT), in a series of different known zeolites containing out-of-frame metal cations or fluorine anion and organic structure-directing cations. The smooth overlap of atomic position descriptors were computed from the DFT-optimised Cartesian coordinates of each atoms in the zeolite crystal cells. The use of these descriptors as inputs in both machine learning regression methods led to the prediction of the DFT isotropic shielding values with mean errors within 0.6 ppm. The results showed that the GBR model scales better than the KRR model.
Full article
(This article belongs to the Special Issue Commemorative Issue in Honor of Professor Karlheinz Schwarz on the Occasion of His 80th Birthday)
►▼
Show Figures

Figure 1
Open AccessArticle
Influence of the Chemical Pressure on the Magnetic Properties of the Mixed Anion Cuprates Cu2OX2 (X = Cl, Br, I)
Computation 2022, 10(5), 73; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050073 - 12 May 2022
Abstract
In this study, we theoretically investigate the structural, electronic and magnetic properties of the Cu2OX2 (X = Cl, Br, I) compounds. Previous studies reported potential spin-driven ferroelectricity in Cu2OCl2, originating from a non-collinear magnetic phase existing
[...] Read more.
In this study, we theoretically investigate the structural, electronic and magnetic properties of the Cu2OX2 (X = Cl, Br, I) compounds. Previous studies reported potential spin-driven ferroelectricity in Cu2OCl2, originating from a non-collinear magnetic phase existing below ∼70 K. However, the nature of this low-temperature magnetic phase is still under debate. Here, we focus on the calculation of J exchange couplings and enhance knowledge in the field by (i) characterizing the low-temperature magnetic order for Cu2OCl2 and (ii) evaluating the impact of the chemical pressure on the magnetic interactions, which leads us to consider the two new phases Cu2OBr2 and Cu2OI2. Our ab initio simulations notably demonstrate the coexistence of strong antiferromagnetic and ferromagnetic interactions, leading to spin frustration. The Néel temperatures were estimated on the basis of a quasi-1D AFM model using the J couplings. It nicely reproduces the value for Cu2OCl2 and allows us to predict an increase of under chemical pressure, with = 120 K for the dynamically stable phase Cu2OBr2. This investigation suggests that chemical pressure is an effective key factor to open the door of room-temperature multiferroicity.
Full article
(This article belongs to the Special Issue Commemorative Issue in Honor of Professor Karlheinz Schwarz on the Occasion of His 80th Birthday)
►▼
Show Figures

Figure 1
Open AccessArticle
Inverse Modeling of Hydrologic Parameters in CLM4 via Generalized Polynomial Chaos in the Bayesian Framework
Computation 2022, 10(5), 72; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050072 - 05 May 2022
Abstract
In this work, generalized polynomial chaos (gPC) expansion for land surface model parameter estimation is evaluated. We perform inverse modeling and compute the posterior distribution of the critical hydrological parameters that are subject to great uncertainty in the Community Land Model (CLM) for
[...] Read more.
In this work, generalized polynomial chaos (gPC) expansion for land surface model parameter estimation is evaluated. We perform inverse modeling and compute the posterior distribution of the critical hydrological parameters that are subject to great uncertainty in the Community Land Model (CLM) for a given value of the output LH. The unknown parameters include those that have been identified as the most influential factors on the simulations of surface and subsurface runoff, latent and sensible heat fluxes, and soil moisture in CLM4.0. We set up the inversion problem in the Bayesian framework in two steps: (i) building a surrogate model expressing the input–output mapping, and (ii) performing inverse modeling and computing the posterior distributions of the input parameters using observation data for a given value of the output LH. The development of the surrogate model is carried out with a Bayesian procedure based on the variable selection methods that use gPC expansions. Our approach accounts for bases selection uncertainty and quantifies the importance of the gPC terms, and, hence, all of the input parameters, via the associated posterior probabilities.
Full article
(This article belongs to the Special Issue Inverse Problems with Partial Data)
►▼
Show Figures

Figure 1
Open AccessArticle
Estimation Parameters of Dependence Meta-Analytic Model: New Techniques for the Hierarchical Bayesian Model
Computation 2022, 10(5), 71; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050071 - 04 May 2022
Abstract
►▼
Show Figures
Dependence in meta-analytic models can happen due to the same collected data or from the same researchers. The hierarchical Bayesian linear model in a meta-analysis that allows dependence in effect sizes is investigated in this paper. The interested parameters on the hierarchical Bayesian
[...] Read more.
Dependence in meta-analytic models can happen due to the same collected data or from the same researchers. The hierarchical Bayesian linear model in a meta-analysis that allows dependence in effect sizes is investigated in this paper. The interested parameters on the hierarchical Bayesian linear dependence (HBLD) model which was developed using the Bayesian techniques will then be estimated. The joint posterior distribution of all parameters for the hierarchical Bayesian linear dependence (HBLD) model is obtained by applying the Gibbs sampling algorithm. Furthermore, in order to measure the robustness of the HBLD model, the sensitivity analysis is conducted using a different prior distribution on the model. This is carried out by applying the Metropolis within the Gibbs algorithm. The simulation study is performed for the estimation of all parameters in the model. The results show that the obtained estimated parameters are close to the true parameters, indicating the consistency of the parameters for the model. The model is also not sensitive because of the changing prior distribution which shows the robustness of the model. A case study, to assess the effects of native-language vocabulary aids on second language reading, is conducted successfully in testing the parameters of the models.
Full article

Figure 1
Open AccessArticle
LFDFT—A Practical Tool for Coordination Chemistry
Computation 2022, 10(5), 70; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050070 - 02 May 2022
Abstract
The electronic structure of coordination compounds with lanthanide ions is studied by means of density functional theory (DFT) calculations. This work deals with the electronic structure and properties of open-shell systems based on the calculation of multiplet structure and ligand-field interaction, within the
[...] Read more.
The electronic structure of coordination compounds with lanthanide ions is studied by means of density functional theory (DFT) calculations. This work deals with the electronic structure and properties of open-shell systems based on the calculation of multiplet structure and ligand-field interaction, within the framework of the Ligand–Field Density-Functional Theory (LFDFT) method. Using effective Hamiltonian in conjunction with the DFT, we are able to reasonably calculate the low-lying excited states of the molecular [Eu(NO ) (phenanthroline) ] complex, subjected to the Eu configuration 4f . The results are compared with available experimental data, revealing relative uncertainties of less than 5% for many energy levels. We also demonstrate the ability of the LFDFT method to simulate absorption spectrum, considering cerocene as an example. Ce M X-ray absorption spectra are simulated for the complexes [Ce( C H ) ] and [Ce( C H ) ][Li(tetrahydrofurane) ], which are approximated by the Ce oxidation states and , respectively. The results showed a very good agreement with the experimental data for the Ce compound, unlike for the Ce one, where charge transfer electronic structure is still missing in the theoretical model. Therefore this presentation reports the benefits of having a theoretical method that is primarily dedicated to coordination chemistry, but it also outlines limitations and places the ongoing developmental efforts in the broader context of treating complex molecular systems.
Full article
(This article belongs to the Special Issue Commemorative Issue in Honor of Professor Karlheinz Schwarz on the Occasion of His 80th Birthday)
►▼
Show Figures

Figure 1
Open AccessArticle
In Silico Analysis of Peptide-Based Derivatives Containing Bifunctional Warheads Engaging Prime and Non-Prime Subsites to Covalent Binding SARS-CoV-2 Main Protease (Mpro)
by
, , , , , and
Computation 2022, 10(5), 69; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050069 - 01 May 2022
Abstract
Despite the progress of therapeutic approaches for treating COVID-19 infection, the interest in developing effective antiviral agents is still high, due to the possibility of the insurgence of viable SARS-CoV-2-resistant strains. Accordingly, in this article, we describe a computational protocol for identifying possible
[...] Read more.
Despite the progress of therapeutic approaches for treating COVID-19 infection, the interest in developing effective antiviral agents is still high, due to the possibility of the insurgence of viable SARS-CoV-2-resistant strains. Accordingly, in this article, we describe a computational protocol for identifying possible SARS-CoV-2 Mpro covalent inhibitors. Combining several in silico techniques, we evaluated the potential of the peptide-based scaffold with different warheads as a significant alternative to nitriles and aldehyde electrophilic groups. We rationally designed four potential inhibitors containing difluorstatone and a Michael acceptor as warheads. In silico analysis, based on molecular docking, covalent docking, molecular dynamics simulation, and FEP, indicated that the conceived compounds could act as covalent inhibitors of Mpro and that the investigated warheads can be used for designing covalent inhibitors against serine or cysteine proteases such as SARS-CoV-2 Mpro. Our work enriches the knowledge on SARS-CoV-2 Mpro, providing a novel potential strategy for its inhibition, paving the way for the development of effective antivirals.
Full article
(This article belongs to the Special Issue Computation to Fight SARS-CoV-2 (CoVid-19))
►▼
Show Figures

Figure 1
Open AccessCommunication
A Comprehensive DFT Investigation of the Adsorption of Polycyclic Aromatic Hydrocarbons onto Graphene
by
and
Computation 2022, 10(5), 68; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050068 - 26 Apr 2022
Abstract
To better understand graphene and its interactions with polycyclic aromatic hydrocarbons (PAHs), density-functional-theory (DFT) computations were used. Adsorption energy is likely to rise with the number of aromatic rings in the adsorbates. The DFT results revealed that the distance between the PAH molecules
[...] Read more.
To better understand graphene and its interactions with polycyclic aromatic hydrocarbons (PAHs), density-functional-theory (DFT) computations were used. Adsorption energy is likely to rise with the number of aromatic rings in the adsorbates. The DFT results revealed that the distance between the PAH molecules adsorbed onto the G ranged between 2.47 and 3.98 Å depending on the structure of PAH molecule. The Non-Covalent Interactions (NCI) plot supports the concept that van der Waals interactions were involved in PAH adsorption onto the Graphene (G) structure. Based on the DFT-calculated adsorption energy data, a rapid and reliable method employing an empirical model of a quantitative structure–activity relationship (QSAR) was created and validated for estimating the adsorption energies of PAH molecules onto graphene.
Full article
(This article belongs to the Section Computational Chemistry)
►▼
Show Figures

Figure 1
Open AccessArticle
A Stochastic Model for Determining Static Stability Margins in Electric Power Systems
Computation 2022, 10(5), 67; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050067 - 25 Apr 2022
Abstract
This paper aims to develop a method for determining margins of static aperiodic stability for electric power systems equipped with distributed generation plants. To this end, we used generalized equations of limiting modes in a stochastic formulation. Computer simulation showed that the developed
[...] Read more.
This paper aims to develop a method for determining margins of static aperiodic stability for electric power systems equipped with distributed generation plants. To this end, we used generalized equations of limiting modes in a stochastic formulation. Computer simulation showed that the developed methodology can be used in solving problems of operational control of the modes of electric power systems. On the basis of the results obtained, we arrived at the following conclusions: the modified equations do not allow the iterative process to converge to a trivial solution and, therefore, they ensure high reliability of results when determining stability margins in a stochastic statement; a technique based on the introduction of an additional variable can be used to improve the convergence of computational processes when determining the stability margins in a deterministic statement; the parameters of the limiting modes obtained in the deterministic and stochastic formulations may significantly differ; with an increase in the variance of the load graphs, the risk of stability violation significantly increases; at the same time, the amount of the margin determined on the basis of the Euclidean norm remains overly optimistic; in the illustrative example, a significant increase in the risk of stability violation takes place during planned and emergency shutdowns of the EPS elements.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Cognitive Hybrid Intelligent Diagnostic System: Typical Architecture
Computation 2022, 10(5), 66; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050066 - 22 Apr 2022
Abstract
The research refers to the modeling of the meaningful and relatively stable visual-figurative and verbal-sign representation of real problems in medical diagnostics of the human organs and systems. The core results of the research are presented. Here, a new visual metalanguage is proposed.
[...] Read more.
The research refers to the modeling of the meaningful and relatively stable visual-figurative and verbal-sign representation of real problems in medical diagnostics of the human organs and systems. The core results of the research are presented. Here, a new visual metalanguage is proposed. It describes the solution of a diagnostic problem by combining several interconnected processes of reasoning in different languages defining “a state of human organs and systems”, “a diagnostic problem” and elements of its decomposition. In the paper, a subject-figurative model of the cognitive hybrid intelligent diagnostic system, its typical architecture, and a synthesis algorithm are provided. Due to the integration of imitation of an internal subject-figurative vision of medical diagnostic problems and the corresponding communication statements of private diagnoses with imitation of the behavior inherent for councils in problem situations, the future implementation of such system prototypes will reduce the number of medical errors. The further stage of this research is the approbation of all solutions for the problem of diagnosing diseases of the pancreas on the materials of the Kaliningrad Regional Clinical Hospital and experimental study of the system. The research is limited by the subject area of medicine but can be generalized to the other areas.
Full article
(This article belongs to the Special Issue Control Systems, Mathematical Modeling and Automation)
►▼
Show Figures

Figure 1
Open AccessArticle
Generation of Basis Sets for Accurate Molecular Calculations: Application to Helium Atom and Dimer
Computation 2022, 10(5), 65; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10050065 - 20 Apr 2022
Abstract
A new approach for basis set generation is reported and tested in helium atom and dimer. The basis sets thus computed, named sigma, range from DZ to 5Z and consist of the same composition as Dunning basis sets but with a different treatment
[...] Read more.
A new approach for basis set generation is reported and tested in helium atom and dimer. The basis sets thus computed, named sigma, range from DZ to 5Z and consist of the same composition as Dunning basis sets but with a different treatment of contractions. The performance of the sigma sets is analyzed for energy and other properties of He atom and He dimer, and the results are compared with those obtained with Dunning and ANO basis sets. The sigma basis sets and their extended versions up to triple augmented provide better energy values than Dunning basis sets of the same composition, and similar values to those attained with the currently available ANO. Extrapolation to complete basis set of correlation energy is compared between the sigma basis sets and those of Dunning, showing the better performance of the former in this respect.
Full article
(This article belongs to the Special Issue Commemorative Issue in Honor of Professor Karlheinz Schwarz on the Occasion of His 80th Birthday)
►▼
Show Figures

Figure 1
Open AccessArticle
Aggregating Composite Indicators through the Geometric Mean: A Penalization Approach
Computation 2022, 10(4), 64; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040064 - 18 Apr 2022
Abstract
In this paper, we introduce a penalized version of the geometric mean. In analogy with the Mazziotta Pareto Index, this composite indicator is derived as a product between the geometric mean and a penalization term to account for the unbalance among indicators. The
[...] Read more.
In this paper, we introduce a penalized version of the geometric mean. In analogy with the Mazziotta Pareto Index, this composite indicator is derived as a product between the geometric mean and a penalization term to account for the unbalance among indicators. The unbalance is measured in terms of the (horizontal) variability of the normalized indicators opportunely scaled and transformed via the Box–Cox function of order zero. The penalized geometric mean is used to compute the penalized Human Development Index (HDI), and a comparison with the geometric mean approach is presented. Data come from the Human Development Data Center for 2019 and refer to the classical three dimensions of HDI. The results show that the new method does not upset the original ranking produced by the HDI but it impacts more on countries with poor performances. The paper has the merit of proposing a new reading of the Mazziotta Pareto Index in terms of the reliability of the arithmetic mean as well as of generalizing this reading to the geometric mean approach.
Full article
(This article belongs to the Special Issue Control Systems, Mathematical Modeling and Automation)
►▼
Show Figures

Figure 1
Open AccessArticle
Classifying the Degree of Bark Beetle-Induced Damage on Fir (Abies mariesii) Forests, from UAV-Acquired RGB Images
by
, , , , , and
Computation 2022, 10(4), 63; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040063 - 18 Apr 2022
Abstract
Bark beetle outbreaks are responsible for the loss of large areas of forests and in recent years they appear to be increasing in frequency and magnitude as a result of climate change. The aim of this study is to develop a new standardized
[...] Read more.
Bark beetle outbreaks are responsible for the loss of large areas of forests and in recent years they appear to be increasing in frequency and magnitude as a result of climate change. The aim of this study is to develop a new standardized methodology for the automatic detection of the degree of damage on single fir trees caused by bark beetle attacks using a simple GIS-based model. The classification approach is based on the degree of tree canopy defoliation observed (white pixels) in the UAV-acquired very high resolution RGB orthophotos. We defined six degrees (categories) of damage (healthy, four infested levels and dead) based on the ratio of white pixel to the total number of pixels of a given tree canopy. Category 1: <2.5% (no defoliation); Category 2: 2.5–10% (very low defoliation); Category 3: 10–25% (low defoliation); Category 4: 25–50% (medium defoliation); Category 5: 50–75% (high defoliation), and finally Category 6: >75% (dead). The definition of “white pixel” is crucial, since light conditions during image acquisition drastically affect pixel values. Thus, whiteness was defined as the ratio of red pixel value to the blue pixel value of every single pixel in relation to the ratio of the mean red and mean blue value of the whole orthomosaic. The results show that in an area of 4 ha, out of the 1376 trees, 277 were healthy, 948 were infested (Cat 2, 628; Cat 3, 244; Cat 4, 64; Cat 5, 12), and 151 were dead (Cat 6). The validation led to an average precision of 62%, with Cat 1 and Cat 6 reaching a precision of 73% and 94%, respectively.
Full article
(This article belongs to the Special Issue Computation and Analysis of Remote Sensing Imagery and Image Motion)
►▼
Show Figures

Figure 1
Open AccessArticle
Modeling the Territorial Structure Dynamics of the Northern Part of the Volga-Akhtuba Floodplain
Computation 2022, 10(4), 62; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040062 - 11 Apr 2022
Abstract
The subject of our study is the tendency to reduce the floodplain area of regulated rivers and its impact on the degradation of the socio-environmental systems in the floodplain. The aim of the work is to create a new approach to the analysis
[...] Read more.
The subject of our study is the tendency to reduce the floodplain area of regulated rivers and its impact on the degradation of the socio-environmental systems in the floodplain. The aim of the work is to create a new approach to the analysis and forecasting of the multidimensional degradation processes of floodplain territories under the influence of natural and technogenic factors. This approach uses methods of hydrodynamic and geoinformation modeling, statistical analysis of observational data and results of high-performance computational experiments. The basis of our approach is the dynamics model of the complex structure of the floodplain. This structure combines the characteristics of the frequency ranges of flooding and the socio-environmental features of various sites (cadastral data of land use). Modeling of the hydrological regime is based on numerical shallow water models. The regression model of the technogenic dynamics of the riverbed allowed us to calculate corrections to the parameters of real floods that imitate the effect of this factor. This made it possible to use digital maps of the modern topography for hydrodynamic modeling and the construction of floods maps for past and future decades. The technological basis of our study is a set of algorithms and software, consisting of three modules. The data module includes, first of all, the cadastres of the territory of the Volga-Akhtuba floodplain (VAF, this floodplain is the interfluve of the Volga and Akhtuba rivers for the last 400 km before flowing into the Caspian Sea), satellite and natural observation data, spatial distributions of parameters of geoinformation and hydrodynamic models. The second module provides the construction of a multilayer digital model of the floodplain area, digital maps of floods and their aggregated characteristics. The third module calculates a complex territorial structure, criteria for the state of the environmental and socio-economic system (ESES) and a forecast of its changes. We have shown that the degradation of the ESES of the northern part of the VAF is caused by the negative dynamics of the hydrological structure of its territory, due to the technogenic influence the hydroelectric power station on the Volga riverbed. This dynamic manifests itself in a decrease in the stable flooded area and an increase in the unflooded and unstable flooded areas. An important result is the forecast of the complex territorial structure and criteria for the state of the interfluve until 2050.
Full article
(This article belongs to the Special Issue Control Systems, Mathematical Modeling and Automation)
►▼
Show Figures

Figure 1
Open AccessArticle
Co-Design of the Control and Power Stages of a Boost-Based Rectifier with Power Factor Correction Depending on Performance Criteria
Computation 2022, 10(4), 61; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040061 - 11 Apr 2022
Abstract
Rectifiers with power factor correction are key devices to supply DC loads from AC sources, guaranteeing a power factor close to one and low total harmonic distortion. Boost-based power factor correction rectifiers are the most widely used topology and they are formed by
[...] Read more.
Rectifiers with power factor correction are key devices to supply DC loads from AC sources, guaranteeing a power factor close to one and low total harmonic distortion. Boost-based power factor correction rectifiers are the most widely used topology and they are formed by a power stage (diode bridge and Boost converter) and a control system. However, there is a relevant control problem, because controllers are designed with linearized models of the converters for a specific operating point; consequently, the required dynamic performance and stability of the whole system for different operating points are not guaranteed. Another weak and common practice is to design the power and control stages independently. This paper proposes a co-design procedure for both the power stage and the control system of a Boost-based PFC rectifier, which is focused on guaranteeing the system’s stability in any operating conditions. Moreover, the design procedure assures a maximum switching frequency and the fulfillment of different design requirements for the output voltage: maximum overshoot and settling time before load disturbances, maximum ripple, and the desired damping ratio. The proposed control has a cascade structure, where the inner loop is a sliding-mode controller (SMC) to track the inductor current reference, and the outer loop is an adaptive PI regulator of the output voltage, which manipulates the amplitude of the inductor current reference. The paper includes the stability analysis of the SMC, the design procedure of the inductor to guarantee the system stability, and the design of the adaptive PI controller parameters and the capacitor to achieve the desired dynamic performance of the output voltage. The proposed rectifier is simulated in PSIM and the results validate the co-design procedures and show that the proposed system is stable for any operating conditions and satisfies the design requirements.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Natural Convection Flow over a Vertical Permeable Circular Cone with Uniform Surface Heat Flux in Temperature-Dependent Viscosity with Three-Fold Solutions within the Boundary Layer
Computation 2022, 10(4), 60; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040060 - 09 Apr 2022
Abstract
The aim of this study is to investigate the effects of temperature-dependent viscosity on the natural convection flow from a vertical permeable circular cone with uniform heat flux. As part of numerical computation, the governing boundary layer equations are transformed into a non-dimensional
[...] Read more.
The aim of this study is to investigate the effects of temperature-dependent viscosity on the natural convection flow from a vertical permeable circular cone with uniform heat flux. As part of numerical computation, the governing boundary layer equations are transformed into a non-dimensional form. The resulting nonlinear system of partial differential equations is then reduced to local non-similarity equations which are solved computationally by three different solution methodologies, namely, (i) perturbation solution for small transpiration parameter (ξ), (ii) asymptotic solution for large ξ, and (iii) the implicit finite difference method together with a Keller box scheme for all ξ. The numerical results of the velocity and viscosity profiles of the fluid are displayed graphically with heat transfer characteristics. The shearing stress in terms of the local skin-friction coefficient and the rate of heat transfer in terms of the local Nusselt number (Nu) are given in tabular form for the viscosity parameter (ε) and the Prandtl number (Pr). The viscosity is a linear function of temperature which is valid for small Prandtl numbers (Pr). The three-fold solutions were compared as part of the validations with various ranges of Pr numbers. Overall, good agreements were established. The major finding of the research provides a better demonstration of how temperature-dependent viscosity affects the natural convective flow. It was found that increasing Pr, ξ, and ε decrease the local skin-friction coefficient, but ξ has more influence on increasing the rate of heat transfer, as the effect of ε was erratic at small and large ξ. Furthermore, at the variable Pr, a large ξ increased the local maxima of viscosity at large extents, particularly at low Pr, but the effect on temperature distribution was found to be less significant under the same condition. However, at variable ε and fixed Pr, the temperature distribution was observed to be more influenced by ε at small ξ, whereas large ξ dominated this scheme significantly regardless of the variation in ε. The validations through three-fold solutions act as evidence of the accuracy and versatility of the current approach.
Full article
(This article belongs to the Special Issue Applications of Computational Mathematics to Simulation and Data Analysis)
►▼
Show Figures

Figure 1
Open AccessArticle
An Alternative Methodology to Compute the Geometric Tortuosity in 2D Porous Media Using the A-Star Pathfinding Algorithm
Computation 2022, 10(4), 59; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040059 - 02 Apr 2022
Abstract
Geometric tortuosity is an essential characteristic to consider when studying a porous medium’s morphology. Knowing the material’s tortuosity allows us to understand and estimate the different diffusion transport properties of the analyzed material. Geometric tortuosity is useful to compute parameters, such as the
[...] Read more.
Geometric tortuosity is an essential characteristic to consider when studying a porous medium’s morphology. Knowing the material’s tortuosity allows us to understand and estimate the different diffusion transport properties of the analyzed material. Geometric tortuosity is useful to compute parameters, such as the effective diffusion coefficient, inertial factor, and diffusibility, which are commonly found in porous media materials. This study proposes an alternative method to estimate the geometric tortuosity of digitally created two-dimensional porous media. The porous microstructure is generated by using the PoreSpy library of Python and converted to a binary matrix for the computation of the parameters involved in this work. As a first step, porous media are digitally generated with porosity values from 0.5 to 0.9; then, the geometric tortuosity is determined using the A-star algorithm. This approach, commonly used in pathfinding problems, improves the use of computational resources and complies with the theory found in the literature. Based on the obtained results, the best geometric tortuosity–porosity correlations are proposed. The selection of the best correlation considers the coefficient of determination value (99.7%) with a confidence interval of 95%.
Full article
(This article belongs to the Section Computational Engineering)
►▼
Show Figures

Figure 1
Open AccessArticle
Understanding the Complex Impacts of Seatbelt Use on Crash Outcomes
by
and
Computation 2022, 10(4), 58; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040058 - 01 Apr 2022
Abstract
►▼
Show Figures
Despite the importance of seatbelt use in the reduction of injuries and fatalities, the majority of past studies failed to account for the complex nature of seatbelts on the safety of roadways. The complexity of seatbelt use is especially related to a possible
[...] Read more.
Despite the importance of seatbelt use in the reduction of injuries and fatalities, the majority of past studies failed to account for the complex nature of seatbelts on the safety of roadways. The complexity of seatbelt use is especially related to a possible association between seatbelt use and other factors at the time of crashes. Ignoring those interaction terms might result in unrealistic or biased point estimates regarding the underlying impact of seatbelt use on roadway safety. For instance, is the impact of seatbelt use on the severity of crashes stable or varies based on other factors such as gender? Or does the impact of seatbelt use changes based on whether a driver is under the influence of alcohol or not? The mixed logit model was used to model the severity of crashes. In this study we focused on interaction terms between seatbelt use and all other plausible predictors of crashes. The results highlighted that there are important and significant interaction terms between seatbelt status and other predictors such as drivers under the influence (DUI), drivers with invalid driver’s licenses, lack of attention in crashes, having a citation record, ejected drivers, and other environmental and roadway characteristics. For instance, it was found that the impact of seatbelt use on the severity of crashes varies based on the actions that drivers took before crashes, such as improper lane changes or following too close. On the other hand, seatbelt use is more effective in crash severity reduction for ejected drivers and less effective for drivers under the influence of alcohol or unattended drivers. The results provide important information to gain a better understanding regarding the effectiveness of seatbelt use.
Full article

Figure 1
Open AccessArticle
Applying Machine Learning Methods and Models to Explore the Structure of Traffic Accident Data
Computation 2022, 10(4), 57; https://0-doi-org.brum.beds.ac.uk/10.3390/computation10040057 - 31 Mar 2022
Abstract
The problem of reducing the increasing number of road traffic accidents has become more relevant in recent years. According to the United Nations plan this number has to be halved by 2030. A very effective way to handle it is to apply the
[...] Read more.
The problem of reducing the increasing number of road traffic accidents has become more relevant in recent years. According to the United Nations plan this number has to be halved by 2030. A very effective way to handle it is to apply the machine learning paradigm to retrospective road traffic accident datasets. This case study applies machine learning techniques to form typical “portraits” of drivers violating road traffic rules by clustering available data into seven homogeneous groups. The obtained results can be used in forming effective marketing campaigns for different target groups. Another relevant problem under consideration is to use available retrospective statistics on mechanical road traffic accidents without victims to estimate the probable type of road traffic accident for the driver taking into account her/his personal features (such as social characteristics, typical road traffic rule violations, driving experience, and age group). For this purpose several machine learning models were applied and the results were discussed.
Full article
(This article belongs to the Special Issue Control Systems, Mathematical Modeling and Automation)
►▼
Show Figures

Figure 1
Highly Accessed Articles
Latest Books
E-Mail Alert
News
Topics
Topic in
Entropy, Fractal Fract, Dynamics, Mathematics, Computation, Axioms
Advances in Nonlinear Dynamics: Methods and Applications
Topic Editors: Ravi P. Agarwal, Maria Alessandra RagusaDeadline: 5 December 2022
Topic in
Entropy, Algorithms, Computation, MAKE, Energies, Materials
Artificial Intelligence and Computational Methods: Modeling, Simulations and Optimization of Complex Systems
Topic Editors: Jaroslaw Krzywanski, Yunfei Gao, Marcin Sosnowski, Karolina Grabowska, Dorian Skrobek, Ghulam Moeen Uddin, Anna Kulakowska, Anna Zylka, Bachil El FilDeadline: 30 December 2022
Topic in
Applied Sciences, Computation, Materials, Polymers, Modelling
Computational Materials Science for Polymers
Topic Editors: Mikhail G. Kiselev, Yury BudkovDeadline: 28 February 2023
Topic in
Chemistry, IJMS, Mathematics, Symmetry, Computation
Molecular Topology and Computation
Topic Editors: Lorentz Jäntschi, Dusanka JanezicDeadline: 1 February 2024

Conferences
6 October 2021–6 October 2031
15th International Conference on Practical Applications of Computational Biology & Bioinformatics

Special Issues
Special Issue in
Computation
Integrated Computer Technologies in Mechanical Engineering – Synergetic Engineering
Guest Editors: Mykola Nechyporuk, Vladimir Pavlikov, Dmitriy KritskiyDeadline: 20 June 2022
Special Issue in
Computation
Applications of Statistics and Machine Learning in Electronics
Guest Editors: Stefan Hensel, Marin B. Marinov, Malinka Ivanova, Maya Dimitrova, Hiroaki WagatsumaDeadline: 25 July 2022
Special Issue in
Computation
Calculations in Solution
Guest Editor: Liliana MamminoDeadline: 31 July 2022
Special Issue in
Computation
Recent Advances in Process Modeling and Optimisation
Guest Editors: George Tsakalidis, Kostas VergidisDeadline: 31 August 2022