Next Article in Journal
A Experimental Study on Engineered Cementitious Composites (ECC) Incorporated with Sporosarcina pasteurii
Next Article in Special Issue
Axial Load Enhancement of Lightweight Aggregate Concrete (LAC) Using Environmentally Sustainable Composites
Previous Article in Journal
Automatic Classification and Coding of Prefabricated Components Using IFC and the Random Forest Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Compressive Strength Prediction of Fly Ash Concrete Using Machine Learning Techniques

1
School of Architecture and Civil Engineering, Zhengzhou University of Industrial Technology, Zhengzhou 451100, China
2
School of Civil Engineering, Southeast University, Nanjing 211189, China
3
School of Civil Engineering, Xinyang College, Xinyang 464000, China
*
Author to whom correspondence should be addressed.
Submission received: 19 April 2022 / Revised: 3 May 2022 / Accepted: 19 May 2022 / Published: 22 May 2022
(This article belongs to the Special Issue Strength and Performance of Building Materials)

Abstract

:
It is time-consuming and uneconomical to estimate the strength properties of fly ash concrete using conventional compression experiments. For this reason, four machine learning models—extreme learning machine, random forest, original support vector regression (SVR), and the SVR model optimized by a grid search algorithm—were proposed to predict the compressive strength of fly ash concrete on 270 group datasets. The prediction results of the proposed model were compared using five evaluation indices, and the relative importance and effect of each input variable on the output compressive strength were analyzed. The results showed that the optimized hybrid model showed the best predictive behavior compared to the other three models, and can be used to forecast the compressive strength of fly ash concrete at a specific mix design ratio before conducting laboratory compression tests, which will save costs on the specimens and laboratory tests. Among the eight input variables listed, age and water were the two relatively most important features with superplasticizer and fly ash being of weaker relative importance.

1. Introduction

Concrete is one of the most commonly used building materials in the world. However, the production of concrete and its ingredients requires large amounts of energy. It not only consumes natural resources but also releases carbon dioxide, which places huge pressure on the environment [1,2,3]. The use of supplementary cementitious materials (SCMs) in concrete can help in the reduction of CO2 emissions [4,5,6]. Therefore, utilizing SCMs in concrete may be an effective and sustainable approach [7]. Fly ash is considered as an SCM that can partially replace cement in concrete. In addition to reducing greenhouse gas emissions, fly ash concrete has some advantages [8]: (1) It can improve the fluidity, cohesion, and water retention of concrete; (2) it can reduce the heat release of hydration in concrete and obviously reduce the temperature cracks, which is especially beneficial for large volume concrete projects; (3) due to the effect of secondary hydration, the compactness of concrete is increased and the interface structure is improved, which can improve the impermeability and sulfate corrosion resistance of concrete; (4) concrete with the same strength, with the addition of fly ash, can reduce the amount of cement to about 10%, which greatly reduces the production cost. Moreover, the durability of fly ash concrete is higher than that of cement concrete due to the prolonged volcanic ash reaction. Therefore, the use of fly ash in concrete can help reduce CO2 emissions by reducing the cement demand and producing materials with improved mechanical properties.
Among the mechanical properties of concrete, compressive strength is one of the most basic but important mechanical properties. To achieve the expected compressive strength of fly ash concrete through a reasonable mix ratio, the traditional method requires constant adjustment of the concrete mix ratio to produce laboratory concrete specimens, which are then subjected to compression tests to obtain their compressive strength. If the strength does not reach the expected standard, the specimens need to be recreated, which is a very time-consuming and labor-intensive task. If a method could be used to roughly estimate the compressive strength of concrete from a given mix before the compression tests, it would greatly save time and the cost of testing and producing specimens.
With the advent of machine learning and its development in the field of civil engineering, various machine learning models have been applied that have achieved good results in predicting the mechanical properties of concrete [9,10,11,12,13,14,15,16,17]. Xu et al. [18] proposed an improved random forest model to predict the compressive strength of high-performance concrete. The results showed that the prediction accuracy of the model was higher than that of other algorithms, and the method demonstrated a strong generalization capability. Saha et al. [19] employed the support vector regression (SVR) method to predict the compressive strength of self-compacting concrete and reached a high-correlation coefficient R exceeding 0.97 between the predicted and actual strength. Moreover, the prediction performance of the SVR model was better than the artificial neural network (ANN) and multivariable regression analysis. Farooq et al. [20] introduced four machine learning algorithms—random forest, gene expression programming, ANN, and decision tree—to predict the compressive strength of concrete, and random forest showed the best prediction performance. Shamiri et al. [21] developed an extreme learning machine model to predict the strength of high-strength concrete, and the model achieved high accuracy and good generalization ability. Chen et al. [22] utilized two models, the long short-term memory (LSTM) network and SVR, to predict the compressive strength of high-strength concrete, and the results showed that the LSTM outperformed the original SVR model with a higher correlation coefficient and smaller error metrics. To improve the performance of the original SVM, Hao et al. [23] proposed a combination of SVM and K-fold cross-validation to predict the concrete compressive strength. The prediction accuracy of the proposed hybrid model improved when compared to the original SVM model.
In general, these intelligent models have shown good applicability and performance in predicting the mechanical properties of concrete predictions. However, it is undeniable that the predictive performance of many original models is limited, and the performance of the original models is often related to the hyper-parameters of the models when the dataset is fixed. Therefore, one branch of research into machine learning models is the application of optimization algorithms in machine learning models. Further exploration in this area is needed to better understand and apply these machine learning models. Considering the good performance of the SVR model in regression prediction, this paper proposes employing SVR as the base model and using a grid search (GS) optimization algorithm to improve the prediction performance of the model to achieve an accurate prediction of the compressive strength of fly ash concrete. Then, the model prediction results are compared with the prediction results of the three models: extreme learning machine (ELM), original SVR, and random forest.

2. Methodology

2.1. Extreme Learning Machine

The extreme learning machine is used to train single hidden layer feedforward neural networks (SLFN). Unlike the traditional SLFN training algorithm, the extreme learning machine (Figure 1) randomly selects the input layer weights and the hidden layer bias. The output layer weights are calculated by minimizing a loss function consisting of the training error term and the regular term of the output layer weight parametrization based on Moore–Penrose’s generalized inverse matrix theory. Theoretical studies show that ELM maintains the generic approximation capability of SLFN, even if the hidden layer nodes are randomly generated. In the past decade, the theory and applications of ELM have been widely studied, and from the perspective of learning efficiency, the extreme learning machine has the advantages of few training parameters, fast learning speed, and strong generalization ability.
f L ( x ) = i = 1 L β i h i ( x ) = H ( x ) β
where x is the input; β is the output weight between the hidden layer and the output layer; hi(x) is the output of the i-th hidden layer node; and the output of the hidden layer is H(x). The output functions of the hidden layer nodes are not unique, and different output functions can be used for different hidden layer neurons. Usually, in practical applications, hi(x) is represented as follows.
h i ( x ) = g ( W i , b i , x ) = g ( W i x + b i )
g(x) is the activation function, and the Sigmoid function can usually be used.
g ( x ) = 1 1 + e x = e x e x + 1

2.2. Random Forest

Random forest is one of the most advanced integration algorithms with the advantages of a few model parameters and high-resistance to overfitting. Decision trees are the basic predictors of random forest. As shown in Figure 2, the random forest regression algorithm uses decision trees as the basic model and produces a series of differentiated decision tree models by constructing different training datasets and different feature spaces, usually using voting or averaging to obtain the final results [18]. The mathematical expression is shown below.
H ( x ) = 1 k i = 1 k h i ( x )
where H(x) denotes the predicted value of the random forest regression model; and hi(x) denotes the output of the i-th decision tree model.

2.3. Support Vector Regression Model

The support vector machine regression principle is different from the traditional regression technique, which mainly uses the structural risk minimization principle instead of the empirical risk minimization principle [24]. As shown in Figure 3, there is a sample training set D = { ( x i , y i ) | i = 1 , 2 , n } , x i R n , y i R . When the regression problem is linear, a function f(x) is probed on Rn so that yi = f(x), and f(x) can be described as follows.
f ( x ) = ω x + b
where ω is the weight vector, and b is the bias. In the case of nonlinearity, the nonlinear regression problem can be transformed into an optimization problem of the solution function [25].
min ( R ( ω , ξ i , ξ i * ) ) = 1 2 ω ω + C i = 1 n ( ξ i + ξ i * ) s . t . { y i [ ω ϕ ( x i ) + b ] ε + ξ i ω ϕ ( x i ) + b y i ε + ξ i * ξ i , ξ i * 0
where C is the penalty parameter, if a larger value of c indicates a heavier penalty for errors; ε is the insensitive loss parameter; ξ i and ξ i * are the two slack variables.
It is generally solved by introducing the Lagrange function and using its pairwise problem.
max ( W ( α , α * ) = 1 2 i , j = 1 n ( α i α i * ) ( α j α j * ) [ ϕ ( x i ) ϕ ( x j ) ] i = 1 n ( α i + α i * ) ε + i = 1 n ( α i α i * ) y i s . t . { i = 1 n ( α i α i * ) = 0 0 α i , α i * C , i = 1 , 2 , n
By introducing the kernel function K ( x i , x j ) , it can be implemented in the support vector machine algorithm to solve the high-dimensional computational problem. Then, the regression fitting function of the support vector machine is:
f ( x ) = ω x + b = i , j = 1 n ( α i α i * ) K ( x i , x j ) + b
The kernel functions are the linear kernel function, polynomial kernel function, and Gaussian kernel function, with the latter being the most widely utilized [26,27]. As a result, the Gaussian kernel function is used.

2.4. Grid Search Optimization Algorithm

It is generally known that the selection of hyper-parameters for the SVR model has a significant impact on the identification results. Because the parameters are extremely nonlinear, determining the combination of parameters such as the penalty parameter C and the kernel parameter g frequently necessitates many experiments. Although Lin et al. [28] have conducted a lot of work to make SVM easier to use such as introducing LIBSVM, the choice of parameters C and g is still based on experience. As a result, there is a pressing need to implement automatic parameter tuning to acquire the best value once the parameters have been entered. These issues can be solved using the grid search method. Grid search may be the simplest and most widely used hyperparametric search algorithm, which permutes the possible values of each parameter and lists all possible combinations to generate a “grid”. Each combination is then used for SVM training, and performance is evaluated using cross-validation. In practice, the grid search method generally uses a wide search range and a large step size to find the possible locations of the global optimum, and then gradually reduces the search range and step size to find a more precise optimum.

3. Materials and Dataset Description

3.1. Input and Output Variables

A dataset with a total of 270 groups was collected from the shared dataset of the UC Irvine Machine Learning Repository [29]. These data included eight input variables and one output variable. These variables were cement, fly ash, water, superplasticizer, coarse aggregate, fine aggregate, age, and W/C (water to binder), respectively. Table 1 and Figure 4 show the statistical characteristics and distribution of these variables. The distribution of these variables showed that the variables were highly discrete. The Pearson coefficients, as shown in Figure 5, appear to make it difficult to identify a clear relationship between the individual components and the output variable. This also shows from the side that there is a complex nonlinear correlation between the multiple input variables and compressive strength. The purpose of this paper was to capture this nonlinear correlation. A total of 216 sets of data were chosen at random as the training set and the remainder as the test set from all of the data samples.

3.2. Evaluation Metrics

To quantify the effectiveness of the model training and prediction, five evaluation metrics were introduced [30]. The formulae for calculating these indicators are shown in Table 2 [31,32]. R can reflect the correlation between the predicted compressive strength and the actual compressive strength, and the closer R is to 1 indicates a stronger correlation. The other four error indicators can reflect the deviation and dispersion degree between the predicted compressive strength and the measured compressive strength.

4. Model Performance

For a grid search, a rough grid search is generally performed first, followed by a fine grid search. Figure 6 shows the matching MSE distribution for each search node under an exponential grid of [2−5, 25] with a step of 20.2. After the grid search, the optimal combination of hyper-parameters C and g was found.
For comparison and assessment purposes, Figure 7 shows the predicted and experimental values for the training and test sets in greater detail. Among the four models listed, the ELM model exhibited the worst prediction behavior, with correlation coefficients R less than 0.9 for both the training and testing sets. The prediction correlation coefficient R of the SVR and random forest model both exceeded 0.9, but the prediction performance of the random forest model on this dataset was better than that of the classical SVR model in terms of the correlation coefficient. However, the optimized SVR-GS model showed the best prediction capacity among these four models. With a correlation coefficient R above 0.94 for both the training and test sets, the predicted values matched the experimental values well at each sample point, confirming the SVR-GS model’s correctness and effectiveness in capturing the intricate nonlinear relationships between the eight input variables and compressive strength. It is worth mentioning that the predictive accuracy of the SVR model optimized by the grid search algorithm became greatly improved. This is mainly attributed to the fact that, unlike the original SVR model, which uses default values or other random combinations of values for the hyper-parameters C and g, the optimized parameter combination achieves a global optimum, and this combination of optimal parameters greatly improves the predictive performance of the model.
Figure 8 shows a comparison of the expected and actual values for each sample for a more intuitive presentation, and the error metrics for model training and testing are shown in Figure 9. It can be observed that for both the training and test sets, the highest values for RMSE, MSE, MAPE, and MAPE were recognized by the ELM model, while the hybrid SVR-GS model achieved the highest prediction accuracy in terms of the lowest RMSE = 4.814, MSE = 23.177, MAPE = 3.432, and MAPE = 13.818%. Compared with the average relative errors of 23.71%, 16.51, and 16.17% for the ELM, random forest, and SVR models, respectively, the optimized model achieved the smallest average relative error of 13.82%, which was less than 15%. This shows that the prediction accuracy of the SVR-GS model can basically meet the engineering requirements and could serve as an efficient alternative method to predict the compressive strength of fly ash concrete.

5. Sensitivity Analysis of Input Variables

Each input variable influences the output compressive strength in the fly ash concrete mix design, although the degree of this influence varies. As a result, it is crucial to assess the relative importance and contribution of each variable to the final output. To analyze the feature importance of each variable, the Shapley additive explanations (SHAP) approach is introduced [33,34,35].
Figure 10 shows that age was the most important of the eight input factors in the current dataset, followed by W/C, water, cement, fine aggregate, and fly ash. Compressive strength was less affected by the coarse aggregate and superplasticizer. Figure 11 shows whether these variables have a positive or negative effect on the output results. A tangible sample is represented by each point in Figure 11, and the variables listed from the top to the bottom have the highest to the lowest relative importance. It can be seen that the age, cement, fly ash, and superplasticizer were all beneficial to compressive strength, and the increase in these variables caused an increase in the compressive strength. In contrast, water and W/C had a negative effect on the compressive strength.

6. Conclusions

In this work, an optimized SVR-GS model was introduced to predict the compressive strength of the fly ash concrete. The main findings are as follows.
(1)
The proposed hybrid model could effectively capture the complicated nonlinear correlations between the eight input variables and the output compressive strength of the fly ash concrete.
(2)
The prediction performance of the SVR-GS model was better than that of the other three machine learning models with a higher prediction accuracy and smaller error, and is recommended for the pre-estimation of the compressive strength of fly ash concrete before laboratory compression experiments.
(3)
Concerning the eight input variables, age was the most important, followed by W/C, water, cement, fine aggregate, and fly ash. Coarse aggregate and superplasticizer were less important for compressive strength. Moreover, age, cement, fly ash, and superplasticizer all played a positive role in the compressive strength and their increase led to an increase in the compressive strength, while water and W/C were negative for the compressive strength of fly ash concrete.
Due to the lack of sufficient datasets, the influence of other factors such as the concrete chemical composition, aggregate type, and particle size, etc. on the predictive performance of the model was analyzed. As the corresponding dataset is collected, these influencing factors will be considered in the model as input features. Moreover, the effectiveness and validity of the model in practical applications will be further analyzed and discussed in future research.

Author Contributions

Y.J.: Writing response to comments, writing revised manuscript, software, validation. H.L.: Methodology, writing-original draft. Y.Z.: Data curation, resources. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Han, Y.; Lin, R.-S.; Wang, X.-Y. Compressive Strength Estimation and CO2 Reduction Design of Fly Ash Composite Concrete. Buildings 2022, 12, 139. [Google Scholar] [CrossRef]
  2. Ahmad, W.; Ahmad, A.; Ostrowski, K.A.; Aslam, F.; Joyklad, P.; Zajdel, P. Application of Advanced Machine Learning Approaches to Predict the Compressive Strength of Concrete Containing Supplementary Cementitious Materials. Materials 2021, 14, 5762. [Google Scholar] [CrossRef] [PubMed]
  3. Taji, I.; Ghorbani, S.; de Brito, J.; Tam, V.W.Y.; Sharifi, S.; Davoodi, A.; Tavakkolizadeh, M. Application of statistical analysis to evaluate the corrosion resistance of steel rebars embedded in concrete with marble and granite waste dust. J. Clean. Prod. 2019, 210, 837–846. [Google Scholar] [CrossRef]
  4. Vieira, G.L.; Schiavon, J.Z.; Borges, P.M.; da Silva, S.R.; de Oliveira Andrade, J.J. Influence of recycled aggregate replacement and fly ash content in performance of pervious concrete mixtures. J. Clean. Prod. 2020, 271, 122665. [Google Scholar] [CrossRef]
  5. Li, Y.; Qiao, C.; Ni, W. Green concrete with ground granulated blast-furnace slag activated by desulfurization gypsum and electric arc furnace reducing slag. J. Clean. Prod. 2020, 269, 122212. [Google Scholar] [CrossRef]
  6. Shubbar, A.A.; Jafer, H.; Dulaimi, A.; Hashim, K.; Atherton, W.; Sadique, M. The development of a low carbon binder produced from the ternary blending of cement, ground granulated blast furnace slag and high calcium fly ash: An experimental and statistical approach. Constr. Build. Mater. 2018, 187, 1051–1060. [Google Scholar] [CrossRef]
  7. Juenger, M.C.G.; Snellings, R.; Bernal, S.A. Supplementary cementitious materials: New sources, characterization, and performance insights. Cem. Concr. Res 2019, 122, 257–273. [Google Scholar] [CrossRef]
  8. Diaz-Loya, I.; Juenger, M.; Seraj, S.; Minkara, R. Extending supplementary cementitious material resources: Reclaimed and remediated fly ash and natural pozzolans. Cem. Concr. Compos. 2019, 101, 44–51. [Google Scholar] [CrossRef]
  9. Barkhordari, M.S.; Armaghani, D.J.; Mohammed, A.S.; Ulrikh, D.V. Data-Driven Compressive Strength Prediction of Fly Ash Concrete Using Ensemble Learner Algorithms. Buildings 2022, 12, 132. [Google Scholar] [CrossRef]
  10. Moradi, M.J.; Khaleghi, M.; Salimi, J.; Farhangi, V.; Ramezanianpour, A.M. Predicting the compressive strength of concrete containing metakaolin with different properties using ANN. Measurement 2021, 183, 109790. [Google Scholar] [CrossRef]
  11. Tang, F.; Wu, Y.; Zhou, Y. Hybridizing Grid Search and Support Vector Regression to Predict the Compressive Strength of Fly Ash Concrete. Adv. Civ. Eng. 2022, 2022, 3601914. [Google Scholar] [CrossRef]
  12. Ahmad, A.; Ahmad, W.; Aslam, F.; Joyklad, P. Compressive strength prediction of fly ash-based geopolymer concrete via advanced machine learning techniques. Case Stud. Constr. Mater. 2022, 16, e00840. [Google Scholar] [CrossRef]
  13. Wu, Y.; Zhou, Y. Hybrid machine learning model and Shapley additive explanations for compressive strength of sustainable concrete. Constr. Build. Mater. 2022, 330, 127298. [Google Scholar] [CrossRef]
  14. Sonebi, M.; Cevik, A.; Grünewald, S.; Walraven, J. Modelling the fresh properties of self-compacting concrete using support vector machine approach. Constr. Build. Mater. 2016, 106, 55–64. [Google Scholar] [CrossRef]
  15. Yan, K.; Shi, C. Prediction of elastic modulus of normal and high strength concrete by support vector machine. Constr. Build. Mater. 2010, 24, 1479–1485. [Google Scholar] [CrossRef]
  16. Ahmad, A.; Chaiyasarn, K.; Farooq, F.; Ahmad, W.; Suparp, S.; Aslam, F. Compressive Strength Prediction via Gene Expression Programming (GEP) and Artificial Neural Network (ANN) for Concrete Containing RCA. Buildings 2021, 11, 324. [Google Scholar] [CrossRef]
  17. Han, B.; Wu, Y.; Liu, L. Prediction and uncertainty quantification of compressive strength of high-strength concrete using optimized machine learning algorithms. Struct Concr. 2022, 1–14. [Google Scholar] [CrossRef]
  18. Han, Q.; Gui, C.; Xu, J.; Lacidogna, G. A generalized method to predict the compressive strength of high-performance concrete by improved random forest algorithm. Constr. Build. Mater. 2019, 226, 734–742. [Google Scholar] [CrossRef]
  19. Saha, P.; Debnath, P.; Thomas, P. Prediction of fresh and hardened properties of self-compacting concrete using support vector regression approach. Neural Comput. Appl. 2020, 32, 7995–8010. [Google Scholar] [CrossRef]
  20. Farooq, F.; Nasir Amin, M.; Khan, K.; Rehan Sadiq, M.; Faisal Javed, M.; Aslam, F.; Alyousef, R. A Comparative Study of Random Forest and Genetic Engineering Programming for the Prediction of Compressive Strength of High Strength Concrete (HSC). Appl. Sci. 2020, 10, 7330. [Google Scholar] [CrossRef]
  21. Al-Shamiri, A.K.; Kim, J.H.; Yuan, T.-F.; Yoon, Y.S. Modeling the compressive strength of high-strength concrete: An extreme learning approach. Constr. Build. Mater. 2019, 208, 204–219. [Google Scholar] [CrossRef]
  22. Chen, H.; Li, X.; Wu, Y.; Zuo, L.; Lu, M.; Zhou, Y. Compressive Strength Prediction of High-Strength Concrete Using Long Short-Term Memory and Machine Learning Algorithms. Buildings 2022, 12, 302. [Google Scholar] [CrossRef]
  23. Ling, H.; Qian, C.; Kang, W.; Liang, C.; Chen, H. Combination of Support Vector Machine and K-Fold cross validation to predict compressive strength of concrete in marine environment. Constr. Build. Mater. 2019, 206, 355–363. [Google Scholar] [CrossRef]
  24. Wu, Y.; Zhou, Y. Prediction and feature analysis of punching shear strength of two-way reinforced concrete slabs using optimized machine learning algorithm and Shapley additive explanations. Mech. Adv. Mater. Struct. 2022, 1–11. [Google Scholar] [CrossRef]
  25. Ghorbani, B.; Arulrajah, A.; Narsilio, G.; Horpibulsuk, S. Experimental investigation and modelling the deformation properties of demolition wastes subjected to freeze–thaw cycles using ANN and SVR. Constr. Build. Mater. 2020, 258, 119688. [Google Scholar] [CrossRef]
  26. Zhang, W.; Khan, A.; Huyan, J.; Zhong, J.; Peng, T.; Cheng, H. Predicting Marshall parameters of flexible pavement using support vector machine and genetic programming. Constr. Build. Mater. 2021, 306, 124924. [Google Scholar] [CrossRef]
  27. Wu, Y.; Li, S. Damage degree evaluation of masonry using optimized SVM-based acoustic emission monitoring and rate process theory. Measurement 2022, 190, 110729. [Google Scholar] [CrossRef]
  28. Chang, C.-C.; Lin, C.-J. Training v-Support Vector Regression: Theory and Algorithms. Neural Comput. 2002, 14, 1959–1977. [Google Scholar] [CrossRef]
  29. Lichman, M. UCI Machine Learning Repository; University of California, School of Information and Computer Science: Irvine, CA, USA, 2013; Available online: http://archive.ics.uci.edu/ml. (accessed on 27 November 2020).
  30. Jueyendah, S.; Lezgy-Nazargah, M.; Eskandari-Naddaf, H.; Emamian, S.A. Predicting the mechanical properties of cement mortar using the support vector machine approach. Constr. Build. Mater. 2021, 291, 123396. [Google Scholar] [CrossRef]
  31. Farooq, F.; Ahmed, W.; Akbar, A.; Aslam, F.; Alyousef, R. Predictive modeling for sustainable high-performance concrete from industrial wastes: A comparison and optimization of models using ensemble learners. J. Clean. Prod. 2021, 292, 126032. [Google Scholar] [CrossRef]
  32. Aslam, F.; Farooq, F.; Amin, M.N.; Khan, K.; Waheed, A.; Akbar, A.; Javed, M.F.; Alyousef, R.; Alabdulijabbar, H. Applications of Gene Expression Programming for Estimating Compressive Strength of High-Strength Concrete. Adv. Civ. Eng. 2020, 2020, 8850535. [Google Scholar] [CrossRef]
  33. Mangalathu, S.; Hwang, S.-H.; Jeon, J.-S. Failure mode and effects analysis of RC members based on machine-learning-based SHapley Additive exPlanations (SHAP) approach. Eng. Struct. 2020, 219, 110927. [Google Scholar] [CrossRef]
  34. Lyngdoh, G.A.; Zaki, M.; Krishnan, N.M.A.; Das, S. Prediction of concrete strengths enabled by missing data imputation and interpretable machine learning. Cem. Concr. Compos. 2022, 128, 104414. [Google Scholar] [CrossRef]
  35. Mangalathu, S.; Shin, H.; Choi, E.; Jeon, J.-S. Explainable machine learning models for punching shear strength estimation of flat slabs without transverse reinforcement. J. Build. Eng. 2021, 39, 102300. [Google Scholar] [CrossRef]
Figure 1. Network structure of the extreme learning machine.
Figure 1. Network structure of the extreme learning machine.
Buildings 12 00690 g001
Figure 2. Schematic diagram of the random forest model (Reprinted with permission from [18]).
Figure 2. Schematic diagram of the random forest model (Reprinted with permission from [18]).
Buildings 12 00690 g002
Figure 3. Schematic diagram of the support vector regression.
Figure 3. Schematic diagram of the support vector regression.
Buildings 12 00690 g003
Figure 4. Histograms of the input and output variables.
Figure 4. Histograms of the input and output variables.
Buildings 12 00690 g004aBuildings 12 00690 g004b
Figure 5. Pearson’s correlation coefficient between any two variables.
Figure 5. Pearson’s correlation coefficient between any two variables.
Buildings 12 00690 g005
Figure 6. Search process for the best combination of hyper-parameters C and g.
Figure 6. Search process for the best combination of hyper-parameters C and g.
Buildings 12 00690 g006
Figure 7. Scatter plot of the prediction results of the four models.
Figure 7. Scatter plot of the prediction results of the four models.
Buildings 12 00690 g007aBuildings 12 00690 g007b
Figure 8. The training and test results of the models.
Figure 8. The training and test results of the models.
Buildings 12 00690 g008aBuildings 12 00690 g008bBuildings 12 00690 g008c
Figure 9. The performance criteria of the two models.
Figure 9. The performance criteria of the two models.
Buildings 12 00690 g009
Figure 10. Each variable’s relative contribution to the compressive strength.
Figure 10. Each variable’s relative contribution to the compressive strength.
Buildings 12 00690 g010
Figure 11. The SHAP summary plot of the compressive strength.
Figure 11. The SHAP summary plot of the compressive strength.
Buildings 12 00690 g011
Table 1. Variable statistical characteristics.
Table 1. Variable statistical characteristics.
VariableUnitMaxMinAverageStandard DeviationKurtosisSkewness
Cementkg/m354024736185.33−0.500.82
Fly ashkg/m314202848.26−0.441.2
Waterkg/m322814018419.130.29−0.38
Superplasticizerkg/m328045.943.521.77
Coarse aggregatekg/m3112580199777.12−0.19−0.26
Fine aggregatekg/m390059477679.77−0.07−0.67
Ageday36515375.917.012.62
W/C-0.700.270.530.11−0.04−0.92
StrengthMPa8063614.97−0.130.45
Table 2. Equations of the evaluation indices.
Table 2. Equations of the evaluation indices.
Evaluation MetricsEquation
R R = n i = 1 n ( Y e Y p ) i = 1 n Y e i = 1 n Y p [ n ( i = 1 n Y e 2 ) ( i = 1 n Y e ) 2 ] [ n ( i = 1 n Y p 2 ) ( i = 1 n Y p ) 2 ]
MAE M A E = 1 n i = 1 n | Y e Y p |
MSE M S E = 1 n i = 1 n ( Y e Y p ) 2
RMSE R M S E = 1 n i = 1 n ( Y e Y p ) 2
MAPE M A P E = 1 n i = 1 n | Y e Y p Y e | × 100 %
where Ye is the true result of compressive strength and Yp is the predicted result of compressive strength.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jiang, Y.; Li, H.; Zhou, Y. Compressive Strength Prediction of Fly Ash Concrete Using Machine Learning Techniques. Buildings 2022, 12, 690. https://0-doi-org.brum.beds.ac.uk/10.3390/buildings12050690

AMA Style

Jiang Y, Li H, Zhou Y. Compressive Strength Prediction of Fly Ash Concrete Using Machine Learning Techniques. Buildings. 2022; 12(5):690. https://0-doi-org.brum.beds.ac.uk/10.3390/buildings12050690

Chicago/Turabian Style

Jiang, Yimin, Hangyu Li, and Yisong Zhou. 2022. "Compressive Strength Prediction of Fly Ash Concrete Using Machine Learning Techniques" Buildings 12, no. 5: 690. https://0-doi-org.brum.beds.ac.uk/10.3390/buildings12050690

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop