Next Article in Journal
Design Optimisation of Metastructure Configuration for Lithium-Ion Battery Protection Using Machine Learning Methodology
Previous Article in Journal
Review of Thermal Management Strategies for Cylindrical Lithium-Ion Battery Packs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

AI-Based Nano-Scale Material Property Prediction for Li-Ion Batteries

1
W Booth School of Engineering Practice and Technology, McMaster University, Hamilton, ON L8S 4L8, Canada
2
Department of Mechanical Engineering, McMaster University, Hamilton, ON L8S 4L8, Canada
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Submission received: 20 November 2023 / Revised: 17 January 2024 / Accepted: 19 January 2024 / Published: 29 January 2024

Abstract

:
In this work, we propose a machine learning (ML)-based technique that can learn interatomic potential parameters for various particle–particle interactions employing quantum mechanics (QM) calculations. This ML model can be used as an alternative for QM calculations for predicting non-bonded interactions in a computationally efficient manner. Using these parameters as input to molecular dynamics simulations, we can predict a diverse range of properties, enabling researchers to design new and novel materials suitable for various applications in the absence of experimental data. We employ our ML-based technique to learn the Buckingham potential, a non-bonded interatomic potential. Subsequently, we utilize these predicted values to compute the densities of four distinct molecules, achieving an accuracy exceeding 93%. This serves as a strong demonstration of the efficacy of our proposed approach.

1. Introduction

The chemical compound space (CCS) is the theoretical space consisting of every possible compound known (and unknown) to us [1,2]. Even some of our largest databases consisting of approximately 10 8 known substances are a mere drop in the ocean compared with an estimated 10 180 substances that possibly make up the CCS [3,4]. Needless to say, the next big discovery of a compound that can revolutionize energy storage devices of the future is far from trivial.
The status quo for techniques used in the discovery of new and novel materials to enhance battery technologies has progressed from expensive and time-consuming empirical trial and error methods to the more recent first principles approach of using quantum mechanics (QM) [5,6,7,8,9], Monte Carlo simulations and molecular dynamics (MD) [10,11,12,13,14]. QM calculations evaluate electron–electron interactions bby solving the complex Schrödinger equation, thereby enabling accurate results for a wide variety of properties. However, the computational cost is a bottleneck for molecules larger than a couple hundred atoms. Hence, for multi-component or multi-layer structures such as the solid electrolyte interface layer, QM is not a feasible approach. Additionally, many battery components including ionic and polymer electrolytes, crystal structures and electrode–electrolyte interactions [11,15,16,17,18] are better analyzed on larger length and time scales that are inaccessible with QM. MD simulations simplify particle–particle interactions to five main types of interactions, namely nonbonded, bonded, angle, dihedral and improper interactions. These interactions, which can be obtained using a simple algebraic equation, reduce the computational cost significantly and are applicable to systems almost 10 6 times larger. To analyze ion migration in perovskite nickelate with 200 atoms, QM techniques, even using deensity functional theory (DFT) approximation to reduce computational costs, require about 10 5 core-hours of computational time in a picosecond range simulation. On the other hand, MD simulations with 105 atoms required only 10 4 core-hours of computational time [19]. Thus, MD simulations enable the analysis of a wide variety of properties and behavior of materials at the atomic scale, such as the crystal structure, thermal properties and mechanical properties, which are often too complex to model using QM calculations. In a recent review, Sun et al. [20] presented the use of MD simulations to optimize lithium metal batteries, investigating the transport structure of Li ions, the electrochemical process at the electronic, atomic or molecular level, the Li+ transport mechanism and the Li deposition behavior in detail.
Though MD simulations are widely used to investigate the properties of materials at the atomic level, these simulations rely on experimentally derived interatomic potential parameters that determine the forces between particles [21]. This dependence on prior experimental data poses a challenge in using MD to design new and novel materials. To address this issue, Lanjan et al. [22] recently proposed a novel computational framework that couples QM calculations with MD simulations. This generates a wide range of crystal structures by varying a single system parameter (e.g., bond length) while keeping other parameters relaxed at their minimum energy level. The QM calculations are then used to evaluate the system’s energy as a function of these changes, and the resulting data points are used to fit the interaction equations to estimate the potential parameters for each type of particle–particle interaction. Employing this framework enables the study of crystal structures with the accuracy of QM calculations but at the speed and system sizes permissible by MD techniques. While this framework enhances nano-based computational methods, the QM calculations still need massive amounts of computational power, which can be significantly reduced with the AI-based technique proposed in this work.
The emergence of ML, deep learning (DL) and artificial intelligence (AI) has helped alleviate the bottlenecks posed by QM and MD simulations and has made it possible to expand the scope of our search for novel materials in the CCS. ML and DL algorithms are orders of magnitude faster than ab initio techniques. Unlike the QM-based simulations, which can take days to complete, ML algorithms can produce results within seconds. The use of AI has brought a paradigm shift in research related to improving battery technology as well as molecular property prediction and material discovery in general. For example, Sandhu et al. [23] used DL to examine the optimal crystal structures of doped cathode materials in lithium manganese oxide (LMO) batteries. Failed or unsuccessful synthesis data were used to predict the reaction success rate for the crystallization of templated vanadium selenites [24]. Using QM and ML techniques, Lu et al. [25] developed a method to predict undiscovered hybrid organic-inorganic perovskites (HOIPs) for photovoltaics. Their screening technique was able to shortlist six HOIPs with ideal band gaps and thermal stabilities from 5158 unexplored candidates. To identify material compositions with suitable properties, Meredig et al. [26] built an ML model trained on thousands of ground state crystal structures and used this model to scan roughly 1.6 million candidate compositions of novel ternary compounds to produce a ranked list of 4500 stable ternary compositions that would possibly represent undiscovered materials.
The broad approach employed when using AI-based property prediction models consists of three overarching components: a reference database consisting of relevant quantum mechanical data which is used to fit the AI model; a mathematical representation that not only uniquely describes the attributes of the reference materials but also enables effective model training; and finally a suitable AI model that can accomplish the learning task itself. In the ensuing sections, we describe these components in further detail.

1.1. Database

The fundamental premise of AI is the ability to draw inferences from patterns in data and enable an accurate prediction in unknown domains. Hence, the data, which make up the training examples for our learning task, becomes a critical aspect for successful prediction. With the introduction of the Materials Genome Initiative in 2011 [27], the United States signaled the importance of unifying the infrastructure for material innovation and harnessing the power of material data. In lieu of the same goal, there has been an advent of various materials databases, such as the Inorganic Crystal Structure Database (ICSD) [28], the Open Quantum Materials Database (OQMD) [29], the Cambridge Structural Databases [30], the Harvard Clean Energy Project [31], the Materials Project [32] and the AFLOWLIB [33]. Specifically, the size of the training examples, the diversity of the dataset and the degrees of freedom all contribute to how effective the learning task for a specific objective can be [34]. In predicting properties such as the band gap energy and glass-forming ability for crystalline and amorphous materials, Ward et al. [35] methodically selected a chemically diverse set of attributes taken from the OQMD. Similarly, for electronic-structure problems, Schütt et al. [36] noted that the density of states at the Fermi energy is the critical property of concern. In predicting this property, around 7000 crystal structures from the ICSD were used, observing higher predicted variance for certain configurations and the need to extend the training set in these specific areas. The process of material discovery is complex and diverse, and it is not surprising that there is no one-size-fits-all database that can accurately predict the properties of all materials. The physical and chemical characteristics of materials vary widely, requiring different methods and techniques for precise analysis and prediction. Moreover, the current methodologies rely on the availability of well-curated data or the ability to manually generate such data, which is a daunting and often infeasible task, especially for new and unexplored materials. Thus, there is a need to develop generalizable and adaptable approaches that can efficiently handle a diverse range of materials, properties and configurations without the need for extensive data generation or curation.

1.2. Molecular Representation

ML algorithms draw inferences from data to establish a relationship between the atomic structure and the properties of a system. To enable the best possible structure-property approximation, a good representation of the material (also referred to as the ‘fingerprint’ or ‘descriptor’) is crucial. The first Hohenberg–Kohn theorem of DFT proves that the electron density of a system contains all the information needed to describe its ground state properties, and it is a ‘universal descriptor’ that can be used to predict these properties without knowledge of the details of the interactions between the electrons [37]. Crucially, for ML, a good molecular representation is invariant to rotation and translation of the system as well as permutation of atomic indices [38]. Therefore, unfortunately, the electronic density is not a universally suitable representation of a system. Additionally, a good descriptor must be unique, continuous, compact and computationally cheap [38]. Often, there are multiple molecular geometries that possess similar values for a property. Hence, there is no single universal representation for all properties leading to hundreds of molecular descriptors that are suitable only for a small subset of the CCS and a small subset of properties [39]. A commonly used molecular representation that satisfies the above-mentioned criteria of a good representation is the ‘Coulomb matrix’. It uses the same parameters that constitute the Hamiltonian for any given system, namely the set of Cartesian coordinates RI and nuclear charges ZI [40]. While the Coulomb matrix representation has shown tremendous success for property prediction in finite systems, it is unable to do the same for infinite periodic crystal structures [36]. Hansen et al. [41] proposed a new descriptor called ‘bag-of-bonds’ that performed better due to incorporating the many-body interactions of a system. In fact, the use of different descriptors in an ML endeavor for material property prediction is so common that there are open-source software packages that provide implementations for a myriad of different descriptors [38]. Unfortunately, a lack of clarity on the right descriptor makes the use of AI inaccessible to researchers that possess domain expertise but lack the needed knowledge of AI. Additionally, the lack of generalizability of a chosen descriptor makes the current AI-based techniques inaccurate and narrow in scope. For overcoming these challenges, the novel technique proposed in this work makes material discovery and property prediction easier and more accessible without the time-consuming process of selecting a suitable descriptor. Specifically, our approach leverages a two-stage process combining AI with MD simulations.

1.3. AI Model

In addition to an appropriate database and the precise molecular representation, a critical aspect in the material property prediction process is the choice of the AI algorithm. AI algorithms can be categorized into supervised learning, unsupervised learning and reinforcement learning. Supervised learning uses a standard fitting procedure that attempts to determine a mapping function between the known input features and the corresponding output labels. The goal is to make accurate predictions for new, unseen data. In contrast, unsupervised learning does not have prior knowledge of the desired output, and the goal is to find patterns and structures in this unlabeled data. Reinforcement learning uses an iterative trial-and-error process where the actions are determined based on reinforcement in the form of a reward-penalty system. The goal here is to maximize the cumulative reward over time. Supervised learning is the most widespread category of learning used in materials research. Different models may be better suited for certain types of materials or properties, and the choice of model often depends on the available data and the specific goals of the prediction task. Akbarpour et al. [42] found that artificial neural networks (ANNs) performed better in predicting the synthesis conditions of nano-porous anodic aluminum oxide at the interpore distance in comparison with both multiple linear regression and experimental studies. On the other hand, for the modeling and synthesis of zeolite synthesis, Manuel Serra et al. [43] found that support vector regression (SVR) outperformed ANNs and decision trees. Fang et al. [44] proposed a novel hybrid methodology for forecasting the atmospheric corrosion of metallic materials where the optimal hyperparameters for an SVR model were automatically determined using a generic algorithm. These examples highlight the need for AI expertise when choosing the right algorithm for a given application, which can be a barrier to making AI methods accessible for materials-based research.
In this work, we have presented an ML model to predict the non-bonded potential parameters for conventional elements in the periodic table. We propose a novel approach that uses ML to learn a common empirical non-bonded interatomic potential—the Buckingham potential [45]—and we successfully demonstrate the ability of this machine-learned potential to predict a wide range of properties when used as an input to classical MD simulations. We also demonstrate a marked improvement in the time taken to determine such properties compared with a traditional first principles approach.

2. Materials and Methods

Due to the enormity of the CCS, it is impossible to generate exhaustive datasets and consequently difficult for AI models to generalize well beyond the dimensional space of the training data. Additionally, the lack of a general representation that can scale well to very different properties results in AI-based techniques that fail to provide both the accuracy and generalizability that comes with ab initio techniques. Therefore, it is essential to use techniques that combine the speed of AI with the accuracy and generalizability of QM and MD simulations [46]. Hence, while most AI-based approaches follow the three-step process described above to predict a confined set of properties in a narrow subset of the CCS, our approach generalizes well to a large set of properties. We delve into the details of our process in the ensuing paragraphs.

2.1. Database Generation

To train our ML model, we generated a database employing the QM approach and Quantum Espresso (QE) software package [47,48,49]. QE uses the principles of QM and computational methods to solve the Schrödinger equation using DFT approximation and can predict the electronic structure and properties of a system at the atomic scale. The database consists of QE-generated non-bonded atom pair energies for each element in the periodic table. Self-consistent field calculations (QE configurations in Table 1) were performed for every possible same atom pair system. Furthermore, for each atom pair, various plausible total charges, each with multiple interatomic distances, were considered. We refer to each such combination of atom pairs with their respective total charge as a system charge configuration. The interatomic distance in each simulation was chosen to include different energy levels, from extremely close and unstable to far apart. There were more data points around the equilibrium range and fewer at farther distances. By exploring identical atom pair combinations for each element in the periodic table and introducing a range of charge values for each atom pair, we conducted simulations for 340 unique system charge configurations. For each configuration, we evaluated the interactions over 20 different distances, totaling nearly 700 h of computational time. This effort resulted in the creation of a comprehensive database comprising non-bonded atom pair energies for 6400 distinct configurations. It must be noted that not all system charge configurations were simulated with the same number of distance values. Some systems with larger elements may be unstable at close distances, and the QM simulations will not work well for those cases. Similar behavior may be seen for systems with higher charge values, owing to the repulsion between atoms.

2.2. Data Preprocessing

2.2.1. Curve Fitting

For the system charge configuration of carbon–carbon, for instance, the plot of relative non-bonded energy versus distance is shown in Figure 1 for various partial charges. In order to learn the interatomic potential for each configuration, the Buckingham potential was selected as the appropriate measure:
U n o n - b o n d e d = A e r B C r 6 .
In the above equation, r is the interatomic distance for a non-bonded atom pair. By fitting each configuration’s energy and distance values to the above equation, the constants A, B and C in the Buckingham potential were obtained using the ‘Levenberg–Marquardt’ algorithm. This algorithm starts with an initial guess for the parameters (A, B and C) of the function and then calculates the gradient of the residuals with respect to the parameters and iterates until the residuals are minimized or a maximum number of iterations is reached. To optimize the initial starting values for this algorithm, the grid search technique was utilized. As a result, a set of Buckingham constants was obtained for each system charge configuration.

2.2.2. Clustering

From the Buckingham potential equation, it is intuitively clear that for a given value of energy and distance, there can be multiple combinations of Buckingham constants. Now, it is prudent to choose a combination that could best enable the learning process of our model. This was accomplished by clustering the system charge configurations representing the same element. To cluster configurations for the same elements, the configuration with the highest R 2 value was first chosen for each element. All the Buckingham potentials obtained previously were recomputed while providing upper and lower bounds to the algorithm centered around this chosen configuration bearing the highest R 2 . The original computed configurations and the resulting recomputed configurations are shown in Figure 2 and Figure 3, respectively.

2.3. AI Model and Training

2.3.1. Fingerprint

The task of training an AI model for predicting non-bonded interatomic potentials, such as the Buckingham potential, requires a careful selection of both the configuration representation and the algorithm. The goal of this work is to determine a potential that can be used as input for MD simulations, giving researchers the flexibility to adjust additional parameters for their specific applications. To this end, we have chosen the most basic properties of a system charge configuration as the input for our AI model. These include the atomic mass, atomic radius, atomic number and partial charge of each atom. The task of selecting the appropriate representation or ‘fingerprint’ of each configuration is thus simplified, as we are only concerned with modeling the non-bonded interactions between atoms.

2.3.2. Training

To achieve an optimal model, all labels generated using the Levenberg–Marquardt algorithm with an R 2 value of less than 90% were eliminated. The remaining dataset was then split into training and test sets at a ratio of 75% to 25%, respectively, due to the small size of the dataset. Instead of further dividing the training data into training and validation sets, k-fold cross-validation was used to train and evaluate the model. K-fold cross-validation divides data into k subsets and trains a model k times, using a different subset as the test set and the remaining ones as the training set each time. The performance is then averaged across all iterations to estimate the model’s performance on unseen data. This technique helps utilize all the data, reduces the impact of sampling bias and reduces the risk of overfitting, especially in small datasets.

2.3.3. Algorithm

The use of ML was determined to be the most appropriate choice for this dataset, as DL models are often prone to overfitting with smaller datasets. After evaluating several ML algorithms, the random forest regressor was selected as the most suitable candidate due to its enhanced accuracy and robustness compared with traditional decision tree algorithms. The random forest regressor operates by combining multiple decision trees, each of which is trained on a different subset of the training data and a randomly selected subset of features. The final prediction is made by averaging the predictions from all decision trees in the forest. The model was optimized for maximum accuracy with the grid search technique, which focused on tuning three hyperparameters: the number of estimators in the forest, the minimum number of samples required to split a node and the utilization of bootstrap samples for each tree. The trained model was able to predict the Buckingham potential constants for test data with an accuracy of 93%.

2.3.4. MD Simulations

With a trained model, we can predict the Buckingham potential parameters for the same element atom pairs for any given partial charge. We then use the mixing rule to calculate the Buckingham constants for dissimilar atom pairs using the following equations:
A m n = ( A m m × A n n ) 0.5
B m n = 1 ( 1 B m m × 1 B n n ) 0.5
C m n = ( C m m 6 × C n n 6 ) 1 / 12
where A, B and C represent the Buckingham potential parameters. Also, m and n represent the index of the atom type in the system.
The non-bonded potentials obtained could subsequently be used for computational investigations at the atomic-molecular scale. Potential constants for the other types of interactions apart from non-bonded interactions were taken from the work by Lanjan et al. [22]. In this work, the ‘LAMMPS’ software package [50] was employed with the settings and potentials described in Table 2 and Table 3, respectively.

3. Results and Discussion

To evaluate the effectiveness of our method, we selected four molecules with different levels of complexity: (1) H2O, a simple molecule, (2) (CH2O)2CO ethylene carbonate (EC), a relatively complex molecule with a ring section, (3) C2H5OH (ethanol), a short-length hydrocarbon, and (4) C8H18 (octane), a long-chain molecule. Firstly, we used the partial charges from the literature [22] for all possible unique similar atom pair combinations for each molecule to predict the corresponding Buckingham potential parameters using our trained ML model. We then computed the Buckingham potential parameters for the dissimilar atom pair combinations using the mixing rules outlined in Equations (2)–(4). The accuracy of the predicted potential parameters is provided in Table 4. The comparison of the predicted values with the experimental values is shown in Figure 4. Next, we used these predicted potential parameters as inputs for the MD simulations to predict the density of these molecules.
Density is an important property of molecules as it can provide information about their packing and intermolecular forces. An accurate prediction of density requires an accurate modeling of interatomic forces and interactions, including both bonded and non-bonded interactions. Non-bonded interactions are sensitive to temperature and pressure changes and have a significant impact on the density of a molecule. As such, calculating the density precisely is a good indicator that the proposed ML-based technique can be employed to determine other molecular properties such as the mechanical properties, thermal properties and electrochemical properties, which are influenced by similar interatomic interactions. Furthermore, density is a thermodynamic property that can be measured experimentally and accurately calculated using QM techniques. Hence, comparing the predicted densities of materials with the experimental values is an effective approach to assessing the accuracy and reliability of our ML-based method. This comparison is summarized in Table 5, where our predicted densities are shown to have an accuracy greater than 93% with respect to the experimental data. Also, the densities obtained with MD simulations (specifications in Table 2) using our ML-predicted potential parameters are shown as a function of time in Figure 5. The density results in our MD simulations align closely with the expected values for ethylene carbonate (EC) and octane, with slight deviations well within the permissible range for computational models. The dynamic density fluctuations observed in the H2O and ethanol simulations are characteristic of the inherent complexities of molecular dynamics. Such variations are anticipated in MD simulations, reflecting the system’s responsiveness to changing conditions and interactions, while the overall trends remained consistent with the experimental expectations, demonstrating the reliability of our computational approach.

4. Conclusions

This work presents a novel ML-based technique that can learn the interatomic potential parameters for various particle–particle interactions with the accuracy of conventional computational techniques like QM. When used as input to MD simulations, these learned potential parameters can predict a diverse range of properties, enabling the rapid screening and comparison of large databases of material properties for battery applications.
In this study, we demonstrate the efficacy and validity of our proposed technique by learning a non-bonded interatomic potential: the Buckingham potential. We used the non-bonded potential parameters predicted in this work in conjunction with the potential parameters obtained from the literature for other types of interactions to predict the densities of four different complex molecules. The obtained values were in close agreement with the experimental values for all four molecules, establishing the accuracy and efficacy of our proposed technique for the nanoscale evaluation of new and novel materials. Our technique can help quickly eliminate materials that are unlikely to meet the desired criteria, narrowing down the list of potential candidates for further evaluation. By identifying the most promising battery compositions and materials for further testing and development, this technique can accelerate the discovery of novel materials and the improvement of existing battery technologies.
In conclusion, the proposed ML-based technique provides a promising path toward discovering and developing novel materials with enhanced properties for applications such as next-generation batteries with superior electrochemical performance. Our technique can accelerate the search for new materials with desirable properties, allowing for the rapid screening and comparison of large databases of material properties for such applications.

Author Contributions

Conceptualization: M.A.L., A.S., A.L. and S.S.; methodology, M.A.L. and A.S.; software: A.S.; validation: M.A.L., A.S. and R.M.; formal analysis: M.A.L., A.S. and A.L; investigation: M.A.L. and A.S.; computational resources: S.S.; data curation: M.A.L., R.M. and A.S.; writing—original draft preparation: M.A.L.; writing—review and editing: A.L. and S.S.; supervision: A.L. and S.S.; project administration: S.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the NSERC Discovery grants program. The grant number is RGPIN-2022-04988.

Data Availability Statement

The dataset generated and analyzed as part of this study can be found here (accessed on 20 January 2024): https://github.com/sudo-singh/predmodel_microproperties/tree/main/sim_data. The source code can be found here: https://github.com/sudo-singh/predmodel_microproperties.

Acknowledgments

The authors would like to thank NSERC Canada for funding this research through their Discovery Grants program. The authors would also like to thank the reviewers for their constructive suggestions that helped improve the quality of this manuscript.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MDMolecular dynamics
QMQuantum mechanics
MLMachine learning
CCSChemical compound space
DFTDensity functional theory
DLDeep learning
AIArtificial intelligence
LMOLithium manganese oxide
HOIPsHybrid organic-inorganic perovskites
OQMDOpen quantum materials database
ICSDInorganic crystal structure database
ANNsArtificial neural networks
SVRSupport vector regression
QEQuantum Espresso
ECEthylene carbonate

References

  1. von Lilienfeld, O.A. First principles view on chemical compound space: Gaining rigorous atomistic control of molecular properties. Int. J. Quantum Chem. 2013, 113, 1676–1689. [Google Scholar] [CrossRef]
  2. Huang, B.; von Lilienfeld, O.A. Ab Initio Machine Learning in Chemical Compound Space. Chem. Rev. 2021, 121, 10001–10036. [Google Scholar] [CrossRef] [PubMed]
  3. Lemonick, S. Exploring chemical space: Can AI take us where no human has gone before? Chem. Eng. News 2020, 98, 30–35. [Google Scholar]
  4. Ruddigkeit, L.; van Deursen, R.; Blum, L.C.; Reymond, J.L. Enumeration of 166 Billion Organic Small Molecules in the Chemical Universe Database GDB-17. J. Chem. Inf. Model. 2012, 52, 2864–2875. [Google Scholar] [CrossRef] [PubMed]
  5. Moradi, Z.; Heydarinasab, A.; Shariati, F.P. First-principle study of doping effects (Ti, Cu, and Zn) on electrochemical performance of Li2MnO3 cathode materials for lithium-ion batteries. Int. J. Quantum Chem. 2021, 121, e26458. [Google Scholar] [CrossRef]
  6. Moradi, Z.; Lanjan, A.; Srinivasan, S. Multiscale Investigation into the Co-Doping Strategy on the Electrochemical Properties of Li2RuO3 Cathodes for Li-Ion Batteries. ChemElectroChem 2021, 8, 112–124. [Google Scholar] [CrossRef]
  7. Tyagi, R.; Lanjan, A.; Srinivasan, S. Co-Doping Strategies to Improve the Electrochemical Properties of LixMn2O4 Cathodes for Li-Ion Batteries. ChemElectroChem 2022, 9, e202101626. [Google Scholar] [CrossRef]
  8. Moradi, Z.; Lanjan, A.; Srinivasan, S. Enhancement of Electrochemical Properties of Lithium Rich Li2RuO3 Cathode Material. J. Electrochem. Soc. 2020, 167, 110537. [Google Scholar] [CrossRef]
  9. Xia, W.; Sakurai, M.; Balasubramanian, B.; Liao, T.; Wang, R.; Zhang, C.; Sun, H.; Ho, K.M.; Chelikowsky, J.R.; Sellmyer, D.J.; et al. Accelerating the discovery of novel magnetic materials using machine learning–guided adaptive feedback. Proc. Natl. Acad. Sci. USA 2022, 119, e2204485119. [Google Scholar] [CrossRef]
  10. Lanjan, A.; Srinivasan, S. An Enhanced Battery Aging Model Based on a Detailed Diffusing Mechanism in the SEI Layer. ECS Adv. 2022, 1, 030504. [Google Scholar] [CrossRef]
  11. Lanjan, A.; Moradi, Z.; Srinivasan, S. Multiscale Investigation of the Diffusion Mechanism within the Solid–Electrolyte Interface Layer: Coupling Quantum Mechanics, Molecular Dynamics, and Macroscale Mathematical Modeling. ACS Appl. Mater. Interfaces 2021, 13, 42220–42229. [Google Scholar] [CrossRef] [PubMed]
  12. Lanjan, A.; Choobar, B.G.; Amjad-Iranagh, S. First principle study on the application of crystalline cathodes Li2Mn0.5TM0.5O3 for promoting the performance of lithium-ion batteries. Comput. Mater. Sci. 2020, 173, 109417. [Google Scholar] [CrossRef]
  13. Lanjan, A.; Choobar, B.G.; Amjad-Iranagh, S. Promoting lithium-ion battery performance by application of crystalline cathodes LiXMn1-zFezPO4. J. Solid State Electrochem. 2020, 24, 157–171. [Google Scholar] [CrossRef]
  14. Stanev, V.; Choudhary, K.; Kusne, A.G.; Paglione, J.; Takeuchi, I. Artificial intelligence for search and discovery of quantum materials. Commun. Mater. 2021, 2, 105. [Google Scholar] [CrossRef]
  15. Aykol, M.; Herring, P.; Anapolsky, A. Machine learning for continuous innovation in battery technologies. Nat. Rev. Mater. 2020, 5, 725–727. [Google Scholar] [CrossRef]
  16. Ye, C.; Tan, R.; Wang, A.; Chen, J.; Comesaña Gándara, B.; Breakwell, C.; Alvarez-Fernandez, A.; Fan, Z.; Weng, J.; Bezzu, C.G.; et al. Long-Life Aqueous Organic Redox Flow Batteries Enabled by Amidoxime-Functionalized Ion-Selective Polymer Membranes. Angew. Chem. Int. Ed. 2022, 61, e202207580. [Google Scholar] [CrossRef]
  17. Spotte-Smith, E.W.C.; Blau, S.M.; Xie, X.; Patel, H.D.; Wen, M.; Wood, B.; Dwaraknath, S.; Persson, K.A. Quantum chemical calculations of lithium-ion battery electrolyte and interphase species. Sci. Data 2021, 8, 203. [Google Scholar] [CrossRef]
  18. Chattopadhyay, J.; Pathak, T.S.; Santos, D.M.F. Applications of Polymer Electrolytes in Lithium-Ion Batteries: A Review. Polymers 2023, 15, 3907. [Google Scholar] [CrossRef]
  19. Chan, H.; Narayanan, B.; Cherukara, M.J.; Sen, F.G.; Sasikumar, K.; Gray, S.K.; Chan, M.K.Y.; Sankaranarayanan, S.K.R.S. Machine Learning Classical Interatomic Potentials for Molecular Dynamics from First-Principles Training Data. J. Phys. Chem. C 2019, 123, 6941–6957. [Google Scholar] [CrossRef]
  20. Sun, Y.; Yang, T.; Ji, H.; Zhou, J.; Wang, Z.; Qian, T.; Yan, C. Boosting the Optimization of Lithium Metal Batteries by Molecular Dynamics Simulations: A Perspective. Adv. Energy Mater. 2020, 10, 2002373. [Google Scholar] [CrossRef]
  21. Behler, J. Perspective: Machine learning potentials for atomistic simulations. J. Chem. Phys. 2016, 145, 170901. [Google Scholar] [CrossRef]
  22. Lanjan, A.; Moradi, Z.; Srinivasan, S. A computational framework for evaluating molecular dynamics potential parameters employing quantum mechanics. Mol. Syst. Des. Eng. 2023, 8, 632–646. [Google Scholar] [CrossRef]
  23. Sandhu, S.; Tyagi, R.; Talaie, E.; Srinivasan, S. Using neurocomputing techniques to determine microstructural properties in a Li-ion battery. Neural Comput. Appl. 2022, 34, 9983–9999. [Google Scholar] [CrossRef]
  24. Raccuglia, P.; Elbert, K.C.; Adler, P.D.F.; Falk, C.; Wenny, M.B.; Mollo, A.; Zeller, M.; Friedler, S.A.; Schrier, J.; Norquist, A.J. Machine-learning-assisted materials discovery using failed experiments. Nature 2016, 533, 73–76. [Google Scholar] [CrossRef]
  25. Lu, S.; Zhou, Q.; Ouyang, Y.; Guo, Y.; Li, Q.; Wang, J. Accelerated discovery of stable lead-free hybrid organic-inorganic perovskites via machine learning. Nat. Commun. 2018, 9, 3405. [Google Scholar] [CrossRef]
  26. Meredig, B.; Agrawal, A.; Kirklin, S.; Saal, J.E.; Doak, J.W.; Thompson, A.; Zhang, K.; Choudhary, A.; Wolverton, C. Combinatorial screening for new materials in unconstrained composition space with machine learning. Phys. Rev. B 2014, 89, 094104. [Google Scholar] [CrossRef]
  27. National Academies of Sciences, Engineering, and Medicine. NSF Efforts to Achieve the Nation’s Vision for the Materials Genome Initiative: Designing Materials to Revolutionize and Engineer Our Future (DMREF); The National Academies Press: Washington, DC, USA, 2023. [Google Scholar]
  28. Belsky, A.; Hellenbrandt, M.; Karen, V.L.; Luksch, P. New developments in the Inorganic Crystal Structure Database (ICSD): Accessibility in support of materials research and design. Acta Crystallogr. Sect. B Struct. Sci. 2002, 58, 364–369. [Google Scholar] [CrossRef] [PubMed]
  29. Kirklin, S.; Saal, J.E.; Meredig, B.; Thompson, A.; Doak, J.W.; Aykol, M.; Rühl, S.; Wolverton, C. The Open Quantum Materials Database (OQMD): Assessing the accuracy of DFT formation energies. Npj Comput. Mater. 2015, 1, 15010. [Google Scholar] [CrossRef]
  30. Groom, C.R.; Bruno, I.J.; Lightfoot, M.P.; Ward, S.C. The Cambridge Structural Database. Acta Crystallogr. Sect. B Struct. Sci. Cryst. Eng. Mater. 2016, 72, 171–179. [Google Scholar] [CrossRef] [PubMed]
  31. Hachmann, J.; Olivares-Amaya, R.; Atahan-Evrenk, S.; Amador-Bedolla, C.; Sánchez-Carrera, R.S.; Gold-Parker, A.; Vogt, L.; Brockway, A.M.; Aspuru-Guzik, A. The Harvard Clean Energy Project: Large-Scale Computational Screening and Design of Organic Photovoltaics on the World Community Grid. J. Phys. Chem. Lett. 2011, 2, 2241–2251. [Google Scholar] [CrossRef]
  32. Jain, A.; Ong, S.P.; Hautier, G.; Chen, W.; Richards, W.D.; Dacek, S.; Cholia, S.; Gunter, D.; Skinner, D.; Ceder, G.; et al. Commentary: The Materials Project: A materials genome approach to accelerating materials innovation. APL Mater. 2013, 1, 011002. [Google Scholar] [CrossRef]
  33. Curtarolo, S.; Setyawan, W.; Wang, S.; Xue, J.; Yang, K.; Taylor, R.H.; Nelson, L.J.; Hart, G.L.; Sanvito, S.; Buongiorno-Nardelli, M.; et al. AFLOWLIB.ORG: A distributed materials properties repository from high-throughput ab initio calculations. Comput. Mater. Sci. 2012, 58, 227–235. [Google Scholar] [CrossRef]
  34. Zhang, Y.; Ling, C. A strategy to apply machine learning to small datasets in materials science. Npj Comput. Mater. 2018, 4, 25. [Google Scholar] [CrossRef]
  35. Ward, L.; Agrawal, A.; Choudhary, A.; Wolverton, C. A general-purpose machine learning framework for predicting properties of inorganic materials. Npj Comput. Mater. 2016, 2, 16028. [Google Scholar] [CrossRef]
  36. Schütt, K.T.; Glawe, H.; Brockherde, F.; Sanna, A.; Müller, K.R.; Gross, E.K.U. How to represent crystal structures for machine learning: Towards fast prediction of electronic properties. Phys. Rev. B 2014, 89, 205118. [Google Scholar] [CrossRef]
  37. Hohenberg, P.; Kohn, W. Inhomogeneous Electron Gas. Phys. Rev. 1964, 136, B864–B871. [Google Scholar] [CrossRef]
  38. Himanen, L.; Jäger, M.O.; Morooka, E.V.; Canova, F.F.; Ranawat, Y.S.; Gao, D.Z.; Rinke, P.; Foster, A.S. DScribe: Library of descriptors for machine learning in materials science. Comput. Phys. Commun. 2020, 247, 106949. [Google Scholar] [CrossRef]
  39. Sanchez-Lengeling, B.; Aspuru-Guzik, A. Inverse molecular design using machine learning: Generative models for matter engineering. Science 2018, 361, 360–365. [Google Scholar] [CrossRef] [PubMed]
  40. Rupp, M.; Tkatchenko, A.; Müller, K.R.; von Lilienfeld, O.A. Fast and Accurate Modeling of Molecular Atomization Energies with Machine Learning. Phys. Rev. Lett. 2012, 108, 058301. [Google Scholar] [CrossRef]
  41. Hansen, K.; Biegler, F.; Ramakrishnan, R.; Pronobis, W.; von Lilienfeld, O.A.; Müller, K.R.; Tkatchenko, A. Machine Learning Predictions of Molecular Properties: Accurate Many-Body Potentials and Nonlocality in Chemical Space. J. Phys. Chem. Lett. 2015, 6, 2326–2331. [Google Scholar] [CrossRef]
  42. Akbarpour, H.; Mohajeri, M.; Moradi, M. Investigation on the synthesis conditions at the interpore distance of nanoporous anodic aluminum oxide: A comparison of experimental study, artificial neural network, and multiple linear regression. Comput. Mater. Sci. 2013, 79, 75–81. [Google Scholar] [CrossRef]
  43. Serra, J.M.; Baumes, L.A.; Moliner, M.; Serna, P.; Corma, A. Zeolite Synthesis Modelling with Support Vector Machines: A Combinatorial Approach. Comb. Chem. High Throughput Screen. 2007, 10, 13–24. [Google Scholar] [CrossRef]
  44. Fang, S.; Wang, M.; Qi, W.; Zheng, F. Hybrid genetic algorithms and support vector regression in forecasting atmospheric corrosion of metallic materials. Comput. Mater. Sci. 2008, 44, 647–655. [Google Scholar] [CrossRef]
  45. The classical equation of state of gaseous helium, neon and argon. Proc. R. Soc. Lond. Ser. A Math. Phys. Sci. 1938, 168, 264–283. [CrossRef]
  46. Wei, J.; Chu, X.; Sun, X.; Xu, K.; Deng, H.; Chen, J.; Wei, Z.; Lei, M. Machine learning in materials science. InfoMat 2019, 1, 338–358. [Google Scholar] [CrossRef]
  47. Giannozzi, P.; Baroni, S.; Bonini, N.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Chiarotti, G.L.; Cococcioni, M.; Dabo, I.; et al. QUANTUM ESPRESSO: A modular and open-source software project for quantum simulations of materials. J. Phys. Condens. Matter 2009, 21, 395502. [Google Scholar] [CrossRef] [PubMed]
  48. Giannozzi, P.; Andreussi, O.; Brumme, T.; Bunau, O.; Nardelli, M.B.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Cococcioni, M.; et al. Advanced capabilities for materials modelling with Quantum ESPRESSO. J. Phys. Condens. Matter 2017, 29, 465901. [Google Scholar] [CrossRef] [PubMed]
  49. Giannozzi, P.; Baseggio, O.; Bonfà, P.; Brunato, D.; Car, R.; Carnimeo, I.; Cavazzoni, C.; de Gironcoli, S.; Delugas, P.; Ruffino, F.F.; et al. Quantum ESPRESSO toward the exascale. J. Chem. Phys. 2020, 152, 154105. [Google Scholar] [CrossRef] [PubMed]
  50. Thompson, A.P.; Aktulga, H.M.; Berger, R.; Bolintineanu, D.S.; Brown, W.M.; Crozier, P.S.; in ’t Veld, P.J.; Kohlmeyer, A.; Moore, S.G.; Nguyen, T.D.; et al. LAMMPS—A flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Comput. Phys. Commun. 2022, 271, 108171. [Google Scholar] [CrossRef]
Figure 1. The Buckingham potential for carbon atom pairs for various partial charges.
Figure 1. The Buckingham potential for carbon atom pairs for various partial charges.
Batteries 10 00051 g001
Figure 2. Original unbounded Buckingham potential constants show a lack of element clustering, thereby making model training difficult.
Figure 2. Original unbounded Buckingham potential constants show a lack of element clustering, thereby making model training difficult.
Batteries 10 00051 g002
Figure 3. Clustered Buckingham potential constants show grouped system charge configurations for same elements.
Figure 3. Clustered Buckingham potential constants show grouped system charge configurations for same elements.
Batteries 10 00051 g003
Figure 4. A 3D plot of the Buckingham potential parameter values for different elements and partial charges from Table 4, obtained experimentally and using our trained ML model.
Figure 4. A 3D plot of the Buckingham potential parameter values for different elements and partial charges from Table 4, obtained experimentally and using our trained ML model.
Batteries 10 00051 g004
Figure 5. Density as a function of time. Comparison of density values of EC, H2O, octane and ethanol molecules obtained from MD simulations using the non-bonded potential parameters from the trained ML model with the corresponding experimental data.
Figure 5. Density as a function of time. Comparison of density values of EC, H2O, octane and ethanol molecules obtained from MD simulations using the non-bonded potential parameters from the trained ML model with the corresponding experimental data.
Batteries 10 00051 g005
Table 1. The summary of settings for QM calculations in this work.
Table 1. The summary of settings for QM calculations in this work.
PropertiesValue Method
XC FunctionalPBE
Convergence Tolerance 1.0 × 10 6 R y
W.F. Cutoff 1.0 × 10 2
Charge Cutoff 1.0 × 10 2 R y
Maximum Force 1.0 × 10 3 R y / B o h r
Smearing Factor 1.0 × 10 2 R y
K-Point Mesh Size 3 × 3 × 3
Table 2. A summary of the settings for MD simulations in this work.
Table 2. A summary of the settings for MD simulations in this work.
PropertiesDescription or Specification
Energy minimizationConjugate gradient for 2 × 10 4 steps
Equilibrium1 ns NVT run and 10 ns NPT run
Production run10 ns
Motions integratorStoermer–Verlet, 1 fs time step
Temperature coupling25 °C, Nose–Hoover thermostat
Pressure coupling1 bar, Parrinello–Rahman barostat
Constraint solverConstraining all bonds
Periodic boundaryx, y and z directions
Long-range interactionsEwald summation with 1.0 × 10 5 accuracy
Trajectory outputEvery 1000 time step (fs)
Neighbor list updatingEvery 10 fs
Dynamic load balanceYes
Table 3. The potentials used in the MD simulations using the LAMMPS software package.
Table 3. The potentials used in the MD simulations using the LAMMPS software package.
Interaction TypePotential StyleEquation
Non-bondedBuckingham or Coulombic E = A e r B C r 6
BondedHarmonic E = K ( r r 0 ) 2
AngleHarmonic E = K ( θ θ 0 ) 2
DihedralQuadratic E = K ( ϕ ϕ 0 ) 2
ImproperHarmonic E = K ( χ χ 0 ) 2
Table 4. Buckingham potential prediction using the trained ML model for the atoms and their respective partial charges constituting the four molecules: water, ethylene carbonate, ethanol and octane.
Table 4. Buckingham potential prediction using the trained ML model for the atoms and their respective partial charges constituting the four molecules: water, ethylene carbonate, ethanol and octane.
Element NamePartial ChargeR2 (A)R2 (B)R2 (C)
Carbon−0.4656100.00%97.75%94.18%
Carbon−0.0257100.00%97.75%94.18%
Carbon0.730599.15%96.15%94.18%
Carbon−0.3101100.00%97.75%94.18%
Carbon−0.0714100.00%97.75%94.18%
Hydrogen0.22237.19%76.07%47.82%
Hydrogen0.405331.15%99.83%11.42%
Hydrogen0.189937.19%76.07%47.82%
Hydrogen0.196837.19%76.07%47.82%
Hydrogen0.415331.15%99.83%11.42%
Hydrogen0.178337.19%76.07%47.82%
Oxygen−0.374599.75%98.94%98.04%
Oxygen−0.71198.91%98.94%98.04%
Oxygen−0.535798.91%98.94%98.04%
Oxygen−0.286599.75%98.94%98.04%
Table 5. Comparison of density results from this work with experimental values.
Table 5. Comparison of density results from this work with experimental values.
DensityExperimentalThis WorkError
H2O0.990.954.04%
Octane0.70.734.29%
Ethanol0.790.781.27%
EC1.331.426.77%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lal, M.A.; Singh, A.; Mzik, R.; Lanjan, A.; Srinivasan, S. AI-Based Nano-Scale Material Property Prediction for Li-Ion Batteries. Batteries 2024, 10, 51. https://0-doi-org.brum.beds.ac.uk/10.3390/batteries10020051

AMA Style

Lal MA, Singh A, Mzik R, Lanjan A, Srinivasan S. AI-Based Nano-Scale Material Property Prediction for Li-Ion Batteries. Batteries. 2024; 10(2):51. https://0-doi-org.brum.beds.ac.uk/10.3390/batteries10020051

Chicago/Turabian Style

Lal, Mohit Anil, Akashdeep Singh, Ryan Mzik, Amirmasoud Lanjan, and Seshasai Srinivasan. 2024. "AI-Based Nano-Scale Material Property Prediction for Li-Ion Batteries" Batteries 10, no. 2: 51. https://0-doi-org.brum.beds.ac.uk/10.3390/batteries10020051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop