Next Article in Journal
Mechanism for High-Precision Control of Movement at Maximum Output in the Vertical Jump Task
Previous Article in Journal
Chaos in Opinion-Driven Disease Dynamics
Previous Article in Special Issue
Differential Entropy-Based Fault-Detection Mechanism for Power-Constrained Networked Control Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Entropy-Based Methods for Motor Fault Detection: A Review

by
Sarahi Aguayo-Tapia
,
Gerardo Avalos-Almazan
and
Jose de Jesus Rangel-Magdaleno
*
Digital Systems Group, National Institute of Astrophysics, Optics and Electronics, Puebla 72840, Mexico
*
Author to whom correspondence should be addressed.
Submission received: 30 December 2023 / Revised: 21 March 2024 / Accepted: 26 March 2024 / Published: 28 March 2024

Abstract

:
In the signal analysis context, the entropy concept can characterize signal properties for detecting anomalies or non-representative behaviors in fiscal systems. In motor fault detection theory, entropy can measure disorder or uncertainty, aiding in detecting and classifying faults or abnormal operation conditions. This is especially relevant in industrial processes, where early motor fault detection can prevent progressive damage, operational interruptions, or potentially dangerous situations. The study of motor fault detection based on entropy theory holds significant academic relevance too, effectively bridging theoretical frameworks with industrial exigencies. As industrial sectors progress, applying entropy-based methodologies becomes indispensable for ensuring machinery integrity based on control and monitoring systems. This academic endeavor enhances the understanding of signal processing methodologies and accelerates progress in artificial intelligence and other modern knowledge areas. A wide variety of entropy-based methods have been employed for motor fault detection. This process involves assessing the complexity of measured signals from electrical motors, such as vibrations or stator currents, to form feature vectors. These vectors are then fed into artificial-intelligence-based classifiers to distinguish between healthy and faulty motor signals. This paper discusses some recent references to entropy methods and a summary of the most relevant results reported for fault detection over the last 10 years.

1. Introduction

Recently, the pursuit of more reliable and accurate techniques for motor fault detection has increased, driving the critical role that electric machines play in various modern industrial applications. Entropy-based methods have gained significant attention among the many emerging methodologies due to their unique ability to capture complex system behaviors and anomalies based on mathematical algorithms.
Entropy, which is a foundational concept that was introduced by Rudolf Julious Emanuel Clausius, has been used as a fundamental tool in signal analysis by assessing the variability and sparsity of signals in different knowledge areas. This pioneering work laid the groundwork for new studies about entropy forms, like information entropy [1], fuzzy entropy [2], and sample entropy [3], which have become useful tools in fault diagnosis methodologies. In recent studies, there has been a growing emphasis on the application of entropy-based methodologies for motor fault detection; for instance, in [4], a feature extraction approach based on entropy was undertaken, where this paper introduced the “weighted multi-scale fluctuation-based dispersion entropy (wtMFDE)” method. Designed for condition monitoring in planetary gearboxes (PGB), wtMFDE harnesses the intricacies of entropy to discern fault signatures from mixed noisy signals. This entropy-based technique seamlessly integrates with adaptive and non-adaptive signal processing methodologies, positioning it ahead of the previously established multi-scale fluctuation-based dispersion entropy (MSFDE) method. When evaluated alongside advanced classifiers, such as multilayer perceptron (MLP), the wtMFDE approach capitalizes on entropy’s power, achieving an unparalleled 100 % classification accuracy for specific fault types, as exemplified by sun chipping.
In [5], a fault diagnosis method for rolling bearings leveraging entropy-based techniques is presented. Ensemble empirical mode decomposition (EEMD) initially dissects training samples, with dispersion entropy (DE) quantifying their features. Principal component analysis (PCA) further refines these features, and the Gath–Geva (GG) clustering method categorizes them. When tested against various data sets, including the Case Western Reserve University (CWRU) data set, the method demonstrated its robustness, particularly with DE’s superior stability over other entropy measures and GG’s efficacy in clear sample categorization.
In [6], a method to detect sparking faults in DC motors using stray flux signals is proposed. It employs spectral entropy for signal analysis and introduces a severity indicator based on Mel frequency cepstral coefficients. Evaluations under various motor conditions highlight the method’s consistent effectiveness, positioning it as a promising tool for integrating into DC motor diagnosis systems.
While entropy-based techniques have enriched our understanding of rotatory machine dynamics, there remains a challenge in effectively capturing temporal details. To overcome these temporal limitations, authors have developed multi-scale and multi-modal techniques in order to obtain reliable results [7,8,9].

2. Entropy Methods

2.1. Shannon Entropy

The first concept of entropy was introduced by Shannon in order to calculate the irregularity and self-similarity of signals. The Shannon entropy H ( x ) of a random signal x with n possible outcomes is defined by
H ( x ) = i = 1 n p ( x i ) l o g 2 ( p ( x i ) )
where p ( x i ) is the probability density function of the signal x i [10,11].
Shannon entropy can be used to measure a time series’s complexity. By definition, Shannon entropy should be a monotonic increasing function and a continuous function. Lastly, if the probability can be divided into the sum of individual values, so should the Shannon entropy.
Thanks to its characteristics, Shannon entropy is a popular method, not only for fault detection but also for other applications, such as for the analysis of biological signals [12], computational applications [13], and environmental data [14].

Reported Works That Used Shannon Entropy

The reported works that used Shannon entropy for fault detection are mostly devoted to analyzing vibration signals. Some of the most relevant works are listed in Table 1, where the methods, type of signals, type of faults, and accuracy of the classification are detailed. Notice that half of these works are proposed to detect bearing faults: inner race (IR), outer race (OR), and ball.

2.2. Approximate Entropy

Approximate entropy measures the probability of occurrence of a new pattern based on the observation of the embedding dimension m and the similarity coefficient r. ApEn is a scale-invariant indicator, given that it relies on the similarity coefficient, which is an equivalent of a standard deviation of a time series.
Table 1. Motor fault detection using Shannon entropy.
Table 1. Motor fault detection using Shannon entropy.
Year and AuthorMethodsType of Signal (Database)Type of FaultReported Accuracy
2014. Hojat Heidari Bafroui, et al. [15]Continuous wavelet transform + Shannon entropy + feed-forward MLPVibrations (Amirkabir University of Technology)Gearbox: chipped and worn94.13–97.21%
2016. David Camarena-Martinez, et al. [11]K-means cluster + Shannon entropyCurrent (own)1/2 BRB, 1 BRB, and
2 BRBs
95–100%
2017. Shaojiang Dong, et al. [16]Local mean decomposition + Shannon entropy + fuzzy f-means flusteringVibrations (CWRU and own)IR, OR, and ball damage95%
2022. Yongbo Li, et al. [17]Local mean decomposition + Shannon entropy + fuzzy f-means flusteringVibrations (CWRU and own)IR, OR, and ball damage95%
ApEn can be defined as follows [10,18]:
A p E n = ϕ m ( r ) ϕ m + 1 ( r )
where ϕ m ( r ) is the mean value of the logarithm pattern mean count and r is the similarity coefficient; on the other hand, ϕ m ( r ) and ϕ ( m + 1 ) ( r ) can be calculated with the following expression:
ϕ m ( r ) = 1 N m + 1 × i = 1 N m + 1 l n c i m ( r )
where c i m ( r ) can be defined as follows:
c i m ( r ) = n N m + 1 ; i , j = 1 , 2 , . . . , N m + 1 , i j
Previous studies demonstrated the advantages of the ApEn, such as its insensitivity to inference and noise, its suitability for random and certain signals, and its stable estimation without requiring large amounts of data.
ApEn is also employed for analyzing short data sets [19], computational applications [20], and brain signals [21,22,23,24].

Reported Works That Used ApEn

ApEn emerged as an improvement of Shannon entropy, and its use in the fault detection area has been devoted mainly to analyzing vibration signals. In Table 2, some of the relevant works that used ApEn are listed. ApEn-improved methods, like refined composite multi-scale approximate entropy (RCMSAE), are commonly employed together with methods like empirical mode decomposition (EMD) and probabilistic neural network (PNN).

2.3. Permutation Entropy

Permutation entropy (PE) considers a signal’s non-linear behavior and describes the time series’s complexity by making a phase space reconstruction. PE only requires the order of the amplitude of the signal. In this regard, this type of entropy has a faster calculation time than others.
Table 2. Motor fault detection using ApEn.
Table 2. Motor fault detection using ApEn.
Year and AuthorMethodsType of Signal (Database)Type of FaultReported Accuracy
2007. Ruqiang Yan, et al. [25]ApEnVibrations (own)Structural bearing damageNot reported
2013. ShuanFeng Zhao, et al. [26]EMD + ApEnVibrations (own)Bearing: spall-like faultsNot reported
2016. Diego Luchesi Sampaio, et al. [18]ApEnVibrations (own)Cracked shaft and misalignmentNot reported
2017. Xueli An, et al. [27]ApEn + k-nearest neighbor + adaptive local iterative filteringVibrations (own)IR, OR, and ball bearing fault100%
2021. Jianpeng Ma, et al. [28]RCMSAE + improved coyote optimization-PNNVibrations (CWRU and own)IR, OR (CWRU and own), and ball bearing faults94.9% (CWRU) and 93.9% (own)
PE can be expressed in terms of the relative frequency p ( π ) for each permutation π as follows [10,29]:
P E = p ( π ) l o g 2 p ( π )
p ( π ) = num { X i m has type π , i | 1 , 2 , . . . , N m + 1 } ] N m + 1
PE is an adequate indicator of the complexity of signals from nonlinear processes; furthermore, PE’s advantages have been highlighted in other works, such as its high calculation efficiency, its robust ability against noise, and its good complexity estimation [10].
Some other applications of PE include the analysis of electroencephalographic signals [30,31,32] and financial time series [33,34].

Reported Works That Used PE

Permutation-entropy-based methods for fault detection are listed in Table 3. Notice that all of these works are devoted to the detection of bearing faults by using vibration signals, as is common in most of the works that use PE [35,36,37,38,39,40].
Together with PE, methods for signals processing and fault classification are used, such as continuous wavelet coefficient (CWC), ensemble empirical mode decomposition (EEMD), support vector machine (SVM), adaptive neuro-fuzzy inference system (ANFIS), flexible analytical wavelet transform (FAWT), composite multi-scale permutation entropy (CMSWPE), grey wolf optimizer (GWO) SVM, composite multi-scale permutation entropy (CMSPE), reverse cognitive fruit fly optimization algorithm (RCFOA), particle swarm optimization (PSO), improved multi-scale permutation entropy (IMSPE), and extreme learning machine (ELM). Moreover, not only is PE employed but also improved methods, such as multi-scale permutation entropy (MSPE), generalized composite multi-scale permutation entropy (GCMSPE), and time-shift multi-scale weighted permutation entropy (TSMSWPE) [41].

2.4. Sample Entropy

Sample entropy (SE) measures the irregularity of a signal independent of the similarity coefficient r and the embedding dimension m.
Table 3. Motor fault detection using PE.
Table 3. Motor fault detection using PE.
Year and AuthorMethodsType of Signal (Database)Type of FaultReported Accuracy
2013. Shuen-De Wu, et al. [29]MSE, MSPE, MBSE, and MSRMS + SVMVibrations (CWRU)IR, OR, and ball bearing damage96.01–99.79%
2014. Vakharia, et al. [42]CWC + PE + SVMVibrations (CWRU)IR, OR, and ball bearing damage97.5%
2016. Yongbo Li, et al. [43]Local mean decomposition + MSPE + Laplacian score + improved SVM based on binary treeVibrations (CWRU)IR, OR, and ball bearing damage97.5%
2017. Jinde Zheng, et al. [44]GCMSPE + Laplacian score + PSO-based SVMVibrations (CWRU and own)IR, OR (CWRU and own), and ball bearing damage (CWR)88.89–100% (CWR) and 96.67–100% (own)
2018. Moshen Kuai, et al. [45]Complete EEMD with adaptative noise + PE + ANFISVibrations (own)Gear faults: broken, one missing tooth, and tooth root crack80–100%
2019. Wenhua Du, et al. [46]SOF logic classifier + MSPE + LDAVibrations (CWRU and own)IR, OR, and ball damage (CWRU); cracked and peeled bearing (own)92.66–100% (CWR) and 97.75–99.25% (own)
2019. Jinde Zheng, et al. [47]CMSWPE + ELMVibrations (CWRU and Suzhou University)IR, OR (CWRU and Suzhou U.), and ball bearing damage (CWR)90.48–100% (CWR) and 100% (Suzhou U.)
2019. Xiaoming Xue, et al. [48]PE + VMD + RFVibrations (CWRU)IR, OR, and ball damage (CWRU)98.44% and 99.09% for different loads
2019. Zhilin Dong, et al. [49]TSMSWPE + GWO-SVMVibrations (CWRU)IR, OR, and ball damage (CWRU and Soochow University)100% (CWR) and 93.5–100%
(Soochow U.)
2020. Snehsheel Sharma, et al. [50]PE + FAWT + SVMVibrations (CWRU)IR, OR, and ball damage95–100%
2020. Cheng He, et al. [51]CMSPE + RCFOA-ELM + PSO-VMDVibrations (CWRU)IR, OR, and ball damage97.33–98.67%
2021. Amrinder Singh Minhas, et al. [52]IMSPE + dominant statistical parameters + extreme gradient boostingVibrations (CWRU and own)IR, OR (CWRU and own), and ball damage (CWRU)96.6–100% (CWRU) and 96.2–100% (own)
2021. Govind Vashishtha, et al. [53]ELM + SWD + PEVibrations (CWRU and own)IR, OR (CWRU and own), and ball damage (CWRU)100%
Consider a signal S of data length N expressed by S = { x 1 , x 2 , . . . , x N } . A pattern is formed by m sequential points of the signal S; for example, X i = [ x i , x i + 1 , . . . , x i + m + 1 ] would represent the ith pattern. Hence, the pattern space X is defined as follows:
X = x 1 x 2 x m x 2 x 3 x m + 1 x N m + 1 x N m + 2 x N
SE can be calculated as follows:
S E = l n ( B m + 1 ( r ) B m ( r ) )
where B m ( r ) represents the mean value of the pattern mean count; B m ( r ) and B m + ( r ) are calculated according to the following expression:
B m ( r ) = 1 N m 1 N m + 1 i = 1 N m j = 1 N m + 1 G ( d i j , r )
where d i j = X i X j and G ( · ) is the Heaviside function. In the context of SE, the suggestion of use for r is to select a value of 0.2 times the standard deviation of the data set [54].
Besides the motor’s fault detection, other applications that rely on SE’s properties are biomedical [55,56], electrical vehicles [57,58], and weather data series [59].

Reported Works That Used SE

In the following Table 4, a summary of some of the most relevant works that utilized SE and improved methods, such as generalized refined composite multi-scale sample entropy (GRCMSSE) for motor fault detection, is presented. Most of them aim to detect bearing faults, but two of the cited works propose the detection of gear and impeller faults.
Table 4. Motor fault detection using SE.
Table 4. Motor fault detection using SE.
Year and AuthorMethodsType of Signal (Database)Type of FaultReported Accuracy
2015. Minghong Han, et al. [60]Local mean decomposition + SE + SVMVibrations (CWRU)IR, OR, and ball bearing damage100%
2017. Qing Ni, et al. [61]SE, root-mean-square value, crest, and kurtosisVibrations (Lu Nan wind farm)IR bearing faultNot reported
2019. Yongbo Li, et al. [62]MSSE + Vold–Kalman filter + least squares SVMVibrations (UESTC)Gear fault: cracked tooth and distributed wear100%
2019. Zhaoyi Guan, et al. [63]EMD + SE + deep belief networkVibrations (own)Structural faults99–100%
2020. Zhenya Wang, et al. [64]GRCMSSE + S-isomap + Grasshopper optimization algorithm-SVMVibrations (Drivetrain diagnostics simulator)IR, OR, and ball bearing faults100%

2.5. Fuzzy Entropy

Fuzzy entropy (FE) emerged as an improvement of the sample entropy because FE uses a Gaussian function for measuring the similarity between two time series instead of the Heaviside function that SE uses.
Given a signal u ( i ) , i = 1 , 2 , . . . , N of N samples, a vector set { X i m , i = 1 , 2 , . . . , N m + 1 } is formed. Each vector has m sequential elements from the signal u ( i ) in the form of
X m i = { u ( i ) , u ( i + 1 ) , . . . , u ( i + m 1 ) } u o ( i )
where u o ( i ) represents the average of the vector X i m .
Then, the similarity FE for a time series is defined as follows:
D i j m = μ ( d i j m , n , r ) = e l n 2 ( d i j m / r ) n
where d i j m is the distance between X i m and X j m , r represents the similarity tolerance, and μ ( d i j m , n , r ) is a fuzzy function.
On the other hand, the function φ m ( n , r ) is expressed as
φ m ( n , r ) = 1 N m i 1 N m ( 1 N m 1 j = 1 , j i N m D i j m )
Finally, FE can be defined as follows [10]:
F E ( m , n , r , N ) = l n φ m ( n , r ) l n φ m + 1 ( n , r )
FE considers the ambiguous uncertainties from the highly irregular time series, making it insensitive to background noise.
Fuzzy entropy has been applied in different fields, like image processing [65,66], the analysis of biomedical signals [67,68,69,70], and decision making [31,71,72].

Reported Works That Used FE

In Table 5, a summary of some of the most relevant works that utilized FE for motor fault detection is presented. Most of them aim to detect bearing faults, but two of the cited works propose the detection of gear and impeller faults.
Some of the improved methods based on FE are multi-scale fuzzy entropy (MSFE), refined composite multi-scale fuzzy entropy (RCMSFE), generalized composite multi-scale fuzzy entropy (GCMSFE), multi-scale refined composite standard deviation fuzzy entropy (MSRCSDFE), and multivariable multi-scale fuzzy distribution entropy (MMSFDE). Although these methods extend the scope of FE by adding, for example, the multi-scale or the generalized analysis, all of them are still driven by FE principles [73,74,75,76,77,78,79,80,81,82,83].
Table 5. Motor fault detection using FE.
Table 5. Motor fault detection using FE.
Year and AuthorMethodsType of Signal (Database)Type of FaultReported Accuracy
2016. Huimin Zhao, et al. [84]EEMD + MSFE + SVMVibrations (CWRU)IR, OR, and ball bearing damage95–100%
2018. Wu Deng, et al. [85]EWT + FE + SVMVibrations (simulated signals)IR, OR, and ball bearing damage90–100%
2018. Jinde Zheng, et al. [86]Sigmoid-based RCMSFE + t-SNE + VPMCDVibrations (CWRU)IR, OR, and ball bearing damage100%
2018. Yu Wei, et al. [74]Intrinsic characteristic-scale decomposition + GCMSFE + Laplacian score + PSO-SVMVibrations (Harbin Intitute of Technology and CWRU)IR, OR (Harbin I.T. and CWR), and impeller faults (Harbin I.T.)98.13–100% (Harbin I.T.) and 100& (CWRU)
2019. Amrinder Singh Minhas, et al. [87]MSRCSDFE + EEMDVibrations (own)IR and OR92.77–100%
2021. Xu Chen, et al. [88]RCMSFE + out-of-sample embedding + MPA-SVMVibrations (CWRU and own)IR and OR100%
2021. Yanli Ma, et al. [28]MMSFDE + Fisher score + SVMVibrations (Hunan University and own)Drive gear (case 1) and bearing + gear fault
(case 2)
97.71–100% (case 1) and 92.5–99.5%
(case 2)
2022. Yongbo Li, et al. [17]SFE and MSFEVibrations (ADVC Laboratory and Paderborn University)OR faults: sharp trench, drilling, pitting (ADVC), and rubbing
(Paderborn U.)
99.88% (ADVC) and 99.3% (Paderborn U.)

2.6. Energy Entropy

Energy entropy (EE) estimates a signal’s complexity based on its intrinsic mode functions (IMFs). Its calculation starts with the energy of the ith IMF as follows:
E i = j = 1 m | c i j | 2
where m is the length of the IMF. Then, the total energy of the n IMFs is given by
E = i = 1 n E i
Finally, the energy entropy H e n of the signal is calculated based on the following expression:
H e n = j = 1 n p i l o g ( p i )
where p i = E i / E represents the percentage of the ith IMF relative to the total energy entropy [10].
The energy entropy provides very good results when analyzing non-stationary and nonlinear complex signals; for example, if a fault in the motor provokes a change in the signal’s frequency, the energy distribution will change. Hence, energy entropy can be used to effectively portray the signal’s characteristics [89].
Other fields besides fault detection where the EE has been applied are milling chatter detection [90], computational chemistry [91], and thermomechanics applications [92].

Reported Works That Used EE

Some of the latest relevant works that used energy entropy for fault detection are listed in Table 6. Unlike the previously mentioned methods, by using EE, more types of faults have been detected, such as misalignment, imbalance, and bearing faults. It is also important to recall that one of these works relied on current signals for the analysis [93,94].
Table 6. Motor fault detection using EE.
Table 6. Motor fault detection using EE.
Year and AuthorMethodsType of Signal (Database)Type of FaultReported Accuracy
2017. Yancai Xiao, et al. [89]IEMD energy entropy + PSO + SVMVibrations (own)Parallel, angle, and comprehensive misalignment98.913%
2017. Yancai Xiao, et al. [93]Dual-tree complex wavelet transform + EE + PSOCurrent (simulation)Parallel, angle, and comprehensive misalignment96%
2018. Bin Pang, et al. [95]CFBEE + improved singular spectrum decomposition + Hilbert transform + SVMVibrations (own)Local rubbing, oil film whirl, and imbalance fault100%
2021. Shuzhi Gao, et al. [96]IEE + triangulation of amplitude attenuation + correlation analysisVibrations (own)IR, OR, and ball bearing damage91–99.67%
Improved methods for EE are also proposed for fault detection, such as characteristic frequency band energy entropy (CFBEE) and improved energy entropy (IEE).

2.7. Dispersion Entropy

The dispersion entropy (DE) of a signal x of n samples can be calculated with the following steps [97,98]:
First, the signal x is normalized between 0 and 1. To do so, a sigmoid function is usually employed for this mapping. Some works have reported using normal cumulative distribution functions (NCDF) for this step [97,98]. Hence, the time series y is obtained from the NCDF of the signal x, which is defined as follows:
y i = 1 σ 2 π x i e ( t μ ) 2 2 σ 2 d t
where σ represents the standard deviation and μ is the mean of the signal x.
The second step consists of mapping the time series y to c classes by multiplying y i by c, then adding 0.5 and rounding to the nearest integer, as follows:
z i c = round ( c × y i + 0.5 )
where z i c represents the ith term of the classified time series z c .
In the third step, the time series z j m , c is constructed based on the embedding dimension m and the time delay d:
z j m , c = { z j c , z j + d c , . . . , z j + ( m 1 ) d c } j = 1 , 2 , . . , N ( m 1 ) d
Then, z j m , c is mapped into a dispersion pattern π v 0 v 1 . . . v m 1 :
z i c = v 0 , z i + d c = v 1 , z i + 2 d c = v 2 , . . . , z i + ( m 1 ) d c = v m 1
Here, the number of feasible dispersion patterns is c m given that each z j m , c is conformed by m elements, which can be an integer from to c.
The fourth step corresponds to the calculation of the relative frequency of each dispersion pattern π v 0 v 1 . . . v m 1 , which is given by
p ( π v 0 v 1 . . . v m 1 ) = num { j | j N ( m 1 ) d , z j m , c has type π v 0 v 1 . . . v m 1 } N ( m 1 ) d
Finally, the DE is calculated as follows:
D E ( x , m , c , d ) = π = 1 c m p ( π v 0 v 1 . . . v m 1 ) × l n ( p ( π v 0 v 1 . . . v m 1 ) )
where m represents the embedding dimension, c is the number of classes, and d is the time delay.
Some works prefer to express the DE in its normalized form, which is given by
N D E ( x , m , c , d ) = D E ( x , m , c , d ) l n ( c m )
The advantages of DE have been used for other applications, such as the analysis of biomedical signals [70] and image processing [99].

Reported Works That Used DE

In Table 7, relevant works that used DE for motor fault detection are listed. Notice that DE has become popular, especially during the last few years; authors rely on this method due to its high stability.
Some of the improved methods based on DE are hierarchical symbolic dynamic entropy (HSDE), improved multi-scale dispersion entropy (IMSDE), refined composite multi-scale dispersion entropy (RCMSDE), weighted refined composite multi-scale dispersion entropy (WRCMSDE), time-shift multi-scale dispersion entropy (TSMSDE), multi-scale dispersion entropy (MSDE), and stacking modified composite multi-scale dispersion entropy (SMCMSDE).
Table 7. Motor fault detection using DE.
Table 7. Motor fault detection using DE.
Year and AuthorMethodsType of Signal (Database)Type of FaultReported Accuracy
2018. Mostafa Rostaghi, et al. [97]HSDEVibrations (CWRU, University of Tabriz)IR, OR, ball bearing faults (CWRU), and medium worn and broken teeth of a spur gear of the gearbox
(U. of Tabriz)
Not reported
2018. Xiaoan Yan, et al. [98]IMSDE + mRMR + ELMVibration (CWRU)IR, OR, and ball bearing faults+98%
2019. Weibo Zhang, et al. [100]RCMSDE + fast EEMD + mRMR + random forest classifierVibration (CWRU)IR, OR, and ball bearing faults96.6–100%
2020. Amrinder Singh Minhas, et al. [101]Complementary EEMD + WRCMSDE, WRCMSFE, WRCMSPE + SVMVibration (CWRU and own) and acoustics (own)IR, OR (CWRU and own), and ball bearing faults (CWRU)70–100%
2020. Kaixuan Shao, et al. [102]VMD + TSMSDE + SVM + vibrational Harris hawks optimizationVibration (CWRU and Cincinnati IMS)IR, OR, and ball bearing faults96.56–98.81% (CWRU) and 79–100% (IMS)
2021. Snehsheel Sharma, et al. [7]Multi-scale fluctuation based DE + local mean decomposition + SVMVibration (CWRU)IR, OR, and ball bearing faults98–100%
2021. Xiong Zhang, et al. [5]EEMD + MSDE + PCA + Gath–Gera clustering methodVibration (CWRU, QPZZ-II, and Cincinnati IMS)OR (all), IR, and ball bearing faults (CWRU and QPZZ-II)100%
2021. Hongchuang Tan, et al. [103]SMCMSDE + equilibrium optimizer-SVM + complete EEMD with adaptative noiseVibration (CWRU and own)IR, OR, and ball bearing fault99.75% (CWRU) and 99.9% (own)
2021. Qiang Xue, et al. [104]HDE + joint approximate diagonalization of eigenmatricesVibration (CWRU and own)IR, OR, and ball bearing faults100%
2021. Fuming Zhou, et al. [105]MHMSFDE + multi-cluster feature selection + GWO based kernel ELMVibration (CWRU and QPZZ-II)IR, OR, ball bearing faults (CWRU), pinion wear, gearwheel pitting, gearwheel tooth breaking, and gearwheel pitting + pinion wear (QPZZ-II)100% (CWRU) and 98.5–99.24% (QPZZ-II)

2.8. Multi-Scale Entropy

The multi-scale version of any type of entropy method consists of the calculation of the entropy at different scales. To this end, a coarse-grained data sequence y j ( s ) should be obtained by a coarse-grained process of the original signal x. Then, y j ( s ) can be expressed as follows [64]:
y j ( s ) = 1 s i = ( j 1 ) s + 1 j s x i ; j = 1 , 2 , . . . , N s
where s represents a scale factor. Therefore, the signal x is transformed into a coarse grain sequence of length N / s .
The multi-scale entropy (MSE) accuracy is constrained by the single-scale method; however, it is usually preferred over the one-scale analysis because it provides more information despite the increase in the calculation time.
The type of applications where the MSE can be used are as vast as the applications of each single-scale method, such as the analysis of time series [106,107]; biological signals, such as heartbeats and encephalographics [108,109,110]; image processing [111]; and hydrologic applications [112]. There are some interesting works on improvements around the MSE, as presented in [113], where the authors successfully diagnosed gearbox and milling tool faults. The method utilizes a novel technique that combines MPE with contrastive learning (LE), yielding results that improve the accuracy of traditional entropy-based methods.
Finally, in Table 8, a summary of each method’s advantages and disadvantages is presented to provide a wider panorama of its characteristics.

2.9. Practical Example: Applied Entropy Methods for Broken Bar Detection

To provide an example of the use of different entropy methods and their effects on the classification accuracy, an implementation of three of the methods presented in this paper was conducted: Shannon entropy, approximate entropy, and energy entropy. These three methods were applied to the same signals from a motor with a healthy bar (HB) and with a broken bar (BB), without any preprocessing. A set of 50 current signals in the steady state were analyzed, as shown in Figure 1. Further explanation about the experimental setup to acquire these signals can be found in [114], where the authors performed an early broken bar detection. As can be observed from Figure 1, the signals were quite similar; therefore, an entropy method could be helpful to discern between the two conditions of the motor.
Results are displayed in Figure 2 comparing the entropy for the two conditions of the motor. Notice that the use of entropy allowed for a separability of data in a similar way that other traditional methods could provide, such as motor current signature analysis. According to the nature of the signal and the aim of the analysis, a certain method of entropy could be more useful than others. For example, energy entropy could be more suitable for this application since it only depends on the intrinsic characteristics of the signal. It is worth noticing from Figure 2 that the separability of data using this type of entropy is better than using Shannon or approximate entropy. Actually, approximate entropy is commonly employed for vibration signals, which are usually more irregular than current signals. Also, according to the characteristics of the phenomenon, certain entropy methods could be discarded; for example, when analyzing a high-frequency phenomenon, dispersion entropy is not adequate.
The selection of the type of entropy is also dependent on its application. A signal with higher separability, such as the comparison between a healthy motor and a motor with a medium level of damage, could be successfully classified with more straightforward methods, such as Shannon entropy, or a faster method, such as permutation entropy. But regarding a more complex analysis, a multi-scale analysis could be necessary.

3. The Role of Entropy in the Fault Diagnosis of Electromechanical Systems: Challenges and Advances

As a statistical measure, entropy is capable of quantifying the complexity of signals, which is closely related to the functional status of an electromechanical system. Consequently, entropy emerges as a promising non-parametric tool to extract characteristics from a system. Recently, several studies applied entropy indices for fault diagnosis, detection, and prediction in electric machines. Some of them employed more than one entropy index to obtain a multi-modal analysis. Despite the existence of several entropy-based algorithms for fault detection, most of them are based on Shannon entropy for random or deterministic behavior detection in signals from electric machines. The different forms of entropy employed for fault detection are usually based on the assessment of aleatory and complexity metrics of the signals, and any change in these indices could be related to important changes in the system behavior.
Depending on the nature of the signals, a specific index may be more useful than other; for this reason, it is necessary to apply different entropy metrics in combination with different classification methods, with the aim to cover all the possible faults. The entropy indices described in Section 2.1, Section 2.2, Section 2.3, Section 2.4, Section 2.5, Section 2.6 and Section 2.7 are commonly used for fault detection. Unfortunately, the classical models of these entropy-based indices are only useful for analyzing signals at one level (monoscale analysis), which does not provide the complete feature extraction of the signal.
To overcome the limitations of a monoscale analysis, multiscale-entropy-based methods were proposed, such as the method presented in Section 2.8. Despite their advantages, there exist some problems with this kind of method, like indeterminacy problems and instability for short signals, in addition to its low sensibility for high-frequency systems. Based on these, the main challenge and the current research status on entropy-based methods is a multiresolution analysis, which is needed to obtain indices that entirely describe the dynamics of the signal under study based on all of its oscillatory components [115,116,117,118]. In general, new entropy-based methods aim to provide information about the signal’s state at various levels of oscillation, and thus, better extract the characteristics of the signals under study in order to detect a fault. It is important to mention that the actual trend is the combination of entropy indices with artificial intelligent methods to improve the accuracy of the control systems and fault classification. Another important aspect about entropy based methods is the computational complexity, which allows for online hardware implementations.
However, the advantages of entropy-based methods are evident, in contrast with other methodologies, due the capability of the entropy indices to give information about the dynamics at different abstraction levels of the electromechanic systems. Some of the information aspects provided by entropy indices are systems complexity, stability and regularity, changes detection, resilience to disturbances, hidden patterns and structures, anomalies detection, future events prediction, and model validation, among others. In contrast to other methods, the calculation of entropy indices does not require a large amount of data, nor does it depend on the model and parameters of electric machines.
Table 8. Advantages and disadvantages of different entropy methods.
Table 8. Advantages and disadvantages of different entropy methods.
MethodAdvantagesDisadvantages
ShanEnAllows for the assessment of the quantity of information in a signal. It is the basis of the following methods.Its value only depends on the elements with probability ≠ 0; therefore, some elements could be neglected.
ApEnUncertainty estimation regarding future observations based on past observations.Dependent on the selection of the hyperparameters. Dependent on the length of the signal. Self-similarity feature [87].
SEBetter performance and less sensitivity to data length compared with ApEnDependent on selecting the hyperparameters. Similarity criteria dependent on the Heaviside function [50].
FuzzyEnBetter consistency and less dependent on the signal length compared with SE. Reflects the complexity and self-similarity features of a signal in a better way than SE and ApEn.Dependent on the selection of parameters.
PerEnHigh computational speed. Suitable for stationary and non-stationary signals.Low discrimination capacity given that it does not consider amplitude values.
DEFaster calculation speed than PerEn.
High stability.
Only analyzes the low-frequency part of
the signal.
MSEAnalyzes the signal in multiple scalesEfficiency dependent on the single-scale entropy method. Slower method given the entropy calculation within a range of scales.

4. Future Trends

Over the years, the use of entropy methods has evolved, with the aim to obtain more accurate and robust results. To this end, improved methods were proposed, such as generalized, multi-scale, composite, hierarchical, and multivariable entropy methods.
Some works also proposed combined methods in order to overcome the drawbacks of using only one type of entropy. But most importantly, entropy methods are usually employed together with signal processing techniques, such as PCA, EMD, and EWT. Artificial-intelligence-based classifications are also commonly used with entropy methods to achieve good classification accuracies when more than one type of fault is being analyzed.
As a summary, some of the trends observed during the elaboration of this work are listed below:
  • Most of the entropy methods are applied to vibration signals. This can be attributed to the nature of the signal and the straightforward acquisition. The presence of a fault in a motor usually increases the complexity of the vibration signal, given that it would introduce abnormal components in the spectrum. In this regard, it is expected that vibration analysis remains the preferred type of signal for entropy-based fault detection techniques.
  • Bearing fault detection is the type of fault that is mostly covered in entropy-based works. Other faults analyzed with entropy methods are gearbox faults, misalignment, and broken rotor bars, among other less common faults. However, these types of fault represent less than 10% of the work compared with those that analyze bearing faults.
  • PE and FE are the most popular methods for motor fault detection. During the last few years, DE has also gained attention. Therefore, it is expected that these would remain the preferred methods, along with their variations, such as composite, weighted, refined, generalized, and multi-variable approaches.
  • The development of new entropy-based methods for multiresolution analysis to cover more than one oscillation pattern.
  • Multimodal analysis in combination with artificial intelligence techniques for monitoring, control, and multiple fault detection.
  • Adaptive entropy-based techniques capable of dynamically adjusting to change the operational conditions of electric motors.
  • Emphasis on computational complexity improvements based on algorithmic optimization techniques.
  • Hardware implementation of entropy-based methodologies for online monitoring.
It is important to mention that aspects such as algorithmic optimization and hardware implementation are fundamental areas of study. These areas aim to adapt fault detection technology to the emerging trends in electrical systems, particularly in line with the philosophy of smart systems that embrace trends like Industry 4.0 and the Internet of things.

5. Conclusions

Different entropy methods were proposed over the years, with some of them aiming to improve the performance of the older ones. In general, the entropy methods are used for extracting characteristics of the motor’s signal to provide a classification that is commonly based on artificial intelligence.
Vibration analysis stands out as the preferred signal type among all the entropy methods reported in this work. In the future, it would be valuable for the state of the art to propose analysis based on other physical variables, such as the current or flux.
In the same regard, the analysis of a wider range of faults would be valuable given that over the years; the focus has been maintained on bearing fault detection.
Fuzzy entropy and dispersion entropy are some of the most reliable methods for entropy-based fault detection thanks to their high stability and reliability, and they are not dependent on the selection of parameters, like the sample entropy and approximate entropy. Permutation entropy is another popular method, and it has shown very good classification accuracies when applied with a classification method like SVM or ELM.
Multi-scale entropy has been preferred in recent years given that it provides more accurate results than a one-scale entropy analysis. Although selecting a multi-scale analysis could have the drawback of a slower calculation, usually, this is not relevant given that the progression of a fault, such as a bearing fault, is rather slow compared with the computation times of the method.
As machinery includes more sophisticated technologies and the demand for uninterrupted services by society increases, it is imperative to find new efficient and accurate mechanisms for fault detection and classification. Entropy-based methods are poised to play a pivotal role in the next generation of monitoring and control systems in conjunction with machine learning methods due to their capability to detect changes in dynamic systems over time.

Author Contributions

Conceptualization, S.A.-T., G.A.-A. and J.d.J.R.-M.; methodology, S.A.-T. and G.A.-A.; validation, S.A.-T.; investigation, S.A.-T. and G.A.-A.; resources, J.d.J.R.-M.; writing—original draft preparation, S.A.-T. and G.A.-A.; writing—review and editing, S.A.-T., G.A.-A. and J.d.J.R.-M.; visualization, S.A.-T. and G.A.-A.; supervision, J.d.J.R.-M. All authors read and agreed to the published version of the manuscript.

Funding

This research received no funding.

Data Availability Statement

Not applicable.

Acknowledgments

The authors acknowledge the financial support from CONACYT in the program of Doctor of Science.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ANFISAdaptive neuro-fuzzy inference system
ApEnApproximate entropy
CFBEECharacteristic frequency band energy entropy
CMSPEComposite multi-scale permutation entropy
CMSWPEComposite multi-scale weighted permutation entropy
CWCContinuous wavelet coefficients
CWRUCase Western Reserve University
DEDispersion entropy
EEEnergy entropy
EEMDEnsemble empirical mode decomposition
ELMExtreme learning machine
EMDEmpirical mode decomposition
EWTEmpirical wavelet transform
FAWTFlexible analytical wavelet transform
GCMSFEGeneralized composite multi-scale fuzzy entropy
GCMSPEGeneralized composite multi-scale permutation entropy
GRCMSEGeneralized refined composite multi-scale sample entropy
GCMSSDEGeneralized composite multi-scale symbol dynamic entropy
GCMSWPEGeneralized composite multi-scale weighted permutation entropy
GRCMSSEGeneralized refined composite multiscale sample entropy
GWOGrey wolf optimizer
HSDEHierarchical symbolic dynamic entropy
HDEHierarchical dispersion entropy
HSEHierarchical sample entropy
HPEHierarchical permutation entropy
IEEImproved energy entropy
IEMDImproved empirical mode decomposition
IMSDEImproved multi-scale dispersion entropy
IMSPEImproved multi-scale permutation entropy
IRInner race
ISSDImproved singular spectrum decomposition
IMSIntelligence maintenance systems
LDALinear discriminant analysis
MBSEMulti-band spectrum entropy
MHSEMarginal Hilbert spectrum entropy
MLPMulti-layer perceptron
WSSTWavelet semi-soft threshold
MPAMarine predators algorithm
MSEMulti-scale entropy
MSDEMulti-scale dispersion entropy
MSFDEMulti-scale fluctuation-based dispersion entropy
MSFEMulti-scale fuzzy entropy
MSSFEMulti-scale symbolic fuzzy entropy
MMSFDEMultivariable multi-scale fuzzy distribution entropy
MSPEMulti-scale permutation entropy
MSSEMulti-scale sample entropy
MSSDEMulti-scale symbolic dynamic entropy
MSRCSDFEMulti-scale refined composite standard deviation fuzzy entropy
MHMSFDEMultivariable hierarchical multi-scale fluctuation dispersion entropy
mRMRMax-relevance min-redundancy
OROuter race
PCAPrincipal component analysis
PEPermutation entropy
PGBPlanetary gearboxes
PSOParticle swarm optimization
PNNProbabilistic neural network
RCFOAReverse cognitive fruit fly optimization algorithm
RCMSAERefined composite multi-scale approximate entropy
RCMSDERefined composite multi-scale dispersion entropy
RCMSFERefined composite multi-scale fuzzy entropy
RFRandom forest
SFESymbolic fuzzy entropy
SVMSupport vector machine
SWDSwarm decomposition
SMCMSDEStacking modified composite multi-scale dispersion entropy
t-SNEt-distributed stochastic neighbor embedding
TSMSWPETime-shift multi-scale weighted permutation entropy
TSMSDETime-shift multi-scale dispersion entropy
VPMCDVariable predictive models based discrimination
WMSFDEWeighted multi-scale fluctuation-based dispersion entropy
WRCMSDEWeighted refined composite multi-scale dispersion entropy
WRCMSDEWeighted refined composite multi-scale dispersion entropy
WRCMSFEWeighted refined composite multi-scale fuzzy entropy
WRCMSPEWeighted refined composite multi-scale permutation entropy

References

  1. Shannon, C.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Champaign, IL, USA, 1949. [Google Scholar]
  2. Chen, W.; Wang, Z.; Xie, H.; Yu, W. Characterization of Surface EMG Signal Based on Fuzzy Entropy. IEEE Trans. Neural Syst. Rehabil. Eng. 2007, 15, 266–272. [Google Scholar] [CrossRef] [PubMed]
  3. Richman, J.S. Sample Entropy Statistics and Testing for Order in Complex Physiological Signals. Commun. Stat. Theory Methods 2007, 36, 1005–1019. [Google Scholar] [CrossRef]
  4. Sharma, S.; Tiwari, S. A novel feature extraction method based on weighted multi-scale fluctuation based dispersion entropy and its application to the condition monitoring of rotary machines. Mech. Syst. Signal Process. 2022, 171, 108909. [Google Scholar] [CrossRef]
  5. Zhang, X.; Zhang, M.; Wan, S.; He, Y.; Wang, X. A bearing fault diagnosis method based on multiscale dispersion entropy and GG clustering. Measurement 2021, 185, 110023. [Google Scholar] [CrossRef]
  6. Martínez, M.; Guerra Carmenate, J.; Antonino-Daviu, J.; Dunai, L.; Fernández de Córdoba, P.; Velasco-Pla, P.; Conejero, A. Spectral Entropy and Frequency Cepstral Coefficients of Stray Flux Signals for Sparking Detection in DC Motors. In Proceedings of the 2023 IEEE 14th International Symposium on Diagnostics for Electrical Machines, Power Electronics and Drives, Chania, Greece, 28–31 August 2023; pp. 524–529. [Google Scholar] [CrossRef]
  7. Sharma, S.; Tiwari, S.; Singh, S. The rotary machine fault detection by hybrid method based on local mean decomposition and fluctuation based dispersion entropy. Mater. Today Proc. 2021, 43, 700–705. [Google Scholar] [CrossRef]
  8. Minhas, A.; Singh, S.; Sharma, N.; Kankar, P. Improvement in classification accuracy and computational speed in bearing fault diagnosis using multiscale fuzzy entropy. J. Braz. Soc. Mech. Sci. Eng. 2020, 42, 1–21. [Google Scholar] [CrossRef]
  9. Rostaghi, M.; Azami, H. Dispersion Entropy: A Measure for Time-Series Analysis. IEEE Signal Process. Lett. 2016, 23, 610–614. [Google Scholar] [CrossRef]
  10. Li, Y.; Wang, X.; Liu, Z.; Liang, X.; Si, S. The Entropy Algorithm and Its Variants in the Fault Diagnosis of Rotating Machinery: A Review. IEEE Access 2018, 6, 66723–66741. [Google Scholar] [CrossRef]
  11. Camarena-Martinez, D.; Valtierra-Rodriguez, M.; Amezquita-Sanchez, J.; Lieberman, D.; Romero-Troncoso, R.; Garcia-Perez, A. Shannon Entropy and K -Means Method for Automatic Diagnosis of Broken Rotor Bars in Induction Motors Using Vibration Signals. Shock Vib. 2016, 2016, 1–10. [Google Scholar] [CrossRef]
  12. Eskov, V.; Eskov, V.; Vochmina, Y.V.; Gorbunov, D.; Ilyashenko, L. Shannon entropy in the research on stationary regimes and the evolution of complexity. Mosc. Univ. Phys. Bull. 2017, 72, 309–317. [Google Scholar] [CrossRef]
  13. Zenil, H.; Hernández-Orozco, S.; Kiani, N.A.; Soler-Toscano, F.; Rueda-Toicen, A.; Tegnér, J. A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity. Entropy 2018, 20, 605. [Google Scholar] [CrossRef] [PubMed]
  14. De Queiroz, M.M.; Silva, R.W.; Loschi, R.H. Shannon entropy and Kullback–Leibler divergence in multivariate log fundamental skew-normal and related distributions. Can. J. Stat. 2016, 44, 219–237. [Google Scholar] [CrossRef]
  15. Bafroui, H.H.; Ohadi, A. Application of wavelet energy and Shannon entropy for feature extraction in gearbox fault detection under varying speed conditions. Neurocomputing 2014, 133, 437–445. [Google Scholar] [CrossRef]
  16. Dong, S.; Xu, X.; Luo, J. Mechanical Fault Diagnosis Method Based on LMD Shannon Entropy and Improved Fuzzy C-means Clustering. Int. J. Acoust. Vib. 2017, 22, 211–217. [Google Scholar] [CrossRef]
  17. Li, Y.; Wang, S.; Yang, Y.; Deng, Z. Multiscale symbolic fuzzy entropy: An entropy denoising method for weak feature extraction of rotating machinery. Mech. Syst. Signal Process. 2022, 162, 108052. [Google Scholar] [CrossRef]
  18. Sampaio, D.L.; Nicoletti, R. Detection of cracks in shafts with the Approximated Entropy algorithm. Mech. Syst. Signal Process. 2016, 72, 286–302. [Google Scholar] [CrossRef]
  19. Yentes, J.M.; Hunt, N.; Schmid, K.K.; Kaipust, J.P.; McGrath, D.; Stergiou, N. The appropriate use of approximate entropy and sample entropy with short data sets. Ann. Biomed. Eng. 2013, 41, 349–365. [Google Scholar] [CrossRef] [PubMed]
  20. Tomčala, J. New fast ApEn and SampEn entropy algorithms implementation and their application to supercomputer power consumption. Entropy 2020, 22, 863. [Google Scholar] [CrossRef] [PubMed]
  21. Vega, C.H.F.; Noel, J.; Fernández, J.R. Cognitive task discrimination using approximate entropy (ApEn) on EEG signals. In Proceedings of the 2013 ISSNIP Biosignals and Biorobotics Conference: Biosignals and Robotics for Better and Safer Living (BRC), Rio de Janerio, Brazil, 18–20 February 2013; pp. 1–4. [Google Scholar]
  22. Alù, F.; Miraglia, F.; Orticoni, A.; Judica, E.; Cotelli, M.; Rossini, P.M.; Vecchio, F. Approximate entropy of brain network in the study of hemispheric differences. Entropy 2020, 22, 1220. [Google Scholar] [CrossRef] [PubMed]
  23. Chuckravanen, D. Approximate entropy as a measure of cognitive fatigue: An eeg pilot study. Int. J. Emerg. Trends Sci. Technol. 2014, 1, 1036–1042. [Google Scholar]
  24. Lee, G.M.; Fattinger, S.; Mouthon, A.L.; Noirhomme, Q.; Huber, R. Electroencephalogram approximate entropy influenced by both age and sleep. Front. Neuroinform. 2013, 7, 33. [Google Scholar] [CrossRef] [PubMed]
  25. Yan, R.; Gao, R. Approximate Entropy as a diagnostic tool for machine health monitoring. Mech. Syst. Signal Process. 2007, 21, 824–839. [Google Scholar] [CrossRef]
  26. Zhao, S.; Liang, L.; Xu, G.; Wang, J.; Zhang, W. Quantitative diagnosis of a spall-like fault of a rolling element bearing by empirical mode decomposition and the approximate entropy method. Mech. Syst. Signal Process. 2013, 40, 154–177. [Google Scholar] [CrossRef]
  27. An, X.; Pan, L. Wind turbine bearing fault diagnosis based on adaptive local iterative filtering and approximate entropy. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2017, 231, 3228–3237. [Google Scholar] [CrossRef]
  28. Ma, Y.; Cheng, J.; Wang, P.; Wang, J.; Yang, Y. Rotating machinery fault diagnosis based on multivariate multiscale fuzzy distribution entropy and Fisher score. Measurement 2021, 179, 109495. [Google Scholar] [CrossRef]
  29. Wu, S.D.; Wu, C.W.; Wu, T.Y.; Wang, C.C. Multi-Scale Analysis Based Ball Bearing Defect Diagnostics Using Mahalanobis Distance and Support Vector Machine. Entropy 2013, 15, 416–433. [Google Scholar] [CrossRef]
  30. Ouyang, G.; Li, J.; Liu, X.; Li, X. Dynamic characteristics of absence EEG recordings with multiscale permutation entropy analysis. Epilepsy Res. 2013, 104, 246–252. [Google Scholar] [CrossRef] [PubMed]
  31. Yang, Y.; Zhou, M.; Niu, Y.; Li, C.; Cao, R.; Wang, B.; Yan, P.; Ma, Y.; Xiang, J. Epileptic seizure prediction based on permutation entropy. Front. Comput. Neurosci. 2018, 12, 55. [Google Scholar] [CrossRef]
  32. Ferlazzo, E.; Mammone, N.; Cianci, V.; Gasparini, S.; Gambardella, A.; Labate, A.; Latella, M.A.; Sofia, V.; Elia, M.; Morabito, F.C.; et al. Permutation entropy of scalp EEG: A tool to investigate epilepsies: Suggestions from absence epilepsies. Clin. Neurophysiol. 2014, 125, 13–20. [Google Scholar] [CrossRef] [PubMed]
  33. Yin, Y.; Shang, P. Weighted multiscale permutation entropy of financial time series. Nonlinear Dyn. 2014, 78, 2921–2939. [Google Scholar] [CrossRef]
  34. Henry, M.; Judge, G. Permutation entropy and information recovery in nonlinear dynamic economic time series. Econometrics 2019, 7, 10. [Google Scholar] [CrossRef]
  35. He, C.; Wu, T.; Liu, C.; Chen, T. A novel method of composite multiscale weighted permutation entropy and machine learning for fault complex system fault diagnosis. Measurement 2020, 158, 107748. [Google Scholar] [CrossRef]
  36. Jinde, Z.; Junsheng, C.; Yang, Y. Multiscale Permutation Entropy Based Rolling Bearing Fault Diagnosis. Shock Vib. 2014, 2014, 154291. [Google Scholar] [CrossRef]
  37. Chen, Y.; Zhang, T.; Zhao, W.; Luo, Z.; Sun, K. Fault Diagnosis of Rolling Bearing Using Multiscale Amplitude-Aware Permutation Entropy and Random Forest. Algorithms 2019, 12, 184. [Google Scholar] [CrossRef]
  38. Yasir, M.N.; Koh, B.H. Data Decomposition Techniques with Multi-Scale Permutation Entropy Calculations for Bearing Fault Diagnosis. Sensors 2018, 18, 1278. [Google Scholar] [CrossRef] [PubMed]
  39. Rajabi, S.; Saman Azari, M.; Santini, S.; Flammini, F. Fault diagnosis in industrial rotating equipment based on permutation entropy, signal processing and multi-output neuro-fuzzy classifier. Expert Syst. Appl. 2022, 206, 117754. [Google Scholar] [CrossRef]
  40. Tiwari, R.; Gupta, V.; Kankar, P. Bearing fault diagnosis based on multi-scale permutation entropy and adaptive neuro fuzzy classifier. J. Vib. Control 2013, 21, 461–467. [Google Scholar] [CrossRef]
  41. Xu, F.; Tse, P. A method combining refined composite multiscale fuzzy entropy with PSO-SVM for roller bearing fault diagnosis. J. Cent. South Univ. 2019, 26, 2404–2417. [Google Scholar] [CrossRef]
  42. Vakharia, V.; Gupta, V.; Kankar, P. A multiscale permutation entropy based approach to select wavelet for fault diagnosis of ball bearings. J. Vib. Control 2015, 21, 3123–3131. [Google Scholar] [CrossRef]
  43. Li, Y.; Xu, M.; Wei, Y.; Huang, W. A new rolling bearing fault diagnosis method based on multiscale permutation entropy and improved support vector machine based binary tree. Measurement 2016, 77, 80–94. [Google Scholar] [CrossRef]
  44. Zheng, J.; Pan, H.; Yang, S.; Cheng, J. Generalized composite multiscale permutation entropy and Laplacian score based rolling bearing fault diagnosis. Mech. Syst. Signal Process. 2018, 99, 229–243. [Google Scholar] [CrossRef]
  45. Kuai, M.; Cheng, G.; Pang, Y.; Li, Y. Research of planetary gear fault diagnosis based on permutation entropy of CEEMDAN and ANFIS. Sensors 2018, 18, 782. [Google Scholar] [CrossRef] [PubMed]
  46. Du, W.; Guo, X.; Wang, Z.; Wang, J.; Yu, M.; Li, C.; Wang, G.; Wang, L.; Guo, H.; Zhou, J.; et al. A New Fuzzy Logic Classifier Based on Multiscale Permutation Entropy and Its Application in Bearing Fault Diagnosis. Entropy 2020, 22, 27. [Google Scholar] [CrossRef] [PubMed]
  47. Zheng, J.; Dong, Z.; Pan, H.; Ni, Q.; Liu, T.; Zhang, J. Composite multi-scale weighted permutation entropy and extreme learning machine based intelligent fault diagnosis for rolling bearing. Measurement 2019, 143, 69–80. [Google Scholar] [CrossRef]
  48. Xue, X.; Li, C.; Cao, S.; Sun, J.; Liu, L. Fault Diagnosis of Rolling Element Bearings with a Two-Step Scheme Based on Permutation Entropy and Random Forests. Entropy 2019, 21, 96. [Google Scholar] [CrossRef] [PubMed]
  49. Dong, Z.; Zheng, J.; Huang, S.; Pan, H.; Liu, Q. Time-Shift Multi-scale Weighted Permutation Entropy and GWO-SVM Based Fault Diagnosis Approach for Rolling Bearing. Entropy 2019, 21, 621. [Google Scholar] [CrossRef] [PubMed]
  50. Sharma, S.; Tiwari, S.; Singh, S. Integrated approach based on flexible analytical wavelet transform and permutation entropy for fault detection in rotary machines. Measurement 2021, 169, 108389. [Google Scholar] [CrossRef]
  51. He, C.; Wu, T.; Gu, R.; Jin, Z.; Ma, R.; Qu, H. Rolling bearing fault diagnosis based on composite multiscale permutation entropy and reverse cognitive fruit fly optimization algorithm—Extreme learning machine. Measurement 2021, 173, 108636. [Google Scholar] [CrossRef]
  52. Minhas, A.S.; Singh, S. A new bearing fault diagnosis approach combining sensitive statistical features with improved multiscale permutation entropy method. Knowl.-Based Syst. 2021, 218, 106883. [Google Scholar] [CrossRef]
  53. Vashishtha, G.; Chauhan, S.; Singh, M.; Kumar, R. Bearing defect identification by swarm decomposition considering permutation entropy measure and opposition-based slime mould algorithm. Measurement 2021, 178, 109389. [Google Scholar] [CrossRef]
  54. Richman, J.S.; Lake, D.E.; Moorman, J.R. Sample entropy. In Methods in Enzymology; Elsevier: Amsterdam, The Netherlands, 2004; Volume 384, pp. 172–184. [Google Scholar]
  55. Jie, X.; Cao, R.; Li, L. Emotion recognition based on the sample entropy of EEG. Bio-Med Mater. Eng. 2014, 24, 1185–1192. [Google Scholar] [CrossRef] [PubMed]
  56. Aktaruzzaman, M.; Sassi, R. Parametric estimation of sample entropy in heart rate variability analysis. Biomed. Signal Process. Control 2014, 14, 141–147. [Google Scholar] [CrossRef]
  57. Hu, X.; Jiang, J.; Cao, D.; Egardt, B. Battery health prognosis for electric vehicles using sample entropy and sparse Bayesian predictive modeling. IEEE Trans. Ind. Electron. 2015, 63, 2645–2656. [Google Scholar] [CrossRef]
  58. Mahajan, R.; Morshed, B.I. Unsupervised eye blink artifact denoising of EEG data with modified multiscale sample entropy, kurtosis, and wavelet-ICA. IEEE J. Biomed. Health Inform. 2014, 19, 158–165. [Google Scholar] [CrossRef] [PubMed]
  59. Xavier, S.F.A.; da Silva Jale, J.; Stosic, T.; dos Santos, C.A.C.; Singh, V.P. An application of sample entropy to precipitation in Paraíba State, Brazil. Theor. Appl. Climatol. 2019, 136, 429–440. [Google Scholar] [CrossRef]
  60. Han, M.; Pan, J. A fault diagnosis method combined with LMD, sample entropy and energy ratio for roller bearings. Measurement 2015, 76, 7–19. [Google Scholar] [CrossRef]
  61. Ni, Q.; Feng, K.; Wang, K.; Yang, B.; Wang, Y. A case study of sample entropy analysis to the fault detection of bearing in wind turbine. Case Stud. Eng. Fail. Anal. 2017, 9, 99–111. [Google Scholar] [CrossRef]
  62. Li, Y.; Feng, K.; Liang, X.; Zuo, M.J. A fault diagnosis method for planetary gearboxes under non-stationary working conditions using improved Vold-Kalman filter and multi-scale sample entropy. J. Sound Vib. 2019, 439, 271–286. [Google Scholar] [CrossRef]
  63. Guan, Z.; Liao, Z.; Li, K.; Chen, P. A precise diagnosis method of structural faults of rotating machinery based on combination of empirical mode decomposition, sample entropy, and deep belief network. Sensors 2019, 19, 591. [Google Scholar] [CrossRef]
  64. Wang, Z.; Yao, L.; Cai, Y. Rolling bearing fault diagnosis using generalized refined composite multiscale sample entropy and optimized support vector machine. Measurement 2020, 156, 107574. [Google Scholar] [CrossRef]
  65. Versaci, M.; Morabito, F.C. Image edge detection: A new approach based on fuzzy entropy and fuzzy divergence. Int. J. Fuzzy Syst. 2021, 23, 918–936. [Google Scholar] [CrossRef]
  66. Oliva, D.; Abd Elaziz, M.; Hinojosa, S.; Oliva, D.; Abd Elaziz, M.; Hinojosa, S. Fuzzy entropy approaches for image segmentation. Metaheuristic Algorithms for Image Segmentation: Theory and Applications; Springer: Berlin/Heidelberg, Germany, 2019; pp. 141–147. [Google Scholar]
  67. Xiang, J.; Li, C.; Li, H.; Cao, R.; Wang, B.; Han, X.; Chen, J. The detection of epileptic seizure signals based on fuzzy entropy. J. Neurosci. Methods 2015, 243, 18–25. [Google Scholar] [CrossRef] [PubMed]
  68. Cao, Z.; Lin, C.T. Inherent fuzzy entropy for the improvement of EEG complexity evaluation. IEEE Trans. Fuzzy Syst. 2017, 26, 1032–1035. [Google Scholar] [CrossRef]
  69. Liu, C.; Li, K.; Zhao, L.; Liu, F.; Zheng, D.; Liu, C.; Liu, S. Analysis of heart rate variability using fuzzy measure entropy. Comput. Biol. Med. 2013, 43, 100–108. [Google Scholar] [CrossRef] [PubMed]
  70. Azami, H.; Fernández, A.; Escudero, J. Refined multiscale fuzzy entropy based on standard deviation for biomedical signal analysis. Med Biol. Eng. Comput. 2017, 55, 2037–2052. [Google Scholar] [CrossRef] [PubMed]
  71. Joshi, D.; Kumar, S. Intuitionistic fuzzy entropy and distance measure based TOPSIS method for multi-criteria decision making. Egypt. Inform. J. 2014, 15, 97–104. [Google Scholar] [CrossRef]
  72. Song, Y.; Fu, Q.; Wang, Y.F.; Wang, X. Divergence-based cross entropy and uncertainty measures of Atanassov’s intuitionistic fuzzy sets with their application in decision making. Appl. Soft Comput. 2019, 84, 105703. [Google Scholar] [CrossRef]
  73. Zheng, J.; Pan, H.; Cheng, J. Rolling bearing fault detection and diagnosis based on composite multiscale fuzzy entropy and ensemble support vector machines. Mech. Syst. Signal Process. 2017, 85, 746–759. [Google Scholar] [CrossRef]
  74. Wei, Y.; Li, Y.; Xu, M.; Huang, W. Intelligent Fault Diagnosis of Rotating Machinery Using ICD and Generalized Composite Multi-Scale Fuzzy Entropy. IEEE Access 2019, 7, 38983–38995. [Google Scholar] [CrossRef]
  75. Li, Y.; Miao, B.; Zhang, W.; Chen, P.; Liu, J.; Jiang, X. Refined composite multiscale fuzzy entropy: Localized defect detection of rolling element bearing. J. Mech. Sci. Technol. 2019, 33, 109–120. [Google Scholar] [CrossRef]
  76. Zheng, J.; Cheng, J.; Yang, Y.; Luo, S. A rolling bearing fault diagnosis method based on multi-scale fuzzy entropy and variable predictive model-based class discrimination. Mech. Mach. Theory 2014, 78, 187–200. [Google Scholar] [CrossRef]
  77. Gao, S.; Wang, Q.; Zhang, Y. Rolling Bearing Fault Diagnosis Based on CEEMDAN and Refined Composite Multiscale Fuzzy Entropy. IEEE Trans. Instrum. Meas. 2021, 70, 1–8. [Google Scholar] [CrossRef]
  78. Chen, X.; Cheng, G.; Li, H.; Zhang, M. Diagnosing planetary gear faults using the fuzzy entropy of LMD and ANFIS. J. Mech. Sci. Technol. 2016, 30, 2453–2462. [Google Scholar] [CrossRef]
  79. Zair, M.; Rahmoune, C.; Djamel, B. Multi-fault diagnosis of rolling bearing using fuzzy entropy of empirical mode decomposition, principal component analysis, and SOM neural network. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2018, 233, 095440621880551. [Google Scholar] [CrossRef]
  80. Malhotra, A.; Singh Minhas, A.; Singh, S.; Zuo, M.J.; Kumar, R.; Kankar, P.K. Bearing fault diagnosis based on flexible analytical wavelet transform and fuzzy entropy approach. Mater. Today Proc. 2021, 43, 629–635. [Google Scholar] [CrossRef]
  81. Li, Y.; Xu, M.; Zhao, H.; Huang, W. Hierarchical fuzzy entropy and improved support vector machine based binary tree approach for rolling bearing fault diagnosis. Mech. Mach. Theory 2016, 98, 114–132. [Google Scholar] [CrossRef]
  82. Vallejo, M.; Gallego, C.J.; Duque-Muñoz, L.; Delgado-Trejos, E. Neuromuscular disease detection by neural networks and fuzzy entropy on time-frequency analysis of electromyography signals. Expert Syst. 2018, 35, e12274. [Google Scholar] [CrossRef]
  83. Tran, M.Q.; Elsisi, M.; Liu, M.K. Effective feature selection with fuzzy entropy and similarity classifier for chatter vibration diagnosis. Measurement 2021, 184, 109962. [Google Scholar] [CrossRef]
  84. Zhao, H.; Sun, M.; Deng, W.; Yang, X. A New Feature Extraction Method Based on EEMD and Multi-Scale Fuzzy Entropy for Motor Bearing. Entropy 2017, 19, 14. [Google Scholar] [CrossRef]
  85. Deng, W.; Zhang, S.; Zhao, H.; Yang, X. A Novel Fault Diagnosis Method Based on Integrating Empirical Wavelet Transform and Fuzzy Entropy for Motor Bearing. IEEE Access 2018, 6, 35042–35056. [Google Scholar] [CrossRef]
  86. Zheng, J.; Jiang, Z.; Pan, H. Sigmoid-based refined composite multiscale fuzzy entropy and t-SNE based fault diagnosis approach for rolling bearing. Measurement 2018, 129, 332–342. [Google Scholar] [CrossRef]
  87. Minhas, A.S.; Singh, G.; Singh, J.; Kankar, P.; Singh, S. A novel method to classify bearing faults by integrating standard deviation to refined composite multi-scale fuzzy entropy. Measurement 2020, 154, 107441. [Google Scholar] [CrossRef]
  88. Chen, X.; Qi, X.; Wang, Z.; Cui, C.; Wu, B.; Yang, Y. Fault diagnosis of rolling bearing using marine predators algorithm-based support vector machine and topology learning and out-of-sample embedding. Measurement 2021, 176, 109116. [Google Scholar] [CrossRef]
  89. Xiao, Y.; Kang, N.; Hong, Y.; Zhang, G. Misalignment Fault Diagnosis of DFWT Based on IEMD Energy Entropy and PSO-SVM. Entropy 2017, 19, 6. [Google Scholar] [CrossRef]
  90. Liu, C.; Zhu, L.; Ni, C. Chatter detection in milling process based on VMD and energy entropy. Mech. Syst. Signal Process. 2018, 105, 169–182. [Google Scholar] [CrossRef]
  91. Ali, H.S.; Chakravorty, A.; Kalayan, J.; de Visser, S.P.; Henchman, R.H. Energy–entropy method using multiscale cell correlation to calculate binding free energies in the SAMPL8 host–guest challenge. J. Comput.-Aided Mol. Des. 2021, 35, 911–921. [Google Scholar] [CrossRef] [PubMed]
  92. Portillo, D.; García Orden, J.; Romero, I. Energy–entropy–momentum integration schemes for general discrete non-smooth dissipative problems in thermomechanics. Int. J. Numer. Methods Eng. 2017, 112, 776–802. [Google Scholar] [CrossRef]
  93. Xiao, Y.; Hong, Y.; Chen, X.; Chen, W. The Application of Dual-Tree Complex Wavelet Transform (DTCWT) Energy Entropy in Misalignment Fault Diagnosis of Doubly-Fed Wind Turbine (DFWT). Entropy 2017, 19, 587. [Google Scholar] [CrossRef]
  94. Yang, Z.; Kong, C.; Wang, Y.; Rong, X.; Wei, L. Fault diagnosis of mine asynchronous motor based on MEEMD energy entropy and ANN. Comput. Electr. Eng. 2021, 92, 107070. [Google Scholar] [CrossRef]
  95. Pang, B.; Tang, G.; Zhou, C.; Tian, T. Rotor Fault Diagnosis Based on Characteristic Frequency Band Energy Entropy and Support Vector Machine. Entropy 2018, 20, 932. [Google Scholar] [CrossRef] [PubMed]
  96. Gao, S.; Ren, Y.; Zhang, Y.; Li, T. Fault diagnosis of rolling bearings based on improved energy entropy and fault location of triangulation of amplitude attenuation outer raceway. Measurement 2021, 185, 109974. [Google Scholar] [CrossRef]
  97. Rostaghi, M.; Ashory, M.R.; Azami, H. Application of dispersion entropy to status characterization of rotary machines. J. Sound Vib. 2019, 438, 291–308. [Google Scholar] [CrossRef]
  98. Yan, X.; Jia, M. Intelligent fault diagnosis of rotating machinery using improved multiscale dispersion entropy and mRMR feature selection. Knowl.-Based Syst. 2019, 163, 450–471. [Google Scholar] [CrossRef]
  99. Azami, H.; da Silva, L.E.V.; Omoto, A.C.M.; Humeau-Heurtier, A. Two-dimensional dispersion entropy: An information-theoretic method for irregularity analysis of images. Signal Process. Image Commun. 2019, 75, 178–187. [Google Scholar] [CrossRef]
  100. Zhang, W.; Zhou, J. A Comprehensive Fault Diagnosis Method for Rolling Bearings Based on Refined Composite Multiscale Dispersion Entropy and Fast Ensemble Empirical Mode Decomposition. Entropy 2019, 21, 680. [Google Scholar] [CrossRef]
  101. Minhas, A.S.; Kankar, P.; Kumar, N.; Singh, S. Bearing fault detection and recognition methodology based on weighted multiscale entropy approach. Mech. Syst. Signal Process. 2021, 147, 107073. [Google Scholar] [CrossRef]
  102. Shao, K.; Fu, W.; Tan, J.; Wang, K. Coordinated approach fusing time-shift multiscale dispersion entropy and vibrational Harris hawks optimization-based SVM for fault diagnosis of rolling bearing. Measurement 2021, 173, 108580. [Google Scholar] [CrossRef]
  103. Tan, H.; Xie, S.; Liu, R.; Ma, W. Bearing fault identification based on stacking modified composite multiscale dispersion entropy and optimised support vector machine. Measurement 2021, 186, 110180. [Google Scholar] [CrossRef]
  104. Xue, Q.; Xu, B.; He, C.; Liu, F.; Ju, B.; Lu, S.; Liu, Y. Feature Extraction Using Hierarchical Dispersion Entropy for Rolling Bearing Fault Diagnosis. IEEE Trans. Instrum. Meas. 2021, 70, 3521311. [Google Scholar] [CrossRef]
  105. Zhou, F.; Han, J.; Yang, X. Multivariate hierarchical multiscale fluctuation dispersion entropy: Applications to fault diagnosis of rotating machinery. Appl. Acoust. 2021, 182, 108271. [Google Scholar] [CrossRef]
  106. Wu, S.D.; Wu, C.W.; Lee, K.Y.; Lin, S.G. Modified multiscale entropy for short-term time series analysis. Phys. A Stat. Mech. Its Appl. 2013, 392, 5865–5873. [Google Scholar] [CrossRef]
  107. Wu, S.D.; Wu, C.W.; Lin, S.G.; Lee, K.Y.; Peng, C.K. Analysis of complex time series using refined composite multiscale entropy. Phys. Lett. A 2014, 378, 1369–1374. [Google Scholar] [CrossRef]
  108. Miskovic, V.; MacDonald, K.J.; Rhodes, L.J.; Cote, K.A. Changes in EEG multiscale entropy and power-law frequency scaling during the human sleep cycle. Hum. Brain Mapp. 2019, 40, 538–551. [Google Scholar] [CrossRef]
  109. Yang, A.C.; Huang, C.C.; Yeh, H.L.; Liu, M.E.; Hong, C.J.; Tu, P.C.; Chen, J.F.; Huang, N.E.; Peng, C.K.; Lin, C.P.; et al. Complexity of spontaneous BOLD activity in default mode network is correlated with cognitive function in normal male elderly: A multiscale entropy analysis. Neurobiol. Aging 2013, 34, 428–438. [Google Scholar] [CrossRef] [PubMed]
  110. Bhattacharyya, A.; Pachori, R.B.; Upadhyay, A.; Acharya, U.R. Tunable-Q wavelet transform based multiscale entropy measure for automated classification of epileptic EEG signals. Appl. Sci. 2017, 7, 385. [Google Scholar] [CrossRef]
  111. Silva, L.E.; Duque, J.J.; Felipe, J.C.; Murta Jr, L.O.; Humeau-Heurtier, A. Two-dimensional multiscale entropy analysis: Applications to image texture evaluation. Signal Process. 2018, 147, 224–232. [Google Scholar] [CrossRef]
  112. Agarwal, A.; Maheswaran, R.; Sehgal, V.; Khosa, R.; Sivakumar, B.; Bernhofer, C. Hydrologic regionalization using wavelet-based multiscale entropy method. J. Hydrol. 2016, 538, 22–32. [Google Scholar] [CrossRef]
  113. Zhou, Y.; Wang, H.; Wang, G.; Kumar, A.; Sun, W.; Xiang, J. Semi-Supervised Multiscale Permutation Entropy-Enhanced Contrastive Learning for Fault Diagnosis of Rotating Machinery. IEEE Trans. Instrum. Meas. 2023, 72, 3525610. [Google Scholar] [CrossRef]
  114. Lizarraga-Morales, R.A.; Rodriguez-Donate, C.; Cabal-Yepez, E.; Lopez-Ramirez, M.; Ledesma-Carrillo, L.M.; Ferrucho-Alvarez, E.R. Novel FPGA-based methodology for early broken rotor bar detection and classification through homogeneity estimation. IEEE Trans. Instrum. Meas. 2017, 66, 1760–1769. [Google Scholar] [CrossRef]
  115. Germán-Salló, Z. Entropy indices based fault detection. Procedia Manuf. 2020, 46, 549–554. [Google Scholar] [CrossRef]
  116. Dhandapani, R.; Mitiche, I.; McMeekin, S.; Morison, G. Bearing Faults Diagnosis and Classification Using Generalized Gaussian Distribution Multiscale Dispersion Entropy Features. In Proceedings of the 2022 30th European Signal Processing Conference (EUSIPCO), Belgrade, Serbia, 29 August–2 September 2022; pp. 1452–1456. [Google Scholar]
  117. Chen, J.; Wen, L.; Jiang, B.; Lu, N.; Liu, J. Multi-feature fusion and IGWO-LSSVM based fault diagnosis of rolling bearings. In Proceedings of the 2023 CAA Symposium on Fault Detection, Supervision and Safety for Technical Processes (SAFEPROCESS), Yibin, China, 22–24 September 2023; pp. 1–6. [Google Scholar]
  118. Ma, C.; Cai, Z.; Li, Y.; Wang, X. Bearing Fault Detection Based on Multiresolution Permutation Entropy. In Proceedings of the 2023 5th International Conference on System Reliability and Safety Engineering (SRSE), Beijing, China, 20–23 October 2023; pp. 244–249. [Google Scholar]
Figure 1. Current signals from the two conditions of a motor: healthy and one broken bar.
Figure 1. Current signals from the two conditions of a motor: healthy and one broken bar.
Entropy 26 00299 g001
Figure 2. Comparison of three different entropy methods for a practical case of broken bar fault detection.
Figure 2. Comparison of three different entropy methods for a practical case of broken bar fault detection.
Entropy 26 00299 g002
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Aguayo-Tapia, S.; Avalos-Almazan, G.; Rangel-Magdaleno, J.d.J. Entropy-Based Methods for Motor Fault Detection: A Review. Entropy 2024, 26, 299. https://0-doi-org.brum.beds.ac.uk/10.3390/e26040299

AMA Style

Aguayo-Tapia S, Avalos-Almazan G, Rangel-Magdaleno JdJ. Entropy-Based Methods for Motor Fault Detection: A Review. Entropy. 2024; 26(4):299. https://0-doi-org.brum.beds.ac.uk/10.3390/e26040299

Chicago/Turabian Style

Aguayo-Tapia, Sarahi, Gerardo Avalos-Almazan, and Jose de Jesus Rangel-Magdaleno. 2024. "Entropy-Based Methods for Motor Fault Detection: A Review" Entropy 26, no. 4: 299. https://0-doi-org.brum.beds.ac.uk/10.3390/e26040299

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop