Next Article in Journal
High-Energy Ejection of Molecules and Gas-Dust Outbursts in Coal Mines
Next Article in Special Issue
A Novel Adaptive Feature Fusion Strategy for Image Retrieval
Previous Article in Journal
Intuitionistic Fuzzy Synthetic Measure on the Basis of Survey Responses and Aggregated Ordinal Data
Previous Article in Special Issue
Personal Interest Attention Graph Neural Networks for Session-Based Recommendation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Moth-Flame Optimization Algorithm with Adaptation Mechanism to Solve Numerical and Mechanical Engineering Problems

1
Faculty of Computer Engineering, Najafabad Branch, Islamic Azad University, Najafabad 8514143131, Iran
2
Big Data Research Center, Najafabad Branch, Islamic Azad University, Najafabad 8514143131, Iran
3
Centre for Artificial Intelligence Research and Optimisation, Torrens University Australia, Brisbane 4006, Australia
4
Yonsei Frontier Lab, Yonsei University, Seoul 03722, Korea
5
Faculty of Computer Sciences and Informatics, Amman Arab University, Amman 11953, Jordan
6
School of Computer Sciences, Universiti Sains Malaysia, Pulau Pinang 11800, Malaysia
*
Authors to whom correspondence should be addressed.
Submission received: 10 October 2021 / Revised: 18 November 2021 / Accepted: 25 November 2021 / Published: 6 December 2021
(This article belongs to the Special Issue Advances in Information Sciences and Applications)

Abstract

:
Moth-flame optimization (MFO) algorithm inspired by the transverse orientation of moths toward the light source is an effective approach to solve global optimization problems. However, the MFO algorithm suffers from issues such as premature convergence, low population diversity, local optima entrapment, and imbalance between exploration and exploitation. In this study, therefore, an improved moth-flame optimization (I-MFO) algorithm is proposed to cope with canonical MFO’s issues by locating trapped moths in local optimum via defining memory for each moth. The trapped moths tend to escape from the local optima by taking advantage of the adapted wandering around search (AWAS) strategy. The efficiency of the proposed I-MFO is evaluated by CEC 2018 benchmark functions and compared against other well-known metaheuristic algorithms. Moreover, the obtained results are statistically analyzed by the Friedman test on 30, 50, and 100 dimensions. Finally, the ability of the I-MFO algorithm to find the best optimal solutions for mechanical engineering problems is evaluated with three problems from the latest test-suite CEC 2020. The experimental and statistical results demonstrate that the proposed I-MFO is significantly superior to the contender algorithms and it successfully upgrades the shortcomings of the canonical MFO.

1. Introduction

In the majority of real-world optimization problems, a large number of decision variables are interacted with together, which is a very time-consuming process for finding an exact solution [1,2,3,4,5,6,7]. Metaheuristic algorithms have been widely used in recent years to approximate near-optimal solutions for real-world problems in various applications such as discrete optimization [8,9,10,11,12,13,14,15,16,17], continuous optimization [18,19,20,21,22], and constrained engineering problems [23,24,25,26,27,28,29,30,31,32,33]. Moreover, a novel research field has emerged in this area which successfully combines machine learning and swarm intelligence approaches to obtain outstanding results in different areas [34,35,36]. Metaheuristic algorithms can be classified into two main categories of non-nature-inspired and nature-inspired [2]. Simulated annealing (SA) [37], tabu search (TS) [38], adaptive dimensional search (ADS) [39], and iterated local search (ILS) [40] are some well-known non-nature-inspired metaheuristic algorithms. Although these algorithms have demonstrated remarkable local search capabilities, they may easily be trapped in local optimum in complex problems [2,3].
Nature-inspired algorithms consist of three main categories: evolutionary, physics-based, and swarm intelligence (SI). Evolutionary algorithms are mostly inspired by Darwin’s theory of evolution. Some examples are: genetic algorithm (GA) [41,42], genetic programming (GP) [43], differential evolution (DE) [44], evolution strategy (ES) [45], and quantum-based avian navigation optimizer (QANA) [46]. Ensemble of mutation strategies and parameters in differential evolution (EPSDE) algorithm [47], multi-population ensemble DE (MPEDE) [48], ensemble of differential evolution variants (EDEV) [49], and multi-trial vector-based differential evolution (MTDE) [50] are some successful improvements on evolutionary algorithms. Physics-based algorithms propose meaningful search strategies inspired by physics and mathematics laws to solve optimization problems. The big bang–big crunch (BB-BC) [51], charged system search (CSS) [52], ray optimization (RO) [53], atom search optimization (ASO) [54], arithmetic optimization algorithm (AOA) [55], and atomic orbital search (AOS) [56] are some of the successful physics-based algorithms in the literature.
Swarm intelligence (SI) algorithms are mainly derived from the social interaction behavior of terrestrial animals, aquatic animals, birds, and insects in nature [57]. Grey wolf optimizer (GWO) [58], chimp optimization algorithm (ChOA) [59], and gorilla troops optimizer (GTO) [60] are inspired by the behavior of terrestrial animals to solve optimization problems. Despite their simplicity and broad use, they may suffer from common drawbacks such as low population diversity, sinking into local optimum, and premature convergence problems. Therefore, there have been many improvements with different approaches applied to overcome their weaknesses [61,62,63,64]. Some SI algorithms are inspired by the behavior of aquatic animals such as prey besieging and foraging, which have been modeled in krill herd (KH) [65], dolphin echolocation (DE) [66], and whale optimization algorithm (WOA) [67]. The social intelligence behaviors of birds and insects encourage researchers to propose a new generation of SI algorithms that are modeled by foraging, hunting, and navigation behavior. Ant colony optimization (ACO) [68], ant lion optimizer (ALO) [69], social spider algorithm (SSA) [70], crow search algorithm (CSA) [71], African vultures optimization algorithm (AVOA) [72], Aquila optimizer (AO) [73], and moth-flame optimization (MFO) [74] are popular SI algorithms inspired by the behaviors of birds and insects.
The moth-flame optimizer (MFO) is a prominent SI algorithm inspired by the moths’ locomotion toward the light source. The MFO is appealing because of its ease of implementation, acceptable time complexity, and small number of parameters, which make it applicable in real-world optimization problems such as image segmentation [75,76,77], feature selection [78,79,80,81,82], food processing [83,84,85], and engineering optimization [86,87,88,89,90,91]. Consequently, many MFO variants have been developed such as Lévy-flight moth-flame optimization (LMFO) [92], non-dominated sorting moth flame optimization (NS-MFO) [93], enhanced moth-flame optimization (EMFO) [94], water cycle–moth-flame optimization (WCMFO) [95], and sine-cosine moth-flame optimization (SMFO) [96]. However, MFO and its variants cannot satisfy the needs of the optimization process for challenging problems, and they still suffer from some weaknesses such as low population diversity [97,98], premature convergence, local optima trapping [99,100], and imbalance between exploration and exploitation [101]. The main reason for these MFO’s weaknesses is that the majority of moths are trapped in the local optima in the early iterations which results in low population diversity. The question of this study is how moths can escape the local optima trapping and be moved to promising zones?
This paper proposes an improved MFO algorithm named I-MFO which uses moths’ memory mechanism and an adapted version of the wandering around search (WAS) strategy introduced in our prior study [57] to find and possibly free trapped moths. In the I-MFO algorithm, trapped moths are detected by comparing their best experienced flame fitness (Fbest) with their current positions’ fitness (OM). If the current position of each moth is not better than its memory, the moth is considered to be a trapped moth, and the adapted wandering around search (AWAS) strategy is employed to possibly free it from local optima by performing some random short flights, which also leads to amelioration of the premature convergence. Moreover, the fl parameter is used to strike a balance between exploration and exploitation by limiting the moths’ flight range. In the end, the new position and its fitness value replace the previous ones if the new fitness value is better than its Fbest.
The proposed I-MFO is comprehensively investigated by the benchmark functions CEC 2018 [102] in different dimensions of 30, 50, and 100, and compared with the state-of-the-art metaheuristic algorithms and three MFO variants including SA, continuous genetic algorithm (CGA) [42], GWO, MFO, WOA, LMFO, WCMFO, ChOA, AOA, and SMFO. The results demonstrate that the I-MFO can avoid local optima trapping, maintain population diversity, mitigate premature convergence, and strike a balance between exploration and exploitation. The I-MFO algorithm is statistically evaluated by the Friedman test and post hoc analysis to prove the superiority of the algorithm. Moreover, the applicability of I-MFO to solve real-world optimization problems was evaluated by three mechanical engineering problems from the latest test-suite CEC 2020 [103]. All experimental evaluations and statistical tests indicate that the I-MFO algorithm outperforms contender algorithms with an overall effectiveness of 92%.
In the rest of this study, the MFO and its variants are discussed in Section 2. Section 3 illustrates the structure of the proposed I-MFO. The results of I-MFO in solving CEC 2018 benchmark test functions are given and analyzed in Section 4, and the results are proven by statistical analysis in Section 5. The applicability of the proposed I-MFO for solving real-world mechanical engineering problems is tested by three problems from the latest test-suite CEC 2020 in Section 6. Finally, conclusions and future directions are given in Section 7.

2. Related Works

In this section, the MFO algorithm is first described in detail, and then its variants are reviewed from the perspective of overcoming the MFO weaknesses.
The MFO algorithm proposed in [74] is a population-based SI algorithm that mimics moths’ navigation behavior toward light sources. The moths navigate toward the real-light source (moon) with a straight path and a fixed angle which is called transverse orientation. Moreover, moths are highly attracted to artificial lights such as flames, and because of the close distance, they change their flight angles continuously, which forms a spiral path. This behavior is modeled by the MFO algorithm to solve optimization problems and is reviewed in [104,105,106]. In this algorithm, both moths and flames are considered as solutions. First, moths scatter in the search space randomly and their positions are saved in a matrix M, where rows indicate the number of moths (N), and columns represent dimensions (D). Then, the fitness value of M is evaluated and stored in an array OM as represented below.
M = [ m 1 , 1 m 1 , 2 m 1 , D m 2 , 1 m 2 , 2 m 2 , D m N , 1 m N , 2 m N , D ]                 O M = [ O M 1 O M 2 O M N ]
The positions and fitness values of flames are considered in a matrix F and an array OF, respectively. In the first iteration, the OF is initiated based on ascending order of OM, and the corresponding sorted positions are assigned to the matrix F. In the next iterations, the F will be updated by the best N search agents from F and current M populations. In the optimization process, each moth Mi(t) in the current iteration t moves around its corresponding flame Fj using a logarithmic spiral defined in Equation (1), where Disi is computed by Equation (2), b determines the shape of the logarithmic spiral, and k is a random number value between interval (−1, 1).
M i ( t ) = D i s i . e b k . C o s ( 2 π k ) + F j ( t )
D i s i ( t ) = | F j ( t ) M i ( t ) |
In this algorithm, the number of flames is computed by Equation (3), where t indicates the current iteration, and N and MaxIt are the number of search agents and the maximum number of iterations, respectively.
f l a m e n o = r o u n d ( N t × N 1 M a x I t   )
The MFO algorithm is an effective problem solver that is widely applied for real-world optimization problems. However, MFO is prone to being trapped in local optimum and suffers premature convergence due to its loss of population diversity and imbalance between the two tendencies of exploration and exploitation. Therefore, many variants have been proposed to boost the MFO algorithm, which can be categorized in improved algorithms based on using new search strategies or operators and hybrid-based improvements with other algorithms. Figure 1 shows the classification of SI algorithms and MFO variants.
Many algorithms have been proposed by using new search strategies or operators in the canonical MFO. Li et al. [92] proposed the LMFO algorithm by employing the Lévy-flight strategy to increase the population diversity. Savsani et al. [93] proposed the effective non-dominated moth-flame optimization algorithm (NS-MFO) to solve multi-objective problems using the elitist non-dominated sorting method. The opposition-based moth-flame optimization (OMFO) [107] presents an opposition-based scheme in the canonical MFO to avoid local optimum and increase global exploration. In the EMFO [94], the Gaussian mutation (GM) was added to MFO to increase the diversity. In LGCMFO [100], Xu et al. used new operators such as Gaussian mutation (GM), Lévy mutation (LM), and Cauchy mutation (CM) to boost exploration and exploitation capabilities and encounter the local optima trapping of the MFO. In addition, Hongwei et al. [99] presented the chaos-enhanced moth-flame optimization (CMFO) with ten chaotic maps to cope with the MFO deficiency. Sapre et al. [108] brought up OMFO to cope with premature convergence and local optima trapping by proposing a combination of opposition-based, Cauchy mutation and evolutionary boundary constraint handling. In 2020, Kaur et al. [97] proposed E-MFO by adding a Cauchy mutation (CM) to improve the distribution of the algorithm in the search space. An improved moth-flame optimization (IMFO) [98] algorithm proposes a new flame generation strategy and divides optimization iterations into three phases to encounter low population diversity and enhance MFO’s search balance, respectively. An improved MFO algorithm called QSMFO was proposed by [109] to boost MFO’s exploitation capabilities while enhancing the exploration rate by introducing the simulated annealing strategy and quantum rotation gate, respectively.
Some variants proposed hybrid-based improvements to the MFO algorithm to boost its performance. MFOGSA [83] is a combination of MFO with gravitational search algorithm (GSA) to utilize MFO’s exploration and GSA’s exploitation capabilities. Bhesdadiya et al. [110] proposed a hybrid PSO-MFO algorithm to solve optimization problems. SA-MFO [111] combines MFO with simulated annealing (SA) to overcome local optima trapping and low convergence rate. Khalilpourazari et al. [95] proposed WCMFO to encounter MFO’s entrapping at local optima and low convergence rate, while taking advantage of the water cycle algorithm (WCA). A combination of MFO and artificial neural network (ANN-MFO) was proposed by Singh et al. [112] to solve multi-objective problems in magnetic abrasive finishing of aluminum. Chen et al. [96] introduced SMFO to improve the exploration capability of MFO by integrating it with the sine cosine strategy. An enhanced MFO algorithm was proposed by MP Dang et al. [113] which is a hybridization of MFO and three different methods to solve the design problem of a flexure hinge. Mittal [114] brought up an enhanced moth-flame optimization by integrating MFO and variable neighborhood search to boost search capabilities and convergence accuracy of the canonical MFO. In a recent study, Abd Elaziz et al. [115] proposed the FCHMD algorithm which is a hybridization of Harris hawks optimizer and MFO. In this algorithm, fractional-order Gauss and 2xmod1 chaotic maps are used to generate the initial population. Moreover, the FCHMD algorithm ameliorates premature convergence and stagnation in local optima by applying evolutionary population dynamics. Ahmed et al. [116] brought up DMFO-DE which is a discrete hybrid algorithm developed by integrating differential evolution and MFO to encounter the local optima problem and ameliorate the convergence speed and prevent the local optima problem. Li et al. [117] proposed the ODSFMFO algorithm which consists of an improved flame generation mechanism based on opposition-based learning (OBL) and differential evolution (DE) algorithm, and an enhanced local search mechanism based on shuffled frog leaping algorithm (SFLA) and death mechanism.
Based on the above review on MFO and its proposed variants, the most serious drawbacks of MFO are premature convergence, getting stuck in local optimum, low population diversity, and deficient balance between exploration and exploitation. Therefore, in this study, the improved moth-flame optimization (I-MFO) algorithm is proposed to encounter MFO’s shortcomings by introducing a memory mechanism and an adapted version of the wandering around search (WAS) strategy [57], called AWAS strategy, to the canonical MFO.

3. Proposed Algorithm

The proposed improved moth-flame optimization (I-MFO) algorithm is boosted using a moth memory mechanism and the adapted wandering around search (AWAS) strategy to overcome the mentioned shortcomings of the canonical MFO algorithm. The moth memory mechanism is inspired by moths’ behavior in nature in remembering their experiences [118], which is defined by Definition 1. Moreover, the AWAS strategy is introduced in Definition 2, to possibly escape the trapped moths from the local optima and alleviate the premature convergence. The pseudo-code and the flowchart of the proposed I-MFO are shown in Algorithm 1 and Figure 2, respectively.
Definition 1 (Moth memory construction).
Suppose Mem = {Mem1, Mem2, …, Memi, …, MemN} is finite set of N moths’ memories. The moth memory Mi is denoted by Memi = (Mbesti, Fbesti), where Mbesti is the best position of Mi obtained so far, and Fbesti is the fitness value of Mbesti. In the first iteration t, the best position Mbesti (t = 1) ← Mi (t = 1) and Fbesti (t = 1) ← OMi (t = 1). For the rest of the iterations, Mbesti (t > 1) ← Mi (P) and Fbesti (t > 1) ← OMi (P) such that {OMi (P) < Fbesti, P = 2, …, t}. The moth memory construction is formulated in Equation (4).
I f     t = 1   then   F b e s t i ( t ) O M i ( t )   a n d   M b e s t i ( t ) M i ( t ) I f     t > 1   then   F b e s t i ( t ) O M i ( P )   a n d   M b e s t i ( t ) M i ( P ) , S u c h   t h a t       s   { 2 ,   ,   t } ,   O M i ( s ) < O M i ( P )
Algorithm 1. The pseudo-code of I-MFO
Algorithm of improved moth-flame optimization (I-MFO)
Input: Maximum iterations (MaxIt), Number of moths (N), and Dimension size (D).
Output: The best flame position and its fitness value.
1Begin
2  Randomly distributing M moths in the D-dimensional search space.
3  Calculating moths’ fitness (OM).
4  Set t = 1.
5  OF ← sort (OM).
6  F  ← sort (M).
7  Defining the moth memory Mbest and Fbest using Definition 1.
8   While tMaxIt
9    Updating F and OF by the best N moths from F and current M.
10    Updating flame_no using Equation (3)
11    For i = 1: N
12      Computing the distance between moth Mi (t) and flame Fj (t) using Equation (2).
13      Updating the position of Mi (t) using Equation (1).
14      Computing the fitness value of Mi (t) and update OMi (t).
15      If Fbesti (t) < OMi (t)
16        Selecting a random moth Mr (t).
17        Updating the position of Mi (t) using AWAS defined in Definition 2.
18        Updating the fitness value OMi (t).
19       End if
20     Updating the moth memory Mi using Definition 1.
21     End for
22    Updating the position and fitness value of the global best flame.
23    t = t + 1.
24   End while
Definition 2 
(AWAS strategy). Consider TM (t) = {M1, …, Mi, …} as a finite set of moths trapped in the current iteration t such that Mi could not dominate its Memi (OMi (t) > Fbesti). Then, to possibly free the trapped moth Mi (t + 1) from the local optimum, its new position is computed by Equation (5), where Fgbest j (t) is the jth dimension of the global best flame, ri is a random number between interval (0, 1), Mrj (t) is the value of a random moth position. The flight length fli (t) for moth Mi is computed by Equation (6), where δ1 and δ2 are defined by the user, NF is the number of flights determined randomly in [1, D], and q is the current flight number. In fact, using AWAS strategy with the random NF provides advantage through which the trapped moth Mi can be moved to a better position.
M i j   ( t + 1 ) = F g b e s t j   ( t ) + r i × f l i   ( t ) × ( M r j   ( t ) M i j   ( t ) )
f l i   ( t ) = δ 1 q × ( δ 2 N F )  

4. Numerical Experiment and Analysis

In this section, the performance of the proposed I-MFO has been evaluated using the CEC 2018 [102] benchmark. Moreover, the proposed algorithm was compared with the state-of-the-art metaheuristic algorithms including SA [37], CGA [42], GWO [58], WOA [67], ChOA [59], AOA [55], canonical MFO [74], and its variants such as LMFO [92], WCMFO [95], and SMFO [96]. The parameter settings of comparative algorithms are adjusted as in their original papers and are reported in Table 1. The obtained results are reported in Table 2, Table 3 and Table 4, where the bold values show the winner algorithm. Furthermore, at the end of each table, the symbols W, T, and L demonstrate the number of wins, ties, and losses of each algorithm, respectively.

4.1. Benchmark Test Functions and Experimental Environment

The performance of the proposed algorithm is evaluated using the CEC 2018 benchmark functions with various dimensions of 30, 50, and 100. This benchmark contains 29 test functions with a diverse set of characteristics: unimodal, simple multimodal, hybrid, and composition. Test functions F1 and F3 are unimodal functions and they are adequate for evaluating the exploitation of algorithms. Test functions F4–F10 are multimodal with many local optima which are suitable to assess the exploration abilities of algorithms. Test functions F11–F20 are hybrid and F21–F30 are composition functions that can evaluate the local optima avoidance ability and balance between exploration and exploitation.
Due to the randomization of SI algorithms and to guarantee that the comparisons are fair, all experiments for each function are repeated 30 times separately on a laptop with characteristics: Intel Core i7-10750H CPU (2.60 GHz) and 24 GB of memory. The MATLAB programming language version R2020a and Windows 10 operating system were used to conduct all experiments. All algorithms were run under the same conditions, with the population size (N) 100 and the maximum number of iterations (MaxIt) (D × 104)/N.

4.2. Exploitation and Exploration Analysis

In this experimental evaluation, the unimodal functions F1 and F3 are considered to assess the exploitation abilities, while the multimodal test functions F4–F10 are dedicated to evaluating the exploration capabilities.
According to Table 2, the results prove that the I-MFO provides superior exploitation abilities around the optimum solution for all dimensions, particularly on test function F1. The results of multimodal test functions are evidence that the exploration capability of the I-MFO significantly outperforms all other algorithms in different dimensions. The comparison and reported results lead to the conclusion that the proposed I-MFO has an effective exploration ability due to the randomness movements of trapped moths by the AWAS strategy. Meanwhile, considering the best flame position and limited value of flight length (fl) causes the exploitation ability to remain functional in the course of iterations. Moreover, the comparison of fitness distribution is shown by box and whiskers diagrams in Figure 3. The diagrams predominately demonstrate that the proposed I-MFO can find the best solutions during the optimization process, which verifies that its exploration and exploitation abilities are more sufficient than other competitors.

4.3. Local Optima Avoidance Evaluation

This experimental evaluation is benchmarking the ability of the proposed algorithm against the contender algorithms in terms of local optima avoidance and striking a balance between exploration and exploitation by considering hybrid and composition function results. The obtained results tabulated in Table 3 and Table 4 indicate that the proposed I-MFO algorithm is superior to the contender algorithms in dimensions 30, 50, and 100. The main reason is that the AWAS strategy helps trapped moths to escape the local optima and obtain a better position by changing random dimensions of trapped moths with dimensions of the best flame and a random moth’s position. The random moth causes the trapped moth to explore the search space and increases the population diversity while considering the best flame enhances the exploitation capabilities of the algorithm simultaneously. Furthermore, Figure 4 visualizes the comparison of fitness distribution using box and whiskers diagrams in which almost all diagrams demonstrate that the proposed I-MFO can find the best solutions during the optimization process. It verifies that I-MFO can provide satisfactory equilibration between exploration and exploitation.

4.4. I-MFO Overall Effectiveness

The overall effectiveness (OE) [50] of the I-MFO and other contender algorithms is computed by using results reported in Table 2, Table 3 and Table 4. The OE results tabulated in Table 5 are calculated using Equation (7), where N indicates the number of test functions and L is the number of losses of each algorithm. The results reveal that I-MFO with overall effectiveness of 92% is the most effective algorithm for all dimensions: 30, 50, and 100.
O E   ( % ) = ( N L N   ) × 100

4.5. Convergence Behavior Analysis

In this section, the convergence behavior of I-MFO is assessed and compared with contender algorithms on some selected functions with dimensions 30, 50, and 100. The convergence curves of the best fitness values obtained by each algorithm on unimodal and multimodal test functions are plotted in Figure 5. Moreover, the convergence curves of hybrid and composition test functions are plotted in Figure 6.
Investigating convergence behaviors of the I-MFO reveals that it shows various convergence behaviors. The most common behavior is an accelerated descent with the fastest accurate solutions toward the promising area in the early iterations, which can be seen in 30D (F5, F15, F16, F26), 50D (F6, F15, F16, F26, F30), and 100D (F5, F7, F8, F15, F16, F26). For some functions such as 30D (F1, F7, F8, F18, F30), 50D (F1, F8, F18), and 100D (F1, F10, F18, F30), the I-MFO shows abrupt movements in the first half of iterations and very low variations for the second half, which proves the efficient balance between exploration and exploitation. Finally, for 30D (F3, F7, F10, F12 F20), 50D (F3, F5, F8, F10, F12, F18, F20), and 100D (F3, F12, F18, F20), the I-MFO starts its convergence with a steep descent slope and then changes to a gradual trend toward the optimum solutions until final iterations. This behavior demonstrates the ability of the I-MFO in escaping from the local optimum and taking advantage of the last iterations.

4.6. Population Diversity Analysis

In metaheuristic algorithms, the population diversity maintenance is important throughout the optimization process. The low diversity among search agents may cause the algorithm to plunge into the local optimum. In this experiment, the population diversity of the proposed I-MFO and contender algorithms is measured by a moment of inertia (Ic) [119], where the Ic is the spreading of each individual from their mass center given by Equation (8) and the mass center cj for j = 1, 2 … D is calculated by Equation (9).
I c = i = 1 N j = 1 D ( M i j c j ) 2
c j = 1 D   i = 1 N M i j
The presented population diversity measures the distribution of search agents, and the diversity’s changing slope for the proposed algorithm and contender algorithms is plotted in Figure 7. This experiment is conducted on some CEC 2018 benchmark functions with dimensions 30, 50, and 100. Comparing the convergence curves in Figure 5 and Figure 6 and the plotted diversity in Figure 8 reveals that I-MFO can effectively maintain diversification among solutions until the near-optimal solution is met.

4.7. Sensitivity Analysis on the Number of Flight (NF) Parameter

As discussed in Definition 2, the NF parameter is the number of opportunities for each trapped moth to fly in the search space and possibly obtain a better position. Hence, in this experiment, the impact of considering different values for the NF parameter is evaluated and discussed. The plotted curves in Figure 8 illustrate the convergence behavior of the MFO compared with different variants of I-MFO algorithm. In I-MFO-NF1 and I-MFO-NF5, the value of NF is considered by 1 and 5 while in I-MFO, the value of NF is set by a random number in [1, D]. The results gained for different dimensions 30, 50, and 100 reveal that setting NF by a random number limited by the dimension has an advantage for trapped moths to possibly escape from the local optima for different test functions.

4.8. Impact Analysis of Applying AWAS Strategy

In this experiment, the impact of applying the AWAS strategy is analyzed on some selected functions of the CEC 2018 benchmark for different dimensions 30, 50, and 100. The proposed AWAS strategy can ameliorate the MFO’s weaknesses described in Section 2. To adequately assess the impact of applying the AWAS strategy, in this experiment we consider MFO, I-MFO, and its three variations including I-MFO-10%, I-MFO-40%, and I-MFO-80% which indicate the percentage of trapped moths that are randomly selected to possibly escape from the local optima using the proposed AWAS strategy.
The first row of Figure 9 indicates convergence curves for unimodal F1, where the I-MFO and its variations outperform the MFO for all dimensions. Specifically, for dimension 100, the I-MFO-10% offers superior outcomes while it has less computational cost compared to the I-MFO. The curves provided for multimodal F5 and F10 indicate that the I-MFO offers better solutions, while in the next ranks, I-MFO-80%, I-MFO-40%, and I-MFO-10% outperform the MFO. The hybrid test function is shown in the fourth row of Figure 9, where the I-MFO and its variations keep converging toward the global optimum with a steep slope until the final iterations. The I-MFO and its variations can also find better solutions for the composition functions F22 and F26, wherein for these functions, as the number of dimensions grows, the significance of the AWAS strategy in guiding the population toward the global optimum region and avoiding local optima entrapment becomes clearer. Although the provided results demonstrate that the I-MFO with 100% of trapped moths applied to AWAS strategy mostly provides better solutions for different dimensions and search spaces, other variations of the I-MFO can also provide competitive performance while they have the advantage of lower computational cost compared to the I-MFO.

5. Statistical Analysis

In this section, the results obtained in the preceding section are first statistically analyzed using the non-parametric Friedman test. The Bonferroni and Tukey post hoc producers are then conducted to establish proper comparisons between the proposed algorithm and comparative algorithms.

5.1. Non-Parametric Friedman Test

The Friedman test is performed to rank the significance of the superiority algorithms statistically [120,121]. The obtained results for unimodal and multimodal test functions are tabulated in Table 6 and the results for hybrid and composition functions are reported in Table 7. This statistical analysis shows that the I-MFO is first rank on all test functions for dimensions of 30, 50, and 100.
Figure 10 contains six charts to visually show the ranking of the I-MFO and contender algorithms in different dimensions. The left side illustrates the ranking of algorithms in various functions of the CEC 2018 benchmark, while the right side shows the bar chart of Friedman test average results. The radar graph shows that the I-MFO outperforms other algorithms in different dimensions as the smaller size of the I-MFO indicates its first and second rank for all functions. The bar chart provided on the right side reveals that the I-MFO is superior to other comparative algorithms as it has the shortest bar in various dimensions of 30, 50, and 100.

5.2. Post Hoc Analysis

In the post hoc analysis [120], we evaluated the proposed hypothesis between the control method and the rest of the compared methods in Table 8 by employing Bonferroni and Tukey’s multiple comparison producers. In this experiment, the level of significance is α = 0.05, which determines whether or not a hypothesis is acceptable by comparing the significant difference (p-value) between each pair of algorithms. Since gained p-values for all dimensions 30, 50, and 100 are less than α = 0.05, it reveals that there are significant differences between the performances of the I-MFO and other compared algorithms.

6. Applicability of I-MFO Algorithm to Solve Mechanical Engineering Problems

In this section, three constrained mechanical engineering problems from the latest test-suite CEC 2020 [103] are considered to evaluate the applicability of the I-MFO algorithm in real-world applications. To achieve a fair comparison, the algorithms were run 20 times with the population size (N) 20 and maximum iterations (MaxIt) (D × 104)/N. In this experimental evaluation, the proposed algorithm and contender algorithms compete to solve three different problems that consist of a gas transmission compressor design problem, three-bar truss, and tension/compression spring design.
  • P1: Gas transmission compressor design problem
Minimization of the objective function using four design variables is the main goal of the gas transmission compressor design problem. This problem is illustrated and formulated in Figure 11 and Equation (10). The performance of the proposed algorithm is evaluated against the contender algorithms to solve this problem and the obtained results are tabulated in Table 9. As shown in this table, the I-MFO is superior in addressing this issue.
Minimize   f ( x ¯ ) = 8.61 × 10 5 x 1 1 2 x 2 x 3 2 3 x 4 1 2 + ( 3.69 ) × 10 4 x 3 + ( 7.72 ) × 10 8 x 1 1   x 2 0.219 ( 765.43 ) × 10 6 x 1 1 Subject   to   x 4 x 2 2 + x 2 2 1 0 Variable   range   20 x 1 50 ,   1 x 2 10 ,   20 x 3 45 ,   0.1 x 4 60
  • P2: Three-bar truss problem
In this problem, three constraints and two variables are utilized to formulate the objective function, which is the weight of the bar structures. The schematic and formulation of this problem are represented in Figure 12 and Equation (11), respectively. The proposed I-MFO algorithm and comparative algorithms are compared for solving this problem. The attained results from this experiment are tabulated in Table 10, in which the I-MFO algorithm outperforms other algorithms in approximating the optimal values for variables with minimum weight.
Minimize   f ( x ) = l × ( x 2 + 2 2   x 1 ) Subject   to   g 1 ( x ) = x 2 2 x 2 x 1 + 2   x 1 2   p σ 0                                           g 2 ( x ) = x 2 + 2   x 1 2 x 2 x 1   p σ   0                                           g 3 ( x ) = 1 x 1 + 2   x 2   p σ 0 where                             l = 100   cm ,   p = 2 K N c m 2 ,   and   σ = 2 K N c m 2 Variable   range   0   x 1   1                                           0   x 2   1
  • P3: Tension/compression spring design problem
In the tension/compression spring design problem, the objective is to minimize the weight of the tension/compression spring by considering three variables and four constraints. As shown in Figure 13, the variables are wire diameter (d), the number of active coils (N), and mean coil diameter (D). The problem and its constraints are described in Equation (12) and results are reported in Table 11.
Minimize   f ( x ) = x 1 2 x 2   ( 2 + x 3 ) Subject   to   g 1 ( x ) = 1   x 2 3 x 3 71 , 785 x 1 4   0                                           g 2 ( x ) =   4 x 2 2 x 1 x 2 12 , 566 ( x 2 x 1 3 x 1 4 ) +   1 5108 x 1 2 1   0                                           g 3 ( x ) = 1   140.45 x 1 x 2 2 x 3   0                                           g 4 ( x ) =   x 1 + x 2 1.5 1   0 Variable   range   0.05   x 1   2.00                                           0.25   x 2   1.30                                           2.00   x 3   15.0
The results of the mechanical engineering problems tabulated in Table 9, Table 10 and Table 11 demonstrate the fact that the I-MFO is superior to other algorithms for solving real-world mechanical engineering problems.

7. Conclusions and Future Works

The transverse orientation behavior of moths while encountering artificial lights is the main inspiration behind the MFO algorithm to successfully solve optimization problems. However, as with most of the SI algorithms, the MFO suffers from premature convergence, local optima entrapping, low population diversity, and imbalance between exploration and exploitation. These drawbacks make the MFO uncompetitive in solving complex and real-world optimization problems. Therefore, an improved version of the MFO named I-MFO is proposed to improve the MFO algorithm from the perspective of alleviating premature convergence, maintaining population diversity, avoiding local optima trapping, and striking a balance between exploration and exploitation.
To detect local optima-trapped moths, a memory mechanism is defined for each moth. Then, the adapted wandering around search (AWAS) strategy is introduced to possibly free detected trapped moths from local optima by changing their positions while considering the best flame and a random moth position. The CEC 2018 benchmark tasks were conducted to evaluate the performance of the I-MFO, where the reported results in Table 2, Table 3 and Table 4 and OE in Table 5 prove I-MFO’s superior performance over 92% of test functions. The multimodal test function results reported in Table 2 are clear evidence of the fact that the I-MFO boosts exploration rate, especially in more complex problems. The hybrid and composition test function results tabulated in Table 3 and Table 4 support the claim that the proposed I-MFO enhances the balance between exploration and exploitation, by which the I-MFO can get out of local optima effectively. The convergence curves also show the local optima avoidance ability and enhanced balance between exploration and exploitation. Moreover, it can be deduced from the population diversity plots that the I-MFO successfully maintains population diversity until a near-optimal solution emerges.
The sensitivity of the AWAS strategy and NF parameter is evaluated on some CEC 2018 benchmark functions for different dimensions, where the results reveal that although the I-MFO offers better solutions in most test functions, other variations of the I-MFO can also provide competitive outcomes for some functions and dimensions. The statistical efficiency of the I-MFO is investigated by the Friedman test and post-hoc analysis, which revealed that the proposed I-MFO outperforms other contender algorithms for various test functions. In the end, the outcomes of the mechanical engineering problems from the latest test-suite CEC 2020 demonstrate that the proposed I-MFO is applicable for solving real-world mechanical engineering problems. Although I-MFO provides competitive results for solving global optimization and engineering tasks, like most improvements, it consumes more time compared to the canonical MFO because it uses the AWAS strategy. Hence, in practice, the I-MFO may not be suitable for solving large-scale real-time problems. For future works, a multi-objective version of I-MFO can be developed for solving continuous multi-objective problems. Moreover, extending I-MFO to the discrete version for solving discrete optimization tasks such as the community detection problem is a worthwhile direction.

Author Contributions

Conceptualization, M.H.N.-S.; methodology, M.H.N.-S. and H.Z.; software, M.H.N.-S., A.F. and H.Z.; validation, M.H.N.-S. and H.Z.; formal analysis, M.H.N.-S., A.F. and H.Z.; investigation, M.H.N.-S., A.F. and H.Z.; resources, M.H.N.-S., S.M. and L.A.; data curation, M.H.N.-S., A.F. and H.Z.; writing, M.H.N.-S., A.F. and H.Z.; original draft preparation, M.H.N.-S., A.F. and H.Z.; writing—review and editing, M.H.N.-S., A.F., H.Z., S.M. and L.A.; visualization, M.H.N.-S., A.F. and H.Z.; supervision, M.H.N.-S. and S.M.; project administration, M.H.N.-S. and S.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data and code used in the research may be obtained from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Del Ser, J.; Osaba, E.; Molina, D.; Yang, X.; Salcedo-Sanz, S.; Camacho, D.; Das, S.; Suganthan, P.; Coello, C.C.; Herrera, F. Bio-inspired computation: Where we stand and what’s next. Swarm Evol. Comput 2019, 48, 220–250. [Google Scholar] [CrossRef]
  2. Talbi, E.-G. Metaheuristics: From Design to Implementation; John Wiley & Sons: New York, NY, USA, 2009; Volume 74. [Google Scholar]
  3. Kar, A.K. Bio inspired computing—A review of algorithms and scope of applications. Expert Syst. Appl. 2016, 59, 20–32. [Google Scholar] [CrossRef]
  4. Dezfouli, M.B.; Nadimi-Shahraki, M.H.; Zamani, H. A novel tour planning model using big data. In Proceedings of the 2018 International Conference on Artificial Intelligence and Data Processing (IDAP), Malatya, Turkey, 28–30 September 2018; pp. 1–6. [Google Scholar]
  5. Zahrani, H.K.; Nadimi-Shahraki, M.H.; Sayarshad, H.R. An intelligent social-based method for rail-car fleet sizing problem. J. Rail Transp. Plan. Manag. 2021, 17, 100231. [Google Scholar] [CrossRef]
  6. Javadian, N.; Sayarshad, H.R.; Najafi, S. Using simulated annealing for determination of the capacity of yard stations in a railway industry. Appl. Soft Comput. 2011, 11, 1899–1907. [Google Scholar] [CrossRef]
  7. Sayarshad, H.R.; Javadian, N.; Tavakkoli-Moghaddam, R.; Forghani, N. Solving multi-objective optimization formulation for fleet planning in a railway industry. Ann. Oper. Res. 2010, 181, 185–197. [Google Scholar] [CrossRef]
  8. Abdollahzadeh, B.; Gharehchopogh, F.S. A multi-objective optimization algorithm for feature selection problems. Eng. Comput. 2021, 1–19. [Google Scholar] [CrossRef]
  9. Ewees, A.A.; Al-qaness, M.A.A.; Abualigah, L.; Oliva, D.; Algamal, Z.Y.; Anter, A.M.; Ali Ibrahim, R.; Ghoniem, R.M.; Abd Elaziz, M. Boosting Arithmetic Optimization Algorithm with Genetic Algorithm Operators for Feature Selection: Case Study on Cox Proportional Hazards Model. Mathematics 2021, 9, 2321. [Google Scholar] [CrossRef]
  10. Mienye, I.D.; Sun, Y. Improved Heart Disease Prediction Using Particle Swarm Optimization Based Stacked Sparse Autoencoder. Electronics 2021, 10, 2347. [Google Scholar] [CrossRef]
  11. Taghian, S.; Nadimi-Shahraki, M.H.; Zamani, H. Comparative analysis of transfer function-based binary Metaheuristic algorithms for feature selection. In Proceedings of the 2018 International Conference on Artificial Intelligence and Data Processing (IDAP), Malatya, Turkey, 28–30 September 2018; pp. 1–6. [Google Scholar]
  12. Zamani, H.; Nadimi-Shahraki, M.H. Swarm intelligence approach for breast cancer diagnosis. Int. J. Comput. Appl. 2016, 151, 40–44. [Google Scholar] [CrossRef]
  13. Zamani, H.; Nadimi-Shahraki, M.H. Feature selection based on whale optimization algorithm for diseases diagnosis. Int. J. Comput. Sci. Inf. Secur. 2016, 14, 1243. [Google Scholar]
  14. Ibrahim, R.A.; Abualigah, L.; Ewees, A.A.; Al-Qaness, M.A.; Yousri, D.; Alshathri, S.; Abd Elaziz, M. An Electric Fish-Based Arithmetic Optimization Algorithm for Feature Selection. Entropy 2021, 23, 1189. [Google Scholar] [CrossRef] [PubMed]
  15. Wang, L.; Shi, R.; Dong, J. A Hybridization of Dragonfly Algorithm Optimization and Angle Modulation Mechanism for 0-1 Knapsack Problems. Entropy 2021, 23, 598. [Google Scholar] [CrossRef]
  16. Lee, J.; Park, J.; Kim, H.-C.; Kim, D.-W. Competitive Particle Swarm Optimization for Multi-Category Text Feature Selection. Entropy 2019, 21, 602. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Nadimi-Shahraki, M.H.; Moeini, E.; Taghian, S.; Mirjalili, S. DMFO-CD: A Discrete Moth-Flame Optimization Algorithm for Community Detection. Algorithms 2021, 14, 314. [Google Scholar] [CrossRef]
  18. Alsalibi, B.; Abualigah, L.; Khader, A.T. A novel bat algorithm with dynamic membrane structure for optimization problems. Appl. Intell. 2021, 51, 1992–2017. [Google Scholar] [CrossRef]
  19. Asghari, K.; Masdari, M.; Gharehchopogh, F.S.; Saneifard, R. A chaotic and hybrid gray wolf-whale algorithm for solving continuous optimization problems. Prog. Artif. Intell. 2021, 10, 349–374. [Google Scholar] [CrossRef]
  20. Goldanloo, M.J.; Gharehchopogh, F.S. A hybrid OBL-based firefly algorithm with symbiotic organisms search algorithm for solving continuous optimization problems. J. Supercomput. 2021, 1–34. [Google Scholar] [CrossRef]
  21. Zaman, H.R.R.; Gharehchopogh, F.S. An improved particle swarm optimization with backtracking search optimization algorithm for solving continuous optimization problems. Eng. Comput. 2021, 1–35. [Google Scholar] [CrossRef]
  22. Doumari, S.A.; Givi, H.; Dehghani, M.; Montazeri, Z.; Leiva, V.; Guerrero, J.M. A New Two-Stage Algorithm for Solving Optimization Problems. Entropy 2021, 23, 491. [Google Scholar] [CrossRef] [PubMed]
  23. Abd Elaziz, M.; Elsheikh, A.H.; Oliva, D.; Abualigah, L.; Lu, S.; Ewees, A.A. Advanced Metaheuristic Techniques for Mechanical Design Problems. Arch. Comput. Methods Eng. 2021, 1–22. [Google Scholar] [CrossRef]
  24. Akay, B.; Karaboga, D. Artificial bee colony algorithm for large-scale problems and engineering design optimization. J. Intell. Manuf. 2012, 23, 1001–1014. [Google Scholar] [CrossRef]
  25. Aloui, M.; Hamidi, F.; Jerbi, H.; Omri, M.; Popescu, D.; Abbassi, R. A Chaotic Krill Herd Optimization Algorithm for Global Numerical Estimation of the Attraction Domain for Nonlinear Systems. Mathematics 2021, 9, 1743. [Google Scholar] [CrossRef]
  26. Gharehchopogh, F.S.; Farnad, B.; Alizadeh, A. A farmland fertility algorithm for solving constrained engineering problems. Concurr. Comput. Pract. Exp. 2021, 33, e6310. [Google Scholar] [CrossRef]
  27. Ivanov, O.; Neagu, B.-C.; Grigoraș, G.; Scarlatache, F.; Gavrilaș, M. A Metaheuristic Algorithm for Flexible Energy Storage Management in Residential Electricity Distribution Grids. Mathematics 2021, 9, 2375. [Google Scholar] [CrossRef]
  28. Wang, S.; Jia, H.; Abualigah, L.; Liu, Q.; Zheng, R. An Improved Hybrid Aquila Optimizer and Harris Hawks Algorithm for Solving Industrial Engineering Optimization Problems. Processes 2021, 9, 1551. [Google Scholar] [CrossRef]
  29. Ziadeh, A.; Abualigah, L.; Abd Elaziz, M.; Şahin, C.B.; Almazroi, A.A.; Omari, M. Augmented grasshopper optimization algorithm by differential evolution: A power scheduling application in smart homes. Multimed. Tools Appl. 2021, 80, 31569–31597. [Google Scholar] [CrossRef]
  30. Varaee, H.; Ghasemi, M.R. Engineering optimization based on ideal gas molecular movement algorithm. Eng. Comput. 2017, 33, 71–93. [Google Scholar] [CrossRef]
  31. Ghasemi, M.R.; Varaee, H. A fast multi-objective optimization using an efficient ideal gas molecular movement algorithm. Eng. Comput. 2017, 33, 477–496. [Google Scholar] [CrossRef]
  32. Hua, Z.; Xiao, Y.; Cao, J. Misalignment Fault Prediction of Wind Turbines Based on Improved Artificial Fish Swarm Algorithm. Entropy 2021, 23, 692. [Google Scholar] [CrossRef]
  33. Wang, S.; Liu, Q.; Liu, Y.; Jia, H.; Abualigah, L.; Zheng, R.; Wu, D. A Hybrid SSA and SMA with Mutation Opposition-Based Learning for Constrained Engineering Problems. Comput. Intell. Neurosci. 2021, 2021, 6379469. [Google Scholar] [CrossRef]
  34. Selvaraj, S.; Choi, E. Swarm Intelligence Algorithms in Text Document Clustering with Various Benchmarks. Sensors 2021, 21, 3196. [Google Scholar] [CrossRef]
  35. Bacanin, N.; Bezdan, T.; Venkatachalam, K.; Al-Turjman, F. Optimized convolutional neural network by firefly algorithm for magnetic resonance image classification of glioma brain tumor grade. J. Real-Time Image Process. 2021, 18, 1085–1098. [Google Scholar] [CrossRef]
  36. Bacanin, N.; Bezdan, T.; Tuba, E.; Strumberger, I.; Tuba, M. Optimizing convolutional neural network hyperparameters by enhanced swarm intelligence metaheuristics. Algorithms 2020, 13, 67. [Google Scholar] [CrossRef] [Green Version]
  37. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  38. Glover, F.; Laguna, M. Tabu search. In Handbook of Combinatorial Optimization; Springer: Boston, MA, USA, 1998; pp. 2093–2229. [Google Scholar]
  39. Hasançebi, O.; Azad, S.K. Adaptive dimensional search: A new metaheuristic algorithm for discrete truss sizing optimization. Comput. Struct. 2015, 154, 1–16. [Google Scholar] [CrossRef]
  40. Lourenço, H.R.; Martin, O.C.; Stützle, T. Iterated local search: Framework and applications. In Handbook of Metaheuristics; Springer: Cham, Switzerland, 2019; pp. 129–168. [Google Scholar]
  41. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  42. Chelouah, R.; Siarry, P. A continuous genetic algorithm designed for the global optimization of multimodal functions. J. Heuristics 2000, 6, 191–213. [Google Scholar] [CrossRef]
  43. Koza, J.R. Genetic programming as a means for programming computers by natural selection. Stat. Comput. 1994, 4, 87–112. [Google Scholar] [CrossRef]
  44. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  45. Beyer, H.-G.; Schwefel, H.-P. Evolution strategies—A comprehensive introduction. Nat. Comput. 2002, 1, 3–52. [Google Scholar] [CrossRef]
  46. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. QANA: Quantum-based avian navigation optimizer algorithm. Eng. Appl. Artif. Intell. 2021, 104, 104314. [Google Scholar] [CrossRef]
  47. Mallipeddi, R.; Suganthan, P.N.; Pan, Q.-K.; Tasgetiren, M.F. Differential evolution algorithm with ensemble of parameters and mutation strategies. Appl. Soft Comput. 2011, 11, 1679–1696. [Google Scholar] [CrossRef]
  48. Wu, G.; Mallipeddi, R.; Suganthan, P.N.; Wang, R.; Chen, H. Differential evolution with multi-population based ensemble of mutation strategies. Inf. Sci. 2016, 329, 329–345. [Google Scholar] [CrossRef]
  49. Wu, G.; Shen, X.; Li, H.; Chen, H.; Lin, A.; Suganthan, P.N. Ensemble of differential evolution variants. Inf. Sci. 2018, 423, 172–186. [Google Scholar] [CrossRef]
  50. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S.; Faris, H. MTDE: An effective multi-trial vector-based differential evolution algorithm and its applications for engineering design problems. Appl. Soft Comput. 2020, 97, 106761. [Google Scholar] [CrossRef]
  51. Erol, O.K.; Eksin, I. A new optimization method: Big bang—Big crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  52. Kaveh, A.; Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [Google Scholar] [CrossRef]
  53. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray optimization. Comput. Struct. 2012, 112, 283–294. [Google Scholar] [CrossRef]
  54. Zhao, W.; Wang, L.; Zhang, Z. Atom search optimization and its application to solve a hydrogeologic parameter estimation problem. Knowl.-Based Syst. 2019, 163, 283–304. [Google Scholar] [CrossRef]
  55. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The arithmetic optimization algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  56. Azizi, M. Atomic orbital search: A novel metaheuristic algorithm. Appl. Math. Model. 2021, 93, 657–683. [Google Scholar] [CrossRef]
  57. Zamani, H.; Nadimi-Shahraki, M.H.; Gandomi, A.H. CCSA: Conscious neighborhood-based crow search algorithm for solving global optimization problems. Appl. Soft Comput. 2019, 85, 105583. [Google Scholar] [CrossRef]
  58. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  59. Khishe, M.; Mosavi, M.R. Chimp optimization algorithm. Expert Syst. Appl. 2020, 149, 113338. [Google Scholar] [CrossRef]
  60. Abdollahzadeh, B.; Soleimanian Gharehchopogh, F.; Mirjalili, S. Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst. 2021, 36, 5887–5958. [Google Scholar] [CrossRef]
  61. Nadimi-Shahraki, M.H.; Taghian, S.; Mirjalili, S. An improved grey wolf optimizer for solving engineering problems. Expert Syst. Appl. 2021, 166, 113917. [Google Scholar] [CrossRef]
  62. Jia, H.; Sun, K.; Zhang, W.; Leng, X. An enhanced chimp optimization algorithm for continuous optimization domains. Complex Intell. Syst. 2021, 1–18. [Google Scholar] [CrossRef]
  63. Liu, Y.; Sun, J.; Yu, H.; Wang, Y.; Zhou, X. An Improved Grey Wolf Optimizer Based on Differential Evolution and OTSU Algorithm. Appl. Sci. 2020, 10, 6343. [Google Scholar] [CrossRef]
  64. Chen, C.; Wang, X.; Chen, H.; Wu, C.; Mafarja, M.; Turabieh, H. Towards Precision Fertilization: Multi-Strategy Grey Wolf Optimizer Based Model Evaluation and Yield Estimation. Electronics 2021, 10, 2183. [Google Scholar] [CrossRef]
  65. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  66. Kaveh, A.; Farhoudi, N. A new optimization method: Dolphin echolocation. Adv. Eng. Softw. 2013, 59, 53–70. [Google Scholar] [CrossRef]
  67. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  68. Dorigo, M.; Di Caro, G. Ant colony optimization: A new meta-heuristic. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; pp. 1470–1477. [Google Scholar]
  69. Mirjalili, S. The ant lion optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  70. James, J.; Li, V.O. A social spider algorithm for global optimization. Appl. Soft Comput. 2015, 30, 614–627. [Google Scholar]
  71. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  72. Abdollahzadeh, B.; Gharehchopogh, F.S.; Mirjalili, S. African vultures optimization algorithm: A new nature-inspired metaheuristic algorithm for global optimization problems. Comput. Ind. Eng. 2021, 158, 107408. [Google Scholar] [CrossRef]
  73. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-qaness, M.A.; Gandomi, A.H. Aquila Optimizer: A novel meta-heuristic optimization Algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  74. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  75. Abd El Aziz, M.; Ewees, A.A.; Hassanien, A.E. Whale optimization algorithm and moth-flame optimization for multilevel thresholding image segmentation. Expert Syst. Appl. 2017, 83, 242–256. [Google Scholar] [CrossRef]
  76. Jia, H.; Ma, J.; Song, W. Multilevel thresholding segmentation for color image using modified moth-flame optimization. IEEE Access 2019, 7, 44097–44134. [Google Scholar] [CrossRef]
  77. Khan, M.A.; Sharif, M.; Akram, T.; Damaševičius, R.; Maskeliūnas, R. Skin Lesion Segmentation and Multiclass Classification Using Deep Learning Features and Improved Moth Flame Optimization. Diagnostics 2021, 11, 811. [Google Scholar] [CrossRef] [PubMed]
  78. Abd Elaziz, M.; Ewees, A.A.; Ibrahim, R.A.; Lu, S. Opposition-based moth-flame optimization improved by differential evolution for feature selection. Math. Comput. Simul. 2020, 168, 48–75. [Google Scholar] [CrossRef]
  79. Gupta, D.; Ahlawat, A.K.; Sharma, A.; Rodrigues, J.J. Feature selection and evaluation for software usability model using modified moth-flame optimization. Computing 2020, 102, 1503–1520. [Google Scholar] [CrossRef]
  80. Tumar, I.; Hassouneh, Y.; Turabieh, H.; Thaher, T. Enhanced binary moth flame optimization as a feature selection algorithm to predict software fault prediction. IEEE Access 2020, 8, 8041–8055. [Google Scholar] [CrossRef]
  81. Abu Khurmaa, R.; Aljarah, I.; Sharieh, A. An intelligent feature selection approach based on moth flame optimization for medical diagnosis. Neural Comput. Appl. 2021, 33, 7165–7204. [Google Scholar] [CrossRef]
  82. Nadimi-Shahraki, M.H.; Banaie-Dezfouli, M.; Zamani, H.; Taghian, S.; Mirjalili, S. B-MFO: A Binary Moth-Flame Optimization for Feature Selection from Medical Datasets. Computers 2021, 10, 136. [Google Scholar] [CrossRef]
  83. Sarma, A.; Bhutani, A.; Goel, L. Hybridization of moth flame optimization and gravitational search algorithm and its application to detection of food quality. In Proceedings of the 2017 Intelligent Systems Conference (IntelliSys), London, UK, 7–8 September 2017; pp. 52–60. [Google Scholar]
  84. Hassanien, A.E.; Gaber, T.; Mokhtar, U.; Hefny, H. An improved moth flame optimization algorithm based on rough sets for tomato diseases detection. Comput. Electron. Agric. 2017, 136, 86–96. [Google Scholar] [CrossRef]
  85. Lei, X.; Fang, M.; Fujita, H. Moth–flame optimization-based algorithm with synthetic dynamic PPI networks for discovering protein complexes. Knowl.-Based Syst. 2019, 172, 76–85. [Google Scholar] [CrossRef]
  86. Li, C.; Li, S.; Liu, Y. A least squares support vector machine model optimized by moth-flame optimization algorithm for annual power load forecasting. Appl. Intell. 2016, 45, 1166–1178. [Google Scholar] [CrossRef]
  87. Mei, R.N.S.; Sulaiman, M.H.; Mustaffa, Z.; Daniyal, H. Optimal reactive power dispatch solution by loss minimization using moth-flame optimization technique. Appl. Soft Comput. 2017, 59, 210–222. [Google Scholar]
  88. Allam, D.; Yousri, D.; Eteiba, M. Parameters extraction of the three diode model for the multi-crystalline solar cell/module using Moth-Flame Optimization Algorithm. Energy Convers. Manag. 2016, 123, 535–548. [Google Scholar] [CrossRef]
  89. Ebrahim, M.A.; Becherif, M.; Abdelaziz, A.Y. Dynamic performance enhancement for wind energy conversion system using Moth-Flame Optimization based blade pitch controller. Sustain. Energy Technol. Assess. 2018, 27, 206–212. [Google Scholar] [CrossRef]
  90. Raju, K.; Madurai Elavarasan, R.; Mihet-Popa, L. An assessment of onshore and offshore wind energy potential in India using moth flame optimization. Energies 2020, 13, 3063. [Google Scholar] [CrossRef]
  91. Rezk, H.; Ali, Z.M.; Abdalla, O.; Younis, O.; Gomaa, M.R.; Hashim, M. Hybrid moth-flame optimization algorithm and incremental conductance for tracking maximum power of solar PV/thermoelectric system under different conditions. Mathematics 2019, 7, 875. [Google Scholar] [CrossRef] [Green Version]
  92. Li, Z.; Zhou, Y.; Zhang, S.; Song, J. Lévy-flight moth-flame algorithm for function optimization and engineering design problems. Math. Probl. Eng. 2016, 2016, 1423930. [Google Scholar] [CrossRef] [Green Version]
  93. Savsani, V.; Tawhid, M.A. Non-dominated sorting moth flame optimization (NS-MFO) for multi-objective problems. Eng. Appl. Artif. Intell. 2017, 63, 20–32. [Google Scholar] [CrossRef]
  94. Xu, L.; Li, Y.; Li, K.; Beng, G.H.; Jiang, Z.; Wang, C.; Liu, N. Enhanced moth-flame optimization based on cultural learning and Gaussian mutation. J. Bionic Eng. 2018, 15, 751–763. [Google Scholar] [CrossRef]
  95. Khalilpourazari, S.; Khalilpourazary, S. An efficient hybrid algorithm based on Water Cycle and Moth-Flame Optimization algorithms for solving numerical and constrained engineering optimization problems. Soft Comput. 2019, 23, 1699–1722. [Google Scholar] [CrossRef]
  96. Chen, C.; Wang, X.; Yu, H.; Wang, M.; Chen, H. Dealing with multi-modality using synthesis of Moth-flame optimizer with sine cosine mechanisms. Math. Comput. Simul. 2021, 188, 291–318. [Google Scholar] [CrossRef]
  97. Kaur, K.; Singh, U.; Salgotra, R. An enhanced moth flame optimization. Neural Comput. Appl. 2020, 32, 2315–2349. [Google Scholar] [CrossRef]
  98. Pelusi, D.; Mascella, R.; Tallini, L.; Nayak, J.; Naik, B.; Deng, Y. An Improved Moth-Flame Optimization algorithm with hybrid search phase. Knowl.-Based Syst. 2020, 191, 105277. [Google Scholar] [CrossRef]
  99. Hongwei, L.; Jianyong, L.; Liang, C.; Jingbo, B.; Yangyang, S.; Kai, L. Chaos-enhanced moth-flame optimization algorithm for global optimization. J. Syst. Eng. Electron. 2019, 30, 1144–1159. [Google Scholar]
  100. Xu, Y.; Chen, H.; Luo, J.; Zhang, Q.; Jiao, S.; Zhang, X. Enhanced Moth-flame optimizer with mutation strategy for global optimization. Inf. Sci. 2019, 492, 181–203. [Google Scholar] [CrossRef]
  101. Li, Y.; Zhu, X.; Liu, J. An improved moth-flame optimization algorithm for engineering problems. Symmetry 2020, 12, 1234. [Google Scholar] [CrossRef]
  102. Awad, N.; Ali, M.; Liang, J.; Qu, B.; Suganthan, P. Problem definitions and evaluation criteria for the cec 2017 special sessionand competition on single objective bound constrained real-parameter numerical optimization. In Technical Report; Nanyang Technological University: Singapore, 2016. [Google Scholar]
  103. Kumar, A.; Wu, G.; Ali, M.Z.; Mallipeddi, R.; Suganthan, P.N.; Das, S. A test-suite of non-convex constrained optimization problems from the real-world and some baseline results. Swarm Evol. Comput. 2020, 56, 100693. [Google Scholar] [CrossRef]
  104. Hussien, A.G.; Amin, M.; Abd El Aziz, M. A comprehensive review of moth-flame optimisation: Variants, hybrids, and applications. J. Exp. Theor. Artif. Intell. 2020, 32, 705–725. [Google Scholar] [CrossRef]
  105. Mehne, S.H.H.; Mirjalili, S. Moth-flame optimization algorithm: Theory, literature review, and application in optimal nonlinear feedback control design. In Nature-Inspired Optimizers; Springer: Cham, Switzerland, 2020; pp. 143–166. [Google Scholar]
  106. Shehab, M.; Abualigah, L.; Al Hamad, H.; Alabool, H.; Alshinwan, M.; Khasawneh, A.M. Moth–flame optimization algorithm: Variants and applications. Neural Comput. Appl. 2020, 32, 9859–9884. [Google Scholar] [CrossRef]
  107. Apinantanakon, W.; Sunat, K. Omfo: A new opposition-based moth-flame optimization algorithm for solving unconstrained optimization problems. In Proceedings of the International Conference on Computing and Information Technology, Singapore, 27–29 December 2017; pp. 22–31. [Google Scholar]
  108. Sapre, S.; Mini, S. Opposition-based moth flame optimization with Cauchy mutation and evolutionary boundary constraint handling for global optimization. Soft Comput. 2019, 23, 6023–6041. [Google Scholar] [CrossRef]
  109. Yu, C.; Heidari, A.A.; Chen, H. A quantum-behaved simulated annealing algorithm-based moth-flame optimization method. Appl. Math. Model. 2020, 87, 1–19. [Google Scholar] [CrossRef]
  110. Bhesdadiya, R.; Trivedi, I.N.; Jangir, P.; Kumar, A.; Jangir, N.; Totlani, R. A novel hybrid approach particle swarm optimizer with moth-flame optimizer algorithm. In Advances in Computer and Computational Sciences; Springer: Singapore, 2017; pp. 569–577. [Google Scholar]
  111. Sayed, G.I.; Hassanien, A.E. A hybrid SA-MFO algorithm for function optimization and engineering design problems. Complex Intell. Syst. 2018, 4, 195–212. [Google Scholar] [CrossRef] [Green Version]
  112. Singh, R.K.; Gangwar, S.; Singh, D.; Pathak, V.K. A novel hybridization of artificial neural network and moth-flame optimization (ANN–MFO) for multi-objective optimization in magnetic abrasive finishing of aluminium 6060. J. Braz. Soc. Mech. Sci. Eng. 2019, 41, 270. [Google Scholar] [CrossRef]
  113. Dang, M.P.; Le, H.G.; Chau, N.L.; Dao, T.-P. Optimization for a flexure hinge using an effective hybrid approach of fuzzy logic and moth-flame optimization algorithm. Math. Probl. Eng. 2021, 2021, 6622655. [Google Scholar] [CrossRef]
  114. Mittal, T. A hybrid moth flame optimization and variable neighbourhood search technique for optimal design of IIR filters. Neural Comput. Appl. 2021, 1–16. [Google Scholar] [CrossRef]
  115. Abd Elaziz, M.; Yousri, D.; Mirjalili, S. A hybrid Harris hawks-moth-flame optimization algorithm including fractional-order chaos maps and evolutionary population dynamics. Adv. Eng. Softw. 2021, 154, 102973. [Google Scholar] [CrossRef]
  116. Ahmed, O.H.; Lu, J.; Xu, Q.; Ahmed, A.M.; Rahmani, A.M.; Hosseinzadeh, M. Using differential evolution and Moth–Flame optimization for scientific workflow scheduling in fog computing. Appl. Soft Comput. 2021, 112, 107744. [Google Scholar] [CrossRef]
  117. Li, Z.; Zeng, J.; Chen, Y.; Ma, G.; Liu, G. Death mechanism-based moth–flame optimization with improved flame generation mechanism for global optimization tasks. Expert Syst. Appl. 2021, 183, 115436. [Google Scholar] [CrossRef]
  118. Blackiston, D.J.; Silva Casey, E.; Weiss, M.R. Retention of memory through metamorphosis: Can a moth remember what it learned as a caterpillar? PLoS ONE 2008, 3, e1736. [Google Scholar] [CrossRef] [Green Version]
  119. Morrison, R.W. Designing Evolutionary Algorithms for Dynamic Environments; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2004. [Google Scholar]
  120. Derrac, J.; García, S.; Molina, D.; Herrera, F. A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 2011, 1, 3–18. [Google Scholar] [CrossRef]
  121. Friedman, M. A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat. 1940, 11, 86–92. [Google Scholar] [CrossRef]
Figure 1. The classification of nature-inspired algorithms and MFO variants.
Figure 1. The classification of nature-inspired algorithms and MFO variants.
Entropy 23 01637 g001
Figure 2. The flowchart of improved moth-flame optimization (I-MFO) algorithm.
Figure 2. The flowchart of improved moth-flame optimization (I-MFO) algorithm.
Entropy 23 01637 g002
Figure 3. The comparison of fitness distribution on unimodal and multimodal functions.
Figure 3. The comparison of fitness distribution on unimodal and multimodal functions.
Entropy 23 01637 g003
Figure 4. The comparison of fitness distribution on hybrid and composition functions.
Figure 4. The comparison of fitness distribution on hybrid and composition functions.
Entropy 23 01637 g004
Figure 5. The convergence curves of algorithms in unimodal and multimodal test functions.
Figure 5. The convergence curves of algorithms in unimodal and multimodal test functions.
Entropy 23 01637 g005
Figure 6. The convergence curves of algorithms in hybrid and composition test functions.
Figure 6. The convergence curves of algorithms in hybrid and composition test functions.
Entropy 23 01637 g006
Figure 7. The population diversity of algorithms in different test functions.
Figure 7. The population diversity of algorithms in different test functions.
Entropy 23 01637 g007
Figure 8. The sensitivity analysis on the NF parameter.
Figure 8. The sensitivity analysis on the NF parameter.
Entropy 23 01637 g008
Figure 9. Impact analysis of applying AWAS strategy on different percentages of trapped moths.
Figure 9. Impact analysis of applying AWAS strategy on different percentages of trapped moths.
Entropy 23 01637 g009
Figure 10. The radar graphs and bar charts of algorithms in different dimensions.
Figure 10. The radar graphs and bar charts of algorithms in different dimensions.
Entropy 23 01637 g010aEntropy 23 01637 g010b
Figure 11. Gas transmission compressor design problem.
Figure 11. Gas transmission compressor design problem.
Entropy 23 01637 g011
Figure 12. Three-bar truss problem.
Figure 12. Three-bar truss problem.
Entropy 23 01637 g012
Figure 13. Tension/compression spring design problem.
Figure 13. Tension/compression spring design problem.
Entropy 23 01637 g013
Table 1. Parameter settings of the I-MFO and other contender algorithms.
Table 1. Parameter settings of the I-MFO and other contender algorithms.
AlgorithmsParameter Settings
SAT0 = 10.
CGAIPMut = 0.9, PXcross = 0.5.
GWOThe parameter a is linearly decreased from 2 to 0.
MFOb = 1, a is decreased linearly from −1 to −2.
WOAα variable decreases linearly from 2 to 0, b = 1.
LMFOβ = 1.5, µ and v are normal distributions, Γ is the gamma function.
WCMFOThe number of rivers and sea = 4.
ChOAf decreases linearly from 2 to 0.
AOAµ = 0.5, α = 5.
SMFOr4 = random number between interval (0, 1).
I-MFOδ1 = 2.02, δ2 = 1.08, NF = random number between 1 and D.
Table 2. Comparison of optimization results obtained from unimodal and multimodal test functions.
Table 2. Comparison of optimization results obtained from unimodal and multimodal test functions.
FDMetricsSA
(1983)
CGA
(2000)
GWO
(2014)
MFO
(2015)
WOA
(2016)
LMFO
(2016)
WCMFO
(2019)
ChOA
(2020)
AOA
(2021)
SMFO
(2021)
I-MFO
F130Avg2.251 × 10103.794 × 10108.223 × 1086.952 × 1091.906 × 1062.402 × 1071.328 × 1042.238 × 10103.943 × 10103.091 × 10105.859 × 103
Min1.897 × 10102.437 × 10104.404 × 1071.027 × 1095.654 × 1051.731 × 1071.214 × 1021.123 × 10102.791 × 10102.010 × 10101.488 × 102
50Avg7.170 × 10101.126 × 10114.522 × 1093.099 × 10107.172 × 1061.091 × 1082.826 × 1044.407 × 10109.968 × 10106.933 × 10101.430 × 104
Min6.279 × 10109.612 × 10101.231 × 1097.095 × 1091.980 × 1067.284 × 1076.883 × 1023.201 × 10108.424 × 10104.945 × 10101.948 × 102
100Avg2.830 × 10113.547 × 10113.207 × 10101.173 × 10113.677 × 1077.525 × 1082.017 × 1051.457 × 10112.617 × 10111.908 × 10112.881 × 104
Min2.638 × 10113.147 × 10111.634 × 10106.748 × 10101.409 × 1076.332 × 1081.093 × 1041.282 × 10112.343 × 10111.527 × 10111.033 × 102
F330Avg1.439 × 1051.095 × 1052.993 × 1041.009 × 1051.715 × 1052.786 × 1031.887 × 1035.221 × 1047.488 × 1048.189 × 1046.106 × 102
Min1.029 × 1059.312 × 1041.576 × 1041.920 × 1038.481 × 1041.424 × 1033.092 × 1023.954 × 1045.445 × 1047.186 × 1043.388 × 102
50Avg2.994 × 1052.241 × 1057.147 × 1041.650 × 1056.180 × 1043.151 × 1041.150 × 1041.306 × 1051.648 × 1051.775 × 1051.460 × 104
Min2.647 × 1051.648 × 1053.628 × 1041.176 × 1043.098 × 1042.291 × 1047.428 × 1021.006 × 1051.249 × 1051.273 × 1057.827 × 103
100Avg8.110 × 1055.742 × 1052.023 × 1054.556 × 1055.928 × 1052.495 × 1057.361 × 1043.071 × 1053.330 × 1053.366 × 1057.625 × 104
Min7.298 × 1054.725 × 1051.595 × 1051.191 × 1053.355 × 1051.819 × 1053.430 × 1042.701 × 1053.027 × 1053.182 × 1056.767 × 104
F430Avg1.587 × 1037.327 × 1035.441 × 1029.082 × 1025.476 × 1024.928 × 1024.886 × 1022.971 × 1038.808 × 1035.977 × 1034.868 × 102
Min1.354 × 1036.137 × 1034.963 × 1025.424 × 1024.995 × 1024.755 × 1024.239 × 1021.134 × 1033.824 × 1033.030 × 1034.704 × 102
50Avg7.832 × 1032.608 × 1048.767 × 1024.097 × 1036.676 × 1025.907 × 1025.493 × 1029.176 × 1032.582 × 1041.879 × 1045.550 × 102
Min6.386 × 1031.598 × 1046.745 × 1021.216 × 1035.138 × 1025.084 × 1024.849 × 1025.017 × 1031.686 × 1041.005 × 1044.210 × 102
100Avg4.778 × 1041.033 × 1052.812 × 1032.348 × 1049.992 × 1027.215 × 1026.423 × 1022.760 × 1047.703 × 1045.544 × 1046.318 × 102
Min3.845 × 1048.305 × 1041.870 × 1036.742 × 1038.615 × 1026.726 × 1025.980 × 1022.116 × 1046.019 × 1043.485 × 1045.772 × 102
F530Avg8.120 × 1028.777 × 1025.855 × 1026.894 × 1028.044 × 1026.278 × 1026.744 × 1027.877 × 1027.905 × 1028.721 × 1025.499 × 102
Min7.865 × 1028.259 × 1025.508 × 1026.280 × 1027.242 × 1025.707 × 1026.104 × 1027.471 × 1027.217 × 1028.041 × 1025.308 × 102
50Avg1.167 × 1031.273 × 1036.892 × 1028.934 × 1029.209 × 1028.152 × 1028.940 × 1021.037 × 1031.078 × 1031.118 × 1036.348 × 102
Min1.142 × 1031.184 × 1036.379 × 1027.731 × 1028.081 × 1027.287 × 1027.743 × 1029.853 × 1029.951 × 1021.053 × 1035.836 × 102
100Avg2.180 × 1032.374 × 1031.058 × 1031.666 × 1031.413 × 1031.456 × 1031.726 × 1031.789 × 1031.955 × 1031.985 × 1038.613 × 102
Min2.098 × 1032.255 × 1039.864 × 1021.455 × 1031.329 × 1031.226 × 1031.328 × 1031.724 × 1031.842 × 1031.875 × 1037.885 × 102
F630Avg6.582 × 1026.734 × 1026.043 × 1026.267 × 1026.671 × 1026.038 × 1026.225 × 1026.604 × 1026.655 × 1026.830 × 1026.000 × 102
Min6.468 × 1026.612 × 1026.011 × 1026.144 × 1026.410 × 1026.017 × 1026.095 × 1026.386 × 1026.476 × 1026.615 × 1026.000 × 102
50Avg6.835 × 1026.936 × 1026.105 × 1026.437 × 1026.760 × 1026.094 × 1026.400 × 1026.720 × 1026.837 × 1026.890 × 1026.000 × 102
Min6.691 × 1026.784 × 1026.052 × 1026.270 × 1026.638 × 1026.034 × 1026.165 × 1026.608 × 1026.747 × 1026.780 × 1026.000 × 102
100Avg7.210 × 1027.176 × 1026.275 × 1026.648 × 1026.768 × 1026.398 × 1026.664 × 1026.860 × 1027.028 × 1027.030 × 1026.000 × 102
Min7.164 × 1027.143 × 1026.229 × 1026.466 × 1026.676 × 1026.222 × 1026.526 × 1026.761 × 1026.970 × 1026.865 × 1026.000 × 102
F730Avg1.728 × 1031.965 × 1038.418 × 1021.011 × 1031.238 × 1038.735 × 1028.985 × 1021.190 × 1031.295 × 1031.349 × 1037.964 × 102
Min1.593 × 1031.798 × 1037.801 × 1028.671 × 1021.089 × 1038.438 × 1028.402 × 1021.063 × 1031.154 × 1031.175 × 1037.595 × 102
50Avg3.572 × 1033.656 × 1031.015 × 1031.701 × 1031.684 × 1031.092 × 1031.141 × 1031.663 × 1031.860 × 1031.919 × 1039.208 × 102
Min3.305 × 1033.300 × 1039.654 × 1021.113 × 1031.500 × 1031.065 × 1031.020 × 1031.464 × 1031.744 × 1031.769 × 1038.168 × 102
100Avg1.014 × 1049.052 × 1031.710 × 1034.169 × 1033.250 × 1031.736 × 1031.988 × 1033.320 × 1033.712 × 1033.891 × 1031.356 × 103
Min9.035 × 1038.338 × 1031.542 × 1032.576 × 1032.814 × 1031.649 × 1031.531 × 1033.127 × 1033.579 × 1033.536 × 1031.140 × 103
F830Avg1.116 × 1031.166 × 1038.713 × 1029.790 × 1021.000 × 1039.379 × 1029.841 × 1021.032 × 1031.041 × 1031.096 × 1038.574 × 102
Min1.074 × 1031.144 × 1038.435 × 1028.938 × 1029.488 × 1028.797 × 1029.344 × 1029.726 × 1021.002 × 1031.058 × 1038.418 × 102
50Avg1.467 × 1031.573 × 1039.792 × 1021.229 × 1031.249 × 1031.119 × 1031.213 × 1031.305 × 1031.426 × 1031.404 × 1039.201 × 102
Min1.427 × 1031.519 × 1039.384 × 1021.118 × 1031.132 × 1031.062 × 1031.087 × 1031.251 × 1031.339 × 1031.320 × 1038.796 × 102
100Avg2.499 × 1032.751 × 1031.397 × 1031.968 × 1031.897 × 1031.740 × 1032.026 × 1032.157 × 1032.404 × 1032.422 × 1031.160 × 103
Min2.448 × 1032.620 × 1031.225 × 1031.717 × 1031.716 × 1031.531 × 1031.756 × 1032.052 × 1032.248 × 1032.276 × 1031.087 × 103
F930Avg1.079 × 1041.239 × 1041.384 × 1036.278 × 1037.233 × 1031.015 × 1038.747 × 1036.612 × 1035.570 × 1039.591 × 1039.882 × 102
Min9.017 × 1038.805 × 1031.025 × 1034.471 × 1034.425 × 1039.056 × 1025.118 × 1034.627 × 1034.101 × 1037.754 × 1039.065 × 102
50Avg3.302 × 1044.335 × 1044.571 × 1031.644 × 1041.783 × 1041.306 × 1032.195 × 1042.591 × 1042.277 × 1043.066 × 1041.305 × 103
Min2.622 × 1043.603 × 1042.135 × 1038.748 × 1031.187 × 1049.481 × 1021.190 × 1041.969 × 1041.804 × 1041.925 × 1049.853 × 102
100Avg1.197 × 1051.243 × 1052.638 × 1044.507 × 1043.820 × 1041.421 × 1045.208 × 1046.813 × 1045.374 × 1046.959 × 1047.081 × 103
Min1.034 × 1051.079 × 1051.102 × 1043.679 × 1042.557 × 1043.037 × 1033.986 × 1045.806 × 1044.673 × 1045.852 × 1044.038 × 103
F1030Avg7.764 × 1038.149 × 1033.909 × 1035.130 × 1036.156 × 1034.422 × 1034.808 × 1038.037 × 1036.487 × 1038.363 × 1032.745 × 103
Min6.721 × 1037.474 × 1032.718 × 1033.575 × 1034.506 × 1033.149 × 1033.332 × 1037.199 × 1035.410 × 1037.473 × 1031.941 × 103
50Avg1.426 × 1041.446 × 1046.428 × 1038.566 × 1039.478 × 1037.081 × 1037.956 × 1031.419 × 1041.223 × 1041.368 × 1044.799 × 103
Min1.337 × 1041.342 × 1044.582 × 1036.288 × 1036.969 × 1036.045 × 1036.204 × 1031.301 × 1041.058 × 1041.234 × 1043.734 × 103
100Avg3.136 × 1043.121 × 1041.497 × 1041.728 × 1042.012 × 1041.758 × 1041.618 × 1043.141 × 1042.783 × 1043.071 × 1041.168 × 104
Min3.040 × 1042.990 × 1041.141 × 1041.417 × 1041.687 × 1041.631 × 1041.147 × 1043.035 × 1042.582 × 1042.843 × 1049.083 × 103
Ranking30W|T|L0/0/90/0/90/0/90/0/90/0/90/0/90/0/90/0/90/0/90/0/99/0/0
50W|T|L0/0/90/0/90/0/90/0/90/0/90/0/92/0/70/0/90/0/90/0/97/0/2
100W|T|L0/0/90/0/90/0/90/0/90/0/90/0/91/0/80/0/90/0/90/0/98/0/1
Table 3. Comparison of optimization results obtained from hybrid test functions.
Table 3. Comparison of optimization results obtained from hybrid test functions.
FDMetricsSA
(1983)
CGA
(2000)
GWO
(2014)
MFO
(2015)
WOA
(2016)
LMFO
(2016)
WCMFO
(2019)
ChOA
(2020)
AOA
(2021)
SMFO
(2021)
I-MFO
F1130Avg3.124 × 1035.262 × 1031.406 × 1033.749 × 1031.462 × 1031.292 × 1031.336 × 1033.361 × 1033.325 × 1035.265 × 1031.188 × 103
Min2.512 × 1033.284 × 1031.271 × 1031.363 × 1031.282 × 1031.177 × 1031.254 × 1031.731 × 1031.739 × 1032.547 × 1031.119 × 103
50Avg1.128 × 1041.978 × 1043.078 × 1037.297 × 1031.591 × 1031.532 × 1031.491 × 1038.609 × 1031.605 × 1041.426 × 1041.326 × 103
Min9.143 × 1031.250 × 1041.480 × 1031.574 × 1031.421 × 1031.380 × 1031.344 × 1036.220 × 1039.287 × 1038.985 × 1031.212 × 103
100Avg1.883 × 1052.174 × 1053.531 × 1041.257 × 1057.762 × 1033.319 × 1032.191 × 1037.211 × 1041.625 × 1052.086 × 1051.776 × 103
Min1.465 × 1051.629 × 1051.647 × 1042.137 × 1044.463 × 1032.955 × 1031.840 × 1036.100 × 1041.167 × 1051.399 × 1051.464 × 103
F1230Avg7.895 × 1083.540 × 1093.900 × 1076.158 × 1073.770 × 1075.460 × 1061.254 × 1063.360 × 1097.828 × 1094.462 × 1092.499 × 105
Min3.859 × 1081.934 × 1092.109 × 1067.305 × 1042.509 × 1061.046 × 1063.718 × 1046.620 × 1083.034 × 1091.749 × 1095.318 × 104
50Avg9.643 × 1092.939 × 10104.764 × 1082.475 × 1091.861 × 1084.882 × 1077.229 × 1061.887 × 10105.350 × 10103.075 × 10101.599 × 106
Min5.659 × 1092.161 × 10107.558 × 1071.646 × 1075.114 × 1072.169 × 1071.549 × 1061.045 × 10102.948 × 10101.600 × 10103.874 × 105
100Avg6.862 × 10101.398 × 10114.919 × 1093.523 × 10106.875 × 1083.370 × 1083.428 × 1076.764 × 10101.766 × 10119.544 × 10103.131 × 106
Min5.696 × 10101.132 × 10111.450 × 1091.435 × 10102.918 × 1082.296 × 1083.806 × 1064.928 × 10101.296 × 10115.096 × 10101.250 × 106
F1330Avg9.518 × 1078.620 × 1088.368 × 1057.958 × 1061.463 × 1054.494 × 1051.047 × 1058.863 × 1084.348 × 1048.738 × 1081.994 × 104
Min4.048 × 1073.361 × 1081.991 × 1041.122 × 1042.283 × 1042.705 × 1051.436 × 1043.327 × 1072.158 × 1042.189 × 1081.396 × 103
50Avg1.722 × 1098.709 × 1091.532 × 1082.427 × 1081.657 × 1052.510 × 1068.895 × 1045.260 × 1093.917 × 1091.288 × 10101.434 × 104
Min1.103 × 1095.824 × 1091.312 × 1051.136 × 1054.764 × 1041.415 × 1062.174 × 1045.002 × 1081.041 × 1071.435 × 1091.582 × 103
100Avg8.436 × 1092.634 × 10104.163 × 1084.053 × 1098.423 × 1041.168 × 1071.378 × 1051.915 × 10103.479 × 10101.965 × 10101.105 × 104
Min6.140 × 1091.992 × 10101.579 × 1062.629 × 1083.701 × 1049.896 × 1063.658 × 1041.137 × 10102.155 × 10101.073 × 10101.651 × 103
F1430Avg1.022 × 1053.619 × 1051.438 × 1058.969 × 1049.075 × 1052.614 × 1042.073 × 1043.244 × 1054.223 × 1041.548 × 1061.671 × 104
Min3.932 × 1047.165 × 1043.679 × 1032.197 × 1031.364 × 1052.724 × 1036.252 × 1034.503 × 1042.213 × 1037.879 × 1045.615 × 103
50Avg1.285 × 1065.160 × 1064.016 × 1053.086 × 1056.358 × 1051.086 × 1058.151 × 1041.206 × 1063.933 × 1052.634 × 1078.426 × 104
Min8.972 × 1052.700 × 1064.749 × 1041.071 × 1049.639 × 1042.360 × 1041.194 × 1045.706 × 1054.727 × 1048.185 × 1058.798 × 103
100Avg2.710 × 1076.024 × 1073.480 × 1067.558 × 1061.876 × 1061.207 × 1063.627 × 1057.928 × 1062.267 × 1073.200 × 1073.439 × 105
Min2.008 × 1073.524 × 1071.056 × 1063.097 × 1056.461 × 1051.938 × 1051.387 × 1054.302 × 1064.241 × 1067.455 × 1061.557 × 105
F1530Avg2.966 × 1064.746 × 1073.637 × 1053.412 × 1048.683 × 1049.006 × 1043.448 × 1045.434 × 1062.498 × 1044.469 × 1071.983 × 104
Min1.113 × 1066.971 × 1061.847 × 1043.640 × 1031.368 × 1045.002 × 1042.547 × 1031.019 × 1061.454 × 1042.375 × 1062.006 × 103
50Avg1.739 × 1081.276 × 1099.314 × 1062.145 × 1077.839 × 1045.206 × 1057.164 × 1041.070 × 1083.131 × 1048.129 × 1089.247 × 103
Min6.074 × 1075.105 × 1081.565 × 1044.235 × 1042.225 × 1043.594 × 1051.422 × 1045.379 × 1071.979 × 1041.237 × 1081.622 × 103
100Avg2.250 × 1098.426 × 1099.478 × 1071.045 × 1092.527 × 1052.824 × 1069.337 × 1044.851 × 1094.659 × 1098.332 × 1097.383 × 103
Min1.743 × 1096.068 × 1095.864 × 1051.058 × 1052.549 × 1041.949 × 1061.223 × 1041.096 × 1091.070 × 1091.272 × 1091.752 × 103
F1630Avg3.496 × 1034.179 × 1032.287 × 1032.995 × 1033.519 × 1032.640 × 1032.807 × 1033.475 × 1033.676 × 1034.402 × 1031.928 × 103
Min3.252 × 1033.709 × 1031.744 × 1032.487 × 1032.728 × 1032.110 × 1032.095 × 1032.940 × 1032.867 × 1033.607 × 1031.617 × 103
50Avg5.575 × 1036.744 × 1032.791 × 1034.150 × 1034.689 × 1033.621 × 1033.778 × 1035.240 × 1036.261 × 1036.930 × 1032.546 × 103
Min5.216 × 1036.027 × 1032.209 × 1033.133 × 1033.895 × 1032.949 × 1033.014 × 1034.488 × 1033.693 × 1035.302 × 1032.186 × 103
100Avg1.236 × 1041.660 × 1045.610 × 1038.085 × 1039.811 × 1036.439 × 1036.869 × 1031.231 × 1041.814 × 1041.679 × 1044.533 × 103
Min1.111 × 1041.563 × 1044.748 × 1036.389 × 1037.512 × 1035.301 × 1034.978 × 1031.047 × 1041.301 × 1041.394 × 1043.471 × 103
F1730Avg2.410 × 1032.789 × 1031.956 × 1032.411 × 1032.520 × 1032.203 × 1032.315 × 1032.598 × 1032.620 × 1032.752 × 1031.875 × 103
Min2.242 × 1032.467 × 1031.777 × 1031.975 × 1031.931 × 1031.801 × 1031.942 × 1032.275 × 1032.085 × 1032.359 × 1031.736 × 103
50Avg4.770 × 1035.784 × 1032.676 × 1033.708 × 1033.892 × 1033.155 × 1033.758 × 1034.205 × 1034.226 × 1035.316 × 1032.573 × 103
Min4.087 × 1034.805 × 1032.257 × 1032.866 × 1033.106 × 1032.538 × 1032.931 × 1033.304 × 1033.228 × 1033.873 × 1032.176 × 103
100Avg1.132 × 1049.223 × 1044.439 × 1037.668 × 1037.212 × 1035.693 × 1036.345 × 1031.240 × 1042.886 × 1054.082 × 1054.247 × 103
Min1.036 × 1041.996 × 1043.338 × 1035.623 × 1035.421 × 1034.630 × 1034.935 × 1039.483 × 1031.665 × 1041.263 × 1042.980 × 103
F1830Avg2.207 × 1067.273 × 1066.631 × 1053.177 × 1062.408 × 1063.682 × 1051.734 × 1051.487 × 1067.850 × 1052.844 × 1078.793 × 104
Min1.112 × 1061.967 × 1068.000 × 1043.737 × 1041.933 × 1058.629 × 1043.793 × 1044.340 × 1051.205 × 1052.007 × 1063.279 × 103
50Avg1.242 × 1074.494 × 1073.300 × 1063.443 × 1064.272 × 1067.009 × 1054.064 × 1058.349 × 1062.081 × 1077.061 × 1073.192 × 105
Min5.637 × 1061.275 × 1072.968 × 1051.807 × 1051.009 × 1063.224 × 1051.508 × 1053.517 × 1068.364 × 1051.412 × 1073.532 × 104
100Avg5.093 × 1071.121 × 1084.158 × 1061.162 × 1072.020 × 1062.306 × 1068.326 × 1051.088 × 1073.135 × 1075.663 × 1071.164 × 106
Min3.392 × 1076.823 × 1077.431 × 1054.881 × 1058.476 × 1051.032 × 1063.782 × 1055.042 × 1069.728 × 1069.819 × 1062.003 × 105
F1930Avg1.278 × 1079.064 × 1072.913 × 1054.071 × 1062.647 × 1066.193 × 1043.223 × 1044.950 × 1071.071 × 1061.072 × 1082.012 × 104
Min6.005 × 1063.289 × 1079.466 × 1032.093 × 1031.744 × 1051.764 × 1042.168 × 1032.507 × 1068.696 × 1055.192 × 1061.940 × 103
50Avg9.348 × 1076.178 × 1082.362 × 1066.151 × 1062.457 × 1062.445 × 1052.361 × 1043.031 × 1084.614 × 1059.214 × 1081.409 × 104
Min4.313 × 1072.829 × 1086.908 × 1045.030 × 1031.534 × 1051.272 × 1052.700 × 1033.918 × 1074.438 × 1051.580 × 1082.081 × 103
100Avg2.062 × 1098.505 × 1091.003 × 1083.561 × 1081.528 × 1074.435 × 1067.032 × 1043.211 × 1094.723 × 1095.975 × 1091.009 × 104
Min1.411 × 1096.413 × 1092.250 × 1062.761 × 1065.273 × 1062.123 × 1061.223 × 1047.255 × 1081.529 × 1092.984 × 1092.081 × 103
F2030Avg2.533 × 1032.656 × 1032.288 × 1032.600 × 1032.702 × 1032.498 × 1032.468 × 1032.921 × 1032.638 × 1032.847 × 1032.116 × 103
Min2.461 × 1032.473 × 1032.154 × 1032.215 × 1032.327 × 1032.180 × 1032.072 × 1032.560 × 1032.327 × 1032.454 × 1032.018 × 103
50Avg3.891 × 1033.930 × 1032.736 × 1033.557 × 1033.628 × 1033.060 × 1033.431 × 1033.932 × 1033.346 × 1033.929 × 1032.297 × 103
Min3.535 × 1033.587 × 1032.422 × 1032.897 × 1032.664 × 1032.586 × 1032.655 × 1033.576 × 1032.634 × 1033.493 × 1032.097 × 103
100Avg7.466 × 1037.382 × 1034.469 × 1035.692 × 1035.875 × 1035.054 × 1035.740 × 1036.931 × 1035.751 × 1036.923 × 1033.566 × 103
Min6.910 × 1036.809 × 1033.301 × 1034.194 × 1034.326 × 1034.139 × 1034.438 × 1036.030 × 1034.700 × 1036.187 × 1033.093 × 103
Ranking30W|T|L0/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/1010/0/0
50W|T|L0/0/100/0/100/0/100/0/100/0/100/0/101/0/90/0/100/0/100/0/109/0/1
100W|T|L0/0/100/0/100/0/100/0/100/0/100/0/101/0/90/0/100/0/100/0/109/0/1
Table 4. Comparison of optimization results obtained from composition test functions.
Table 4. Comparison of optimization results obtained from composition test functions.
FDMetricsSA
(1983)
CGA
(2000)
GWO
(2014)
MFO
(2015)
WOA
(2016)
LMFO
(2016)
WCMFO
(2019)
ChOA
(2020)
AOA
(2021)
SMFO
(2021)
I-MFO
F2130Avg2.593 × 1032.655 × 1032.383 × 1032.476 × 1032.558 × 1032.439 × 1032.493 × 1032.565 × 1032.604 × 1032.653 × 1032.363 × 103
Min2.565 × 1032.626 × 1032.352 × 1032.421 × 1032.463 × 1032.378 × 1032.398 × 1032.503 × 1032.515 × 1032.551 × 1032.334 × 103
50Avg2.945 × 1033.065 × 1032.485 × 1032.694 × 1032.888 × 1032.609 × 1032.694 × 1032.886 × 1033.002 × 1033.064 × 1032.430 × 103
Min2.893 × 1033.026 × 1032.440 × 1032.575 × 1032.744 × 1032.542 × 1032.580 × 1032.819 × 1032.885 × 1032.935 × 1032.399 × 103
100Avg4.058 × 1034.393 × 1032.845 × 1033.594 × 1033.884 × 1033.280 × 1033.539 × 1034.044 × 1034.581 × 1034.394 × 1032.738 × 103
Min3.956 × 1034.275 × 1032.751 × 1033.262 × 1033.502 × 1033.106 × 1033.233 × 1033.804 × 1034.161 × 1034.128 × 1032.652 × 103
F2230Avg6.618 × 1036.787 × 1034.413 × 1035.842 × 1035.949 × 1035.006 × 1036.637 × 1039.124 × 1037.785 × 1038.654 × 1032.889 × 103
Min4.837 × 1035.363 × 1032.420 × 1033.150 × 1032.315 × 1032.325 × 1035.330 × 1038.503 × 1035.492 × 1035.677 × 1032.300 × 103
50Avg1.569 × 1041.600 × 1048.634 × 1031.029 × 1041.208 × 1048.858 × 1031.001 × 1041.655 × 1041.468 × 1041.616 × 1046.525 × 103
Min1.464 × 1041.489 × 1047.065 × 1037.958 × 1038.721 × 1037.176 × 1038.609 × 1031.554 × 1041.304 × 1041.529 × 1045.551 × 103
100Avg3.336 × 1043.346 × 1041.777 × 1042.032 × 1042.397 × 1041.948 × 1041.943 × 1043.374 × 1043.092 × 1043.277 × 1041.393 × 104
Min3.264 × 1043.169 × 1041.413 × 1041.778 × 1042.087 × 1041.791 × 1041.671 × 1043.233 × 1042.790 × 1043.043 × 1041.189 × 104
F2330Avg2.916 × 1033.149 × 1032.731 × 1032.801 × 1033.032 × 1032.759 × 1032.785 × 1033.011 × 1033.312 × 1033.283 × 1032.700 × 103
Min2.819 × 1033.102 × 1032.695 × 1032.762 × 1032.886 × 1032.710 × 1032.721 × 1032.930 × 1033.093 × 1033.027 × 1032.680 × 103
50Avg3.380 × 1033.816 × 1032.907 × 1033.135 × 1033.592 × 1033.027 × 1033.104 × 1033.515 × 1034.310 × 1033.929 × 1032.858 × 103
Min3.345 × 1033.644 × 1032.835 × 1033.046 × 1033.377 × 1032.990 × 1032.980 × 1033.373 × 1033.850 × 1033.594 × 1032.820 × 103
100Avg4.322 × 1035.475 × 1033.405 × 1033.716 × 1034.823 × 1033.475 × 1033.545 × 1034.661 × 1036.745 × 1036.024 × 1033.035 × 103
Min4.238 × 1035.225 × 1033.289 × 1033.547 × 1034.263 × 1033.366 × 1033.306 × 1034.424 × 1036.011 × 1035.104 × 1032.971 × 103
F2430Avg3.084 × 1033.342 × 1032.904 × 1032.974 × 1033.167 × 1032.927 × 1032.978 × 1033.201 × 1033.682 × 1033.433 × 1032.871 × 103
Min3.063 × 1033.270 × 1032.855 × 1032.910 × 1033.021 × 1032.897 × 1032.928 × 1033.128 × 1033.473 × 1033.217 × 1032.852 × 103
50Avg3.459 × 1034.039 × 1033.087 × 1033.227 × 1033.733 × 1033.136 × 1033.231 × 1033.713 × 1034.749 × 1034.359 × 1033.008 × 103
Min3.416 × 1033.856 × 1033.000 × 1033.152 × 1033.545 × 1033.071 × 1033.135 × 1033.588 × 1034.340 × 1033.875 × 1032.964 × 103
100Avg5.059 × 1038.073 × 1033.962 × 1034.272 × 1035.854 × 1034.086 × 1034.293 × 1035.913 × 1031.065 × 1048.875 × 1033.651 × 103
Min4.961 × 1037.435 × 1033.819 × 1034.124 × 1035.238 × 1033.976 × 1034.048 × 1035.524 × 1038.928 × 1036.869 × 1033.556 × 103
F2530Avg4.148 × 1035.436 × 1032.957 × 1033.107 × 1032.945 × 1032.889 × 1032.887 × 1034.099 × 1034.426 × 1033.940 × 1032.888 × 103
Min3.828 × 1034.615 × 1032.913 × 1032.889 × 1032.898 × 1032.888 × 1032.884 × 1033.456 × 1033.635 × 1033.463 × 1032.887 × 103
50Avg1.054 × 1041.824 × 1043.371 × 1034.930 × 1033.155 × 1033.043 × 1033.041 × 1038.621 × 1031.388 × 1041.083 × 1043.000 × 103
Min7.933 × 1031.397 × 1043.055 × 1033.159 × 1033.039 × 1032.994 × 1032.962 × 1036.928 × 1031.101 × 1047.534 × 1032.978 × 103
100Avg5.160 × 1045.583 × 1045.277 × 1031.123 × 1043.590 × 1033.456 × 1033.321 × 1031.363 × 1042.328 × 1042.002 × 1043.262 × 103
Min4.633 × 1044.701 × 1044.686 × 1034.792 × 1033.464 × 1033.365 × 1033.206 × 1031.142 × 1041.986 × 1041.680 × 1043.116 × 103
F2630Avg6.408 × 1038.876 × 1034.424 × 1035.689 × 1037.599 × 1035.012 × 1035.447 × 1036.328 × 1039.412 × 1038.871 × 1034.300 × 103
Min5.542 × 1037.735 × 1033.954 × 1034.921 × 1035.975 × 1034.607 × 1034.955 × 1035.882 × 1037.702 × 1035.057 × 1032.900 × 103
50Avg1.063 × 1041.594 × 1045.735 × 1038.121 × 1031.306 × 1047.041 × 1038.059 × 1031.028 × 1041.546 × 1041.587 × 1045.179 × 103
Min1.002 × 1041.454 × 1045.192 × 1036.910 × 1039.977 × 1036.161 × 1037.062 × 1039.047 × 1031.326 × 1041.396 × 1044.512 × 103
100Avg2.452 × 1044.461 × 1041.263 × 1041.741 × 1043.111 × 1041.493 × 1041.752 × 1042.492 × 1044.995 × 1044.315 × 1049.748 × 103
Min2.363 × 1044.099 × 1041.124 × 1041.526 × 1042.326 × 1041.333 × 1041.518 × 1042.276 × 1044.219 × 1043.583 × 1049.123 × 103
F2730Avg3.279 × 1033.667 × 1033.229 × 1033.236 × 1033.346 × 1033.221 × 1033.228 × 1033.493 × 1034.286 × 1033.688 × 1033.213 × 103
Min3.250 × 1033.497 × 1033.212 × 1033.208 × 1033.282 × 1033.200 × 1033.201 × 1033.355 × 1033.633 × 1033.397 × 1033.184 × 103
50Avg3.730 × 1035.183 × 1033.471 × 1033.550 × 1034.305 × 1033.356 × 1033.504 × 1034.272 × 1036.565 × 1035.306 × 1033.337 × 103
Min3.669 × 1034.697 × 1033.342 × 1033.407 × 1033.678 × 1033.249 × 1033.377 × 1033.997 × 1035.687 × 1034.453 × 1033.231 × 103
100Avg4.858 × 1038.855 × 1033.854 × 1033.867 × 1034.945 × 1033.500 × 1033.607 × 1035.656 × 1031.177 × 1049.335 × 1033.467 × 103
Min4.700 × 1037.573 × 1033.594 × 1033.655 × 1033.909 × 1033.389 × 1033.482 × 1035.033 × 1039.541 × 1035.884 × 1033.381 × 103
F2830Avg4.040 × 1035.557 × 1033.339 × 1033.721 × 1033.303 × 1033.255 × 1033.194 × 1034.295 × 1035.958 × 1035.524 × 1033.226 × 103
Min3.927 × 1034.829 × 1033.269 × 1033.318 × 1033.269 × 1033.209 × 1033.100 × 1033.565 × 1034.603 × 1034.419 × 1033.155 × 103
50Avg8.098 × 1031.147 × 1043.873 × 1038.080 × 1033.424 × 1033.316 × 1033.298 × 1036.101 × 1031.102 × 1049.557 × 1033.278 × 103
Min6.921 × 1038.816 × 1033.653 × 1035.324 × 1033.344 × 1033.274 × 1033.259 × 1035.216 × 1039.574 × 1038.008 × 1033.259 × 103
100Avg2.499 × 1043.958 × 1046.692 × 1031.749 × 1043.721 × 1031.149 × 1047.644 × 1031.204 × 1042.938 × 1042.320 × 1043.357 × 103
Min2.393 × 1043.584 × 1044.771 × 1031.485 × 1043.598 × 1033.439 × 1033.333 × 1039.983 × 1032.587 × 1041.831 × 1043.321 × 103
F2930Avg4.387 × 1035.321 × 1033.645 × 1034.003 × 1034.751 × 1033.785 × 1033.965 × 1034.348 × 1035.689 × 1035.698 × 1033.465 × 103
Min4.114 × 1034.426 × 1033.459 × 1033.603 × 1034.062 × 1033.596 × 1033.650 × 1034.057 × 1034.626 × 1034.728 × 1033.343 × 103
50Avg6.503 × 1039.644 × 1034.214 × 1035.076 × 1037.281 × 1034.337 × 1034.671 × 1036.906 × 1031.548 × 1041.622 × 1043.589 × 103
Min5.954 × 1037.327 × 1033.750 × 1034.271 × 1036.025 × 1033.820 × 1033.992 × 1035.462 × 1038.398 × 1038.290 × 1033.288 × 103
100Avg1.758 × 1045.024 × 1047.229 × 1031.370 × 1041.413 × 1046.953 × 1037.986 × 1031.940 × 1048.567 × 1045.425 × 1045.805 × 103
Min1.557 × 1042.701 × 1046.385 × 1037.555 × 1031.053 × 1045.760 × 1037.019 × 1031.268 × 1043.350 × 1041.727 × 1045.194 × 103
F3030Avg1.398 × 1071.149 × 1087.020 × 1063.271 × 1056.709 × 1061.579 × 1052.811 × 1043.527 × 1074.703 × 1073.278 × 1081.064 × 104
Min4.880 × 1062.608 × 1078.829 × 1051.393 × 1044.463 × 1054.934 × 1041.582 × 1041.030 × 1071.875 × 1063.212 × 1075.336 × 103
50Avg3.257 × 1081.521 × 1096.713 × 1078.852 × 1078.101 × 1075.293 × 1062.475 × 1065.299 × 1085.682 × 1082.207 × 1091.323 × 106
Min1.906 × 1087.735 × 1083.536 × 1072.389 × 1064.041 × 1073.797 × 1061.155 × 1061.890 × 1081.863 × 1082.782 × 1087.972 × 105
100Avg3.841 × 1091.351 × 10103.958 × 1081.283 × 1091.922 × 1081.283 × 1071.932 × 1061.185 × 10103.055 × 10101.581 × 10107.578 × 103
Min3.148 × 1098.053 × 1095.455 × 1073.821 × 1077.264 × 1077.913 × 1063.637 × 1058.263 × 1091.450 × 10104.669 × 1095.286 × 103
30W|T|L0/0/100/0/100/0/100/0/100/0/100/0/102/0/80/0/100/0/100/0/108/0/2
Ranking50W|T|L0/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/1010/0/0
100W|T|L0/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/100/0/1010/0/0
Table 5. The overall effectiveness of the I-MFO and contender algorithms.
Table 5. The overall effectiveness of the I-MFO and contender algorithms.
AlgorithmsSA
(W/T/L)
CGA
(W/T/L)
GWO
(W/T/L)
MFO
(W/T/L)
WOA
(W/T/L)
LMFO
(W/T/L)
WCMFO
(W/T/L)
ChOA
(W/T/L)
AOA
(W/T/L)
SMFO
(W/T/L)
I-MFO
(W/T/L)
D = 300/0/290/0/290/0/290/0/290/0/290/0/292/0/270/0/290/0/290/0/2927/0/2
D = 500/0/290/0/290/0/290/0/290/0/290/0/293/0/260/0/290/0/290/0/2926/0/3
D = 1000/0/290/0/290/0/290/0/290/0/290/0/292/0/270/0/290/0/290/0/2927/0/2
Total0/0/870/0/870/0/870/0/870/0/870/0/877/0/800/0/870/0/870/0/8780/0/7
OE0%0%0%0%0%0%8%0%0%0%92%
Table 6. Friedman test for unimodal and multimodal functions of the CEC 2018.
Table 6. Friedman test for unimodal and multimodal functions of the CEC 2018.
FunctionsUnimodal FunctionsMultimodal Functions
Dimensions30501003050100
AlgorithmsAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall Rank
SA9.22109.771010.50118.6799.451010.1510
CGA9.751110.421110.251010.491110.741110.2911
GWO4.5745.1254.1232.5222.6622.5782
MFO6.3066.0766.3074.6554.6755.266
WOA6.7073.6046.1056.2165.4564.534
LMFO3.4533.5734.1542.9932.8533.293
WCMFO1.7721.6221.7024.6144.5244.675
ChOA6.1756.7076.1767.1977.2777.347
AOA8.1289.0297.8097.6788.0487.988
SMFO8.5798.5787.5789.75109.1498.829
I-MFO1.3511.5011.3211.2011.1711.061
Table 7. Friedman test for hybrid and composition functions of the CEC 2018.
Table 7. Friedman test for hybrid and composition functions of the CEC 2018.
FunctionsHybrid FunctionsComposition Functions
Dimensions30501003050100
AlgorithmsAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall RankAvg. RankOverall Rank
SA7.9688.7498.5287.2078.7497.688
CGA9.941010.191110.20119.50910.19119.6510
GWO3.7923.7934.1753.1223.79533.142
MFO3.8744.5655.464.3654.5654.955
WOA6.4975.1464.0446.0565.1465.836
LMFO4.6354.0643.6533.3934.0643.233
WCMFO3.8633.5722.9523.9943.5723.554
ChOA8.1897.7487.4877.7087.7487.47
AOA5.8166.9678.899.64106.96710.1511
SMFO10.211110.12109.62109.901110.12109.369
I-MFO1.2311.1011.0911.1211.1011.031
Table 8. Adjusted p-values for the Friedman test on different dimensions (I-MFO is the control method).
Table 8. Adjusted p-values for the Friedman test on different dimensions (I-MFO is the control method).
Dimensions3050100
AlgorithmsBonferroni
p-Value
Tukey
p-Value
Bonferroni
p-Value
Tukey
p-Value
Bonferroni
p-Value
Tukey
p-Value
SA7.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10-08
CGA7.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−8
GWO5.337 × 10−75.338 × 10−75.337 × 10−75.338 × 10−77.238 × 10−87.247 × 10−8
MFO5.337 × 10−75.338 × 10−77.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−8
WOA7.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−8
LMFO3.444 × 10−63.444 × 10−65.337 × 10−75.338 × 10−75.337 × 10−75.338 × 10−7
WCMFO1.595 × 10−31.595 × 10−33.444 × 10−63.444 × 10−63.444 × 10−63.444 × 10−6
ChOA7.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−8
AOA5.337 × 10−75.338 × 10−77.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−8
SMFO7.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−87.238 × 10−87.247 × 10−8
Table 9. Results of the gas transmission compressor design problem.
Table 9. Results of the gas transmission compressor design problem.
AlgorithmsOptimal Values for VariablesOptimal Cost
x1x2x3x4
SA46.761.6225.790.554.390311 × 106
CGA49.9720.0131.4749.831.735023 × 107
GWO20.007.8120.0060.002.964974 × 106
MFO50.001.1824.570.392.964902 × 106
WOA50.001.1824.860.392.965002 × 106
LMFO49.461.1824.640.392.965456 × 106
WCMFO50.001.1824.610.392.964897 × 106
ChOA50.001.1924.240.412.966828 × 106
AOA50.001.2320.000.513.014615 × 106
SMFO23.661.0923.660.193.052254 × 106
I-MFO50.001.1824.600.392.964896 × 106
Table 10. Results of the three-bar truss problem.
Table 10. Results of the three-bar truss problem.
AlgorithmsOptimal Values for VariablesOptimal Weight
x1x2
SA0.7686300.4742322.6482456 × 102
CGA0.7924280.3977522.6390770 × 102
GWO0.7877710.4108722.6389619 × 102
MFO0.7891860.4068062.6389603 × 102
WOA0.7877130.4109772.6389653 × 102
LMFO0.7917130.3999092.6392114 × 102
WCMFO0.7884720.4088222.6389589 × 102
ChOA0.7878020.4107242.6389653 × 102
AOA0.7927890.3969062.6392526 × 102
SMFO0.7920440.3988592.6390973 × 102
I-MFO0.7887920.4079192.6389585 × 102
Table 11. Results for tension/compression spring design problem.
Table 11. Results for tension/compression spring design problem.
AlgorithmsOptimal Values for VariablesOptimum Weight
dDN
SA0.0759350.9930943.8798910.033670
CGA0.0710311.0199751.7260760.019749
GWO0.0512310.34569911.9701350.012676
MFO0.0530640.3907189.5424370.012699
WOA0.0504510.32767513.2193410.012694
LMFO0.0500000.31715414.1071560.012771
WCMFO0.0515090.35241111.5459690.012666
ChOA0.0510690.34174612.2510780.012702
AOA0.0500000.31047515.0000000.013195
SMFO0.0500000.31469214.6965050.013136
I-MFO0.0517100.35721711.2597850.012665
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Nadimi-Shahraki, M.H.; Fatahi, A.; Zamani, H.; Mirjalili, S.; Abualigah, L. An Improved Moth-Flame Optimization Algorithm with Adaptation Mechanism to Solve Numerical and Mechanical Engineering Problems. Entropy 2021, 23, 1637. https://0-doi-org.brum.beds.ac.uk/10.3390/e23121637

AMA Style

Nadimi-Shahraki MH, Fatahi A, Zamani H, Mirjalili S, Abualigah L. An Improved Moth-Flame Optimization Algorithm with Adaptation Mechanism to Solve Numerical and Mechanical Engineering Problems. Entropy. 2021; 23(12):1637. https://0-doi-org.brum.beds.ac.uk/10.3390/e23121637

Chicago/Turabian Style

Nadimi-Shahraki, Mohammad H., Ali Fatahi, Hoda Zamani, Seyedali Mirjalili, and Laith Abualigah. 2021. "An Improved Moth-Flame Optimization Algorithm with Adaptation Mechanism to Solve Numerical and Mechanical Engineering Problems" Entropy 23, no. 12: 1637. https://0-doi-org.brum.beds.ac.uk/10.3390/e23121637

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop