Next Article in Journal
Influence of Moulding Pressure on the Burst Pressure of Reverse-Acting Rupture Discs
Next Article in Special Issue
Migration-Based Moth-Flame Optimization Algorithm
Previous Article in Journal
Application of Ultrasonic Atomization in a Combined Circulation System of Spray Evaporative Cooling and Air Cooling for Electric Machines
Previous Article in Special Issue
An Improved Hybrid Aquila Optimizer and Harris Hawks Algorithm for Solving Industrial Engineering Optimization Problems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Deep Ensemble of Slime Mold Algorithm and Arithmetic Optimization Algorithm for Global Optimization

1
School of Information Engineering, Sanming University, Sanming 365004, China
2
Research and Innovation Department, Skyline University College, Sharjah 1797, United Arab Emirates
3
Faculty of Computer Sciences and Informatics, Amman Arab University, Amman 11953, Jordan
4
School of Computer Science, Universiti Sains Malaysia, Gelugor 11800, Malaysia
5
School of Computer Science and Technology, Hainan University, Haikou 570228, China
*
Authors to whom correspondence should be addressed.
Submission received: 18 August 2021 / Revised: 25 September 2021 / Accepted: 2 October 2021 / Published: 4 October 2021
(This article belongs to the Special Issue Evolutionary Process for Engineering Optimization)

Abstract

:
In this paper, a new hybrid algorithm based on two meta-heuristic algorithms is presented to improve the optimization capability of original algorithms. This hybrid algorithm is realized by the deep ensemble of two new proposed meta-heuristic methods, i.e., slime mold algorithm (SMA) and arithmetic optimization algorithm (AOA), called DESMAOA. To be specific, a preliminary hybrid method was applied to obtain the improved SMA, called SMAOA. Then, two strategies that were extracted from the SMA and AOA, respectively, were embedded into SMAOA to boost the optimizing speed and accuracy of the solution. The optimization performance of the proposed DESMAOA was analyzed by using 23 classical benchmark functions. Firstly, the impacts of different components are discussed. Then, the exploitation and exploration capabilities, convergence behaviors, and performances are evaluated in detail. Cases at different dimensions also were investigated. Compared with the SMA, AOA, and another five well-known optimization algorithms, the results showed that the proposed method can outperform other optimization algorithms with high superiority. Finally, three classical engineering design problems were employed to illustrate the capability of the proposed algorithm for solving the practical problems. The results also indicate that the DESMAOA has very promising performance when solving these problems.

1. Introduction

Nowadays, optimization problems exist in various scenarios, for instance, the engineering design problems. The objective of these optimization problems is to find the extreme values with determined constraint conditions. Then, commonly, the cost is reduced as much as possible. To tackle these problems, researchers have proposed many optimization algorithms [1,2,3,4]. Generally speaking, traditional optimization algorithms, such as gradient-based methods, are susceptible to the initial positions and have difficulties to deal with the non-convex problems that may contain a mass of local optimums. In practice, when we are faced with complex constraint conditions in the real world, it is more essential to obtain the optimal solutions within limited time and cost. At this point, the important thing is not to find the theoretical optimal result but to obtain as good an approximate solution as possible under restricted conditions. For this purpose, many stochastic optimizers have been developed and employed to solve complex optimization problems. As its name implies, the random operator is the main feature for a stochastic optimizer, which allows the algorithms to avoid the stagnation and search the whole search region for global optimization result.
The meta-heuristic algorithms (MAs) have shown very powerful capability in the fields of computational sciences. In general, MAs have four types according to the sources of inspiration, namely, physics-inspired (PI), evolution-inspired (EI), swarm-inspired (SI), and human-inspired (HI). Some representative algorithms are shown below:
  • Physics-inspired: multi-verse optimizer (MVO) [5], gravitational search algorithm (GSA) [6], thermal exchange optimization (TEO) [7], heat transfer relation-based optimization algorithm (HTOA) [8].
  • Evolution-inspired: genetic algorithm (GA) [9], differential evolution (DE) [10], evolutionary programming (EP) [11].
  • Swarm-inspired: particle swarm optimization (PSO) [12], emperor penguin optimizer (EPO) [13], Aquila optimizer (AO) [14], remora optimization algorithm (ROA) [4], marine predators algorithm (MPA) [15].
  • Human-inspired: teaching–learning-based optimization (TLBO) [16], social group optimization (SGO) [17], β-hill climbing (βHC) [18], coronavirus optimization algorithm (COA) [19].
For a meta-heuristic algorithm, one important thing is to balance of the global search and local search [20]. It is known that the search agents are first randomly generated within the search spaces. Then, positions of these search agents are updated according to the formulas in the algorithm. In the early stage, drastic exploration in the search space should be performed as much as possible in the early stage. Then, in the later phase, more local exploitation should be conducted to improve the accuracy of obtained optimal solution. Although hundreds of MAs have been proposed for the optimization problems, there is still need for new algorithms to solve these optimization problems. According to the No-Free-Lunch (NFL) theory [21], on one optimization algorithm can solve all the optimization problems. Generally speaking, it is common that MAs suffer from local optimum stagnation and poor convergence speed as a result of poor optimization ability. Thus, it is very important to develop new optimization algorithms or improve existing MAs by taking some effective measures. Up until now, there have been three primary methods for the improvements of the existing algorithms, which are listed in Table 1.
The slime mold algorithm (SMA) [33] and arithmetic optimization algorithm (AOA) [34] are two newly proposed MAs, and both have the merits of simplicity, efficiency, and flexibility. The SMA has good population diversity and stable performance when solving optimization problems. However, it gets stuck in local optima sometimes for the limited global search capability. On the contrary, the AOA has powerful exploration capability by using the arithmetic operators. However, the performance of AOA is not stable because of the poor population diversity. Therefore, the SMA and AOA are considered to be hybridized together in this paper for solving the global optimization problems. To evaluate the performance of proposed algorithm, we employed 23 classical benchmark functions and 4 constrained engineering design problems. The main contributions of this works are as follows:
  • Hybridizing the slime mold algorithm (SMA) [33] and arithmetic optimization algorithm (AOA) [34] named SMAOA to improve the exploration capability of original SMA.
  • Applying the random contraction strategy (RCS), which is inspired from SMA to help the SMAOA jump out from local optimum.
  • Applying the subtraction and addition strategy (SAS), which is extracted from AOA to enhance the exploitation ability of SMAOA.
  • When the RCS and SAS were applied on SMAOA, the DESMAOA was finally obtained. By comparing seven well-known optimization algorithms, we identified the proposed DESMAOA to be powerful according to the experimental results.
The rest of this paper is organized as follows: The basics of SMA and AOA are described in Section 2. Then, the hybrid method is presented in Section 3, including two strategies that are obtained from these two algorithms. In Section 4, a series of experimental tests are conducted to evaluate the performance of proposed DESMAOA. In Section 5, three engineering design problems are employed to assess the applicability of proposed algorithm in practice. Finally, Section 6 concludes this paper and provides some directions for meaningful future research.

2. Preliminaries

2.1. Slime Mold Algorithm (SMA)

The slime mold algorithm (SMA) is a recent meta-heuristic algorithm proposed by Li et al. in 2020 [33]. The basic idea of SMA is based on the foraging behavior of slime mold, which have different feedback characteristics according to the food quality. Three special behaviors of the slime mold are mathematical formulated in the SMA, i.e., approaching food, wrapping food, and finally grabbling food. First, the process of approaching food can be expressed as
X i ( t + 1 ) = { X b ( t ) + v b ( W X A ( t ) X B ( t ) ) ,   r 1 < p v c X i ( t ) , r 1 p
where t is the number of current iteration, Xi(t + 1) is the newly generated position, Xb(t) denotes the best position found by slime mold in iteration t, XA(t) and XB(t) are two random positions selected from the population of slime mold, and r1 is a random value in [0, 1].
vb and vc are the coefficients that simulate the oscillation and contraction mode of slime mold, respectively, and vc is designed to linearly decrease from one to zero during the iterations. The range of vb is from −a to a, and the computational formula of a is
a = arctanh ( 1 t T )
where T is the maximum number of iterations.
According to Equations (1) and (2), it can be seen that as the number of iterations increases, the slime mold will wrap the food.
W is a very important factor that indicates the weight of slime mold, and it is calculated as follows:
W ( S m e l l I n d e x ( i ) ) = { 1 + r a n d log ( b F S ( i ) b F w F + 1 ) ,   i N / 2 1 r a n d log ( b F S ( i ) b F w F + 1 ) ,   i > N / 2
S m e l l I n d e x ( i ) = s o r t ( S ( i ) )
where rand means a random value between 0 and 1; bF and wF are the best and worst fitness values, respectively, obtained by far; S(i) is the fitness value of ith slime mold; N is the popsize of the population; and SmellIndex is a ranking of fitness values for individuals in the population.
In Equation (1), it is also worth noting that p is the probability of determining the update location for slime mold, which is related to the fitness values of slime mold and food and can be calculated as follows:
p = tanh | S ( i ) D F |
where DF denotes the best fitness obtained by population.
Finally, when the slime mold has found the food (i.e., grabble food), it still has a certain chance (z) to search other new food, which is formulated as
X ( t + 1 ) = r a n d ( U B L B ) + L B ,   r 2 < z
where UB and LB are the upper boundary and lower boundary, respectively, and r2 implies a random value in the region [0, 1].
In general, z should be very small; thus, it is set to 0.03 in SMA. Finally, the pseudo-code of SMA is given in Algorithm 1.
Algorithm 1. Pseudo-code of SMA
Initialize the parameters popsize (N) and maximum iterations (T)
Initialize the positions of all slime mold Xi (i = 1, 2, …, N)
While (tT)
  Calculate the fitness of all slime mold
  Update bestFitness, Xb
  Calculate the weight W by Equation (3) and (4)
  For each search agent
    If r2 < z
      Update position by Equation (6)
    Else
      Update p, vb, and vc
      Update position by Equation (1)
    End if
  End for
  t = t + 1
End While
Return bestFitness, Xb

2.2. Arithmetic Optimization Algorithm (AOA)

Arithmetic optimization algorithm (AOA) is a very new meta-heuristic method proposed by Abualigah and others in 2021 [34]. The main inspiration of this algorithm is to combine the four traditional arithmetic operators in mathematics, i.e., multiplication (M), division (D), subtraction (S), and addition (A). Similar to sine-cosine algorithm (SCA) [35], AOA also has a very simple structure and low computation complexity. Considering the M and D operators can produce large steps in the iterations, M and D are hence mainly conducted in the exploration phase. The expression is as follows:
X i ( t + 1 ) = { X b ( t ) / ( M O P + e p s ) ( ( U B L B ) μ + L B ) ,   r a n d   <   0.5 X b ( t ) M O P ( ( U B L B ) μ + L B ) , r a n d     0.5
where eps is a very small positive number, and μ is a constant coefficient (0.499) that is carefully designed for this algorithm.
MOP is non-linearly decreased from 1 to 0 during the iterations, and the expression is as follows:
M O P = 1 ( t T ) 1 / α
where α is a constant value, which is set to 5 according to the AOA.
From Equation (7), it can be seen that both M and D operators can generate very stochastic positions for the search agent on the basis of the best position. By contrast, S and A operators are applied to emphasize the local exploitation that will generate smaller steps in the search space. The mathematical expression is defined as
X i ( t + 1 ) = { X b ( t ) M O P ( ( U B L B ) μ + L B ) ,   r a n d   <   0.5 X b ( t ) + M O P ( ( U B L B ) μ + L B ) ,   r a n d     0.5
There is no doubt that the importance of balance between exploration and exploitation for an optimization algorithm. In AOA, the parameter MOA is utilized to switch the exploration and exploitation over the course of iterations, which is expressed as
M O A ( t ) = M i n + t ( M a x M i n T )
where Min and Max are constant values.
According to Equation (10), MOA increases from Min to Max. Thus, in the early phase, search agent has more chance to perform exploration in the search space, while in the later stage, search agent will be more likely to conduct search near the best position. The pseudo-code of AOA is shown in Algorithm 2.
Algorithm 2. Pseudo-code of AOA
Initialize the parameters popsize (N) and maximum iterations (T)
Initialize the positions of all search agents Xi (i = 1, 2, …, N)
Set the parameters α, μ, Min, and Max
While (tT)
  Calculate the fitness of all search agents
  Update bestFitness, Xb
  Calculate the MOP by Equation (8)
  Calculate the MOA by Equation (10)
  For each search agent
    If rand > MOA
      Update position by Equation (7)
    Else
      Update position by Equation (9)
    End if
  End for
  t = t + 1
End While
Return bestFitness, Xb

3. The Proposed Hybridized Algorithm (DESMAOA)

It is well known that MAs have the merits of concision, flexibility, and especially utility. Hence, many scholars are working on developing new meta-heuristic-based approaches for optimization problems. However, several optimization algorithms such as slime mold algorithm and arithmetic optimization algorithm still have some drawbacks. For instance, when dealing with complex optimization problems, SMA tends to drop into local best, and also converges slowly. Similarly, AOA only utilizes the information of best position in the population, which may suffer the problem of low precision. Therefore, this paper aimed to develop a new hybridization algorithm composed of SMA and AOA for better optimization performance.
In this paper, the SMA and AOA are firstly integrated to form a hybridized style named SMAOA. Then, the preliminary hybrid algorithm is further enhanced by adding two strategies. One is the random contraction strategy (RCS), which is an improved version of contraction formula in SMA. The other is the subtraction and addition strategy (SAS), which is extracted from the local search in AOA. Finally, the deep ensemble of SMA and AOA is accomplished, and the hybridized algorithm (i.e., DESMAOA) is obtained. The detailed implement of proposed algorithm is delineated in the following.

3.1. The Hybridization of SMA and AOA

In SMA, the contraction formula (see Equations (1) and (2)) is utilized to help slime mold jump out of local minima, which will tend to zero in the later iterations. Thus, it will not play the role of global exploration. On the other hand, the multiplication and division methods in AOA display a powerful capability in global exploration. Thus, the formulas of multiplication and division (see in Equation (7)) are considered to replace the contraction equation. Therefore, the hybrid algorithm SMAOA will perform good global search in the whole stage. To be specific, for the search agent that is close to best position, the multiplication and division operators will make it more likely to search other spaces.

3.2. Random Contraction Strategy (RCS)

In this work, we present the RCS on the basis of the mathematical formula of contraction mode in SMA, which is applied to expand exploration space and avoid local optimum. The coefficient vc is replaced by a random value lying between −1 and 1. The position update formula is calculated as follows:
V i 2 ( t + 1 ) = ( 2 r a n d 1 ) X i ( t )
From Equation (11), we should note that the generated position of RCS is within the range [−|Xi(t)|, |Xi(t)|] with uniform distribution, which adds more flexibility for the search agents in the proposed algorithm. Note that the generated position of RCS is taken as a candidate solution.

3.3. Subtraction and Addition Strategy (SAS)

The other strategy proposed here is the SAS, which is also the exploitation method of AOA. According to the AOA, SAS can be performed locally and increase the accuracy of solutions effectively. It is worth mentioning here that the SAS is conducted behind the RCS.
In the same way, the position generated by SAS is treated as a candidate solution, and if a better position is found, then it will be adopted.

3.4. The Deep Ensemble of SMA and AOA

As mentioned above, the SMA and AOA are hybridized together firstly to achieve the SMAOA. Then, two strategies are introduced in the SMAOA, namely, random contraction strategy and subtraction and addition strategy. In order to perform a better balance effect between exploration and exploitation, we utilize a parameter, b, that is related with iterations to represent the probability of conducting the strategies. Its computational formula is given below:
b = 1 t T
The pseudo-code of DESMAOA is shown in Algorithm 3. Moreover, the flowchart of proposed method is shown in Figure 1.
Algorithm 3. Pseudo-code of DESMAOA
Initialize the parameters popsize (N) and maximum iterations (T)
Initialize the positions of all search agents Xi (i = 1, 2, …, N)
Set the parameters α, μ, Min, and Max
While (tT)
  Calculate the fitness of all search agents
  Update bestFitness, Xb
  Calculate a, b, p, and W by Equation (2)–(5)
  Calculate the MOP by Equation (8)
  Update vb
  For each search agent
    If r2 < z
      Update position by Equation (6)
    Else
      If r1 < p
        Update position Vi1 by Equation (1) (1)’
      Else
        Update position Vi1 by Equation (7)
      End if
      If f(Vi1) < f(Xi)
        Xi = Vi1
      End if
      If rand < b
        Apply RCS and generate candidate position Vi2 by Equation (11)
        If f(Vi2) < f(Xi)
          Xi = Vi2
        End if
      End if
      If rand > b
        Apply SAS and generate candidate position Vi3 by Equation (9)
        If f(Vi3) < f(Xi)
          Xi = Vi3
        End if
      End if
    End if
  End for
  t = t + 1
End While
Return bestFitness, Xb

3.5. The Computational Complexity of DESMAOA

The computational complexity of DESMAOA depends on the population size (N), dimension size (D), and maximum iterations (T). First, the computational complexity of initialization is O(N × D). Then, in the iterations, the computational complexity of calculating the fitness values of all search agents is O(N). The computational complexity of sorting is O(N × logN). Moreover, the computational complexity of updating the positions of search agents in SMAOA is O(N × D). Considering the worst cases, the computational complexity of RCS and SAS is O(2N × D). In summary, the final computational complexity of the DESMAOA is O(N × D + T × N(1 + logN + 3D)).

4. Experimental Results and Discussions

In this section, we provide the results of a series of comparative experiments that were conducted by using 23 classical benchmark functions and 10 IEEE CEC2021 single objective optimization functions to evaluate the performance of proposed DESMAOA [36,37]. Table 2 lists the detailed parameter values of these test functions. It can be seen that these classical test functions included unimodal functions (F1–F7), multimodal functions (F8–F13), and also fixed-dimension multimodal functions (F14–F23). Moreover, the CEC2021 test functions contained four types of functions: unimodal function, basic functions, hybrid functions, and composition functions. The unimodal functions are suitable for testing the exploitation capability of algorithms, while the other types of test functions that contain a large number of local minimas can reveal the exploration capability and stability of algorithms.
In the experiments of test functions, the impacts of two applied strategies were firstly analyzed by using the classical test functions. Then, the test results of DESMAOA in classical test functions were compared with seven well-known algorithms. Multiple aspects of the analysis including exploitation capability, exploration capability, qualitative analysis, and convergence behavior are described. Moreover, the results of CEC2021 test functions were also analyzed to investigate the performance of proposed algorithm.

4.1. Impacts of Components

The impacts of different versions are investigated in this section. SMA showed very outstanding performance in optimization problems. However, it still had the problems of premature convergence and local optima. According to the works of Abualigah [34], AOA shows powerful global exploration and local exploitation capability. Hence, we first hybridized the SMA and AOA to obtain the SMAOA. Then, in order to help the search agent jump out of local optima, we integrated the RCS into SMAOA. Moreover, SAS was introduced into SMAOA to improve the local search capability. Different combinations between SMAOA and two strategies are listed below:
  • SMAOA;
  • SMAOA combined with RCS (SMAOA1);
  • SMAOA combined with SAS (SMAOA2);
  • SMAOA combined with RCS and SAS (DESMAOA).
For impartial comparison, the number of iterations and population size for all tests were set as 500 and 30, respectively. Moreover, we conducted independent tests 30 times for each algorithm. The averages and standard deviations were utilized for analysis and comparison between these algorithms. The results are listed in Table 3. Note that the dimension of F1–F13 was set to 30.
From Table 3, it can be seen that these four improved algorithms could obtain the same optimal fitness in F1–F4, F9, F10, F14, and F16–F18. In particular, the theoretical optimization values were obtained in F1–F4, F9, and F18. Compared to SMAOA, SMAOA1, and SMAOA2, DESMAOA won in F5–F8, F11, F13, F15, and F20–F24. This demonstrates that the significant effect with the combination of RCS and SAS. In addition, it is worth mentioning here that the results of DESMAOA in F12 and F19 were very close to the best ones. From the values of standard deviations, it was also shown that DESMAOA had good stability and strong robustness in solving these test functions.

4.2. The Classical Benchmark Functions

This section outlines the 23 classical test functions that were employed for experiments. The performance of DESMAOA was compared with two newly proposed algorithms (SMA and AOA) and another five very famous optimization algorithms (GWO [38], WOA [39], SSA [40], MVO [5], and PSO [12]). Table 4 lists the main parameter values used in each algorithm. Note that the parameter values used in DESMAOA were the same as those used in two original algorithms. Therefore, the stable performance could be guaranteed to some extent for the proposed algorithm. In addition, the test conditions were the same as previously for equal comparison.

4.2.1. Exploration and Exploitation Capability Analysis

Table 5 lists the experimental results of these algorithms. It was shown that the performance of DESMAOA is not only better than the original SMA and AOA but also superior to other comparative algorithms on 20 out of 23 benchmark functions. In F1–F5, F7–F15, F22, and F23, DESMAOA had the lowest average values and stand deviations. This reveals that DESMAOA possesses very good stability and also can find the optimal solution. It is worth noting that the proposed algorithm can obtain the theoretical optimization values in test functions F1–F4, F9, F11, and F18. In F6, F19, and F20, the results of DESMAOA were very close to the best ones. Therefore, these results demonstrated the remarkable effect of the proposed hybrid method. With the help of RCS, the proposed algorithm can jump out of the local minima and obtain the global optimal solution. In the meantime, high precision results could be obtained by using SAS.
In addition, the Wilcoxon signed-rank test was utilized to confirm the statistical superiority of DESMAOA [41], which revealing the statistical differences between two algorithms. The results are given in Table 6. On the basis of these results and the results in Table 5, DESMAOA outperformed SMA for 15 benchmark functions (except F1, F3, F7, F9, F10, F11, F15, and F20) and AOA for 20 benchmark functions (except F7, F15, and F17). Moreover, DESMAOA was found to be better than other comparative algorithms in most of the functions. Furthermore, the results of test functions were also evaluated using the Friedman ranking test [42], which can reveal the overall performance ranking of the comparative algorithms to the test functions. As can be seen from Figure 2, the proposed DESMAOA achieved the first rank among these algorithms. In summary, DESMAOA had excellent optimization performance that was significantly better than SMA and AOA.

4.2.2. Qualitative Analysis

Figure 3 shows the qualitative results of proposed algorithm in F4, F5, F6, F8, F12, F13, F15, and F21. From the scatter plot of the search history, we were able to see that search agents were distributed in the whole search space in the early stage. During the iteration progresses, they concentrated in a quick time. The density of distribution for different functions indicates that DESMAOA had balanced performance between exploration and exploitation. Moreover, some sudden changes in the amplitude were observed clearly in the trajectory of the first search agent, which revealed that DESMAOA had strong exploration capability over the course of iterations when handing these test functions. The drastic fluctuation of average fitness also showed that DESMAOA can jump out of local optima and explore more spaces when dealing with different types of optimization issues. Hence, the local optimal solution can be avoided effectively. Finally, the DESMAOA was able to find better solutions in most of the functions compared with SMA and AOA, which demonstrates the effectiveness of the proposed method.

4.2.3. Analysis of Convergence Behavior

It is important to study the convergence behavior of optimization algorithms when they are searching for the optimal solution. In general, fast convergence speed is required in the early exploration, which implies the algorithm has powerful exploration capability. On the other hand, local optima also should be avoided, which can be seen from the convergence curve. Figure 4 shows the convergence curves of DESMAOA and other compared algorithms on 30 dimensions. Some benchmark functions are used for analysis including F4, F5, F6, F8, F12, F13, F15, and F21. From these functions, it can be seen that the initial convergence speed of DESMAOA is the fastest in most cases. In Figure 5, step-like or cliff-like declines in the convergence curves of DESMAOA can be observed. This suggests that the DESMAOA has a prominent exploration capability. From F5, F6, F12, and F13, the precision of solutions for DESMAOA is further improved with the help of SAS during the iteration. In sum, DESMAOA achieved the best solutions in these functions.

4.2.4. Scalability Test

The performance fluctuations of optimization algorithms can be revealed according to the scalability test. In this work, the performance of DESMAOA in different dimensions (D = 50, 200, 1000) were also tested. It is easy to understand that the higher dimension will make it harder for the algorithm to find the global optimal solution. Note that only F1–F13 in the 23 benchmark functions were selected for this test. As mentioned previously, F1–F7 are single-mode functions that only have one locally optimal solution. In contrast, F8–F13 are multimode functions that have many locally optimal solutions. Moreover, the experimental parameters were kept the same as previous experiments. Table 7 and Table 8 show the results of DESMAOA and other algorithms in different dimensions.
Both results of unimodal and multimode functions indicated that DESMAOA had excellent performance in the conditions of high dimensions. Compared with SMA, AOA, and other well-known algorithms, DESMAOA was the first in all functions except F7. In F7, AOA became better and had more stable results in different dimensions. It is also noted that these comparative algorithms (GWO, WOA, SSA, MVO, and PSO) presented poor optimization capability in some cases, especially in higher dimensions. Furthermore, the Wilcoxon signed-rank test and Friedman ranking test were utilized to analyze the differences between DESMAOA and other algorithms, as listed in Table 9, Table 10, Table 11 and Table 12. From Table 9, Table 10 and Table 11, it can be seen that DESMAOA had significant differences compared with these comparative algorithms. Moreover, in Table 12, the proposed DESMAOA ranked first compared to other algorithms in different dimensions. It is noted that the distance between the first and second was evident. In summary, the proposed DESMAOA had better optimization behavior and stability in dealing with high-dimensional problems.

4.3. The IEEE CEC2021 Standard Test Functions

This section describes the IEEE CEC2021 test functions that were employed to further analyze the performance of proposed DESMAOA on solving global optimization problems. The comparative algorithms included the SMA, AOA, GWO, WOA, SSA, MVO, and PSO. To achieve the statistical results, 30 repeated independent tests were conducted for each function. The experimental results are given in Table 13.
From Table 13, it can be observed that the DESMAOA was able to obtain the best results in six functions: CEC_02, CEC_05, CEC_06, CEC_08, CEC_09, and CEC_10. Thus, we can find that the DESMAOA has good performance in hybrid and composition test functions. By comparing it with other optimization algorithms, we found that DESMAOA showed very competitive performance for these CEC2021 test functions. Moreover, the Friedman’s ranking test was also used to evaluate the performance of DESMAOA. The average rank and rank were also given in Table 13. It can be seen that DESMAOA obtained the best statistical ranking result among these algorithms.
Therefore, the results of CEC2021 test functions also showed the high performance for solving optimization problems.

5. Applicability for Solving Engineering Design Problems

This section reports the three classical engineering design problems we employed to evaluate the capability of DESMAOA to solve practical problems, which were the pressure vessel design problem, three-bar truss design problem, and tension/compression spring design problem. In the same way, 30 search agents and 500 iterations were utilized in the design procedure of engineering problems for a fair comparison. Meanwhile, other related results of optimization algorithms proposed by scholars are also given and compared with proposed algorithm here. Detailed descriptions are shown below.

5.1. Pressure Vessel Design

The design of the pressure vessel is an optimization problem with four variables and four constraints in the industrial field [43]. The lowest cost of pressure vessel was the ultimate goal. The structure of pressure vessel is shown in Figure 5. The four design variables were the thickness of the shell (Ts), thickness of the head (Th), inner radius (R), and length of the cylindrical section (L). Table 14 lists the comparison between DESMAOA and other competitor algorithms. From Table 14, we can see that DESMAOA was capable of finding the optimal solution with the lowest cost.

5.2. Three-Bar Truss Design

The aim of the three-bar truss design is to achieve the lowest weight of three-bar truss with the constraints of stress, deflection, and buckling, which belongs to the field of civil engineering [48]. In this design problem, two parameters x1 (or A1) and x2 (or A2) were involved, as shown in Figure 6. The solutions obtained by the DESMAOA and other representative algorithms are listed in Table 15. It can be seen that the proposed hybrid method apparently outperformed other approaches. Moreover, 30 repeated tests were also performed to evaluate the robustness of the proposed algorithm. The worst value, mean value, best value, and stand deviation were 2.639079 × 102, 2.638562 × 102, 2.638523 × 102, and 1.0451 × 102. Hence, the statistical results revealed that the proposed algorithm had very stable and superior performance in solving this design problem.

5.3. Tension/Compression Spring Design

In the design of a tension/compression spring [51], the objective is to obtain the minimum optimal weight under three constraints: (1) shear stress, (2) surge frequency, and (3) deflection. As shown in Figure 7, there were three variables that needed to be considered. They were the wire diameter (d), mean coil diameter (D), and the number of active coils (N). The results of DESMAOA and other comparative algorithms are listed in Table 16. By comparison, the proposed DESMAOA achieved the best solution for this problem, which was 5.44827 × 10−2, 4.83109 × 10−1, and 5.746128 × 100 for d, D, and N, respectively. Moreover, the optimal weight was 1.11083 × 10−2.

6. Conclusions and Future Works

To overcome the shortcomings of basic meta-heuristic algorithms, this paper presents an effective deep ensemble method of two very new optimization algorithms, i.e., the SMA and AOA. A preliminary hybrid of these two algorithms was firstly conducted to enhance the capability of exploration. Then, two strategies were integrated to the hybridized algorithm to assist it to jump out of the local minima and improve the accuracy of the solution. The performance of proposed DESMAOA was extensively analyzed by using 23 classical test functions.
First, different combinations of SMAOA and two strategies were analyzed and discussed. The results revealed the effectiveness of proposed strategies. Then, the results of DESMAOA were compared with SMA, AOA, and five well-known algorithms. The results showed that the proposed method had the advantages of both SMA and AOA and that it also was evidently better than other comparison algorithms. Afterward, experimental tests in high dimensional environments (50, 200, and 1000) were also investigated among these comparative algorithms, and the results of scalability test also confirmed the superior performance of the proposed method. Finally, the proposed DESMAOA was employed to deal with three engineering design problems. The results show that the proposed method was good at solving these problems, and in particular it was very stable when solving the three-bar truss design problem.
As future perspectives, the DESMAOA can be utilized to solve more optimization problems in other disciplines, such as the feature selection, training of multi-layer perceptron neural network, and image processing. Another investigation is to consider the implementation of this hybrid method on other optimization algorithms for better optimization performance.

Author Contributions

Conceptualization, R.Z. and H.J.; methodology, R.Z. and H.J.; software, R.Z. and S.W.; validation, R.Z., H.J. and L.A.; formal analysis, R.Z. and S.W.; investigation, R.Z. and H.J.; resources, R.Z., H.J. and L.A.; data curation, R.Z.; writing—original draft preparation, R.Z.; writing—review and editing, R.Z. and H.J.; visualization, Q.L.; supervision, H.J. and L.A.; project administration, R.Z. and H.J.; funding acquisition, R.Z. and H.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Sanming University introduces high-level talents to start scientific research funding support project (21YG01, 20YG14), Fujian Natural Science Foundation Project (2021J011128), Guiding science and technology projects in Sanming City (2021-S-8), Educational research projects of young and middle-aged teachers in Fujian Province (JAT200618), Scientific research and development fund of Sanming University (B202009), and Funded By Open Research Fund Program of Fujian Provincial Key Laboratory of Agriculture Internet of Things Application (ZD2101).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Abualigah, L.; Diaba, A. Advances in sine cosine algorithm: A comprehensive survey. Artif. Intell. Rev. 2021, 54, 2567–2608. [Google Scholar] [CrossRef]
  2. Abualigah, L.; Diaba, A. A comprehensive survey of the Grasshopper optimization algorithm: Results, variants, and applications. Neural Comput. Appl. 2020, 32, 15533–15556. [Google Scholar] [CrossRef]
  3. Jia, H.; Lang, C.; Oliva, D.; Song, W.; Peng, X. Dynamic Harris Hawks Optimization with Mutation Mechanism for Satellite Image Segmentation. Remote Sens. 2019, 11, 1421. [Google Scholar] [CrossRef] [Green Version]
  4. Jia, H.; Peng, X.; Lang, C. Remora optimization algorithm. Expert Syst. Appl. 2021, 185, 115665. [Google Scholar] [CrossRef]
  5. Mirjalili, S.; Mirjalili, S.M.; Hatamlou, A. Multi-verse optimizer: A nature-inspired algorithm for global optimization. Neural Comput. Appl. 2016, 27, 495–513. [Google Scholar] [CrossRef]
  6. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  7. Kaveh, A.; Dadras, A. A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Adv. Eng. Softw. 2017, 110, 69–84. [Google Scholar] [CrossRef]
  8. Asef, F.; Majidnezhad, V.; Feizi-Derakhshi, M.R.; Parsa, S. Heat transfer relation-based optimization algorithm (HTOA). Soft. Comput. 2021, 25, 8129–8158. [Google Scholar] [CrossRef]
  9. Corriveau, G.; Guilbault, R.; Tahan, A.; Sabourin, R. Bayesian network as an adaptive parameter setting approach for genetic algorithms. Complex Intell. Syst. 2016, 2, 1–22. [Google Scholar] [CrossRef] [Green Version]
  10. Storn, R.; Price, K. Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 2010, 23, 689–694. [Google Scholar]
  11. Yao, X.; Liu, Y.; Lin, G. Evolutionary Programming Made Faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar]
  12. Chen, G.; Yu, J. Particle swarm optimization algorithm. Inf. Control 2005, 186, 454–458. [Google Scholar]
  13. Gaurav, D.; Vijay, K. Emperor penguin optimizer: A bio-inspired algorithm for engineering problems. Knowl. Based Syst. 2018, 159, 20–50. [Google Scholar]
  14. Abualigah, L.; Yousri, D.; Elaziz, M.A.; Ewees, A.A.; Al-qaness, M.A.A.; Gandomi, A.H. Aquila optimizer: A novel meta-heuristic optimization Algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  15. Faramarzi, A.; Heidarinejad, M.; Mirjalili, S.; Gandomi, A.H. Marine predators algorithm: A nature-inspired metaheuristic. Expert Syst. Appl. 2020, 152, 113377. [Google Scholar] [CrossRef]
  16. Rao, R.V.; Savsani, V.J.; Vakharia, D.P. Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 2011, 43, 303–315. [Google Scholar] [CrossRef]
  17. Satapathy, S.; Naik, A. Social group optimization (SGO): A new population evolutionary optimization technique. Complex Intell. Syst. 2016, 2, 173–203. [Google Scholar] [CrossRef] [Green Version]
  18. Al-Betar, M.A. β-hill climbing: An exploratory local search. Neural Comput. Appl. 2017, 28, 153–168. [Google Scholar] [CrossRef]
  19. Martínez-Lvarez, F.; Asencio-Cortés, G.; Torres, J.F.; Gutiérrez-Avilés, D.; Melgar-García, L.; Pérez-Chacón, R.; Rubio-Escudero, C.; Riquelme, J.C.; Troncoso, A. Coronavirus Optimization Algorithm: A bioinspired metaheuristic based on the COVID-19 propagation model. Big Data 2020, 8, 308–322. [Google Scholar] [CrossRef] [PubMed]
  20. Hussain, A.; Muhammad, Y.S. Trade-off between exploration and exploitation with genetic algorithm using a novel selection operator. Complex Intell. Syst. 2019, 6, 1–14. [Google Scholar] [CrossRef] [Green Version]
  21. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
  22. Shehadeh, H.A. A hybrid sperm swarm optimization and gravitational search algorithm (HSSOGSA) for global optimization. Neural Comput. Appl. 2021, 33, 11739–11752. [Google Scholar]
  23. Kaveh, A.; Rahmani, P.; Eslamlou, A.D. An efficient hybrid approach based on Harris Hawks optimization and imperialist competitive algorithm for structural optimization. Eng. Comput. 2021, 277, 1–29. [Google Scholar]
  24. Dhiman, G.; Kaur, A. A hybrid algorithm based on particle swarm and spotted hyena optimizer for global optimization. In Soft Computing for Problem Solving; Springer: Singapore, 2019; Volume 1, pp. 599–615. [Google Scholar]
  25. Dhiman, G. SSC: A hybrid nature-inspired meta-heuristic optimization algorithm for engineering applications. Knowl. Based Syst. 2021, 222, 106926. [Google Scholar] [CrossRef]
  26. Banaie-Dezfouli, M.; Nadimi-Shahraki, M.H.; Beheshti, Z. R-GWO: Representative-based grey wolf optimizer for solving engineering problems. Appl. Soft Comput. 2021, 106, 107328. [Google Scholar] [CrossRef]
  27. Zhang, H.; Wang, Z.; Chen, W.; Heidari, A.A.; Wang, M.; Zhao, X.; Liang, G.; Chen, H.; Zhang, X. Ensemble mutation-driven salp swarm algorithm with restart mechanism: Framework and fundamental analysis. Expert Syst. Appl. 2021, 165, 113897. [Google Scholar] [CrossRef]
  28. Yu, C.; Heidari, A.A.; Xue, X.; Zhang, L.; Chen, H.; Chen, W. Boosting quantum rotation gate embedded slime mould algorithm. Expert Syst. Appl. 2021, 181, 115082. [Google Scholar] [CrossRef]
  29. Zhang, H.; Cai, Z.; Ye, X.; Wang, M.; Kuang, F.; Chen, H.; Li, C.; Li, Y. A multi-strategy enhanced salp swarm algorithm for global optimization. Eng. Comput. 2020. [Google Scholar] [CrossRef]
  30. Che, Y.; He, D. A Hybrid Whale Optimization with Seagull Algorithm for Global Optimization Problems. Math. Probl. Eng. 2021, 2021, 1–31. [Google Scholar]
  31. Hassan, B.A. CSCF: A chaotic sine cosine firefly algorithm for practical application problems. Neural Comput. Appl. 2021, 33, 7011–7030. [Google Scholar] [CrossRef]
  32. Yue, S.; Zhang, H. A hybrid grasshopper optimization algorithm with bat algorithm for global optimization. Multimed. Tools Appl. 2021, 80, 3863–3884. [Google Scholar] [CrossRef]
  33. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime Mould Algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  34. Abualigah, L.; Diabat, A.; Mirjalili, S.; Abd Elaziz, M.; Gandomi, A.H. The Arithmetic Optimization Algorithm. Comput. Methods Appl. Mech. Eng. 2021, 376, 113609. [Google Scholar] [CrossRef]
  35. Mirjalili, S. SCA: A Sine Cosine algorithm for solving optimization problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  36. Molga, M.; Smutnicki, C. Test Functions for Optimization Needs. 2005. Available online: http://www.zsd.ict.pwr.wroc.pl/files/docs/functions.pdf (accessed on 1 October 2021).
  37. Mohamed, A.W.; Hadi, A.A.; Mohamed, A.K.; Agrawal, P.; Kumar, A.; Suganthan, P.N. Problem Definitions and Evaluation Criteria for the CEC 2021 Special Session and Competition on Single Objective Bound Constrained Numerical Optimization. Cairo University. Tech. Rep. 2020. Available online: http://home.elka.pw.edu.pl/~ewarchul/cec2021-specification.pdf (accessed on 1 October 2021).
  38. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  39. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  40. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp swarm algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  41. Demsar, J. Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 2006, 7, 1–30. [Google Scholar]
  42. Garcia, S.; Fernandez, A.; Luengo, J.; Herrera, F. Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inform. Sci. 2010, 180, 2044–2064. [Google Scholar] [CrossRef]
  43. Kannan, B.; Kramer, S.N. An augmented lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. J. Mech. Des. 1994, 116, 405–411. [Google Scholar] [CrossRef]
  44. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  45. Rizk-Allah, R.M. Hybridizing sine cosine algorithm with multi-orthogonal search strategy for engineering design problems. J. Comput. Des. Eng. 2018, 5, 249–273. [Google Scholar] [CrossRef]
  46. Zhou, Y.; Ling, Y.; Luo, Q. Lévy flight trajectory-based whale optimization algorithm for engineering optimization. Eng. Comput. 2018, 35, 2406–2428. [Google Scholar] [CrossRef]
  47. Pelusi, D.; Mascella, R.; Tallini, L.; Nayak, J.; Naik, B.; Deng, Y. An Improved Moth-Flame Optimization algorithm with hybrid search phase. Knowl. Based Syst. 2020, 191, 105277. [Google Scholar] [CrossRef]
  48. Sadollah, A.; Bahreininejad, A.; Eskandar, H.; Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 2013, 13, 2592–2612. [Google Scholar] [CrossRef]
  49. Liu, H.; Cai, Z.; Wang, Y. Hybridizing particle swarm optimization with differential evolution for constrained numerical and engineering optimization. Appl. Soft Comput. 2010, 10, 629–640. [Google Scholar] [CrossRef]
  50. Singh, N.; Kaur, J. Hybridizing sine-cosine algorithm with harmony search strategy for optimization design problems. Soft. Comput. 2021. [Google Scholar] [CrossRef]
  51. Baykasoğlu, A.; Akpinar, S. Weighted superposition attraction (WSA): A swarm intelligence algorithm for optimization problems–part2: Constrained optimization. Appl. Soft Comput. 2015, 37, 396–415. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the proposed DESMAOA.
Figure 1. Flowchart of the proposed DESMAOA.
Processes 09 01774 g001
Figure 2. Average Friedman ranking values of DESMAOA and other comparative algorithms on 30 dimensions.
Figure 2. Average Friedman ranking values of DESMAOA and other comparative algorithms on 30 dimensions.
Processes 09 01774 g002
Figure 3. Qualitative results for the benchmark functions F4, F5, F6, F8, F12, F13, F15, and F21.
Figure 3. Qualitative results for the benchmark functions F4, F5, F6, F8, F12, F13, F15, and F21.
Processes 09 01774 g003
Figure 4. The convergence curves of F4, F5, F6, F8, F12, F13, F15, and F21.
Figure 4. The convergence curves of F4, F5, F6, F8, F12, F13, F15, and F21.
Processes 09 01774 g004
Figure 5. Pressure vessel design problem: model diagram (left) and structure parameters (right).
Figure 5. Pressure vessel design problem: model diagram (left) and structure parameters (right).
Processes 09 01774 g005
Figure 6. Three-bar truss design problem: model diagram (left) and structure parameters (right).
Figure 6. Three-bar truss design problem: model diagram (left) and structure parameters (right).
Processes 09 01774 g006
Figure 7. Tension/compression spring design problem: model diagram (left) and structure parameters (right).
Figure 7. Tension/compression spring design problem: model diagram (left) and structure parameters (right).
Processes 09 01774 g007
Table 1. A summary of methods for improving the optimization algorithms developed in the literature.
Table 1. A summary of methods for improving the optimization algorithms developed in the literature.
Name of MethodRepresentative AlgorithmDescription
Hybridize two or more algorithmsHybrid sperm swarm optimization and gravitational search algorithm (HSSOGSA) [22]The capability of exploitation in SSO and the capability of exploration in GSA are combined for better performance.
Imperialist competitive Harris hawks optimization (ICHHO) [23]The exploration of ICA is utilized to improve the HHO for global optimization.
Hybrid particle swarm and spotted hyena optimizer (HPSSHO) [24]Particle swarm algorithm is used to improve the hunting strategy of spotted hyena optimizer.
Sine-cosine and spotted hyena-based chimp optimization algorithm (SSC) [25]Sine-cosine functions and attacking strategy of SHO are embedded in ChoA for better exploration and exploitation.
Add one or more strategies onto an algorithmRepresentative-based grey wolf optimizer (R-GWO) [26]A search strategy named representative-based hunting (RH) is utilized to improve the exploration and diversity of the population
Reinforced salp swarm algorithm (CMSRSSSA) [27]An ensemble/composite mutation strategy (CMS) is applied to boost the exploitation and exploration speed of SSA, while restart strategy (RS) is used to get away from local optimum.
Boosting quantum rotation gate embedded slime mold algorithm (WQSMA) [28]The quantum rotation gate mechanism and the operation from water cycle are applied to balance the exploration and exploitation inclinations.
Enhanced salp swarm algorithm (ESSA) [29]Orthogonal learning, quadratic interpolation, and generalized oppositional learning are embedded into SSA to boost the global exploration and local exploitation.
Hybridize two or more algorithms that are further improved by one or more strategiesWhale optimization with seagull algorithm (WSOA) [30]WOA’s contraction surrounding mechanism and SOA’s spiral attack behavior work together, and then levy flight strategy is employed on the search process of SOA.
Chaotic sine-cosine firefly (CSCF) algorithm [31]Chaotic form of SCA and FA are integrated together to improve the convergence speed and efficiency.
Hybrid grasshopper optimization algorithm with bat algorithm (BGOA) [32]In BGOA, Levy fight, local search part of BA, and random strategy are introduced into basic GOA.
Table 2. Benchmark function properties (D indicates dimension).
Table 2. Benchmark function properties (D indicates dimension).
Function TypeFunctionDimensionRangeTheoretical Optimization Value
Unimodal test functionsF130, 50, 200, 1000[−100, 100]0
F230, 50, 200, 1000[−10, 10]0
F330, 50, 200, 1000[−100, 100]0
F430, 50, 200, 1000[−100, 100]0
F530, 50, 200, 1000[−30, 30]0
F630, 50, 200, 1000[−100, 100]0
F730, 50, 200, 1000[−1.28, 1.28]0
Multimodal test functionsF830, 50, 200, 1000[−500, 500]−418.9829 × D
F930, 50, 200, 1000[−5.12, 5.12]0
F1030, 50, 200, 1000[−32, 32]0
F1130, 50, 200, 1000[−600, 600]0
F1230, 50, 200, 1000[−50, 50]0
F1330, 50, 200, 1000[−50, 50]0
Fixed-dimension multimodal test functionsF142[−65, 65]0.998004
F154[−5, 5]0.0003075
F162[−5, 5]−1.03163
F172[−5, 5]0.398
F182[−2, 2]3
F193[−1, 2]−3.8628
F206[0, 1]−3.3220
F214[0, 10]−10.1532
F224[0, 10]−10.4028
F234[0, 10]−10.5363
CEC2021 unimodal test functionsCEC_0110[−100, 100]100
CEC2021 basic test functionsCEC_0210[−100, 100]1100
CEC_0310[−100, 100]700
CEC_0410[−100, 100]1900
CEC2021 hybrid test functionsCEC_0510[−100, 100]1700
CEC_0610[−100, 100]1600
CEC_0710[−100, 100]2100
CEC2021 composition test functionsCEC_0810[−100, 100]2200
CEC_0910[−100, 100]2400
CEC_1010[−100, 100]2500
Table 3. Comparison of the SMAOA, SMAOA1, SMAOA2, and DESMAOA.
Table 3. Comparison of the SMAOA, SMAOA1, SMAOA2, and DESMAOA.
FunctionSMAOASMAOA1SMAOA2DESMAOA
MeanStdMeanStdMeanStdMeanStd
F10.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 100
F20.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 100
F30.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 100
F40.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 100
F52.82 × 1007.00 × 1005.85 × 10−11.07 × 1002.46 × 10−11.33 × 1001.17 × 10−31.55 × 10−3
F62.44 × 10−22.34 × 10−27.77 × 10−31.21 × 10−25.80 × 10−61.92 × 10−64.95 × 10−62.01 × 10−6
F71.16 × 10−49.72 × 10−55.67 × 10−55.61 × 10−56.72 × 10−56.77 × 10−54.27 × 10−54.76 × 10−5
F8−12,569.23611.89 × 10−1−12,569.32291.38 × 10−1−12,569.48664.96 × 10−6−12,569.48664.11 × 10−6
F90.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 1000.00 × 100
F108.8818 × 10−160.00 × 1008.8818 × 10−160.00 × 1008.8818 × 10−160.00 × 1008.8818 × 10−160.00 × 100
F112.13 × 10−12.92 × 10−10.00 × 1000.00 × 1008.85 × 10−32.30 × 10−20.00 × 1000.00 × 100
F122.63 × 10−34.65 × 10−35.08 × 10−58.33 × 10−54.84 × 10−87.54 × 10−81.09 × 10−71.59 × 10−7
F131.70 × 10−23.67 × 10−28.82 × 10−41.15 × 10−32.50 × 10−36.04 × 10−35.91 × 10−71.06 × 10−6
F149.98 × 10−12.60 × 10−119.98 × 10−19.58 × 10−129.98 × 10−19.91 × 10−169.98 × 10−18.05 × 10−16
F154.16 × 10−41.56 × 10−43.63 × 10−49.61 × 10−54.07 × 10−41.90 × 10−43.34 × 10−48.63 × 10−5
F16−1.0316 × 1003.97 × 10−8−1.0316 × 1009.46 × 10−8−1.0316 × 1001.67 × 10−11−1.0316 × 1001.87 × 10−11
F173.9789 × 10−16.82 × 10−73.9789 × 10−13.97 × 10−73.9789 × 10−15.63 × 10−123.9789 × 10−15.94 × 10−12
F183.00 × 1002.79 × 10−93.00 × 1006.69 × 10−103.00 × 1008.56 × 10−113.00 × 1009.42 × 10−11
F19−3.8627 × 1004.32 × 10−5−3.8628 × 1004.68 × 10−5−3.8628 × 1005.61 × 10−5−3.8627 × 1007.91 × 10−5
F20−3.25 × 1005.98 × 10−2−3.2859 × 1005.59 × 10−2−3.2583 × 1006.06 × 10−2−3.286 × 1005.59 × 10−2
F21−1.01528 × 1014.26 × 10−4−1.01529 × 1013.67 × 10−4−1.01531 × 1019.07 × 10−5−1.01531 × 1011.42 × 10−4
F22−1.04023 × 1014.61 × 10−4−1.04025 × 1014.47 × 10−4−1.04028 × 1018.31 × 10−5−1.04028 × 1017.38 × 10−5
F23−1.0536 × 1013.47 × 10−4−1.05362 × 1012.47 × 10−4−1.05363 × 1019.44 × 10−5−1.05363 × 1018.26 × 10−5
Table 4. Parameter values for the optimization algorithms.
Table 4. Parameter values for the optimization algorithms.
AlgorithmParameter Settings
DESMAOAz = 0.03; α = 5; μ = 0.499
SMA [33]z = 0.03
AOA [34]α = 5; μ = 0.499; Min = 0.2; Max = 1
GWO [38]a = [2, 0]
WOA [39]a1 = [2, 0]; a2 = [−2, −1]; b = 1
SSA [40]c1∈[0, 1]; c2∈[0, 1]
MVO [5]WEP∈[0.2, 1]; TDR∈[0, 1]; r1, r2, r3∈[0, 1]
PSO [12]c1 = 2; c2 = 2; W∈[0.2, 0.9]; vMax = 6
Table 5. The result statistics of benchmark functions for the DESMAOA and competitor algorithms.
Table 5. The result statistics of benchmark functions for the DESMAOA and competitor algorithms.
FunctionMetricDESMAOASMAAOAGWOWOASSAMVOPSO
F1Mean0.00 × 1009.93 × 10−3025.37 × 10−67.21 × 10−282.42 × 10−733.96 × 10−71.34 × 1001.76 × 10−4
Std0.00 × 1000.00 × 1002.14 × 10−61.17 × 10−278.81 × 10−739.50 × 10−75.38 × 10−11.82 × 10−4
F2Mean0.00 × 1005.05 × 10−1381.74 × 10−38.26 × 10−178.81 × 10−522.07 × 1002.20 × 1007.05 × 100
Std0.00 × 1002.77 × 10−1372.08 × 10−36.54 × 10−172.46 × 10−511.33 × 1007.31 × 1007.01 × 100
F3Mean0.00 × 1005.43 × 10−3231.24 × 10−31.55 × 10−54.41 × 1041.66 × 1032.04 × 1027.93 × 101
Std0.00 × 1000.00 × 1008.14 × 10−43.50 × 10−51.08 × 1049.24 × 1026.63 × 1012.57 × 101
F4Mean0.00 × 1007.56 × 10−1541.53 × 10−28.03 × 10−74.68 × 1011.15 × 1012.16 × 1001.12 × 100
Std0.00 × 1004.14 × 10−1531.06 × 10−26.71 × 10−72.77 × 1014.04 × 1008.66 × 10−12.40 × 10−1
F5Mean1.17 × 10−38.56 × 1002.79 × 1012.71 × 1012.82 × 1012.90 × 1027.89 × 1028.16 × 101
Std1.55 × 10−31.21 × 1013.01 × 10−18.49 × 10−14.97 × 10−14.77 × 1028.74 × 1027.03 × 101
F6Mean4.95 × 10−65.74 × 10−33.06 × 1007.58 × 10−13.72 × 10−11.78 × 10−71.34 × 1001.37 × 10−4
Std2.01 × 10−63.38 × 10−32.69 × 10−14.94 × 10−12.18 × 10−11.51 × 10−73.43 × 10−11.65 × 10−4
F7Mean4.27 × 10−51.24 × 10−46.74 × 10−51.69 × 10−33.15 × 10−31.73 × 10−13.21 × 10−22.55 × 100
Std4.76 × 10−51.07 × 10−47.11 × 10−58.95 × 10−43.61 × 10−35.61 × 10−21.32 × 10−24.54 × 100
F8Mean−12,569.4866−12,569.1799−5.48 × 103−6.01 × 103−1.06 × 104−7.47 × 103−7.55 × 103−4.69 × 103
Std4.11 × 10−62.66 × 10−13.69 × 1026.42 × 1021.69 × 1038.76 × 1026.27 × 1021.21 × 103
F9Mean0.00 × 1000.00 × 1001.66 × 10−62.26 × 1003.79 × 10−155.53 × 1011.20 × 1021.02 × 102
Std0.00 × 1000.00 × 1001.27 × 10−63.27 × 1002.08 × 10−141.83 × 1013.29 × 1013.19 × 101
F10Mean8.8818 × 10−168.8818 × 10−164.36 × 10−41.01 × 10−133.85 × 10−152.56 × 1002.03 × 1001.69 × 10−2
Std0.00 × 1000.00 × 1001.62 × 10−41.81 × 10−142.10 × 10−156.94 × 10−15.47 × 10−11.30 × 10−2
F11Mean0.00 × 1000.00 × 1008.42 × 10−46.19 × 10−31.68 × 10−21.88 × 10−28.60 × 10−14.39 × 10−3
Std0.00 × 1000.00 × 1003.12 × 10−38.92 × 10−36.38 × 10−21.46 × 10−28.21 × 10−26.85 × 10−3
F12Mean1.09 × 10−75.81 × 10−37.44 × 10−14.42 × 10−22.81 × 10−27.54 × 1002.43 × 1002.07 × 10−2
Std1.59 × 10−76.50 × 10−33.03 × 10−21.86 × 10−22.18 × 10−23.43 × 1001.39 × 1004.22 × 10−2
F13Mean5.91 × 10−76.35 × 10−32.96 × 1006.94 × 10−16.36 × 10−11.38 × 1011.96 × 10−15.55 × 10−3
Std1.06 × 10−67.05 × 10−31.03 × 10−22.45 × 10−13.53 × 10−11.10 × 1011.26 × 10−19.01 × 10−3
F14Mean9.98 × 10−19.98 × 10−19.87 × 1004.16 × 1002.54 × 1001.36 × 1009.98 × 10−12.97 × 100
Std8.05 × 10−163.93 × 10−133.89 × 1004.28 × 1002.91 × 1008.82 × 10−14.31 × 10−112.55 × 100
F15Mean3.34 × 10−44.84 × 10−48.39 × 10−33.15 × 10−38.25 × 10−42.91 × 10−35.24 × 10−37.22 × 10−3
Std8.63 × 10−52.15 × 10−41.29 × 10−26.88 × 10−35.40 × 10−45.93 × 10−31.26 × 10−29.03 × 10−3
F16Mean−1.0316 × 100−1.0316 × 100−1.0316 × 100−1.0316 × 100−1.0316 × 100−1.0316 × 100−1.0316 × 100−1.0316 × 100
Std1.87 × 10−118.36 × 10−102.28 × 10−113.13 × 10−81.68 × 10−93.89 × 10−144.19 × 10−76.25 × 10−16
F17Mean3.9789 × 10−13.9789 × 10−14.0217 × 10−13.9789 × 10−13.9789 × 10−13.9789 × 10−13.9789 × 10−13.9789 × 10−1
Std5.94 × 10−125.40 × 10−91.52 × 10−28.10 × 10−76.71 × 10−67.99 × 10−151.27 × 10−70.00 × 100
F18Mean3.0000 × 1003.0000 × 1004.8000 × 1005.7000 × 1003.0001 × 1003.0000 × 1003.0000 × 1003.0000 × 100
Std9.42 × 10−111.17 × 10−96.85 × 1001.48 × 1018.14 × 10−52.13 × 10−133.49 × 10−61.79 × 10−15
F19Mean−3.8627 × 100−3.8628 × 100−3.8627 × 100−3.8605 × 100−3.8572 × 100−3.8628 × 100−3.8628 × 100−3.8628 × 100
Std7.91 × 10−51.58 × 10−72.62 × 10−44.08 × 10−31.02 × 10−21.17 × 10−127.73 × 10−62.58 × 10−15
F20Mean−3.286 × 100−3.2503 × 100−3.2942 × 100−3.2339 × 100−3.2225 × 100−3.2255 × 100−3.2454 × 100−3.2402 × 100
Std5.59 × 10−25.95 × 10−25.12 × 10−27.30 × 10−21.10 × 10−15.45 × 10−25.93 × 10−28.13 × 10−2
F21Mean−1.01531 × 101−1.01531 × 101−7.8781 × 100−8.8066 × 100−8.4438 × 100−6.9755 × 100−7.048 × 100−6.3883 × 100
Std1.42 × 10−41.05 × 10−42.68 × 1002.54 × 1002.44 × 1003.35 × 1003.28 × 1003.27 × 100
F22Mean−1.04028 × 101−1.04028 × 101−7.2814 × 100−1.02239 × 101−7.0271 × 100−8.938 × 100−9.0327 × 100−8.71250 × 100
Std7.38 × 10−51.82 × 10−43.52 × 1009.70 × 10−13.08 × 1002.99 × 1002.83 × 1002.91 × 100
F23Mean−1.05363 × 101−1.05363 × 101−6.6743 × 100−1.05349 × 101−7.7815 × 100−8.1138 × 100−8.5201 × 100−9.1233 × 100
Std8.26 × 10−59.71 × 10−53.31 × 1008.48 × 10−43.28 × 1003.51 × 1003.20 × 1002.93 × 100
Table 6. p-values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms.
Table 6. p-values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms.
FunctionDESMAOA vs. SMADESMAOA vs. AOADESMAOA vs. GWODESMAOA vs. WOADESMAOA vs. SSADESMAOA vs. MVODESMAOA vs. PSO
F11.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F26.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F31.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F61.22 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−58.54 × 10−4
F72.52 × 10−11.88 × 10−16.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F86.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F91.00 × 1001.22 × 10−46.10 × 10−51.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−5
F101.00 × 1006.10 × 10−56.10 × 10−59.77 × 10−46.10 × 10−56.10 × 10−56.10 × 10−5
F111.00 × 1006.10 × 10−52.50 × 10−11.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−5
F126.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−58.36 × 10−3
F136.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.22 × 10−4
F146.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−58.14 × 10−26.10 × 10−57.93 × 10−3
F153.30 × 10−15.54 × 10−24.54 × 10−15.54 × 10−26.10 × 10−51.16 × 10−31.22 × 10−4
F162.56 × 10−23.36 × 10−36.10 × 10−52.52 × 10−16.10 × 10−56.10 × 10−56.10 × 10−5
F176.10 × 10−46.39 × 10−16.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F181.81 × 10−27.62 × 10−18.36 × 10−38.36 × 10−36.10 × 10−58.36 × 10−36.10 × 10−5
F196.10 × 10−51.03 × 10−22.56 × 10−26.10 × 10−56.10 × 10−54.27 × 10−36.10 × 10−5
F206.39 × 10−11.81 × 10−22.52 × 10−12.52 × 10−11.51 × 10−25.99 × 10−12.08 × 10−1
F213.05 × 10−43.02 × 10−26.10 × 10−51.22 × 10−41.88 × 10−11.53 × 10−38.04 × 10−1
F226.10 × 10−54.27 × 10−36.10 × 10−41.22 × 10−48.04 × 10−13.03 × 10−17.62 × 10−1
F236.10 × 10−41.21 × 10−11.22 × 10−46.10 × 10−53.30 × 10−16.79 × 10−16.79 × 10−1
Table 7. Unimodal benchmark function result statistics of the DESMAOA and competitor algorithms in different dimensions.
Table 7. Unimodal benchmark function result statistics of the DESMAOA and competitor algorithms in different dimensions.
FunctionDMetricDESMAOASMAAOAGWOWOASSAMVOPSO
F150Mean0.00 × 1003.94 × 10−3104.05 × 10−55.96 × 10−203.97 × 10−716.27 × 10−19.07 × 1002.05 × 10−1
Std0.00 × 1000.00 × 1001.33 × 10−55.86 × 10−202.17 × 10−705.32 × 10−12.48 × 1001.82 × 10−1
200Mean0.00 × 1005.15 × 10−2444.63 × 10−21.09 × 10−72.33 × 10−711.76 × 1042.84 × 1033.31 × 102
Std0.00 × 1000.00 × 1001.24 × 10−27.08 × 10−89.24 × 10−711.60 × 1033.15 × 1024.11 × 101
1000Mean0.00 × 1002.20 × 10−2461.50 × 1002.53 × 10−13.57 × 10−682.29 × 1057.94 × 1054.11 × 104
Std0.00 × 1000.00 × 1004.79 × 10−25.65 × 10−21.95 × 10−671.19 × 1042.70 × 1042.32 × 103
F250Mean0.00 × 1001.50 × 10−1456.70 × 10−32.60 × 10−121.56 × 10−499.29 × 1003.50 × 1032.58 × 101
Std0.00 × 1008.24 × 10−1453.05 × 10−31.52 × 10−127.19 × 10−493.59 × 1001.73 × 1041.89 × 101
200Mean0.00 × 1007.90 × 10−1387.23 × 10−23.25 × 10−52.93 × 10−481.55 × 1025.08 × 10774.66 × 102
Std0.00 × 1003.91 × 10−1371.18 × 10−27.69 × 10−61.17 × 10−471.44 × 1012.73 × 10786.40 × 101
1000Mean0.00 × 1005.92 × 10−11.58 × 1006.78 × 10−11.44 × 10−471.19 × 1033.59 × 102781.41 × 103
Std0.00 × 1002.92 × 1001.08 × 10−15.77 × 10−17.74 × 10−472.48 × 101Inf6.51 × 101
F350Mean0.00 × 1001.03 × 10−2931.81 × 10−23.84 × 10−12.01 × 1059.10 × 1036.50 × 1031.48 × 103
Std0.00 × 1000.00 × 1008.60 × 10−31.01 × 1004.50 × 1044.69 × 1031.95 × 1034.64 × 102
200Mean0.00 × 1003.94 × 10−2197.32 × 10−11.98 × 1044.55 × 1062.05 × 1053.16 × 1058.34 × 104
Std0.00 × 1000.00 × 1001.86 × 10−19.23 × 1031.51 × 1067.25 × 1043.10 × 1042.24 × 104
1000Mean0.00 × 1004.81 × 10−1253.35 × 1011.53 × 1061.36 × 1085.19 × 1067.98 × 1062.27 × 106
Std0.00 × 1002.64 × 10−1246.49 × 1003.16 × 1056.32 × 1072.46 × 1068.76 × 1054.56 × 105
F450Mean0.00 × 1003.80 × 10−1583.56 × 10−27.25 × 10−47.35 × 1011.94 × 1011.67 × 1013.74 × 100
Std0.00 × 1002.06 × 10−1577.14 × 10−31.42 × 10−31.99 × 1013.14 × 1004.24 × 1007.18 × 10−1
200Mean0.00 × 1003.11 × 10−1149.10 × 10−22.39 × 1018.41 × 1013.52 × 1018.32 × 1011.93 × 101
Std0.00 × 1001.19 × 10−1131.18 × 10−25.51 × 1001.89 × 1013.49 × 1003.80 × 1001.45 × 100
1000Mean0.00 × 1003.86 × 10−1011.54 × 10−17.88 × 1017.94 × 1014.43 × 1019.81 × 1013.31 × 101
Std0.00 × 1001.90 × 10−1007.55 × 10−33.25 × 1002.09 × 1013.19 × 1006.42 × 10−11.52 × 100
F550Mean4.84 × 1001.89 × 1014.83 × 1014.72 × 1014.83 × 1011.64 × 1037.66 × 1024.21 × 102
Std1.47 × 1011.93 × 1011.40 × 10−17.35 × 10−14.05 × 10−13.66 × 1037.47 × 1022.18 × 102
200Mean1.29 × 1016.23 × 1011.98 × 1021.98 × 1021.98 × 1023.79 × 1063.91 × 1055.98 × 105
Std3.68 × 1017.20 × 1017.40 × 10−24.19 × 10−11.65 × 10−19.40 × 1051.21 × 1051.04 × 105
1000Mean1.11 × 1024.01 × 1021.00 × 1031.05 × 1039.94 × 1021.21 × 1082.33 × 1092.95 × 108
Std2.50 × 1024.15 × 1022.71 × 10−12.55 × 1011.03 × 1001.12 × 1071.85 × 1084.28 × 107
F650Mean1.66 × 10−48.78 × 10−27.29 × 1002.57 × 1001.20 × 1008.30 × 10−19.61 × 1001.97 × 10−1
Std5.60 × 10−56.54 × 10−24.04 × 10−14.02 × 10−15.39 × 10−17.25 × 10−11.88 × 1001.74 × 10−1
200Mean3.73 × 10−28.26 × 1003.59 × 1012.91 × 1011.11 × 1011.76 × 1042.99 × 1033.21 × 102
Std3.21 × 10−28.04 × 1001.19 × 1001.28 × 1002.90 × 1002.45 × 1034.10 × 1024.66 × 101
1000Mean3.34 × 1006.80 × 1012.42 × 1022.02 × 1026.68 × 1012.37 × 1058.04 × 1054.02 × 104
Std4.80 × 1008.81 × 1011.23 × 1002.57 × 1001.52 × 1011.11 × 1042.79 × 1042.17 × 103
F750Mean1.36 × 10−41.96 × 10−46.63 × 10−53.54 × 10−33.97 × 10−34.86 × 10−11.07 × 10−13.97 × 101
Std1.18 × 10−41.69 × 10−45.90 × 10−51.90 × 10−34.79 × 10−31.53 × 10−12.32 × 10−22.76 × 101
200Mean1.29 × 10−44.32 × 10−45.35 × 10−51.63 × 10−24.14 × 10−31.72 × 1015.40 × 1002.95 × 103
Std1.52 × 10−43.04 × 10−45.09 × 10−55.34 × 10−34.22 × 10−34.14 × 1007.17 × 10−14.68 × 102
1000Mean1.17 × 10−46.93 × 10−48.25 × 10−51.55 × 10−13.29 × 10−31.74 × 1032.88 × 1042.39 × 105
Std1.28 × 10−44.85 × 10−46.96 × 10−53.32 × 10−23.73 × 10−31.75 × 1022.72 × 1037.66 × 103
Table 8. Unimodal benchmark function result statistics of the DESMAOA and competitor algorithms in different dimensions.
Table 8. Unimodal benchmark function result statistics of the DESMAOA and competitor algorithms in different dimensions.
FunctionDMetricDESMAOASMAAOAGWOWOASSAMVOPSO
F850Mean−2.0949 × 104−2.0947 × 104−8.3989 × 103−9.1468 × 103−1.8030 × 104−1.2107 × 104−1.2350 × 104−7.6172 × 103
Std1.54 × 10−42.27 × 1005.06 × 1021.59 × 1032.78 × 1031.00 × 1031.11 × 1032.18 × 103
200Mean−8.3796 × 104−8.3757 × 104−2.1657 × 104−2.7533 × 104−7.1281 × 104−3.4381 × 104−4.0399 × 104−1.5591 × 104
Std6.62 × 10−16.42 × 1011.27 × 1035.65 × 1031.28 × 1042.25 × 1032.30 × 1036.41 × 103
1000Mean−4.1892 × 105−4.1862 × 105−5.4566 × 104−8.4602 × 104−3.5941 × 105−8.8084 × 104−1.1041 × 105−3.3236 × 104
Std1.25 × 1025.75 × 1022.29 × 1032.28 × 1045.92 × 1047.25 × 1033.94 × 1031.51 × 104
F950Mean0.00 × 1000.00 × 1001.61 × 10−55.24 × 1001.89 × 10−158.82 × 1012.54 × 1022.84 × 102
Std0.00 × 1000.00 × 1004.42 × 10−67.71 × 1001.04 × 10−142.32 × 1015.65 × 1015.01 × 101
200Mean0.00 × 1000.00 × 1001.34 × 10−32.41 × 1017.58 × 10−158.27 × 1021.90 × 1032.02 × 103
Std0.00 × 1000.00 × 1001.82 × 10−49.14 × 1004.15 × 10−148.74 × 1011.30 × 1021.25 × 102
1000Mean0.00 × 1000.00 × 1003.79 × 10−22.06 × 1020.00 × 1007.63 × 1031.46 × 1041.41 × 104
Std0.00 × 1000.00 × 1001.94 × 10−35.67 × 1010.00 × 1002.12 × 1022.44 × 1022.98 × 102
F1050Mean8.8818 × 10−168.8818 × 10−161.14 × 10−34.3720 × 10−114.3225 × 10−154.83 × 1003.56 × 1001.69 × 100
Std0.00 × 1000.00 × 1001.93 × 10−42.44 × 10−112.38 × 10−151.23 × 1003.13 × 1005.70 × 10−1
200Mean8.8818 × 10−168.8818 × 10−161.06 × 10−22.18 × 10−54.09 × 10−151.30 × 1012.04 × 1016.61 × 100
Std0.00 × 1000.00 × 1001.01 × 10−36.01 × 10−62.70 × 10−154.61 × 10−12.15 × 10−13.38 × 10−1
1000Mean8.8818 × 10−168.8818 × 10−163.32 × 10−21.89 × 10−24.91 × 10−151.45 × 1012.10 × 1011.60 × 101
Std0.00 × 1000.00 × 1007.69 × 10−43.23 × 10−32.42 × 10−151.94 × 10−13.27 × 10−22.33 × 10−1
F1150Mean0.00 × 1000.00 × 1007.70 × 10−32.94 × 10−31.34 × 10−25.55 × 10−11.09 × 1001.62 × 10−2
Std0.00 × 1000.00 × 1001.91 × 10−26.06 × 10−35.11 × 10−22.74 × 10−12.30 × 10−21.19 × 10−2
200Mean0.00 × 1000.00 × 1007.85 × 1006.27 × 10−30.00 × 1001.46 × 1022.73 × 1012.28 × 100
Std0.00 × 1000.00 × 1001.19 × 1011.48 × 10−20.00 × 1001.88 × 1013.08 × 1002.70 × 100
1000Mean0.00 × 1000.00 × 1001.33 × 1042.37 × 10−20.00 × 1002.12 × 1037.25 × 1032.74 × 102
Std0.00 × 1000.00 × 1002.64 × 1033.67 × 10−20.00 × 1008.35 × 1013.01 × 1021.86 × 101
F1250Mean6.67 × 10−76.02 × 10−39.06 × 10−11.22 × 10−13.46 × 10−21.26 × 1015.51 × 1008.03 × 10−2
Std6.28 × 10−71.22 × 10−22.39 × 10−27.35 × 10−21.86 × 10−24.57 × 1001.37 × 1001.53 × 10−1
200Mean2.01 × 10−55.76 × 10−38.41 × 10−15.42 × 10−17.03 × 10−27.55 × 1032.30 × 1034.84 × 101
Std1.57 × 10−58.11 × 10−35.60 × 10−26.68 × 10−23.52 × 10−21.01 × 1042.88 × 1033.71 × 101
1000Mean1.53 × 10−49.67 × 10−31.04 × 1001.26 × 1001.05 × 10−11.16 × 1074.19 × 1099.21 × 106
Std2.46 × 10−41.70 × 10−21.12 × 10−22.99 × 10−15.30 × 10−24.67 × 1064.67 × 1082.30 × 106
F1350Mean1.08 × 10−32.52 × 10−24.94 × 1002.03 × 1001.14 × 1008.07 × 1017.29 × 1001.84 × 10−1
Std4.27 × 10−33.02 × 10−26.56 × 10−42.80 × 10−14.84 × 10−11.61 × 1011.15 × 1011.16 × 10−1
200Mean1.85 × 10−34.73 × 10−11.97 × 1011.67 × 1016.18 × 1001.61 × 1061.11 × 1055.27 × 103
Std1.29 × 10−37.17 × 10−19.97 × 10−24.32 × 10−11.75 × 1007.60 × 1051.03 × 1052.58 × 103
1000Mean6.06 × 10−23.82 × 1001.00 × 1021.21 × 1024.02 × 1011.47 × 1089.13 × 1098.26 × 107
Std8.56 × 10−23.59 × 1003.49 × 10−17.98 × 1001.12 × 1012.91 × 1078.64 × 1081.28 × 107
Table 9. p-values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms on 50 dimensions.
Table 9. p-values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms on 50 dimensions.
FunctionDESMAOA vs. SMADESMAOA vs. AOADESMAOA vs. GWODESMAOA vs. WOADESMAOA vs. SSADESMAOA vs. MVODESMAOA vs. PSO
F15.00 × 10−16.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F26.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F35.00 × 10−16.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F55.37 × 10−36.10 × 10−51.22 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F66.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F78.33 × 10−27.30 × 10−26.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F86.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F91.00 × 1006.10 × 10−56.10 × 10−51.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−5
F101.00 × 1006.10 × 10−56.10 × 10−52.44 × 10−46.10 × 10−56.10 × 10−56.10 × 10−5
F111.00 × 1006.10 × 10−52.50 × 10−11.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−5
F126.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F132.01 × 10−36.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
Table 10. p-Values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms on 200 dimensions.
Table 10. p-Values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms on 200 dimensions.
FunctionDESMAOA vs. SMADESMAOA vs. AOADESMAOA vs. GWODESMAOA vs. WOADESMAOA vs. SSADESMAOA vs. MVODESMAOA vs. PSO
F12.50 × 10−16.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F26.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F36.10 × 10−56.10 × 10−59.77 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−58.54 × 10−46.10 × 10−56.10 × 10−5
F66.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.22 × 10−46.10 × 10−5
F76.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F82.56 × 10−26.10 × 10−51.22 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F91.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F101.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F111.00 × 1006.10 × 10−56.10 × 10−51.00 × 1006.10 × 10−54.88 × 10−46.10 × 10−5
F126.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.22 × 10−4
F136.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
Table 11. p-Values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms on 1000 dimensions.
Table 11. p-Values of the Wilcoxon signed-rank test between DESMAOA and other competitor algorithms on 1000 dimensions.
FunctionDESMAOA vs. SMADESMAOA vs. AOADESMAOA vs. GWODESMAOA vs. WOADESMAOA vs. SSADESMAOA vs. MVODESMAOA vs. PSO
F13.91 × 10−36.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F26.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F36.10 × 10−56.10 × 10−51.22 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−46.10 × 10−56.10 × 10−5
F66.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.22 × 10−46.10 × 10−5
F76.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−58.36 × 10−3
F84.21 × 10−16.10 × 10−51.22 × 10−46.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F91.00 × 1006.10 × 10−56.10 × 10−51.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−5
F101.00 × 1006.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
F111.00 × 1006.10 × 10−56.10 × 10−51.00 × 1006.10 × 10−59.77 × 10−46.10 × 10−5
F126.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−51.16 × 10−3
F136.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−56.10 × 10−5
Table 12. Experimental results of Friedman test on 50, 200, and 1000 dimensions.
Table 12. Experimental results of Friedman test on 50, 200, and 1000 dimensions.
AlgorithmD = 50D = 200D = 1000
MeanRankMean MeanRank
DESMAOA1.192311.230811.26921
SMA1.961522.000022.19232
AOA4.769254.538554.46154
GWO4.307734.384644.69235
WOA4.384643.769233.46153
SSA6.769277.000086.15387
MVO6.846286.769277.46158
PSO5.769266.307766.30776
Table 13. The result statistics of CEC2021 test functions for the DESMAOA and competitor algorithms.
Table 13. The result statistics of CEC2021 test functions for the DESMAOA and competitor algorithms.
FunctionMetricDESMAOASMAAOAGWOWOASSAMVOPSO
CEC_01Mean3.4126 × 1038.4790 × 1031.4400 × 10108.5800 × 1078.8200 × 1073.0497 × 1032.0700 × 1042.6509 × 103
Std3.2484 × 1034.3942 × 1035.4800 × 1092.0600 × 1081.0800 × 1082.6586 × 1031.2300 × 1042.8147 × 103
CEC_02Mean1.6460 × 1031.7230 × 1032.3794 × 1031.7650 × 1032.3408 × 1031.9202 × 1031.7883 × 1032.0353 × 103
Std1.8064 × 1022.0356 × 1022.5500 × 1024.0002 × 1022.8907 × 1023.0299 × 1022.8914 × 1023.3248 × 102
CEC_03Mean7.4197 × 1027.3268 × 1028.0255 × 1027.3186 × 1027.9430 × 1027.4133 × 1027.3257 × 1027.3096 × 102
Std1.3713 × 1019.9242 × 1008.5999 × 1001.0358 × 1012.9899 × 1011.5608 × 1019.4278 × 1001.1527 × 101
CEC_04Mean1.9024 × 1031.9015 × 1033.9200 × 1051.9028 × 1031.9116 × 1031.9015 × 1031.9014 × 1031.9011 × 103
Std1.5132 × 1004.7478 × 10−12.0400 × 1051.1137 × 1008.0854 × 1004.6367 × 10−16.4272 × 10−16.8410 × 10−1
CEC_05Mean4.7672 × 1032.0900 × 1044.6200 × 1051.1600 × 1055.8000 × 1053.2100 × 1046.7593 × 1035.0581 × 103
Std4.2784 × 1034.6800 × 1041.1100 × 1051.9300 × 1058.5200 × 1057.2900 × 1044.2953 × 1033.3481 × 103
CEC_06Mean1.7312 × 1031.7684 × 1032.2222 × 1031.7776 × 1031.8431 × 1031.7573 × 1031.7587 × 1031.8632 × 103
Std1.2285 × 1029.1820 × 1012.0670 × 1021.1171 × 1021.0223 × 1028.6052 × 1011.0387 × 1021.0503 × 102
CEC_07Mean7.0311 × 1036.5603 × 1032.9700 × 1061.8000 × 1043.4500 × 1057.3733 × 1037.5164 × 1036.0549 × 103
Std7.7063 × 1036.4319 × 1033.8500 × 1063.7100 × 1045.3700 × 1054.9714 × 1036.1260 × 1032.6469 × 103
CEC_08Mean2.2974 × 1032.4032 × 1033.5700 × 1032.3384 × 1032.4289 × 1032.3012 × 1032.3861 × 1032.4193 × 103
Std2.2368 × 1013.1043 × 1023.7790 × 1029.1745 × 1013.3946 × 1021.4399 × 1012.6287 × 1023.7983 × 102
CEC_09Mean2.7018 × 1032.7599 × 1032.9038 × 1032.7449 × 1032.7752 × 1032.7330 × 1032.7514 × 1032.7922 × 103
Std1.0302 × 1021.0211 × 1019.9242 × 1014.4570 × 1016.2479 × 1016.4081 × 1019.5351 × 1001.0509 × 102
CEC_10Mean2.9231 × 1032.9323 × 1033.6576 × 1032.9366 × 1032.9545 × 1032.9289 × 1032.9290 × 1032.9234 × 103
Std2.2799 × 1013.1577 × 1014.0387 × 1022.4483 × 1016.9125 × 1012.4243 × 1012.9085 × 1012.3865 × 101
Average rank2.33.97.94.66.93.33.73.4
Rank15867243
Table 14. Optimal results for comparative algorithms on the pressure vessel design problem.
Table 14. Optimal results for comparative algorithms on the pressure vessel design problem.
AlgorithmOptimal Values for VariablesOptimal Cost
TsThRL
DESMAOA7.943124 × 10−13.927124 × 10−14.288001 × 1011.671866 × 1025.8363262 × 103
SMA [33]7.931 × 10−13.932 × 10−14.06711 × 1011.962178 × 1025.9941857 × 103
AOA [34]8.303737 × 10−14.162057 × 10−14.275127 × 1011.693454 × 1026.0487844 × 103
MVO [5]8.125 × 10−14.375 × 10−14.2090738 × 1011.7673869 × 1026.0608066 × 103
WOA [39]8.12500 × 10−14.37500 × 10−14.2098209 × 1011.76638998 × 1026.0597410 × 103
MFO [44]8.125 × 10−14.375 × 10−14.2098445 × 1011.76636596 × 1026.0597143 × 103
GWO [38]8.125 × 10−14.345 × 10−14.20892 × 1011.767587 × 1026.0515639 × 103
MOSCA [45]7.781909 × 10−13.830476 × 10−14.03207539 × 1011.999841994 × 1025.88071150 × 103
LWOA [46]7.78858 × 10−13.85321 × 10−14.032609 × 1012.00 × 1025.893339 × 103
IMFO [47]7.781948 × 10−13.846621 × 10−14.032097 × 1011.999812 × 1025.8853778 × 103
Table 15. Optimal results for comparative algorithms on the three-bar truss design problem.
Table 15. Optimal results for comparative algorithms on the three-bar truss design problem.
AlgorithmOptimal Values for VariablesOptimal Weight
x1x2
DESMAOA7.882549 × 10−14.085642 × 10−12.638523657 × 102
SMA [33]7.729316 × 10−14.718874 × 10−12.658067955 × 102
AOA [34]7.9369 × 10−13.9426 × 10−12.639154 × 102
MBA [48]7.885650 × 10−14.085597 × 10−12.638958522 × 102
SSA [40]7.88665414 × 10−14.08275784 × 10−12.638958434 × 102
MFO [44]7.88244771 × 10−14.09466906 × 10−12.638959797 × 102
PSO-DE [49]7.886751 × 10−14.082482 × 10−12.638958433 × 102
HSCAHS [50]7.885721 × 10−14.084012 × 10−12.63881992 × 102
Table 16. Optimal results for comparative algorithms on the tension/compression spring design problem.
Table 16. Optimal results for comparative algorithms on the tension/compression spring design problem.
AlgorithmOptimal Values for VariablesOptimal Weight
dDp
DESMAOA5.44827 × 10−24.83109 × 1015.746128 × 1001.11083 × 102
SMA [33]5.8992 × 1026.23402 × 1013.590304 × 1001.2128 × 102
AOA [34]5.00 × 1023.49809 × 1011.18637 × 1011.2124 × 102
MVO [5]5.251 × 1023.7602 × 1011.033513 × 1011.2790 × 102
AO [14]5.02439 × 1023.5262 × 1011.05425 × 1011.1165 × 102
SSA [40]5.1207 × 1023.45215 × 1011.2004032 × 1011.26763 × 102
GWO [38]5.169 × 1023.56737 × 1011.128885 × 1011.2666 × 102
GSA [6]5.0276 × 1023.23680 × 1011.3525410 × 1011.27022 × 102
WSA [51]5.168626 × 1023.5665047 × 1011.129291654 × 1011.267061 × 102
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zheng, R.; Jia, H.; Abualigah, L.; Liu, Q.; Wang, S. Deep Ensemble of Slime Mold Algorithm and Arithmetic Optimization Algorithm for Global Optimization. Processes 2021, 9, 1774. https://0-doi-org.brum.beds.ac.uk/10.3390/pr9101774

AMA Style

Zheng R, Jia H, Abualigah L, Liu Q, Wang S. Deep Ensemble of Slime Mold Algorithm and Arithmetic Optimization Algorithm for Global Optimization. Processes. 2021; 9(10):1774. https://0-doi-org.brum.beds.ac.uk/10.3390/pr9101774

Chicago/Turabian Style

Zheng, Rong, Heming Jia, Laith Abualigah, Qingxin Liu, and Shuang Wang. 2021. "Deep Ensemble of Slime Mold Algorithm and Arithmetic Optimization Algorithm for Global Optimization" Processes 9, no. 10: 1774. https://0-doi-org.brum.beds.ac.uk/10.3390/pr9101774

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop