Next Article in Journal
Graph-Based Siamese Network for Authorship Verification
Next Article in Special Issue
Enhanced Brain Storm Optimization Algorithm Based on Modified Nelder–Mead and Elite Learning Mechanism
Previous Article in Journal
Single Image Super-Resolution with Arbitrary Magnification Based on High-Frequency Attention Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Individual Disturbance and Attraction Repulsion Strategy Enhanced Seagull Optimization for Engineering Design

1
College of Information Technology, Jilin Agricultural University, Changchun 130118, China
2
Department of Computer Science and Artificial Intelligence, Wenzhou University, Wenzhou 325035, China
*
Authors to whom correspondence should be addressed.
Submission received: 10 November 2021 / Revised: 26 December 2021 / Accepted: 10 January 2022 / Published: 16 January 2022
(This article belongs to the Special Issue Evolutionary Computation 2022)

Abstract

:
The seagull optimization algorithm (SOA) is a novel swarm intelligence algorithm proposed in recent years. The algorithm has some defects in the search process. To overcome the problem of poor convergence accuracy and easy to fall into local optimality of seagull optimization algorithm, this paper proposed a new variant SOA based on individual disturbance (ID) and attraction-repulsion (AR) strategy, called IDARSOA, which employed ID to enhance the ability to jump out of local optimum and adopted AR to increase the diversity of population and make the exploration of solution space more efficient. The effectiveness of the IDARSOA has been verified using representative comprehensive benchmark functions and six practical engineering optimization problems. The experimental results show that the proposed IDARSOA has the advantages of better convergence accuracy and a strong optimization ability than the original SOA.

1. Introduction

With the emergence of the concept of swarm intelligence in 1989 [1], many scholars have proposed various swarm intelligence optimization algorithms in recent years, which show more efficient and stable effects in solving complex practical problems. Compared with traditional gradient descent algorithms, intelligent algorithms, such as novel whale optimization algorithm (WOA) [2], hunger games search (HGS) [3], colony predation algorithm (CPA) [4], slime mold algorithm (SMA) [5], Runge Kutta optimizer (RUN) [6], Harris hawks optimization (HHO) [7], bat algorithm (BA) [8], teaching-learning-based pathfinder algorithm (TLPFA) [9], wind-driven optimization algorithm(WDO) [10], salp swarm algorithm (SSA) [11,12], grey wolf optimizer (GWO) [13], and its variants I-GWO and Ex-GWO [14], usually have stronger optimization capabilities. These algorithms can effectively solve complex optimization problems and have strong flexibility, robustness, and self-organization. Furthermore, these algorithms have applications in many fields, such as neural network training [15], multi-attribute decision making [16,17,18,19], traveling salesman problem [20], object tracking [21,22], image segmentation [23,24], feature selection [25,26,27,28,29], engineering design problems [30,31,32,33], scheduling problem [34,35], medical data classification [36,37,38,39], bankruptcy prediction [40,41,42], parameter optimization [43,44,45,46], gate resource allocation [47,48], cloud workflow scheduling [49,50], fault diagnosis of rolling bearings [51,52], power electronic circuit design [53,54], detection of foreign fiber in cotton [55,56], and energy vehicle dispatch [57]. However, there are still common problems, such as slow convergence speed, easy to fall into local optimum, and poor convergence accuracy [23,58].
In 2019, Dhiman et al. [59] proposed a seagull optimization algorithm (SOA) based on seagull migration and attack behavior. The author verified the performance of the SOA on 44 well-known benchmark functions and applied SOA to optical buffers, pressure vessels, reducers, welded beams, tension/compression springs, 25 bar truss, and rolling circle problems. The results illustrate the effectiveness and practical value of SOA. However, like other swarm intelligence algorithms, the SOA also has the problems of slow convergence and low solution accuracy. Since the SOA was proposed recently, Lei et al. [60] introduced the Lévy flight strategy and singer function to improve the problem of slow convergence speed, and applied the improved SOA to find the lowest cost problem. To alleviate the problem of early convergence of the SOA, Cao et al. [61] proposed the balanced SOA, which was used to identify the best parameters of the exchange membrane fuel cell (PEMFC) chimney. In 2021, Dhiman et al. [62] introduced the concept of the dynamic archive to the SOA in the multi-objective problem. They then proposed the multi-objective SOA, relying on roulette selection to determine the effective archive solutions, and applied it to the six constraint problems of engineering design. Because SOA has been proposed in recent years, it does not have many variants like other swarm intelligence algorithms, which shows that SOA has a lot of room for improvement. There are not many cases where the SOA is applied to solve practical problems. There are more possibilities in the areas where the SOA can be applied. Since the realization of an SOA that requires fewer parameters and the characteristics of easy implementation, SOA has a larger optimization space and exploration prospects.
The idea of attraction and repulsion appeared in the attraction and repulsion particle swarm optimization (ARPSO) [63]. Through the alternation between the two stages of attraction and repulsion, it can enhance the ability of particle swarm to jump out of the local optimum, improve the diversity of search space, and prevent the problem of premature convergence to a great extent. Since ARPSO has a good ability to jump out of the local optimal solution, it has a strong ability to find the global optimal solution. On this basis, Pant et al. [64] proposed a diversity-guided particle swarm optimizer with three stages: attraction, repulsion, and attraction-repulsion. Mohamed et al. [65] proposed a modified multi-objective imperialist competitive algorithm for the shortcomings of a single-objective empire competition algorithm when used in high-dimensional or complex multimodal function problems. The algorithm introduced the concept of attraction and repulsion in the assimilation stage. It improved the algorithm’s performance to achieve a better effect of finding the global optimal solution.
Inspired by predecessors, to solve the problems of poor optimization accuracy and easy to fall into local optimum in SOA, this paper proposes an improved SOA variant, called seagull optimization algorithm, based on individual disturbance and attraction-repulsion strategy (IDARSOA). It is easy to fall into local optimum in the original SOA when looking for the optimal forward direction. By adding the individual disturbance strategy in the process of looking for the forward direction of the seagull population, it can effectively increase the exploration and optimization ability of the algorithm and the ability to jump out of the local optimum. The attraction-repulsion strategy adopted in this paper makes the seagulls migrate in the optimal direction under the interaction of the global optimal seagull individual attraction, and the global worst seagull individual repulsion enhances the diversity and optimization ability of the algorithm population and makes the search solution space more comprehensive in the algorithm exploitation stage. To evaluate the performance of the IDARSOA, this paper uses 10 benchmark functions of IEEE CEC 2019 and 10 functions of IEEE CEC 2020 to effectively verify the effect of IDARSOA. The comparison experiment includes parameters sensitivity analyses, the comparison between the added mechanism and the original algorithm, the comparison with the widely used algorithm, and the comparison with the excellent variant algorithm. According to Wilcoxon signed-rank test and Friedman test, the performance of IDARSOA is better than the original algorithm.
The structure of the paper is as follows, and an overview of the SOA can be found in Section 2. Section 3 introduces the IDARSOA. The experimental results are described and discussed in Section 4. Section 5 applies IDARSOA to engineering problems and analyzes the experimental results. The sixth part includes the conclusion of the full text and summary of future work.

2. Overview of SOA

The SOA is a new meta-heuristic algorithm, first proposed in 2019 [59]. The SOA mainly simulates the critical characteristics of seagulls’ social life migration behavior and attack behavior. During the migration of seagulls, the position of each seagull is different to avoid collisions. The entire population always migrates towards the optimal position, guiding the forward position of each seagull. During the migration process, seagulls will attack migratory birds in a spiral motion.

2.1. Population Initialization

Let the size of the population space be N × D , where N represents the number of populations and the number of solutions, and D represents the dimension. The fitness is expressed as F = [ F 1   F 2 F D ] T , and the position of seagulls is represented as F = [ F 1   F 2 F D ] T , n = 1 , 2 , , N . The upper bound of the search range is u b = [ u b 1   u b 2 u b D ] and the lower bound is l b = [ l b 1   l b 2 l b D ] . The initialization Equation (1) is shown below:
X N × D = r a n d ( N , D ) × ( u b l b ) + l b

2.2. Migration Behavior

During the migration process, the seagull moves to another new position through the position calculation equation at the current position while avoiding collisions with other seagulls. At the same time, the accessory variable A is introduced to calculate the new position of the seagull.
C S ( t ) = A × X ( t )
where, C S ( t ) represents the new position of seagulls, and the new position of seagulls does not collide with the position of other seagulls. X ( t ) denotes the initialized seagull position before updating, t represents the number of iterations, A is the seagull motion behavior in the search state, and the value range of A is [ 0 , f C ] , and its equation is as follows:
A = f C t × f C M a x i t e r a t i o n
where, M a x i t e r a t i o n is the maximum number of iterations, the value of f C is 2, and the value of A decreases linearly from 2 to 0.
In the process of migration, seagulls will move towards the optimal position, and the optimal direction expression is:
M S = B × ( X b e s t ( t ) X ( t ) )
where X b e s t ( t ) is the optimal position of seagulls under the current iteration, and B is a randomly generated number that balances global search and local search. The equation is as follows:
B = 2 × A 2 × r d
where r d is a random number between [0, 1]. The seagull flies in the optimal direction to migrate to a better position. The updated position expression is as follows:
D S ( t ) = | C S ( t ) + M S ( t ) |

2.3. Attack Behavior

When seagulls are migrating, they rely on their own wings and their own weight to maintain the corresponding height, and constantly change the angle and speed of flight according to the position of the prey, thereby launching an attack on the prey. When prey is found, the seagull attacks the prey in a spiral manner on a three-dimensional plane. The plane behavior of x , y , z is expressed as follows:
x = r × cos ( θ )
y = r × sin ( θ )
z = r × θ
r = u × e θ v
where r represents the radius of the seagull in the circling process, and θ is a random angle value in the range of [0, 2π]. u and v are fixed values of the spiral state. e is the base of the natural logarithm. The equation for the position change of the seagull during the attack is as follows:
X ( t ) = D S ( t ) × x × y × z + X b e s t ( t )
The pseudo-code of the traditional SOA is given as follows in Algorithm 1.
Algorithm 1. Pseudocode of SOA.
Set the size N, dim, maximum iterations, u, v, fc
Initialize seagulls’ positions X
t = 0
while (t < Maxiteration) do
The default global optimal solution is the position of the first seagull
  for i = 1: size(X,1) do
    update additional variable A using Equation (3)
    Calculate Cs using Equation (2)
    rd takes a random value on (0, 1)
    Calculate Ms using Equation (4)
    Calculate Ds using Equation (6)
    Update r, x, y, z using Equations (7)–(10)
    Calculate new seagull position using Equation (11)
  end for
  for i = 1: size(X,1) do
    for j = 1: size(X,2) do
     Border control
   end for
  end for
  for i = 1: size(X,1) do
    Calculate the fitness value of the new seagull position
  end for
  Sort the fitness value and update the optimal position and fitness value of the seagull
  t   t + 1
end while
return the best solution

3. Improvement Methods Based on SOA

The improved SOA has two effective strategies. Firstly, the individual disturbance strategy is added to improve the optimization ability of the algorithm. Then, embed the attraction-repulsion strategy into the original SOA to increase the possibility of the population approaching the optimal solution.

3.1. Individual Disturbance

In the process of searching for the optimal direction for seagulls, the original algorithm updates the optimal direction according to the seagull’s own position and optimal position, which will cause the problem of falling into local optimality, causing the seagull population to lose its direction in the migration process, and misleading the seagull group to deviate from the optimal migration route. In this paper, in the process of seagulls looking for the migration direction, in addition to relying on their own position and optimal position, another seagull individual is also used, and a weight is added to coordinate the seagulls’ exploration ability to find the optimal direction. The equation of updated seagull optimization direction is as follows:
M S = X ( t ) m × B × ( X b e s t ( t ) X K ( t ) )
where X ( t ) is the position of the seagull under the current iteration, X b e s t ( t ) is the optimal position of the current seagull, and X K ( t ) is the position of the random seagull. The weight expression is as follows:
m = M a x i t e r a t i o n t M a x i t e r a t i o n
m is a linear weight, which decreases linearly with the increase in iteration times to balance global and local search.

3.2. Adopt an Attraction-Repulsion Strategy

The migration of the seagull population is often guided by the global optimal individual to move towards the optimal solution. Still, if the global optimal individual falls into the local optimal and cannot jump out, it is likely to stagnate the whole population. To solve this problem, this paper adopts the attraction-repulsion strategy. The idea of attraction-repulsion first appeared in the particle swarm optimization algorithm of attraction and repulsion in 2002 [63]. In this paper, a global best solution and a global worst solution are added in Equation (14) through the attraction-repulsion strategy, which allows the seagull population to move randomly under the effect of attraction and repulsion to find the optimal solution. As the iterative process of the algorithm enters the later stage, the diversity of the population will be significantly reduced, and the premature phenomenon will eventually occur. The global worst position introduced at this point can play a role in increasing the population diversity. It makes the population more comprehensive in the local search process and overcomes the problem of premature maturity of the algorithm. The search equation for the position of seagulls using the attraction-repulsion strategy is as follows:
n e w D S = r × D S + ( ω 1 × ( 1 r ) ) × ( G B E S T X D S ) ( ω 2 × ( 1 r ) ) × ( G W O R S T X D S )
where G W O R S T X is the global worst position, G B E S T X is the global optimal position, X ( t ) is the seagull position under the current iteration, and r is a random number between 0 and 1. Through the experiment, it is found that when the ω 1 and ω 2 are 0.5 and 0.4, respectively, the seagulls will be better affected by the interaction of attraction and repulsion in the process of migration, be close to the optimal seagull individual, enhance the diversity of seagull population, improve the optimization ability, and reduce the risk of falling into the local optimum. The pseudo-code of IDARSOA is shown in Algorithm 2.
Algorithm 2. Pseudocode of IDARSOA.
Set the size N ,   d i m ,   m a x i m u m   i t e r a t i o n s ,   u ,   v ,   f c ,   ω 1 ,   ω 2
Initialize seagulls’ positions X
t = 0
while (t < Maxiterationdo
  Calculate and rank the fitness value of the seagull population
  Get the best and worst positions in the population
  for i = 1: size(X,1) do
    Update additional variable A using Equation (3)
    Calculate Cs using Equation (2)
    Update m using Equation (13)
    Randomly generate an integer in (1, D) and assign it to K
    rd takes a random value on (0, 1)
    Calculate Ms using Equation (4)
    Calculate Ds using Equation (6)
    Generate a random number at (0, 1) and assign it to R
    Calculate new Ds according to the attraction and repulsion strategy using Equation (14)
    Update r, x, y, z using Equations (7)–(10)
    Calculate new seagull position using Equation (11)
  end for
  for i = 1: size(X,1) do
    for j = 1: size(X,2) do
     Border control
    end for
  end for
  for i = 1: size(X,1) do
    Calculate the fitness value of the new seagull position
  end for
  Sort the fitness value and update the optimal position and fitness value of the seagull
  t   t + 1
end while
return the best solution
To better understand the idea and algorithm flow of this optimization algorithm, the flow chart of IDARSOA is shown in Figure 1.
The time complexity of the improved IDARSOA depends on the number of iterations of the algorithm (S), the total number of seagulls (n), and the dimension of the case at hand (d). Through analysis, the overall time complexity is O(IDARSOA) = O(initialize) + O(calculate the fitness of initialize) + O(select the best fitness from the fitness) + S × (O(calculate the CS) + O(perform individual disturbance strategy) + O(perform attraction repulsion strategy) + O(attack) + O(boundary control) + O(update the positions of seagull)). Initialization time complexity is O( n × d ), calculate the fitness of the initial value of O(n), to find the best fitness by the fitness order of O( n × l o g 2 n ), calculated CS time complexity of O(n), the time complexity required to implement perform individual disturbance strategy is O(n), the attack behavior and boundary control of the local search require O( n × d ), the position of the updated seagull is O(n). Therefore, the final time complexity of IDARSOA is as follows:
O ( IDARSOA ) = O ( n × d ) + O ( n ) + O ( n × l o g 2 n ) + S × ( O ( n ) + O ( n ) + O ( n ) + O ( n × d ) + O ( n ) ) = O ( n × d ) + O ( n ) + O ( n × l o g 2 n ) + S ×   ( 4 O ( n ) + O ( n × d ) ) .

4. Experimental Results and Discussion

In this part, to verify the performance of the IDARSOA, 20 well-known functions are used to test the efficiency of the proposed optimizer. There are four experiments: The first is sensitivity analyses of the parameters in IDARSOA. The second is the comparison experiment between IDARSOA and IDSOA, ARSOA, and the original SOA, which proves that the SOA variant has an improved performance compared to the original algorithm, and the improvement strategy is effective. The third is a comparative experiment between IDARSOA and the novel swarm intelligence optimization algorithm to verify that IDARSOA is superior to those popular intelligent algorithms. The last is to compare IDARSOA with other algorithm variants. The results are used to verify the effects of IDARSOA. To ensure the fairness of the experiment, all methods should be tested under the same conditions [22]. All experiments in this paper use MATLAB2018 software; the dimension is determined to be 30, the number of running layers is 30, and the search agent is set to 30. The description of these 20 functions is shown in Table A1. F1–F10 are taken from CEC 2019 [66,67,68,69,70,71], F11–F20 are taken from CEC 2020. The bound is the search space range of the test function, and F(min) is the minimum value of the test function.

4.1. IDARSOA’s Parameters Sensitivity Analyses

A in Equation (3) in IDARSOA represents the motion behavior of seagulls in a specified space, which is mainly affected by the parameter fc. To explore the influence of the value of fc on the performance of the seagull optimization algorithm, we set the value of fc to 1, 2, 3, 5, 7, and 9, which are represented by IDARSOAfc1, IDARSOAfc2, IDARSOAfc3, IDARSOAfc5, IDARSOAfc7, and IDARSOAfc9, respectively. Table 1 shows how these algorithms find the optimal solution in 20 test functions. It can be seen from the data in the table that in the three functions F4, F19, and F20, the ability of IDARSOA with different parameters to find the optimal solution is the same. In F1, the average value of the optimal solution found by these algorithms is the same. However, through the comparison of STD, it is found that IDARSOAfc1 has the best stability. In other functions, the value of fc is different, and the optimization performance in functions is also different. Integrating 20 test functions, IDARSOAfc2 has the best effect. Therefore, this paper sets the value of fc in IDARSOA to two.
In order to explore the best combination value of the attraction weight ω 1 of the best individual, and the repulsion weight ω 2 of the worst individual in the attraction-repulsion strategy, and considering that attraction-repulsion is a pair of interaction forces, this section selects another weight between 0.1–0.9 when ω 1 and ω 2 are 0.5 respectively, to obtain the most suitable weight. As shown in Table 2, after the combination of different weights, there are 17 combination forms, and the specific ω 1 and ω 2 values are shown in the table.
The comparison of different weight values among the 20 tested functions is displayed in Table 3, where different combinations of weights have different effects in various functions. Mean level in the table indicates the average ranking value of the algorithm among the 20 functions, and rank is the final ranking obtained from mean level. The data in the table show that too much or too little attraction and too much or too little repulsion will affect the search capability. This is because when the attraction weight is too large, it will suppress the effect of repulsion. If the globally optimal individual falls into the local optimum, the weight given to the repulsion is not enough to get rid of the local optimal solution space. Only a larger weight is given to the repulsion, but this will lead to the current individual crossing the boundary, and the optimal solution is not true. When the attraction is too small, the present individual will approach the optimal solution. If the weight of the repulsion is small at this time, the effect of attraction and repulsion strategy will be weakened. However, if the weight given to the repulsion is too large, it will cause the individual to move away from the optimal solution. The average ranking value of IDARSOA04 is the best in all combinations, and the rank value is the first. This shows that when the attraction weight is 0.5 and the repulsion weight is 0.4, the performance of the attraction-repulsion strategy can play the best.

4.2. Study of the Proposed Method

This section describes the effects of two optimization mechanisms added to SOA: individual disturbance and attraction-repulsion strategy. Four different SOA effects were compared to examine the impact of all combinations of each mechanism on SOA. As shown in Table 4 below, “ID” and “AR” represent “individual disturbance” and “attraction-repulsion strategy”, respectively. In Table 4, “1” indicates that SOA uses this mechanism, and “0” indicates the opposite; that is, it does not use this optimization mechanism. For example, the IDSOA representation combines the “individual disturbance” rather than the “attraction-repulsion strategy”. The combination of the two strategies is shown in Table 4.
Based on the 20 functions in the test functions table, four SOAs were applied to these functions for testing. Four kinds of SOA results are shown in Table 5 below. This paper uses a non-parametric Wilcoxon signed-rank test at 5% significance level to prove the difference between IDARSOA and the other three algorithms. The “+”, “−”, and “=” in the table indicate superior to IDARSOA, inferior to IDARSOA, and equal to IDARSOA, respectively. According to the average ranking ARV in Table 5, IDARSOA outperforms the other three algorithms with a score of 1.4. This shows that IDARSOA performs better than other algorithms in the 20 test functions, reflecting that IDARSOA has better advantages than the other three algorithms. In addition, IDSOA and ARSOA are better than SOA in average ranking. This is because the individual disturbance strategy in this paper will use different random agent positions to perturb each time SOA looks for the optimal direction, to enhance the ability of the algorithm to jump out of the local optimization. The attraction-repulsion strategy makes SOA more comprehensive in the process of searching solution space through the interaction of attraction and repulsion between the optimal solution and the worst solution.
Figure 2 shows IDARSOA and its two strategies used in SOA and compares the original SOA. It can be seen from Figure 2 that in F3, F6, F8, and F12, the convergence speed of IDARSOA is not as fast as SOA. Still, the best solution found by this algorithm in these functions is closer to the theoretical value of each function. It performs better in terms of optimality finding accuracy, indicating the strong exploration performance of IDARSOA. Overall, IDARSOA has a better optimization effect than IDSOA, ARSOA, and SOA, which shows that adding “individual disturbance” and “attraction-repulsion strategy” is very helpful to the search of algorithms and improves SOA performance. IDARSOA is the best way to deal with these different types of functions.
To explore the changes in the performance of IDARSOA with the increase in data size and to ensure the reliability of the experiments, this section uses the univariate principle for the experiments. Under the same operating environment, we set the dim in the experiment to 50 and 100, the number of evaluations in the experiment to 300,000, and the number of trials to 30. Because the test function of CEC2019 has a fixed dimension, this part uses the CEC2020 test functions for validation. The Wilcoxon signed-rank test data for SOA with different mechanisms in different dimensions are shown in the following Table 6. When dim is 50, IDARSOA shows better performance than SOA in seven test functions compared to SOA, while the other three test functions, IDARSOA and SOA, obtain the same optimal solution. SOA with both ID and DD strategies outperformed SOA in terms of average ranking. When dim is set to 100, IDARSOA still ranks first among these algorithms with an ARV of 1.3. Still, the optimal value obtained among the seven functions is better than SOA. Combined with Table 5 above, the increase in data size does not affect the performance improvement of the ID and AR strategies for SOA, as IDARSOA is sufficient proof.
To explore the impact of the two mechanisms used in this paper on SOA performance in high dimensions, this section uses box plots to reflect the data distribution characteristics of the different algorithms. As shown in Figure 3 below, when dim = 50, the median of IDARSOA in F11 is smaller than the other three algorithms; the ranges of upper and lower edges are also very small, indicating the stable performance of the optimal value found by IDARSOA. In F14, from the data distribution of the four algorithms for function finding, all four algorithms find the theoretical optimal value. When dim = 100, the range between the upper and lower edges and the range between the upper and lower quartiles of IDARSOA in F15 and F17 are smaller than those of any of the algorithms, proving the stable performance of the search for the optimum. As a whole, the original SOA is not very stable in finding the optimal solution, and the optimal solution found is rather scattered. In contrast, the performance of IDARSOA, IDSOA, and ARSOA is more stable.
To explore the impact of the two strategies adopted in this paper on SOA, this section analyzes the balance and diversity of IDARSOA and SOA. As shown in Figure 4 below, this paper selects F1, F2, F14, and F18 from 20 test functions for discussion. The first column in Figure 4 is the balance diagram of IDARSOA, the second column shows the balance diagram of SOA, and the third column is the diversity analysis diagram. The balance diagrams contain three curves: exploration, development, and incremental decline. It can be seen from the figure that the exploration ability of the original algorithm SOA is weak, and the mining ability accounts for a large proportion of the whole search process. Due to its early entry into the development stage and long local development process, SOA has a weak global search ability and cannot get a good optimal solution. As can be seen from the balance analysis diagram of IDARSOA, its global search ability has been significantly improved.
By comparing the population diversity of IDARSOA and SOA, it can be seen that the two mechanisms used in this paper significantly increase the population diversity. Furthermore, the oscillation of IDARSOA diversity is much larger than that of SOA, which indicates that IDARSOA has more solutions to search in the solution space, effectively reducing the problem of stagnation occurring in the algorithm. This is because the diversity of the population is increased by the perturbation of random individuals when seagulls are searching for the optimal direction. In the process of local search time, the influence of the attraction-repulsion strategy used makes the search space more comprehensive. Still, at the same time, the IDARSOA population diversity decreases seriously slow, and the state of particles is scattered, which affects the convergence speed of IDARSOA. This phenomenon arises because we try to introduce other individuals for perturbation in the process of finding the optimal migration direction of the seagull population. Although the perturbation by individuals can reduce the risk of falling into the local optimum, the disadvantage exists that it leads to a slow decline in diversity and does not perfectly achieve a rapid decrease in population diversity with the increase in the number of iterations.

4.3. Comparative Study with Swarm Intelligence Algorithm

This part selects five popular metaheuristic algorithms: sine cosine algorithm (SCA) [72], firefly algorithm (FA) [73], whale optimization algorithm (WOA), bat algorithm (BA) [74] moth-flame optimization, and (MFO) [75] to compare with IDARSOA on 20 functions. The main parameter settings of these algorithms are shown in Table 7 below. In the previous part, it has been proved that the variant IDARSOA has better performance than the original SOA, so the next comparative experiment will not add SOA for comparison.
To prove the optimized performance of IDARSOA, the following Table 8 shows the average value and standard deviation of the six algorithms, including IDARSOA in F1 to F20. In most functions, the standard deviation of IDARSOA is reasonable and small overall, reflecting the stability and superiority of IDARSOA. In comparison with the five algorithms, IDARSOA ranks first among the six algorithms with ARV = 2.55, which shows the superiority of IDARSOA.
To more clearly show the change of convergence curve of IDARSOA and the other five algorithms under the same experimental conditions, 9 of the 20 functions are selected as follows. These functions are F1, F2, F4, F8, F9, F13, F16, F19, and F20, respectively. It can be seen from Figure 5 that in F1 and F2, IDARSOA converges rapidly and is closer to the optimal value in optimization accuracy than the other five algorithms, which also reflects the advantages of IDARSOA in exploration performance. In F4, F9, and F13, although IDARSOA is not as good as MFO in finding the optimal solution initially, IDARSOA can also find a good optimal value through its continuous exploration. In F19 and F20, IDARSOA is as good as other algorithms in convergence speed, but it is better in finding the optimal value. Overall, IDARSOA shows its advantages in finding the optimal value of the function.

4.4. Comparative Study with Variants of Novel Intelligent Algorithms

In order to verify the effectiveness of IDARSOA, this paper selects CBA [76], FSTPSO [77], CDLOBA [78], PPPSO [79], CESCA [80], CMFO [81], SCAPSO [82], CCMWOA [83], and BSSFOA [84] to compare with IDARSOA. The specific parameter settings in these algorithms are shown in Table 9 below.
Table 10 shows the average value and standard deviation of the optimal solution obtained by IDARSOA and the advanced algorithm in 20 test functions. Among these 10 algorithms, IDARSOA ranks first with an ARV of 3.05. Compared with the PSO variant algorithm with good performance, it is stronger than FSTPSO in 15 functions, PPPSO in 12 functions, and SCAPSO in 7 functions. As a typical algorithm of the WOA variant, CCMWOA ranks third among the 10 algorithms, but it is only stronger than IDARSOA in the four test functions. Among the three functions F14, F19, and F20, IDARSOA, BSSFOA, SCAPSO, and CCMWOA achieved the same optimal value. This shows that IDARSOA, like these three advanced algorithms, can effectively find the best value.
In order to clearly and intuitively understand the convergence of IDARSOA with the advanced algorithm, the following Figure 6 shows the convergence effect plots compared with the advanced algorithm. The convergence plots of nine test functions are selected in the figure, namely F2, F4, F6, F8, F13, F16, F18, F19, and F20. In F4, F6, and F8, the advantages of IDARSOA’s optimization ability in these three functions are obviously displayed. IDARSOA gradually enters the state of convergence only in the late iteration, which is due to the addition of the individual perturbation strategy, the search solution is influenced by random individuals, which reduces the risk of falling into local optimum and enhances the exploration ability, but this also leads to the problem that IDARSOA converges slower than other algorithms. In F9 and F20, as the data in the above table show, IDARSOA, BSSFOA, SCAPSO, and CCMWOA obtain the same optimal values, so the curves of these algorithms overlap together in the figure. Owing to the great potential of the proposed method, in the future, it can also be extended to tackle other practical problems, such as medical diagnosis [85,86,87,88], microgrid planning [89], engineering optimization problems [31,33], energy storage planning and scheduling [90], active surveillance [91], kayak cycle phase segmentation [92], location-based services [93,94], image dehazing [95], information retrieval services [96,97,98], human motion capture [99], and video deblurring [100].

5. Engineering Design Issues

In this section, the performance of IDARSOA is verified on six well-known engineering design optimization problems, including tension/compression spring, pressure vessels, I-beam, speed reducer, welded beam, and three-bar truss design problems. It is worth noting that the optimal solution to be obtained has many constraints that should not be violated [62].

5.1. Tension-Compression String Problem

This problem aims to design a tension/compression spring with the smallest weight while satisfying the constraints. In this model, the design parameters are wire diameter (d), average coil diameter (D), and effective coil number (N). The specific model is as follows:
Consider
x = [ x 1   x 2   x 3   ] = [ d   D   N ]
Minimize
f ( x ) = x 1 2 x 2 ( x 3 + 2 )
Subject to
g 1 ( x ) = 1 4 x 2 3 x 3 71785 x 1 4 0
g 2 ( x ) = 4 x 2 2 x 1 x 2 12566 ( x 2 x 1 3 x 1 4 ) + 1 5108 x 1 2 0
g 3 ( x ) = 1 140.45 x 1 x 2 2 x 3 0
g 4 ( x ) = x 1 + x 2 1.5 1 0
Variable range:
0.05 x 1 2 ,   0.25 x 2 1.3 ,   2 x 3 15
The IDARSOA and other algorithms were applied to optimize the tension/compression spring design problem, and the results are shown in Table 11. IDARSOA and the other 10 algorithms are applied to the same problem; IDARSOA and DE get the lowest optimization cost at 0.012670, which shows the enhancement effect of the proposed IDARSOA in practical engineering applications.

5.2. Pressure Vessel Design Problem

For the design of cylindrical pressure vessels, the main difficulty is to reduce the manufacturing cost while meeting the four parameters of the pressure vessel, namely, the thickness of the head ( T h ) , the inner radius (R), the thickness of the shell ( T s ) , and the cross-sectional range minus the head (L). The model can be described as:
Consider
x = [ x 1   x 2   x 3   x 4   ] = [ T s   T h   R   L ]
Objective:
f ( x ) m i n = 0.6224 x 1 x 3 x 4 + 1.7781 x 3 x 1 2 + 3.1661 x 4 x 1 2 + 19.84 x 3 x 1 2
Subject to
g 1 ( x ) = x 1 + 0.0193 x 3 0
g 2 ( x ) = x 3 + 0.00954 x 3 0
g 3 ( x ) = π x 4 x 3 2 4 3 π x 3 3 + 1,296,000 0
g 4 ( x ) = x 4 240 0
Variable ranges:
0 x 1 99 , 0 x 2 99 ,   10 x 3 200 ,   10 x 4 200
Applying IDARSOA and several other algorithms to this engineering problem, the results obtained are shown in Table 12. It can be seen from the data that IDARSOA ranks second among these algorithms at the cost of 6072.4301, which indicates that IDARSOA has a good effect in optimizing the design of pressure vessels.

5.3. I-Beam Design Problem

The goal of the structural design problem of the I-steel is to minimize vertical deflection. The problem involves four structural parameters: two thicknesses, one length, and one height. The specific problem model is as follows:
Consider:
                      x = [ x 1   x 2   x 3   x 4 ] = [ b   h   t w   t f ]
The value range of the four parameters:
10 x 1 50
10 x 2 80
0.9 x 3 5
0.9 x 4 5
Minimize:
                    f ( x ) = 5000 t w ( h 2 t f ) 3 12 + b t f 3 6 + 2 b t f ( h t f 2 ) 2
Subject to:
                    g ( x ) = 2 b t f + t w ( h 2 t f ) 0
g 1 ( x ) = 18 h × 10 4 t w ( h 2 t f ) 3 + 2 b t f ( 4 t f + 3 h ( h 2 t f ) ) + 15 b × 10 3 ( h 2 t f ) t w 3 + 2 t f b 3 6 0
The results of the IDARSOA and other six algorithms to the I-beam design problem are shown in the following Table 13. It can be seen from the data in the table that IDARSOA and SOS can effectively solve this problem at the same time.

5.4. Speed Reducer Design Problem

The premise of the problem is to minimize the weight of the speed reducer while satisfying each parameter in the engineering design model within the valid range. The parameters involved: x 1 is the face width (b), x 2 is the tooth mode (m), x 3 is the number of gear teeth (z), x 4 is the length of the first shaft between bearings ( l 1 ), x 5 is the length of the second shaft between bearings ( l 2 ), x 6 is the diameter first ( d 1 ), and x 7 is the second shaft ( d 2 ). The specific mathematical model is shown below.
Consider
z = [ x 1     x 2   x 3   x 4   x 5   x 6   x 7   ] = [ b   m   z   l 1   l 2   d 1   d 2 ] ,
Minimize f ( x ) = 0.7854 x 1 x 2 2 ( 3.3333 x 3 2 + 14.9334 x 3 43.0934 ) 1.508 x 1 ( x 6 2 + x 7 2 ) + 7.4777 ( x 6 2 + x 7 2 ) + 0.7854 ( x 4 x 6 2 + x 5 x 7 2 ) .
Subject to:
g 1 ( x ) = 27 x 1 x 3 x 2 2 1 0
g 2 ( x ) = 397.5 x 1 x 2 2 x 3 2 1 0
g 3 ( x ) = 1.93 x 4 3 x 2 x 6 4 x 3 1 0
g 4 ( x ) = 1.93 x 5 3 x 2 x 7 4 x 3 1 0
g 5 ( x ) = [ ( 745 ( x 4 / x 2 x 3 ) ) 2 + 16.9 × 10 6 ] 1 / 2 110 x 6 3 1 0
g 6 ( x ) = [ ( 745 ( x 5 / x 2 x 3 ) ) 2 + 157.5 × 10 6 ] 1 / 2 85 x 7 3 1 0
g 7 ( x ) = x 2 x 3 40 1 0
g 8 ( x ) = 5 x 2 x 1 1 0
g 9 ( x ) = x 1 12 x 2 1 0
g 10 ( x ) = 1.5 x 6 + 1.9 x 4 1 0
g 11 ( x ) = 1.1 x 7 + 1.9 x 5 1 0
where
2.6 x 1 3.6 ,   0.7 x 2 0.8 ,   17 x 3 28 ,   7.3 x 4 28 ,   7.3 x 5 8.3 ,   2.9 x 6 3.9 ,   5.0 x 7 5.5
As shown from the data in Table 14 below, IDARSOA performs well in this problem, proving its advantage in solving constrained problems. The advantage is outstanding compared to other hHHO-SCA, SCA, and GSA.

5.5. Welded Beam Design Problem

The objective of this engineering problem is to reduce the manufacturing cost of a welded beam, where the variables involved are: welding seam thickness (h), welding joint length (l), beam width (t), beam thickness (b). A detailed model is shown below.
Consider
x = [ x 1 , x 2 , x 3 , x 4 ] = [   h   l   t   b ]
Minimize
f ( x ) = 1.10471 x 1 2 + 0.04811 x 3 x 4 ( 14 . 0 + x 4 )
Subject to
g 1 ( x ) = τ ( x ) τ max 0
g 2 ( x ) = σ ( x ) σ max 0
g 3 ( x ) = δ ( x ) δ max 0
g 4 ( x ) = x 1 x 4 0
g 5 ( x ) = P P C ( x ) 0
g 6 ( x ) = 0.125 x 1 0
g 7 ( x ) = 1.10471 x 1 2 + 0.04811 x 3 x 4 ( 14 . 0 + x 2 ) 5.0 0
Variable range 0 . 1 x 1 2 , 0 . 1 x 2 10 , 0 . 1 x 3 10 , 0 . 1 x 4 2 where
τ ( x ) = ( τ ) 2 + 2 τ τ x 2 2 R + ( τ ) 2 , τ = P 2 x 1 x 2 , τ = MR J , M = P ( L + x 2 2 )
R = x 2 2 4 + ( x 1 + x 3 2 ) 2
J = 2 { 2 x 1 x 2 [ x 2 2 4 + ( x 1 + x 3 2 ) 2 ] }
σ ( x ) = 6 PL x 4 x 3 2 ,   δ ( x ) = 6 PL 3 Ex 3 2 x 4
P C ( x ) = 4 . 013 E x 3 2 x 4 6 36 L 2 ( 1 x 3 2 L E 4 G )
P = 60,001 b ,   L = 14 ,   δ max = 0.25
E = 30 × 1 6   psi , G = 12 × 1 0 6   psi
τ max = 13 , 600   psi ,   σ max = 30 , 000   psi
IDARSOA for this problem has an inferior performance to EO and RO methods when solving the same problem with other algorithms. However, it has advantages compared with HS, FSA, SCA, and SBM (see Table 15).

5.6. Three-Bar Truss Design Problem

The three-bar truss design problem is a typical constrained engineering problem that requires obtaining a smaller weight while satisfying two parameters x 1 , x 2 . The specific mathematical model is as follows.
Objective function:
f ( x ) = ( 2 2 x 1 + x 2 ) × l  
Subject to:
g 1 ( x ) = 2 x 1 + x 2 2 x 1 2 + 2 x 1 x 2 P σ 0
g 2 ( x ) = x 2 2 x 1 2 + 2 x 1 x 2 P σ 0
g 2 ( x ) = 1 2 x 2 + x 1 P σ 0
where
0 x i 1     i = 1 ,   2
L = 100 cm, P = 2 kN/cm2, σ = 2 kN/cm2
IDARSOA is compared with the four metaheuristic algorithms, and its minimum weight obtained for solving this problem is 263.8960. As shown in Table 16, IDARSOA has an advantage in solving the problem and can handle the three-bar truss design problem well.

6. Conclusions and Future Works

The IDARSOA proposed in this paper is designed to overcome the lack of search ability of the original SOA. When seagulls look for the optimal migration direction, the individual disturbance mechanism is added to enhance the ability to jump out of the local optimum through the disturbance of seagulls in different individual positions. At the same time, the attraction-repulsion strategy is introduced to guide the seagulls to move towards the position of the optimal solution. The combination of these two mechanisms improves the optimization accuracy of the algorithm, makes up for the lack of search ability of the original algorithm, enhances the diversity of the population, and makes the process of exploring the solution space more comprehensive. Data results of 20 representative benchmark functions show that the performance of this optimization algorithm is significantly improved compared with the original SOA, and it can effectively solve the function optimization problem. In the application of IDARSOA to six engineering examples, there are sound effects which can be a good solution to the actual engineering problems, and shows that IDARSOA can improve the accuracy of the calculation results and has a certain practical value.
Although our proposed method effectively improves the optimization performance of SOA, IDARSOA takes more time to complete in dealing with complex and large-scale problems. Therefore, we will consider combining IDARSOA with distributed platforms, such as Hadoop, to improve its parallel performance and speed up the time to solve real industrial environment problems. In addition, there are still many problems worthy of further study. On the one hand, IDARSOA suffers from the problem of slow convergence. In the next stage of research, we consider balancing the relationship between population diversity and the number of iterations by adding complementary strategies to speed up the convergence trend of IDARSOA while ensuring that it has an affluent population. At the same time, under the core idea of SOA, how to enrich the algorithm model and improve the algorithm performance so that the improved SOA has the same superior performance as SASS [122], COLSHADE [123], and CMA-ES [124] algorithms are also the critical research contents in our subsequent work. On the other hand, our goal is to better integrate optimized SOA into real-life problems and make full use of the advantages of SOA. Due to the good performance of IDARSOA in functions, we plan to combine IDARSOA with machine learning to solve more complex real-world problems. Then IDARSOA will be applied to other scenarios, such as for image enhancement optimization, image segmentation and classification, and handling dynamic landscapes. Moreover, learning techniques can be used to further boost the proposed method [5,125,126], and the proposed method can also be extended to the multi/many-objective optimization algorithms [127,128,129,130,131].

Author Contributions

Funding acquisition, H.C.; Investigation, C.B.; Methodology, A.A.H.; Writing—original draft, H.Y.; Writing—review & editing, S.Q. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Science and Technology Development Program of Jilin Province (20190301024NY).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Description of the benchmark functions.
Table A1. Description of the benchmark functions.
NO. FunctionsDimF (min)
CEC2019 benchmark functions
F1Storn’s Chebyshev Polynomial Fitting Problem91
F2Inverse Hilbert Matrix Problem161
F3Lennard-Joes Minimum Energy Cluster181
F4Rastrigin’s Function101
F5Griewangk’s Function101
F6Weierstrass Function101
F7Modified Schwefel’s Function101
F8Expand Schaffer’s F6 function101
F9Happy Cat Function101
F10Ackley Function101
CEC2020 benchmark functions
F11Shifted and Rotated Bent Cigar Function (CEC2017 F1)30100
F12Shifted and Rotated Schwefel’s Function (CEC2014 F11)301100
F13Shifted and Rotated Lunacek bi-Rastrigin Function (CEC2017 F7)30700
F14Expanded Rosenbrock’s plus Griewangk’s Function (CEC2017 F19)301900
F15Hybrid Function1 (n = 3) (CEC2014 F17)301700
F16Hybrid Function2 (n = 4) (CEC2017 F16)301600
F17Hybrid Function3 (n = 5) (CEC2014 F21)302100
F18Composition Function1 (n = 3) (CEC2017 F22)302200
F19Composition Function2 (n = 4) (CEC2017 F24)302400
F20Composition Function3 (n = 5) (CEC2017 F25)302500

References

  1. Beni, G.; Wang, J. Swarm intelligence in cellular robotic systems. In Robots and Biological Systems: Towards a New Bionics? Springer: Cham, Switzerland, 1993; pp. 703–712. [Google Scholar]
  2. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  3. Yang, Y.; Chen, H.; Heidari, A.A.; Gandomi, A.H. Hunger games search: Visions, conception, implementation, deep analysis, perspectives, and towards performance shifts. Expert Syst. Appl. 2021, 177, 114864. [Google Scholar] [CrossRef]
  4. Tu, J.; Chen, H.; Wang, M.; Gandomi, A.H. The Colony Predation Algorithm. J. Bionic Eng. 2021, 18, 674–710. [Google Scholar] [CrossRef]
  5. Li, S.; Chen, H.; Wang, M.; Heidari, A.A.; Mirjalili, S. Slime mould algorithm: A new method for stochastic optimization. Future Gener. Comput. Syst. 2020, 111, 300–323. [Google Scholar] [CrossRef]
  6. Ahmadianfar, I.; Asghar Heidari, A.; Gandomi, A.H.; Chu, X.; Chen, H. RUN Beyond the Metaphor: An Efficient Optimization Algorithm Based on Runge Kutta Method. Expert Syst. Appl. 2021, 181, 115079. [Google Scholar] [CrossRef]
  7. Piri, J.; Mohapatra, P. An analytical study of modified multi-objective Harris Hawk Optimizer towards medical data feature selection. Comput. Biol. Med. 2021, 135, 104558. [Google Scholar] [CrossRef] [PubMed]
  8. Chen, M.-R.; Huang, Y.-Y.; Zeng, G.-Q.; Lu, K.-D.; Yang, L.-Q. An improved bat algorithm hybridized with extremal optimization and Boltzmann selection. Expert Syst. Appl. 2021, 175, 114812. [Google Scholar] [CrossRef]
  9. Tang, C.; Zhou, Y.; Tang, Z.; Luo, Q. Teaching-learning-based pathfinder algorithm for function and engineering optimization problems. Appl. Intell. 2021, 51, 5040–5066. [Google Scholar] [CrossRef]
  10. Zhong, L.; Zhou, Y.; Luo, Q.; Zhong, K. Wind driven dragonfly algorithm for global optimization. Concurr. Comput. Pract. Exp. 2021, 33, e6054. [Google Scholar] [CrossRef]
  11. Salgotra, R.; Singh, U.; Singh, S.; Singh, G.; Mittal, N. Self-adaptive salp swarm algorithm for engineering optimization problems. Appl. Math. Model. 2021, 89, 188–207. [Google Scholar] [CrossRef]
  12. Mirjalili, S.; Gandomi, A.H.; Mirjalili, S.Z.; Saremi, S.; Faris, H.; Mirjalili, S.M. Salp Swarm Algorithm: A bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 2017, 114, 163–191. [Google Scholar] [CrossRef]
  13. Kapoor, S.; Zeya, I.; Singhal, C.; Nanda, S.J. A Grey Wolf Optimizer Based Automatic Clustering Algorithm for Satellite Image Segmentation. Procedia Comput. Sci. 2017, 115, 415–422. [Google Scholar] [CrossRef]
  14. Seyyedabbasi, A.; Kiani, F. I-GWO and Ex-GWO: Improved algorithms of the Grey Wolf Optimizer to solve global optimization problems. Eng. Comput. 2021, 37, 509–532. [Google Scholar] [CrossRef]
  15. Yang, Z.; Li, K.; Guo, Y.; Ma, H.; Zheng, M. Compact real-valued teaching-learning based optimization with the applications to neural network training. Knowl.-Based Syst. 2018, 159, 51–62. [Google Scholar] [CrossRef]
  16. Fan, C.; Hu, K.; Feng, S.; Ye, J.; Fan, E. Heronian mean operators of linguistic neutrosophic multisets and their multiple attribute decision-making methods. Int. J. Distrib. Sens. Netw. 2019, 15, 1550147719843059. [Google Scholar] [CrossRef]
  17. Cui, W.-H.; Ye, J. Logarithmic similarity measure of dynamic neutrosophic cubic sets and its application in medical diagnosis. Comput. Ind. 2019, 111, 198–206. [Google Scholar] [CrossRef]
  18. Fan, C.; Fan, E.; Hu, K. New form of single valued neutrosophic uncertain linguistic variables aggregation operators for decision-making. Cogn. Syst. Res. 2018, 52, 1045–1055. [Google Scholar] [CrossRef] [Green Version]
  19. Ye, J.; Cui, W. Modeling and stability analysis methods of neutrosophic transfer functions. Soft Comput. 2020, 24, 9039–9048. [Google Scholar] [CrossRef]
  20. Lai, X.; Zhou, Y. Analysis of multiobjective evolutionary algorithms on the biobjective traveling salesman problem (1, 2). Multimed. Tools Appl. 2020, 79, 30839–30860. [Google Scholar] [CrossRef]
  21. Hu, K.; Ye, J.; Fan, E.; Shen, S.; Huang, L.; Pi, J. A novel object tracking algorithm by fusing color and depth information based on single valued neutrosophic cross-entropy. J. Intell. Fuzzy Syst. 2017, 32, 1775–1786. [Google Scholar] [CrossRef] [Green Version]
  22. Hu, K.; He, W.; Ye, J.; Zhao, L.; Peng, H.; Pi, J. Online Visual Tracking of Weighted Multiple Instance Learning via Neutrosophic Similarity-Based Objectness Estimation. Symmetry 2019, 11, 832. [Google Scholar] [CrossRef] [Green Version]
  23. Zhao, D.; Liu, L.; Yu, F.; Heidari, A.A.; Wang, M.; Liang, G.; Muhammad, K.; Chen, H. Chaotic random spare ant colony optimization for multi-threshold image segmentation of 2D Kapur entropy. Knowl.-Based Syst. 2020, 216, 106510. [Google Scholar] [CrossRef]
  24. Zhao, D.; Liu, L.; Yu, F.; Asghar Heidari, A.; Wang, M.; Oliva, D.; Muhammad, K.; Chen, H. Ant Colony Optimization with Horizontal and Vertical Crossover Search: Fundamental Visions for Multi-threshold Image Segmentation. Expert Syst. Appl. 2020, 167, 114122. [Google Scholar] [CrossRef]
  25. Zhang, Y.; Liu, R.; Wang, X.; Chen, H.; Li, C. Boosted binary Harris hawks optimizer and feature selection. Eng. Comput. 2020, 37, 3741–3770. [Google Scholar] [CrossRef]
  26. Hu, J.; Chen, H.; Heidari, A.A.; Wang, M.; Zhang, X.; Chen, Y.; Pan, Z. Orthogonal learning covariance matrix for defects of grey wolf optimizer: Insights, balance, diversity, and feature selection. Knowl.-Based Syst. 2021, 213, 106684. [Google Scholar] [CrossRef]
  27. Zhang, X.; Xu, Y.; Yu, C.; Heidari, A.A.; Li, S.; Chen, H.; Li, C. Gaussian mutational chaotic fruit fly-built optimization and feature selection. Expert Syst. Appl. 2020, 141, 112976. [Google Scholar] [CrossRef]
  28. Li, Q.; Chen, H.; Huang, H.; Zhao, X.; Cai, Z.; Tong, C.; Liu, W.; Tian, X. An Enhanced Grey Wolf Optimization Based Feature Selection Wrapped Kernel Extreme Learning Machine for Medical Diagnosis. Comput. Math. Methods Med. 2017, 2017, 9512741. [Google Scholar] [CrossRef] [PubMed]
  29. Liu, T.; Hu, L.; Ma, C.; Wang, Z.-Y.; Chen, H.-L. A fast approach for detection of erythemato-squamous diseases based on extreme learning machine with maximum relevance minimum redundancy feature selection. Int. J. Syst. Sci. 2015, 46, 919–931. [Google Scholar] [CrossRef]
  30. Gupta, S.; Deep, K.; Heidari, A.A.; Moayedi, H.; Chen, H. Harmonized salp chain-built optimization. Eng. Comput. 2019, 37, 1049–1079. [Google Scholar] [CrossRef]
  31. Ba, A.F.; Huang, H.; Wang, M.; Ye, X.; Gu, Z.; Chen, H.; Cai, X. Levy-based antlion-inspired optimizers with orthogonal learning scheme. Eng. Comput. 2020, 1–22. [Google Scholar] [CrossRef]
  32. Zhang, H.; Cai, Z.; Ye, X.; Wang, M.; Kuang, F.; Chen, H.; Li, C.; Li, Y. A multi-strategy enhanced salp swarm algorithm for global optimization. Eng. Comput. 2020, 1–27. [Google Scholar] [CrossRef]
  33. Liang, X.; Cai, Z.; Wang, M.; Zhao, X.; Chen, H.; Li, C. Chaotic oppositional sine–cosine method for solving global optimization problems. Eng. Comput. 2020, 1–17. [Google Scholar] [CrossRef]
  34. Pang, J.; Zhou, H.; Tsai, Y.-C.; Chou, F.-D. A scatter simulated annealing algorithm for the bi-objective scheduling problem for the wet station of semiconductor manufacturing. Comput. Ind. Eng. 2018, 123, 54–66. [Google Scholar] [CrossRef]
  35. Zhou, H.; Pang, J.; Chen, P.-K.; Chou, F.-D. A modified particle swarm optimization algorithm for a batch-processing machine scheduling problem with arbitrary release times and non-identical job sizes. Comput. Ind. Eng. 2018, 123, 67–81. [Google Scholar] [CrossRef]
  36. Hu, L.; Li, H.; Cai, Z.; Lin, F.; Hong, G.; Chen, H.; Lu, Z. A new machine-learning method to prognosticate paraquat poisoned patients by combining coagulation, liver, and kidney indices. PLoS ONE 2017, 12, e0186427. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Li, C.; Hou, L.; Sharma, B.Y.; Li, H.; Chen, C.; Li, Y.; Zhao, X.; Huang, H.; Cai, Z.; Chen, H. Developing a new intelligent system for the diagnosis of tuberculous pleural effusion. Comput. Methods Programs Biomed. 2018, 153, 211–225. [Google Scholar] [CrossRef]
  38. Zhao, X.; Zhang, X.; Cai, Z.; Tian, X.; Wang, X.; Huang, Y.; Chen, H.; Hu, L. Chaos enhanced grey wolf optimization wrapped ELM for diagnosis of paraquat-poisoned patients. Comput. Biol. Chem. 2019, 78, 481–490. [Google Scholar] [CrossRef]
  39. Huang, H.; Zhou, S.; Jiang, J.; Chen, H.; Li, Y.; Li, C. A new fruit fly optimization algorithm enhanced support vector machine for diagnosis of breast cancer based on high-level features. BMC Bioinform. 2019, 20, 290. [Google Scholar] [CrossRef] [Green Version]
  40. Zhang, Y.; Liu, R.; Heidari, A.A.; Wang, X.; Chen, Y.; Wang, M.; Chen, H. Towards augmented kernel extreme learning models for bankruptcy prediction: Algorithmic behavior and comprehensive analysis. Neurocomputing 2020, 430, 185–212. [Google Scholar] [CrossRef]
  41. Yu, C.; Chen, M.; Cheng, K.; Zhao, X.; Ma, C.; Kuang, F.; Chen, H. SGOA: Annealing-behaved grasshopper optimizer for global tasks. Eng. Comput. 2021. [Google Scholar] [CrossRef]
  42. Cai, Z.; Gu, J.; Luo, J.; Zhang, Q.; Chen, H.; Pan, Z.; Li, Y.; Li, C. Evolving an optimal kernel extreme learning machine by using an enhanced grey wolf optimization strategy. Expert Syst. Appl. 2019, 138, 112814. [Google Scholar] [CrossRef]
  43. Heidari, A.A.; Abbaspour, R.A.; Chen, H. Efficient boosted grey wolf optimizers for global search and kernel extreme learning machine training. Appl. Soft Comput. 2019, 81, 105521. [Google Scholar] [CrossRef]
  44. Shen, L.; Chen, H.; Yu, Z.; Kang, W.; Zhang, B.; Li, H.; Yang, B.; Liu, D. Evolving support vector machines using fruit fly optimization for medical data classification. Knowl.-Based Syst. 2016, 96, 61–75. [Google Scholar] [CrossRef]
  45. Wang, M.; Chen, H.; Yang, B.; Zhao, X.; Hu, L.; Cai, Z.; Huang, H.; Tong, C. Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Neurocomputing 2017, 267, 69–84. [Google Scholar] [CrossRef]
  46. Wang, M.; Chen, H. Chaotic multi-swarm whale optimizer boosted support vector machine for medical diagnosis. Appl. Soft Comput. 2020, 88, 105946. [Google Scholar] [CrossRef]
  47. Deng, W.; Xu, J.; Zhao, H.; Song, Y. A Novel Gate Resource Allocation Method Using Improved PSO-Based QEA. IEEE Trans. Intell. Transp. Syst. 2020. [Google Scholar] [CrossRef]
  48. Deng, W.; Xu, J.; Song, Y.; Zhao, H. An Effective Improved Co-evolution Ant Colony Optimization Algorithm with Multi-Strategies and Its Application. Int. J. Bio-Inspired Comput. 2020, 16, 158–170. [Google Scholar] [CrossRef]
  49. Chen, Z.-G.; Zhan, Z.-H.; Lin, Y.; Gong, Y.-J.; Gu, T.-L.; Zhao, F.; Yuan, H.-Q.; Chen, X.; Li, Q.; Zhang, J. Multiobjective cloud workflow scheduling: A multiple populations ant colony system approach. IEEE Trans. Cybern. 2018, 49, 2912–2926. [Google Scholar] [CrossRef]
  50. Wang, Z.-J.; Zhan, Z.-H.; Yu, W.-J.; Lin, Y.; Zhang, J.; Gu, T.-L.; Zhang, J. Dynamic group learning distributed particle swarm optimization for large-scale optimization and its application in cloud workflow scheduling. IEEE Trans. Cybern. 2019, 50, 2715–2729. [Google Scholar] [CrossRef]
  51. Deng, W.; Liu, H.; Xu, J.; Zhao, H.; Song, Y. An improved quantum-inspired differential evolution algorithm for deep belief network. IEEE Trans. Instrum. Meas. 2020, 69, 7319–7327. [Google Scholar] [CrossRef]
  52. Zhao, H.; Liu, H.; Xu, J.; Deng, W. Performance prediction using high-order differential mathematical morphology gradient spectrum entropy and extreme learning machine. IEEE Trans. Instrum. Meas. 2019, 69, 4165–4172. [Google Scholar] [CrossRef]
  53. Liu, X.-F.; Zhan, Z.-H.; Zhang, J. Resource-Aware Distributed Differential Evolution for Training Expensive Neural-Network-Based Controller in Power Electronic Circuit. IEEE Trans. Neural Netw. Learn. Syst. 2021. [Google Scholar] [CrossRef]
  54. Zhan, Z.-H.; Liu, X.-F.; Zhang, H.; Yu, Z.; Weng, J.; Li, Y.; Gu, T.; Zhang, J. Cloudde: A heterogeneous differential evolution algorithm and its distributed cloud version. IEEE Trans. Parallel Distrib. Syst. 2016, 28, 704–716. [Google Scholar] [CrossRef]
  55. Zhao, X.; Li, D.; Yang, B.; Ma, C.; Zhu, Y.; Chen, H. Feature selection based on improved ant colony optimization for online detection of foreign fiber in cotton. Appl. Soft Comput. 2014, 24, 585–596. [Google Scholar] [CrossRef]
  56. Zhao, X.; Li, D.; Yang, B.; Chen, H.; Yang, X.; Yu, C.; Liu, S. A two-stage feature selection method with its application. Comput. Electr. Eng. 2015, 47, 114–125. [Google Scholar] [CrossRef]
  57. Liang, D.; Zhan, Z.-H.; Zhang, Y.; Zhang, J. An efficient ant colony system approach for new energy vehicle dispatch problem. IEEE Trans. Intell. Transp. Syst. 2019, 21, 4784–4797. [Google Scholar] [CrossRef]
  58. Ridha, H.M.; Heidari, A.A.; Wang, M.; Chen, H. Boosted mutation-based Harris hawks optimizer for parameters identification of single-diode solar cell models. Energy Convers. Manag. 2020, 209, 112660. [Google Scholar] [CrossRef]
  59. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl.-Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  60. Lei, G.; Song, H.; Rodriguez, D. Power generation cost minimization of the grid-connected hybrid renewable energy system through optimal sizing using the modified seagull optimization technique. Energy Rep. 2020, 6, 3365–3376. [Google Scholar] [CrossRef]
  61. Cao, Y.; Li, Y.; Zhang, G.; Jermsittiparsert, K.; Razmjooy, N. Experimental modeling of PEM fuel cells using a new improved seagull optimization algorithm. Energy Rep. 2019, 5, 1616–1625. [Google Scholar] [CrossRef]
  62. Dhiman, G.; Singh, K.K.; Soni, M.; Nagar, A.; Dehghani, M.; Slowik, A.; Kaur, A.; Sharma, A.; Houssein, E.H.; Cengiz, K. MOSOA: A new multi-objective seagull optimization algorithm. Expert Syst. Appl. 2021, 167, 114150. [Google Scholar] [CrossRef]
  63. Riget, J.; Vesterstrøm, J.S. A diversity-guided particle swarm optimizer-the ARPSO. Dept. Comput. Sci. Univ. of Aarhus Aarhus Denmark Tech. Rep. 2002, 2, 2002. [Google Scholar]
  64. Pant, M.; Radha, T.; Singh, V.P. A simple diversity guided particle swarm optimization. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; pp. 3294–3299. [Google Scholar]
  65. Mohamed, N.; Bilel, N.; Alsagri, A.S. A multi-objective methodology for multi-criteria engineering design. Appl. Soft Comput. 2020, 91, 106204. [Google Scholar] [CrossRef]
  66. Price, K.; Awad, N.; Ali, M.; Suganthan, P. Problem definitions and evaluation criteria for the 100-digit challenge special session and competition on single objective numerical optimization. In Technical Report; Nanyang Technological University: Singapore, 2018. [Google Scholar]
  67. Ahmed, A.M.; Rashid, T.A.; Saeed, S.A.M. Cat swarm optimization algorithm: A survey and performance evaluation. Comput. Intell. Neurosci. 2020, 2020, 4854895. [Google Scholar] [CrossRef]
  68. Rahman, C.M.; Rashid, T.A. A new evolutionary algorithm: Learner performance based behavior algorithm. Egypt. Inform. J. 2021, 22, 213–223. [Google Scholar] [CrossRef]
  69. Abdullah, J.M.; Ahmed, T. Fitness dependent optimizer: Inspired by the bee swarming reproductive process. IEEE Access 2019, 7, 43473–43486. [Google Scholar] [CrossRef]
  70. Rahman, C.M.; Rashid, T.A. Dragonfly algorithm and its applications in applied science survey. Comput. Intell. Neurosci. 2019, 2019, 9293617. [Google Scholar] [CrossRef]
  71. Li, Z.; Tam, V. A novel meta-heuristic optimization algorithm inspired by the spread of viruses. arXiv 2020, arXiv:2006.06282. [Google Scholar]
  72. Mirjalili, S. SCA: A sine cosine algorithm for solving optimization problems. Knowl.-Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  73. Yang, X.-S. Firefly algorithms for multimodal optimization. In International Symposium on Stochastic Algorithms; Springer: Berlin/Heidelberg, Germany, 2009; pp. 169–178. [Google Scholar]
  74. Yang, X.-S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  75. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  76. Adarsh, B.; Raghunathan, T.; Jayabarathi, T.; Yang, X.-S. Economic dispatch using chaotic bat algorithm. Energy 2016, 96, 666–675. [Google Scholar] [CrossRef]
  77. Nobile, M.S.; Cazzaniga, P.; Besozzi, D.; Colombo, R.; Mauri, G.; Pasi, G. Fuzzy Self-Tuning PSO: A settings-free algorithm for global optimization. Swarm Evol. Comput. 2018, 39, 70–85. [Google Scholar] [CrossRef]
  78. Yong, J.; He, F.; Li, H.; Zhou, W. A novel bat algorithm based on collaborative and dynamic learning of opposite population. In Proceedings of the 2018 IEEE 22nd International Conference on Computer Supported Cooperative Work in Design (CSCWD), Nanjing, China, 9–11 May 2018; pp. 541–546. [Google Scholar]
  79. Zhang, H.; Yuan, M.; Liang, Y.; Liao, Q. A novel particle swarm optimization based on prey–predator relationship. Appl. Soft Comput. 2018, 68, 202–218. [Google Scholar] [CrossRef]
  80. Lin, A.; Wu, Q.; Heidari, A.A.; Xu, Y.; Chen, H.; Geng, W.; Li, C. Predicting intentions of students for master programs using a chaos-induced sine cosine-based fuzzy K-nearest neighbor classifier. IEEE Access 2019, 7, 67235–67248. [Google Scholar] [CrossRef]
  81. Hongwei, L.; Jianyong, L.; Liang, C.; Jingbo, B.; Yangyang, S.; Kai, L.J. Chaos-enhanced moth-flame optimization algorithm for global optimization. J. Syst. Eng. Electron. 2019, 30, 1144–1159. [Google Scholar]
  82. Nenavath, H.; Jatoth, R.K.; Das, S. A synergy of the sine-cosine algorithm and particle swarm optimizer for improved global optimization and object tracking. Swarm Evol. Comput. 2018, 43, 1–30. [Google Scholar] [CrossRef]
  83. Luo, J.; Chen, H.; Heidari, A.A.; Xu, Y.; Zhang, Q.; Li, C. Multi-strategy boosted mutative whale-inspired optimization approaches. Appl. Math. Model. 2019, 73, 109–123. [Google Scholar] [CrossRef]
  84. Fan, Y.; Wang, P.; Mafarja, M.; Wang, M.; Zhao, X.; Chen, H. A bioinformatic variant fruit fly optimizer for tackling optimization problems. Knowl.-Based Syst. 2021, 213, 106704. [Google Scholar] [CrossRef]
  85. Hu, Z.; Wang, J.; Zhang, C.; Luo, Z.; Luo, X.; Xiao, L.; Shi, J. Uncertainty Modeling for Multi center Autism Spectrum Disorder Classification Using Takagi-Sugeno-Kang Fuzzy Systems. IEEE Trans. Cogn. Dev. Syst. 2021. [Google Scholar] [CrossRef]
  86. Chen, C.; Wu, Q.; Li, Z.; Xiao, L.; Hu, Z.Y. Diagnosis of Alzheimer’s disease based on Deeply-Fused Nets. Comb. Chem. High Throughput Screen. 2020, 24, 781–789. [Google Scholar] [CrossRef]
  87. Fei, X.; Wang, J.; Ying, S.; Hu, Z.; Shi, J. Projective parameter transfer based sparse multiple empirical kernel learning Machine for diagnosis of brain disease. Neurocomputing 2020, 413, 271–283. [Google Scholar] [CrossRef]
  88. Saber, A.; Sakr, M.; Abo-Seida, O.M.; Keshk, A.; Chen, H. A Novel Deep-Learning Model for Automatic Detection and Classification of Breast Cancer Using the Transfer-Learning Technique. IEEE Access 2021, 9, 71194–71209. [Google Scholar] [CrossRef]
  89. Cao, X.Y.; Wang, J.X.; Wang, J.H.; Zeng, B. A Risk-Averse Conic Model for Networked Microgrids Planning With Reconfiguration and Reorganizations. IEEE Trans. Smart Grid 2020, 11, 696–709. [Google Scholar] [CrossRef]
  90. Cao, X.; Cao, T.; Gao, F.; Guan, X. Risk-Averse Storage Planning for Improving RES Hosting Capacity under Uncertain Siting Choice. IEEE Trans. Sustain. Energy 2021. [Google Scholar] [CrossRef]
  91. Pei, H.; Yang, B.; Liu, J.; Chang, K. Active Surveillance via Group Sparse Bayesian Learning. IEEE Trans. Pattern Anal. Mach. Intell. 2020. [Google Scholar] [CrossRef] [PubMed]
  92. Qiu, S.; Hao, Z.; Wang, Z.; Liu, L.; Liu, J.; Zhao, H.; Fortino, G. Sensor Combination Selection Strategy for Kayak Cycle Phase Segmentation Based on Body Sensor Networks. IEEE Internet Things J. 2021. [Google Scholar] [CrossRef]
  93. Wu, Z.; Li, G.; Shen, S.; Cui, Z.; Lian, X.; Xu, G. Constructing dummy query sequences to protect location privacy and query privacy in location-based services. World Wide Web 2021, 24, 25–49. [Google Scholar] [CrossRef]
  94. Wu, Z.; Wang, R.; Li, Q.; Lian, X.; Xu, G. A location privacy-preserving system based on query range cover-up for location-based services. IEEE Trans. Veh. Technol. 2020, 69, 5244–5254. [Google Scholar] [CrossRef]
  95. Zhang, X.; Wang, T.; Wang, J.; Tang, G.; Zhao, L. Pyramid channel-based feature attention network for image dehazing. Comput. Vis. Image Underst. 2020, 197, 103003. [Google Scholar] [CrossRef]
  96. Wu, Z.; Li, R.; Xie, J.; Zhou, Z.; Guo, J.; Xu, X. A user sensitive subject protection approach for book search service. J. Assoc. Inf. Sci. Technol. 2020, 71, 183–195. [Google Scholar] [CrossRef]
  97. Wu, Z.; Shen, S.; Lian, X.; Su, X.; Chen, E. A dummy-based user privacy protection approach for text information retrieval. Knowl.-Based Syst. 2020, 195, 105679. [Google Scholar] [CrossRef]
  98. Wu, Z.; Shen, S.; Zhou, H.; Li, H.; Lu, C.; Zou, D. An effective approach for the protection of user commodity viewing privacy in e-commerce website. Knowl.-Based Syst. 2021, 220, 106952. [Google Scholar] [CrossRef]
  99. Qiu, S.; Zhao, H.; Jiang, N.; Wu, D.; Song, G.; Zhao, H.; Wang, Z. Sensor network oriented human motion capture via wearable intelligent system. Int. J. Intell. Syst. 2022, 137, 1646–1673. [Google Scholar] [CrossRef]
  100. Zhang, X.; Jiang, R.; Wang, T.; Wang, J. Recursive Neural Network for Video Deblurring. IEEE Trans. Circuits Syst. Video Technol. 2021, 31, 3025–3036. [Google Scholar] [CrossRef]
  101. Mahdavi, M.; Fesanghary, M.; Damangir, E. An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 2007, 188, 1567–1579. [Google Scholar] [CrossRef]
  102. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  103. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray optimization. Comput. Struct. 2012, 112, 283–294. [Google Scholar] [CrossRef]
  104. Mezura-Montes, E.; Coello, C.A.C. An empirical study about the usefulness of evolution strategies to solve constrained optimization problems. Int. J. Gen. Syst. 2008, 37, 443–473. [Google Scholar] [CrossRef]
  105. Rashedi, E.; Nezamabadi-Pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  106. Coello, C.A.C. Use of a self-adaptive penalty approach for engineering optimization problems. Comput. Ind. 2000, 41, 113–127. [Google Scholar] [CrossRef]
  107. Kannan, B.; Kramer, S.N. An augmented Lagrange multiplier based method for mixed integer discrete continuous optimization and its applications to mechanical design. In International Design Engineering Technical Conferences and Computers and Information in Engineering Conference; American Society of Mechanical Engineers: New York, NY, USA, 1994; pp. 103–112. [Google Scholar]
  108. Sandgren, E. Nonlinear Integer and Discrete Programming in Mechanical Design Optimization. J. Mech. Des. 1990, 112, 223–229. [Google Scholar] [CrossRef]
  109. Cheng, M.-Y.; Prayogo, D. Symbiotic organisms search: A new metaheuristic optimization algorithm. Comput. Struct. 2014, 139, 98–112. [Google Scholar] [CrossRef]
  110. Gandomi, A.H.; Yang, X.-S.; Alavi, A.H. Cuckoo search algorithm: A metaheuristic approach to solve structural optimization problems. Eng. Comput. 2013, 29, 17–35. [Google Scholar] [CrossRef]
  111. Wang, G.; Heidari, A.A.; Wang, M.; Kuang, F.; Zhu, W.; Chen, H. Chaotic arc adaptive grasshopper optimization. IEEE Access 2021, 9, 17672–17706. [Google Scholar] [CrossRef]
  112. Wang, G.G. Adaptive response surface method using inherited latin hypercube design points. J. Mech. Des. 2003, 125, 210–220. [Google Scholar] [CrossRef]
  113. Kamboj, V.K.; Nandi, A.; Bhadoria, A.; Sehgal, S. An intensify Harris Hawks optimizer for numerical and engineering optimization problems. Appl. Soft Comput. 2020, 89, 106018. [Google Scholar] [CrossRef]
  114. Faramarzi, A.; Heidarinejad, M.; Stephens, B.; Mirjalili, S. Equilibrium optimizer: A novel optimization algorithm. Knowl.-Based Syst. 2020, 191, 105190. [Google Scholar] [CrossRef]
  115. Lee, K.S.; Geem, Z.W. A new meta-heuristic algorithm for continuous engineering optimization: Harmony search theory and practice. Comput. Methods Appl. Mech. Eng. 2005, 194, 3902–3933. [Google Scholar] [CrossRef]
  116. Hedar, A.-R.; Fukushima, M. Derivative-free filter simulated annealing method for constrained continuous global optimization. J. Glob. Optim. 2006, 35, 521–549. [Google Scholar] [CrossRef]
  117. Ray, T.; Liew, K.-M. Society and civilization: An optimization algorithm based on the simulation of social behavior. IEEE Trans. Evol. Comput. 2003, 7, 386–396. [Google Scholar] [CrossRef]
  118. Akhtar, S.; Tai, K.; Ray, T. A socio-behavioural simulation model for engineering design optimization. Eng. Optim. 2002, 34, 341–354. [Google Scholar] [CrossRef]
  119. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  120. Nenavath, H.; Jatoth, R.K. Hybridizing sine cosine algorithm with differential evolution for global optimization and object tracking. Appl. Soft Comput. 2018, 62, 1019–1043. [Google Scholar] [CrossRef]
  121. Chen, W.-N.; Zhang, J.; Lin, Y.; Chen, N.; Zhan, Z.-H.; Chung, H.S.-H.; Li, Y.; Shi, Y.-H. Particle swarm optimization with an aging leader and challengers. IEEE Trans. Evol. Comput. 2012, 17, 241–258. [Google Scholar] [CrossRef]
  122. Kumar, A.; Das, S.; Zelinka, I. A self-adaptive spherical search algorithm for real-world constrained optimization problems. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion, Cancún, Mexico, 8–12 July 2020; pp. 13–14. [Google Scholar]
  123. Gurrola-Ramos, J.; Hernàndez-Aguirre, A.; Dalmau-Cedeño, O. COLSHADE for real-world single-objective constrained optimization problems. In Proceedings of the 2020 IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
  124. Kumar, A.; Das, S.; Zelinka, I. A modified covariance matrix adaptation evolution strategy for real-world constrained optimization problems. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion, Cancún, Mexico, 8–12 July 2020; pp. 11–12. [Google Scholar]
  125. Li, J.; Xiao, D.-D.; Zhang, T.; Liu, C.; Li, Y.-X.; Wang, G.-G. Multi-swarm cuckoo search algorithm with Q-learning model. Comput. J. 2020, 64, 108–131. [Google Scholar] [CrossRef]
  126. Nan, X.; Bao, L.; Zhao, X.; Zhao, X.; Sangaiah, A.K.; Wang, G.-G.; Ma, Z. EPuL: An enhanced positive-unlabeled learning algorithm for the prediction of pupylation sites. Molecules 2017, 22, 1463. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  127. Liu, D.; Fan, Z.; Fu, Q.; Li, M.; Faiz, M.A.; Ali, S.; Li, T.; Zhang, L.; Khan, M.I. Random forest regression evaluation model of regional flood disaster resilience based on the whale optimization algorithm. J. Clean. Prod. 2020, 250, 119468. [Google Scholar] [CrossRef]
  128. Gu, Z.-M.; Wang, G.-G. Improving NSGA-III algorithms with information feedback models for large-scale many-objective optimization. Future Gener. Comput. Syst. 2020, 107, 49–69. [Google Scholar] [CrossRef]
  129. Yi, J.-H.; Deb, S.; Dong, J.; Alavi, A.H.; Wang, G.-G. An improved NSGA-III Algorithm with adaptive mutation operator for big data optimization problems. Future Gener. Comput. Syst. 2018, 88, 571–585. [Google Scholar] [CrossRef]
  130. Wang, G.-G.; Cai, X.; Cui, Z.; Min, G.; Chen, J. High performance computing for cyber physical social systems by using evolutionary multi-objective optimization algorithm. IEEE Trans. Emerg. Top. Comput. 2020, 8, 20–30. [Google Scholar] [CrossRef]
  131. Sun, J.; Miao, Z.; Gong, D.; Zeng, X.-J.; Li, J.; Wang, G.-G. Interval multi-objective optimization with memetic algorithms. IEEE Trans. Cybern. 2020, 50, 3444–3457. [Google Scholar] [CrossRef]
Figure 1. Flow chart of IDARSOA.
Figure 1. Flow chart of IDARSOA.
Mathematics 10 00276 g001
Figure 2. Comparison and convergence curve of mechanism.
Figure 2. Comparison and convergence curve of mechanism.
Mathematics 10 00276 g002
Figure 3. Box plots of SOAs.
Figure 3. Box plots of SOAs.
Mathematics 10 00276 g003
Figure 4. Balance and diversity analysis diagram of IDARSOA and SOA.
Figure 4. Balance and diversity analysis diagram of IDARSOA and SOA.
Mathematics 10 00276 g004
Figure 5. Convergence curve of IDARSOA and original algorithms.
Figure 5. Convergence curve of IDARSOA and original algorithms.
Mathematics 10 00276 g005
Figure 6. Convergence curve of IDARSOA and advanced algorithms.
Figure 6. Convergence curve of IDARSOA and advanced algorithms.
Mathematics 10 00276 g006
Table 1. Comparison of different fc of IDARSOA.
Table 1. Comparison of different fc of IDARSOA.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
IDARSOAfc11.000000 × 1005.758899 × 10−144.952258 × 1001.817900 × 10−12.433492 × 1001.186997 × 100
IDARSOAfc21.000000 × 1002.667535 × 10−134.416804 × 1002.696345 × 10−12.125574 × 1009.741309 × 10−1
IDARSOAfc31.000000 × 1001.427971 × 10−124.620769 × 1003.408565 × 10−12.422004 × 1001.022597 × 100
IDARSOAfc51.000000 × 1007.603133 × 10−134.468247 × 1002.786393 × 10−13.079520 × 1001.460601 × 100
IDARSOAfc71.000000 × 1003.477511 × 10−144.630774 × 1003.173943 × 10−14.326896 × 1001.815038 × 100
IDARSOAfc91.000000 × 1002.433234 × 10−114.729463 × 1003.398487 × 10−14.073180 × 1001.644750 × 100
F4 F5 F6
AVGSTDAVGSTDAVGSTD
IDARSOAfc13.688039 × 1011.373013 × 1015.821549 × 1009.047358 × 1005.491320 × 1001.275813 × 100
IDARSOAfc22.463451 × 1011.139996 × 1012.194699 × 1006.875180 × 10−15.664204 × 1001.692936 × 100
IDARSOAfc32.239873 × 1018.125202 × 1001.998788 × 1003.113314 × 10−14.818130 × 1001.701835 × 100
IDARSOAfc52.424563 × 1016.106529 × 1002.061292 × 1004.047150 × 10−15.185845 × 1001.352847 × 100
IDARSOAfc72.731944 × 1011.041899 × 1002.108973 × 1006.882413 × 10−15.632649 × 1001.915015 × 100
IDARSOAfc92.859710 × 1011.035596 × 1012.131835 × 1006.668301 × 10−15.873998 × 1002.092584 × 100
F7 F8 F9
AVGSTDAVGSTDAVGSTD
IDARSOAfc11.263735 × 1033.365559 × 1024.074083 × 1004.138044 × 10−11.212927 × 1007.951421 × 10−2
IDARSOAfc21.406561 × 1034.167857 × 1023.953226 × 1003.750406 × 10−11.181665 × 1007.032048 × 10−2
IDARSOAfc31.509095 × 1033.868018 × 1023.920128 × 1004.127622 × 10−11.185820 × 1008.267084 × 10−2
IDARSOAfc51.468453 × 1035.192703 × 1024.002200 × 1003.019612 × 10−11.224425 × 1007.752889 × 10−2
IDARSOAfc71.619661 × 1034.745775 × 1024.158420 × 1003.717104 × 10−11.261381 × 1008.587691 × 10−2
IDARSOAfc91.646884 × 1034.487442 × 1024.139863 × 1003.270712 × 10−11.260103 × 1009.578651 × 10−2
F10 F11 F12
AVGSTDAVGSTDAVGSTD
IDARSOAfc12.124024 × 1011.382577 × 10−18.891585 × 1096.785941 × 1096.663063 × 1036.116660 × 102
IDARSOAfc22.098561 × 1011.830573 × 1004.385372 × 1094.046630 × 1097.024390 × 1036.603009 × 102
IDARSOAfc32.125866 × 1011.031388 × 10−12.122388 × 1091.018590 × 1096.815012 × 1036.312405 × 102
IDARSOAfc52.130665 × 1011.963167 × 10−12.727998 × 1092.547515 × 1097.080456 × 1035.757104 × 102
IDARSOAfc72.142574 × 1012.103051 × 10−12.055958 × 1091.768645 × 1097.111943 × 1036.878647 × 102
IDARSOAfc92.138810 × 1012.077814 × 10−12.791952 × 1091.488388 × 1097.212336 × 1038.694651 × 102
F13 F14 F15
AVGSTDAVGSTDAVGSTD
IDARSOAfc11.205610 × 1037.625843 × 1011.900000 × 1030.000000 × 1002.114255 × 1075.741210 × 107
IDARSOAfc21.051132 × 1038.194302 × 1011.900000 × 1030.000000 × 1001.031563 × 1072.481495 × 107
IDARSOAfc31.019698 × 1035.671150 × 1011.900000 × 1030.000000 × 1007.595739 × 1071.634042 × 108
IDARSOAfc51.053406 × 1036.639036 × 1011.900000 × 1030.000000 × 1004.953248 × 1071.150416 × 108
IDARSOAfc71.087131 × 1038.449326 × 1011.900000 × 1030.000000 × 1003.026353 × 1071.108006 × 108
IDARSOAfc91.097117 × 1031.210897 × 1021.900000 × 1030.000000 × 1001.629272 × 1072.692786 × 107
F16 F17 F18
AVGSTDAVGSTDAVGSTD
IDARSOAfc13.134750 × 1033.749393 × 1021.286831 × 1082.581530 × 1082.459859 × 1032.249968 × 101
IDARSOAfc22.929809 × 1034.989991 × 1028.429436 × 1071.946799 × 1082.438595 × 1032.656722 × 101
IDARSOAfc32.823617 × 1033.599560 × 1021.542068 × 1082.822871 × 1082.443144 × 1032.455074 × 101
IDARSOAfc53.009103 × 1034.150459 × 1021.561803 × 1082.815656 × 1082.439564 × 1033.676659 × 101
IDARSOAfc72.951470 × 1032.636037 × 1022.118624 × 1082.940915 × 1082.446642 × 1033.351741 × 101
IDARSOAfc93.098547 × 1033.829978 × 1022.434406 × 1084.983535 × 1082.489955 × 1036.573209 × 101
F19 F20 Mean LevelRank
AVGSTDAVGSTD
IDARSOAfc12.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 1003.354
IDARSOAfc22.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 1002.11
IDARSOAfc32.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 1002.22
IDARSOAfc52.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 1002.93
IDARSOAfc72.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 1003.755
IDARSOAfc92.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 1004.456
Table 2. Parameter settings of IDARSOA.
Table 2. Parameter settings of IDARSOA.
AlgorithmParametersAlgorithmParameters
IDARSOA01 ω 1 = 0.5 ; ω 1 = 0.1 IDARSOA10 ω 1 = 0.1 ; ω 1 = 0.5
IDARSOA02 ω 1 = 0.5 ; ω 1 = 0.2 IDARSOA11 ω 1 = 0.2 ; ω 1 = 0.5
IDARSOA03 ω 1 = 0.5 ; ω 1 = 0.3 IDARSOA12 ω 1 = 0.3 ; ω 1 = 0.5
IDARSOA04 ω 1 = 0.5 ; ω 1 = 0.4 IDARSOA13 ω 1 = 0.4 ; ω 1 = 0.5
IDARSOA05 ω 1 = 0.5 ; ω 1 = 0.5 IDARSOA14 ω 1 = 0.6 ; ω 1 = 0.5
IDARSOA06 ω 1 = 0.5 ; ω 1 = 0.6 IDARSOA15 ω 1 = 0.7 ; ω 1 = 0.5
IDARSOA07 ω 1 = 0.5 ; ω 1 = 0.7 IDARSOA16 ω 1 = 0.8 ; ω 1 = 0.5
IDARSOA08 ω 1 = 0.5 ; ω 1 = 0.8 IDARSOA17 ω 1 = 0.9 ; ω 1 = 0.5
IDARSOA09 ω 1 = 0.5 ; ω 1 = 0.9
Table 3. Comparison of parameters settings.
Table 3. Comparison of parameters settings.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
IDARSOA011.0000 × 1007.1098 × 10 −134.5221 × 1003.0290 × 10−12.2998 × 1009.4177 × 10−1
IDARSOA021.0000 × 1003.8050 × 10 −124.6309 × 1003.3643 × 10−12.5054 × 1008.1540 × 10−1
IDARSOA031.0000 × 1002.1336 × 10 −134.5643 × 1003.4439 × 10−12.7870 × 1001.3151 × 100
IDARSOA041.0000 × 1001.7623 × 10 −124.4649 × 1002.7740 × 10−12.1095 × 1009.5977 × 10−1
IDARSOA051.0000 × 1006.2818 × 10 −124.4148 × 1002.4449 × 10−11.9846 × 1008.1660 × 10−1
IDARSOA061.0000 × 1009.8255 × 10 −134.4005 × 1002.4310 × 10−12.8128 × 1001.4789 × 100
IDARSOA071.0000 × 1005.1951 × 10 −124.4686 × 1003.0250 × 10−12.7536 × 1001.3484 × 100
IDARSOA081.0000 × 1002.5142 × 10 −124.5113 × 1003.2751 × 10−13.1051 × 1001.6605 × 100
IDARSOA091.0000 × 1001.9889 × 10 −134.5311 × 1003.4123 × 10−13.1209 × 1001.4098 × 100
IDARSOA101.0000 × 1000.0000 × 1004.6013 × 1003.3527 × 10−13.8233 × 1001.4639 × 100
IDARSOA111.0000 × 1001.3380 × 10 −154.5414 × 1003.3407 × 10−13.6497 × 1001.5097 × 100
IDARSOA121.0000 × 1004.1233 × 10 −174.5305 × 1003.4018 × 10−13.0959 × 1001.2182 × 100
IDARSOA131.0000 × 1006.6097 × 10 −154.4954 × 1003.3859 × 10−12.5881 × 1001.3068 × 100
IDARSOA141.0000 × 1009.3677 × 10 −124.5112 × 1003.0444 × 10−12.1808 × 1008.4346 × 10−1
IDARSOA151.0000 × 1009.0943 × 10 −134.6305 × 1003.1351 × 10−12.8134 × 1001.5691 × 100
IDARSOA161.0000 × 1001.2898 × 10 −114.4263 × 1002.4384 × 10−12.9569 × 1001.4764 × 100
IDARSOA171.0000 × 1002.6900 × 10 −124.6626 × 1003.1935 × 10−12.6036 × 1001.0676 × 100
F4 F5 F6
AVGSTDAVGSTDAVGSTD
IDARSOA012.4581 × 1011.0205 × 1012.6664 × 1001.0053 × 1005.3757 × 1001.3387 × 100
IDARSOA022.4620 × 1019.5404 × 1002.3394 × 1007.2482 × 10−15.6362 × 1001.7517 × 100
IDARSOA032.6353 × 1011.0136 × 1012.3466 × 1008.4829 × 10−15.6669 × 1001.6688 × 100
IDARSOA042.8527 × 1011.1019 × 1012.1741 × 1009.0196 × 10−15.6726 × 1001.6060 × 100
IDARSOA052.8420 × 1011.0021 × 1012.1779 × 1001.3784 × 1005.4521 × 1001.3099 × 100
IDARSOA062.6725 × 1019.2999 × 1003.2507 × 1003.1355 × 1005.6015 × 1001.8411 × 100
IDARSOA072.6327 × 1018.6022 × 1003.2597 × 1001.5464 × 1005.9124 × 1001.3527 × 100
IDARSOA082.5881 × 1016.1845 × 1003.2389 × 1002.0344 × 1005.6661 × 1001.2747 × 100
IDARSOA092.6593 × 1018.0807 × 1003.4279 × 1001.4143 × 1006.0165 × 1001.4066 × 100
IDARSOA103.8186 × 1017.9266 × 1006.1616 × 1002.0080 × 1006.6093 × 1009.2928 × 10−1
IDARSOA113.3962 × 1017.3061 × 1004.1841 × 1001.4833 × 1006.2598 × 1001.1306 × 100
IDARSOA122.9104 × 1016.2466 × 1003.5271 × 1001.2830 × 1005.7546 × 1001.4611 × 100
IDARSOA132.7372 × 1017.7938 × 1002.8897 × 1001.1336 × 1005.7283 × 1001.9894 × 100
IDARSOA142.8092 × 1018.7542 × 1003.2801 × 1003.5517 × 1005.9541 × 1001.8852 × 100
IDARSOA152.4409 × 1018.0677 × 1002.8339 × 1001.6447 × 1005.5501 × 1001.5394 × 100
IDARSOA162.9955 × 1011.1018 × 1013.5394 × 1002.3300 × 1005.9731 × 1001.9348 × 100
IDARSOA172.6693 × 1011.0648 × 1013.0238 × 1001.2134 × 1005.9119 × 1002.0688 × 100
F7 F8 F9
AVGSTDAVGSTDAVGSTD
IDARSOA011.4524 × 1033.9054 × 1024.0606 × 1004.0185 × 10−11.2206 × 1007.9805 × 10−2
IDARSOA021.3058 × 1033.7027 × 1023.9924 × 1004.0475 × 10−11.1941 × 1006.3549 × 10−2
IDARSOA031.4908 × 1034.2485 × 1024.0050 × 1003.5385 × 10−11.2189 × 1001.0129 × 10−1
IDARSOA041.3631 × 1033.0353 × 1024.0836 × 1003.4300 × 10−11.1955 × 1007.6053 × 10−2
IDARSOA051.4481 × 1034.6512 × 1023.9618 × 1004.4447 × 10−11.2555 × 1001.1981 × 10−1
IDARSOA061.5382 × 1034.0746 × 1024.0030 × 1003.6576 × 10−11.2387 × 1001.0696 × 10−1
IDARSOA071.5988 × 1034.3166 × 1024.2008 × 1002.4802 × 10−11.2737 × 1007.0067 × 10−2
IDARSOA081.6605 × 1033.6934 × 1024.2143 × 1003.4708 × 10−11.3011 × 1006.8430 × 10−2
IDARSOA091.6913 × 1034.4211 × 1024.2161 × 1003.0293 × 10−11.3086 × 1007.1227 × 10−2
IDARSOA101.5330 × 1033.8835 × 1024.4975 × 1004.4491 × 10−11.4773 × 1002.6855 × 10−1
IDARSOA111.5015 × 1033.5299 × 1024.3507 × 1003.2739 × 10−11.3384 × 1007.9786 × 10−2
IDARSOA121.5489 × 1034.5487 × 1024.2247 × 1002.2252 × 10−11.3136 × 1006.0874 × 10−2
IDARSOA131.5232 × 1034.0966 × 1024.0739 × 1003.4958 × 10−11.2415 × 1006.6822 × 10−2
IDARSOA141.4081 × 1034.2644 × 1023.9631 × 1003.8229 × 10−11.2280 × 1001.0433 × 10−1
IDARSOA151.4451 × 1034.3183 × 1024.0201 × 1003.7615 × 10−11.2287 × 1001.0141 × 10−1
IDARSOA161.5349 × 1034.1635 × 1023.9816 × 1002.6288 × 10−11.2260 × 1001.2272 × 10−1
IDARSOA171.6059 × 1034.9431 × 1024.0404 × 1003.5546 × 10−11.2206 × 1001.1546 × 10−1
F10 F11 F12
AVGSTDAVGSTDAVGSTD
IDARSOA012.1280 × 1011.7507 × 10−12.1382 × 1083.4416 × 1082.0200 × 1032.8335 × 102
IDARSOA022.1286 × 1011.6340 × 10−11.2274 × 1081.4623 × 1082.1510 × 1033.2795 × 102
IDARSOA032.1269 × 1011.5574 × 10−14.2864 × 1081.6445 × 1092.0298 × 1033.3485 × 102
IDARSOA042.1279 × 1011.2501 × 10−11.1905 × 1083.1222 × 1081.9318 × 1033.0480 × 102
IDARSOA052.1321 × 1011.7899 × 10−11.8711 × 1086.2487 × 1081.9360 × 1033.2684 × 102
IDARSOA062.1322 × 1011.8987 × 10−19.8061 × 1082.2807 × 1092.1234 × 1033.8321 × 102
IDARSOA072.1420 × 1012.1176 × 10−13.6423 × 1081.2382 × 1092.1887 × 1033.9374 × 102
IDARSOA082.1464 × 1012.0693 × 10−14.6418 × 1081.6441 × 1092.1188 × 1033.4011 × 102
IDARSOA092.1392 × 1011.9553 × 10−18.3583 × 1082.2192 × 1092.3045 × 1033.2662 × 102
IDARSOA102.1570 × 1011.5581 × 10−19.3577 × 1082.2582 × 1092.1975 × 1032.7247 × 102
IDARSOA112.1482 × 1012.0087 × 10−15.0671 × 1081.6407 × 1092.2381 × 1032.6707 × 102
IDARSOA122.1438 × 1011.8690 × 10−12.2028 × 1082.1073 × 1082.1718 × 1032.1766 × 102
IDARSOA132.1385 × 1012.0300 × 10−14.1066 × 1081.6480 × 1092.1200 × 1033.3548 × 102
IDARSOA142.1291 × 1011.8167 × 10−13.9643 × 1081.6447 × 1092.2573 × 1034.5010 × 102
IDARSOA152.1225 × 1011.1628 × 10−12.4735 × 1085.3249 × 1082.1028 × 1034.2773 × 102
IDARSOA162.1260 × 1011.4831 × 10−11.4818 × 1081.4395 × 1082.1507 × 1033.4058 × 102
IDARSOA172.1298 × 1011.6992 × 10−12.2210 × 1084.2484 × 1082.1251 × 1033.6144 × 102
F13 F14 F15
AVGSTDAVGSTDAVGSTD
IDARSOA017.3352 × 1027.8137 × 1001.9000 × 1030.0000 × 1001.9923 × 1069.7803 × 106
IDARSOA027.3522 × 1028.2661 × 1001.9000 × 1030.0000 × 1001.9171 × 1069.7928 × 106
IDARSOA037.3240 × 1029.9243 × 1001.9000 × 1030.0000 × 1005.2395 × 1041.8689 × 105
IDARSOA047.3718 × 1021.2167 × 1011.9000 × 1030.0000 × 1001.6897 × 1053.0976 × 105
IDARSOA057.3812 × 1021.4718 × 1011.9000 × 1030.0000 × 1003.7445 × 1061.3596 × 107
IDARSOA067.3422 × 1029.1288 × 1001.9000 × 1030.0000 × 1002.1241 × 1053.2841 × 105
IDARSOA077.3583 × 1027.0423 × 1001.9000 × 1030.0000 × 1001.9924 × 1069.7802 × 106
IDARSOA087.3996 × 1021.0126 × 1011.9000 × 1030.0000 × 1002.7562 × 1061.0101 × 107
IDARSOA097.3726 × 1026.7043 × 1001.9000 × 1030.0000 × 1009.6037 × 1062.0197 × 107
IDARSOA107.4737 × 1026.7846 × 1001.9000 × 1030.0000 × 1002.4328 × 1069.9372 × 106
IDARSOA117.4906 × 1026.9347 × 1001.9000 × 1030.0000 × 1003.0081 × 1053.3392 × 105
IDARSOA127.4096 × 1027.4914 × 1001.9000 × 1030.0000 × 1002.0581 × 1069.7678 × 106
IDARSOA137.3780 × 1027.5538 × 1001.9000 × 1030.0000 × 1003.7639 × 1061.3590 × 107
IDARSOA147.3852 × 1021.0807 × 1011.9000 × 1030.0000 × 1007.5295 × 1042.2042 × 105
IDARSOA157.3825 × 1021.1380 × 1011.9000 × 1030.0000 × 1009.3968 × 1042.3985 × 105
IDARSOA167.3843 × 1021.2874 × 1011.9000 × 1030.0000 × 1002.2068 × 1053.2102 × 105
IDARSOA177.3505 × 1028.0908 × 1001.9000 × 1030.0000 × 1001.3849 × 1052.5673 × 105
F16 F17 F18
AVGSTDAVGSTDAVGSTD
IDARSOA011.6346 × 1032.6021 × 1012.4613 × 1052.0420 × 1052.2977 × 1031.4230 × 101
IDARSOA021.6415 × 1033.7938 × 1012.2126 × 1061.0799 × 10+72.3003 × 1031.3598 × 10−1
IDARSOA031.6415 × 1034.2627 × 1011.6688 × 1051.9092 × 1052.2925 × 1032.3775 × 101
IDARSOA041.6335 × 1034.4810 × 1012.3329 × 1054.5377 × 1052.2949 × 1032.0431 × 101
IDARSOA051.6336 × 1034.0635 × 1012.6040 × 1052.0104 × 1052.2937 × 1032.0205 × 101
IDARSOA061.6291 × 1032.2961 × 1012.8097 × 1054.5706 × 1052.2820 × 1033.1238 × 101
IDARSOA071.6339 × 1033.2790 × 1014.3703 × 1057.0714 × 1052.2840 × 1033.0200 × 101
IDARSOA081.6304 × 1032.3174 × 1012.8207 × 1054.5525 × 1052.2801 × 1033.2190 × 101
IDARSOA091.6467 × 1034.0193 × 1014.2740 × 1055.7028 × 1052.2827 × 1033.0705 × 101
IDARSOA101.6450 × 1033.1481 × 1013.2174 × 1051.8418 × 1052.2830 × 1032.6971 × 101
IDARSOA111.6380 × 1032.8113 × 1012.7357 × 1052.1122 × 1052.2817 × 1033.0474 × 101
IDARSOA121.6298 × 1032.2318 × 1013.8072 × 1055.8818 × 1052.2811 × 1033.1022 × 101
IDARSOA131.6515 × 1031.2246 × 1022.0171 × 1052.0005 × 1052.2817 × 1033.2621 × 101
IDARSOA141.6362 × 1033.8043 × 1011.3530 × 1051.8946 × 1052.2983 × 1031.0829 × 101
IDARSOA151.6569 × 1034.6309 × 1012.5786 × 1054.5928 × 1052.2980 × 1031.2827 × 101
IDARSOA161.6468 × 1034.3076 × 1012.9995 × 1054.4376 × 1052.3003 × 1031.4027 × 10−1
IDARSOA171.6395 × 1034.2421 × 1013.4032 × 1054.3474 × 1052.2979 × 1031.3421 × 101
F19 F20 Mean LevelRank
AVGSTDAVGSTD
IDARSOA012.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 10052
IDARSOA022.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1006.66
IDARSOA032.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1005.33
IDARSOA042.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1004.651
IDARSOA052.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1005.84
IDARSOA062.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1006.55
IDARSOA072.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1008.8513
IDARSOA082.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1008.6511
IDARSOA092.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 10010.9516
IDARSOA102.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 10011.917
IDARSOA112.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 10010.6515
IDARSOA122.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1009.2514
IDARSOA132.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1007.59
IDARSOA142.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1007.458
IDARSOA152.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1006.66
IDARSOA162.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1008.7512
IDARSOA172.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 1008.1510
Table 4. The performance of the two strategies on SOA.
Table 4. The performance of the two strategies on SOA.
IDAR
SOA00
ARSOA01
IDSOA10
IDARSOA11
Table 5. Comparison of Wilcoxon signed-rank test results of different SOAs.
Table 5. Comparison of Wilcoxon signed-rank test results of different SOAs.
F1F2F3F4F5F6
IDARSOAN/AN/AN/AN/AN/AN/A
IDSOA9.7656 × 10−41.0201 × 10−13.5152 × 10−61.9729 × 10−53.7243 × 10−51.2506 × 10−4
ARSOA9.7656 × 10−42.7016 × 10−51.3820 × 10−31.3601 × 10−51.0246 × 10−54.9916 × 10−3
SOA9.7656 × 10−42.7016 × 10−51.9209 × 10−61.9209 × 10−61.7344 × 10−63.5152 × 10−6
F7F8F9F10F11F12
IDARSOAN/AN/AN/AN/AN/AN/A
IDSOA1.9209 × 10−61.7344 × 10−64.0715 × 10−52.6033 × 10−61.6046 × 10−42.6033 × 10−6
ARSOA7.1889 × 10−12.7116 × 10−12.5967 × 10−53.6826 × 10−27.5137 × 10−55.4463 × 10−2
SOA5.7517 × 10−61.7344 × 10−62.1266 × 10−62.6033 × 10−61.9729 × 10−51.7344 × 10−6
F13F14F15F16F17F18
IDARSOAN/AN/AN/AN/AN/AN/A
IDSOA7.8126 × 10−11.0000 × 1009.2710 × 10−31.7344 × 10−62.2102 × 10−12.5967 × 10−5
ARSOA6.3391 × 10−61.0000 × 1001.9569 × 10−21.5658 × 10−28.5896 × 10−22.1827 × 10−2
SOA5.2165 × 10−61.0000 × 1001.6046 × 10−41.7344 × 10−61.5286 × 10−11.7344 × 10−6
F19F20+/−/=ARVRANK
IDARSOAN/AN/A 1.41
IDSOA9.7656 × 10−44.3778 × 10−413/3/42.53
ARSOA1.0000 × 1001.0000 × 10011/2/72.052
SOA4.8828 × 10−44.3778 × 10−417/1/23.454
Table 6. Comparison of Wilcoxon signed rank test results of different SOAs in high dimension.
Table 6. Comparison of Wilcoxon signed rank test results of different SOAs in high dimension.
dim = 50
F11F12F13F14F15F16
IDARSOAN/AN/AN/AN/AN/AN/A
IDSOA1.7344 × 10−61.7344 × 10−61.7344 × 10−61.0000 × 1004.0715 × 10−54.7292 × 10−6
ARSOA1.7344 × 10−62.4308 × 10−21.7344 × 10−61.0000 × 1003.1817 × 10−61.7344 × 10−6
SOA1.7344 × 10−61.7344 × 10−61.7344 × 10−61.0000 × 1001.7344 × 10−61.7344 × 10−6
F17F18F19F20+/−/=ARVRANK
IDARSOAN/AN/AN/AN/A 1.11
IDSOA1.4936 × 10−51.7344 × 10−61.0000 × 1001.0000 × 1006/1/31.92
ARSOA4.4493 × 10−52.8786 × 10−61.0000 × 1001.0000 × 1007/0/32.23
SOA1.7344 × 10−61.7344 × 10−61.0000 × 1001.0000 × 1007/0/334
dim = 100
F11F12F13F14F15F16
IDARSOAN/AN/AN/AN/AN/AN/A
IDSOA1.7344 × 10−61.7344 × 10−62.7653 × 10−31.0000 × 1003.3173 × 10−44.1955 × 10−4
ARSOA1.7344 × 10−64.2843 × 10−11.9209 × 10−61.0000 × 1008.1878 × 10−54.8603 × 10−5
SOA1.7344 × 10−61.7344 × 10−61.7344 × 10−61.0000 × 1001.7344 × 10−61.7344 × 10−6
F17F18F19F20+/−/=ARVRANK
IDARSOAN/AN/AN/AN/A 1.31
IDSOA2.7653 × 10−34.1955 × 10−41.0000 × 1001.0000 × 1005/2/32.12
ARSOA9.2710 × 10−32.1266 × 10−61.0000 × 1001.0000 × 1006/0/42.12
SOA3.3173 × 10−41.7344 × 10−61.0000 × 1001.0000 × 1007/0/334
Table 7. Parameter settings of original algorithms.
Table 7. Parameter settings of original algorithms.
AlgorithmPopulation SizeMaximum Evaluation TimesOther Parameters
IDARSOA30300,000 f c = 2 ; k [ 1 , 30 ] ; r d = [ 0 , 1 ] ; u = 1 ;
v = 1 ; R [ 0 , 1 ] ; θ [ 0 , 2 π ] ; ω 1 = 0.5 ;
ω 2 = 0.4
SCA30300,000 a = 2 ; r 1 [ 0 , 2 ] ; r 2 [ 0 , 2 π ] ; r 3 [ 0 , 2 ] ;
r 4 [ 0 , 1 ]
FA30300,000 a l p h a = 0.5 ; b e t a = 0.2 ; g a m m a = 1
WOA30300,000 a 1 [ 0 , 2 ] ; a 2 [ 2 , 1 ] ; b = 1 ; p [ 0 , 1 ] ;   r 1 [ 0 , 1 ] ; r 2 [ 0 , 1 ]
BA30300,000 A = 0.5 ; r = 0.5
MFO30300,000 b = 1 ; a [ 2 , 1 ] ; t [ 1 , 1 ]
Table 8. Comparison of IDARSOA and original algorithms.
Table 8. Comparison of IDARSOA and original algorithms.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
IDARSOA1.0000 × 1009.9190 × 10−144.4624 × 1002.6850 × 10−12.4698 × 1001.3612 × 100
SCA1.6438 × 1054.7520 × 1051.5949 × 1039.1850 × 1027.3779 × 1001.5579 × 100
FA1.9599 × 1077.4561 × 1064.8321 × 1035.5803 × 1028.7553 × 1003.7347 × 10−1
WOA5.8467 × 1051.0555 × 1067.1961 × 1032.7167 × 1032.2456 × 1001.0804 × 100
BA1.7570 × 1081.8006 × 1081.2794 × 1047.2381 × 1039.0701 × 1009.9474 × 10−1
MFO7.5475 × 1067.7712 × 1061.8202 × 1032.7692 × 1036.9546 × 1002.1818 × 100
F4 F5 F6
AVGSTDAVGSTDAVGSTD
IDARSOA2.8324 × 1011.1700 × 1012.0808 × 1007.2777 × 10−15.5185 × 1001.3272 × 100
SCA3.5327 × 1016.6120 × 1005.5044 × 1002.1222 × 1006.2583 × 1001.0812 × 100
FA3.6286 × 1014.4263 × 1009.4106 × 1001.6023 × 1007.4503 × 1004.8768 × 10−1
WOA4.8000 × 1011.8752 × 1011.7457 × 1003.2824 × 10−17.0760 × 1001.9836 × 100
BA7.1763 × 1012.2951 × 1011.4952 × 1008.8400 × 10−29.4079 × 1002.0098 × 100
MFO2.8907 × 1019.5872 × 1002.3353 × 1003.9566 × 1004.4310 × 1001.7569 × 100
F7 F8 F9
AVGSTDAVGSTDAVGSTD
IDARSOA1.3741 × 1033.6634 × 1023.9526 × 1003.2795 × 10−11.2210 × 1009.9084 × 10−2
SCA1.1619 × 1032.1455 × 1023.9556 × 1002.8269 × 10−11.3891 × 1008.2047 × 10−2
FA1.1666 × 1031.5428 × 1024.2398 × 1001.4230 × 10−11.6918 × 1009.4065 × 10−2
WOA1.1236 × 1033.8056 × 1024.2289 × 1003.5462 × 10−11.3197 × 1001.7161 × 10−1
BA1.4699 × 1033.0922 × 1024.5483 × 1002.8625 × 10−11.3975 × 1001.9923 × 10−1
MFO1.0700 × 1033.8785 × 1024.4744 × 1002.9561 × 10−11.3434 × 1001.5116 × 10−1
F10 F11 F12
AVGSTDAVGSTDAVGSTD
IDARSOA2.1250 × 1011.3100 × 10−15.5253 × 1077.9504 × 1072.2202 × 1034.4254 × 102
SCA2.1102 × 1011.0959 × 1004.9127 × 1082.4689 × 1082.1567 × 1031.5194 × 102
FA2.1023 × 1016.0846 × 10−15.6923 × 1081.5375 × 1082.1972 × 1031.5409 × 102
WOA2.1065 × 1018.9335 × 10−22.2785 × 1031.1634 × 1032.0847 × 1033.3080 × 102
BA2.1306 × 1017.8357 × 10−21.0418 × 1055.1233 × 1042.4523 × 1033.1571 × 102
MFO2.1168 × 1011.7784 × 10−17.9222 × 1073.0168 × 1082.0807 × 1033.3741 × 102
F13 F14 F15
AVGSTDAVGSTDAVGSTD
IDARSOA7.3703 × 1021.2169 × 1011.9000 × 1030.0000 × 1001.8669 × 1069.8014 × 106
SCA7.5499 × 1027.4892 × 1001.9001 × 1035.8440 × 10−18.8925 × 1041.2521 × 105
FA7.9477 × 1028.1099 × 1001.9104 × 1032.1422 × 1001.8894 × 1048.4283 × 103
WOA7.7068 × 1022.3497 × 1011.9000 × 1036.1056 × 10−25.3733 × 1033.3812 × 103
BA8.5547 × 1025.4071 × 1011.9025 × 1031.1437 × 1003.5577 × 1031.1950 × 103
MFO7.3963 × 1021.7116 × 1011.9016 × 1031.6222 × 1008.3721 × 1041.4693 × 105
F16 F17 F18
AVGSTDAVGSTDAVGSTD
IDARSOA1.6286 × 1032.4444 × 1012.6228 × 1052.0391 × 1052.2955 × 1031.8224 × 101
SCA1.6233 × 1031.4381 × 1014.4090 × 1031.1815 × 1032.2840 × 1032.4760 × 101
FA1.6511 × 1031.5534 × 1014.5973 × 1031.0688 × 1032.2893 × 1031.0275 × 101
WOA1.7174 × 1036.8250 × 1014.7778 × 1032.2965 × 1032.2982 × 1031.0879 × 101
BA1.8946 × 1031.3756 × 1022.8174 × 1033.1854 × 1022.3172 × 1031.3160 × 101
MFO1.7649 × 1031.1782 × 1023.4778 × 1048.8401 × 1042.2960 × 1031.5197 × 101
F19 F20 +/−/=ARVRank
AVGSTDAVGSTD
IDARSOA2.6000 × 1030.0000 × 1002.7000 × 1030.0000 × 100 2.551
SCA2.8366 × 1036.1747 × 1002.9578 × 1032.2701 × 10111/4/53.253
FA2.8317 × 1036.3671 × 1002.9833 × 1031.1435 × 10114/4/24.355
WOA2.7221 × 1031.3152 × 1022.9248 × 1037.9355 × 10111/5/42.852
BA2.7663 × 1031.1570 × 1022.9257 × 1037.9028 × 10115/3/24.656
MFO2.8201 × 1037.2237 × 1002.9526 × 1033.9734 × 10112/5/33.354
Table 9. Parameter settings of advanced algorithms.
Table 9. Parameter settings of advanced algorithms.
AlgorithmPopulation SizeMaximum Evaluation TimesOther Parameters
IDARSOA30300,000 f c = 2 ; k [ 1 , 30 ] ; r d = [ 0 , 1 ] ; u = 1 ;
v = 1 ; R [ 0 , 1 ] ; θ [ 0 , 2 π ] ; ω 1 = 0.5 ;
ω 2 = 0.4
CBA30300,000 Q m i n = 0 ;   Q m a x = 2
FSTPSO30300,000 V m a x = 6 ; V m i n = 6 ; c 1 = 2 ; c 2 = 2 ;
ω = 0.9
CDLOBA30300,000 Q m i n = 0 ;   Q m a x = 2
BSSFOA30300,000 d e l t a = 0.9 ; F m i n = 0 ; F m a x = 2
PPPSO30300,000 V m a x = 6 ; V m i n = 0 ; c 1 = 2 ; c 2 = 2 ;
k 1 = 0.5 ; k 2 = 0.3 ; k p = 0.45 ;
k i = 0.3 ; X = 0.7298
CESCA30300,000 a = 2 ; beta = 1.5 ; ChaosVec ( 1 ) = 0.7
CMFO30300,000 C C ( 1 ) = 0.7 ; b = 1
SCAPSO30300,000 V m a x = 4 ; ω m a x = 0.9 ; ω m i n = 0.4 ;
c 1 = 2 ; c 2 = 2 ; a = 2
CCMWOA30300,000 m = 1500 ; b = 1
Table 10. Comparison of IDARSOA and other advanced algorithms.
Table 10. Comparison of IDARSOA and other advanced algorithms.
F1 F2 F3
AVGSTDAVGSTDAVGSTD
IDARSOA1.000000 × 1009.503525 × 10−134.484137 × 1003.007712 × 10−12.467507 × 1001.378922 × 100
CBA2.045314 × 1053.169591 × 1057.063138 × 1034.005939 × 1039.473201 × 1001.441603 × 100
FSTPSO5.416835 × 1067.040553 × 1062.823097 × 1031.665675 × 1038.854926 × 1001.712659 × 100
CDLOBA5.366078 × 1083.447411 × 1082.126215 × 1045.968429 × 1038.539816 × 1001.479686 × 100
BSSFOA1.000000 × 1003.955114 × 10−115.000000 × 1002.366243 × 10−65.313317 × 10162.834720 × 1017
PPPSO4.063737 × 1073.657041 × 1076.747908 × 1033.576540 × 1034.287855 × 1002.220982 × 100
CESCA1.000000 × 1000.000000 × 1001.209563 × 1037.315292 × 1029.812875 × 1006.840541 × 10−1
CMFO2.495589 × 1071.852789 × 1078.484967 × 1033.251125 × 1032.102516 × 1008.270586 × 10−1
SCAPSO1.000003 × 1001.283621 × 10−55.000000 × 1000.000000 × 1008.806864 × 1005.061346 × 10−1
CCMWOA1.000000 × 1000.000000 × 1005.000000 × 1000.000000 × 1003.963592 × 1001.168292 × 100
F4 F5 F6
AVGSTDAVGSTDAVGSTD
IDARSOA2.717251 × 1011.254797 × 1012.139273 × 1007.451321 × 10−15.271715 × 1001.476269 × 100
CBA6.694629 × 1012.317729 × 1011.490043 × 1005.325698 × 10−11.073698 × 1011.808086 × 100
FSTPSO5.043442 × 1011.343329 × 1016.034827 × 1003.838452 × 1006.986759 × 1001.597126 × 100
CDLOBA5.746809 × 1012.240174 × 1011.242134 × 1001.954810 × 10−11.043361 × 1011.157089 × 100
BSSFOA1.442075 × 1024.588835 × 1001.668396 × 1022.200394 × 1011.697462 × 1015.541694 × 10−1
PPPSO3.839310 × 1011.074413 × 1011.283875 × 1001.518439 × 10−16.462171 × 1001.531601 × 100
CESCA9.448928 × 1019.699659 × 1008.953531 × 1011.596622 × 1011.108156 × 1018.666638 × 10−1
CMFO3.749406 × 1011.589558 × 1012.574459 × 1003.660344 × 1007.706524 × 1001.621081 × 100
SCAPSO5.057120 × 1011.642438 × 1011.565520 × 1008.810254 × 10−26.913048 × 1001.777494 × 100
CCMWOA4.986932 × 1011.008877 × 1013.479725 × 1001.161293 × 1007.444709 × 1001.245598 × 100
F7 F8 F9
AVGSTDAVGSTDAVGSTD
IDARSOA1.528968 × 1034.727661 × 1023.940307 × 1004.320032 × 10−11.235618 × 1001.113516 × 10−1
CBA1.374383 × 1033.420663 × 1024.853054 × 1002.295465 × 10−11.405100 × 1001.721617 × 10−1
FSTPSO1.203606 × 1033.716102 × 1024.538657 × 1004.262319 × 10−11.340619 × 1001.231064 × 10−1
CDLOBA1.380348 × 1033.537207 × 1024.896797 × 1001.828565 × 10−11.479007 × 1002.249296 × 10−1
BSSFOA3.192738 × 1032.412846 × 1025.611770 × 1009.256059 × 10−24.769840 × 1007.678918 × 10−1
PPPSO1.318088 × 1032.810489 × 1024.498047 × 1003.964417 × 10−11.249141 × 1003.765369 × 100
CESCA1.993929 × 1031.805498 × 1025.028661 × 1001.188186 × 10−19.121988 × 10−24.013362 × 10−1
CMFO1.336276 × 1033.394988 × 1024.722025 × 1002.486458 × 10−11.249141 × 1003.765369 × 100
SCAPSO1.166070 × 1032.627874 × 1024.100872 × 1003.769670 × 10−19.121988 × 10−24.013362 × 10−1
CCMWOA1.141992 × 1033.629329 × 1024.476690 × 1003.195163 × 10−11.249141 × 1003.765369 × 100
F10 F11 F12
AVGSTDAVGSTDAVGSTD
IDARSOA2.128730 × 1011.542929 × 10−12.055593 × 1088.387840 × 1082.167846 × 1035.225523 × 102
CBA2.104854 × 1019.277244 × 10−21.393178 × 1037.324373 × 1022.402761 × 1033.703850 × 102
FSTPSO2.103147 × 1013.766262 × 10−26.699923 × 1085.783327 × 1082.213364 × 1032.199824 × 102
CDLOBA2.128063 × 1017.271713 × 10−22.264579 × 1039.065271 × 1022.415362 × 1032.767366 × 102
BSSFOA2.152992 × 1011.151529 × 10−23.246366 × 10104.869552 × 1094.034769 × 1031.913518 × 102
PPPSO2.109848 × 1016.506809 × 10−24.246755 × 1051.279981 × 10102.158117 × 1033.566239 × 102
CESCA2.151104 × 1011.304537 × 10−12.316693 × 1062.045188 × 1092.885617 × 1031.487243 × 102
CMFO2.129575 × 1012.345907 × 10−14.246755 × 1051.279981 × 10102.325639 × 1033.842820 × 102
SCAPSO2.129200 × 1018.592595 × 10−22.316693 × 1062.045188 × 1092.188654 × 1032.606792 × 102
CCMWOA2.074818 × 1011.984757 × 1004.246755 × 1051.279981 × 10102.066235 × 1032.230539 × 102
F13 F14 F15
AVGSTDAVGSTDAVGSTD
IDARSOA7.365436 × 1021.047151 × 1011.900000 × 1030.000000 × 1001.905660 × 1069.794928 × 106
CBA9.157041 × 1028.172201 × 1011.909141 × 1033.830691 × 1004.250929 × 1032.427470 × 103
FSTPSO7.607505 × 1021.897089 × 1011.903777 × 1032.134699 × 1001.443403 × 1045.269551 × 104
CDLOBA9.592444 × 1028.807963 × 1011.908928 × 1035.950454 × 1004.725657 × 1032.539698 × 103
BSSFOA8.771444 × 1021.008231 × 1011.900000 × 1030.000000 × 1001.373129 × 1072.834251 × 107
PPPSO7.532335 × 1021.710264 × 1011.901063 × 1035.062910 × 10−11.011323 × 1041.127936 × 104
CESCA8.456961 × 1021.331773 × 1011.900604 × 1037.846227 × 10−11.154661 × 1066.668748 × 105
CMFO7.614705 × 1022.633592 × 1011.903336 × 1033.169298 × 1008.447643 × 1044.428584 × 105
SCAPSO7.526187 × 1029.735354 × 1001.900000 × 1030.000000 × 1004.110251 × 1032.153134 × 103
CCMWOA7.666253 × 1022.147281 × 1011.900000 × 1030.000000 × 1003.088870 × 1046.547435 × 104
F16 F17 F18
AVGSTDAVGSTDAVGSTD
IDARSOA1.625281 × 1031.773744 × 1012.026475 × 1052.060346 × 1052.297973 × 1031.264591 × 101
CBA1.870947 × 1031.636097 × 1023.128701 × 1035.301465 × 1022.300030 × 1034.910720 × 10−2
FSTPSO1.807712 × 1031.274306 × 1024.046043 × 1032.391059 × 1032.347481 × 1031.181476 × 100
CDLOBA1.883110 × 1031.966674 × 1023.480247 × 1031.252334 × 1032.301347 × 1031.185150 × 100
BSSFOA2.498101 × 1031.056882 × 1013.893858 × 1071.048266 × 1082.335064 × 1034.808892 × 10−2
PPPSO1.782462 × 1031.091577 × 1023.608611 × 1031.151662 × 1032.302283 × 1031.167129 × 100
CESCA1.811594 × 1031.063348 × 1023.837579 × 1052.882279 × 1052.341942 × 1034.403841 × 100
CMFO1.759178 × 1031.097689 × 1023.918881 × 1032.766490 × 1032.305250 × 1032.638451 × 101
SCAPSO1.743337 × 1038.170653 × 1012.972308 × 1033.940947 × 1022.338823 × 1032.257253 × 100
CCMWOA1.736167 × 1031.335177 × 1025.400884 × 1032.830385 × 1032.301539 × 1034.067334 × 10−1
F19 F20 +/−/=ARVRANK
AVGSTDAVGSTD
IDARSOA2.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 100 3.051
CBA2.737212 × 1031.384116 × 1022.986157 × 1035.742234 × 10113/4/36.157
FSTPSO2.734298 × 1039.769952 × 1012.971398 × 1032.729669 × 10115/3/25.956
CDLOBA2.805355 × 1039.421156 × 1012.980184 × 1036.246217 × 10114/3/36.78
BSSFOA2.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 10017/0/37.810
PPPSO2.645155 × 1031.051386 × 1022.926862 × 1035.154461 × 10112/4/44.34
CESCA2.613790 × 1037.941062 × 1002.750519 × 1033.357604 × 10118/2/07.559
CMFO2.804255 × 1031.237585 × 1022.962768 × 1033.912437 × 10113/1/65.95
SCAPSO2.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 1007/4/93.22
CCMWOA2.600000 × 1030.000000 × 1002.700000 × 1030.000000 × 10011/4/53.43
Table 11. Comparison results of the tension-compression string problem.
Table 11. Comparison results of the tension-compression string problem.
AlgorithmOptimal Values for VariablesOptimum Cost
dDN
IDARSOA 0.0519600.36324010.919470.012670
DE0.0516090.35471411.410830.012670
Improved HS [101]0.0511540.34987112.076430.012671
PSO [102] 0.0517280.35764411.244540.012675
WOA [2] 0.0512070.34521512.004300.012676
RO [103] 0.0513700.34909611.762790.012679
ES [104] 0.0519890.36396510.890520.012681
GSA [105] 0.0502760.32368013.525410.012702
GA [106] 0.0514800.35166111.632200.012705
Mathematical optimization 0.0533960.3991809.1854000.012730
Constraint correction 0.0500000.31590014.250000.012833
Table 12. Comparison results of pressure vessel design issues.
Table 12. Comparison results of pressure vessel design issues.
AlgorithmOptimal Values for VariablesOptimum Cost
T s T h   RL
PSO (He et al.)0.8125000.43750042.091266176.7465006061.0777
IDARSOA0.8125000.437542.09711177.19016072.4301
GA [106] 0.937500.50000048.32900112.67906410.381
Lagrangian multiplier [107] 1.125000.62500058.2910043.690007198.043
BA [74] 98.8015098.1089710.98606200.00007258.564
Branch-and-bound [108] 1.125000.62500047.70000117.71008129.104
GSA [105] 1.1250000.62500055.98865984.45420258538.8359
Table 13. Comparison results of I-beam problem.
Table 13. Comparison results of I-beam problem.
AlgorithmOptimum VariablesOptimum Cost
bh t w   t f  
IDARSOA50.000080.00000.90002.3217690.013074
SOS [109]50.000080.00000.90002.32180.013074
CS [110]50.000080.00000.90002.32170.013075
AGOA [111]43.1266379.912470.9326022.6718650.013295
ARSM [112]37.050080.00001.71002.31000.015700
IARSM [112]48.420079.99000.90002.40000.131000
Table 14. Comparison results of speed reducer design problem.
Table 14. Comparison results of speed reducer design problem.
AlgorithmOptimal Values for VariablesOptimum Cost
bmz l 1 l 2 d 1 d 2
IDARSOA3.506080.7177.37.7192623.3531545.2883642998.7797
PSO [102]3.500010.7178.37.83.3524125.2867153005.7630
hHHO-SCA [113]3.560610.7177.37.9914103.4525695.2867493029.8731
SCA [72]3.508750.7177.37.83.4610205.2892133030.5630
GSA [105]3.60.7178.37.83.3696585.2892243051.1200
Table 15. Comparison results of the welded beam design problem.
Table 15. Comparison results of the welded beam design problem.
AlgorithmOptimal Values for VariablesOptimum Cost
hltb
EO [114]0.20573.47059.036640.20571.7249
RO [103]0.2036873.5284679.0042330.2072411.735344
IDARSOA0.22755.80458.2614550.2475572.280517
HS [115]0.2442006.2231008.2915000.2433002.380700
FSA [116]0.2443566.1257928.2939050.2443562.38119
SCA [117]0.2444386.2379678.2885760.2445662.385435
SBM [118]0.24076.48518.23990.24972.4426
Table 16. Comparison results of the three-bar truss design problem.
Table 16. Comparison results of the three-bar truss design problem.
AlgorithmOptimal Values for VariablesOptimum Cost
x1x2
IDARSOA0.7889060.40760263.8960
GWO [119]0.794770.39192263.987
SCADE [120]0.739420.5691266.0501
RCBA [79]0.565440.64079266.6156
ALCPSO [121]0.9999240.000108282.8427
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Yu, H.; Qiao, S.; Heidari, A.A.; Bi, C.; Chen, H. Individual Disturbance and Attraction Repulsion Strategy Enhanced Seagull Optimization for Engineering Design. Mathematics 2022, 10, 276. https://0-doi-org.brum.beds.ac.uk/10.3390/math10020276

AMA Style

Yu H, Qiao S, Heidari AA, Bi C, Chen H. Individual Disturbance and Attraction Repulsion Strategy Enhanced Seagull Optimization for Engineering Design. Mathematics. 2022; 10(2):276. https://0-doi-org.brum.beds.ac.uk/10.3390/math10020276

Chicago/Turabian Style

Yu, Helong, Shimeng Qiao, Ali Asghar Heidari, Chunguang Bi, and Huiling Chen. 2022. "Individual Disturbance and Attraction Repulsion Strategy Enhanced Seagull Optimization for Engineering Design" Mathematics 10, no. 2: 276. https://0-doi-org.brum.beds.ac.uk/10.3390/math10020276

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop