Design optimization techniques

CHAPTER

Design optimization techniques

5

CHAPTER OUTLINE 5.1 Population-Based Pareto Algorithms..................................................................153 5.1.1 Mixed Integer Nonlinear Programming..............................................153 5.1.2 Genetic Algorithm...........................................................................154 5.1.3 Particle Swarm Optimization Algorithm ............................................161 5.1.4 Nondominated Sorting ....................................................................167 5.2 Normal Boundary Intersection and Normalized Normal Constraint .......................172 References .............................................................................................................173

5.1 POPULATION-BASED PARETO ALGORITHMS 5.1.1 MIXED INTEGER NONLINEAR PROGRAMMING Mixed Integer Nonlinear Programming (MINLP) is vastly employed in engineering practice, including the thermohydraulic design as shown in, for instance, Sections 2.2.1 and 2.2.4, where either the density change of the pin fin array in the parametric analysis or the trade-off between the number of heat sources and the length of a solo heat source comprised in the design represents the combinatorial nature of MINLP, the general form of which is

subject to

min f ðx; xi Þ

(5.1)

8 hðx; xi Þ 5 0; hARm > > < gðx; xi Þ # 0; gARp > xARn ; xl # x # xu > : xi AZI

(5.2)

Conventional algorithms are usually gradient-based techniques, which guarantee optimum constraints and efficient convergence. Nevertheless, gradient-based algorithms reportedly lack competence in nonlinear, nonsmooth, and multimodal problems [1] that are often encountered in engineering design practice. Population-based heuristic methods arose as effective alternatives. One of the most famous algorithms is the genetic algorithm (GA), for which the pioneering work can be attributed to John Holland. Traditionally in GAs, the design variables Thermohydrodynamic Programming and Constructal Design in Microsystems. DOI: https://doi.org/10.1016/B978-0-12-813191-6.00005-7 © 2019 Elsevier Inc. All rights reserved.

153

154

CHAPTER 5 Design optimization techniques

of any single candidate solution are represented in binary strings. Each individual of the randomly generated initial population is first evaluated according to their fitness function, which is involved for reproduction. The next generation is in part obtained through the operators—that is, crossover and mutation—that work on the selected parents. The process continues until an appropriate population size is reached. The loop is then reinitiated by the evaluation of population fitness before the operators start to work for the new generation. The most widely used GA is the NSGA-II proposed by Deb et al. [2]. The multiobjective optimization of heat exchangers in either macroscale [3] or microscale [4], is one of the exemplary cases. Also since 2002, the particle swarm optimization (PSO) algorithm, as a distinguished member of the swarm intelligence family, has drawn attention from numerous investigators. The population or the group of candidate solutions, namely the particle swarm, are assigned an array of initial positions and velocities, which is updated iteratively in connection with the present global optima. Later development in PSO brought the collision-avoiding technique by bouncing off particles tending to gather in close proximity to a local optimum [5], or more generally by introducing a repulsive course when the diversity of the whole population breaks down through the set threshold [6]. These, GA and PSO, will be elaborated on in the following two sections.

5.1.2 GENETIC ALGORITHM With the binary representation, GA starts with a group of randomly generated individuals. The next generation is basically the outcome of the combination of these three listed operators: 1. Selection Following the convention of the MINLP as defined in Eqs. (5.1) and (5.2), the fitness function concerning the equality and inequality constraints [7] is Fðx; xi Þ 5

8 > > < > > :

f ðx; xi Þ; gðx; xi Þ # 0 and hðx; xi Þ 5 0 p m X   X max f ðx; xi Þ 1 max2 0; gj ðx; xi Þ 1 h2k ðx; xi Þ; j51

gðx; xi Þ . 0 or hðx; xi Þ 6¼ 0

(5.3)

k51

where max f ðx; xi Þ is the maximum function value of the most up-to-date solutions with feasibility—that is, all the constraints in Eq. (5.2) are satisfied. The selection is implemented with respect to the fitness function, for which Eq. (5.3) is one of the most popular forms without incurring additional parameters. 2. Crossover The encoded two parents exchange one or several string bits—again in a binary convention, albeit not necessarily the case, as is shown in the next example—in their corresponding positions. Both the length of the string bits and the position(s) are chosen randomly.

5.1 Population-Based Pareto Algorithms

FIGURE 5.1 Flow chart of genetic algorithm (GA) in general.

3. Mutation The mutation operator also works on one or several string bits with randomly selected length and position(s). But instead of exchanging, the string bit(s) is (are) changed directly, as in a binary scenario from 0 to 1, or vice versa. In the real number case, different implementations will follow. The criterion to terminate the searching process is primarily by setting either the maximum number of generations or the number of generations without considerable progress (also referred to as stall generations) as the fitness function involved for judgment. Other criteria are possible, for instance by setting the time duration after which the algorithm stops. Let’s consider the 2D fitness function

subject to

 2 min f ðxÞ 5 100 x21 2x2 1 ð12x1 Þ2

(5.4)

8 x1 x2 1 x1 2 x2 1 1:5 # 0 > > < 10 2 x1 x2 # 0 ; 0 # x1 # 1 > > : 0 # x2 # 13

(5.5)

of which the contour plot is shown in Fig. 5.2, the shadowed triangle-like region of feasibility located to the, approximately, 12 o’clock of the figure center. The minimum is at the lower tip of the triangle, which is obtainable, as listed below, by

155

156

CHAPTER 5 Design optimization techniques

FIGURE 5.2 Contour plot of the sample fitness function, the minimum marked by a pentagram.

FIGURE 5.3 Elaborated description on operators between the parents’ generation and the children’s generation.

solving the equality parts of the first two constraints simultaneously, within the feasible region of the design variables as defined by the 3rd and the 4th constraints. 

x1 5 0:8122 x2 5 12:3122

(5.6)

5.1 Population-Based Pareto Algorithms

In this specific case elitism (i.e., a certain percentage of the population in ith generation is selected based on their fitness function and copied to the i 1 1th generation), intermediate crossover and inbound mutation are employed, each yielding their corresponding type of individuals that constitute the whole population of the next generation. Fig. 5.3 shows the operators and their respective functions between generations, which in essence for the sample fitness function is a detail-enhanced view of the dash-line frame in Fig. 5.1. More elaborated information is available in the MATLAB code as follows:

157

158

CHAPTER 5 Design optimization techniques

5.1 Population-Based Pareto Algorithms

159

160

CHAPTER 5 Design optimization techniques

Fig. 5.4 shows snapshots of individual locations for a population of different generations. The general trend indicates that the elite group gathers quickly after the 7th generation and gradually becomes invisible after the 14th generation as they form a cluster within the proximity of the global optimum, that is, the lower

5.1 Population-Based Pareto Algorithms

FIGURE 5.4 Snapshots during the evolution process, asterisks in green, blue, and black represent the best solution, the elite cluster, and the rest of population in the (A) 2nd, (B) the 7th, (C) the 14th, and (D) the 23rd generation, respectively.

tip of the triangle. The rest of the individuals in the population are still somehow scattered since the GA is in nature a stochastic algorithm, always trying to avoid being trapped in a local minimum. This entails a side effect in that the solution acquired would theoretically keep drifting infinitely. An auxiliary stabilizer is, thus, possible and sometimes necessary the in form of other algorithms, gradientbased or not, depending on the actual form of the fitness function.

5.1.3 PARTICLE SWARM OPTIMIZATION ALGORITHM The PSO algorithm, as another representative of the emerging population-based algorithms, was inspired by the behavior of bird flocks, fish schools, or more in general, the dynamics of social animals. The particle coordinates in the ndimensional Euclid space correspond to the n variables in the design space, while the velocities are assigned to all the n particles, which renders them the capability of searching the design space for the best solution. After eliminating the noisy

161

162

CHAPTER 5 Design optimization techniques

FIGURE 5.5 Flow chart of particle swarm optimization (PSO) algorithm.

hypotheses, for example, the nearest neighbor velocity matching [8], from empirical observation, the essence of the algorithm for any single particle relies on the velocity adjustment based on the best-known coordinates within its neighborhood and that the entire swarm has experienced up to the current iteration, respectively, which is formulated as

5.1 Population-Based Pareto Algorithms

FIGURE 5.6 Contour plot of the sample fitness function, the feasible region shadowed, and the minimum marked by pentagram.

    vi 0 5 wvi 1 yp up pi 2 xi 1 yg ug gi 2 xi

(5.7)

0

where the adjusted velocity vector vi (subscripts i representing the ith particle in the swarm) is the weighted summation of vi , the original velocity, pi 2 xi , the difference between the coordinates of the ith particle and best solution so far witnessed by all its neighbors, and gi 2 xi , the difference between the coordinates of the ith particle and the globally best solution known to the swarm. In addition to the weight constants w, yp , and yg , the latter two terms in the weighted summation are further sub-relaxed respectively by two random numbers up and ug , which are uniformly distributed between 0 and 1. The coordinates are updated for each of the particles in the swarm until the stop criteria, often the maximum number of iterations and/or stall iterations, are reached. The structure of the algorithm is depicted in the flow chart shown in Fig. 5.5. Later investigations [9,10] indicate that, in practice, the weight w and the neighborhood size are supposed to change dynamically depending on whether the globally known best solution is updated or not. Other minutia can be found as the following 2D multimodal fitness function  2  2 min x21 1x2 211 1 x1 1x22 27

(5.8)

163

164

CHAPTER 5 Design optimization techniques

FIGURE 5.7 Sequences of convergence, asterisks in green and black representing the best solution and the rest particles in the swarm, subplot (A), (B), (C), and (D) corresponding to the 1st, 5th, 50th, and 500th iterations, respectively.

subject to

8 x 2 ðx1 12Þ2 # 0 > > < 2 4x1 2 10x2 # 0 > 25 # x1 # 5 > : 25 # x2 # 5

(5.9)

for which the global minimum within the feasible region is 

x1 5 3 x2 5 2

(5.10)

as is also marked in the contour plot in Fig. 5.6. The MATLAB code for solving the problem as stated in Eqs. (5.8) and (5.9) is listed below. Fig. 5.7 shows the sequence of convergence as the population particles are all settled at the expected location after the 500th iteration.

5.1 Population-Based Pareto Algorithms

165

166

CHAPTER 5 Design optimization techniques

5.1 Population-Based Pareto Algorithms

5.1.4 NONDOMINATED SORTING Where multiobjective optimization is concerned, the concept of Pareto optimum, or Pareto frontier (hypersurface) in an n-dimensional Euclid space, in general, is inevitable. Supposing that the objective function (basically a vector) has got N outputs (components) and there are M such vectors, as a result of coming out of a certain generation or iteration during the implementation of some optimization

167

168

CHAPTER 5 Design optimization techniques

algorithm, the Pareto optimum regarding that generation or iteration is defined so that the following scenario does not exist between any two vectors, say oi and oj ð1 # i; j # N; i 6¼ jÞ, in it. 8 oi1 , oj1 > > < oi2 , oj2 ... > > : oiN , ojN

where



oi 5 ½oi1 ; oi2 ; . . . ; oiN  oj 5 oj1 ; oj2 ; . . . ; ojN

(5.11)

(5.12)

In this scenario, oj is called dominated by oi , since each of its components is inferior to (greater than) its counterpart in oi , when the simultaneous minimization of all the N components is expected. Among all the population vectors in each generation or iteration, we set a mark (which is initially zero) for the vector oj , and the mark is added by one each time it is dominated as depicted in Eqs. (5.11) and (5.12), by the other vector, a higher mark means more times of being dominated. Marking the vectors, thus, creates a ranking system for the entire population of which those marked zero form the Pareto frontier with Rank No. 1 in the present generation or iteration. Note that this ranking system serves as a necessary indictor in multiobjective optimization, even if it is never an issue in conventional problems with a single

FIGURE 5.8 Simulated Pareto frontiers with different ranks, the abscissa o1 and the ordinate o2 representing two different dimensions in the space of the objective function.

5.1 Population-Based Pareto Algorithms

output of the objective function. The following MATLAB code shows the elaborated procedure of this nondominated sorting (ranking) [2] of a simulated population in the two-dimensional space, the result of which is shown in Fig. 5.8.

169

5.1 Population-Based Pareto Algorithms

171

172

CHAPTER 5 Design optimization techniques

5.2 NORMAL BOUNDARY INTERSECTION AND NORMALIZED NORMAL CONSTRAINT To recap, this chapter started with the definition of MINLP, one of the most intricate types in the world of multiobjective optimization, introduced the gradientfree algorithms that triggered a series of follow-up investigations and the ranking system when simultaneous optimization is to be implemented on different dimensions of the objective function. While these methods primarily deal with the design space, the space of the objective function has so far been ignored. Throughout the operation on objective functions, neither conventional methods, for instance the Weighted Sum (WS) and the Compromising Programming (CP) method [11], nor the more popular techniques—for example, the family of evolutionary algorithms (EAs) — [12,13] perform well in controlling the uniformity of distribution of the generated Pareto sets. Although the stochastic nature may work for EAs, this is not the case for those originating back in the period of gradient-based methods. More advanced treatments, like the normal boundary intersection (NBI) [14], was found generating dominated or non-Pareto solutions, due to the utilization of equality constraints. Of all the methods discussed here, NBI seems the most promising candidate since it is not flawed by the generation of illdistributed Pareto sets given that the non-Pareto solutions generated can be eliminated properly which yields the normalized normal constraint (NNC) method with a Pareto filter [16]. The working principle of NBI, along with its derivative, NNC, is discussed as follows. For simplicity in illustration, a bi-objective case is addressed assuming that the problem is to minimize two objective functions, o1 and o2, simultaneously. The first step is to solve the single objective minimization problem regarding o1 and o2 separately. The acquired objective function values are respectively noted   as o1 and o2 which are normalized to a 01 scale by a linear transformation and respectively marked as E and A in Fig. 5.9. The line segment AE is then evenly divided into n sections by n1 nodes (node 0 is A and node n is E). In the NBI method, the single objective minimization with respect to o2 will be implemented given the linear equality constraint going through node (and step) i (i goes sequentially from 1 to n1) and perpendicular to line segment AE, using the Pareto solution set from step i1 as the initial guess. After a thorough traverse from node 1 to node n, a curve passing through A-E alphabetically (see Fig. 5.9) can be expected as the Pareto frontier. However, this is not exactly the case if it consists of both convex and concave parts as depicted in Fig. 5.9 (e.g., C is dominated by D). But if the NNC method, which intentionally prescribes a feasible region by using a linear inequality constraint defined in the same way as for NBI, is applied, the minimization will result in D in step i leaving a gap between C and D which technically does not belong to the expected Pareto frontier. Nevertheless, curve segment B-C (B excluded), which is obviously dominated by D, still needs elimination in a Pareto filter which, on the other hand,

References

FIGURE 5.9 Graphical representation of NBI and NNC methods. [15]

only keeps the nondominated pairs by an exhaustive comparison—pretty much the same procedure as the fast nondominated sorting algorithm (see Section 5.1.4) tries to find in the first-ranked Pareto frontier.

REFERENCES [1] Chowdhury S, Tong W, Messac A, et al. A mixed-discrete particle swarm optimization algorithm with explicit diversity-preservation. Struct Multidiscip Optim 2013;47 (3):36788. [2] Deb K, Pratap A, Agarwal S, et al. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 2002;6(2):18297. [3] Sanaye S, Hajabdollahi H. Thermal-economic multi-objective optimization of plate fin heat exchanger using genetic algorithm. Appl Energy 2010;87(6):1893902. [4] Halelfadl S, Adham AM, Mohd-Ghazali N, et al. Optimization of thermal performances and pressure drop of rectangular microchannel heat sink using aqueous carbon nanotubes based nanofluid. Appl Therm Eng 2014;62(2):4929. [5] Krink T, VesterstrOm JS, Riget J. Particle swarm optimisation with spatial particle extension//Evolutionary Computation, 2002. CEC’02. Proceedings of the 2002 Congress on IEEE; 2002, vol. 2. p. 14741479. [6] Riget J, Vesterstrøm JS. A diversity-guided particle swarm optimizer-the ARPSO[J], Tech. Rep., 2002, vol. 2. Aarhus, Denmark: Dept. Comput. Sci., Univ. of Aarhus; 2002. [7] Deb K. An efficient constraint handling method for genetic algorithms. Comput Methods Appl Mech Eng 2000;186(2):31138.

173

174

CHAPTER 5 Design optimization techniques

[8] Kennedy J, Eberhart R. Particle swarm optimization//Neural Networks, 1995. Proceedings IEEE International Conference on IEEE; 1995, vol. 4, p. 19421948. [9] Hu X, Eberhart R. Multiobjective optimization using dynamic neighborhood particle swarm optimization//Evolutionary Computation, 2002. CEC’02. Proceedings of the 2002 Congress on IEEE; 2002, vol. 2, p. 16771681. [10] Pedersen MEH. Good parameters for particle swarm optimization. Copenhagen, Denmark: Hvass Lab., Tech. Rep. HL1001; 2010. [11] Chen W, Wiecek MM, Zhang J. Quality utility - a compromise programming approach to robust design. J Mech Des 1999;121(2):17987. [12] Cheng FY, Li D. Genetic algorithm development for multiobjective optimization of structures. AIAA J 1998;36(6). [13] Srinivasan D, Tettamanzi A. Heuristics-guided evolutionary approach to multiobjective generation scheduling. IEE Proc Gen Transm Distrib 1996;143(6):5539. [14] Das I, Dennis JE. Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J Opt 1998;8 (3):63157. [15] Shi ZY, Dong T. Synthetic optimization of air turbine for dental handpieces// Engineering in Medicine and Biology Society (EMBC). 2014 36th Annual International Conference of the IEEE. IEEE; 2014. p. 48154818. [16] Messac A, Ismail-Yahaya A, Mattson CA. The normalized normal constraint method for generating the Pareto frontier. Struct Multidiscip Opt 2003;25(2):8698.