Differential Evolution
Differential Evolution: A Survey of the State-of-the-Art Swagatam Das, Ponnuthurai Nagaratnam Suganthan, IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 15, NO. 1, FEBRUARY 2011- Compared to most other EAs, DE is much more simple and straightforward to implement. Main body of the algorithm takes four to five lines to code in any programming language. Simplicity to code is important for practitioners from other fields, since they may not be experts in programming and are looking for an algorithm that can be simply implemented and tuned to solve their domain-specific problems. Note that although PSO (particle swarm optimization) is also very easy to code, the performance of DE and its variants is largely better than the PSO variants over a wide variety of problems as has been indicated in many studies.
- As indicated by recent studies, despite its simplicity, DE exhibits much better performance in comparison with several others like G3 with PCX, MA-S2, ALEP, CPSO-H, and so on of current interest on a wide variety of problems including uni-modal, multimodal, separable, non-separable and so on. Although some very strong EAs like the restart CMA-ES was able to beat DE at CEC 2005 competition, on non-separable objective functions, the gross performance of DE in terms of accuracy, convergence speed, and robustness still makes it attractive for applications to various real-world optimization problems, where finding an approximate solution in reasonable amount of computational time is much weighted.
- The number of control parameters in DE is very few (Cr, F, and NP in classical DE). The effects of these parameters on the performance of the algorithm are well- studied. As will be discussed in the next section, simple adaptation rules for F and Cr have been devised to improve the performance of the algorithm to a large ex- tent without imposing any serious computational burden.
- The space complexity of DE is low as compared to some of the most competitive real parameter optimizers like CMA-ES. This feature helps in extending DE for handling large scale and expensive optimization problems. Although CMA-ES remains very competitive over problems up to 100 variables, it is difficult to extend it to higher dimensional problems due to its storage, update, and inversion operations over square matrices with size the same as the number of variables.
Single Objective Differential Evolution
R. Storn and K. V. Price, "Differential evolution: A simple and efficient adaptive scheme for global optimization over continuous spaces," ICSI, USA, Tech. Rep. TR-95-012, 1995. Available: http://icsi.berkeley.edu/~storn/litera.htmlGlobals
- F : combiation factor (try F=0.3)
- Cr : cross over rate (try Cr=0.5)
- NP: number of items in population (try NP=100)
de
function de(NP, F)
Generate NP random solutions
for each item P
value[P] = score(P)
Stop = false
while not Stop
For each item P
C = candidate(P)
value[C] = score(C)
if (better(value[C], value[P]))
replace P with C
print BestScore,Best
score
function score(X)
tmp = objectiveFunction(X) # domain specific
if (tmp > Best)
BestScore = tmp
Best = X
if (BestScore >= GoodEnough)
Stop = true
Count++
if (Count > TooMuch)
Stop = true
return tmp
candidate
function candidate(original)
pick three other items at random x,y,z
candidate = ()
for each attribute A
new = x[A]
if numeric(A)
new = x[A] + f*([y[A] - z[A]])
else
if (rand() < f) new = (rand() < 0.5) ? y[A] : z[A])
candidate[A] = new
for each attribute A
candidate[A] = (rand() < Cr) ? original[A] : candiate[A]
k = a random attribute
candidate[A] = original[k] #must use at least 1 parent attribute
return candidate
Variants
Varitions on a Theme
Storn and Price suggested a total of ten different working strategies for DE and some guidelines in applying these strategies to any given problem.K. Price, R. Storn, and J. Lampinen, "Differential Evolution—A Practical Approach to Global Optimization" Berlin, Germany: Springer, 2005.
_R. Storn and K. Price, "Differential evolution: A simple and efficient heuristic for global optimization over continuous spaces" J. Global Optimization, vol. 11, no. 4, pp. 341-359, 1997. _
None work best in all circumstances. Suck it and see!
Opposition-based DE
When building the candidate, select the opposite value of each numeric range- For an attribute ranging 2..12
- The opposite of 4 is 10
Multiple objective DE
Tea Robic and Bogdan Flipic, 2005.If first standard DE, the candidate is compared to its parent (so the "orginal" is the parent) and there is only a single objective function.
In the next two variant, we allow multiple goals and we apply domination
Domination : y1 dominates y2 if
for all objective o, o(y1) <= o(y2)
and there exists one objective o, o(y1) < o(y2)
- If child dominates, replace original
- If neither doiminates, keep original and add child
The third variant, DEMO/closest/obj, works in the same way as standard, the "original" is the closest item in the objective space.
No comments:
Post a Comment