Monday, September 2, 2013

Conversations with Deb


Thanks to Marouane Kessentini,  I had time Friday in a Skype meeting with Kalyanmoy Deb of NSGA-II fame.

Deb told me of a new speciality  in optimization. Multi-objective optimization fails for high objective spaces, at which point we go from...
  • multi-objective to
  • many-objective
Standard many-objective techniques are to (e.g.) 
  • learn correlations between objectives to reduce N objectives to M
  • use some domain knowledge to find combinations of objectives
Deb has been building NSGA-III that uses "aspiration points" (supplied by the users) for many-objective problems (e.g. a 14 goal objective problem for land usage in New Zealand).  NSGA-III has methods for recognizing  unpractical aspirations then moving them to more achievable aspirations near the Pareto frontier. e.g. http://link.springer.com/chapter/10.1007%2F978-3-642-37140-0_25

Other things said in that meeting:
  • HV is ungood for high-objective problems (computational expensive to compute)
  • Spread is also ungood in high objective space - diversity is kinda irrelevant in high objective space since the space between the aspiration points can be vast.
  • When users state multi-objectives, they often disagree on those objectives. So a real many-objective optimizer has to recognize diversity and clusters of objectives that might be mutual exclusive.
Afterwards, I was thinking:
  • about Abdel's stuff- is it really an aspirational-based system?
  • about Joe's stuff: if we FASTMAPed on objective space, would we be able to handle higher objective dimensionality?
Anyway, some references on many-objective optimization:

No comments:

Post a Comment