Chapter 2 (literature review) is currently halfway done at 6 pages + 1. After a quick break to finish deliverables for my classes, I intend to finish the other half by Saturday morning.
The current state of the thesis draft can be found at http://unbox.org/stuff/var/jcraig/thesis/ .
The good news is that the more I read, the more ideas I have as to how things might be done differently. For example, a comparison of NSGA-II against a very similar algorithm, SPEA2, shows that NSGA-II performs better on lower dimensional problems, because it converges a little bit faster than SPEA2. However, SPEA2 does better with higher-dimensional problems, and it gets better distributions. The differences between the two algorithms are so minor that it is a simple matter to tinker with NSGA-II to determine whether it can have SPEA2's distribution while maintaining its speed.
On the local search front, the algorithm EPLS (Evolutionary Parallel Local Search) performs similarly to my modified NSGA-II, but they report performing well against other algorithms (not including NSGA-II, oddly enough). I can try using kNN clustering like they do and using a different breeding operator for the local search like their simplex crossover to see if that gets better results than our clunky FastMap + V-optimal recursive splits with binary uniform crossover.