Last week we discussed the concept of multi-start optimization and a simple and effective ILIAD workflow for implementing a multi-start optimization to find the global maximum of a surface. This week we will use ILIAD’s toolkit to look deeper into the strengths and core principles of this optimization technique and compare it with other prevalent methods for nonlinear optimization problems.

Figure 1: Example surface with a single global maximum and several local extrema.
Many proponents of non-gradient-based optimization algorithms cite their relative resilience to “getting stuck” on local optima when compared with gradient-based algorithms. Multi-start optimization algorithms facilitate the use of gradient-based algorithms while significantly increasing the odds of finding the true global optimum, but as engineers and scientists we must ask: is launching several gradient-based optimizations really faster than a single non-gradient-based optimization?
To test this, we put the optimization workflow from last week’s blog head-to-head with two of the most popular non-gradient-based optimization techniques, Non-dominated Sorting Genetic Algorithm (NSGA2) and PSO (Particle Swarm Optimization), finding the maximum value of the test surface. In this benchmark, the multi-start workflow made 237 calls to the analysis program and found a point within machine precision of the global optimum whereas NSGA2 and PSO took 601 and 399 analysis calls respectively just to find values within 0.01 of the global optimum. For inexpensive analyses, this two to three times speedup can save minutes, but for analyses that involve complex finite element simulations, this can potentially translate to hundreds of CPU hours saved!
So what makes multi-start optimization so fast for this type of problem? To answer this, let’s first take a conceptual look at the non-gradient-based methods.
NSGA2 is an evolutionary algorithm in which a the “fittest” individuals in a “population” of design points recombine and mutate to create a (usually) fitter subsequent generation. The premise of PSO is to launch a multitude of particles “traveling” through the search space that iteratively adjust their course toward the current best particle.
NSGA2 can create a “generation” of design points clustered near the optimum by recombining the variables of the best points, but some progress relies on random mutation to explore the search space and yield improvement. PSO relies on the trajectory of one of the particles passing through or very close to the optimum. Ultimately, both algorithms require a considerable amount of chance in order to find the optimum in a reasonable amount of time and may not know when to stop searching.

Figure 2: Approximated contour plot showing all design points from the multi-start optimization.
By contrast, gradient-based algorithms maximally exploit local knowledge of the design space, seeking the maximum improvement with each step. A multi-start gradient-based optimization is akin to several blindfolded hikers on a mountain range; each will find a peak by climbing upwards, but none are certain that theirs is the tallest peak. Figure 2 shows the design points of each optimizer in the multi-start algorithm. Lines show an optimizer climbing toward a peak and clusters result from convergence to local optima.

Figure 3: Approximation of the design space reconstructed from multi-start optimization data.
Figure 3 shows a surface constructed from the starting and ending points of the multi-start optimization. This is approximately what the optimizer “thinks” the surface looks like based on its limited dataset. Though the sparsity of starting locations leaves part of the design space under-explored, we are still able to discern several key features of the true surface with high accuracy including the global maximum! These visualizations help illustrate how multi-start gradient-based optimization uses mathematical approximation to quickly find the optima even in design spaces with many local extrema.
This week we compared the nested multi-start workflow on a nonlinear test surface with two popular non-gradient-based optimization algorithms and tangibly explored the concept of multi-start optimization using ILIAD’s modeling and post-processing tools. Check back in weekly for more information on multi-start optimization and other concepts and capabilities found in OmniQuest’s software suite!
Connect with us now for complimentary webinars and evaluation software.
Our engineering team can work with you to conduct a Test Case showing how OmniQuestTM will improve your designs, processes and your overall business