When minimizing a nonconvex function $f : Omega rightarrow mathbb{R}$ that may have multiple minima, there are some very simple strategies to improve the odds of finding the global minimum point. Singlestart scatters points $x_1,ldots,x_n$ in $Omega$ (deterministically or randomly, using rejection sampling if necessary), and starts some local minimization algorithm from a single point $x_k$ where $f(x_k) leq f(x_j)$ for $1 leq j leq n$. Multistart starts the local minimization algorithm from all the points $x_1,ldots,x_n$, and is obviously much more computationally intensive. One of the ways to reduce the computational load in multistart is to do just one iteration of the local minimization algorithm from each of the points $x_1,ldots,x_n$ and run a geometric clustering algorithm on the new collection of points, trying to identify clusters of points that belong to the same basin of attraction of a local minimum point, and saving computational effort by starting a local minimization from the center of mass of each cluster.
I now have a very specific question: Can anyone provide me with a reference for the variant (of singlestart or multistart, as one sees it) where one runs a single iteration of the local minimization algorithm from each of the points $x_1,ldots,x_n$, and then lets it run further on from that particular one of the $n$ new points where the function value is smallest? (If it helps, think of the local minimization algorithm as BFGS, with the first iteration a gradient search).
Note: I am not asking for advice on global optimization, either stochastic or deterministic. I don't need that, only an answer to a very specific query.
No comments:
Post a Comment