A name for a numerical phenomena when using numerical methods
An answer to this question on the Scientific Computing Stack Exchange.
Question
I have a nonlinear solver for equation $$g= c_1f(x_1,y_1)+c_2f(x_2,y_2)$$
Note that $c_1$ is much bigger than $c_2$. After using Levenberg–Marquardt algorithm, it seemed to only optimize $x_1$ and $y_1$, while $c_2f(x_2,y_2)$ is ignored by the solver.
Is there a proper noun for this phenomenon?
Answer
You could call this a "scaling" problem. Your $c$ variables essential form the weights on a multi-objective optimization problem. If $c_1\gg c_2$, then the objective associated with $c_1$ becomes, essentially, the only thing that matters.
You have a few options for dealing with this:
- decrease $c_1$
- increase $c_2$
- Use a more aggressive form of $f(x_2,y_2)$. For instance, choosing $\left(f(x_2,y_2)\right)^2$ will greatly exaggerate $f(x_2,y_2)$ which may cause your solver to pay more attention to it. However, once $f(x_2,y_2)$ is brought low enough, the $c$ weights will become more important again.
You could also try to extract the Pareto frontier, the set of solutions where an improvement in the $c_1$ objective would cause a loss in the $c_2$ objective, and vice versa. You can get this by choosing a bunch of $c$ values and taking a convex hull or by slowly decreasing $c_1$ while increasing $c_2$.
Note that this kind of problem shows up in ridge regression and Lasso), and more generally as part of regularization), where the objective functions take the form of $$ \lVert y-X\beta\rVert^2 + \lambda\lVert \beta \rVert^2\text{ (Ridge)} $$ and $$ \lVert y-X\beta\rVert^2 + \lambda\lVert \beta \rVert_1\text{ (Lasso)} $$ So looking into how that parameter is chosen#Choice_of_regularization_parameter) might also be useful.