Many situations give rise to sparse linear least-squares problems, often with bounds on the variables. The next problem requires that the variables be nonnegative. This problem comes from fitting a function approximation to a piecewise linear spline. Specifically, particles are scattered on the unit square. The function to be approximated is evaluated at these points, and a piecewise linear spline approximation is constructed under the condition that (linear) coefficients are not negative. There are 2000 equations to fit on 400 variables:
load particle % Get C, d lb = zeros(400,1); [x,resnorm,residual,exitflag,output] = ... lsqlin(C,d,[],[],[],[],lb);
The default diagonal preconditioning works fairly well:
exitflag,resnorm,output exitflag = 3 resnorm = 22.5794 output = iterations: 10 algorithm: 'trust-region-reflective' firstorderopt: 2.7870e-05 cgiterations: 42 message: 'Optimization terminated: relative function value changing by less…'
For bound constrained problems, the first-order optimality is
the infinity norm of v.*g
, where v
is
defined as in Box Constraints, and g
is
the gradient.
You can improve (decrease) the first-order optimality measure
by using a sparse QR factorization in each iteration. To do this,
set PrecondBandWidth
to inf
:
options = optimoptions('lsqlin','PrecondBandWidth',inf); [x,resnorm,residual,exitflag,output] = ... lsqlin(C,d,[],[],[],[],lb,[],[],options);
The first-order optimality measure decreases:
exitflag,resnorm,output exitflag = 1 resnorm = 22.5794 output = iterations: 12 algorithm: 'trust-region-reflective' firstorderopt: 5.5907e-15 cgiterations: 0 message: 'Optimization terminated: first order optimality with optimality…'