Some solvers, such as fsolve
and lsqcurvefit
,
have objective functions that are vectors or matrices. The main difference
in usage between these types of objective functions and scalar
objective functions is the way to write their derivatives.
The first-order partial derivatives of a vector-valued or matrix-valued
function is called a Jacobian; the first-order partial derivatives
of a scalar function is called a gradient.
For information on complex-valued objective functions, see Complex Numbers in Optimization Toolbox Solvers.
If x is a vector of independent variables, and F(x) is a vector function, the Jacobian J(x) is
For example, if
The function file associated with this example is:
function [F jacF] = vectorObjective(x) F = [x(1)^2 + x(2)*x(3); sin(x(1) + 2*x(2) - 3*x(3))]; if nargout > 1 % need Jacobian jacF = [2*x(1),x(3),x(2); cos(x(1)+2*x(2)-3*x(3)),2*cos(x(1)+2*x(2)-3*x(3)), ... -3*cos(x(1)+2*x(2)-3*x(3))]; end
To indicate to your solver that your objective function includes
a Jacobian, set the SpecifyObjectiveGradient
option
to true
. For example,
options = optimptions('lsqnonlin','SpecifyObjectiveGradient',true);
The Jacobian of a matrix F(x) is defined by changing the matrix to a vector, column by column. For example, rewrite the matrix
For example, if
If x is a matrix, define the Jacobian of F(x) by changing the matrix x to a vector, column by column. For example, if
So, for example,