My Old Nemesis: Linear Least Squares
I fumbled across this today and decided to build some more intuition on the topic, but first I had to figure out what the damn J was. After hunting around for an explanation I realized that the Jacobian is just a Gradient, or, thats backwards: the Gradient is a Jacobian! So as a recap, an image is just a function [x,y] |-> f(x,y), and the gradient turns a 1-d point f(x,y) into a 2-d space via f(x,y) |-> [df/dx, df/dy]. All we're doing is partial derivatives: No brain surgery. The jacobian matrix is a generalization of this principle.Lets say we have an image I, such that [x,y] |-> x+y. This would look pretty boring:
Here's where wiki comes in: "[A Jacobite Matrix] is the matrix of all first-order partial derivatives of a vector- or scalar-valued function with respect to another vector"
Isn't that what we just did? We took the jacobite of f(x,y) with the vector [x y]. In matlab, it would look like:
syms x y;
f = [x+y];
jacobian(f,[x y])
The jacobian operation pairs up it's two vectors like an outer product, but instead of multiplication, it derives one with respect to the other. The dimensionality increases the same way. In the code a jacobian(1x1, 2x1) => 2x1.



No comments:
Post a Comment