[REBOL] Re: GPS: minimizing sum of squares algorithm
From: jjmmes:ya:hoo:es at: 11-Nov-2002 21:21
Hi Tom,
Thanks for helping me out. I think the general
approach to solving this is with Jan's code, specially
when you want to acount for 4+ data points.
It turns out that with 3 data points I only have 2
sets of independent equations, u: f(x,y) and v:
f'(x,y), so I only have to solve one set of 3
equations with 3 unknowns and can get the unknows
expressed as a function of u,v .
Regards,
Jose
--- Tom Conlin <[tomc--darkwing--uoregon--edu]> escribió:
> Hi Jose,
>
> Hmmm
>
> I recall "root mean square" from Calculus not Linear
> and it was the sqrt of the area under f(x^2) between
> a couple of points
> if this is the correct interpretation I am not
> seeing how it will help you
> find your constants.
>
> I think you are closer to your mark in the subject
> line
> and may be looking for a "linear-least-squares" fit.
> (or if you are less lucky a
> "non-linear-least-squares" fit)
>
> disclaimer: I have only been through standard
> undergraduate classes and
> have not used this stuff in several years
> (winter 99, and never in the real world).
>
> if I were you I would try to view what I needed
> as 2 problems each with 3 unknowns since your
> x and y equations seem independent and the amount of
> work will
> normally square with the number of unknowns,
> that is 2*3^2 < 6^2
>
> looking at just the x half,
> a few gps samples you could have in something
> similar to your notation
>
> 1*a + lat1*b + lon1*c = x1
> 1*a + lat2*b + lon2*c = x2
> 1*a + lat3*b + lon3*c = x3
>
> or in matrix form
>
> | 1 lat1 lon1 | |a| |x1|
> | 1 lat2 lon2 | * |b| = |x2|
> | 1 lat3 lon3 | |c| |x3|
>
> in a more standard notation this is typically
> represented as
>
> A x = b
>
> the linear-least-squares solution 'x' to Ax = b
> is
>
> x = (A_t * A)^-1 * A_t * b
>
> where A_t is the transpose of A
> (transpose means the indices for a row and column
> are swapped
> or that a matrix is reflected about its diagonal)
>
> and the (A_t A)^-1 means the inverse of the matrix
> that results from
> multiplying the transpose of A by A (order matters)
>
> more specifically it is the matrix you would have
> to multiply (A_t * A)
> by to get the "Identity matrix" where the identity
> matrix is
> a square matrix with 1's from the top-left to
> lower-right and 0's
> elsewhere
>
> a matrix must be square to have an inverse
> There is no guarantee a square matrix has an
> inverse.
>
> I am not going into inverting a matrix because there
> are alternatives.
>
> all in all it is pretty expensive to go this way
> but at least it is straight forward if you want to
> plow through it.
>
> If you are going to do allot of these there is a
> shortcut which
> stems from the rearrangement
>
> A_t A x = A_t b
>
> where the two sides of the equation can be expressed
> as
> "Augmented Matrices"
>
> going back to something that resembles your notation
> translated to matrix
> form we get
>
> || 1 1 1| |1 lat1 lon1| : a| | 1
> 1 1 : x1|
> ||lat1 lat2 lat3| * |1 lat2 lon2| : b| = |lat1
> lat2 lat3 : x2|
> ||lon1 lon2 lon3| |1 lat3 lon3| : c| |lon1
> lon2 lon3 : x3|
>
> where the [a b c] & [x1 x2 x3] vectors are
> augmenting the square matrices
>
> these augmented matrices (representing your
> simultaneous equations)
> may be solved by adding multiples of the rows in a
> matrix to one another.
> to end up with 1s, 0s, and solutions (if there are
> any)
> there are methods by Gauss, and an improvement
> called "Gauss-Jordan"
> which have been around for hundreds of years.
>
> in the end you end up with identity matrices
> augmented with the solutions
>
> |1 0 0 : i*a| |1 0 0 : p|
> |0 1 0 : j*b| = |0 1 0 : q|
> |0 0 1 : k*c] |0 0 1 : r]
>
> so a = p/i
> b = q/j
> c = r/k
>
> there is a function "naive-gauss" in code I wrote
> to double check
> the longhand results required in my homework.
> (the professor would not even allow a four function
> calc in the class)
> and was not finished beyond the point where it
> served my immediate
> purpose
>
> http://darkwing.uoregon.edu/~tomc/core/math/matrix.r
>
> that should work to reduce your augmented
> equations.
>
> there is another "gauss-jordan" which would be
> better if it did not have
> a bug I never got around to tracking down
>
> note:
> the "comp" function in the file was derived from
> Eric Long's compare
>
> best of luck
>
> On Sat, 9 Nov 2002, [iso-8859-1] jose wrote:
>
> > a,b,c,d,e,f are constants.
> >
> > I basically want to solve a set of 6 equations
> with
> > six unknowns, using one of the existing linear
> algebra
> > algorithms. The algorithm implementation I'm
> looking
> > for is also known as RMS, root mean squares but
> > couldn't find a good algorithm description in the
> web
> >
> > Regards
> > Jose
> > --- Jan Skibinski <[jan--skibinski--sympatico--ca]>
> > escribió: > Jose,
> > > I could certainly help you with this but I would
> > > need to know more about the shape of the
> functions b
> > > c e f
> > > in order to model them appriopriately. For
> example,
> > > depending on what those functions represent a
> > > polynomial
> > > approximation might be of no use or it might be
> > > quite good actually.
> > > A reference to existing georeference algorithms
> > > could
> > > be a good starting point.
> > > Regards,
> > > Jan
> > >
> > >
> > > jose wrote:
> > >
> > > > Has anybody written this algorithm in Rebol ?
> > > >
> > > > I need this to georeference gps data, i.e.
> find a,
> > > b,
> > > > c, d, e, f below given three calibration
> points
>
=== message truncated ===