DiffCorr:

--------------------------------------------------------------------------
   Solves the least squares problem for

   y = h(x)

   by linearizing about the reference vector xA:

   y = h(xA) + H(xA)(x-xA)

   Uses the singular value decomposition to solve the least squares
   problem 
--------------------------------------------------------------------------
   Form:
   [x, k, rsvd, cHWH, rank, P, wmr, sr, J, sig, nz] = DiffCorr( f, S0, xA, kx, tol, tolSVD, initCHWH )
--------------------------------------------------------------------------

   ------
   Inputs
   ------
   F                      [rho,H,W]  = f(xA)
   S0                     A priori state covariance matrix
   xA                     A priori state
   kx                     States to be found
   tol                    Error tolerance
   tolSVD                 SVD tolerance
   initCHWH               1 = show condition number of initial HWH

   -------
   Outputs
   -------
   x                      Matrix of state vectors. Each column is one iteration.
   k                      Number of iterations
   rsvd                   Residuals from the least squares
   cHWH                   Condition number of H'WH
   rank                   Rank of the A matrix
   P                      Covariance matrix: inv[S0 + H'WH]
   wmr                    Weighted mean of the residuals
   sr                     Weighted rms deviation of the residuals
   J                      Loss estimate
   sig                    Uncertainty in the estimates
   nz                     Number of measurements used

--------------------------------------------------------------------------
     References: Wertz, J.R., Spacecraft Attitude Determination and Control,
                 Kluwer, 1976, pp. 447-454.
--------------------------------------------------------------------------

Children:

Math: Linear/LSSVD