Armijo:

--------------------------------------------------------------------------
   Use the Armijo rule to compute a stepsize, alpha:

      alpha = s * beta ^ m        (beta < 1)

   where "s" is the initial stepsize (usually 1), and "m" is the first 
   nonnegative integer (0,1,2,...) such that:

      f(x+alpha*d) - f(x) <= sigma * alpha * g(x)' * d

   where f(x) is the cost function, g(x) is the gradient of f(x), x is the 
   current state, and d is the current direction of the optimization iteration.

   Create a function handle like this:
		f = @(x) x(1)^2 + 3*(x(2)-2)^3

   Since version 8.
--------------------------------------------------------------------------
   Form:
   [alpha,xnew,m] = Armijo( x, s, beta, sigma, d, f, g )

   See also:  NewtonsMethod.m
--------------------------------------------------------------------------

   ------
   Inputs
   ------
   x          Initial state
   s          Maximum stepsize
   beta       Reduction factor 
   sigma      Scale factor
   d          Direction
   f          Function handle for cost function
   g          Function handle for gradient of cost function

   -------
   Outputs
   -------
   alpha      Stepsize
   xnew       New state
   m          Number of iterations

--------------------------------------------------------------------------