Ceres Solver 中文文档
  • 😜Ceres Solver 中文文档
  • Why Ceres ?
  • Installation
  • Tutorial
    • Non-linear Least Squares
      • Introduction
      • Hello World
      • Derivatives
        • Numeric Derivatives
        • Analytic Derivatives
        • More About Derivatives
      • Powell’s Function
      • Curve Fitting
      • Robust Curve Fitting
      • Bundle Adjustment
      • Other Examples
    • General Unconstrained Minimization
      • General Unconstrained Minimization
  • On Derivatives
    • Spivak Notation
    • Analytic Derivatives
    • Numeric Derivatives
    • Automatic Derivatives
    • Interfacing with Automatic Differentiation
    • Using Inverse & Implicit Function Theorems
  • Modeling Non-linear Least Squares
    • Introduction
    • Main Class Interface
      • CostFunction
      • SizeCostFunction
      • AutoDiffCostFunction
      • DynamicAutoDiffCostFunction
      • NumericDiffCostFunction
      • DynamicNumericDifferCostFunction
      • CostFunctionToFunctor
      • DynamicCostFunctionToFunctor
      • ConditionedCostFunction
      • GradientChecker
      • NormalPrior
      • LossFunction
      • Manifold
      • AutoDIffManifold
      • Problem
      • EvaluatationCallback
      • Rotation
      • Cubic Interpolation
        • CubicInterpolator
        • BiCubicInterpolator
  • Solveing Non-linear Least Squares
    • Introduction
    • Trust Region Methodd
    • Line Search Methods
    • Linear Solvers
    • Mixed Precision Solves
    • Preconditioners
    • Ordering
    • Main Class Interfaces
      • Solver::Options
      • ParameterBlockOrdering
      • IterationSummary
      • IterationCallback
      • CRSMatrix
      • Solver::Summary
  • Covariance Estimation
    • Introduction
    • Gauge Invariance
    • Covariance
    • Rank of the Jacobian
      • Options
      • Covariance
      • GetCovarianceBlock
      • GetCovarianceBlockInTangentSpace
    • Example Usage
Powered by GitBook
On this page
  1. Tutorial
  2. Non-linear Least Squares

Powell’s Function

PreviousMore About DerivativesNextCurve Fitting

Last updated 1 year ago

考虑一个略微复杂点的例子,最小化 Powell's 函数。 其中 x=[x1,x2,x3,x4]x=\left[x_{1}, x_{2}, x_{3}, x_{4}\right]x=[x1​,x2​,x3​,x4​] 残差项为:

f1(x)=x1+10x2f2(x)=5(x3−x4)f3(x)=(x2−2x3)2f4(x)=10(x1−x4)2F(x)=[f1(x),f2(x),f3(x),f4(x)]\begin{aligned} f_{1}(x) & =x_{1}+10 x_{2} \\ f_{2}(x) & =\sqrt{5}\left(x_{3}-x_{4}\right) \\ f_{3}(x) & =\left(x_{2}-2 x_{3}\right)^{2} \\ f_{4}(x) & =\sqrt{10}\left(x_{1}-x_{4}\right)^{2} \\ F(x) & =\left[f_{1}(x), f_{2}(x), f_{3}(x), f_{4}(x)\right] \end{aligned}f1​(x)f2​(x)f3​(x)f4​(x)F(x)​=x1​+10x2​=5​(x3​−x4​)=(x2​−2x3​)2=10​(x1​−x4​)2=[f1​(x),f2​(x),f3​(x),f4​(x)]​

则在该例子中 fi(x)f_i(x)fi​(x) 就是 ResidualBlock,xxx 就是整个参数块,例如 f1(x)f_1(x)f1​(x) 对应的参数块为 [x1,x2][x_1,x_2][x1​,x2​]。Cost 的计算方式为 x1+10x2x_1+10x_2x1​+10x2​。F(x)F(x)F(x) 拥有四个参数,拥有四个残差项,我们想要最小化 12∣∣F(x)∣∣2\frac{1}{2}||F(x)||^221​∣∣F(x)∣∣2。接下来我们使用 Ceres 来解决该问题。

同样,第一步是定义评估目标函数中每个残差项的 functor。评估 f4(x1,x4)f_4(x_1,x_4)f4​(x1​,x4​) 的 functor 如下:

struct F4 {
  template <typename T>
  bool operator()(const T* const x1, const T* const x4, T* residual) const {
    residual[0] = sqrt(10.0) * (x1[0] - x4[0]) * (x1[0] - x4[0]);
    return true;
  }
};

类似的我们可以定义 F1,F2,F3 去评估 f1(x1,x2),f2(x3,x4),f3(x3,x2)f_1(x_1,x_2), f_2(x_3,x_4), f_3(x_3,x_2)f1​(x1​,x2​),f2​(x3​,x4​),f3​(x3​,x2​)。最终构建优化问题的方式如下:

double x1 =  3.0; double x2 = -1.0; double x3 =  0.0; double x4 = 1.0;

Problem problem;

// Add residual terms to the problem using the autodiff
// wrapper to get the derivatives automatically.
problem.AddResidualBlock(
  new AutoDiffCostFunction<F1, 1, 1, 1>(), nullptr, &x1, &x2);
problem.AddResidualBlock(
  new AutoDiffCostFunction<F2, 1, 1, 1>(), nullptr, &x3, &x4);
problem.AddResidualBlock(
  new AutoDiffCostFunction<F3, 1, 1, 1>(), nullptr, &x2, &x3);
problem.AddResidualBlock(
  new AutoDiffCostFunction<F4, 1, 1, 1>(), nullptr, &x1, &x4);
Initial x1 = 3, x2 = -1, x3 = 0, x4 = 1
iter      cost      cost_change  |gradient|   |step|    tr_ratio  tr_radius  ls_iter  iter_time  total_time
   0  1.075000e+02    0.00e+00    1.55e+02   0.00e+00   0.00e+00  1.00e+04        0    2.91e-05    3.40e-04
   1  5.036190e+00    1.02e+02    2.00e+01   0.00e+00   9.53e-01  3.00e+04        1    4.98e-05    3.99e-04
   2  3.148168e-01    4.72e+00    2.50e+00   6.23e-01   9.37e-01  9.00e+04        1    2.15e-06    4.06e-04
   3  1.967760e-02    2.95e-01    3.13e-01   3.08e-01   9.37e-01  2.70e+05        1    9.54e-07    4.10e-04
   4  1.229900e-03    1.84e-02    3.91e-02   1.54e-01   9.37e-01  8.10e+05        1    1.91e-06    4.14e-04
   5  7.687123e-05    1.15e-03    4.89e-03   7.69e-02   9.37e-01  2.43e+06        1    1.91e-06    4.18e-04
   6  4.804625e-06    7.21e-05    6.11e-04   3.85e-02   9.37e-01  7.29e+06        1    1.19e-06    4.21e-04
   7  3.003028e-07    4.50e-06    7.64e-05   1.92e-02   9.37e-01  2.19e+07        1    1.91e-06    4.25e-04
   8  1.877006e-08    2.82e-07    9.54e-06   9.62e-03   9.37e-01  6.56e+07        1    9.54e-07    4.28e-04
   9  1.173223e-09    1.76e-08    1.19e-06   4.81e-03   9.37e-01  1.97e+08        1    9.54e-07    4.32e-04
  10  7.333425e-11    1.10e-09    1.49e-07   2.40e-03   9.37e-01  5.90e+08        1    9.54e-07    4.35e-04
  11  4.584044e-12    6.88e-11    1.86e-08   1.20e-03   9.37e-01  1.77e+09        1    9.54e-07    4.38e-04
  12  2.865573e-13    4.30e-12    2.33e-09   6.02e-04   9.37e-01  5.31e+09        1    2.15e-06    4.42e-04
  13  1.791438e-14    2.69e-13    2.91e-10   3.01e-04   9.37e-01  1.59e+10        1    1.91e-06    4.45e-04
  14  1.120029e-15    1.68e-14    3.64e-11   1.51e-04   9.37e-01  4.78e+10        1    2.15e-06    4.48e-04

Solver Summary (v 2.2.0-eigen-(3.4.0)-lapack-suitesparse-(7.1.0)-metis-(5.1.0)-acceleratesparse-eigensparse)

                                     Original                  Reduced
Parameter blocks                            4                        4
Parameters                                  4                        4
Residual blocks                             4                        4
Residuals                                   4                        4

Minimizer                        TRUST_REGION

Dense linear algebra library            EIGEN
Trust region strategy     LEVENBERG_MARQUARDT
                                        Given                     Used
Linear solver                        DENSE_QR                 DENSE_QR
Threads                                     1                        1
Linear solver ordering              AUTOMATIC                        4

Cost:
Initial                          1.075000e+02
Final                            1.120029e-15
Change                           1.075000e+02

Minimizer iterations                       15
Successful steps                           15
Unsuccessful steps                          0

Time (in seconds):
Preprocessor                         0.000311

  Residual only evaluation           0.000002 (14)
  Jacobian & residual evaluation     0.000023 (15)
  Linear solver                      0.000043 (14)
Minimizer                            0.000163

Postprocessor                        0.000012
Total                                0.000486

Termination:                      CONVERGENCE (Gradient tolerance reached. Gradient max norm: 3.642190e-11 <= 1.000000e-10)

Final x1 = 0.000146222, x2 = -1.46222e-05, x3 = 2.40957e-05, x4 = 2.40957e-05

Footnotes

AutoDiffCostFunction<F1, 1, 1, 1>() 的模板参数分别代表 Cost 的计算方式,残差的维度,第一个参数的维度,第二个参数的维度。这里需要注意的是,每个 ResidualBlock 仅依赖于相应残差函数所依赖的两个参数,而不依赖于所有四个参数,例如 f1(x1,x2)f_1(x_1,x_2)f1​(x1​,x2​) 仅仅依赖于 x1,x2x_1,x_2x1​,x2​。编译并运行 可以得到如下结果:

不难看出,当目标函数值为 0 时,最优解位于 x1=0,x2=0,x3=0,x4=0x_{1}=0, x_{2}=0, x_{3}=0, x_{4}=0x1​=0,x2​=0,x3​=0,x4​=0 处。 经过 15 次迭代,Ceres 得到的 cost 为 1.120029e−151.120029e-151.120029e−15。

.

examples/powell.cc
examples/powell.cc