To Log Everything I find useful. If you find anything inappropriate, please contact cuijinqiang@gmail.com
Sunday, March 24, 2013
Saturday, March 23, 2013
German 1
wir haben uns kennengelernt
Da begrüßte er mich, wir hatten uns persönlich nie kennengelernt, aber er legte Wert darauf, mich am 4. November kennenzulernen.
He greeted me, we had never met in person, but it was important to him that he meet me on the 4th of November.
Da begrüßte er mich, wir hatten uns persönlich nie kennengelernt, aber er legte Wert darauf, mich am 4. November kennenzulernen.
He greeted me, we had never met in person, but it was important to him that he meet me on the 4th of November.
独处也是一种能力
独处也是一种能力
周国平
周国平
【助您成才的15个建议】1.学会换位思考;2.学会适应环境;3.学会大方;4.学会低调;5.嘴要甜;6.有礼貌;7.言多必失;8.学会感恩;9.遵守时间;10.信守诺言;11.学会忍耐;12.有一颗平常心;13.学会赞扬别人;14.待上以敬,待下以宽;15.经常检讨自己。
【励志语录】1.好多人做不好自己,是因为总想着做别人!2.从不奢求生活能给予我最好的,只是执着于寻求最适合我的!3.宁愿跑起来被拌倒无数次,也不愿规规矩矩走一辈子,就算跌倒也要豪迈的笑 。4.不要生气要争气,不要看破要突破,不要嫉妒要欣赏,不要托延要积极,不要心动要行动。
Saturday, March 2, 2013
Convex Optimization
Reference in Convex optimization
- Stephen Boyd and Lieven Vandenberghe, Convex optimization (book in pdf)
- EE364a: Convex Optimization I and EE364b: Convex Optimization II, Stanford course homepages
- 6.253: Convex Analysis and Optimization, an MIT OCW course homepage
- Brian Borchers, An overview of software for convex optimization
Examples
The following problems are all convex minimization problems, or can be transformed into convex minimizations problems via a change of variables:
- Least squares
- Linear programming
- Convex quadratic minimization with linear constraints
- Quadratically constrained Convex-quadratic minimization with convex quadratic constraints
- Conic optimization
- Geometric programming
- Second order cone programming
- Semidefinite programming
- Entropy maximization with appropriate constraints
Methods
Convex minimization problems can be solved by the following contemporary methods:[4]
- "Bundle methods" (Wolfe, Lemaréchal, Kiwiel), and
- Subgradient projection methods (Polyak),
- Interior-point methods (Nemirovskii and Nesterov).
Other methods of interest:
- Cutting-plane methods
- Ellipsoid method
- Subgradient method
- Dual subgradients and the drift-plus-penalty method
Subgradient methods can be implemented simply and so are widely used.[5] Dual subgradient methods are subgradient methods applied to a dual problem. The drift-plus-penalty method is similar to the dual subgradient method, but takes a time average of the primal variables.
Friday, March 1, 2013
OpenOF Framework for Sparse Non-linear Least Squares Optimization on a GPU
2013-3-2
OpenOF Framework for Sparse Non-linear Least Squares Optimization on a GPU
With OpenOF, a framework is presented, which enables developers to design sparse optimizations regarding parameters and measurements and utilize the parallel power of a GPU
This code is written in Python with three major libraries: Thrust, CUSP and SymPy. Code framework is written in Python but can also generate C++ code.
Code website
https://github.com/OpenOF/OpenOF
Process of Nonlinear least squares optimization:
1. Iterative method
2. Linearize the cost function in each iteration
3. Levenberg-Marquardt(LM) algorithm is standard, combing the Gauss-Newton algorithm with the gradient descent approach. LM guarantees convergence.
4. In each interation , solving linear Ax = b is most intensive.
5. Sparse matrix representation is used: sparseLM (Lourakis, 2010) and g2o (Kummerle et al., 2011), but on CPU
6. Solving Ax =b, many algorithms can achieve, Cholesky docomposition A = LDL'
7. this paper use Conjugate gradient (CG) approach on GPU.
Nonlinear least squares optimization is widely used in SLAM and BA.
The authors' some comments about three BA libraries:
1.The SBA library (Lourakis and Argyros,2009) takes advantage of the special structure of the Hessian matrix to apply the Schur complement for solving the linear system. Nevertheless it has several drawbacks. Integrating additional parameters which remain identical for all measurements (e.g. camera calibration) is not possible, as the structure would change such that the Schur complement could not be applied anymore.
2. sparseLM (Lourakis, 2010) is slow.
3. g2o: the Jacobian is evaluated by numerical differentiation which is time consuming and also degrades the convergence rate.
4. ISAM: (Kaess et al., 2011),which address only a subset of problems, have been presented previously for least squares optimization
Overall Comment: this paper is claims to present an open source framework for sparse nonlinear opitmization. The cost functions is described in high level scripting language. It can not be used without GPU yet. It seems for me g2o or iSam would be more useful on CPU.
Subscribe to:
Posts (Atom)