Thursday, August 15, 2013

Image stiching in OpenCV

Reference to http://ramsrigoutham.com/2012/11/22/panorama-image-stitching-in-opencv/

Main steps of the code:
  1. Load two images;
  2. Convert to gray scale;
  3. Using SURF detector to find SURF descriptor in both images;
  4. matching the SURF descriptor using FLANN Matcher;
  5. Postprocessing matches to find good matches
  6. Using RANSAC to estimate the Homography matrix using the matched SURF descriptors;
  7. Warping the images based on the homography matrix

The image shows the definition of Homography which transforms 2d Planar point to the image plane. 


 The following image is shows the initial image and the matched SURF features of the two. Homography is derived from the left image to the right image.


Stiching image result.

Source Code
#include <stdio.h>
#include <iostream>
#include "opencv2/opencv.hpp"
#include "opencv2/nonfree/nonfree.hpp"
using namespace cv;
using namespace std;
int main()
{
// Load the image
Mat image1 = imread("image1.jpg",1 );
Mat image2 = imread("image2.jpg",1);
Mat img1_gray;
Mat img2_gray;
//convert to Grayscale
cvtColor(image1,img1_gray,CV_RGB2GRAY );
cvtColor(image2,img2_gray,CV_RGB2GRAY );
imshow("Image1", image1);
imshow("Image2", image2);
if(!img1_gray.data || !img2_gray.data )
{
cout << " -- (!) Error reading Images " << endl;
return -1;
}
// -- Step 1: Detect the keypoints using SURF Detector
int minHessian = 400;
SurfFeatureDetector detector( minHessian );
vector< KeyPoint> keypoints_object, keypoints_scene;
detector.detect(img1_gray, keypoints_object );
detector.detect(img2_gray, keypoints_scene );
// -- Step 2: Calculate descriptors (feature vectors)
SurfDescriptorExtractor extractor;
Mat descriptors_object, descriptors_scene;
extractor.compute(img1_gray,keypoints_object, descriptors_object ) ;
extractor.compute(img2_gray,keypoints_scene , descriptors_scene ) ;
// -- Step 3: Matching descriptor vector using FLANN matcher
FlannBasedMatcher matcher;
vector<DMatch> matches;
matcher.match(descriptors_object, descriptors_scene, matches );
double max_dist = 0; double min_dist = 100;
// -- Quick calculation of Max and Min distances between keypoints
for (int i = 0; i < descriptors_object.rows ; i++)
{
double dist = matches[i].distance;
if( dist < min_dist ) min_dist = dist;
if( dist > max_dist ) max_dist = dist;
}
printf("-- Max Dist : %f \n", max_dist);
printf("-- Min Dist : %f \n", min_dist);
// -- Use only " good " matches, (i.e, those with distance smaller than 3*mindist)
vector<DMatch> good_matches;
for (int i = 0; i < descriptors_object.rows; i++)
{
if ( matches[i].distance < 3*min_dist )
{
good_matches.push_back( matches[i] );
}
}
// Draw Good Matches
Mat img_goodmatches;
drawMatches(image1, keypoints_object, image2, keypoints_scene,good_matches,img_goodmatches,Scalar::all(-1),Scalar::all(-1),vector<char>(),DrawMatchesFlags::NOT_DRAW_SINGLE_POINTS);
imshow("GoodMatches", img_goodmatches );
imwrite("MatchingResult.jpg", img_goodmatches);
vector< Point2f > obj;
vector< Point2f > scene;
for (int i = 0; i < good_matches.size(); i++)
{
// -- Get the keypoints from the good matches
obj.push_back( keypoints_object[ good_matches[i].queryIdx].pt);
scene.push_back(keypoints_scene[ good_matches[i].trainIdx].pt);
}
// -- Find the Homography Matrix
Mat H = findHomography( obj, scene, CV_RANSAC ) ;
cout << "Homography Matrix" << H << endl;
// -- Use the Homography Matrix to warp the images, convert image 1 to image 2 using H
Mat result;
warpPerspective(image1, result, H, cv::Size(image1.cols + image2.cols,image1.rows));
Mat half(result, cv::Rect(0,0,image2.cols,image2.rows)); // IMPT: Mat 'half' refer to the same address as the left part of 'result'
image2.copyTo(half);
imshow("Stiching Result", result );
imshow("Stiching Result2", half );
imwrite("StichingResult.jpg", result);
waitKey(0);
return 0;
}
view raw gistfile1.cpp hosted with ❤ by GitHub

Tuesday, August 6, 2013

Resource to learn optimization

http://www.quora.com/Mathematical-Optimization/What-are-some-good-resources-to-learn-about-optimization#


Theory behind MPC


MPC is based on iterative, finite horizon optimization of a plant model. At time t the current plant state is sampled and a cost minimizing control strategy is computed (via a numerical minimization algorithm) for a relatively short time horizon in the future: [t,t+T]. Specifically, an online or on-the-fly calculation is used to explore state trajectories that emanate from the current state and find (via the solution of Euler-Lagrange equations) a cost-minimizing control strategy until time t+T. Only the first step of the control strategy is implemented, then the plant state is sampled again and the calculations are repeated starting from the current state, yielding a new control and new predicted state path. The prediction horizon keeps being shifted forward and for this reason MPC is also called receding horizon control.