函数扭曲点和非扭曲点不是互逆的

Vah*_*agn 3 opencv fisheye camera-calibration

我试图了解相机校准/3D 重建,并面临 cv::fisheye:: Distor/un DistorPoints 函数的奇怪行为。我希望鱼眼模型沿着将其连接到主点(cx,cy)的射线移动一个点,但是,事实并非如此。此外,functions cv::fisheye::distortPointscv::fisheye::undistortPoints不是彼此相反的(正如人们所期望的那样)。

以下代码创建一个具有畸变系数的相机矩阵,并对其进行反畸变,然后再畸变回任意点。相机内在参数和畸变系数的值取自公共数据集。

cv::Mat camera_matrix = cv::Mat::zeros(3,3,CV_64F);
camera_matrix.at<double>(0,0) = 190.9784;
camera_matrix.at<double>(1,1) = 190.9733;
camera_matrix.at<double>(0,2) = 254.9317;
camera_matrix.at<double>(1,2) = 256.8974;
camera_matrix.at<double>(2,2) = 1;

std::cout << "Camera matrix: \n" << camera_matrix << "\n" <<std::endl;

cv::Mat distortion_coefficients(4,1,CV_64F);
distortion_coefficients.at<double>(0) = 0.003482;
distortion_coefficients.at<double>(1) = 0.000715;
distortion_coefficients.at<double>(2) = -0.0020532;
distortion_coefficients.at<double>(3) = 0.000203;

std::cout << "Distortion coefficients\n"<< distortion_coefficients<< "\n" << std::endl;

cv::Mat original_point(1,1,CV_64FC2);
original_point.at<cv::Point2d>(0).x= 7.7;
original_point.at<cv::Point2d>(0).y= 9.9;
cv::Mat undistorted, distorted;
cv::fisheye::undistortPoints(original_point, undistorted, camera_matrix, 
            distortion_coefficients, cv::Mat(), camera_matrix);
cv::fisheye::distortPoints(undistorted, distorted, camera_matrix, distortion_coefficients);

std:: cout << "Original point: " << original_point.at<cv::Point2d>(0).x << " " << original_point.at<cv::Point2d>(0).y << std::endl;
std:: cout << "Undistorted point: " << undistorted.at<cv::Point2d>(0).x << " " << undistorted.at<cv::Point2d>(0).y<< std::endl;
std:: cout << "Distorted point: " << distorted.at<cv::Point2d>(0).x << " " << distorted.at<cv::Point2d>(0).y;
Run Code Online (Sandbox Code Playgroud)

这样做的结果是

Camera matrix: 
[190.9784, 0, 254.9317;
 0, 190.9733, 256.8974;
 0, 0, 1]

Distortion coefficients
[0.003482;
 0.000715;
 -0.0020532;
 0.000203]

Original point: 7.7 9.9
Undistorted point: 8905.69 8899.45
Distorted point: 464.919 466.732
Run Code Online (Sandbox Code Playgroud)

靠近左上角的点移到了右下角较远的位置。

这是一个错误还是我不明白什么?

cv::fisheye::unactorImage 正在处理数据集图像 - 曲线变回直线。

我缺少什么?

ele*_*ris 5

你错过了两件事。

\n
    \n
  1. 您对fisheye:: DistrPoints() 函数使用了不正确的参数。\n 您需要标准化点。\n来自文档,但不是很清楚:
  2. \n
\n
\n

请注意,该函数假设未失真点的相机固有矩阵是一致的。这意味着如果您想使用 unactorPoints() 变换回未扭曲的点,则必须将它们与 P\xe2\x88\x921 相乘。

\n
\n
    \n
  1. 您需要认识到,并非扭曲图像中的所有点最终都会出现在未扭曲图像中。外推法仅在一定程度上有效,超过某一点,不失真和再失真将不会互为倒数。
  2. \n
\n

为了标准化点,您首先需要均匀化点(转换为 3d)并将逆矩阵与每个均匀化点相乘以获得标准化点。

\n

您可以对新相机矩阵使用fisheye::estimateNewCameraMatrixForUn DistretRectify 来调整源和目标中有效像素之间的平衡。但是,如果您希望未畸变的点与未畸变图像中的点相匹配,则需要在 unactorImage 中使用这个新的相机矩阵来了解 Knew。

\n
cv::Mat k = cv::Mat::zeros(3,3,CV_64F);\nk.at<double>(0,0) = 190.9784;\nk.at<double>(1,1) = 190.9733;\nk.at<double>(0,2) = 254.9317;\nk.at<double>(1,2) = 256.8974;\nk.at<double>(2,2) = 1;\n\nstd::cout << "Camera matrix: \\n" << k << "\\n" <<std::endl;\n\ncv::Mat d(4,1,CV_64F);\nd.at<double>(0) = 0.003482;\nd.at<double>(1) = 0.000715;\nd.at<double>(2) = -0.0020532;\nd.at<double>(3) = 0.000203;\n\nstd::cout << "Distortion coefficients\\n"<< d<< "\\n" << std::endl;\n\n\ncv::Mat points_original(1,4,CV_64FC2);\npoints_original.at<cv::Point2d>(0).x= 7.7;\npoints_original.at<cv::Point2d>(0).y= 9.9;\npoints_original.at<cv::Point2d>(1).x= 30;\npoints_original.at<cv::Point2d>(1).y= 30;\npoints_original.at<cv::Point2d>(2).x= 40;\npoints_original.at<cv::Point2d>(2).y= 40;\npoints_original.at<cv::Point2d>(3).x= 50;\npoints_original.at<cv::Point2d>(3).y= 50;\n\ncv::Mat nk;\n\n// float balance = 1;\n// fisheye::estimateNewCameraMatrixForUndistortRectify(k,d,cv::Size(512,512),Mat::eye(3,3,CV_64FC1),nk,balance);\n\nnk = k;\n\nstd::cout << "New Camera matrix: \\n" << nk << "\\n" <<std::endl;\n\ncv::Mat points_undistorted, points_redistorted;\ncv::fisheye::undistortPoints(points_original,points_undistorted,k,d,cv::Mat(),nk);\n\n // {x,y} -> {x,y,1}\nstd::vector<Point3d> points_undistorted_homogeneous;\nconvertPointsToHomogeneous(points_undistorted, points_undistorted_homogeneous);\n\nMat cam_intr_inv = nk.inv();\n\nfor(int i=0;i<points_undistorted_homogeneous.size();++i){\n    Mat p(Size(1,3),CV_64FC1);\n    p.at<double>(0,0) = points_undistorted_homogeneous[i].x;\n    p.at<double>(1,0) = points_undistorted_homogeneous[i].y;\n    p.at<double>(2,0) = points_undistorted_homogeneous[i].z;\n\n    Mat q = cam_intr_inv*p;\n\n    points_undistorted_homogeneous[i].x = q.at<double>(0,0);\n    points_undistorted_homogeneous[i].y = q.at<double>(1,0);\n    points_undistorted_homogeneous[i].z = q.at<double>(2,0);    \n}\n\nstd::vector<Point2d> points_undistorted_normalized;\nconvertPointsFromHomogeneous(points_undistorted_homogeneous,points_undistorted_normalized);\n\nfisheye::distortPoints(points_undistorted_normalized, points_redistorted,k, d);\n\nfor(int i = 0;i<points_original.size().width;++i){\n    std:: cout << "Original point: " << points_original.at<cv::Point2d>(i) << "\\n";\n    std:: cout << "Undistorted point: " << points_undistorted.at<cv::Point2d>(i) << "\\n";\n    std:: cout << "Redistorted point: " << points_redistorted.at<cv::Point2d>(i) << "\\n\\n";\n}  \n
Run Code Online (Sandbox Code Playgroud)\n

结果:

\n
Original point: [7.7, 9.9]\nUndistorted point: [8905.69, 8899.45]\nRedistorted point: [463.048, 464.816]\n\nOriginal point: [30, 30]\nUndistorted point: [8125.4, 8196.15]\nRedistorted point: [461.864, 465.638]\n\nOriginal point: [40, 40]\nUndistorted point: [7775.49, 7846.24]\nRedistorted point: [461.725, 465.582]\n\nOriginal point: [50, 50]\nUndistorted point: [-3848.98, -3886.37]\nRedistorted point: [50, 50]\n
Run Code Online (Sandbox Code Playgroud)\n

正如你所看到的,角球区的前 40 分都失败了。也许您可以通过再次校准并在图像边缘(以不同角度)添加更多棋盘角来改进这一点。

\n