飞道的博客

机器人手眼标定Ax=xB(eye to hand和eye in hand)及平面九点法标定

660人阅读  评论(0)

一、背景

Calibration是机器人开发者永远的痛。虽然说方法说起来几十年前就有,但每一个要用摄像头的人都还是要经过一番痛苦的踩坑,没有轻轻松松拿来就效果好的包。

机器人视觉应用中,手眼标定是一个非常基础且关键的问题。简单来说手眼标定的目的就是获取机器人坐标系和相机坐标系的关系,最后将视觉识别的结果转移到机器人坐标系下。

手眼标定行业内分为两种形式,根据相机固定的地方不同,如果相机和机器人末端固定在一起,就称之为“眼在手”(eye in hand),如果相机固定在机器人外面的底座上,则称之为“眼在外”(eye to hand)。 

eye to hand
眼在外
eye in hand
​​​​​​眼在手

二、手眼关系的数学描述

1. eye in hand,这种关系下,两次运动,机器人底座和标定板的关系始终不变。求解的量为相机和机器人末端坐标系的位姿关系。

 

2. eye to hand,这种关系下,两次运动,机器人末端和标定板的位姿关系始终不变。求解的量为相机和机器人底座坐标系之间的位姿关系。

 

三、AX = XB问题的求解

旋转和平移分步法求解: 

  • Y. Shiu, S. Ahmad Calibration of Wrist-Mounted Robotic Sensors by Solving Homogeneous Transform Equations of the Form AX = XB. In IEEE Transactions on Robotics and Automation, 5(1):16-29, 1989. 
  • R. Tsai, R. Lenz A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration. In IEEE Transactions on Robotics and Automation, 5(3):345-358, 1989. 

迭代求解及相关资料可以看看相关网上的英文教程 http://math.loyola.edu/~mili/Calibration/index.html其中也有一些AX= XB的matlab代码可以使用。  

ROS 下也有相关的一些package可以利用

https://github.com/IFL-CAMP/easy_handeye

http://wiki.ros.org/handeye

https://visp-doc.inria.fr/doxygen/visp-daily/tutorial-calibration-extrinsic.html

http://campar.in.tum.de/Chair/HandEyeCalibration TUM这里也有很多手眼标定求解的参考链接

四、其他参考资料

https://mp.weixin.qq.com/s/nJx1dlpBXaL2_iT_J4W5Kg 邱强Flyqq 微信文章

https://blog.csdn.net/u011089570/article/details/47945733 图不错

https://blog.csdn.net/qq_16481211/article/details/79764730 部分halcon代码

https://blog.csdn.net/qq_16481211/article/details/79767100 halcon代码

https://blog.csdn.net/happyjume/article/details/80847822 部分原理

https://blog.csdn.net/zhang970187013/article/details/81098175 UR5 与easy hand eye

一般用“两步法”求解基本方程,即先从基本方程上式求解出旋转部分,再代入求解出平移部分。

https://blog.csdn.net/yunlinwang/article/details/51622143

============== Halcon 官方示例-手眼标定 ==================

五、Matlab下手眼标定解算

相机与机器人是eye-to-hand模式,机器人为加拿大Kinova 6轴机械臂,机器人pose为基座相对于末端的x,y,z,rx,ry,rz,rw, 单位为米。姿态使用单位四元数表示。

2017.08.29Kinova_pose_all_8_1.txt


  
  1. 0 .4285232296,0 .1977596461, -0 .5595823047,0 .4738016082,0 .3209333657, -0 .2268795901, -0 .7880605703
  2. 0 .2265712272,0 .0261943502, -0 .6661910850,0 .5222343809,0 .0461668455, -0 .1315038186, -0 .8413362107
  3. 0 .1058885008,0 .0527681997, -0 .7419267756,0 .5081638565, -0 .0263721340, -0 .1696570449, -0 .8439730402
  4. 0 .1127224767, -0 .0420359377, -0 .7413071786,0 .5101706595, -0 .0322252464, -0 .1864402870, -0 .8390038445
  5. 0 .2592932751, -0 .0329068529, -0 .7162865014,0 .5101641882,0 .1265739325, -0 .0153077050, -0 .8505746380
  6. -0 .1724239544,0 .0084144761, -0 .6998332592,0 .4989369998, -0 .0973210400, -0 .1244194561, -0 .8521210503
  7. -0 .0534271258,0 .0771779706, -0 .6845820453,0 .5195499916,0 .0511217081, -0 .1691595167, -0 .8359661686
  8. 0 .2598500029,0 .0933230213, -0 .6638257382,0 .5673912325,0 .1686973705, -0 .1427337629, -0 .7932436318

pattern pose为标定板相对于相机的x,y,z,rx,ry,rz,rw, 单位为米。姿态使用单位四元数表示。

2017.08.29Pattern_pose_all_8_1.txt


  
  1. 0. 4277459616,- 0. 0267754900, 0. 0822384718, 0. 3555198802,- 0. 2313832954,- 0. 6738796808,- 0. 6049409568
  2. 0. 4610891450,- 0. 0089776615, 0. 0394863697,- 0. 4168581229, 0. 4389318994, 0. 4953132086, 0. 6230833961
  3. 0. 4159190432, 0. 0112235849, 0. 0112504715,- 0. 3559197420, 0. 4898053987, 0. 4542453721, 0. 6535081871
  4. 0. 4143974465, 0. 0094036803, 0. 1073298638,- 0. 3438949365, 0. 4875942762, 0. 4507613906, 0. 6639294113
  5. 0. 3473267442,- 0. 1866610227, 0. 0651366494, 0. 4966101920,- 0. 4291949558,- 0. 5361047392,- 0. 5308123169
  6. 0. 3883165305,- 0. 2249779584, 0. 0093256602,- 0. 3595692001, 0. 5529429756, 0. 4090559739, 0. 6305848604
  7. 0. 4309364801,- 0. 2354261800,- 0. 0925819097,- 0. 3831460128, 0. 4244707870, 0. 5043032275, 0. 6470718186
  8. 0. 4362292324,- 0. 0778107597,- 0. 1007970399, 0. 4568889439,- 0. 3144470665,- 0. 5302104578,- 0. 6412896427

此Matlab文件调用数据进行离线解算。http://math.loyola.edu/~mili/Calibration/index.html 的这部分Registering Two Sets of 6DoF Data with 1 Unknown,有code下载,下载好命名为shiu.m和tsai.m供下面程序调用就行。我这里贴出

tsai.m


  
  1. function X = tsai_wechat(A,B)
  2. % Calculates the least squares solution of
  3. % AX = XB
  4. %
  5. % A New Technique for Fully Autonomous
  6. % and Efficient 3D Robotics Hand/Eye Calibration
  7. % Lenz Tsai
  8. %
  9. % Mili Shah
  10. % July 2014
  11. [m,n] = size(A); n = n/4;
  12. S = zeros(3*n,3);
  13. v = zeros(3*n,1);
  14. %Calculate best rotation R
  15. for i = 1:n
  16. A1 = logm(A(1:3,4*i-3:4*i-1));
  17. B1 = logm(B(1:3,4*i-3:4*i-1));
  18. a = [A1(3,2) A1(1,3) A1(2,1)]'; a = a/norm(a);
  19. b = [B1(3,2) B1(1,3) B1(2,1)]'; b = b/norm(b);
  20. S(3*i-2: 3*i,:) = skew(a+b);
  21. v(3*i-2: 3*i,:) = a-b;
  22. end
  23. x = S\v;
  24. theta = 2*atan(norm(x));
  25. x = x/norm(x);
  26. R = (eye(3)*cos(theta) + sin(theta)*skew(x) + (1-cos(theta))*x*x')';
  27. %Calculate best translation t
  28. C = zeros(3*n,3);
  29. d = zeros(3*n,1);
  30. I = eye(3);
  31. for i = 1:n
  32. C(3*i-2: 3*i,:) = I - A(1:3,4*i-3:4*i-1);
  33. d(3*i-2: 3*i,:) = A(1:3,4*i)-R*B(1:3,4*i);
  34. end
  35. t = C\d;
  36. %Put everything together to form X
  37. X = [R t;0 0 0 1];

Jaco_handeye_test_10.m 测试程序中用到了Peter Corke老师的机器人工具箱。我的Matlab版本R2013a,利用机器人工具箱的一些转换函数(四元数的构建,欧拉角转换等),它安装和基本使用参考这里:https://blog.csdn.net/yaked/article/details/48933603


  
  1. clear;
  2. close all;
  3. clc;
  4. % Robot pose in Quatenion xyzw
  5. JacoCartesianPose = importdata('E:\\Matlab_work\\handeye\\yake_handeye\\2017.08.29Kinova_pose_all_8_1.txt');
  6. [m,n] = size(JacoCartesianPose); % 8* 7
  7. A = cell(1,m); % 1*8
  8. for i = 1: m
  9. robotHtool_qua = Quaternion([ JacoCartesianPose(i, 7), JacoCartesianPose(i, 4), JacoCartesianPose(i, 5) , JacoCartesianPose(i, 6)]) ;
  10. A{1,i} = transl(JacoCartesianPose(i, 1), JacoCartesianPose(i, 2), JacoCartesianPose(i, 3)) * robotHtool_qua.T;
  11. end
  12. % Pattern Pose(Homogeneous) stored in cell B.
  13. patternInCamPose = importdata('E:\\Matlab_work\\handeye\\yake_handeye\\2017.08.29Pattern_pose_all_8_1.txt');
  14. [melem,nelem] = size(patternInCamPose); % 8*7
  15. B= cell(1,melem);
  16. for x=1: melem
  17. camHmarker_qua = Quaternion([ patternInCamPose(x, 7) , patternInCamPose(x, 4), patternInCamPose(x, 5) , patternInCamPose(x, 6)]) ;
  18. B{1,x} = transl(patternInCamPose(x, 1), patternInCamPose(x, 2), patternInCamPose(x, 3)) * camHmarker_qua.T;
  19. end
  20. %
  21. n2= m;
  22. TA= cell(1,n2);
  23. TB= cell(1,n2);
  24. %--------------------- 8 -----------------------------------
  25. M1= [];
  26. M2= [];
  27. for j=[1: m-1]% 1-7.
  28. TA{1, j} = inv(A{1, j}) * A{1, j+1};
  29. M1= [M1 TA{1, j}];
  30. TB{1, j} = B{1, j} * inv(B{1, j+1});
  31. M2= [M2 TB{1, j}];
  32. end
  33. % M1=[TA{1,1} TA{1,2} TA{1,3} TA{1,4} TA{1,5} TA{1,6} TA{1,7} ];
  34. % M2=[TB{1,1} TB{1,2} TB{1,3} TB{1,4} TB{1,5} TB{1,6} TB{1,7} ];
  35. %--------------------- 8 -----------------------------------
  36. C_Tsai= tsai(M1, M2)
  37. T_Tsai = (transl(C_Tsai))';
  38. C_Tsai_rad = tr2rpy(C_Tsai);
  39. C_Tsai_rpy_rx_ry_rz = rad2deg(C_Tsai_rad);
  40. Q_Tsai_Qxyzw= Quaternion(C_Tsai);
  41. fprintf('Tsai(rad) = \n%f, %f, %f, %f, %f, %f\n',T_Tsai(1,1), T_Tsai(1,2), T_Tsai(1,3), C_Tsai_rad(1,1), C_Tsai_rad(1,2), C_Tsai_rad(1,3));
  42. fprintf('Tsai(deg) = \n%f, %f, %f, %f, %f, %f\n\n',T_Tsai(1,1), T_Tsai(1,2), T_Tsai(1,3), C_Tsai_rpy_rx_ry_rz(1,1), C_Tsai_rpy_rx_ry_rz(1,2), C_Tsai_rpy_rx_ry_rz(1,3));
  43. fprintf('Tsai(Qxyzw) = \n %f, %f, %f, %f\n\n', Q_Tsai_Qxyzw.v(1), Q_Tsai_Qxyzw.v(2), Q_Tsai_Qxyzw.v(3), Q_Tsai_Qxyzw.s);

稍微解释一下,程序做的就是读入机器人和相机的两两姿态信息,转换为4x4 的齐次变换矩阵,送入tsai.m程序求解。

手眼标定eye in hand 和eye to hand 的区别主要是机器人那边,一个是end相对于base,另一个是base相对于end。千万注意。 

====================平面九点标定法====================

当利用RGB相机或者是机器人只进行平面抓取(也即固定姿态抓取,机器人六自由度位置和姿态简化为只考虑平移,姿态人为给定并且固定,这时机器人可以移动到目标点上方),问题可以简化为平面RGB图像的目标像素点集A(x1,y1)与机器人在平面(X1,Y1)的点对关系。具体做法是相机识别像素点给到A,然后利用示教器查看机器人在基座标系下的坐标,当做B。

相机坐标和机器人坐标写成齐次的形式,投影矩阵X是一个3x3的矩阵我们需要6组对应点来求解最小配置解。利用奇异值分解SVD来求取。

D:\opencv_work\cubeSolver\cv_solver\ConsoleApplication1\CV_SVD.cpp

D:\Matlab_work\handeye\NinePoints_Calibration.m


  
  1. //Solve equation:AX=b
  2. #include <cv.h>
  3. #include<opencv2/opencv.hpp>
  4. using namespace std;
  5. using namespace cv;
  6. int main(int argc, char** argv)
  7. {
  8. printf( "\nSolve equation:AX=b\n\n");
  9. //Mat A = (Mat_<float>(6, 3) <<
  10. //480.8, 639.4, 1,
  11. //227.1, 317.5, 1,
  12. //292.4, 781.6, 1,
  13. //597.4, 1044.1, 1,
  14. //557.7, 491.6, 1,
  15. //717.8, 263.7, 1
  16. // );// 4x3
  17. //Mat B = (Mat_<float>(6, 3) <<
  18. //197170, 185349, 1,
  19. //195830, 186789, 1,
  20. //196174, 184591, 1,
  21. //197787, 183176, 1,
  22. //197575, 186133, 1,
  23. //198466, 187335, 1
  24. // );
  25. Mat A = (Mat_< float>( 4, 3) <<
  26. 2926.36, 2607.79, 1,
  27. 587.093, 2616.89, 1,
  28. 537.031, 250.311, 1,
  29. 1160.53, 1265.21, 1); // 4x3
  30. Mat B = (Mat_< float>( 4, 3) <<
  31. 320.389, 208.197, 1,
  32. 247.77, 209.726, 1,
  33. 242.809, 283.182, 1,
  34. 263.152, 253.715, 1);
  35. Mat X;
  36. cout << "A=" << endl << A << endl;
  37. cout << "B=" << endl << B << endl;
  38. solve(A, B, X, CV_SVD);
  39. cout << "X=" << endl << X << endl;
  40. Mat a1 = (Mat_< float>( 1, 3) << 1864, 1273, 1);
  41. Mat b1 = a1*X;
  42. cout << "b1=" << endl << b1 << endl;
  43. cout << "真实值为:" << "283.265, 253.049, 1" << endl;
  44. getchar();
  45. return 0;
  46. }

https://docs.opencv.org/3.1.0/d2/de8/group__core__array.html#ga12b43690dbd31fed96f213eefead2373

结果对比:左halcon C#(第三列为0,0,1,没做显示),右opencv c++,底下为Matlab结果,三者一致,算法检测通过。

=============

https://blog.csdn.net/Stones1025/article/details/100085338

这种方法利用点对,求仿摄变换矩阵 


  
  1. #include "pch.h"
  2. #include <iostream>
  3. #include <cv.h>
  4. #include<opencv2/opencv.hpp>
  5. using namespace cv;
  6. int main(int argc, char** argv)
  7. {
  8. printf( "\nSolve equation:AX=b\n\n");
  9. Point2f srcTri[ 3];
  10. srcTri[ 0] = Point2f( 480.8f, 639.4f);
  11. srcTri[ 1] = Point2f( 227.1f, 317.5f);
  12. srcTri[ 2] = Point2f( 292.4f, 781.6f);
  13. Point2f dstTri[ 3];
  14. dstTri[ 0] = Point2f( 197170.0f, 185349.0f);
  15. dstTri[ 1] = Point2f( 195830.0f, 186789.0f);
  16. dstTri[ 2] = Point2f( 196174.0f, 184591.0f);
  17. Mat warp_mat = getAffineTransform(srcTri, dstTri);
  18. // 计算图像坐标
  19. std:: cout << "img_corners:\n" << srcTri << std:: endl;
  20. std:: cout << "robot_corners:\n" << dstTri << std:: endl;
  21. std:: cout << warp_mat << std:: endl;
  22. //
  23. //[5.284835627018051, -0.00236967559639317, 194630.5662011061;
  24. //0.4056177440315953, -4.793119669651512, 188218.6997054448]
  25. return 0;
  26. }

================= Eye in hand 数据及Ground truth ========================= 

Marker in Camera 八组数据,单位:米及弧度,姿态用的是RotVector表示

0.11725,0.05736,0.32176,0.02860,-0.1078,3.06934
0.07959,0.06930,0.29829,0.66126,-0.2127,2.99042
0.14434,0.03227,0.40377,-0.9465,-0.1744,2.80668
0.11008,0.05605,0.35730,0.12422,-1.0665,2.87604
0.1131,0.0438171,0.299624,-0.058278,-0.514154,-3.06309
0.057039,0.109464,0.415275,0.414777,0.750109,-2.49495
0.13702,0.00520,0.38190,0.26431,-0.7554,2.26885
-0.03670,-0.04433,0.237292,0.807501,-0.27347,0.614594

Robot end-effector in Base 八组数据,单位:米及弧度,姿态用的是RotVector表示

0.400422,-0.0262847,0.365594,3.12734,-0.0857978,-0.0168582
0.423347,-0.16857,0.340184,-2.73844,0.089013,0.123284
0.42540,0.19543,0.29062,2.44351,-0.1777,-0.0722
0.58088,-0.0541,0.32633,-2.9303,0.06957,0.98985
0.2760,-0.033,0.3301,3.0813,0.0724,0.6077
0.54011,0.09071,0.34623,2.62279,0.75876,-0.6927
0.57732,-0.1346,0.37990,-2.6764,1.04868,0.82177
0.27844,-0.1371,0.28799,-0.8233,2.26319,0.36110

Ground truth:Camera in end-effector

Calibration Result:

-0.005816946773 -0.997849042178 -0.065295115862 0.053457653403

0.999355366046 -0.003487653357 -0.035730779845 -0.009463725064

0.035426197714 -0.065460868457 0.997226082298 -0.045113271057

0 0 0 1

 

RotVector(rad): -0.023440 -0.079412 1.57466

AngVector(rad): 1.576836 <-0.014865, -0.050362, 0.998620>

AngVector(deg): 90.346026 <-0.014865, -0.050362, 0.998620>

rx_ry_rz_rw: -0.010543 -0.079412 1.574660 0.704968

rx_ry_rz_rw_norm: -0.010543 -0.035718 0.708260 0.704968


转载:https://blog.csdn.net/yaked/article/details/77161160
查看评论
* 以上用户言论只代表其个人观点,不代表本网站的观点或立场