y像素映射到世界坐标

y像素映射到世界坐标

本文介绍了如何将x,y像素映射到世界坐标的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

在我的项目中,我以像素为单位计算x,y,Z以毫米为单位计算距相机的距离.所以我想利用深度并计算x,y,z.

In my project I am calculating x,y in pixel and Z the distance from the camera in mm. So I want to make use of the depth and calculate the x,y,z.

有人可以告诉我我该怎么做吗?

Can any one tell me how I can do that?

我有以下信息:

    从图像中获得的
  1. x,y像素
  2. 与照相机和物体的距离相机和物体之间的距离一直在变化,因为我要从不同的距离拍摄图像
  1. x,y pixel obtained from the image
  2. Distance from the camera and the objectdistance from camera and object keep varying, since am making image from different distances

推荐答案

如果图像坐标中有x,y,相机矩阵中有z,世界坐标中有z,则需要将图像坐标放入同质向量中,乘以通过相机矩阵的逆,然后通过您的z_world坐标.一开始可能不直观的事情是,您在世界坐标系中的单位无关紧要.在乘以相机矩阵的逆后,您定义了x/z的比率,该比率是无单位的.您通过乘以z_world来给出结果单位.您可以以毫米,英寸,英里为单位进行测量,而所得的矢量将具有相同的单位.

If you have x,y in image coordinates, a camera matrix, and z in world coordinates then you need to put your image coordinates into a homogenous vector, multiply by the inverse of the camera matrix, and then by your z_world coordinate. Something that might not be intuitive at first is that your units in world coordinates do not matter. After multiplying by the inverse of the camera matrix you have defined the ratio x/z which is unitless. You give the result units by multiplying by z_world. You can measure it in mm, inches, miles, whatever and your resulting vector will have the same units.

cv::Matx31f world_cord(x_im,y_im,1);         //here measured in pixels
world_cord = camera_matrix.inv()*world_cord; //representing a ratio x/z,y/z
world_cord *= z_world;                       //now x,y,z are measured in units of z_world

world_cord现在包含x_world,y_world,z_world.

world_cord now contains x_world,y_world,z_world.

这篇关于如何将x,y像素映射到世界坐标的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-28 22:44