问题描述
我试图做一些事情呈三角到这一点,但在Android上:的我吸引到ImageView的图像,我希望能够选择6分(3源和目标3)与之前使用warpAffine方法上的OpenCV像的例子。我能够重现上面的例子,但通过在X时,Y从onTouch事件的坐标,它并没有给我什么,我所期望的。我认为这个问题是如何转换的Android触摸坐标OpenCV的垫列/行。也许我需要显示在OpenCV的表面上的图像?我希望我是清楚的,先谢谢了。
I am trying to do something simillar to this, but on Android:http://docs.opencv.org/doc/tutorials/imgproc/imgtrans/warp_affine/warp_affine.htmlI have an image drawn to ImageView and i want to be able to select 6 points (3 source and 3 target)and use the warpAffine method on OpenCV like the example before.I'm able to reproduce the example above, but when passing the X,Y coordinates from onTouch event, it doesn't give me what i expect.I think the problem is how to convert the android touch coordinates to OpenCV Mat column/row.Maybe i need to display the image on OpenCV surface?I hope i was clear,Thanks in advance.
推荐答案
在触摸坐标的指属于Android手机的屏幕分辨率点(三星Galaxy S3是720x1280)。 在另一方面,OpenCV的图像能比这更大或更小,这意味着的触摸坐标的不能被直接映射到的图像坐标。
The touch coordinate refer to a point that belongs to your Android's screen resolution (Samsung Galaxy S3 is 720x1280). On the other hand, the OpenCV image could be larger or smaller than that, meaning that a touch coordinate can't be mapped directly into an image coordinate.
有什么需要做的一项决议转换到另一个,是的触摸到了640×480范围内的标准化协调的从720x1280。因此,寻找在真正的的坐标,你会做的形象:
What needs to be done to convert from one resolution to the other, is the normalization of the touch coordinate from 720x1280 to the 640x480 range. Therefore, to find the real coordinates on the image you would do:
real_x = (touch_x * 640) / 720;
real_y = (touch_y * 480) / 1280;
这篇关于如何转换的Android触摸坐标OpenCV的图像cordinates?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!