问题描述
我在NSImageView控件中使用目标C进行opencv图像绘制时遇到一个奇怪的问题。
下面的代码是重现问题的实际片段(只为图像定义一个真实的路径):
我创建了2个mat对象,一个是通过读取图像文件,第二个是相同大小,但是黑色。
计时器应该同时更新各自NSImageView控件中的两个图像。
通过转换文件完成绘图图像到RGB,因为opencv图像以BGR方式存储。
因此,程序应该在第一个控件中显示文件图像,在第二个控件中显示黑色图像,每次都是定时器被触发。
出于测试目的,定义了2个选项:
SYNCHRONIZE:两个图纸都是在同一个计时器触发器上完成。如果没有定义,则在2个连续的计时器触发器之后刷新一个控件。
USE_CONVERT:将原始图像从RGB转换为BGR并绘制转换后的mat图像。如果未定义,则使用原始垫图像进行绘制。
我的目标是激活USE_CONVERT和SYNCHRONIZE。
问题是,在这种情况下,文件图像同时显示控件,我希望将文件图像放在一个控件中,将黑色放在另一个控件中。
此外,如果定义了USE_CONVERT但未定义SYNCHRONIZE ,它正常工作!
一旦我不使用cvtColor(未定义USE_CONVERT)我实际上得到第二个控件中的黑框和文件图像第一个,但当然文件图像显示错误的颜色。
看起来连续调用时某个对象仍然存活(并使用),如果被调用则释放在不同的定时电话。
有人可以向我解释这个实现有什么问题吗?
提前谢谢。
Hi,
I am facing a strange issue regarding opencv image drawing with objective C in NSImageView controls.
The code below is the actual snippet that reproduces the issue (just define a true path for the image) :
I create 2 mat objects, one by reading an image file, the second one is of the same size, but black.
A timer is supposed to update both images in their respective NSImageView control at the same time.
Drawing is done by converting the file image to RGB, as opencv images are stored in a BGR way.
Hence, the program is supposed to display the file image in the first control, and the black image in the second one, each time the timer is triggered.
For testing purpose, is defined 2 options :
SYNCHRONIZE : both drawings are done at the same timer trigger. If not defined, refresh one control after the other, at 2 successive timer triggers.
USE_CONVERT : converts the original image from RGB to BGR and draws the converted mat image. If not defined, uses the original mat image for drawing.
My goal is to have USE_CONVERT and SYNCHRONIZE activated.
The issue is that, in that case, the file image is displayed in both controls, where I would like to have the file image in one control and the black one in the other one.
Moreover, if USE_CONVERT is defined but SYNCHRONIZE is not defined, it works correctly !
As soon as I do not use cvtColor (USE_CONVERT is not defined) I actually get the black frame in second control and the file image in the first one, but of course file image is displayed with wrong colors.
It looks like some object is still alive (and used) when called successively, while released if called at different timer calls.
Can someone explain to me what is wrong with this implementation ?
Thanks in advance.
#import "AppDelegate.h"
#import <opencv2/opencv.hpp>
using namespace cv;
#define USE_CONVERT
#define SYNCHRONIZE
@implementation AppDelegate
NSTimer *timer = nil;
Mat matFrame1;
Mat matFrame2;
bool first = false;
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
// Insert code here to initialize your application
matFrame1 = imread("/Users/..../image.bmp");
matFrame2 = Mat::zeros(matFrame1.rows, matFrame1.cols, CV_8UC3);
timer = [NSTimer scheduledTimerWithTimeInterval:.1 target:self selector:@selector(timerGUI) userInfo:nil repeats:YES];
}
-(void)timerGUI
{
NSLog(@"TimerGUI");
#ifndef SYNCHRONIZE
if (first)
#endif
[self drawImage : matFrame2 : self.imageView2];
#ifndef SYNCHRONIZE
else
#endif
[self drawImage : matFrame1 : self.imageView1];
first = first ? false : true;
}
- (void)drawImage : (Mat)matImage : (NSImageView *)View{
NSImage* img = nil;
NSBitmapImageRep* bitmapRep = nil;
#ifdef USE_CONVERT
Mat dispImage;
cvtColor(matImage, dispImage, CV_BGR2RGB);
bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&dispImage.data pixelsWide:dispImage.cols pixelsHigh:dispImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:dispImage.step bitsPerPixel:0];
img = [[NSImage alloc] initWithSize:NSMakeSize(dispImage.cols, dispImage.rows)];
#else
bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&matImage.data pixelsWide:matImage.cols pixelsHigh:matImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:matImage.step bitsPerPixel:0];
img = [[NSImage alloc] initWithSize:NSMakeSize(matImage.cols, matImage.rows)];
#endif
[img addRepresentation:bitmapRep];
[View setImage:img];
#ifdef USE_CONVERT
dispImage.release();
#endif
bitmapRep = nil;
img = nil;
}
@end
#import <Cocoa/Cocoa.h>
@interface AppDelegate : NSObject <NSApplicationDelegate>
@property (assign) IBOutlet NSWindow *window;
@property (weak) IBOutlet NSImageView *imageView1;
@property (weak) IBOutlet NSImageView *imageView2;
@end
我仍然面临这个问题,但可以简化它使用下面的代码。
虽然它产生了其他的副作用,但它们似乎与同样的问题有关。
下面的代码应该显示一个OpenCV图像(matFrame)一个NSImageView控件并在之后分配另一个Mat(matAux)。
问题是在新分配的Mat(matAux)中写入会覆盖matFrame,因为显示的帧不是加载的帧。此外,显示器不是绿色而是蓝色,因此通道不匹配...
看起来新的matAux数据空间与matFrame一起过度,除非我错过了什么,否则它不应该在那里。
我检查的东西:
- 删除matAux的数据初始化(setTo,或其他任何写入方法)不产生问题
- 使用copyTo而不是cvCreateMat工作。但这不是我的目的,因为新的matAux与matFrame无关
- 将matAux分配为较小的mat部分填充matFrame蓝色
- 按下按钮时调用代码。如果从单独的按钮调用matAux创建和初始化,它就可以工作......
- 如果matAux是在应用程序启动时创建的(在applicationDidFinishLaunching中),它可以工作。但这也不是我的目的...
Mac OS / opencv初学者的任何帮助或大开眼界的愚蠢错误都会非常感激。
提前致谢。
Hi,
I am still facing the issue, but could simplify it with the following code below.
Although it produces other side effetcs, they seem related to the same problem.
Code below is supposed to display an OpenCV image (matFrame) in a NSImageView control and allocate another Mat (matAux) right after.
The issue is that writing in the newly allocated Mat (matAux) is overwriting matFrame as the displayed frame is not the loaded one. Moreoverver, display is not green but blue, hence a channel mismatch ...
It looks like the newly matAux data space is ovelapping with matFrame, where it should not unless I missed something.
Things I have checked :
- Removing data initialization of matAux (setTo, or whatever other writing method) does not produces the problem
- Using copyTo instead of cvCreateMat works. But this is not my purpose as the new matAux has nothing to do with matFrame
- allocating matAux as a smaller Mat partly fills matFrame with blue
- code is called while pushing a button. If matAux creation and initialization is called from separate button, it works ...
- if matAux is created at application start ( in applicationDidFinishLaunching), it works. But this is also not my purpose ...
Any help or eye opening on Mac OS / opencv beginner's stupid mistake would really be appreciated.
Thanks in advance.
@implementation AppDelegate
Mat matFrame;
Mat matAux;
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
// Insert code here to initialize your application
}
- (IBAction)go:(id)sender {
// Loads image to be displayed
matFrame = imread("/Users/.../image.bmp");
// Display image in control
NSImage* img = nil;
NSBitmapImageRep* bitmapRep = nil;
Mat dispImage;
cvtColor(matFrame, dispImage, CV_BGR2RGB);
bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&dispImage.data pixelsWide:dispImage.cols pixelsHigh:dispImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:dispImage.step bitsPerPixel:0];
img = [[NSImage alloc] initWithSize:NSMakeSize(dispImage.cols, dispImage.rows)];
[img addRepresentation:bitmapRep];
[_imageView setImage:img];
dispImage.release();
bitmapRep = nil;
img = nil;
// Allocate aother Mat
matAux = cvCreateMat(matFrame.rows, matFrame.cols, CV_8UC3);
// Fill Mat with green
matAux.setTo(Scalar(0,255,0));
}
推荐答案
// Display image in control
Mat dispImage;
cvtColor(matImage, dispImage, CV_BGR2RGB);
NSData *data = [NSData dataWithBytes:dispImage.data length:dispImage.elemSize()*dispImage.total()];
CGColorSpaceRef colorSpace;
if (dispImage.elemSize() == 1) {
colorSpace = CGColorSpaceCreateDeviceGray();
} else {
colorSpace = CGColorSpaceCreateDeviceRGB();
}
CGDataProviderRef provider = CGDataProviderCreateWithCFData((CFDataRef)data);
// Creating CGImage from cv::Mat
CGImageRef imageRef = CGImageCreate(dispImage.cols, //width
dispImage.rows, //height
8, //bits per component
8 * dispImage.elemSize(), //bits per pixel
dispImage.step.p[0], //bytesPerRow
colorSpace, //colorspace
kCGImageAlphaNone|kCGBitmapByteOrderDefault,// bitmap info
provider, //CGDataProviderRef
NULL, //decode
false, //should interpolate
kCGRenderingIntentDefault //intent
);
NSImage *img = [[NSImage alloc] initWithCGImage:imageRef size:NSZeroSize];
CGImageRelease(imageRef);
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpace);
[View setImage:img];
img = nil;
这当然没有解释覆盖原因,但至少给出了解决方法。
which of course does not explains the overwrite reason but at least gives a workaround.
这篇关于在NSImageView中绘制opencv mat的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!