Hi,
I am facing a strange issue regarding opencv image drawing with objective C in NSImageView controls.
The code below is the actual snippet that reproduces the issue (just define a true path for the image) :
I create 2 mat objects, one by reading an image file, the second one is of the same size, but black.
A timer is supposed to update both images in their respective NSImageView control at the same time.
Drawing is done by converting the file image to RGB, as opencv images are stored in a BGR way.
Hence, the program is supposed to display the file image in the first control, and the black image in the second one, each time the timer is triggered.
For testing purpose, is defined 2 options :
SYNCHRONIZE : both drawings are done at the same timer trigger. If not defined, refresh one control after the other, at 2 successive timer triggers.
USE_CONVERT : converts the original image from RGB to BGR and draws the converted mat image. If not defined, uses the original mat image for drawing.
My goal is to have USE_CONVERT and SYNCHRONIZE activated.
The issue is that, in that case, the file image is displayed in both controls, where I would like to have the file image in one control and the black one in the other one.
Moreover, if USE_CONVERT is defined but SYNCHRONIZE is not defined, it works correctly !
As soon as I do not use cvtColor (USE_CONVERT is not defined) I actually get the black frame in second control and the file image in the first one, but of course file image is displayed with wrong colors.
It looks like some object is still alive (and used) when called successively, while released if called at different timer calls.
Can someone explain to me what is wrong with this implementation ?
Thanks in advance.
#import "AppDelegate.h"
#import <opencv2/opencv.hpp>
using namespace cv;
#define USE_CONVERT
#define SYNCHRONIZE
@implementation AppDelegate
NSTimer *timer = nil;
Mat matFrame1;
Mat matFrame2;
bool first = false;
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
matFrame1 = imread("/Users/..../image.bmp");
matFrame2 = Mat::zeros(matFrame1.rows, matFrame1.cols, CV_8UC3);
timer = [NSTimer scheduledTimerWithTimeInterval:.1 target:self selector:@selector(timerGUI) userInfo:nil repeats:YES];
}
-(void)timerGUI
{
NSLog(@"TimerGUI");
#ifndef SYNCHRONIZE
if (first)
#endif
[self drawImage : matFrame2 : self.imageView2];
#ifndef SYNCHRONIZE
else
#endif
[self drawImage : matFrame1 : self.imageView1];
first = first ? false : true;
}
- (void)drawImage : (Mat)matImage : (NSImageView *)View{
NSImage* img = nil;
NSBitmapImageRep* bitmapRep = nil;
#ifdef USE_CONVERT
Mat dispImage;
cvtColor(matImage, dispImage, CV_BGR2RGB);
bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&dispImage.data pixelsWide:dispImage.cols pixelsHigh:dispImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:dispImage.step bitsPerPixel:0];
img = [[NSImage alloc] initWithSize:NSMakeSize(dispImage.cols, dispImage.rows)];
#else
bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&matImage.data pixelsWide:matImage.cols pixelsHigh:matImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:matImage.step bitsPerPixel:0];
img = [[NSImage alloc] initWithSize:NSMakeSize(matImage.cols, matImage.rows)];
#endif
[img addRepresentation:bitmapRep];
[View setImage:img];
#ifdef USE_CONVERT
dispImage.release();
#endif
bitmapRep = nil;
img = nil;
}
@end
#import <Cocoa/Cocoa.h>
@interface AppDelegate : NSObject <NSApplicationDelegate>
@property (assign) IBOutlet NSWindow *window;
@property (weak) IBOutlet NSImageView *imageView1;
@property (weak) IBOutlet NSImageView *imageView2;
@end
Hi,
I am still facing the issue, but could simplify it with the following code below.
Although it produces other side effetcs, they seem related to the same problem.
Code below is supposed to display an OpenCV image (matFrame) in a NSImageView control and allocate another Mat (matAux) right after.
The issue is that writing in the newly allocated Mat (matAux) is overwriting matFrame as the displayed frame is not the loaded one. Moreoverver, display is not green but blue, hence a channel mismatch ...
It looks like the newly matAux data space is ovelapping with matFrame, where it should not unless I missed something.
Things I have checked :
- Removing data initialization of matAux (setTo, or whatever other writing method) does not produces the problem
- Using copyTo instead of cvCreateMat works. But this is not my purpose as the new matAux has nothing to do with matFrame
- allocating matAux as a smaller Mat partly fills matFrame with blue
- code is called while pushing a button. If matAux creation and initialization is called from separate button, it works ...
- if matAux is created at application start ( in applicationDidFinishLaunching), it works. But this is also not my purpose ...
Any help or eye opening on Mac OS / opencv beginner's stupid mistake would really be appreciated.
Thanks in advance.
@implementation AppDelegate
Mat matFrame;
Mat matAux;
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
}
- (IBAction)go:(id)sender {
matFrame = imread("/Users/.../image.bmp");
NSImage* img = nil;
NSBitmapImageRep* bitmapRep = nil;
Mat dispImage;
cvtColor(matFrame, dispImage, CV_BGR2RGB);
bitmapRep = [[NSBitmapImageRep alloc] initWithBitmapDataPlanes:&dispImage.data pixelsWide:dispImage.cols pixelsHigh:dispImage.rows bitsPerSample:8 samplesPerPixel:3 hasAlpha:NO isPlanar:NO colorSpaceName:NSCalibratedRGBColorSpace bytesPerRow:dispImage.step bitsPerPixel:0];
img = [[NSImage alloc] initWithSize:NSMakeSize(dispImage.cols, dispImage.rows)];
[img addRepresentation:bitmapRep];
[_imageView setImage:img];
dispImage.release();
bitmapRep = nil;
img = nil;
matAux = cvCreateMat(matFrame.rows, matFrame.cols, CV_8UC3);
matAux.setTo(Scalar(0,255,0));
}