小编jun*_*cat的帖子

在Mac OS X 10.6上获取相对于屏幕的NSView框架/边界

我需要获得相对于屏幕的NSView的框架/边界.换句话说,我需要x和y坐标是屏幕上的位置,而不是相对于它的超视图的位置.

我根据评论提出了以下解决方案.

NSRect frameRelativeToWindow = [self.view
    convertRect:self.view.bounds toView:nil
];
#if MAC_OS_X_VERSION_MAX_ALLOWED > MAC_OS_X_VERSION_10_6
NSPoint pointRelativeToScreen = [self.view.window
    convertRectToScreen:frameRelativeToWindow
].origin;
#else
NSPoint pointRelativeToScreen = [self.view.window
    convertBaseToScreen:frameRelativeToWindow.origin
];
#endif

NSRect frame = self.view.frame;

frame.origin.x = pointRelativeToScreen.x;
frame.origin.y = pointRelativeToScreen.y;
Run Code Online (Sandbox Code Playgroud)

cocoa

12
推荐指数
1
解决办法
9955
查看次数

initWithCVPixelBuffer失败,因为CVPixelBufferRef不是非IOSurface支持的

我收到YUV帧(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange),当从CVPixelBufferRef创建CIImage时,我得到:

initWithCVPixelBuffer失败,因为CVPixelBufferRef不是非IOSurface支持的.

CVPixelBufferRef pixelBuffer;

size_t planeWidth[] = { width, width / 2 };
size_t planeHeight[] = { height, height / 2};
size_t planeBytesPerRow[] = { width, width / 2 };

CVReturn ret = CVPixelBufferCreateWithBytes(
kCFAllocatorDefault, width, height, kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
data, bytesPerRow, 0, 0, 0, &pixelBuffer
);

if (ret != kCVReturnSuccess)
{
    NSLog(@"FAILED");

    CVPixelBufferRelease(pixelBuffer);

    return;
}

CVPixelBufferLockBaseAddress(pixelBuffer, 0);

// fails
CIImage * image = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];

CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

CVPixelBufferRelease(pixelBuffer);

[image release];
Run Code Online (Sandbox Code Playgroud)

core-image ios

0
推荐指数
1
解决办法
3481
查看次数

标签 统计

cocoa ×1

core-image ×1

ios ×1