如何使用Accelerate Framework将iOS相机图像转换为灰度?

Bre*_*ett 6 iphone image-processing objective-c accelerate-framework vimage

看起来这应该比我发现它更简单.

AVFoundation在标准委托方法中有一个框架:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
Run Code Online (Sandbox Code Playgroud)

我想在哪里使用Accelerate.Framework.将帧转换为灰度.

框架中有一系列转换方法,包括vImageConvert_RGBA8888toPlanar8(),看起来它可能是我想看到的,但是,我找不到任何如何使用它们的示例!

到目前为止,我有代码:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

      @autoreleasepool {
            CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
            /*Lock the image buffer*/
            CVPixelBufferLockBaseAddress(imageBuffer,0);
            /*Get information about the image*/
            uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
            size_t width = CVPixelBufferGetWidth(imageBuffer);
            size_t height = CVPixelBufferGetHeight(imageBuffer);
            size_t stride = CVPixelBufferGetBytesPerRow(imageBuffer);

            // vImage In
            Pixel_8 *bitmap = (Pixel_8 *)malloc(width * height * sizeof(Pixel_8));
            const vImage_Buffer inImage = { bitmap, height, width, stride };

            //How can I take this inImage and convert it to greyscale?????
            //vImageConvert_RGBA8888toPlanar8()??? Is the correct starting format here??
      }    
}
Run Code Online (Sandbox Code Playgroud)

所以我有两个问题: (1)在上面的代码中,是RBGA8888正确的起始格式吗?(2)如何实际Accelerate.Framework调用转换为灰度?

Tar*_*ark 5

这里有一个更简单的选择。如果将相机获取格式更改为YUV,则您已经拥有一个灰度框,可以根据需要使用它。设置数据输出时,请使用以下方法:

dataOutput.videoSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) };
Run Code Online (Sandbox Code Playgroud)

然后,您可以使用以下命令在捕获回调中访问Y平面:

CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
uint8_t *yPlane = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);

... do stuff with your greyscale camera image ...

CVPixelBufferUnlockBaseAddress(pixelBuffer);
Run Code Online (Sandbox Code Playgroud)