dre*_*zor 3 objective-c avfoundation ios cmsamplebufferref
我需要从CMSampleBufferRef 获取UIImage来自未压缩的图像数据.我正在使用代码:
captureStillImageOutput captureStillImageAsynchronouslyFromConnection:connection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
// that famous function from Apple docs found on a lot of websites
// does NOT work for still images
UIImage *capturedImage = [self imageFromSampleBuffer:imageSampleBuffer];
}
Run Code Online (Sandbox Code Playgroud)
http://developer.apple.com/library/ios/#qa/qa1702/_index.html是imageFromSampleBuffer功能链接.
但它不能正常工作.:(
有一种jpegStillImageNSDataRepresentation:imageSampleBuffer方法,但它提供压缩数据(好吧,因为JPEG).
UIImage捕获静止图像后,如何使用最原始的非压缩数据创建?
也许,我应该为视频输出指定一些设置?我目前正在使用这些:
captureStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
captureStillImageOutput.outputSettings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
Run Code Online (Sandbox Code Playgroud)
我注意到,输出有一个默认值AVVideoCodecKey,即AVVideoCodecJPEG.是否可以以任何方式避免,或者甚至在拍摄静止图像时是否重要?
我在那里找到了一些东西:来自相机的原始图像数据,如"645 PRO",但我只需要一个UIImage,而不使用OpenCV或OGLES或其他第三方.
该方法imageFromSampleBuffer确实有效,我正在使用它的更改版本,但如果我没记错,你需要设置outputSettings.我认为您需要将键设置为kCVPixelBufferPixelFormatTypeKey和值kCVPixelFormatType_32BGRA.
例如:
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* outputSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[newStillImageOutput setOutputSettings:outputSettings];
Run Code Online (Sandbox Code Playgroud)
编辑
我正在使用这些设置来拍摄静态图像而非视频.你的sessionPreset AVCaptureSessionPresetPhoto?可能存在问题
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
[newCaptureSession setSessionPreset:AVCaptureSessionPresetPhoto];
Run Code Online (Sandbox Code Playgroud)
编辑2
关于将其保存到UIImage的部分与文档中的部分相同.这就是我要问问题的其他根源的原因,但我想这只是抓住了吸管.我知道另一种方式,但这需要OpenCV.
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
Run Code Online (Sandbox Code Playgroud)
我想这对你没有帮助,抱歉.我不知道你的问题的其他起源.
| 归档时间: |
|
| 查看次数: |
3238 次 |
| 最近记录: |