我正在使用Apple的AVCaptureSession示例代码,并且创建的UIImage完全空白.这只发生在iPhone 3G上,还有一个独特的错误显示在控制台上,上面写着 -
错误:CGDataProviderCreateWithCopyOfData:vm_copy失败:状态2.
我已经在线研究了错误,发现了这个StackOverflow的答案,它摆脱了错误......但是图像仍然是空白的.
有没有其他人经历过这个并知道如何修复它?
提前致谢.
我的代码 -
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer, 0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
UIImage *image = [UIImage imageWithCGImage:quartzImage];
CGImageRelease(quartzImage);
return image;
Run Code Online (Sandbox Code Playgroud) 我使用ZXing作为应用程序,这主要是与ZXing原始代码相同的代码,除了我允许连续扫描几次(即,一旦检测到某些东西,ZXingWidgetController就不会被解雇).
当我按下调用的关闭按钮时,我会经历长时间的长时间冻结(有时它永远不会结束)
- (void)cancelled {
// if (!self.isStatusBarHidden) {
// [[UIApplication sharedApplication] setStatusBarHidden:NO];
// }
[self stopCapture];
wasCancelled = YES;
if (delegate != nil) {
[delegate zxingControllerDidCancel:self];
}
}
Run Code Online (Sandbox Code Playgroud)
同
- (void)stopCapture {
decoding = NO;
#if HAS_AVFF
if([captureSession isRunning])[captureSession stopRunning];
AVCaptureInput* input = [captureSession.inputs objectAtIndex:0];
[captureSession removeInput:input];
AVCaptureVideoDataOutput* output = (AVCaptureVideoDataOutput*)[captureSession.outputs objectAtIndex:0];
[captureSession removeOutput:output];
[self.prevLayer removeFromSuperlayer];
/*
// heebee jeebees here ... is iOS still writing into the layer?
if (self.prevLayer) {
layer.session = nil;
AVCaptureVideoPreviewLayer* layer = prevLayer; …Run Code Online (Sandbox Code Playgroud) iOS-app最小化时如何从相机拍照?
(即applicationDidEnterBackground:/ 之后applicationWillResignActive:)
AppDelegate.m: (感谢你链接)
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
//To make the code block asynchronous
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
//### background task starts
NSLog(@"Running in the background\n");
while(TRUE)
{
printf("Called"); //////Work fine
[self.window.rootViewController captureNow]; /////Capture picture!
[NSThread sleepForTimeInterval: 10.0]; //wait for 10 sec
}
});
return YES;
}
Run Code Online (Sandbox Code Playgroud)
OurViewController.m: (感谢你链接)
-(IBAction)captureNow {
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in _stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port …Run Code Online (Sandbox Code Playgroud) 我使用AVCaptureSession拍照并将照片存储到相册.当我点击按钮时,它会拍摄快照并存储到相册.但是当我使用横向模式时,然后单击它存储横向模式的按钮会导致倒置静止图像.

码:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
[self setCaptureSession:[[AVCaptureSession alloc] init]];
[self addVideoInputFrontCamera:NO]; // set to YES for Front Camera, No for Back camera
[self addStillImageOutput];
[self setPreviewLayer:[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] ];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];
CGRect layerRect = [[[self view] layer] bounds];
[[self previewLayer]setBounds:layerRect];
[[self previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),CGRectGetMidY(layerRect))];
[[[self view] layer] addSublayer:[self previewLayer]];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(saveImageToPhotoAlbum) name:kImageCapturedSuccessfully object:nil];
[[self captureSession] startRunning];
camera=[UIButton buttonWithType:UIButtonTypeCustom];
[camera setImage:[UIImage imageNamed:@"button.png"] forState:UIControlStateNormal];
[camera setFrame:CGRectMake(150, 10, 40, …Run Code Online (Sandbox Code Playgroud) 我试图在iOS上在GPUImage上实现视频录制.
我有一个GPUImageVideoCamera,我正在尝试设置这样的录音:
-(void)toggleVideoRecording {
if (!self.recording) {
self.recording = true;
GPUImageMovieWriter * mw = [self createMovieWriter: self.filter1];
[self.filter1 addTarget:mw];
mw.shouldPassthroughAudio = YES;
[mw startRecording];
}
else {
self.recording = false;
[self finishWritingMovie];
}
Run Code Online (Sandbox Code Playgroud)
}
-(GPUImageMovieWriter *) createMovieWriter:(GPUImageFilter *) forFilter {
NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/CurrentMovie.m4v"];
unlink([pathToMovie UTF8String]);
NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie];
GPUImageMovieWriter * mw = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:forFilter->currentFilterSize];
self.movieWriter = mw;
self.movieURL = movieURL;
return mw;
}
Run Code Online (Sandbox Code Playgroud)
这实际上大部分都有效 但是,MoviewWriter需要输入媒体的CGSize,我不知道如何将它从VideoCamera中拉出来.首先,尺寸会有所不同,因为相机正在使用sessionPreset:AVCaptureSessionPresetHigh,这意味着它会根据设备的功能而变化.其次,方向改变了相关的CGSize.如果用户以横向模式开始录制,我想转置尺寸.
对于任何知道如何使用vanilla CoreVideo执行此操作的人,我可以将AVCaptureSession从我的GPUImageVideoCamera中拉出来.(但我已经阅读了AVCaptureSession的API,没有什么看起来很有希望).
甚至将我引用到处理此项目的示例项目也会非常有帮助.
我正在使用AvCapture会话在相机屏幕上添加cutom叠加.我添加了一个AvCaptureVidePreview图层
AvCaptureVideoPreviewLayer*layer = [[AvCaptureVideoPreviewLayer alloc] iniWithSession:session];
layer.frame = self.view.layer.bounds;
Run Code Online (Sandbox Code Playgroud)
但在iPhone 4S的情况下,相机预览不是全屏.它在iPhone 5中运行良好.
我正在尝试捕捉图像:
var stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
[...]
if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
videoConnection.videoOrientation = AVCaptureVideoOrientation.Portrait
println("enabled = \(videoConnection.enabled)") //enabled = true
println("active = \(videoConnection.active)") //active = true
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
(sampleBuffer, error) in
if error != nil {
println(error)
}
})
}
Run Code Online (Sandbox Code Playgroud)
它大多数时候都很完美!
但有时我在调用时遇到错误captureStillImageAsynchronouslyFromConnection:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation couldn't be completed."
UserInfo=0x14682110 {NSLocalizedDescription=The operation couldn't be completed.,
NSUnderlyingError=0x1676be30 "The operation couldn't be completed.
(OSStatus error -16400.)",
NSLocalizedFailureReason=An unknown error has occured (-16400)}
Run Code Online (Sandbox Code Playgroud)
我只是想预料会发生这个错误.我尝试过测试 …
我正在尝试构建一个应用程序,它将从相机捕获帧并使用OpenCV处理它们,然后将这些文件保存到设备,但是以特定的帧速率.
我现在坚持的是这个事实AVCaptureVideoDataOutputSampleBufferDelegate似乎并不尊重AVCaptureDevice.activeVideoMinFrameDuration或AVCaptureDevice.activeVideoMaxFrameDuration设置.
captureOutput 如上面的设置所示,运行速度远远超过每秒2帧.
你是否碰巧知道无论是否有代表,人们都能做到这一点?
视图控制器:
override func viewDidLoad() {
super.viewDidLoad()
}
override func viewDidAppear(animated: Bool) {
setupCaptureSession()
}
func setupCaptureSession() {
let session : AVCaptureSession = AVCaptureSession()
session.sessionPreset = AVCaptureSessionPreset1280x720
let videoDevices : [AVCaptureDevice] = AVCaptureDevice.devices() as! [AVCaptureDevice]
for device in videoDevices {
if device.position == AVCaptureDevicePosition.Back {
let captureDevice : AVCaptureDevice = device
do {
try captureDevice.lockForConfiguration()
captureDevice.activeVideoMinFrameDuration = CMTimeMake(1, 2)
captureDevice.activeVideoMaxFrameDuration = CMTimeMake(1, 2)
captureDevice.unlockForConfiguration()
let input : AVCaptureDeviceInput = try …Run Code Online (Sandbox Code Playgroud) 我正在实施水龙头以集中精力,并为如何使用不同而感到困惑AVCaptureFocusModes。这样做:
[device setFocusPointOfInterest:focusPoint];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
Run Code Online (Sandbox Code Playgroud)
可以成功对焦,但是由于锁定焦距,移动相机将永远失去对焦。相反,如果我这样做:
[device setFocusPointOfInterest:focusPoint];
[device setFocusMode:AVCaptureFocusModeContinousAutoFocus];
Run Code Online (Sandbox Code Playgroud)
自动对焦引擎似乎消除了我的兴趣,只专注于看起来最好的东西。相机应用程序成功聚焦在您的兴趣点上,同时在移动相机时也保持连续自动对焦,这是怎么做的?
到目前为止,这是我的完整代码:
- (void)setFocusPointOfInterest:(CGPoint)point
{
Class captureDeviceClass = NSClassFromString(@"AVCaptureDevice");
if (captureDeviceClass != nil) {
AVCaptureDevice *device = [captureDeviceClass defaultDeviceWithMediaType:AVMediaTypeVideo];
if([device isFocusPointOfInterestSupported] &&
[device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
CGRect screenRect = [[UIScreen mainScreen] bounds];
double screenWidth = screenRect.size.width;
double screenHeight = screenRect.size.height;
double focus_x = point.x/screenWidth;
double focus_y = point.y/screenHeight;
CGPoint focusPoint = CGPointMake(focus_x,focus_y);
if([device lockForConfiguration:nil]) {
[device setFocusPointOfInterest:focusPoint];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device setExposurePointOfInterest:focusPoint];
if ([device isExposureModeSupported:AVCaptureExposureModeAutoExpose]){
[device setExposureMode:AVCaptureExposureModeAutoExpose];
}
[device unlockForConfiguration]; …Run Code Online (Sandbox Code Playgroud) 我正在玩swift和iPhone 7 Plus.我正在使用builtInWideAngleCamera和builtInTelephotoCamera.即使我无法同时获得2张图像,这也很棒.
我在Apple文档中看到AVCaptureDeviceType包含一个builtInDualCamera条目.这个设备在avfoundation中的目的是什么,因为我们不能用apple API做任何事情(缩放,深度效果)?
换句话说,在使用AVCaptureDeviceType,avcapturesession和stuff时,我看不出builtInDualCamera和builtInWideAngleCamera之间的区别
谢谢
avcapturesession ×10
ios ×7
iphone ×4
swift ×3
avcapture ×1
avfoundation ×1
cocoa-touch ×1
gpuimage ×1
ios4 ×1
ios6 ×1
objective-c ×1
zxing ×1