相关疑难解决方法(0)

ios使用AVFramework捕获图像

我正在使用此代码捕获图像

#pragma mark - image capture

// Create and configure a capture session and start it running
- (void)setupCaptureSession 
{
    NSError *error = nil;

    // Create the session
    AVCaptureSession *session = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    session.sessionPreset = AVCaptureSessionPresetMedium;

    // Find a suitable AVCaptureDevice
    AVCaptureDevice *device = [AVCaptureDevice
                           defaultDeviceWithMediaType:AVMediaTypeVideo];

    // Create a device input with the …
Run Code Online (Sandbox Code Playgroud)

iphone objective-c avfoundation ios

41
推荐指数
3
解决办法
3万
查看次数

AVCaptureDeviceOutput没有调用委托方法captureOutput

我正在构建一个iOS应用程序(我的第一个),它可以动态处理视频静止帧.为了深入研究,我从Apple 的AV*文档中提供了一个示例.

该过程涉及设置输入(摄像机)和输出.输出使用委托,在这种情况下是控制器本身(它符合并实现所需的方法).

我遇到的问题是委托方法永远不会被调用.下面的代码是控制器的实现,它有几个NSLog.我可以看到"已启动"消息,但"被调用的委托方法"从未显示过.

此代码都在实现"AVCaptureVideoDataOutputSampleBufferDelegate"协议的控制器中.

- (void)viewDidLoad {

    [super viewDidLoad];

    // Initialize AV session    
        AVCaptureSession *session = [AVCaptureSession new];

        if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone)
            [session setSessionPreset:AVCaptureSessionPreset640x480];
        else
            [session setSessionPreset:AVCaptureSessionPresetPhoto];

    // Initialize back camera input
        AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

        NSError *error = nil;

        AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:camera error:&error];

        if( [session canAddInput:input] ){
            [session addInput:input];
        }


    // Initialize image output
        AVCaptureVideoDataOutput *output = [AVCaptureVideoDataOutput new];

        NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
                                           [NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        [output setVideoSettings:rgbOutputSettings];
        [output …
Run Code Online (Sandbox Code Playgroud)

iphone avfoundation ios5

13
推荐指数
2
解决办法
1万
查看次数

iOS:captureOutput:didOutputSampleBuffer:不调用fromConnection

我想从AVCaptureSession的实时馈送中提取帧,我使用Apple的AVCam作为测试用例.这是AVCam的链接:

https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html

我发现那个captureOutput:didOutputSampleBuffer:fromConnection没有被调用,我想知道为什么或我做错了什么.

这是我做的:

(1)我做了AVCamViewController一个代表

@interface AVCamViewController () <AVCaptureFileOutputRecordingDelegate, AVCaptureVideoDataOutputSampleBufferDelegate>
Run Code Online (Sandbox Code Playgroud)

(2)我创建了一个AVCaptureVideoDataOutput对象并将其添加到会话中

AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
if ([session canAddOutput:videoDataOutput])
     {
         [session addOutput:videoDataOutput];
     }
Run Code Online (Sandbox Code Playgroud)

(3)我通过记录随机字符串来测试添加了委托方法和测试

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    NSLog(@"I am called");

}
Run Code Online (Sandbox Code Playgroud)

测试应用程序可以工作,但是不会调用captureOutput:didOutputSampleBuffer:fromConnection.

(4)我在SO上读到AVCaptureSession *session = [[AVCaptureSession alloc] init];,viewDidLoad 中本地的会话变量是未调用委托的可能原因,我把它作为AVCamViewController类的实例变量,但它没有被调用.

这是我正在测试的viewDidLoad方法(取自AVCam),我在方法的末尾添加了AVCaptureDataOutput:

- (void)viewDidLoad
{
    [super viewDidLoad];

    // Create the AVCaptureSession
    session = [[AVCaptureSession alloc] init];
    [self setSession:session];

    // Setup the preview view
    [[self previewView] setSession:session];

    // Check for …
Run Code Online (Sandbox Code Playgroud)

objective-c ios avcapture avcapturesession avcapturedevice

5
推荐指数
2
解决办法
7183
查看次数

未调用didOutputSampleBuffer委托

我的代码中没有调用didOutputSampleBuffer函数.我不知道为什么会这样.这是代码:

import UIKit
import AVFoundation
import Accelerate

class ViewController: UIViewController {

var captureSession: AVCaptureSession?
var dataOutput: AVCaptureVideoDataOutput?
var customPreviewLayer: AVCaptureVideoPreviewLayer?

@IBOutlet weak var camView: UIView!

override func viewWillAppear(animated: Bool) {
    super.viewDidAppear(animated)
    captureSession?.startRunning()
    //setupCameraSession()
}

override func viewDidLoad() {
    super.viewDidLoad()
    // Do any additional setup after loading the view, typically from a nib.
    //captureSession?.startRunning()
    setupCameraSession()
}

override func didReceiveMemoryWarning() {
    super.didReceiveMemoryWarning()
    // Dispose of any resources that can be recreated.
}

func setupCameraSession() {
    // Session
    self.captureSession = AVCaptureSession()
    captureSession!.sessionPreset = …
Run Code Online (Sandbox Code Playgroud)

buffer delegates avfoundation ios swift

5
推荐指数
1
解决办法
3582
查看次数