标签: avcapturesession

AVCaptureSession输出未正确显示,并且未调用AVCaptureMetadataOutputObjectsDelegate方法

我有一个单一的视图应用程序,我试图根据这个解释测试iOS7的AVCaptureMetadataOutput .我的ViewController符合AVCaptureMetadataOutputObjectsDelegate并且代码看起来像这样(几乎与Mattt的相同):

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.

    // Testing the VIN Scanner before I make it part of the library
    NSLog(@"Setting up the vin scanner");
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error = nil;

    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                        error:&error];
    if (input) {
        [session addInput:input];
    } else {
        NSLog(@"Error: %@", error);
    }

    AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] …
Run Code Online (Sandbox Code Playgroud)

avfoundation ios avcapturesession ios7

1
推荐指数
1
解决办法
7740
查看次数

当NSStream NSStreamEventHasSpaceAvailable事件被调用?

我真的不明白这个事件.我希望在发送队列(或类似的内部结构)完成发送先前写入的数据包时调用它.

这是正确的假设吗?


我正在通过Multipeer连接处理视频流,我想使用此属性来决定是否应该丢弃相机框架(如果没有NSStreamEventHasSpaceAvailable),或者我可以提交它NSOutputStream.

想象一下BlueTooth连接,我真的需要丢弃很多相机框架,而不是提交每一帧NSStream.

camera nsstream ios avcapturesession multipeer-connectivity

1
推荐指数
1
解决办法
2744
查看次数

connectionWithMediaType返回nil

我尝试使用glkview实现视频输入,但是当我尝试旋转旋转时总是从connectionWithMediaType返回nil.

这是我的设置

override public func viewDidLoad() {
    super.viewDidLoad()
    // Do any additional setup after loading the view.
    videoFeed = GLKView(frame: self.view.bounds, context: EAGLContext(API: .OpenGLES2))
    videoFeed.autoresizingMask = [UIViewAutoresizing.FlexibleWidth, UIViewAutoresizing.FlexibleHeight]
    videoFeed.translatesAutoresizingMaskIntoConstraints = true
    videoFeed.contentScaleFactor = 1.0
    self.view.addSubview(videoFeed)
    renderContext  = CIContext(EAGLContext: videoFeed.context)
    sessionQueue = dispatch_queue_create("dCamSession", DISPATCH_QUEUE_SERIAL)
    videoFeed.bindDrawable()
}

override public func viewDidAppear(animated: Bool) {
    super.viewDidAppear(animated)
    startSession()
}

func createSession() -> AVCaptureSession {
    let cam = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
    var input:AVCaptureInput
    do {
        input = try AVCaptureDeviceInput(device: cam)
    } catch _ as NSError {
        print("Cannot Init …
Run Code Online (Sandbox Code Playgroud)

ios avcapture avcapturesession glkview swift

1
推荐指数
1
解决办法
518
查看次数

使用 AVAssetWriter 损坏视频捕获音频和视频

我正在使用 anAVCaptureSession来使用视频和音频输入并使用AVAssetWriter.

如果我不写音频,视频会按预期编码。但是,如果我编写音频,则会收到损坏的视频。

如果我检查CMSampleBuffer提供给AVAssetWriter它的音频,它会显示以下信息:

invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
formatDescription = <CMAudioFormatDescription 0x17410ba30 [0x1b3a70bb8]> {
mediaType:'soun' 
mediaSubType:'lpcm' 
mediaSpecific: {
    ASBD: {
        mSampleRate: 44100.000000 
        mFormatID: 'lpcm' 
        mFormatFlags: 0xc 
        mBytesPerPacket: 2 
        mFramesPerPacket: 1 
        mBytesPerFrame: 2 
        mChannelsPerFrame: 1 
        mBitsPerChannel: 16     } 
    cookie: {(null)} 
    ACL: {(null)}
    FormatList Array: {(null)} 
} 
extensions: {(null)}
Run Code Online (Sandbox Code Playgroud)

由于它提供 lpcm 音频,因此我AVAssetWriterInput使用此设置配置了声音(我尝试了一个和两个通道):

var channelLayout = AudioChannelLayout()
memset(&channelLayout, 0, MemoryLayout<AudioChannelLayout>.size);
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Mono

let audioOutputSettings:[String: …
Run Code Online (Sandbox Code Playgroud)

audio ios avcapturesession avassetwriter swift

1
推荐指数
1
解决办法
1658
查看次数

在 swift 3 上使用相机进行实时人脸检测

如何像“相机”一样实时进行人脸检测?就像脸上和脸上的白色圆形形状。我用AVCapturSession。我发现我保存的用于面部检测的图像。下面我附上了我当前的代码。它仅在我按下按钮时捕获图像并将其保存到照片库中。有的请帮我根据人脸实时创建圆形!

代码

class CameraFaceRecongnitionVC: UIViewController {

    @IBOutlet weak var imgOverlay: UIImageView!
    @IBOutlet weak var btnCapture: UIButton!

    let captureSession = AVCaptureSession()
    let stillImageOutput = AVCaptureStillImageOutput()
    var previewLayer : AVCaptureVideoPreviewLayer?

    // If we find a device we'll store it here for later use
    var captureDevice : AVCaptureDevice?

    override func viewDidLoad() {
        super.viewDidLoad()
        btnCapture.CameraButton()
        roundButton.RoundButtonForFaceRecong()

        // Do any additional setup after loading the view, typically from a nib.
        captureSession.sessionPreset = AVCaptureSessionPresetHigh

        if let devices = AVCaptureDevice.devices() as? [AVCaptureDevice] {
            // Loop …
Run Code Online (Sandbox Code Playgroud)

face-detection ios avcapturesession avcapturedevice swift3

1
推荐指数
1
解决办法
1万
查看次数

iPhone 7+, ios 11.2: Depth data delivery is not supported in the current configuration

This bug is driving me mad. I'm trying to produce the absolute minimal code to get AVDepthData from an iPhone 7+ using its DualCam.

I have this code:


//
//  RecorderViewController.swift
//  ios-recorder-app


import UIKit
import AVFoundation


class RecorderViewController: UIViewController {

    @IBOutlet weak var previewView: UIView!

    @IBAction func onTapTakePhoto(_ sender: Any) {

        guard let capturePhotoOutput = self.capturePhotoOutput else { return }

        let photoSettings = AVCapturePhotoSettings()

        photoSettings.isDepthDataDeliveryEnabled = true //Error

        capturePhotoOutput.capturePhoto(with: photoSettings, delegate: self)

    }

    var session: AVCaptureSession?
    var videoPreviewLayer: AVCaptureVideoPreviewLayer? …
Run Code Online (Sandbox Code Playgroud)

ios avcapturesession avcapturedevice swift avcaptureoutput

1
推荐指数
1
解决办法
485
查看次数

查看控制器需要5-10秒才能显示

我正在使用swift 4并有一个应用程序,人们可以通过我的应用程序打开他们的手机相机.我有一个ViewController被调用CameraController,它具有默认值UIView,我在其上面有一个视图,CameraView它显示用户摄像头和其他按钮.

当我点击其中一个按钮时,它会通过segue(PlacesController)将我带到另一个视图控制器.当我解雇时,PlacesController我回到CameraController现在,子视图现在需要大约8或10秒再次显示.

在某些情况下我可以在保持当前子视图的同时转到另一个控制器吗?

问题是,当我进入我的segue控制器PlaceController然后回到我的状态CameraController时,相机和子层变得可见之前需要大约8或10秒.特别是这个代码下面我想知道我是否可以保持我的子层仍然运行,因为它等待10秒显示太多了.

self.CameraView.layer.insertSublayer(previewLayer!, at: 0)
Run Code Online (Sandbox Code Playgroud)

这是我的代码:

class CameraController: UIViewController {
    @IBOutlet weak var CameraView: UIView!
     var previewLayer: AVCaptureVideoPreviewLayer?
     let captureSession = AVCaptureSession()


  override func viewDidLoad() {
        super.viewDidLoad()


    }
 override func viewDidAppear(_ animated: Bool) {
           DispatchQueue.main.async {
             self.beginSession()
        }

    func beginSession() {
   // gets the camera showing and displays buttons on top of it
        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        self.CameraView.layer.insertSublayer(previewLayer!, at: …
Run Code Online (Sandbox Code Playgroud)

camera ios avcapturesession segue swift

1
推荐指数
2
解决办法
3015
查看次数

在Swift中使用"= {}()"声明时,类属性的性质是什么?

我从一些示例代码中获得了Swift类,并且在其中有一个captureSession声明如下的属性:

private lazy var captureSession: AVCaptureSession = {
    let session = AVCaptureSession()

    guard
        let backCamera = AVCaptureDevice.default(for: .video),
        let input = try? AVCaptureDeviceInput(device: backCamera)
        else { return session }
    session.addInput(input)
    return session
}()
Run Code Online (Sandbox Code Playgroud)

我不认为captureSession是计算属性,也不是闭包.那这是什么?

ios avcapturesession swift

1
推荐指数
1
解决办法
94
查看次数

AVCapturePhotoOutput - 设置不可重复使用

我正在运行 ios 12 swift 4.2。

我已经实现了一个基本的相机捕获会话,并且正在点击其中的图像。一切都很好,直到我在自动/开/关模式之间切换闪光灯。单击第一张照片然后更改闪光模式后,应用程序崩溃并显示错误:

[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] Settings may not be re-used'
Run Code Online (Sandbox Code Playgroud)

以下是相机实现:

var captureSession: AVCaptureSession!
var videoPreviewLayer: AVCaptureVideoPreviewLayer!
var capturePhotoOutput: AVCapturePhotoOutput!
let capturePhotoSettings = AVCapturePhotoSettings()


var previewView: UIView!


override func viewDidLoad() {
    startCameraSession()
    setupCaptureOutput()
}

@objc // Tap on a button to capture
func takePhotoOnTap() {
    guard let capturePhotoOutput = self.capturePhotoOutput else { return }

    capturePhotoSettings.isAutoStillImageStabilizationEnabled = true
    capturePhotoSettings.isHighResolutionPhotoEnabled = true
    capturePhotoSettings.flashMode = .auto
    let _ = getSettings(camera: captureDevice!, flashMode: spotmiCameraOptions.flashMode)
    capturePhotoOutput.capturePhoto(with: capturePhotoSettings, delegate: self)
}


    //This is …
Run Code Online (Sandbox Code Playgroud)

camera avfoundation ios avcapturesession swift

1
推荐指数
1
解决办法
2156
查看次数

无法获取 AVCaptureDevice

另一个 Mac Catalyst 移植问题。在 iOS 中完美运行以获取用于视频目的的 AVCaptureDevice 的代码在 macOS 上运行时失败并返回 nil。

演示此问题的最简单方法是使用 Apple 自己的AvCam 演示应用程序。它在 Swift 和 Objective-C 中都有提供。两者都以同样的方式失败。选中“Mac”复选框并在 Mac 上构建和运行。授予相机权限,然后注意控制台中的错误。

在Swift版本中,查看configureSessionCameraViewController的方法。线路:

var defaultVideoDevice: AVCaptureDevice?

// Choose the back dual camera, if available, otherwise default to a wide angle camera.

if let dualCameraDevice = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back) {
    defaultVideoDevice = dualCameraDevice
} else if let backCameraDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) {
    // If a rear dual camera is not available, default to …
Run Code Online (Sandbox Code Playgroud)

ios avcapturesession mac-catalyst

1
推荐指数
1
解决办法
1187
查看次数