标签: avcapturesession

当快速点击按钮时,iPhone手电筒应用程序崩溃

我的flashligh应用程序完美运行.出于某种原因,每次我快速点击手电筒按钮,应用程序只是冻结,并没有做任何事情.当我调用AVCaptureSession stopRunning时,它似乎冻结了.下面是我的切换手电筒方法的代码.我也希望这种方法最终被称为频闪功能.

- (void)toggleFlashlight{

  if (isTorchOn) {

   // Start session configuration
   [session beginConfiguration];

   [device lockForConfiguration:nil];    
   // Set torch to on
   [device setTorchMode:AVCaptureTorchModeOn];

   [device unlockForConfiguration];
   [session commitConfiguration];

  [session startRunning];


 }
 else {
  [session stopRunning];
  [session release]; 
  session = nil;

  session = [[AVCaptureSession alloc] init];

  // Create device input and add to current session
  AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error: nil];
  [session addInput:input];

  // Create video output and add to current session      
  AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
  [session addOutput:output];
  [output release]; …
Run Code Online (Sandbox Code Playgroud)

iphone avcapturesession avcapturedevice flashlight

0
推荐指数
2
解决办法
1378
查看次数

Objective-C到Swift翻译问题

我正在尝试将这个简单的代码行转换为Swift,但无法弄清楚如何编写它:

AVCaptureConnection *videoConnection = nil;
Run Code Online (Sandbox Code Playgroud)

我试过了:

    let videoConnection: AVCaptureConnection = nil

    let videoConnection: AVCaptureConnection = false

    var videoConnection:AVCaptureConnection = AVCaptureConnection()
            videoConnection = nil

    var videoConnection:AVCaptureConnection = AVCaptureConnection()
            videoConnection = false

var videoConnection:AVCaptureConnection = AVCaptureConnection()
            videoConnection.active = false

var videoConnection:AVCaptureConnection = AVCaptureConnection()
            videoConnection.active = nil
Run Code Online (Sandbox Code Playgroud)

关于如何写这个的任何建议是Swift将不胜感激.

translation objective-c avcapturesession swift

0
推荐指数
1
解决办法
74
查看次数

像 Snapchat 或 Houseparty 一样自动将前置摄像头用于 AVCaptureDevice 预览层 (Swift 3)

基本上我想要完成的是让 AVCaptureDevice 的前置摄像头成为 AVCaptureSession 期间应用程序上的第一个也是唯一的选项。

我环顾了 StackOverflow,提供的所有方法和答案都已从 iOS 10、Swift 3 和 Xcode 8 开始弃用。

我知道您应该使用 AVCaptureDeviceDiscoverySession 枚举设备并查看它们以区分正面和背面,但我不确定如何这样做。

有人可以帮忙吗?如果是这样就太棒了!

这是我的代码:

    override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)

    previewLayer.frame = singleViewCameraSlot.bounds
    self.singleViewCameraSlot.layer.addSublayer(previewLayer)
    captureSession.startRunning()

}



lazy var captureSession: AVCaptureSession = {
    let capture = AVCaptureSession()
    capture.sessionPreset = AVCaptureSessionPreset1920x1080
    return capture
}()

lazy var previewLayer: AVCaptureVideoPreviewLayer = {
    let preview =  AVCaptureVideoPreviewLayer(session: self.captureSession)

    preview?.videoGravity = AVLayerVideoGravityResizeAspect
    preview?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
    preview?.bounds = CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height)
    preview?.position = CGPoint(x: self.view.bounds.midX, y: …
Run Code Online (Sandbox Code Playgroud)

avfoundation avcapturesession swift swift3 xcode8

0
推荐指数
1
解决办法
6058
查看次数

didOutputSampleBuffer委托永远不会被调用(iOS/Swift 3)

我试图跟踪视频录制的样本缓冲率.

我有一个视图控制器AVCaptureFileOutputRecordingDelegateAVCaptureVideoDataOutputSampleBufferDelegate,然后设置像这样的缓冲器输出:

sessionQueue.async { [weak self] in
        if let `self` = self {
            let movieFileOutput = AVCaptureMovieFileOutput()

            let bufferQueue = DispatchQueue(label: "bufferRate", qos: .userInteractive, attributes: .concurrent)

            let theOutput = AVCaptureVideoDataOutput()
                theOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value:kCVPixelFormatType_32BGRA)]
                theOutput.alwaysDiscardsLateVideoFrames = true
                theOutput.setSampleBufferDelegate(self, queue: bufferQueue)

            if self.session.canAddOutput(theOutput) {
                self.session.addOutput(theOutput)
                print("ADDED BUFFER OUTPUT")
            }

            if self.session.canAddOutput(movieFileOutput) {
                self.session.beginConfiguration()
                self.session.addOutput(movieFileOutput)
                self.session.sessionPreset = AVCaptureSessionPresetHigh
                if let connection = movieFileOutput.connection(withMediaType: AVMediaTypeVideo) {
                    if connection.isVideoStabilizationSupported {
                        connection.preferredVideoStabilizationMode = .auto
                    }
                }

                self.session.commitConfiguration()

                self.movieFileOutput = …
Run Code Online (Sandbox Code Playgroud)

ios avcapturesession swift

0
推荐指数
1
解决办法
746
查看次数

是否 swift 管理我从 CVPixelBufferCreate 创建的 CVPixelBuffer 的内存?

假设我想从相机输出中存储一帧

let imageBuffer:CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
some_list.append(imageBuffer.copy())
Run Code Online (Sandbox Code Playgroud)

**

这是通过扩展到 CVPixelBuffer 定义复制函数的方式

extension CVPixelBuffer {
    func copy() -> CVPixelBuffer {
        precondition(CFGetTypeID(self) == CVPixelBufferGetTypeID(), "copy() cannot be called on a non-CVPixelBuffer")
        var _copy : CVPixelBuffer?
        CVPixelBufferCreate(
            nil,
            CVPixelBufferGetWidth(self),
            CVPixelBufferGetHeight(self),
            CVPixelBufferGetPixelFormatType(self),
            CVBufferGetAttachments(self, CVAttachmentMode.shouldPropagate),
            &_copy)
        guard let copy = _copy else { fatalError() }
        CVPixelBufferLockBaseAddress(self, CVPixelBufferLockFlags.readOnly)
        CVPixelBufferLockBaseAddress(copy, CVPixelBufferLockFlags(rawValue: 0))
        let dest = CVPixelBufferGetBaseAddress(copy)
        let source = CVPixelBufferGetBaseAddress(self)
        let height = CVPixelBufferGetHeight(self)
        let bytesPerRow = CVPixelBufferGetBytesPerRow(self)
        memcpy(dest, source, height * bytesPerRow)
        CVPixelBufferUnlockBaseAddress(copy, CVPixelBufferLockFlags(rawValue: 0))
        CVPixelBufferUnlockBaseAddress(self, CVPixelBufferLockFlags.readOnly) …
Run Code Online (Sandbox Code Playgroud)

avfoundation avcapturesession swift

0
推荐指数
1
解决办法
715
查看次数

AVFoundation -AVCaptureSession 仅在进入后台并使用断点返回时停止和开始运行

这个问题在 Xcode 10.2.1 和 iOS 12 中没有出现。它在 Xcode 11.1 和 iOS 13 中开始

我的应用程序录制视频,当应用程序转到后台时,我停止运行捕获会话并删除预览层。当应用程序返回前台时,我重新启动捕获会话并将预览层添加回:

let captureSession = AVCaptureSession()
var previewLayer: AVCaptureVideoPreviewLayer?
var movieFileOutput = AVCaptureMovieFileOutput()

// *** I initially didn't remove the preview layer in this example but I did remove it in the other 2 examples below ***
@objc fileprivate func stopCaptureSession() {
    DispatchQueue.main.async {
        [weak self] in
        if self?.captureSession.isRunning == true {
            self?.captureSession.stopRunning()
        }
    }
}

@objc func restartCaptureSession() {
    DispatchQueue.main.async {
        [weak self] in
        if self?.captureSession.isRunning == …
Run Code Online (Sandbox Code Playgroud)

avfoundation ios avcapturesession swift

0
推荐指数
1
解决办法
1万
查看次数

裁剪 AVCapturePhoto 以覆盖屏幕上显示的矩形

我正在尝试拍摄一块薄金属的照片,并将其裁剪为屏幕上显示的轮廓。我几乎看过这里的所有其他帖子,但还没有任何东西适合我。然后该图像将被图书馆用于分析。我可以进行一些裁剪,但不能对显示的矩形进行裁剪。我尝试在裁剪之前旋转图像,并根据屏幕上的矩形计算矩形。

视频层

这是我的捕获代码。PreviewView是容器,videoLayer是AVCapture视频的容器。

    // Photo capture delegate
    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
        guard let imgData = photo.fileDataRepresentation(), let uiImg = UIImage(data: imgData), let cgImg = uiImg.cgImage else {
            return
        }
        print("Original image size: ", uiImg.size, "\nCGHeight: ", cgImg.height, " width: ", cgImg.width)
        print("Orientation: ", uiImg.imageOrientation.rawValue)
        
        guard let img = cropImage(image: uiImg) else {
            return
        }
        
        showImage(image: img)
    }

    
    func cropImage(image: UIImage) -> UIImage? {
        print("Image size before crop: ", image.size)
        //Get the croppedRect from function below …
Run Code Online (Sandbox Code Playgroud)

camera ios avcapturesession swift avcapturephotooutput

0
推荐指数
1
解决办法
1314
查看次数

为什么这段代码有时会起作用,而不是其他代码?

我在我的应用程序中创建了一个"镜像"视图,它使用前置摄像头向用户显示"镜像".我遇到的问题是我几周没有触及这个代码(当时它确实有效)但是现在我再次测试它并且它不起作用.代码与以前相同,没有错误出现,故事板中的视图与之前完全相同.我不知道发生了什么,所以我希望这个网站会有所帮助.

这是我的代码:

if([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront]) {
        //If the front camera is available, show the camera


        AVCaptureSession *session = [[AVCaptureSession alloc] init];
        AVCaptureOutput *output = [[AVCaptureStillImageOutput alloc] init];
        [session addOutput:output];

        //Setup camera input
        NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
        //You could check for front or back camera here, but for simplicity just grab the first device
        AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
        NSError *error = nil;
        // create an input and add it to the session
        AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; //Handle errors …
Run Code Online (Sandbox Code Playgroud)

camera objective-c uiimagepickercontroller avcapturesession

-1
推荐指数
1
解决办法
457
查看次数