标签: avcapture

AVCaptureVideoPreviewLayer方向 - 需要景观

我的应用只是风景.我将这样呈现AVCaptureVideoPreviewLayer:

self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[self.previewLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];                    
NSLog(@"previewView: %@", self.previewView);
CALayer *rootLayer = [self.previewView layer];
[rootLayer setMasksToBounds:YES];
[self.previewLayer setFrame:[rootLayer bounds]];
    NSLog(@"previewlayer: %f, %f, %f, %f", self.previewLayer.frame.origin.x, self.previewLayer.frame.origin.y, self.previewLayer.frame.size.width, self.previewLayer.frame.size.height);
[rootLayer addSublayer:self.previewLayer];
[session startRunning];
Run Code Online (Sandbox Code Playgroud)

self.previewView的框架为(0,0,568,320),这是正确的.self.previewLayer记录一个(0,0,568,320)的帧,这在理论上是正确的.但是,相机显示在横向屏幕中间显示为纵向矩形,并且相机预览图像的方向错误90度.我究竟做错了什么?我需要相机预览层出现在全屏幕,在横向模式下,图像应正确导向.

avfoundation orientation ios avcapture avcapturesession

56
推荐指数
6
解决办法
5万
查看次数

AVCaptureDevice相机变焦

我有一个简单的AVCaptureSession运行,以在我的应用程序中获取相机并拍照.如何使用UIGestureRecognizer相机实现"捏缩放"功能?

iphone objective-c ios avcapture avcapturesession

19
推荐指数
4
解决办法
2万
查看次数

找到设备的相机分辨率iOS的方法

什么是使用设置找到要捕获的图像分辨率的最佳方法AVCaptureSessionPresetPhoto.
我想在捕获图像之前找到分辨率.

iphone avfoundation ios avcapture avcapturesession

17
推荐指数
3
解决办法
2万
查看次数

运行多个AVCaptureSessions或添加多个输入

我想在两个相互靠近的UIViews中显示iPad2的前置和后置摄像头的流.要流式传输一个设备的图像,我使用以下代码

AVCaptureDeviceInput *captureInputFront = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];

AVCaptureSession *session = [[AVCaptureSession alloc] init];
session addInput:captureInputFront];
session setSessionPreset:AVCaptureSessionPresetMedium];
session startRunning];

AVCaptureVideoPreviewLayer *prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
prevLayer.frame = self.view.frame;
[self.view.layer addSublayer:prevLayer];
Run Code Online (Sandbox Code Playgroud)

这适用于任何一台相机.为了并行显示流,我尝试创建另一个会话,但是第二个会话建立后,第一个会话冻结.

然后我尝试向会话添加两个AVCaptureDeviceInput,但目前最多支持一个输入.

任何有用的想法如何从两个相机流?

video-capture objective-c avcapture avcapturesession ipad-2

16
推荐指数
1
解决办法
7048
查看次数

AVCaptureVideoPreviewLayer没有填充屏幕

我读了大约一百万个线程,关于如何让一个VideoPreviewLayer填满iPhone的整个屏幕,但没有任何作用......也许你可以帮助我,因为我真的被卡住了.

这是我的预览层init:

    if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPad)
{
    // Choosing bigger preset for bigger screen.
    _sessionPreset = AVCaptureSessionPreset1280x720;
}
else
{
    _sessionPreset = AVCaptureSessionPresetHigh;
}

[self setupAVCapture];

AVCaptureSession *captureSession = _session;
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession];
UIView *aView = self.view;
previewLayer.frame = aView.bounds;
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
[aView.layer addSublayer:previewLayer];
Run Code Online (Sandbox Code Playgroud)

这是我的setupAvCapture方法:

  //-- Setup Capture Session.
_session = [[AVCaptureSession alloc] init];
[_session beginConfiguration];

//-- Set preset session size.
[_session setSessionPreset:_sessionPreset];

//-- Creata a video device and input from that Device.  Add the input to …
Run Code Online (Sandbox Code Playgroud)

objective-c avfoundation ios avcapture avcapturesession

15
推荐指数
2
解决办法
9973
查看次数

AVCaptureSession指定捕获图像的分辨率和质量obj-c iphone app

嗨我想设置AV捕获会话以使用iphone相机捕获具有特定分辨率(如果可能,具有特定质量)的图像.这是setupping AV会话代码

// Create and configure a capture session and start it running
- (void)setupCaptureSession 
{
    NSError *error = nil;

    // Create the session
    self.captureSession = [[AVCaptureSession alloc] init];

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the
    // chosen device.
    captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    // Find a suitable AVCaptureDevice
    NSArray *cameras=[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *device;
    if ([UserDefaults camera]==UIImagePickerControllerCameraDeviceFront)
    {
        device =[cameras objectAtIndex:1];
    }
    else …
Run Code Online (Sandbox Code Playgroud)

iphone objective-c avcapture avcapturesession

12
推荐指数
1
解决办法
4万
查看次数

Swift 3 - AVCapture自定义相机视图

我正在通过这个视频来制作自定义相机视图. https://www.youtube.com/watch?v=w0O3ZGUS3pk

但是由于iOS 10和swift 3的变化,很多东西都不再相关了

以下是我将弃用函数更改为新函数后得到的代码.但是没有错误,但也没有在UIView上看到预览

import UIKit
import AVFoundation

class ViewController: UIViewController, AVCapturePhotoCaptureDelegate, UIImagePickerControllerDelegate, UINavigationControllerDelegate {
    @IBOutlet weak var cameraView: UIView!
    var captureSession = AVCaptureSession();
    var sessionOutput = AVCapturePhotoOutput();
    var sessionOutputSetting = AVCapturePhotoSettings(format: [AVVideoCodecKey:AVVideoCodecJPEG]);
    var previewLayer = AVCaptureVideoPreviewLayer();

    override func viewWillAppear(_ animated: Bool) {
        let deviceDiscoverySession = AVCaptureDeviceDiscoverySession(deviceTypes: [AVCaptureDeviceType.builtInDuoCamera, AVCaptureDeviceType.builtInTelephotoCamera,AVCaptureDeviceType.builtInWideAngleCamera], mediaType: AVMediaTypeVideo, position: AVCaptureDevicePosition.unspecified)
        for device in (deviceDiscoverySession?.devices)! {
            if(device.position == AVCaptureDevicePosition.front){
                do{
                    let input = try AVCaptureDeviceInput(device: device)
                    if(captureSession.canAddInput(input)){
                        captureSession.addInput(input);

                        if(captureSession.canAddOutput(sessionOutput)){
                            captureSession.addOutput(sessionOutput);
                            previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
                            previewLayer.videoGravity = …
Run Code Online (Sandbox Code Playgroud)

avfoundation avcapture avcapturesession swift ios10

12
推荐指数
1
解决办法
1万
查看次数

在iOS中的AVCaptureDevice输出上设置GrayScale

我想在我的应用程序中实现自定义相机.所以,我正在创建这个相机AVCaptureDevice.

现在我只想在我的自定义相机中显示灰度输出.所以我试图用这个setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:AVCaptureWhiteBalanceGains.我正在使用AVCamManual:将AVCam扩展为使用手动捕获.

- (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains
{
    NSError *error = nil;

    if ( [videoDevice lockForConfiguration:&error] ) {
        AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains]; // Conversion can yield out-of-bound values, cap to limits
        [videoDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
        [videoDevice unlockForConfiguration];
    }
    else {
        NSLog( @"Could not lock device for configuration: %@", error );
    }
}
Run Code Online (Sandbox Code Playgroud)

但为此,我必须将RGB增益值传递到1到4之间.所以我创建了这个方法来检查MAX和MIN值.

- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
    AVCaptureWhiteBalanceGains g = gains;

    g.redGain = MAX( 1.0, g.redGain );
    g.greenGain = MAX( 1.0, g.greenGain ); …
Run Code Online (Sandbox Code Playgroud)

objective-c ios avcapture avcapturedevice swift

10
推荐指数
1
解决办法
1657
查看次数

iOS:iPhone 11 Pro 上的手电筒级别

我正在使用AVCaptureDevice.setTorchModeOn(level)方法以可变亮度打开手电筒。

在我的旧 iPhone SE 上,它运行良好——当我level0变为 时,我可以清楚地看到 4 种不同的亮度级别1

但在 iPhone 11 Pro 上,手电筒仅在水平为1.0! 如果远离最高水平,它的亮度(与控制中心的手电筒相比)。

我尝试使用maxAvailableTorchLevel常量,但结果与使用1.0.
还尝试了超过的值1.0- 这会导致异常(如预期的那样)。

有人也有这个问题吗?也许有一些解决方法?

iphone ios avcapture avcapturedevice flashlight

10
推荐指数
1
解决办法
627
查看次数

AVCaptureVideoPreviewLayer前置摄像头在传递给opengl着色器之前翻转(取消镜像)pixelbuffer

AVCaptureVideoPreviewLayer用来传递实时视频并openGL实时应用着色器.使用前置摄像头时,视频会被镜像,我想在应用着色器之前取消镜像.

任何人都可以帮助吗?

补充:切换到前置摄像头的代码:

-(void)showFrontCamera{
    NSLog(@"inside showFrontCamera");
    [captureSession removeInput:videoInput];
    // Grab the front-facing camera
    AVCaptureDevice *backFacingCamera = nil;
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for (AVCaptureDevice *device in devices) {
        if ([device position] == AVCaptureDevicePositionFront) {
            backFacingCamera = device;
        }
    }
    // Add the video input
    NSError *error = nil;
    videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:backFacingCamera error:&error] autorelease];

    if ([captureSession canAddInput:videoInput]) {
        [captureSession addInput:videoInput];
    }

}
Run Code Online (Sandbox Code Playgroud)

ios avcapture opengl-es-2.0

9
推荐指数
1
解决办法
4151
查看次数