我使用下面的代码来聚焦iphone相机.但它没有用.我从Apple的AVCam示例代码中获取此代码.我做错了吗?是否有任何方法可以检测到iPhone是否真正关注?
-(void) focusAtPoint:(CGPoint)point
{
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];;
if (device != nil) {
if ([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]) {
NSError *error;
if ([device lockForConfiguration:&error]) {
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
} else {
NSLog(@"Error in Focus Mode");
}
}
}
}
Run Code Online (Sandbox Code Playgroud) 使用案例:我想捕获来自摄像头的输入,在捕获的帧(和声音)之上绘制并将结果保存为.mov文件.
问题:在将输入保存到文件之前,我无法看到如何修改输入.
我做了很多来解决它,比如添加 MPVolumeView,有人说如果你在当前添加 MPVolumeView 的实例,系统音量 HUD 会隐藏,但那不起作用
我加了观察器来观察音量的变化?并拍照
但现在我迷失在如何隐藏音量 HUD
我曾尝试添加 MPVolumeView 实例的解决方案,但不起作用,请给我另一种解决方法
任何建议,将不胜感激。
所以我使用 AVCaptureSession 用前置摄像头拍照。我还从此会话中创建预览层以在屏幕上显示当前图像。
previewLayer = AVCaptureVideoPreviewLayer(session: session)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
Run Code Online (Sandbox Code Playgroud)
一切都按其应有的方式进行。但现在我遇到了一个问题,因为我需要实现一个按钮来翻转/镜像(变换)此预览图层 - 这样用户就可以选择拍摄正常的自拍照或拍摄镜像的自拍照。
我已经尝试过转换 PreviewLayer 并且它有点有效。问题是,如果您旋转设备,预览图片会以另一种方式旋转,因为它已被变换。(在默认或任何其他相机应用程序中,图片随相机旋转)。任何人都知道如何实现这一目标?
镜像预览图层:(我尝试变换图层,甚至稍后查看,结果相同)。
@IBAction func mirrorCamera(_ sender: AnyObject) {
cameraMirrored = !cameraMirrored
if cameraMirrored {
// TRANSFORMING VIEW
self.videoPreviewView.transform = CGAffineTransform(scaleX: -1, y: 1);
// OR LAYER
self.previewLayer.transform = CATransform3DMakeScale(-1, 1, 1);
} else {
self.videoPreviewView.transform = CGAffineTransform(scaleX: 1, y: 1);
self.videoPreviewView.transform = CATransform3DMakeScale(1, 1, 1);
}
}
Run Code Online (Sandbox Code Playgroud) 目前,我有一个 vie 控制器,它以模态方式呈现一个包含相机的视图控制器。但是,每当我转换时,预览层都会有一个动画,因此它会从左上角循环生长以填充屏幕的其余部分。我尝试禁用 CALayer 隐式动画但没有成功。这是视图出现时的代码。
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
previewLayer?.frame = self.view.frame
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
capturedImageView.center = self.view.center
captureSession = AVCaptureSession()
if usingFrontCamera == true {
captureSession?.sessionPreset = AVCaptureSession.Preset.hd1920x1080
}
else {
captureSession?.sessionPreset = AVCaptureSession.Preset.hd1280x720
}
captureDevice = AVCaptureDevice.default(for: AVMediaType.video)
do {
let input = try AVCaptureDeviceInput(device: captureDevice!)
if (captureSession?.canAddInput(input) != nil) {
captureSession?.addInput(input)
stillImageOutput = AVCapturePhotoOutput()
captureSession?.addOutput(stillImageOutput!)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
previewLayer?.videoGravity = AVLayerVideoGravity.resizeAspect
self.view.layer.addSublayer(previewLayer!)
captureSession?.startRunning()
}
} catch {
} …Run Code Online (Sandbox Code Playgroud) 以下代码在 iPhone 上完美运行。在后置摄像头和前置摄像头之间来回切换。然而,当在 iPad 上运行时,选择前置摄像头时canAddInput- 方法总是返回(后置摄像头工作正常)。NO有什么想法吗?
- (void)addVideoInput:(BOOL)isFront{
AVCaptureDevice *videoDevice;
//NSLog(@"Adding Video input - front: %i", isFront);
[self.captureSession removeInput:self.currentInput];
if(isFront == YES){
self.isFrontCam = YES;
videoDevice = [self frontFacingCameraIfAvailable];
}else{
self.isFrontCam = NO;
videoDevice = [self backCamera];
}
if (videoDevice) {
NSError *error;
AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error) {
// Everything's fine up to here.
// the next line always resolves to NO and thus the
// Video input isn't added.
if ([[self …Run Code Online (Sandbox Code Playgroud) 我正在学习AVCaptureSession以及如何使用其委托方法捕获多个图像
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
Run Code Online (Sandbox Code Playgroud)
我的目标是以每秒预定义的速率捕获1个或多个图像.例如,每1秒1或2个图像.所以我订了
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
captureOutput.minFrameDuration = CMTimeMake(1, 1);
Run Code Online (Sandbox Code Playgroud)
当[self.captureSession startRunning];开始我的日志文件显示委托被称为20次.它来自何处以及如何以我预期的间隔捕获图像?
我使用以下代码来设置AVCaptureSession,录制视频文件并播放它:有时这很好用,有时我在播放时会出现黑屏.据我所知,这是完全随机的.
当错误发生时,如果我尝试在quicktime中打开文件,我会收到"无法打开文件,格式无法识别"的消息.这让我相信它是一个录音问题而不是播放问题.
另外,如果注释掉添加麦克风输入的代码部分,则不会发生错误(但我的视频文件当然没有音频轨道)...所以也许音频源会随机破坏文件原因?
- (void)viewDidLoad {
[super viewDidLoad];
....
captureSession = [[AVCaptureSession alloc] init];
[captureSession setSessionPreset:AVCaptureSessionPresetHigh];
NSArray *devices = [AVCaptureDevice devices];
AVCaptureDevice *frontCamera;
AVCaptureDevice *mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
for (AVCaptureDevice *device in devices) {
NSLog(@"Device name: %@", [device localizedName]);
if ([device hasMediaType:AVMediaTypeVideo])
{
if ([device position] == AVCaptureDevicePositionFront) {
NSLog(@"Device position : front");
frontCamera = device;
}
}
}
NSError *error = nil;
AVCaptureDeviceInput * microphone_input = [AVCaptureDeviceInput deviceInputWithDevice:mic error:&error];
AVCaptureDeviceInput *frontFacingCameraDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:frontCamera error:&error];
if (!error)
{
if …Run Code Online (Sandbox Code Playgroud) 在我的应用程序中,我使用AVFoundation捕获图像
我做了一个按钮,在前后摄像头之间切换,但是不起作用.
这是我使用的代码:
if (captureDevice.position == AVCaptureDevicePositionFront) {
for ( AVCaptureDevice *device in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo] ) {
if ( device.position == AVCaptureDevicePositionBack) {
NSError * error;
AVCaptureDeviceInput * newDeviceInput = [[AVCaptureDeviceInput alloc]initWithDevice:device error:&error];
[captureSesion beginConfiguration];
for (AVCaptureDeviceInput *oldInput in [captureSesion inputs]) {
[captureSesion removeInput:oldInput];
}
if ([captureSesion canAddInput:newDeviceInput]) {
[captureSesion addInput:newDeviceInput];
}
[captureSesion commitConfiguration];
break;
}
}
}
Run Code Online (Sandbox Code Playgroud)
谢谢.
我正在使用AVCaptureSession录制视频.但我无法设置最大视频长度.如果我使用ImagePicker控制器,则有方法用于设置最大视频持续时间,如videoMaximumDuration.但在AVCaptureSession中我如何设置MaximumDuration.请帮助我...先生谢谢
xcode avfoundation ios avcapturesession avcapturemoviefileoutput
avcapturesession ×10
ios ×7
avfoundation ×3
iphone ×3
objective-c ×3
camera ×2
swift ×2
xcode ×2
avcam ×1
calayer ×1
ipad ×1
mpvolumeview ×1
uiimageview ×1
volume ×1