因此,我按照Apple的说明使用AVCaptureSession
以下方法捕获视频会话:http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html.我面临的一个问题是,即使相机/ iPhone设备的方向是垂直的(并且AVCaptureVideoPreviewLayer
显示垂直相机流),输出图像似乎处于横向模式.我检查了imageFromSampleBuffer:
示例代码中imageBuffer的宽度和高度,分别得到640px和480px.有谁知道为什么会这样?
谢谢!
我知道打开闪光灯并在iPhone 4上保持打开的唯一方法是打开摄像机.我不太确定代码.这是我正在尝试的:
-(IBAction)turnTorchOn {
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:&error];
if (videoInput) {
[captureSession addInput:videoInput];
AVCaptureVideoDataOutput* videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_current_queue()];
[captureSession addOutput:videoOutput];
[captureSession startRunning];
videoCaptureDevice.torchMode = AVCaptureTorchModeOn;
}
}
Run Code Online (Sandbox Code Playgroud)
有人知道这是否有效还是我错过了什么?(我还没有iPhone 4进行测试 - 只是尝试了一些新的API).
谢谢
在iOS 10问世之前,我使用以下代码来获取录像机的视频和音频:
for device in AVCaptureDevice.devices()
{
if (device as AnyObject).hasMediaType( AVMediaTypeAudio )
{
self.audioCapture = device as? AVCaptureDevice
}
else if (device as AnyObject).hasMediaType( AVMediaTypeVideo )
{
if (device as AnyObject).position == AVCaptureDevicePosition.back
{
self.backCameraVideoCapture = device as? AVCaptureDevice
}
else
{
self.frontCameraVideoCapture = device as? AVCaptureDevice
}
}
}
Run Code Online (Sandbox Code Playgroud)
当iOS 10终于问世时,我在运行代码时收到了以下警告.请注意,我的录像机仍然可以正常工作约2周.
在iOS 10.0中不推荐使用'devices()':请改用AVCaptureDeviceDiscoverySession.
当我今天早上运行我的代码时,我的录像机停止了工作.xCode8没有给我任何错误,但摄像头捕获的previewLayer是完全白色的.当我开始录制时,我收到以下错误:
错误域= AVFoundationErrorDomain代码= -11800"操作无法完成"UserInfo = {NSLocalizedDescription =操作无法完成,NSUnderlyingError = 0x17554440 {错误域= NSOSStatusErrorDomain代码= -12780"(null)"},NSLocalizedFailureReason = An发生未知错误(-12780)}
我认为这与我使用弃用的方法有关AVCaptureDevice.devices()
.因此,我想知道如何使用AVCaptureDeviceDiscoverySession
?
提前谢谢你的帮助!
我的应用程序设置为使用AVCaptureSession从摄像机录制视频,但是没有音频.如何录制音频然后将其添加到文件的videoOutput中我需要做什么?这是录制视频的代码:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
CALayer *viewLayer = self.vImagePreview.layer;
NSLog(@"viewLayer = %@", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput …
Run Code Online (Sandbox Code Playgroud) objective-c audio-recording ios avcapturesession avcapturedevice
无法想象这一个.当应用程序处于活动状态时,一切正常,有时当我将应用程序移动到后台(按下主页按钮)而不是返回时,预览层会冻结/卡住.我使用viewWillAppear和viewDidAppear进行设置.这是我如何设置一切:
var backCamera = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo)
var global_device : AVCaptureDevice!
var captureSession: AVCaptureSession?
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
captureSession = AVCaptureSession()
captureSession!.sessionPreset = AVCaptureSessionPresetPhoto
CorrectPosition = AVCaptureDevicePosition.Back
for device in backCamera {
if device.position == AVCaptureDevicePosition.Back {
global_device = device as! AVCaptureDevice
CorrectPosition = AVCaptureDevicePosition.Back
break
}
}
configureCamera()
var error: NSError?
var input = AVCaptureDeviceInput(device: global_device, error: &error)
if error == nil && captureSession!.canAddInput(input) {
captureSession!.addInput(input)
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession!.canAddOutput(stillImageOutput) {
captureSession!.addOutput(stillImageOutput) …
Run Code Online (Sandbox Code Playgroud) 我正在为iPhone 5s编码,它现在有两个用于后置摄像头的LED灯.我不知道官方的LED颜色名称,但是一个LED是白色的,另一个LED是黄色的.Apple将此称为"True Tone".
我正在尝试单独访问这些相机LED灯.
我相信这是可能的,因为当我访问iOS7的控制中心(从底部向上滑动)并按下内置闪光灯时,只有白色LED亮起.
这与应用下面的火炬灯代码不同.当我执行下面的代码时,两个LED指示灯亮起.
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device hasTorch] && [device isTorchAvailable] && [device isTorchModeSupported:AVCaptureTorchModeOn]) {
[device lockForConfiguration:nil];
[device setTorchMode: onOff ? AVCaptureTorchModeOn : AVCaptureTorchModeOff];
[device unlockForConfiguration];
}
Run Code Online (Sandbox Code Playgroud)
我一直在搜索AVCaptureDevice类参考,我没有看到任何关于单独访问两个LED灯的具体信息.我的意思是,只打开白色LED或仅打开黄色LED.不是两个LED同时.
我假设两个灯都是LED.
关于如何做到这一点的任何想法?
非常感谢您对此的任何信息.
每当我启动以麦克风作为输入运行的AVCaptureSession时,它都会取消当前正在运行的任何背景音乐(例如iPod音乐).如果我注释掉添加音频输入的行,背景音频将继续.
有没有人知道用麦克风录制视频剪辑的方法,同时继续允许播放背景音频?当您尝试录制视频并且当前正在播放音乐时,也会出现错误.
试图这样做:
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error: nil];
UInt32 doSetProperty = 1;
AudioSessionSetProperty (kAudioSessionProperty_OverrideCategoryMixWithOthers, sizeof(doSetProperty), &doSetProperty);
[[AVAudioSession sharedInstance] setActive: YES error: nil];
Run Code Online (Sandbox Code Playgroud)
但 'AudioSessionSetProperty' is deprecated: first deprecated in iOS 7.0
所以我试着这样做:
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
NSError *setCategoryError = nil;
[audioSession setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionMixWithOthers
error:&setCategoryError];
[audioSession setActive:YES error:nil];
Run Code Online (Sandbox Code Playgroud)
但最后它没有奏效.感谢帮助!
objective-c ios avcapturesession avcapturedevice avaudiosession
我将麦克风设置在上,AVCaptureSession
并且需要麦克风开关。我应该如何进行呢?我真的需要captureSession?.removeInput(microphone)
还是有简便的方法?
let microphone = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
do {
let micInput = try AVCaptureDeviceInput(device: microphone)
if captureSession.canAddInput(micInput) {
captureSession.addInput(micInput)
}
} catch {
print("Error setting device audio input: \(error)")
return false
}
Run Code Online (Sandbox Code Playgroud) 在Apple的iOS 6.0功能页面上,它曾经说过
利用内置摄像头的高级功能.新的API可让您控制焦点,曝光和感兴趣的区域.您还可以使用面部检测API访问和显示面部,并利用支持硬件的视频稳定功能.
此文本已被删除,我无法在API中找到用于控制曝光的新方法.在AVCaptureDevice
"曝光设置"下的课程中,没有针对iOS 6.0的新属性/方法.你知道在哪里可以找到API的新功能吗?
我想在我的应用程序中实现自定义相机.所以,我正在创建这个相机AVCaptureDevice
.
现在我只想在我的自定义相机中显示灰度输出.所以我试图用这个setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:
和AVCaptureWhiteBalanceGains
.我正在使用AVCamManual:将AVCam扩展为使用手动捕获.
- (void)setWhiteBalanceGains:(AVCaptureWhiteBalanceGains)gains
{
NSError *error = nil;
if ( [videoDevice lockForConfiguration:&error] ) {
AVCaptureWhiteBalanceGains normalizedGains = [self normalizedGains:gains]; // Conversion can yield out-of-bound values, cap to limits
[videoDevice setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:normalizedGains completionHandler:nil];
[videoDevice unlockForConfiguration];
}
else {
NSLog( @"Could not lock device for configuration: %@", error );
}
}
Run Code Online (Sandbox Code Playgroud)
但为此,我必须将RGB增益值传递到1到4之间.所以我创建了这个方法来检查MAX和MIN值.
- (AVCaptureWhiteBalanceGains)normalizedGains:(AVCaptureWhiteBalanceGains) gains
{
AVCaptureWhiteBalanceGains g = gains;
g.redGain = MAX( 1.0, g.redGain );
g.greenGain = MAX( 1.0, g.greenGain ); …
Run Code Online (Sandbox Code Playgroud) avcapturedevice ×10
ios ×9
iphone ×4
objective-c ×3
swift ×3
camera ×2
avcapture ×1
avfoundation ×1
led ×1
preview ×1