根据http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html,我能够使用AVCaptureSession从相机捕获视频帧.但是,似乎AVCaptureScreen从相机捕获帧而不在屏幕上显示相机流.我想像UIImagePicker一样显示相机流,以便用户知道相机正在打开并看到相机指向的位置.任何帮助或指针将不胜感激!
嘿,我试图使用AVCaptureSession iphone相机访问原始数据.我按照Apple(提供的指南链接点击这里).
从samplebuffer的原始数据是在YUV格式(我是正确这里有关原始视频帧格式??),如何直接获得为Y分量的数据从存储在samplebuffer原始数据.
我试图让相机输入显示在预览图层视图上.
self.cameraPreviewView与IB中的UIView相关联
这是我目前使用AV Foundation Programming Guide编写的代码.但预览从未显示过
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetHigh;
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
NSLog(@"Couldn't create video capture device");
}
[session addInput:input];
// Create video preview layer and add it to the UI
AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
UIView *view = self.cameraPreviewView;
CALayer *viewLayer = [view layer];
newCaptureVideoPreviewLayer.frame = view.bounds;
[viewLayer addSublayer:newCaptureVideoPreviewLayer];
self.cameraPreviewLayer = newCaptureVideoPreviewLayer;
[session startRunning];
Run Code Online (Sandbox Code Playgroud) 我必须实现在单个会话中重复暂停和恢复视频捕获的功能,但是将每个新段(每次暂停后捕获的段)添加到同一视频文件中AVFoundation.目前,每当我再次按"停止"然后再"记录"时,它只会将新的视频文件保存到我的iPhone的文档目录中,并开始捕获到新文件.我需要能够按下"记录/停止"按钮,仅在记录处于活动状态时捕获视频和音频...然后当按下"完成"按钮时,将一个包含所有段的AV文件放在一起.所有这些都需要在同一个捕获会话/预览会话中发生.
我没用AVAssetWriterInput.
我能想到尝试这个的唯一方法是按下"完成"按钮,获取每个单独的输出文件并将它们组合成一个文件.
此代码适用于iOS 5,但不适用于iOS 6.实际上,对于iOS 6,我第一次暂停录制(停止录制)AVCaptureFileOutputRecordingDelegate方法(captureOutput: didFinishRecordingToOutputFileAtURL: fromConnections: error:),但在此之后,当我开始录制时,captureOutput: didFinishRecordingToOutputFileAtURL: fromConnections: error:再次调用委托方法()但是在停止录制时不会调用它.
我需要一个解决这个问题的方法.请帮我.
//View LifeCycle
- (void)viewDidLoad
{
[super viewDidLoad];
self.finalRecordedVideoName = [self stringWithNewUUID];
arrVideoName = [[NSMutableArray alloc]initWithCapacity:0];
arrOutputUrl = [[NSMutableArray alloc] initWithCapacity:0];
CaptureSession = [[AVCaptureSession alloc] init];
captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
if ([captureDevices count] > 0)
{
NSError *error;
VideoInputDevice = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:&error];
if (!error)
{
if ([CaptureSession canAddInput:VideoInputDevice])
[CaptureSession addInput:VideoInputDevice];
else
NSLog(@"Couldn't add video …Run Code Online (Sandbox Code Playgroud) 我有兴趣在iOS中使用AVCaptureSession录制媒体,同时使用AVPlayer播放媒体(具体来说,我正在播放音频和录制视频,但我不确定这是否重要).
问题是,当我稍后将结果媒体重新播放时,它们会不同步.是否可以通过确保同时开始播放和录制,或通过发现它们之间的偏移来同步它们?我可能需要同步大约10毫秒.假设我总能捕获音频(因为用户可能使用耳机)是不合理的,因此通过分析原始和录制的音频进行同步不是一种选择.
这个问题表明,可以同时结束播放和录制并从那里确定最终偏移的初始偏移,但我不清楚如何让它们同时结束.我有两种情况:1)音频播放耗尽,2),用户点击"停止录制"按钮.
这个问题建议启动,然后应用一个固定的,但可能依赖于设备的延迟,这显然是一个黑客,但如果它对音频足够好,显然值得考虑视频.
我可以使用另一个媒体层来执行所需的同步吗?
相关:这个问题没有答案.
我想创建一个自定义键盘,作为条形码扫描仪.我已经完成了整个编码,但输出并不像预期的那样:我被要求获得相机权限(第一次),但相机不向视图发送视频.
我认为,出于安全考虑,可能存在使用键盘的一些限制?!?
1.)打开手电筒
-(void) turnFlashOn
{
AVCaptureDevice *flashLight = [AVCaptureDevice
defaultDeviceWithMediaType:AVMediaTypeVideo];
if([flashLight isTorchAvailable] && [flashLight
isTorchModeSupported:AVCaptureTorchModeOn])
{
BOOL success = [flashLight lockForConfiguration:nil];
if(success){
NSError *error;
[flashLight setTorchMode:AVCaptureTorchModeOn];
[flashLight setTorchModeOnWithLevel:1.0 error:&error];
NSLog(@"Error: %@", error);
[flashLight unlockForConfiguration];
NSLog(@"flash turned on -> OK");
}
else
{
NSLog(@"flash turn on -> ERROR");
}
}
}
Run Code Online (Sandbox Code Playgroud)
这给了我这个日志输出,但闪存没有任何反应:
Error: (null)
flash turned on -> OK
Run Code Online (Sandbox Code Playgroud)
2.)扫描条形码(viewDidLoad的一部分)
// SCANNER PART
self.captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *videoInput = …Run Code Online (Sandbox Code Playgroud) 在我的应用程序大量使用后,正在运行的AVCaptureSession实例正在遭受痛苦
DroppedFrameReason(P)=缓冲区外
这是来自SampleBuffer对象的详细信息 - (void)captureOutput:(AVCaptureOutput *)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
CMSampleBuffer 0x10de70770 retainCount: 1 allocator: 0x1b45e2bb8
invalid = NO
dataReady = YES
makeDataReadyCallback = 0x0
makeDataReadyRefcon = 0x0
buffer-level attachments:
DroppedFrameReason(P) = OutOfBuffers
formatDescription = <CMVideoFormatDescription 0x174441e90 [0x1b45e2bb8]> {
mediaType:'vide'
mediaSubType:'BGRA'
mediaSpecific: {
codecType: 'BGRA' dimensions: 480 x 360
}
extensions: {<CFBasicHash 0x174a61100 [0x1b45e2bb8]>{type = immutable dict, count = 5,
entries =>
0 : <CFString 0x1ae9fa7c8 [0x1b45e2bb8]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1ae9fa808 [0x1b45e2bb8]>{contents = "ITU_R_601_4"}
1 : <CFString 0x1ae9fa928 …Run Code Online (Sandbox Code Playgroud) 看完这个问题后:AVAudioSession AVAudioSessionCategoryPlayAndRecord故障,我试图尝试通过正常工作的背景音乐进行视频录制.当录制开始时,我正在解决音频故障,当它结束时,它会在第一次录制时正常工作.但如果我再次尝试录音,音乐就会停止.
有什么想法吗?
这是我的代码片段:
captureSession = AVCaptureSession()
captureSession?.automaticallyConfiguresApplicationAudioSession = false
captureSession?.usesApplicationAudioSession = true
guard let captureSession = self.captureSession else {
print("Error making capture session")
return;
}
captureSession.sessionPreset = AVCaptureSessionPresetHigh
self.camera = self.defaultBackCamera()
self.audioDeviceInput = try? AVCaptureDeviceInput(device: getAudioDevice())
cameraInput = try AVCaptureDeviceInput(device: camera)
captureSession.beginConfiguration()
if captureSession.inputs.count > 0 {
return
}
if captureSession.canAddInput(cameraInput) {
captureSession.addInput(cameraInput)
if captureSession.outputs.count == 0 {
photoOutput = AVCapturePhotoOutput()
if captureSession.canAddOutput(photoOutput!) {
captureSession.addOutput(self.photoOutput!)
}
}
captureSession.commitConfiguration()
if !captureSession.isRunning {
captureSession.startRunning()
self.previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
self.previewLayer!.videoGravity …Run Code Online (Sandbox Code Playgroud) 我使用标准AVFoundation类来捕获视频和显示预览(http://developer.apple.com/library/ios/#qa/qa1702/_index.html)
这是我的代码:
- (void)setupCaptureSession {
NSError *error = nil;
[self setCaptureSession: [[AVCaptureSession alloc] init]];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// TODO: Obs?uga b??du, gdy nie uda si? utworzy? wej?cia
}
[[self captureSession] addInput:input];
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[[self captureSession] addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
output.videoSettings =
[NSDictionary …Run Code Online (Sandbox Code Playgroud) 我正在使用基于Apple的AppCam应用程序示例的AVCaptureSession克隆Apple的相机应用程序.问题是我无法在视频预览屏幕中看到焦点矩形.我使用以下代码来设置焦点,但仍未显示焦点矩形.
AVCaptureDevice *device = [[self videoInput] device];
if ([device isFocusModeSupported:focusMode] && [device focusMode] != focusMode) {
NSError *error;
printf(" setFocusMode \n");
if ([device lockForConfiguration:&error]) {
[device setFocusMode:focusMode];
[device unlockForConfiguration];
} else {
id delegate = [self delegate];
if ([delegate respondsToSelector:@selector(acquiringDeviceLockFailedWithError:)]) {
[delegate acquiringDeviceLockFailedWithError:error];
}
}
}
Run Code Online (Sandbox Code Playgroud)
当我使用UIImagePickerController时,默认支持自动对焦,点击焦点,并且可以看到焦点矩形.是否无法使用AVCaptureSession在视频预览图层中显示焦点矩形?
avcapturesession ×10
avfoundation ×5
ios ×5
iphone ×5
autofocus ×2
avplayer ×1
barcode ×1
camera ×1
core-media ×1
ios8 ×1
keyboard ×1
objective-c ×1
stream ×1
swift ×1