我的用例如下:我正在尝试将 AVCaptureSession 的音频录制添加到应用程序中。该应用程序可能会播放音乐/声音,但我不想更改有关该应用程序的 AVAudioSession 的任何内容。即,共享音频会话可能具有与 PlayAndRecord 不同的类别。所以,没问题,我可以usesApplicationAudioSession在 AVCaptureSession 上设置为 false,它将使用与应用程序的音频会话不同的新 AudioSession。然而,文档指出,这
如果您的应用程序使用自己的音频会话进行播放,则可能会导致中断。
确实如此。我的问题是:有没有办法访问 AVCaptureSession 的私有音频会话(并以与其他音频源混合的方式配置它)?正如我所说,对于这种情况,我不想更改应用程序共享音频会话配置的任何内容。
任何帮助或提示将不胜感激。
我想在 iPhone 显示屏上查看 240 fps 的视频预览。
我的代码(简化)是这样的(如下):进行会话,激活相机,并在显示屏上显示实际的视频预览。
var session: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var videoPreviewLayer: AVCaptureVideoPreviewLayer?
override func viewDidLoad() {
super.viewDidLoad()
super.viewWillAppear(true)
session = AVCaptureSession()
let backCamera = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)
var error: NSError?
var input: AVCaptureDeviceInput!
do {
input = try AVCaptureDeviceInput(device: backCamera)
} catch let errorgesehen as NSError {
error = errorgesehen
input = nil
print(error!.localizedDescription)
}
configureCameraForHighestFrameRate(device: backCamera!)
if error == nil && session!.canAddInput(input) {
session!.addInput(input)
}
videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session)
videoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspect
videoPreviewLayer!.frame = CGRect(x: 0.0, …Run Code Online (Sandbox Code Playgroud) 环境:Xcode 4,ios 5,ipod touch第4代带视网膜显示屏
我正在尝试编写一个简单的应用程序,显示视频预览并捕获照片.为了显示预览,我使用以下代码:
// Create the session
session = [[AVCaptureSession alloc] init];
// Set preset to the highest available
session.sessionPreset = AVCaptureSessionPresetPhoto;
// Give the frame for preview
CALayer *viewLayer = self.vPreview.layer;
NSLog(@"viewLayer = %@", viewLayer);
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer =
[[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vPreview.bounds;
NSLog(@"Bounds: x:%d, y:%d,height:%d,width:%d",
self.vPreview.bounds.origin.x,self.vPreview.bounds.origin.y,
self.vPreview.bounds.size.height, self.vPreview.bounds.size.width);
[self.vPreview.layer addSublayer:captureVideoPreviewLayer];
// Get AV Device
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
// Add Input from the above device
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device …Run Code Online (Sandbox Code Playgroud) 我正在阅读本教程,从iPhone相机获取像素数据.
虽然我没有运行和使用此代码的问题,但我需要获取相机数据的输出(在BGRA中)并将其转换为ARGB,以便我可以将其与外部库一起使用.我该怎么做呢?
我可以从iOS后置摄像头捕捉图像.一切都在完美无缺地工作,除了我希望它按照我的界限拍摄照片UIView.
我的代码如下:
- (void)viewDidLoad
{
[super viewDidLoad];
// Do any additional setup after loading the view.
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetMedium;
captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
captureVideoPreviewLayer.frame = vImagePreview.bounds;
[vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [self backFacingCameraIfAvailable];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(@"ERROR: trying to open camera: %@", error);
}
[session addInput:input];
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = …Run Code Online (Sandbox Code Playgroud) 当我开始录制视频时,我正试图播放苹果要求的"嘟嘟"声.我发现通过SO和其他来源,当你的音频输入没有一些配置时,你无法播放声音.
这是我对配置方法的尝试:
private void SetupAudio()
{
beepSound = AssetBank.GetSystemSoundWav("video_record", "video_beep");
AudioSession.Initialize();
AudioSession.Interrupted += delegate
{
Console.WriteLine("Interrupted handler");
};
AudioSession.Category = AudioSessionCategory.PlayAndRecord;
AudioSession.OverrideCategoryMixWithOthers = true;
NSError err;
AVAudioSession.SharedInstance().SetActive(true, out err);
}
Run Code Online (Sandbox Code Playgroud)
这是我设置录制会话的代码:
public void SetupVideoCaptureSession(AVCaptureDevicePosition position)
{
// Setup devices
foreach (var device in AVCaptureDevice.Devices)
{
if (device.HasMediaType(AVMediaType.Video))
{
if (device.Position == AVCaptureDevicePosition.Front)
{
frontCam = device;
} else if (device.Position == AVCaptureDevicePosition.Back)
{
backCam = device;
}
}
}
// Create capture session
captureSession = new AVCaptureSession();
captureSession.BeginConfiguration();
captureSession.SessionPreset = …Run Code Online (Sandbox Code Playgroud) 我正在开发视频流应用,其中我需要捕获前置摄像头视频帧并编码然后转移到另一端,典型的流程是这样的
AVCaptureSession - > AVCaptureDeviceInput - > AVCaptureVideoDataOutput - >捕获帧 - >编码帧 - >发送帧到另一端,
它工作正常,我已将kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange设置为帧格式.
还有用于显示预览的预览图层,
当设备方向发生变化时,问题就出现了,如果设备从纵向移动到横向,那么在另一端框架旋转90度,我期待自从预览层支持方向,所以我将在Capture回调中自动接收旋转缓冲区,但看起来,预览层只显示捕获的缓冲区的预览和UI orating缓冲区,而在另一端我会得到咆哮的缓冲区,
所以我想知道,有没有任何配置可以让它改变,或者我需要在Capture缓冲区回调中旋转/转换缓冲区.
我有一个AVCaptureSession的应用程序,可以正常使用以前的iOS版本,但后来我尝试在ios8设备上运行它,应用程序崩溃偶尔.但问题没有解决.进入"[session addInput:input];"的异常 .请告知如何解决.请在[session addInput:input]中验证我的以下代码并获取错误;
打印错误描述:错误域= AVFoundationErrorDomain代码= -11852"无法使用后置摄像头"UserInfo = 0x17c076e0 {NSLocalizedDescription =无法使用后置摄像头,AVErrorDeviceKey =,NSLocalizedFailureReason =此应用无权使用后置摄像头.}
#import "CameraViewController.h"
#import "MAImagePickerControllerAdjustViewController.h"
#import "PopupViewController.h"
#import "MAImagePickerFinalViewController.h"
@implementation CameraViewController
@synthesize vImagePreview;
@synthesize vImage;
@synthesize stillImageOutput;
@synthesize lFrameCount;
@synthesize session;
@synthesize device;
@synthesize oneOff;
@synthesize captureManager = _captureManager;
@synthesize flashButton = _flashButton;
@synthesize vImage1;
@synthesize vImage2;
@synthesize vImage3;
@synthesize vImage4;
@synthesize vImage5;
@synthesize vImage6;
/////////////////////////////////////////////////////////////////////
#pragma mark - UI Actions
/////////////////////////////////////////////////////////////////////
-(IBAction) captureNow
{
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port …Run Code Online (Sandbox Code Playgroud) objective-c ios avcapturesession avcapturemoviefileoutput ios8
当我按"myButton"时,我正在尝试捕获图像并将其保存到变量中.我该怎么办?
我的代码如下:
import UIKit
import AVFoundation
import MobileCoreServices
class ViewController: UIViewController {
let captureSession = AVCaptureSession()
var previewLayer : AVCaptureVideoPreviewLayer?
var captureDevice : AVCaptureDevice?
@IBOutlet var myTap: UITapGestureRecognizer!
@IBOutlet weak var myButton: UIButton!
@IBAction func shotPress(sender: UIButton) {
//Save image to variable somehow
})
var stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if captureSession.canAddOutput(stillImageOutput) {
captureSession.addOutput(stillImageOutput)
}
}
override func viewDidLoad() {
super.viewDidLoad()
captureSession.sessionPreset = AVCaptureSessionPresetHigh
let devices = AVCaptureDevice.devices()
for device in devices {
if (device.hasMediaType(AVMediaTypeVideo)) {
if(device.position == …Run Code Online (Sandbox Code Playgroud) avcapturesession ×10
ios ×8
ios8 ×2
objective-c ×2
swift ×2
audio ×1
avcam ×1
avfoundation ×1
camera ×1
frame-rate ×1
ios5 ×1
iphone ×1
ipod ×1
uiview ×1
video ×1
xamarin ×1
xamarin.ios ×1