我正在努力编码从AVCaptureSession使用中
接收的音频缓冲区AudioConverter,然后将它们附加到AVAssetWriter.
我没有收到任何错误(包括OSStatus响应),并且CMSampleBuffer生成的
s似乎有有效数据,但生成的文件根本没有任何可播放的音频.当与视频一起写入时,视频帧停止附加几帧(appendSampleBuffer()
返回false,但没有AVAssetWriter.error),可能是因为资产编写者正在等待音频赶上.我怀疑它与我为AAC设置启动的方式有关.
该应用程序使用RxSwift,但我删除了RxSwift部件,以便更容易理解更广泛的受众.
请查看以下代码中的评论以获取更多...评论
给定一个设置结构:
import Foundation
import AVFoundation
import CleanroomLogger
public struct AVSettings {
let orientation: AVCaptureVideoOrientation = .Portrait
let sessionPreset = AVCaptureSessionPreset1280x720
let videoBitrate: Int = 2_000_000
let videoExpectedFrameRate: Int = 30
let videoMaxKeyFrameInterval: Int = 60
let audioBitrate: Int = 32 * 1024
/// Settings that are `0` means variable rate.
/// The `mSampleRate` and `mChennelsPerFrame` is overwritten at run-time
/// to values …Run Code Online (Sandbox Code Playgroud) 我有一些代码创建CMBlockBuffers然后创建一个CMSampleBuffer并将其传递给AVAssetWriterInput.
这里有关于内存管理的优惠吗?根据Apple文档,您在名称中使用"创建"的任何内容都应与CFRelease一起发布.
但是,如果我使用CFRelease,那么我的应用程序将以对象0xblahblah的'malloc:*错误中止:未释放指针被释放.
CMBlockBufferRef tmp_bbuf = NULL;
CMBlockBufferRef bbuf = NULL;
CMSampleBufferRef sbuf = NULL;
status = CMBlockBufferCreateWithMemoryBlock(
kCFAllocatorDefault,
samples,
buflen,
kCFAllocatorDefault,
NULL,
0,
buflen,
0,
&tmp_bbuf);
if (status != noErr || !tmp_bbuf) {
NSLog(@"CMBlockBufferCreateWithMemoryBlock error");
return -1;
}
// Copy the buffer so that we get a copy of the samples in memory.
// CMBlockBufferCreateWithMemoryBlock does not actually copy the data!
//
status = CMBlockBufferCreateContiguous(kCFAllocatorDefault, tmp_bbuf, kCFAllocatorDefault, NULL, 0, buflen, kCMBlockBufferAlwaysCopyDataFlag, &bbuf);
//CFRelease(tmp_bbuf); // causes …Run Code Online (Sandbox Code Playgroud) aac ×1
avfoundation ×1
core-audio ×1
core-media ×1
encoding ×1
ios ×1
iphone ×1
objective-c ×1
swift ×1