播放来自CMSampleBuffer的音频

Alo*_*edi 8 avaudioplayer twilio ios cmsamplebuffer swift

我已经为iOS中的群组创建了视频聊天应用。我一直在寻找一些方法来分别控制不同参与者的音量。我找到办法静音和取消静音使用isPlaybackEnabledRemoteAudioTrack,但并不能够控制音量。

我还认为我们是否可以在中使用它AVAudioPlayer。我发现了addSink。这是我从这里尝试过的

class Audio: NSObject, AudioSink {
    var a = 1
    func renderSample(_ audioSample: CMSampleBuffer!) {
        print("audio found", a)
        a += 1

        var audioBufferList = AudioBufferList()
        var data = Data()
        var blockBuffer : CMBlockBuffer?

        CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(audioSample, bufferListSizeNeededOut: nil, bufferListOut: &audioBufferList, bufferListSize: MemoryLayout<AudioBufferList>.size, blockBufferAllocator: nil, blockBufferMemoryAllocator: nil, flags: 0, blockBufferOut: &blockBuffer)
        let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))

        for audioBuffer in buffers {
            let frame = audioBuffer.mData?.assumingMemoryBound(to: UInt8.self)
            data.append(frame!, count: Int(audioBuffer.mDataByteSize))
        }

        let player = try! AVAudioPlayer(data: data) //crash here
        player.play()
    }
}
Run Code Online (Sandbox Code Playgroud)

但是它崩溃了let player = try! AVAudioPlayer(data: data)


编辑:
这是错误:Fatal error: 'try!' expression unexpectedly raised an error: Error Domain=NSOSStatusErrorDomain Code=-39 "(null)": file

这是data所以我想它不被转换:

? 0 bytes
  - count : 0
  ? pointer : 0x000000016d7ae160
    - pointerValue : 6131736928
  - bytes : 0 elements
Run Code Online (Sandbox Code Playgroud)

这是audioSample

<CMAudioFormatDescription 0x2815a3de0 [0x1bb2ef830]> {
    mediaType:'soun' 
    mediaSubType:'lpcm' 
    mediaSpecific: {
        ASBD: {
            mSampleRate: 16000.000000 
            mFormatID: 'lpcm' 
            mFormatFlags: 0xc 
            mBytesPerPacket: 2 
            mFramesPerPacket: 1 
            mBytesPerFrame: 2 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 16     } 
        cookie: {(null)} 
        ACL: {(null)}
        FormatList Array: {(null)} 
    } 
    extensions: {(null)}
}
Run Code Online (Sandbox Code Playgroud)

Pav*_*lov 5

您可以从CMSampleBuffer获取完整的数据缓冲区并将其转换为Data

let blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer)
let blockBufferDataLength = CMBlockBufferGetDataLength(blockBuffer!)
var blockBufferData  = [UInt8](repeating: 0, count: blockBufferDataLength)
let status = CMBlockBufferCopyDataBytes(blockBuffer!, atOffset: 0, dataLength: blockBufferDataLength, destination: &blockBufferData)
guard status == noErr else { return }
let data = Data(bytes: blockBufferData, count: blockBufferDataLength)
Run Code Online (Sandbox Code Playgroud)

另请参阅AVAudioPlayer概述:

使用此类进行音频播放,除非您正在播放从网络流捕获的音频或需要非常低的 I/O 延迟。

所以我认为这对你不起作用。您应该更好地使用AVAudioEngineAudio Queue Services


Man*_*nav -4

尝试将音频文件保存到文档目录,然后播放声音。这对我有用。

    func playMusic() {
        let url = NSBundle.mainBundle().URLForResource("Audio", withExtension: "mp3")!
        let data = NSData(contentsOfURL: url)!
        AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, error: nil)
        AVAudioSession.sharedInstance().setActive(true, error: nil)
        audioPlayer = AVAudioPlayer(data: data, fileTypeHint: AVFileTypeMPEGLayer3, error: nil)
        audioPlayer.prepareToPlay()
        audioPlayer.play()
    }

Run Code Online (Sandbox Code Playgroud)