小编JCu*_*ng8的帖子

操作 AudioBuffer.mData 以显示音频可视化

我正在尝试实时处理音频数据,以便可以根据麦克风输入的声音显示屏幕频谱分析仪/可视化。我正在使用 AVFoundationAVCaptureAudioDataOutputSampleBufferDelegate来捕获音频数据,这会触发 delgate 函数captureOutput。函数如下:

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

    autoreleasepool {

        guard captureOutput != nil,
            sampleBuffer != nil,
            connection != nil,
            CMSampleBufferDataIsReady(sampleBuffer) else { return }

        //Check this is AUDIO (and not VIDEO) being received
        if (connection.audioChannels.count > 0)
        {
            //Determine number of frames in buffer
            var numFrames = CMSampleBufferGetNumSamples(sampleBuffer)

            //Get AudioBufferList
            var audioBufferList = AudioBufferList(mNumberBuffers: 1, mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))
            var blockBuffer: CMBlockBuffer?
          CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, nil, &audioBufferList, …
Run Code Online (Sandbox Code Playgroud)

core-audio avfoundation ios swift4

2
推荐指数
1
解决办法
713
查看次数

标签 统计

avfoundation ×1

core-audio ×1

ios ×1

swift4 ×1