播放AVAudioPlayerNode时,AVAudioEngine inputNode的格式发生更改

Won*_*ray 4 avfoundation ios swift avaudioengine avaudioplayernode

我将从我制作的一个简单的“操场”视图控制器类开始,该类演示了我的问题:

class AudioEnginePlaygroundViewController: UIViewController {
    private var audioEngine: AVAudioEngine!
    private var micTapped = false
    override func viewDidLoad() {
        super.viewDidLoad()
        configureAudioSession()
        audioEngine = AVAudioEngine()
    }

    @IBAction func toggleMicTap(_ sender: Any) {
        guard let mic = audioEngine.inputNode else {
            return
        }
        if micTapped {
            mic.removeTap(onBus: 0)
            micTapped = false
            return
        }
        stopAudioPlayback()

        let micFormat = mic.inputFormat(forBus: 0)
        print("installing tap: \(micFormat.sampleRate) -- \(micFormat.channelCount)")
        mic.installTap(onBus: 0, bufferSize: 2048, format: micFormat) { (buffer, when) in
            print("in tap completion")
            let sampleData = UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength))
        }
        micTapped = true
        startEngine()
    }

    @IBAction func playAudioFile(_ sender: Any) {
        stopAudioPlayback()
        let playerNode = AVAudioPlayerNode()

        let audioUrl = Bundle.main.url(forResource: "test_audio", withExtension: "wav")!
        let audioFile = readableAudioFileFrom(url: audioUrl)
        audioEngine.attach(playerNode)
        audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
        startEngine()
        playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
        playerNode.play()
    }

    // MARK: Internal Methods

    private func configureAudioSession() {
        do {
            try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: [.mixWithOthers, .defaultToSpeaker])
            try AVAudioSession.sharedInstance().setActive(true)
        } catch { }
    }

    private func readableAudioFileFrom(url: URL) -> AVAudioFile {
        var audioFile: AVAudioFile!
        do {
            try audioFile = AVAudioFile(forReading: url)
        } catch { }
        return audioFile
    }

    private func startEngine() {
        guard !audioEngine.isRunning else {
            return
        }

        do {
            try audioEngine.start()
        } catch { }
    }

    private func stopAudioPlayback() {
        audioEngine.stop()
        audioEngine.reset()
    }
}
Run Code Online (Sandbox Code Playgroud)

上面的VC有一个AVAudioEngine实例和两个UIButton动作:一个播放在硬编码url上找到的音频文件,另一个则切换引擎的inputNode上水龙头的安装/拆卸。

我的目标是使现场麦克风的窃听和音频文件的播放同时工作,但又完全互不影响。也就是说,无论麦克风分接头的当前状态如何,我都希望能够触发播放,反之亦然。如果在触发音频文件播放之前安装了水龙头,一切都将按预期工作。但是,如果我先播放音频文件,然后尝试安装水龙头,则会发生以下崩溃:

[avae] AVAEInternal.h:70:_AVAE_Check: required condition is false: [AVAEGraphNode.mm:810:CreateRecordingTap: (IsFormatSampleRateAndChannelCountValid(format))]
Run Code Online (Sandbox Code Playgroud)

这导致我通过installTap调用上方的log语句检查麦克风格式的数据。可以肯定的是,在播放之前安装分接头时,我的预期采样率为44100.0,通道数为1。但是当我先播放音频文件然后安装麦克风分接头时,我的日志显示采样率为0和一个通道计数2,这给了我上面显示的错误。

我尝试修改AVAudioEngine的启动/重置流程,尝试了AVAudioSession的不同类别/模式组合(请参见configureConfigSession方法),并尝试手动创建如下的拍子格式:

let micFormat = mic.inputFormat(forBus: 0)
var trueFormat: AVAudioFormat!
if micFormat.sampleRate == 0 {
    trueFormat = AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 1)
} else {
    trueFormat = micFormat
}
print("installing tap: \(micFormat.sampleRate) -- \(micFormat.channelCount)")
mic.installTap(onBus: 0, bufferSize: 2048, format: trueFormat) { (buffer, when) in
    print("in tap completion")
    let sampleData = UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength))
}
Run Code Online (Sandbox Code Playgroud)

这给了我一个类似但不同的错误:

[avae] AVAEInternal.h:70:_AVAE_Check: required condition is false: [AVAudioIONodeImpl.mm:896:SetOutputFormat: (IsFormatSampleRateAndChannelCountValid(hwFormat))]
Run Code Online (Sandbox Code Playgroud)

我看不出麦克风的格式数据根据是否播放过AVAudioPlayerNode而有所不同的任何原因。

Won*_*ray 5

经过一番搜索,我发现了问题所在。问题出在音频引擎的inputNode单例。从文档:

首次访问inputNode时,音频引擎会按需创建单例。要接收输入,请从输入音频节点的输出连接另一个音频节点,或在其上创建录音抽头。

加上我遇到的格式问题的参考:

检查输入节点的输入格式(特别是硬件格式)是否为非零采样率和通道数,以查看是否启用了输入。

在我的游乐场课程中,触发音频文件回放的流程在创建带有以下内容的“活动链”之前,永远不会访问引擎的inputNode

audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
Run Code Online (Sandbox Code Playgroud)

如果希望引擎内部为输入配置自身,似乎必须在start()之前访问AVAudioEngine的inputNode。即使对引擎进行stop()和reset()也不会导致对inputNode的访问来重新配置引擎。(我怀疑通过disconnectNode调用手动中断活动链会允许内部重新配置,但我尚不确定。)

因此,代码方面的修复很简单:在实例化后立即访问引擎的输入节点,以便为音频输入配置引擎。这是整个课程,文件播放和麦克风点击可一起使用:

import UIKit

class AudioEnginePlaygroundViewController: UIViewController {
    private var audioEngine: AVAudioEngine!
    private var mic: AVAudioInputNode!
    private var micTapped = false

    override func viewDidLoad() {
        super.viewDidLoad()
        configureAudioSession()
        audioEngine = AVAudioEngine()
        mic = audioEngine.inputNode!
    }

    @IBAction func toggleMicTap(_ sender: Any) {
        if micTapped {
            mic.removeTap(onBus: 0)
            micTapped = false
            return
        }

        let micFormat = mic.inputFormat(forBus: 0)
        mic.installTap(onBus: 0, bufferSize: 2048, format: micFormat) { (buffer, when) in
            let sampleData = UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength))
        }
        micTapped = true
        startEngine()
    }

    @IBAction func playAudioFile(_ sender: Any) {
        stopAudioPlayback()
        let playerNode = AVAudioPlayerNode()

        let audioUrl = Bundle.main.url(forResource: "test_audio", withExtension: "wav")!
        let audioFile = readableAudioFileFrom(url: audioUrl)
        audioEngine.attach(playerNode)
        audioEngine.connect(playerNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
        startEngine()
        playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
        playerNode.play()
    }

    // MARK: Internal Methods

    private func configureAudioSession() {
        do {
            try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord, with: [.mixWithOthers, .defaultToSpeaker])
            try AVAudioSession.sharedInstance().setActive(true)
        } catch { }
    }

    private func readableAudioFileFrom(url: URL) -> AVAudioFile {
        var audioFile: AVAudioFile!
        do {
            try audioFile = AVAudioFile(forReading: url)
        } catch { }
        return audioFile
    }

    private func startEngine() {
        guard !audioEngine.isRunning else {
            return
        }

        do {
            try audioEngine.start()
        } catch { }
    }

    private func stopAudioPlayback() {
        audioEngine.stop()
        audioEngine.reset()
    }
}
Run Code Online (Sandbox Code Playgroud)

  • 感谢您发布此后续内容! (2认同)