Tob*_*mpe 5 avfoundation avaudioengine avaudioplayernode
我正在接收 Int16 形式的 16 位/48 kHz 立体声 PCM 样本流,并且我尝试使用 AVAudioEngine 播放它们,但是我根本听不到任何声音。我认为这要么与我设置播放器的方式有关,要么与我将数据推入缓冲区的方式有关。
我已经阅读了很多有关使用音频队列服务的替代解决方案的内容,但是我能找到的所有示例代码要么是 Objective-C 要么仅限 iOS。
如果我遇到任何类型的帧大小问题或其他问题,我难道不应该至少能听到扬声器中发出的垃圾声吗?
这是我的代码:
import Foundation
import AVFoundation
class VoicePlayer {
var engine: AVAudioEngine
let format = AVAudioFormat(commonFormat: AVAudioCommonFormat.pcmFormatInt16, sampleRate: 48000.0, channels: 2, interleaved: true)!
let playerNode: AVAudioPlayerNode!
var audioSession: AVCaptureSession = AVCaptureSession()
init() {
self.audioSession = AVCaptureSession()
self.engine = AVAudioEngine()
self.playerNode = AVAudioPlayerNode()
self.engine.attach(self.playerNode)
//engine.connect(self.playerNode, to: engine.mainMixerNode, format:AVAudioFormat.init(standardFormatWithSampleRate: 48000, channels: 2))
/* If I set my custom format here, AVFoundation complains about the format not being available */
engine.connect(self.playerNode, to: engine.outputNode, format:AVAudioFormat.init(standardFormatWithSampleRate: 48000, channels: 2))
engine.prepare()
try! engine.start()
self.playerNode.play()
}
func play(buffer: [Int16]) {
let interleavedChannelCount = 2
let frameLength = buffer.count / interleavedChannelCount
let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(frameLength))!
print("audio buffer size in frames is \(AVAudioFrameCount(frameLength))")
// buffer contains 2 channel interleaved data
// audioBuffer contains 2 channel interleaved data
var buf = buffer
let size = MemoryLayout<Int16>.stride * interleavedChannelCount * frameLength
memcpy(audioBuffer.mutableAudioBufferList.pointee.mBuffers.mData, &buf, size)
audioBuffer.frameLength = AVAudioFrameCount(frameLength)
/* Implemented an AVAudioConverter for testing
Input: 16 bit PCM 48kHz stereo interleaved
Output: whatever the standard format for the system is
Maybe this is somehow needed as my audio interface doesn't directly support 16 bit audio and can only run at 24 bit?
*/
let normalBuffer = AVAudioPCMBuffer(pcmFormat: AVAudioFormat.init(standardFormatWithSampleRate: 48000, channels: 2)!, frameCapacity: AVAudioFrameCount(frameLength))
normalBuffer?.frameLength = AVAudioFrameCount(frameLength)
let converter = AVAudioConverter(from: format, to: AVAudioFormat.init(standardFormatWithSampleRate: 48000, channels: 2)!)
var gotData = false
let inputBlock: AVAudioConverterInputBlock = { inNumPackets, outStatus in
if gotData {
outStatus.pointee = .noDataNow
return nil
}
gotData = true
outStatus.pointee = .haveData
return audioBuffer
}
var error: NSError? = nil
let status: AVAudioConverterOutputStatus = converter!.convert(to: normalBuffer!, error: &error, withInputFrom: inputBlock);
// Play the output buffer, in this case the audioBuffer, otherwise the normalBuffer
// Playing the raw audio buffer causes an EXEC_BAD_ACCESS on playback, playing back the buffer from the converter doesn't, but it still doesn't sound anything like a human voice
self.playerNode.scheduleBuffer(audioBuffer) {
print("Played")
}
}
}
Run Code Online (Sandbox Code Playgroud)
任何帮助将不胜感激。
将数据复制到 后,AVAudioPCMBuffer您需要设置其frameLength属性以指示它包含多少有效音频。
func play(buffer: [Int16]) {
let interleavedChannelCount = 2
let frameLength = buffer.count / interleavedChannelCount
let audioBuffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(frameLength))!
// buffer contains 2 channel interleaved data
// audioBuffer contains 2 channel interleaved data
var buf = buffer
memcpy(audioBuffer.mutableAudioBufferList.pointee.mBuffers.mData, &buf, MemoryLayout<Int16>.stride * interleavedChannelCount * frameLength)
audioBuffer.frameLength = AVAudioFrameCount(frameLength)
self.playerNode.scheduleBuffer(audioBuffer) {
print("Played")
}
}
Run Code Online (Sandbox Code Playgroud)
编辑:更新了问题的更改。旧的(现在)不相关的部分:
部分问题是您的格式不一致。format被声明为非交错的,但buffer它是单个数组,Int16因此可能表示交错的数据。直接将一个复制到另一个可能是不正确的。
| 归档时间: |
|
| 查看次数: |
1656 次 |
| 最近记录: |