Con*_*cks 7 audio ios avaudiosession swift avaudioengine
我有两个课MicrophoneHandler,和AudioPlayer.我已成功地使用AVCaptureSession以核定答案挖掘麦克风数据在这里,并和改装CMSampleBuffer,以NSData使用此功能:
func sendDataToDelegate(buffer: CMSampleBuffer!)
{
let block = CMSampleBufferGetDataBuffer(buffer)
var length = 0
var data: UnsafeMutablePointer<Int8> = nil
var status = CMBlockBufferGetDataPointer(block!, 0, nil, &length, &data) // TODO: check for errors
let result = NSData(bytesNoCopy: data, length: length, freeWhenDone: false)
self.delegate.handleBuffer(result)
}
Run Code Online (Sandbox Code Playgroud)
我现在想通过扬声器播放音频通过转换NSData产生的上面AVAudioPCMBuffer,并用播放AVAudioEngine.我的AudioPlayer班级如下:
var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!
override init()
{
super.init()
self.setup()
self.start()
}
func handleBuffer(data: NSData)
{
let newBuffer = self.toPCMBuffer(data)
print(newBuffer)
self.playerNode.scheduleBuffer(newBuffer, completionHandler: nil)
}
func setup()
{
self.engine = AVAudioEngine()
self.playerNode = AVAudioPlayerNode()
self.engine.attachNode(self.playerNode)
self.mixer = engine.mainMixerNode
engine.connect(self.playerNode, to: self.mixer, format: self.mixer.outputFormatForBus(0))
}
func start()
{
do {
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
self.playerNode.play()
}
func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer
{
let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 2, interleaved: false) // given NSData audio format
let PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: UInt32(data.length) / audioFormat.streamDescription.memory.mBytesPerFrame)
PCMBuffer.frameLength = PCMBuffer.frameCapacity
let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))
data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)
return PCMBuffer
}
Run Code Online (Sandbox Code Playgroud)
在上面的第一个片段中调用handleBuffer:buffer时缓冲区到达函数self.delegate.handleBuffer(result).
我能够print(newBuffer),并查看转换缓冲区的内存位置,但扬声器没有任何内容.我只能想象转换之间的某些内容是不一致的NSData.有任何想法吗?提前致谢.
NSData格式AVAudioPlayer为什么不全部使用呢?如果您确实需要NSData,您可以随时从下面加载此类数据soundURL。在此示例中,磁盘缓冲区类似于:
let soundURL = documentDirectory.URLByAppendingPathComponent("sound.m4a")
Run Code Online (Sandbox Code Playgroud)
无论如何,为了优化内存和资源管理,直接记录到文件是有意义的。您可以NSData通过以下方式从录音中得到:
let data = NSFileManager.defaultManager().contentsAtPath(soundURL.path())
Run Code Online (Sandbox Code Playgroud)
下面的代码就是您所需要的:
记录
if !audioRecorder.recording {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setActive(true)
audioRecorder.record()
} catch {}
}
Run Code Online (Sandbox Code Playgroud)
玩
if (!audioRecorder.recording){
do {
try audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder.url)
audioPlayer.play()
} catch {}
}
Run Code Online (Sandbox Code Playgroud)
设置
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try audioRecorder = AVAudioRecorder(URL: self.directoryURL()!,
settings: recordSettings)
audioRecorder.prepareToRecord()
} catch {}
Run Code Online (Sandbox Code Playgroud)
设置
let recordSettings = [AVSampleRateKey : NSNumber(float: Float(44100.0)),
AVFormatIDKey : NSNumber(int: Int32(kAudioFormatMPEG4AAC)),
AVNumberOfChannelsKey : NSNumber(int: 1),
AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Medium.rawValue))]
Run Code Online (Sandbox Code Playgroud)
下载 Xcode 项目:
您可以在这里找到这个例子。从Swift Recipes下载完整的项目,该项目可以在模拟器和设备上录制和播放。
| 归档时间: |
|
| 查看次数: |
2706 次 |
| 最近记录: |