Oli*_*ver 6 ios8 avaudioengine
我试图在iOS 8中使用新的AVAudioEngine.
看起来在声音文件播放完毕之前调用了player.scheduleFile()的completionHandler .
我正在使用一个长度为5s的声音文件 - 并且 - println()
声音在声音结束前约1秒钟出现-Message.
我做错了什么或者我是否误解了完成握手的想法?
谢谢!
这是一些代码:
class SoundHandler {
let engine:AVAudioEngine
let player:AVAudioPlayerNode
let mainMixer:AVAudioMixerNode
init() {
engine = AVAudioEngine()
player = AVAudioPlayerNode()
engine.attachNode(player)
mainMixer = engine.mainMixerNode
var error:NSError?
if !engine.startAndReturnError(&error) {
if let e = error {
println("error \(e.localizedDescription)")
}
}
engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
}
func playSound() {
var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
var soundFile = AVAudioFile(forReading: soundUrl, error: nil)
player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })
player.play()
}
}
Run Code Online (Sandbox Code Playgroud)
您始终可以使用AVAudioTime计算音频播放完成时的未来时间。当前行为非常有用,因为它支持安排其他缓冲区/段/文件在当前缓冲区/段/文件结束之前从回调中播放,从而避免了音频播放的间隙。这使您无需进行大量工作即可创建一个简单的循环播放器。这是一个例子:
class Latch {
var value : Bool = true
}
func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch {
let looping = Latch()
let frames = file.length
let sampleRate = file.processingFormat.sampleRate
var segmentTime : AVAudioFramePosition = 0
var segmentCompletion : AVAudioNodeCompletionHandler!
segmentCompletion = {
if looping.value {
segmentTime += frames
player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
}
}
player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
segmentCompletion()
player.play()
return looping
}
Run Code Online (Sandbox Code Playgroud)
上面的代码在调用player.play()之前两次调度整个文件。随着每个片段即将完成,它将在将来安排另一个完整文件,以避免播放间隙。要停止循环,请使用返回值Latch,如下所示:
let looping = loopWholeFile(file, player)
sleep(1000)
looping.value = false
player.stop()
Run Code Online (Sandbox Code Playgroud)
小智 6
我看到了同样的行为.
从我的实验中,我相信一旦缓冲区/段/文件被"调度",就会调用回调,而不是在完成播放时.
虽然文档明确指出:"在缓冲区完全播放或播放器停止后调用.可能是零."
所以我认为这是一个错误或不正确的文档.不知道哪个
这似乎是一个错误,我们应该提交雷达提交!http://bugreport.apple.com
在此期间作为一种解决方法,我注意到如果您改为使用scheduleBuffer:atTime:options:completionHandler:
回调按预期触发(播放完成后).
示例代码:
AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];
[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
// reminder: we're not on the main thread in here
dispatch_async(dispatch_get_main_queue(), ^{
NSLog(@"done playing, as expected!");
});
}];
Run Code Online (Sandbox Code Playgroud)
我的错误报告被关闭为“按预期工作”,但苹果向我指出了 iOS 11 中的 ScheduleFile、scheduleSegment 和 ScheduleBuffer 方法的新变体。这些方法添加了一个completionCallbackType参数,您可以使用该参数来指定您想要完成回调当播放完成时:
[self.audioUnitPlayer
scheduleSegment:self.audioUnitFile
startingFrame:sampleTime
frameCount:(int)sampleLength
atTime:0
completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
// do something here
}];
Run Code Online (Sandbox Code Playgroud)
该文档没有说明它是如何工作的,但我测试了它并且它对我有用。
我一直在 iOS 8-10 上使用此解决方法:
- (void)playRecording {
[self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
float totalTime = [self recordingDuration];
float elapsedTime = [self recordingCurrentTime];
float remainingTime = totalTime - elapsedTime;
[self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
}];
}
- (float)recordingDuration {
float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
if (isnan(duration)) {
duration = 0;
}
return duration;
}
- (float)recordingCurrentTime {
AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
AVAudioFramePosition sampleTime = playerTime.sampleTime;
if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
self.audioUnitLastKnownTime = time;
return time;
}
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
4224 次 |
最近记录: |