Die*_*goQ 7 objective-c ios webrtc replaykit
我正在使用WebRTC创建点对点连接以共享屏幕和音频.我正在使用ReplayKit捕获屏幕,生成CMSampleBufferRef; 使用我可以创建RTCVideoFrame.
为了得到CMSampleBufferRef我正在使用:
[[RPScreenRecorder sharedRecorder] startCaptureWithHandler:^(CMSampleBufferRef _Nonnull sampleBuffer, RPSampleBufferType bufferType, NSError * _Nullable error)
Run Code Online (Sandbox Code Playgroud)
到目前为止,一切都很完美.
当我开始将应用程序发送到后台并多次返回时,问题就出现了; 然后ReplayKit停止调用他的捕获处理程序.只有当我发送CMSampleBufferRef到WebRTC时才会发生这种情况,因此很明显ReplayKit问题与WebRTC有关.如果我从代码中删除此行,则问题不会发生(显然WebRTC不起作用).
[self->source capturer:self->capturer didCaptureVideoFrame:videoFrame];
我能让它再次工作的唯一方法是重启设备.甚至杀死应用程序并重新启动也不起作用.
这是我RTCVideoTrack在视图控制器中创建的方式:
- (RTCVideoTrack *)createLocalVideoTrack {
self->source = [_factory videoSource];
self->capturer = [[RTCVideoCapturer alloc] initWithDelegate:self->source];
[self->source adaptOutputFormatToWidth:441 height:736 fps:15];
return [_factory videoTrackWithSource:self->source trackId:@"ARDAMSv0"];
}
Run Code Online (Sandbox Code Playgroud)
下面是我转换CMSampleBufferRef到RTCVideoFrame并发送至的WebRTC:
- (void)didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer {
if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
!CMSampleBufferDataIsReady(sampleBuffer)) {
return;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (pixelBuffer == nil) {
return;
}
RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
int64_t timeStampNs =
CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * NSEC_PER_SEC;
RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:RTCVideoRotation_0 timeStampNs:timeStampNs];
[self->source capturer:self->capturer didCaptureVideoFrame:videoFrame];
}
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
457 次 |
| 最近记录: |