使用 WebRTC 通过 ReplayKit 发送 iOS 设备的屏幕截图

Nir*_*iro 7 screensharing ios webrtc apprtc replaykit

我们希望使用 WebRTC 使用 ReplayKit 发送 iOS devices\xe2\x80\x99 屏幕捕获。\nReplayKit 有一个 processSampleBuffer 回调,它提供 CMSampleBuffer。

\n\n

但这就是我们陷入困境的地方,我们\xe2\x80\x99似乎无法将CMSampleBuffer发送到连接的对等点。\n我们尝试从sampleBuffer创建pixelBuffer,然后创建RTCVideoFrame。

\n\n

我们还从 RTCPeerConnectionFactory 中提取了 RTCVideoSource,然后使用 RTCVideoCapturer 并将其流式传输到 localVideoSource。

\n\n

知道我们做错了什么吗?

\n\n
var peerConnectionFactory: RTCPeerConnectionFactory?\n\noverride func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {\n switch sampleBufferType {\n           case RPSampleBufferType.video:\n\n        // create the CVPixelBuffer\n        let pixelBuffer:CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!;\n\n        // create the RTCVideoFrame\n        var videoFrame:RTCVideoFrame?;\n        let timestamp = NSDate().timeIntervalSince1970 * 1000\n        videoFrame = RTCVideoFrame(pixelBuffer: pixelBuffer, rotation: RTCVideoRotation._0, timeStampNs: Int64(timestamp))\n\n        // connect the video frames to the WebRTC\n        let localVideoSource = self.peerConnectionFactory!.videoSource()\n        let videoCapturer = RTCVideoCapturer()\n        localVideoSource.capturer(videoCapturer, didCapture: videoFrame!)\n\n        let videoTrack : RTCVideoTrack =   self.peerConnectionFactory!.videoTrack(with: localVideoSource, trackId: "100\xe2\x80\x9d)\n\n        let mediaStream: RTCMediaStream = (self.peerConnectionFactory?.mediaStream(withStreamId: \xe2\x80\x9c1"))!\n        mediaStream.addVideoTrack(videoTrack)\n        self.newPeerConnection!.add(mediaStream)\n\n        break\n    }\n}\n
Run Code Online (Sandbox Code Playgroud)\n

Sum*_*ena 5

这是一个很好的实现方法,您只需在代码片段中使用的方法中渲染RTCVideoFrame,所有其他对象都会初始化该方法,这是最好的方法。为了更好地理解,我给你一个片段。

    var peerConnectionFactory: RTCPeerConnectionFactory?
    var localVideoSource: RTCVideoSource?
    var videoCapturer: RTCVideoCapturer?
    func setupVideoCapturer(){
          // localVideoSource and videoCapturer will use 
            localVideoSource = self.peerConnectionFactory!.videoSource() 
            videoCapturer = RTCVideoCapturer()
    //      localVideoSource.capturer(videoCapturer, didCapture: videoFrame!)
    
            let videoTrack : RTCVideoTrack =   self.peerConnectionFactory!.videoTrack(with: localVideoSource, trackId: "100")
    
            let mediaStream: RTCMediaStream = (self.peerConnectionFactory?.mediaStream(withStreamId: "1"))!
            mediaStream.addVideoTrack(videoTrack)
            self.newPeerConnection!.add(mediaStream)
        }


 override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) {
     switch sampleBufferType {
               case RPSampleBufferType.video:
    
            // create the CVPixelBuffer
            let pixelBuffer:CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!;
    
            // create the RTCVideoFrame
            var videoFrame:RTCVideoFrame?;
            let timestamp = NSDate().timeIntervalSince1970 * 1000
            videoFrame = RTCVideoFrame(pixelBuffer: pixelBuffer, rotation: RTCVideoRotation._0, timeStampNs: Int64(timestamp))
            // connect the video frames to the WebRTC
            localVideoSource.capturer(videoCapturer, didCapture: videoFrame!)
    
            break
        }
    }
Run Code Online (Sandbox Code Playgroud)

希望对你有帮助。