WebRTC无法录屏

Kub*_*bik 2 android screen-capture webrtc

我正在尝试使用 WebRTC 制作屏幕共享应用程序。我有可以从相机获取和共享视频流的代码。我需要修改它以通过 MediaProjection API 获取视频。基于这篇文章,我修改了代码以使用 org.webrtc.ScreenCapturerAndroid,但没有显示视频输出。只有黑屏。如果我使用相机,一切正常(我可以在屏幕上看到相机输出)。有人可以检查我的代码并指出我正确的方向吗?我已经被这个问题困住了三天了。

\n\n

这是我的代码:

\n\n
public class MainActivity extends AppCompatActivity {\n\n    private static final String TAG = "VIDEO_CAPTURE";\n\n    private static final int CAPTURE_PERMISSION_REQUEST_CODE = 1;\n    private static final String VIDEO_TRACK_ID = "video_stream";\n\n    PeerConnectionFactory peerConnectionFactory;\n\n    SurfaceViewRenderer localVideoView;\n    ProxyVideoSink localSink;\n\n    VideoSource videoSource;\n    VideoTrack localVideoTrack;\n\n    EglBase rootEglBase;\n\n    boolean camera = false;\n\n    @Override\n    protected void onCreate(Bundle savedInstanceState) {\n        super.onCreate(savedInstanceState);\n        setContentView(R.layout.activity_main);\n\n        rootEglBase = EglBase.create();\n        localVideoView = findViewById(R.id.local_gl_surface_view);\n\n        localVideoView.init(rootEglBase.getEglBaseContext(), null);\n\n        startScreenCapture();\n    }\n\n    @TargetApi(21)\n    private void startScreenCapture() {\n        MediaProjectionManager mMediaProjectionManager = (MediaProjectionManager) getApplication().getSystemService(Context.MEDIA_PROJECTION_SERVICE);\n        startActivityForResult(mMediaProjectionManager.createScreenCaptureIntent(), CAPTURE_PERMISSION_REQUEST_CODE);\n    }\n\n    @Override\n    public void onActivityResult(int requestCode, int resultCode, Intent data) {\n        if (requestCode != CAPTURE_PERMISSION_REQUEST_CODE) { return; }\n\n        start(data);\n    }\n\n    private void start(Intent permissionData) {\n\n        //Initialize PeerConnectionFactory globals.\n        PeerConnectionFactory.InitializationOptions initializationOptions =\n                PeerConnectionFactory.InitializationOptions.builder(this)\n                        .setEnableVideoHwAcceleration(true)\n                        .createInitializationOptions();\n        PeerConnectionFactory.initialize(initializationOptions);\n\n        //Create a new PeerConnectionFactory instance - using Hardware encoder and decoder.\n        PeerConnectionFactory.Options options = new PeerConnectionFactory.Options();\n        DefaultVideoEncoderFactory defaultVideoEncoderFactory = new DefaultVideoEncoderFactory(\n                rootEglBase.getEglBaseContext(), true,true);\n        DefaultVideoDecoderFactory defaultVideoDecoderFactory = new DefaultVideoDecoderFactory(rootEglBase.getEglBaseContext());\n\n        peerConnectionFactory = PeerConnectionFactory.builder()\n                .setOptions(options)\n                .setVideoDecoderFactory(defaultVideoDecoderFactory)\n                .setVideoEncoderFactory(defaultVideoEncoderFactory)\n                .createPeerConnectionFactory();;\n\n        VideoCapturer videoCapturerAndroid;\n        if (camera) {\n            videoCapturerAndroid = createCameraCapturer(new Camera1Enumerator(false));\n        } else {\n            videoCapturerAndroid = new ScreenCapturerAndroid(permissionData, new MediaProjection.Callback() {\n                @Override\n                public void onStop() {\n                    super.onStop();\n                    Log.e(TAG, "user has revoked permissions");\n                }\n            });\n        }\n\n        videoSource = peerConnectionFactory.createVideoSource(videoCapturerAndroid);\n\n        DisplayMetrics metrics = new DisplayMetrics();\n        MainActivity.this.getWindowManager().getDefaultDisplay().getRealMetrics(metrics);\n        videoCapturerAndroid.startCapture(metrics.widthPixels, metrics.heightPixels, 30);\n\n        localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);\n        localVideoTrack.setEnabled(true);\n\n        //localVideoTrack.addRenderer(new VideoRenderer(localRenderer));\n        localSink = new ProxyVideoSink().setTarget(localVideoView);\n        localVideoTrack.addSink(localSink);\n    }\n\n    //find first camera, this works without problem\n    private VideoCapturer createCameraCapturer(CameraEnumerator enumerator) {\n        final String[] deviceNames = enumerator.getDeviceNames();\n\n        // First, try to find front facing camera\n        Logging.d(TAG, "Looking for front facing cameras.");\n        for (String deviceName : deviceNames) {\n            if (enumerator.isFrontFacing(deviceName)) {\n                Logging.d(TAG, "Creating front facing camera capturer.");\n                VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);\n\n                if (videoCapturer != null) {\n                    return videoCapturer;\n                }\n            }\n        }\n\n        // Front facing camera not found, try something else\n        Logging.d(TAG, "Looking for other cameras.");\n        for (String deviceName : deviceNames) {\n            if (!enumerator.isFrontFacing(deviceName)) {\n                Logging.d(TAG, "Creating other camera capturer.");\n                VideoCapturer videoCapturer = enumerator.createCapturer(deviceName, null);\n\n                if (videoCapturer != null) {\n                    return videoCapturer;\n                }\n            }\n        }\n\n        return null;\n    }\n}\n
Run Code Online (Sandbox Code Playgroud)\n\n

代理视频接收器

\n\n
public class ProxyVideoSink implements VideoSink {\n\n    private VideoSink target;\n\n    synchronized ProxyVideoSink setTarget(VideoSink target) { this.target = target; return this; }\n\n    @Override\n    public void onFrame(VideoFrame videoFrame) {\n\n        if (target == null) {\n            Log.w("VideoSink", "Dropping frame in proxy because target is null.");\n            return;\n        }\n\n        target.onFrame(videoFrame);\n    }\n}\n
Run Code Online (Sandbox Code Playgroud)\n\n

在 logcat 中,我可以看到一些帧已渲染,但没有显示任何内容(黑屏)。

\n\n
06-18 17:42:44.750 11357-11388/com.archona.webrtcscreencapturetest I/org.webrtc.Logging: EglRenderer: local_gl_surface_viewDuration: 4000 ms. Frames received: 117. Dropped: 0. Rendered: 117. Render fps: 29.2. Average render time: 4754 \xce\xbcs. Average swapBuffer time: 2913 \xce\xbcs.\n06-18 17:42:48.752 11357-11388/com.archona.webrtcscreencapturetest I/org.webrtc.Logging: EglRenderer: local_gl_surface_viewDuration: 4001 ms. Frames received: 118. Dropped: 0. Rendered: 118. Render fps: 29.5. Average render time: 5015 \xce\xbcs. Average swapBuffer time: 3090 \xce\xbcs.\n
Run Code Online (Sandbox Code Playgroud)\n\n

我正在使用最新版本的 WebRTC 库:实现“org.webrtc:google-webrtc:1.0.23546”。\n我的设备具有 API 级别 24 (Android 7.0),但我已在具有不同 API 级别的 3 个不同设备上测试了此代码,因此我不怀疑设备特定问题。\n我尝试构建另一个使用 MediaProjection API(不带 WebRTC)的应用程序,并且我可以在 SurfaceView 中看到正确的输出。\n我尝试降级 webrtc 库,但似乎没有任何效果。

\n\n

谢谢你的帮助。

\n

小智 5

我在使用 WebRTC 库时遇到了同样的问题org.webrtc:google-webrtc:1.0.22672。我使用的是安卓7.0设备。视频通话工作正常。问题在于屏幕共享。屏幕共享始终显示黑屏。

然后我添加了以下内容:

peerConnectionFactory.setVideoHwAccelerationOptions(rootEglBase.getEglBaseContext(), rootEglBase.getEglBaseContext());
Run Code Online (Sandbox Code Playgroud)

现在它工作得很好。

  • 我需要在哪里使用它,因为我在peerconnectionfactory中找不到setVideoHwAccelearationOptions方法。 (2认同)