我正在尝试使用 WebRTC 制作屏幕共享应用程序。我有可以从相机获取和共享视频流的代码。我需要修改它以通过 MediaProjection API 获取视频。基于这篇文章,我修改了代码以使用 org.webrtc.ScreenCapturerAndroid,但没有显示视频输出。只有黑屏。如果我使用相机,一切正常(我可以在屏幕上看到相机输出)。有人可以检查我的代码并指出我正确的方向吗?我已经被这个问题困住了三天了。
\n\n这是我的代码:
\n\npublic class MainActivity extends AppCompatActivity {\n\n private static final String TAG = "VIDEO_CAPTURE";\n\n private static final int CAPTURE_PERMISSION_REQUEST_CODE = 1;\n private static final String VIDEO_TRACK_ID = "video_stream";\n\n PeerConnectionFactory peerConnectionFactory;\n\n SurfaceViewRenderer localVideoView;\n ProxyVideoSink localSink;\n\n VideoSource videoSource;\n VideoTrack localVideoTrack;\n\n EglBase rootEglBase;\n\n boolean camera = false;\n\n @Override\n protected void onCreate(Bundle savedInstanceState) {\n super.onCreate(savedInstanceState);\n setContentView(R.layout.activity_main);\n\n rootEglBase = EglBase.create();\n localVideoView = findViewById(R.id.local_gl_surface_view);\n\n localVideoView.init(rootEglBase.getEglBaseContext(), null);\n\n startScreenCapture();\n }\n\n @TargetApi(21)\n private void startScreenCapture() {\n MediaProjectionManager mMediaProjectionManager …Run Code Online (Sandbox Code Playgroud)