chrome.desktopCapture - 无法同时录制系统音频和麦克风?

Utk*_*nos 6 javascript google-chrome-extension mediarecorder getusermedia mediastream

我已经构建了一个 Chrome 扩展程序,它可以捕获屏幕活动和麦克风输入并输出一个视频文件。由于chrome.desktopCapture无法在屏幕捕获的同时记录音频输入,我将麦克风放在自己的单独流中。所以:

//get screen stream
chrome.desktopCapture.chooseDesktopMedia(['screen'], null, (stream_id, opts) => {
    let constraints = {video: mandatory: {
        chromeMediaSource: 'desktop',
        chromeMediaSourceId: stream_id
    }};
    navigator.mediaDevices.getUserMedia(constraints).then((stream) => {
        video_stream = stream;
    });
});

//get mic stream
navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
    audio_stream = stream;
});
Run Code Online (Sandbox Code Playgroud)

然后,稍后,当我获得两个流时,在开始录制之前,我通过创建主流合并它们,并将单独的视频和音频流中的相应轨道添加到其中。

let master_stream = new MediaStream(video_stream);
master_stream.addTrack(audio_stream.getTracks()[0]);
Run Code Online (Sandbox Code Playgroud)

最终,当我得到一个视频文件时,这很好用。我得到屏幕和麦克风。

问题:如果我要求 Chrome 也记录系统声音,为什么这种技术不起作用?

因此,如果我更改['screen']['screen', 'audio'],在其他所有内容都相同的情况下,最终视频中将没有麦克风。

如果我getTracks()在 上输出master_stream,这就是我得到的:

Array(3)
    0: MediaStreamTrack {kind: "audio", id: "9ee3ee33-73ee-41e4-957c-d6fd3aaada43", label: "System Audio", enabled: true, muted: false, …}
    1: MediaStreamTrack {kind: "audio", id: "ab2429a1-7f75-48f2-9ee1-6a4bfd7ca942", label: "Default - Microphone (Samson Meteor Mic) (17a0:0310)", enabled: true, muted: false, …}
    2: MediaStreamTrack {kind: "video", id: "4ecb1929-31d0-4a79-8cbc-1a8759323c3b", label: "screen:0:0", enabled: true, muted: false, …}
Run Code Online (Sandbox Code Playgroud)

我看不出为什么添加系统音频会在结果输出中杀死麦克风音频的明显原因。有人有什么想法吗?

小智 0

// add two buttons startrecording and stoprecording
let desktopStream; // variable to hold the desktop stream
let microphoneStream; // variable to hold the microphone stream
let recordedChunks = []; // array to store recorded chunks
let mediaRecorder; // variable to hold the MediaRecorder instance

const startRecording = async () => {
    try {
        desktopStream = await navigator.mediaDevices.getDisplayMedia({
            audio: true,
            video: true
        });
        microphoneStream = await navigator.mediaDevices.getUserMedia({
            audio: true
        });

        const audioContext = new AudioContext();
        const microphoneSource = audioContext.createMediaStreamSource(microphoneStream);

        // Create a MediaStreamAudioDestinationNode
        const destination = audioContext.createMediaStreamDestination();
        const desktopAudioSource = audioContext.createMediaStreamSource(desktopStream);
        desktopAudioSource.connect(destination);
        microphoneSource.connect(destination);

        // Combine the audio streams
        const combinedAudioStream = destination.stream;

        // Merge the audio and video streams
        const combinedStream = new MediaStream();
        combinedStream.addTrack(combinedAudioStream.getAudioTracks()[0]);
        combinedStream.addTrack(desktopStream.getVideoTracks()[0]);

        // Create a MediaRecorder to record the combined stream
        mediaRecorder = new MediaRecorder(combinedStream);

        mediaRecorder.ondataavailable = (event) => {
            recordedChunks.push(event.data);
        };

        mediaRecorder.onstop = () => {
            const blob = new Blob(recordedChunks, {
                type: 'video/webm'
            });
            const url = URL.createObjectURL(blob);
            const a = document.createElement('a');
            document.body.appendChild(a);
            a.style = 'display: none';
            a.href = url;
            a.download = 'recorded-video.webm';
            a.click();
            window.URL.revokeObjectURL(url);
            recordedChunks = [];
        };

        mediaRecorder.start();
    } catch (error) {console.error('Error starting recording:', error);
    }
};

const stopRecording = () => {
    if (desktopStream) {
        desktopStream.getTracks().forEach((track) => track.stop());
    }

    if (microphoneStream) {
        microphoneStream.getTracks().forEach((track) => track.stop());
    }

    if (mediaRecorder && mediaRecorder.state !== 'inactive') {
        mediaRecorder.stop();
    }
};
Run Code Online (Sandbox Code Playgroud)