我想以<canvas>特定的帧速率从 HTML元素录制视频。
我正在使用 CanvasCaptureMediaStreamcanvas.captureStream(fps)并且还可以通过访问视频轨道,const track = stream.getVideoTracks()[0]因此我创建track.requestFrame()以通过MediaRecorder.
我想一次精确地捕捉一帧,然后更改画布内容。更改画布内容可能需要一些时间(因为需要加载图像等)。所以我无法实时捕捉画布。画布上的一些变化会在 500 毫秒内实时发生,因此这也需要调整以一次渲染一帧。
MediaRecorder API 的目的是记录实时流,进行编辑并不是它的设计目的,而且说实话,它做得不太好......
MediaRecorder 本身没有帧速率的概念,这通常由 MediaStreamTrack 定义。然而,CanvasCaptureStreamTrack 并没有真正明确它的帧速率是多少。
我们可以向 传递一个参数HTMLCanvas.captureStream(),但这仅告诉我们每秒想要的最大帧数,它并不是真正的 fps 参数。
此外,即使我们停止在画布上绘图,录制器仍会继续实时延长录制视频的持续时间(我认为在这种情况下,技术上只录制单个长帧)。
所以...我们必须破解...
我们可以用 MediaRecorder 做的一件事就是使用pause()它resume()。
那么在进行长时间的绘图操作之前暂停并在完成后立即恢复听起来很容易吗?是的...也不是那么容易...
再次,帧速率由 MediaStreamTrack 决定,但该 MediaStreamTrack 无法暂停。
好吧,实际上有一种方法可以暂停一种特殊的 MediaStreamTrack,幸运的是我正在谈论 CanvasCaptureMediaStreamTracks。
当我们使用参数调用捕获流时0,我们基本上可以手动控制何时将新帧添加到流中。
因此,在这里我们可以将 MediaRecorder 和 MediaStreamTrack 同步到我们想要的任何帧速率。
基本工作流程是
await the_long_drawing_task;
resumeTheRecorder();
writeTheFrameToStream(); // track.requestFrame();
await wait( time_per_frame );
pauseTheRecorder();
Run Code Online (Sandbox Code Playgroud)
这样做,记录器仅在我们决定的每帧时间被唤醒,并且在此期间将单个帧传递到 MediaStream,有效地模拟 MediaRecorder 所关注的恒定 FPS 绘制。
但与往常一样,在这个仍处于实验阶段的领域中的黑客行为伴随着许多浏览器的怪异,下面的演示实际上只适用于当前的 Chrome...
无论出于何种原因,Firefox 总是会生成帧数是所请求帧数两倍的文件,并且偶尔会在前面添加一个很长的第一帧......
另请注意,Chrome 有一个错误,即它会在绘图时更新画布流,即使我们使用frameRequestRate 启动 此流0。因此,这意味着,如果您在一切准备就绪之前开始绘图,或者画布上的绘图本身需要很长时间,那么我们的记录器将记录我们没有要求的半生不熟的帧。
为了解决这个错误,我们需要使用第二个画布,仅用于流式传输。我们在该画布上要做的就是绘制源画布,这始终是一个足够快的操作。不面对那个错误。
await the_long_drawing_task;
resumeTheRecorder();
writeTheFrameToStream(); // track.requestFrame();
await wait( time_per_frame );
pauseTheRecorder();
Run Code Online (Sandbox Code Playgroud)
class FrameByFrameCanvasRecorder {
constructor(source_canvas, FPS = 30) {
this.FPS = FPS;
this.source = source_canvas;
const canvas = this.canvas = source_canvas.cloneNode();
const ctx = this.drawingContext = canvas.getContext('2d');
// we need to draw something on our canvas
ctx.drawImage(source_canvas, 0, 0);
const stream = this.stream = canvas.captureStream(0);
const track = this.track = stream.getVideoTracks()[0];
// Firefox still uses a non-standard CanvasCaptureMediaStream
// instead of CanvasCaptureMediaStreamTrack
if (!track.requestFrame) {
track.requestFrame = () => stream.requestFrame();
}
// prepare our MediaRecorder
const rec = this.recorder = new MediaRecorder(stream);
const chunks = this.chunks = [];
rec.ondataavailable = (evt) => chunks.push(evt.data);
rec.start();
// we need to be in 'paused' state
waitForEvent(rec, 'start')
.then((evt) => rec.pause());
// expose a Promise for when it's done
this._init = waitForEvent(rec, 'pause');
}
async recordFrame() {
await this._init; // we have to wait for the recorder to be paused
const rec = this.recorder;
const canvas = this.canvas;
const source = this.source;
const ctx = this.drawingContext;
if (canvas.width !== source.width ||
canvas.height !== source.height) {
canvas.width = source.width;
canvas.height = source.height;
}
// start our timer now so whatever happens between is not taken in account
const timer = wait(1000 / this.FPS);
// wake up the recorder
rec.resume();
await waitForEvent(rec, 'resume');
// draw the current state of source on our internal canvas (triggers requestFrame in Chrome)
ctx.clearRect(0, 0, canvas.width, canvas.height);
ctx.drawImage(source, 0, 0);
// force write the frame
this.track.requestFrame();
// wait until our frame-time elapsed
await timer;
// sleep recorder
rec.pause();
await waitForEvent(rec, 'pause');
}
async export () {
this.recorder.stop();
this.stream.getTracks().forEach((track) => track.stop());
await waitForEvent(this.recorder, "stop");
return new Blob(this.chunks);
}
}
///////////////////
// how to use:
(async() => {
const FPS = 30;
const duration = 5; // seconds
let x = 0;
let frame = 0;
const ctx = canvas.getContext('2d');
ctx.textAlign = 'right';
draw(); // we must have drawn on our canvas context before creating the recorder
const recorder = new FrameByFrameCanvasRecorder(canvas, FPS);
// draw one frame at a time
while (frame++ < FPS * duration) {
await longDraw(); // do the long drawing
await recorder.recordFrame(); // record at constant FPS
}
// now all the frames have been drawn
const recorded = await recorder.export(); // we can get our final video file
vid.src = URL.createObjectURL(recorded);
vid.onloadedmetadata = (evt) => vid.currentTime = 1e100; // workaround https://crbug.com/642012
download(vid.src, 'movie.webm');
// Fake long drawing operations that make real-time recording impossible
function longDraw() {
x = (x + 1) % canvas.width;
draw(); // this triggers a bug in Chrome
return wait(Math.random() * 300)
.then(draw);
}
function draw() {
ctx.fillStyle = 'white';
ctx.fillRect(0, 0, canvas.width, canvas.height);
ctx.fillStyle = 'black';
ctx.fillRect(x, 0, 50, 50);
ctx.fillText(frame + " / " + FPS * duration, 290, 140);
};
})().catch(console.error);Run Code Online (Sandbox Code Playgroud)
我问了一个与此相关的类似问题。与此同时,我想出了一个与 Kaiido 重叠的解决方案,我认为值得一读。
我添加了两个技巧:
const recordFrames = (onstop, canvas, fps=30) => {
const chunks = [];
// get Firefox to initialise the canvas
canvas.getContext('2d').fillRect(0, 0, 0, 0);
const stream = canvas.captureStream();
const recorder = new MediaRecorder(stream);
recorder.addEventListener('dataavailable', ({data}) => chunks.push(data));
recorder.addEventListener('stop', () => onstop(new Blob(chunks)));
const frameDuration = 1000 / fps;
const frame = (next, start) => {
recorder.pause();
api.error += Date.now() - start - frameDuration;
setTimeout(next, 0); // helps Firefox record the right frame duration
};
const api = {
error: 0,
init() {
recorder.start();
recorder.pause();
},
step(next) {
recorder.resume();
setTimeout(frame, frameDuration, next, Date.now());
},
stop: () => recorder.stop()
};
return api;
}
Run Code Online (Sandbox Code Playgroud)
如何使用
const fps = 30;
const duration = 5000;
const animation = Something;
const videoOutput = blob => {
const video = document.createElement('video');
video.src = URL.createObjectURL(blob);
document.body.appendChild(video);
}
const recording = recordFrames(videoOutput, canvas, fps);
const startRecording = () => {
recording.init();
animation.play();
};
// I am assuming you can call these from your library
const onAnimationRender = nextFrame => recording.step(nextFrame);
const onAnimationEnd = () => recording.step(recording.stop);
let now = 0;
const progression = () => {
now = now + 1 + recorder.error * fps / 1000;
recorder.error = 0;
return now * 1000 / fps / duration
}
Run Code Online (Sandbox Code Playgroud)
我发现这个解决方案在 Chrome 和 Firefox 中的 30fps 下都令人满意。我没有遇到 Kaiido 提到的 Chrome 错误,因此没有实施任何措施来处理它们。
| 归档时间: |
|
| 查看次数: |
2472 次 |
| 最近记录: |