在Android上没有预览的Camera2视频录制:mp4输出文件无法完全播放

Mar*_*ark 10 video mp4 codec mediarecorder android-camera2

我正试图在我的三星Galaxy S6(支持1920x1080,大约30 fps)上从后置摄像头(面向脸部的摄像头)录制视频.如果我不需要,我不想使用任何表面进行预览,因为这只是在后台发生.

我似乎有它工作,但输出文件不能以一种实际上正确的方式播放.在我的Windows 10 PC上,Windows Media Player将显示第一帧然后播放音频,VLC将不会显示任何帧.在我的手机上,录制的文件可播放但不完全.它将保持第一帧5-8秒,然后在最后,剩余时间变为0,显示的总时间改变,然后实际视频帧开始播放.在我的Mac上(10.9.5)Quicktime不会显示视频(尽管没有错误),但Google Picasa可以完美播放.我想在我的电脑上尝试使用Picasa查看它是否在那里工作,但我无法下载谷歌Picasa,因为它已经落日了.

我尝试安装我发现的Windows编解码器包,但这并没有解决任何问题.MediaInfo v0.7.85报告了有关该文件的信息:

General
Complete name               : C:\...\1465655479915.mp4
Format                      : MPEG-4
Format profile              : Base Media / Version 2
Codec ID                    : mp42 (isom/mp42)
File size                   : 32.2 MiB
Duration                    : 15s 744ms
Overall bit rate            : 17.1 Mbps
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50
com.android.version         : 6.0.1

Video
ID                          : 1
Format                      : AVC
Format/Info                 : Advanced Video Codec
Format profile              : High@L4
Format settings, CABAC      : Yes
Format settings, ReFrames   : 1 frame
Format settings, GOP        : M=1, N=30
Codec ID                    : avc1
Codec ID/Info               : Advanced Video Coding
Duration                    : 15s 627ms
Bit rate                    : 16.2 Mbps
Width                       : 1 920 pixels
Height                      : 1 080 pixels
Display aspect ratio        : 16:9
Frame rate mode             : Variable
Frame rate                  : 0.000 (0/1000) fps
Minimum frame rate          : 0.000 fps
Maximum frame rate          : 30.540 fps
Color space                 : YUV
Chroma subsampling          : 4:2:0
Bit depth                   : 8 bits
Scan type                   : Progressive
Stream size                 : 0.00 Byte (0%)
Source stream size          : 31.7 MiB (98%)
Title                       : VideoHandle
Language                    : English
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50
mdhd_Duration               : 15627

Audio
ID                          : 2
Format                      : AAC
Format/Info                 : Advanced Audio Codec
Format profile              : LC
Codec ID                    : 40
Duration                    : 15s 744ms
Bit rate mode               : Constant
Bit rate                    : 256 Kbps
Channel(s)                  : 2 channels
Channel positions           : Front: L R
Sampling rate               : 48.0 KHz
Frame rate                  : 46.875 fps (1024 spf)
Compression mode            : Lossy
Stream size                 : 492 KiB (1%)
Title                       : SoundHandle
Language                    : English
Encoded date                : UTC 2016-06-11 14:31:50
Tagged date                 : UTC 2016-06-11 14:31:50

我用来创建它的代码是:

package invisiblevideorecorder;

import android.content.Context;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.media.CamcorderProfile;
import android.media.MediaRecorder;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.view.Surface;

import java.io.File;
import java.io.IOException;
import java.util.Arrays;

/**
 * @author Mark
 * @since 6/10/2016
 */
public class InvisibleVideoRecorder {
    private static final String TAG = "InvisibleVideoRecorder";
    private final CameraCaptureSessionStateCallback cameraCaptureSessionStateCallback = new CameraCaptureSessionStateCallback();
    private final CameraDeviceStateCallback cameraDeviceStateCallback = new CameraDeviceStateCallback();
    private MediaRecorder mediaRecorder;
    private CameraManager cameraManager;
    private Context context;

    private CameraDevice cameraDevice;

    private HandlerThread handlerThread;
    private Handler handler;

    public InvisibleVideoRecorder(Context context) {
        this.context = context;
        handlerThread = new HandlerThread("camera");
        handlerThread.start();
        handler = new Handler(handlerThread.getLooper());

        try {
            mediaRecorder = new MediaRecorder();

            mediaRecorder.setAudioSource(MediaRecorder.AudioSource.CAMCORDER);
            mediaRecorder.setVideoSource(MediaRecorder.VideoSource.SURFACE);

            final String filename = context.getExternalFilesDir(Environment.DIRECTORY_MOVIES).getAbsolutePath() + File.separator + System.currentTimeMillis() + ".mp4";
            mediaRecorder.setOutputFile(filename);
            Log.d(TAG, "start: " + filename);

            // by using the profile, I don't think I need to do any of these manually:
//            mediaRecorder.setVideoEncodingBitRate(16000000);
//            mediaRecorder.setVideoFrameRate(30);
//            mediaRecorder.setCaptureRate(30);
//            mediaRecorder.setVideoSize(1920, 1080);
//            mediaRecorder.setVideoEncoder(MediaRecorder.VideoEncoder.MPEG_4_SP);
//            mediaRecorder.setAudioEncoder(MediaRecorder.AudioEncoder.AAC);

//            Log.d(TAG, "start: 1 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P));
            // true
//            Log.d(TAG, "start: 2 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH_SPEED_1080P));
            // false
//            Log.d(TAG, "start: 3 " + CamcorderProfile.hasProfile(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_HIGH));
            // true

            CamcorderProfile profile = CamcorderProfile.get(CameraMetadata.LENS_FACING_BACK, CamcorderProfile.QUALITY_1080P);
            Log.d(TAG, "start: profile " + ToString.inspect(profile));
//          start: 0 android.media.CamcorderProfile@114016694 {
//                audioBitRate: 256000
//                audioChannels: 2
//                audioCodec: 3
//                audioSampleRate: 48000
//                duration: 30
//                fileFormat: 2
//                quality: 6
//                videoBitRate: 17000000
//                videoCodec: 2
//                videoFrameHeight: 1080
//                videoFrameRate: 30
//                videoFrameWidth: 1920
//            }
            mediaRecorder.setOrientationHint(0);
            mediaRecorder.setProfile(profile);
            mediaRecorder.prepare();
        } catch (IOException e) {
            Log.d(TAG, "start: exception" + e.getMessage());
        }

    }

    public void start() {
        Log.d(TAG, "start: ");

        cameraManager = (CameraManager) context.getSystemService(Context.CAMERA_SERVICE);
        try {
            cameraManager.openCamera(String.valueOf(CameraMetadata.LENS_FACING_BACK), cameraDeviceStateCallback, handler);
        } catch (CameraAccessException | SecurityException e) {
            Log.d(TAG, "start: exception " + e.getMessage());
        }

    }

    public void stop() {
        Log.d(TAG, "stop: ");
        mediaRecorder.stop();
        mediaRecorder.reset();
        mediaRecorder.release();
        cameraDevice.close();
        try {
            handlerThread.join();
        } catch (InterruptedException e) {

        }
    }

    private class CameraCaptureSessionStateCallback extends CameraCaptureSession.StateCallback {
        private final static String TAG = "CamCaptSessionStCb";

        @Override
        public void onActive(CameraCaptureSession session) {
            Log.d(TAG, "onActive: ");
            super.onActive(session);
        }

        @Override
        public void onClosed(CameraCaptureSession session) {
            Log.d(TAG, "onClosed: ");
            super.onClosed(session);
        }

        @Override
        public void onConfigured(CameraCaptureSession session) {
            Log.d(TAG, "onConfigured: ");
        }

        @Override
        public void onConfigureFailed(CameraCaptureSession session) {
            Log.d(TAG, "onConfigureFailed: ");
        }

        @Override
        public void onReady(CameraCaptureSession session) {
            Log.d(TAG, "onReady: ");
            super.onReady(session);
            try {
                CaptureRequest.Builder builder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
                builder.addTarget(mediaRecorder.getSurface());
                CaptureRequest request = builder.build();
                session.setRepeatingRequest(request, null, handler);
                mediaRecorder.start();
            } catch (CameraAccessException e) {
                Log.d(TAG, "onConfigured: " + e.getMessage());

            }
        }

        @Override
        public void onSurfacePrepared(CameraCaptureSession session, Surface surface) {
            Log.d(TAG, "onSurfacePrepared: ");
            super.onSurfacePrepared(session, surface);
        }
    }

    private class CameraDeviceStateCallback extends CameraDevice.StateCallback {
        private final static String TAG = "CamDeviceStateCb";

        @Override
        public void onClosed(CameraDevice camera) {
            Log.d(TAG, "onClosed: ");
            super.onClosed(camera);
        }

        @Override
        public void onDisconnected(CameraDevice camera) {
            Log.d(TAG, "onDisconnected: ");
        }

        @Override
        public void onError(CameraDevice camera, int error) {
            Log.d(TAG, "onError: ");
        }

        @Override
        public void onOpened(CameraDevice camera) {
            Log.d(TAG, "onOpened: ");
            cameraDevice = camera;
            try {
                camera.createCaptureSession(Arrays.asList(mediaRecorder.getSurface()), cameraCaptureSessionStateCallback, handler);
            } catch (CameraAccessException e) {
                Log.d(TAG, "onOpened: " + e.getMessage());
            }
        }
    }

}
Run Code Online (Sandbox Code Playgroud)

我遵循Android源代码(测试和应用程序)代码,以及我在github上找到的几个例子,因为camera2 API还没有很好地记录下来.

有什么明显的东西我做错了吗?或者,我是否只是在Mac上缺少使用Quicktime的编解码器以及在我的PC上使用Windows Media Player和VLC?我还没有尝试在Linux上播放文件,所以我不知道那里发生了什么.哦,如果我将mp4文件上传到photos.google.com,它们也可以完全正确地在那里播放.

谢谢!标记

Gra*_*per 8

当我们开发基于Camera2 API的插件时,我的团队遇到了类似的问题,但它只影响了三星Galaxy S7(我们还有一个用于测试的S6没有出现这种行为).

该问题似乎是由三星相机固件中的一个错误引起的,当设备退出深度睡眠时(Android 6.0 Marshmallow中的超低功耗模式)触发了该问题.从深度睡眠恢复后,使用Camera2 MediaRecorder捕获和编码的任何视频的第一帧具有非常长的帧持续时间 - 有时与视频本身的总持续时间一样长或更长.

因此,当回放时,在音频继续播放的同时显示第一帧这么长的持续时间.第一帧完成显示后,其余帧将正常播放.

我们发现其他有类似问题的人在GitHub上讨论这个问题

问题是运行Marshmallow的某些设备存在深度睡眠问题.它似乎与CPU有关,因为Verizon上的S7没有问题,但AT&T上的S7确实存在问题.当我更新到Marshmallow时,我在S6 Verizon手机上看到了这个.

要进行复制,请在连接USB时重启设备.运行示例.一切都应该没问题.然后,断开设备,让它进入深度睡眠状态(屏幕关闭,5分钟内没有移动),然后再试一次.一旦设备进入深度睡眠状态,就会出现此问题.

我们最终使用了cybaker提出的解决方法 ; 也就是说,创建视频文件时,请检查视频第一帧的持续时间.如果看起来不正确,请使用合理的帧持续时间重新编码视频:

DataSource channel = new FileDataSourceImpl(rawFile);
IsoFile isoFile = new IsoFile(channel);

List<TrackBox> trackBoxes = isoFile.getMovieBox().getBoxes(TrackBox.class);
boolean sampleError = false;
for (TrackBox trackBox : trackBoxes) {
    TimeToSampleBox.Entry firstEntry = trackBox.getMediaBox().getMediaInformationBox().getSampleTableBox().getTimeToSampleBox().getEntries().get(0);

    // Detect if first sample is a problem and fix it in isoFile
    // This is a hack. The audio deltas are 1024 for my files, and video deltas about 3000
    // 10000 seems sufficient since for 30 fps the normal delta is about 3000
    if(firstEntry.getDelta() > 10000) {
        sampleError = true;
        firstEntry.setDelta(3000);
    }
}

if(sampleError) {
    Movie movie = new Movie();
    for (TrackBox trackBox : trackBoxes) {
            movie.addTrack(new Mp4TrackImpl(channel.toString() + "[" + trackBox.getTrackHeaderBox().getTrackId() + "]" , trackBox));
    }
    movie.setMatrix(isoFile.getMovieBox().getMovieHeaderBox().getMatrix());
    Container out = new DefaultMp4Builder().build(movie);

    //delete file first!
    FileChannel fc = new RandomAccessFile(rawFile.getName(), "rw").getChannel();
    out.writeContainer(fc);
    fc.close();
    Log.d(TAG, "Finished correcting raw video");
}
Run Code Online (Sandbox Code Playgroud)

希望这能指出你正确的方向!