Android ImageReader获得NV21格式?

Goo*_*oey 8 android imaging image android-mediaprojection

我没有成像或图形的背景,所以请耐心等我:)

我在我的一个项目中使用JavaCV.在这些例子中,Frame构造了一个具有一定大小的缓冲区的a .

public void onPreviewFrame(byte[] data, Camera camera)在Android中使用该函数时,data如果将声明声明Framenew Frame(frameWidth, frameHeight, Frame.DEPTH_UBYTE, 2);where frameWidthframeHeight声明为,则复制此字节数组没有问题

Camera.Size previewSize = cameraParam.getPreviewSize();
int frameWidth = previewSize.width;
int frameHeight = previewSize.height;
Run Code Online (Sandbox Code Playgroud)

最近,Android添加了一种捕获屏幕的方法.当然,我想抓住那些图像并将它们转换为Frames.我修改了Google的示例代码以使用ImageReader.

ImageReader被构造为ImageReader.newInstance(DISPLAY_WIDTH, DISPLAY_HEIGHT, PixelFormat.RGBA_8888, 2);.所以目前它使用RGBA_8888像素格式.我使用以下代码将字节复制到Frame,实例化为new Frame(DISPLAY_WIDTH, DISPLAY_HEIGHT, Frame.DEPTH_UBYTE, 2);:

ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.remaining()];
buffer.get(bytes);
mImage.close();
((ByteBuffer) frame.image[0].position(0)).put(bytes);
Run Code Online (Sandbox Code Playgroud)

但这给了我一个java.nio.BufferOverflowException.我打印了两个缓冲区的大小,Frame的缓冲区大小为691200,而bytes上面的数组大小1413056.弄清楚后一个数字是如何构建失败的,因为我遇到了这个本地调用.很明显,这不会成功.

经过相当多的挖掘后,我发现NV21图像格式是"相机预览图像的默认格式,否则无法使用setPreviewFormat(int)设置",但ImageReader类不支持NV21格式(请参阅格式参数) ).所以这很难过.在文档中,它还读取"对于android.hardware.camera2 API,建议YUV输出使用YUV_420_888格式."

所以我尝试创建一个像这样的ImageReader ImageReader.newInstance(DISPLAY_WIDTH, DISPLAY_HEIGHT, ImageFormat.YUV_420_888, 2);,但是这让我java.lang.UnsupportedOperationException: The producer output buffer format 0x1 doesn't match the ImageReader's configured buffer format 0x23.无法工作.

作为最后的手段,我尝试使用例如这篇文章将RGBA_8888转换为YUV ,但我无法理解如何int[] rgba根据答案获得.

那么,TL; DR 如何获得NV21图像数据,就像你在Android的public void onPreviewFrame(byte[] data, Camera camera)相机功能中实例化我的Frame并使用Android的ImageReader(和媒体投影)一起工作?

编辑(25-10-2016)

我创建了以下转换,可以从RGBA转换为NV21格式:

private class updateImage implements Runnable {

    private final Image mImage;

    public updateImage(Image image) {
        mImage = image;
    }

    @Override
    public void run() {

        int mWidth = mImage.getWidth();
        int mHeight = mImage.getHeight();

        // Four bytes per pixel: width * height * 4.
        byte[] rgbaBytes = new byte[mWidth * mHeight * 4];
        // put the data into the rgbaBytes array.
        mImage.getPlanes()[0].getBuffer().get(rgbaBytes);

        mImage.close(); // Access to the image is no longer needed, release it.

        // Create a yuv byte array: width * height * 1.5 ().
        byte[] yuv = new byte[mWidth * mHeight * 3 / 2];
        RGBtoNV21(yuv, rgbaBytes, mWidth, mHeight);
        ((ByteBuffer) yuvImage.image[0].position(0)).put(yuv);
    }

    void RGBtoNV21(byte[] yuv420sp, byte[] argb, int width, int height) {
        final int frameSize = width * height;

        int yIndex = 0;
        int uvIndex = frameSize;

        int A, R, G, B, Y, U, V;
        int index = 0;
        int rgbIndex = 0;

        for (int i = 0; i < height; i++) {
            for (int j = 0; j < width; j++) {

                R = argb[rgbIndex++];
                G = argb[rgbIndex++];
                B = argb[rgbIndex++];
                A = argb[rgbIndex++]; // Ignored right now.

                // RGB to YUV conversion according to
                // https://en.wikipedia.org/wiki/YUV#Y.E2.80.B2UV444_to_RGB888_conversion
                Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
                U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
                V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;

                // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor
                // of 2 meaning for every 4 Y pixels there are 1 V and 1 U.
                // Note the sampling is every other pixel AND every other scanline.
                yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
                if (i % 2 == 0 && index % 2 == 0) {
                    yuv420sp[uvIndex++] = (byte) ((V < 0) ? 0 : ((V > 255) ? 255 : V));
                    yuv420sp[uvIndex++] = (byte) ((U < 0) ? 0 : ((U > 255) ? 255 : U));
                }
                index++;
            }
        }
    }
}
Run Code Online (Sandbox Code Playgroud)

yuvImage对象初始化为yuvImage = new Frame(DISPLAY_WIDTH, DISPLAY_HEIGHT, Frame.DEPTH_UBYTE, 2);,DISPLAY_WIDTH并且DISPLAY_HEIGHT只是两个指定显示大小的整数.这是后台处理程序处理onImageReady的代码:

private final ImageReader.OnImageAvailableListener mOnImageAvailableListener
            = new ImageReader.OnImageAvailableListener() {

        @Override
        public void onImageAvailable(ImageReader reader) {
            mBackgroundHandler.post(new updateImage(reader.acquireNextImage()));
        }

    };

...

mImageReader = ImageReader.newInstance(DISPLAY_WIDTH, DISPLAY_HEIGHT, PixelFormat.RGBA_8888, 2);
mImageReader.setOnImageAvailableListener(mOnImageAvailableListener, mBackgroundHandler);
Run Code Online (Sandbox Code Playgroud)

方法工作,我至少没有得到任何错误,但输出图像格式不正确.我的转换出了什么问题?正在创建的示例图像:

格式错误的图像的例子

编辑(15-11-2016)

我已将该RGBtoNV21功能修改为以下内容:

void RGBtoNV21(byte[] yuv420sp, int width, int height) {
    try {
        final int frameSize = width * height;

        int yIndex = 0;
        int uvIndex = frameSize;
        int pixelStride = mImage.getPlanes()[0].getPixelStride();
        int rowStride = mImage.getPlanes()[0].getRowStride();
        int rowPadding = rowStride - pixelStride * width;
        ByteBuffer buffer = mImage.getPlanes()[0].getBuffer();

        Bitmap bitmap = Bitmap.createBitmap(getResources().getDisplayMetrics(), width, height, Bitmap.Config.ARGB_8888);

        int A, R, G, B, Y, U, V;
        int offset = 0;

        for (int i = 0; i < height; i++) {
            for (int j = 0; j < width; j++) {

                // Useful link: https://stackoverflow.com/questions/26673127/android-imagereader-acquirelatestimage-returns-invalid-jpg

                R = (buffer.get(offset) & 0xff) << 16;     // R
                G = (buffer.get(offset + 1) & 0xff) << 8;  // G
                B = (buffer.get(offset + 2) & 0xff);       // B
                A = (buffer.get(offset + 3) & 0xff) << 24; // A
                offset += pixelStride;

                int pixel = 0;
                pixel |= R;     // R
                pixel |= G;  // G
                pixel |= B;       // B
                pixel |= A; // A
                bitmap.setPixel(j, i, pixel);

                // RGB to YUV conversion according to
                // https://en.wikipedia.org/wiki/YUV#Y.E2.80.B2UV444_to_RGB888_conversion
//                        Y = ((66 * R + 129 * G + 25 * B + 128) >> 8) + 16;
//                        U = ((-38 * R - 74 * G + 112 * B + 128) >> 8) + 128;
//                        V = ((112 * R - 94 * G - 18 * B + 128) >> 8) + 128;

                Y = (int) Math.round(R *  .299000 + G *  .587000 + B *  .114000);
                U = (int) Math.round(R * -.168736 + G * -.331264 + B *  .500000 + 128);
                V = (int) Math.round(R *  .500000 + G * -.418688 + B * -.081312 + 128);

                // NV21 has a plane of Y and interleaved planes of VU each sampled by a factor
                // of 2 meaning for every 4 Y pixels there are 1 V and 1 U.
                // Note the sampling is every other pixel AND every other scanline.
                yuv420sp[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
                if (i % 2 == 0 && j % 2 == 0) {
                    yuv420sp[uvIndex++] = (byte) ((V < 0) ? 0 : ((V > 255) ? 255 : V));
                    yuv420sp[uvIndex++] = (byte) ((U < 0) ? 0 : ((U > 255) ? 255 : U));
                }
            }
            offset += rowPadding;
        }

        File file = new File(Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_PICTURES).getAbsolutePath(), "/Awesomebitmap.png");
        FileOutputStream fos = new FileOutputStream(file);
        bitmap.compress(Bitmap.CompressFormat.PNG, 100, fos);
    } catch (Exception e) {
        Timber.e(e, "Converting image to NV21 went wrong.");
    }
}
Run Code Online (Sandbox Code Playgroud)

现在图像不再格式错误,但色度偏差.

错误的色度

右侧是在该循环中创建的位图,左侧是保存到图像的NV21.因此正确处理RGB像素.显然,色度偏离,但RGB到YUV的转换应该与维基百科所描述的相同.这可能有什么问题?

fad*_*den 5

一般来说,ImageReader 的目的是让您以最小的开销对发送到 Surface 的像素进行原始访问,因此尝试让它执行颜色转换是没有意义的。

对于相机,您可以选择两种输出格式(NV21 或 YV12)中的一种,所以选择 YV12。那是您的原始 YUV 数据。对于屏幕捕获,输出将始终为 RGB,因此您需要为 ImageReader选择RGBA_8888(format 0x1)而不是YUV_420_888(format 0x23)。如果您为此需要 YUV,则必须自己进行转换。ImageReader 为您提供一系列Plane对象,而不是byte[],因此您需要适应它。


归档时间:

查看次数:

7966 次

最近记录:

9 年,2 月 前