标签: cvpixelbuffer

如何正确寻址 CVPixelBuffer 中的像素?

简短的问题是:解决 a 中像素值的公式是什么CVPixelBuffer

我试图将 a 转换CVPixelBuffer为平面字节数组,并注意到一些奇怪的事情: TheCVPixelBuffer是从 a 获得的CMSampleBuffer。它的宽度和高度都是852x640像素。总而言之,考虑到每个像素的字节数545280,这将需要字节数。21811204

现在第一个奇怪的事情是,它每行CVPixelBuffer返回3456字节,这对于864像素来说有足够的空间。这些额外的12像素从哪里来?如果最终图像中的一行只有像素宽,但实际上一行中852有像素,我如何知道需要复制哪些字节?或者哪些字节未使用?顺便说一句,它们没用过吗?864CVPixelBuffer

另一件事是报告的数据大小CVPixelBuffers,即2211848字节。现在,如果我们将3456每行的字节数与640行数相乘,我们将得到2211840字节数。同样,我们还剩下8额外的字节。这些字节是怎么回事8?它们未被使用吗?他们到最后了吗?

欢迎任何能够阐明这一问题的建议,谢谢。

ios cmsamplebuffer cvpixelbuffer

5
推荐指数
1
解决办法
1271
查看次数

使用像素数据创建CVPixelBuffer,但最终图像失真

我通过OpenGLES方法(glReadPixels)或其他方式获取像素,然后创建CVPixelBuffer(带或不带CGImage)进行视频录制,但最终图片失真.当我在iPhone 5c,5s和6上测试时,会在iPhone6上发生这种情况.

看起来像:

在此输入图像描述

这是代码:

CGSize viewSize=self.glView.bounds.size;
NSInteger myDataLength = viewSize.width * viewSize.height * 4;

// allocate array and read pixels into it.
GLubyte *buffer = (GLubyte *) malloc(myDataLength);
glReadPixels(0, 0, viewSize.width, viewSize.height, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

// gl renders "upside down" so swap top to bottom into new array.
// there's gotta be a better way, but this works.
GLubyte *buffer2 = (GLubyte *) malloc(myDataLength);
for(int y = 0; y < viewSize.height; y++)
{
    for(int x = 0; x < viewSize.width* …
Run Code Online (Sandbox Code Playgroud)

image opengl-es avfoundation ios cvpixelbuffer

5
推荐指数
1
解决办法
1497
查看次数

CVMetalTextureCacheCreateTextureFromImage始终返回null

我正试图通过MetalKit渲染I420(YCbCr规划器)

大多数例子都是使用来自Camera的CMSampleBuffer,

但我的目标是使用给定的I420字节.

我做这样的事情:

let data = NSMutableData(contentsOfURL: NSBundle.mainBundle().URLForResource("yuv_640_360", withExtension: "yuv")!)

// Cache for Y

CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, self.device!, nil, &videoTextureCache)

var pixelBuffer: CVPixelBuffer?

CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(size.width), Int(size.height), kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, data.mutableBytes, Int(size.width), nil, nil, [
    "kCVPixelBufferMetalCompatibilityKey": true,
    "kCVPixelBufferOpenGLCompatibilityKey": true,
    "kCVPixelBufferIOSurfacePropertiesKey": []
    ]
    , &pixelBuffer)

// Y texture

var yTextureRef : Unmanaged<CVMetalTexture>?

let yWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)

let yHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)

let result = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, (videoTextureCache?.takeUnretainedValue())!, pixelBuffer, nil, MTLPixelFormat.R8Unorm, yWidth, yHeight, 0, &yTextureRef);
Run Code Online (Sandbox Code Playgroud)

基本上代码与其他示例几乎相同,但我自己创建自己的CVPixelBuffer.

我创建CVPixelBuffer和CVMetalTexture时没有出错,

但它总是为yTexture返回null.

如何创建正确的CVPixelBuffer并使用它进行渲染?

core-video ios metal cvpixelbuffer metalkit

5
推荐指数
1
解决办法
1438
查看次数

使用Vision在调度队列中CVPixelBuffer的内存泄漏

我在捕获管道中使用渲染器类来为视频添加CI过滤器.在渲染器的copyRenderedPixelBuffer中,我还想复制像素缓冲区并将其发送到Vision以检测面部标记.

我使用串行调度队列为Vision创建了一个单例.问题是,一旦我添加了调度队列,pixelBuffer就不会从内存中释放出来 - 因此会出现大量泄漏(即使像素缓冲区在objc代码中释放).随着调度队列静音,内存泄漏消失(但由于Vision功能,视频预览存在大量视频延迟).

任何帮助非常感谢!

- (CVPixelBufferRef)copyRenderedPixelBuffer:(CVPixelBufferRef)pixelBuffer
{
    OSStatus err = noErr;
    CVPixelBufferRef renderedOutputPixelBuffer = NULL;

    CVPixelBufferRef visionOutputPixelBuffer;

    CIImage *sourceImage = nil;

    err = CVPixelBufferPoolCreatePixelBuffer( kCFAllocatorDefault, _bufferPool, &renderedOutputPixelBuffer );
    if ( err ) {
        NSLog(@"Cannot obtain a pixel buffer from the buffer pool (%d)", (int)err );
        goto bail;
    }

    err = CVPixelBufferPoolCreatePixelBuffer( kCFAllocatorDefault, _bufferPool, &visionOutputPixelBuffer );
    if ( err ) {
        NSLog(@"Cannot obtain a pixel buffer from the buffer pool (%d)", (int)err );

    }

    // Vision

    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    int bufferHeight …
Run Code Online (Sandbox Code Playgroud)

memory-leaks grand-central-dispatch swift cvpixelbuffer apple-vision

5
推荐指数
0
解决办法
392
查看次数

如何将CVImageBuffer转换为UIImage?

我有tmpPixelBuffer像素缓冲区数据的临时变量nil,但是,当检测到元数据对象时,我想从该缓冲区创建图像,因此我可以从该图像中裁剪元数据图像.

图像总是nil,我做错了什么?

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {

    tmpPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
}


func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {

    let image = CIImage(CVPixelBuffer: tmpPixelBuffer)
    let context = CIContext()
    let cgiImage = context.createCGImage(image, fromRect: image.extent())
    let capturedImage = UIImage(CGImage: cgiImage)
    ...
}
Run Code Online (Sandbox Code Playgroud)

我也尝试这样做:

func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {

    let image = CIImage(CVPixelBuffer: tmpPixelBuffer)
    let context = CIContext(options: nil)

    let cgiImage = context.createCGImage(image, fromRect: …
Run Code Online (Sandbox Code Playgroud)

core-image ios swift cvpixelbuffer avcaptureoutput

4
推荐指数
3
解决办法
4047
查看次数

在iOS中用白色像素替换部分像素缓冲区

我正在使用iPhone相机捕捉实时视频并将像素缓冲区馈送到进行某些对象识别的网络.以下是相关代码:(我不会发布用于设置AVCaptureSession 等的代码,因为这是非常标准的.)

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

    OSType sourcePixelFormat = CVPixelBufferGetPixelFormatType( pixelBuffer );
    int doReverseChannels;
    if ( kCVPixelFormatType_32ARGB == sourcePixelFormat ) {
        doReverseChannels = 1;
    } else if ( kCVPixelFormatType_32BGRA == sourcePixelFormat ) {
        doReverseChannels = 0;
    } else {
        assert(false);
    }

    const int sourceRowBytes = (int)CVPixelBufferGetBytesPerRow( pixelBuffer );
    const int width = (int)CVPixelBufferGetWidth( pixelBuffer );
    const int fullHeight = (int)CVPixelBufferGetHeight( pixelBuffer );
    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
    unsigned char* sourceBaseAddr = CVPixelBufferGetBaseAddress( pixelBuffer …
Run Code Online (Sandbox Code Playgroud)

objective-c avfoundation ios cvpixelbuffer

4
推荐指数
2
解决办法
1825
查看次数

什么是CVPixelbuffer中的飞机?

在CVPixelBuffer对象中,有一个或多个平面.(参考)我们有获得飞机的数量,高度,基地址的方法.

那架飞机到底是什么?它是如何在CVPixelBuffer中构建的?

样品:

<CVPixelBuffer 0x1465f8b30 width=1280 height=720 pixelFormat=420v iosurface=0x14a000008 planes=2>
    <Plane 0 width=1280 height=720 bytesPerRow=1280>
    <Plane 1 width=640 height=360 bytesPerRow=1280>
Run Code Online (Sandbox Code Playgroud)

ios cvpixelbuffer

4
推荐指数
1
解决办法
820
查看次数

将UIImage转换为8-Gray类型的像素缓冲区

我能够将其转换UIImage为ARGB CVPixelBuffer,但现在我正在尝试将其转换UIImage为灰度级缓冲区.我认为自从代码通过以来我已经拥有它,但coreML模型抱怨说:

"Error Domain = com.apple.CoreML Code = 1"图像不是预期的类型8-Gray,而是Unsupported(40)"

这是CGContext我到目前为止的灰度:

public func pixelBufferGray(width: Int, height: Int) -> CVPixelBuffer? {

        var pixelBuffer : CVPixelBuffer?
        let attributes = [kCVPixelBufferCGImageCompatibilityKey: kCFBooleanTrue, kCVPixelBufferCGBitmapContextCompatibilityKey: kCFBooleanTrue]

        let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(width), Int(height), kCVPixelFormatType_8IndexedGray_WhiteIsZero, attributes as CFDictionary, &pixelBuffer)

        guard status == kCVReturnSuccess, let imageBuffer = pixelBuffer else {
            return nil
        }

        CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))

        let imageData =  CVPixelBufferGetBaseAddress(imageBuffer)

        guard let context = CGContext(data: imageData, width: Int(width), height:Int(height),
                                      bitsPerComponent: 8, bytesPerRow: …
Run Code Online (Sandbox Code Playgroud)

uiimage ios swift cvpixelbuffer coreml

4
推荐指数
1
解决办法
1693
查看次数

将ARFrame的捕获图像转换为UIImage方向问题

我想检测球并让AR模型与它互动.我使用opencv进行球检测并发送我可用于hitTest获取坐标的球的中心sceneView.我一直CVPixelBufferUIImage使用以下功能:

static func convertToUIImage(buffer: CVPixelBuffer) -> UIImage?{
    let ciImage = CIImage(cvPixelBuffer: buffer)
    let temporaryContext = CIContext(options: nil)
    if let temporaryImage = temporaryContext.createCGImage(ciImage, from: CGRect(x: 0, y: 0, width: CVPixelBufferGetWidth(buffer), height: CVPixelBufferGetHeight(buffer)))
    {
        let capturedImage = UIImage(cgImage: temporaryImage)
        return capturedImage
    }
    return nil
}
Run Code Online (Sandbox Code Playgroud)

这给了我旋转的图像:

在此输入图像描述

然后我发现使用以下方法改变方向:

let capturedImage = UIImage(cgImage: temporaryImage, scale: 1.0, orientation: .right)
Run Code Online (Sandbox Code Playgroud)

虽然它在设备处于纵向时给出了正确的方向,但是将设备旋转到横向再次给出了旋转图像.

现在我正在考虑使用它来处理它viewWillTransition.但在此之前,我想知道:

  1. 如果还有其他方法来转换正确方向的图像?
  2. 为什么会这样?

uiimage ios swift cvpixelbuffer arkit

4
推荐指数
1
解决办法
3125
查看次数

在任何 iOS 设备上复制 CVPixelBuffer

我很难想出在任何 iOS 设备上可靠地复制 CVPixelBuffer 的代码。我的第一次尝试效果很好,直到我在 iPad Pro 上尝试过:

extension CVPixelBuffer {
    func deepcopy() -> CVPixelBuffer? {
        let width = CVPixelBufferGetWidth(self)
        let height = CVPixelBufferGetHeight(self)
        let format = CVPixelBufferGetPixelFormatType(self)
        var pixelBufferCopyOptional:CVPixelBuffer?
        CVPixelBufferCreate(nil, width, height, format, nil, &pixelBufferCopyOptional)
        if let pixelBufferCopy = pixelBufferCopyOptional {
            CVPixelBufferLockBaseAddress(self, kCVPixelBufferLock_ReadOnly)
            CVPixelBufferLockBaseAddress(pixelBufferCopy, 0)
            let baseAddress = CVPixelBufferGetBaseAddress(self)
            let dataSize = CVPixelBufferGetDataSize(self)
            let target = CVPixelBufferGetBaseAddress(pixelBufferCopy)
            memcpy(target, baseAddress, dataSize)
            CVPixelBufferUnlockBaseAddress(pixelBufferCopy, 0)
            CVPixelBufferUnlockBaseAddress(self, kCVPixelBufferLock_ReadOnly)
        }
        return pixelBufferCopyOptional
    }
}
Run Code Online (Sandbox Code Playgroud)

上述在 iPad Pro 上崩溃,因为CVPixelBufferGetDataSize(self)它略大于CVPixelBufferGetDataSize(pixelBufferCopy),因此 memcpy 写入未分配的内存。 …

ios cvpixelbuffer

4
推荐指数
2
解决办法
1376
查看次数