mrp*_*nts 16 core-image ios quartz-core
这与关于将a转换CMSampleBuffer为a 的无数问题不同UIImage.我只是想知道为什么我不能像这样转换它:
CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage * imageFromCoreImageLibrary = [CIImage imageWithCVPixelBuffer: pixelBuffer];
UIImage * imageForUI = [UIImage imageWithCIImage: imageFromCoreImageLibrary];
Run Code Online (Sandbox Code Playgroud)
它似乎更简单,因为它适用于YCbCr颜色空间,以及RGBA和其他颜色空间.这段代码有问题吗?
Ale*_*kov 25
对于JPEG图像:
斯威夫特4:
let buff: CMSampleBuffer ... // Have you have CMSampleBuffer
if let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buff, previewPhotoSampleBuffer: nil) {
let image = UIImage(data: imageData) // Here you have UIImage
}
Run Code Online (Sandbox Code Playgroud)
小智 19
使用Swift 3和iOS 10 AVCapturePhotoOutput:包括:
import UIKit
import CoreData
import CoreMotion
import AVFoundation
Run Code Online (Sandbox Code Playgroud)
创建一个UIView进行预览并将其链接到Main Class
@IBOutlet var preview: UIView!
Run Code Online (Sandbox Code Playgroud)
创建此设置以设置相机会话(kCVPixelFormatType_32BGRA很重要!!):
lazy var cameraSession: AVCaptureSession = {
let s = AVCaptureSession()
s.sessionPreset = AVCaptureSessionPresetHigh
return s
}()
lazy var previewLayer: AVCaptureVideoPreviewLayer = {
let previewl:AVCaptureVideoPreviewLayer = AVCaptureVideoPreviewLayer(session: self.cameraSession)
previewl.frame = self.preview.bounds
return previewl
}()
func setupCameraSession() {
let captureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) as AVCaptureDevice
do {
let deviceInput = try AVCaptureDeviceInput(device: captureDevice)
cameraSession.beginConfiguration()
if (cameraSession.canAddInput(deviceInput) == true) {
cameraSession.addInput(deviceInput)
}
let dataOutput = AVCaptureVideoDataOutput()
dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(value: **kCVPixelFormatType_32BGRA** as UInt32)]
dataOutput.alwaysDiscardsLateVideoFrames = true
if (cameraSession.canAddOutput(dataOutput) == true) {
cameraSession.addOutput(dataOutput)
}
cameraSession.commitConfiguration()
let queue = DispatchQueue(label: "fr.popigny.videoQueue", attributes: [])
dataOutput.setSampleBufferDelegate(self, queue: queue)
}
catch let error as NSError {
NSLog("\(error), \(error.localizedDescription)")
}
}
Run Code Online (Sandbox Code Playgroud)
在WillAppear:
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
setupCameraSession()
}
Run Code Online (Sandbox Code Playgroud)
在Didappear:
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
preview.layer.addSublayer(previewLayer)
cameraSession.startRunning()
}
Run Code Online (Sandbox Code Playgroud)
创建一个捕获输出的函数:
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
// Here you collect each frame and process it
let ts:CMTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
self.mycapturedimage = imageFromSampleBuffer(sampleBuffer: sampleBuffer)
}
Run Code Online (Sandbox Code Playgroud)
下面是一个转换的代码kCVPixelFormatType_32BGRA CMSampleBuffer到的UIImage的关键事情是BITMAPINFO必须对应32BGRA 32小与premultfirst和alpha信息:
func imageFromSampleBuffer(sampleBuffer : CMSampleBuffer) -> UIImage
{
// Get a CMSampleBuffer's Core Video image buffer for the media data
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags.readOnly);
// Get the number of bytes per row for the pixel buffer
let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!);
// Get the number of bytes per row for the pixel buffer
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!);
// Get the pixel buffer width and height
let width = CVPixelBufferGetWidth(imageBuffer!);
let height = CVPixelBufferGetHeight(imageBuffer!);
// Create a device-dependent RGB color space
let colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
var bitmapInfo: UInt32 = CGBitmapInfo.byteOrder32Little.rawValue
bitmapInfo |= CGImageAlphaInfo.premultipliedFirst.rawValue & CGBitmapInfo.alphaInfoMask.rawValue
//let bitmapInfo: UInt32 = CGBitmapInfo.alphaInfoMask.rawValue
let context = CGContext.init(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo)
// Create a Quartz image from the pixel data in the bitmap graphics context
let quartzImage = context?.makeImage();
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags.readOnly);
// Create an image object from the Quartz image
let image = UIImage.init(cgImage: quartzImage!);
return (image);
}
Run Code Online (Sandbox Code Playgroud)
Dip*_*ara 13
使用以下代码从PixelBuffer选项1转换图像:
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef myImage = [context
createCGImage:ciImage
fromRect:CGRectMake(0, 0,
CVPixelBufferGetWidth(pixelBuffer),
CVPixelBufferGetHeight(pixelBuffer))];
UIImage *uiImage = [UIImage imageWithCGImage:myImage];
Run Code Online (Sandbox Code Playgroud)
选项2:
int w = CVPixelBufferGetWidth(pixelBuffer);
int h = CVPixelBufferGetHeight(pixelBuffer);
int r = CVPixelBufferGetBytesPerRow(pixelBuffer);
int bytesPerPixel = r/w;
unsigned char *buffer = CVPixelBufferGetBaseAddress(pixelBuffer);
UIGraphicsBeginImageContext(CGSizeMake(w, h));
CGContextRef c = UIGraphicsGetCurrentContext();
unsigned char* data = CGBitmapContextGetData(c);
if (data != NULL) {
int maxY = h;
for(int y = 0; y<maxY; y++) {
for(int x = 0; x<w; x++) {
int offset = bytesPerPixel*((w*y)+x);
data[offset] = buffer[offset]; // R
data[offset+1] = buffer[offset+1]; // G
data[offset+2] = buffer[offset+2]; // B
data[offset+3] = buffer[offset+3]; // A
}
}
}
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Run Code Online (Sandbox Code Playgroud)
我写了一个简单的扩展,用于与Swift 4.x/3.x一起生成UIImagea CMSampleBuffer.
这也可以处理缩放和方向,但如果它们适合您,您可以接受默认值.
import UIKit
import AVFoundation
extension CMSampleBuffer {
func image(orientation: UIImageOrientation = .up,
scale: CGFloat = 1.0) -> UIImage? {
if let buffer = CMSampleBufferGetImageBuffer(self) {
let ciImage = CIImage(cvPixelBuffer: buffer)
return UIImage(ciImage: ciImage,
scale: scale,
orientation: orientation)
}
return nil
}
}
Run Code Online (Sandbox Code Playgroud)
CIImageUIImage初始化的ciImage值,以及scale&orientation值.如果未提供,则分别使用默认值up和1.0小智 5
雨燕5.0
if let cvImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
let ciimage = CIImage(cvImageBuffer: cvImageBuffer)
let context = CIContext()
if let cgImage = context.createCGImage(ciimage, from: ciimage.extent) {
let uiImage = UIImage(cgImage: cgImage)
}
}
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
18670 次 |
| 最近记录: |