San*_*dip 7 bluetooth ios avcapturesession core-bluetooth multipeer-connectivity
我们如何在iOS 7中使用蓝牙或wifi有效地将摄像头源从一个iOS设备传输到另一个iOS设备.下面是获取流缓冲区的代码.
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Create a UIImage from the sample buffer data
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
}
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
Run Code Online (Sandbox Code Playgroud)
在这里,我们可以获得iOS相机捕获的图像.
我们是否可以使用多个对等方将样本缓冲区信息直接发送到另一个设备,或者是否有任何有效的方法将数据传输到其他iOS设备?
谢谢.
我找到了这样做的方法,我们可以使用多对等连接来流压缩图像,使其看起来像相机流。
要发送流的对等方将使用此代码。在 captureOutput Delegate 方法中:
NSData *imageData = UIImageJPEGRepresentation(cgBackedImage, 0.2);
// maybe not always the correct input? just using this to send current FPS...
AVCaptureInputPort* inputPort = connection.inputPorts[0];
AVCaptureDeviceInput* deviceInput = (AVCaptureDeviceInput*) inputPort.input;
CMTime frameDuration = deviceInput.device.activeVideoMaxFrameDuration;
NSDictionary* dict = @{
@"image": imageData,
@"timestamp" : timestamp,
@"framesPerSecond": @(frameDuration.timescale)
};
NSData *data = [NSKeyedArchiver archivedDataWithRootObject:dict];
[_session sendData:data toPeers:_session.connectedPeers withMode:MCSessionSendDataReliable error:nil];
Run Code Online (Sandbox Code Playgroud)
在接收端:
- (void)session:(MCSession *)session didReceiveData:(NSData *)data fromPeer:(MCPeerID *)peerID {
// NSLog(@"(%@) Read %d bytes", peerID.displayName, data.length);
NSDictionary* dict = (NSDictionary*) [NSKeyedUnarchiver unarchiveObjectWithData:data];
UIImage* image = [UIImage imageWithData:dict[@"image"] scale:2.0];
NSNumber* framesPerSecond = dict[@"framesPerSecond"];
}
Run Code Online (Sandbox Code Playgroud)
我们将获得 FPS 值,相应地我们可以设置参数来管理我们的流图像。
希望它会有所帮助。
谢谢。
| 归档时间: |
|
| 查看次数: |
1011 次 |
| 最近记录: |