如何计算FOV?

Hum*_*tda 26 iphone camera fieldofview ios avcapturesession

初始背景

我正在开发一个增强现实应用程序位置,我需要获得视野[FOV](我只是在方向改变时更新值,所以我正在寻找一种方法,当我调用它时可以得到这个值)

目标是使"程度统治者"与现实相关,如下所示: Degree Ruler  -  AR App

我已经AVCaptureSession用来显示相机流; 以及与a相连的路径CAShapeLayer来绘制标尺.这工作得很好,但现在我必须使用视场值将我的元素放在正确的位置(例如,选择160°和170°之间的正确空间!).

实际上,我用这些来源硬编码这些值:https://stackoverflow.com/a/3594424/3198096(特别感谢@ hotpaw2!)但我不确定它们是完全精确的,这不是处理iPhone 5等我无法从官方渠道获取价值(Apple!),但有一个链接显示我认为我需要的所有iDevice的值(4,4S,5,5S):AnandTech | 关于iphone 5s相机改进的一些想法.

注意:经过个人测试和其他在线研究,我很确定这些值是不准确的!这也迫使我使用外部库来检查我使用哪种型号的iPhone来手动初始化我的FOV ......我必须检查所有支持设备的值.

我更喜欢"代码解决方案"!

看完这篇文章后:iPhone:实时视频颜色信息,焦距,光圈?,我试图exif data从建议的AVCaptureStillImageOutput 获得.之后,我可以从exif数据中读取焦距,然后通过公式计算水平和垂直视野!(或者可以直接获得如下所示的FOV:http://www.brianklug.org/2011/11/a-quick-analysis-of-exif-data-from-apples-iphone-4s-camera-samples/ - - 注意:经过一定数量的更新后,似乎我们无法直接从exif获取视野!)


实际点

来自:http://iphonedevsdk.com/forum/iphone-sdk-development/112225-camera-app-working-well-on-3gs-but-not-on-4s.html修改后的EXIF数据无法保存正确

这是我正在使用的代码:

AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (camera != nil)
{
    captureSession = [[AVCaptureSession alloc] init];

    AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];

    [captureSession addInput:newVideoInput];

    captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
    captureLayer.frame = overlayCamera.bounds;
    [captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    previewLayerConnection=captureLayer.connection;
    [self setCameraOrientation:[[UIApplication sharedApplication] statusBarOrientation]];
    [overlayCamera.layer addSublayer:captureLayer];
    [captureSession startRunning];

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [stillImageOutput setOutputSettings:outputSettings];
    [captureSession addOutput:stillImageOutput];

    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in stillImageOutput.connections)
    {
        for (AVCaptureInputPort *port in [connection inputPorts])
        {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] )
            {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }

    [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
                                                         completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
     {
         NSData *imageNSData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

         CGImageSourceRef imgSource = CGImageSourceCreateWithData((__bridge_retained CFDataRef)imageNSData, NULL);

         NSDictionary *metadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);

         NSMutableDictionary *metadataAsMutable = [metadata mutableCopy];

         NSMutableDictionary *EXIFDictionary = [[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy];

         if(!EXIFDictionary)
             EXIFDictionary = [[NSMutableDictionary dictionary] init];

         [metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];

         NSLog(@"%@",EXIFDictionary);
     }];
}
Run Code Online (Sandbox Code Playgroud)

这是输出:

{
    ApertureValue = "2.52606882168926";
    BrightnessValue = "0.5019629837352776";
    ColorSpace = 1;
    ComponentsConfiguration =     (
        1,
        2,
        3,
        0
    );
    ExifVersion =     (
        2,
        2,
        1
    );
    ExposureMode = 0;
    ExposureProgram = 2;
    ExposureTime = "0.008333333333333333";
    FNumber = "2.4";
    Flash = 16;
    FlashPixVersion =     (
        1,
        0
    );
    FocalLenIn35mmFilm = 40;
    FocalLength = "4.28";
    ISOSpeedRatings =     (
        50
    );
    LensMake = Apple;
    LensModel = "iPhone 4S back camera 4.28mm f/2.4";
    LensSpecification =     (
        "4.28",
        "4.28",
        "2.4",
        "2.4"
    );
    MeteringMode = 5;
    PixelXDimension = 1920;
    PixelYDimension = 1080;
    SceneCaptureType = 0;
    SceneType = 1;
    SensingMethod = 2;
    ShutterSpeedValue = "6.906947890818858";
    SubjectDistance = "69.999";
    UserComment = "[S.D.] kCGImagePropertyExifUserComment";
    WhiteBalance = 0;
}
Run Code Online (Sandbox Code Playgroud)

我想我拥有计算FOV所需的一切.但他们是正确的价值观吗?因为看了很多不同的网站给出了不同的焦距值,我有点困惑!我的PixelDimensions似乎也错了!

通过http://en.wikipedia.org/wiki/Angle_of_view这是我计划使用的公式:

FOV = (IN_DEGREES(   2*atan( (d) / (2  * f) )   ));
// d = sensor dimensions (mm)
// f = focal length (mm)
Run Code Online (Sandbox Code Playgroud)

我的问题

我的方法和我的公式看起来是否正确,如果是,我将哪些值传递给函数?


精度

  • 如果你对统治者如何匹配现实有任何建议,我认为我需要使用FOV; 我会接受答案!
  • 在增强现实视图控制器中禁用缩放,因此在初始化相机时我的视野是固定的,并且在用户旋转手机之前无法更改!

也很抱歉我的英语错误,我是法国人......

Wil*_*ker 26

在iOS 7及更高版本中,您可以按照以下方式执行操作:

float FOV = camera.activeFormat.videoFieldOfView;
Run Code Online (Sandbox Code Playgroud)

这里camera是你的AVCaptureDevice.根据您为视频会话选择的预设,即使在同一设备上也可以更改.它是水平视野(以度为单位),因此您需要根据显示尺寸计算垂直视野.

这是Apple的参考资料.