转换从CIFaceFeature返回的CGPoint结果

No *_*ing 4 face-detection uiimage ios uiimageorientation cifacefeature

我试图找出如何转换CGPoint返回的结果CIFaceFeature,以便用它们绘制CALayer.以前我已经将我的图像标准化为0旋转以使事情变得更容易,但这会导致使用横向模式下的设备拍摄的图像出现问题.

我一直在这方面工作一段时间没有成功,我不确定我对任务的理解是否不正确,或者我的方法是否不正确,或两者兼而有之.这是我认为正确的:

来自相机的原始图像

根据该CIDetector featuresInImage:options:方法的文档

A dictionary that specifies the orientation of the image. The detection is 
adjusted to account for the image orientation but the coordinates in the 
returned feature objects are based on those of the image.
Run Code Online (Sandbox Code Playgroud)

UIImageView中显示的图像

在下面的代码中,我试图旋转CGPoint,以便通过覆盖UIImageView的CAShape图层绘制它.

我正在做什么(......或者我认为我在做......)是将左眼CGPoint平移到视图的中心,旋转90度,然后将点转换回原来的位置.这不正确但我不知道我哪里出错了.我的方法是错误的还是我实施的方式?

#define DEGREES_TO_RADIANS(angle) ((angle) / 180.0 * M_PI)
Run Code Online (Sandbox Code Playgroud)

- leftEyePosition是一个CGPoint

CGAffineTransform  transRot = CGAffineTransformMakeRotation(DEGREES_TO_RADIANS(90));

float x = self.center.x;
float y = self.center.y;
CGAffineTransform tCenter = CGAffineTransformMakeTranslation(-x, -y);
CGAffineTransform tOffset = CGAffineTransformMakeTranslation(x, y);

leftEyePosition = CGPointApplyAffineTransform(leftEyePosition, tCenter);
leftEyePosition = CGPointApplyAffineTransform(leftEyePosition, transRot);
leftEyePosition = CGPointApplyAffineTransform(leftEyePosition, tOffset);
Run Code Online (Sandbox Code Playgroud)

从这篇文章:https://stackoverflow.com/a/14491293/840992,我需要根据imageOrientation进行旋转

取向

Apple/UIImage.imageOrientation Jpeg/File kCGImagePropertyOrientation

UIImageOrientationUp    = 0  =  Landscape left  = 1
UIImageOrientationDown  = 1  =  Landscape right = 3
UIImageOrientationLeft  = 2  =  Portrait  down  = 8
UIImageOrientationRight = 3  =  Portrait  up    = 6
Run Code Online (Sandbox Code Playgroud)

消息由skinnyTOD在2013年2月1日下午4:09编辑

Xia*_*ang 6

我需要弄清楚完全相同的问题.Apple样本"SquareCam"直接在视频输出上运行,但我还需要来自UIImage的结果.所以我用一些转换方法扩展了CIFaceFeature类,以获得关于UIImage及其UIImageView(或UIView的CALayer)的正确点位置和边界.完整的实施发布在这里:https://gist.github.com/laoyang/5747004.你可以直接使用.

这是CIFaceFeature中一个点的最基本转换,返回的CGPoint根据图像的方向转换:

- (CGPoint) pointForImage:(UIImage*) image fromPoint:(CGPoint) originalPoint {

    CGFloat imageWidth = image.size.width;
    CGFloat imageHeight = image.size.height;

    CGPoint convertedPoint;

    switch (image.imageOrientation) {
        case UIImageOrientationUp:
            convertedPoint.x = originalPoint.x;
            convertedPoint.y = imageHeight - originalPoint.y;
            break;
        case UIImageOrientationDown:
            convertedPoint.x = imageWidth - originalPoint.x;
            convertedPoint.y = originalPoint.y;
            break;
        case UIImageOrientationLeft:
            convertedPoint.x = imageWidth - originalPoint.y;
            convertedPoint.y = imageHeight - originalPoint.x;
            break;
        case UIImageOrientationRight:
            convertedPoint.x = originalPoint.y;
            convertedPoint.y = originalPoint.x;
            break;
        case UIImageOrientationUpMirrored:
            convertedPoint.x = imageWidth - originalPoint.x;
            convertedPoint.y = imageHeight - originalPoint.y;
            break;
        case UIImageOrientationDownMirrored:
            convertedPoint.x = originalPoint.x;
            convertedPoint.y = originalPoint.y;
            break;
        case UIImageOrientationLeftMirrored:
            convertedPoint.x = imageWidth - originalPoint.y;
            convertedPoint.y = originalPoint.x;
            break;
        case UIImageOrientationRightMirrored:
            convertedPoint.x = originalPoint.y;
            convertedPoint.y = imageHeight - originalPoint.x;
            break;
        default:
            break;
    }
    return convertedPoint;
}
Run Code Online (Sandbox Code Playgroud)

以下是基于上述转换的类别方法:

// Get converted features with respect to the imageOrientation property
- (CGPoint) leftEyePositionForImage:(UIImage *)image;
- (CGPoint) rightEyePositionForImage:(UIImage *)image;
- (CGPoint) mouthPositionForImage:(UIImage *)image;
- (CGRect) boundsForImage:(UIImage *)image;

// Get normalized features (0-1) with respect to the imageOrientation property
- (CGPoint) normalizedLeftEyePositionForImage:(UIImage *)image;
- (CGPoint) normalizedRightEyePositionForImage:(UIImage *)image;
- (CGPoint) normalizedMouthPositionForImage:(UIImage *)image;
- (CGRect) normalizedBoundsForImage:(UIImage *)image;

// Get feature location inside of a given UIView size with respect to the imageOrientation property
- (CGPoint) leftEyePositionForImage:(UIImage *)image inView:(CGSize)viewSize;
- (CGPoint) rightEyePositionForImage:(UIImage *)image inView:(CGSize)viewSize;
- (CGPoint) mouthPositionForImage:(UIImage *)image inView:(CGSize)viewSize;
- (CGRect) boundsForImage:(UIImage *)image inView:(CGSize)viewSize;
Run Code Online (Sandbox Code Playgroud)

(需要注意的另一件事是在从UIImage方向提取面部特征时指定正确的EXIF方向.相当令人困惑......这就是我所做的:

int exifOrientation;
switch (self.image.imageOrientation) {
    case UIImageOrientationUp:
        exifOrientation = 1;
        break;
    case UIImageOrientationDown:
        exifOrientation = 3;
        break;
    case UIImageOrientationLeft:
        exifOrientation = 8;
        break;
    case UIImageOrientationRight:
        exifOrientation = 6;
        break;
    case UIImageOrientationUpMirrored:
        exifOrientation = 2;
        break;
    case UIImageOrientationDownMirrored:
        exifOrientation = 4;
        break;
    case UIImageOrientationLeftMirrored:
        exifOrientation = 5;
        break;
    case UIImageOrientationRightMirrored:
        exifOrientation = 7;
        break;
    default:
        break;
}

NSDictionary *detectorOptions = @{ CIDetectorAccuracy : CIDetectorAccuracyHigh };
CIDetector *faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];

NSArray *features = [faceDetector featuresInImage:[CIImage imageWithCGImage:self.image.CGImage]
                                          options:@{CIDetectorImageOrientation:[NSNumber numberWithInt:exifOrientation]}];
Run Code Online (Sandbox Code Playgroud)

)