小编Vim*_*lan的帖子

如何在相机胶卷中对照片进行方形切割?

我想尝试iPhone上的一些图像过滤功能,就像Instagram一样.我使用imagePickerController从相机胶卷中获取照片.据我所知,imagePickerController返回的图像被减少以节省内存.将原始图像加载到UIImage并不明智.但是,我如何处理图像然后将其保存为原始像素?我使用iPhone 4S作为我的开发设备.

相机胶卷中的原始照片为3264*2448.

UIImagePickerControllerOriginalImage返回的图像是1920*1440

UIImagePickerControllerEditedImage返回的图像为640*640

imageViewOld(使用UIImagePickerControllerCropRect [80,216,1280,1280]裁剪图像返回UIImagePickerControllerOriginalImage)是1280*1224

imageViewNew(使用双倍大小的UIImagePickerControllerCropRect [80,216,2560,2560]来裁剪UIImagePickerControllerOriginalImage返回的图像)是1840*1224.

我检查同一张照片继续Instagram是1280*1280

我的问题是:

  1. 为什么UIImagePickerControllerOriginalImage不会返回"原始"照片?为什么减少到1920*1440?
  2. 为什么UIImagePickerControllerEditedImage不会返回1280*1280的图像?由于
    UIImagePickerControllerCropRect显示它被1280*1280平方切割?

  3. 如何将原始照片的正方形切割为2448*2448图像?

提前致谢.以下是我的代码:

  - (void)imagePickerController:(UIImagePickerController *)picker  didFinishPickingMediaWithInfo:(NSDictionary *)info
  {

   NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
   if ([mediaType isEqualToString:@"public.image"])
   {

    UIImage *imageEdited = [info objectForKey:UIImagePickerControllerEditedImage];
    UIImage *imagePicked = [info objectForKey:UIImagePickerControllerOriginalImage];

    CGRect cropRect;
    cropRect = [[info valueForKey:@"UIImagePickerControllerCropRect"] CGRectValue];

    NSLog(@"Original width = %f height= %f ",imagePicked.size.width, imagePicked.size.height);
    //Original width = 1440.000000 height= 1920.000000

    NSLog(@"imageEdited width = %f height = %f",imageEdited.size.width, imageEdited.size.height);
    //imageEdited width = 640.000000 height = 640.000000

    NSLog(@"corpRect %f %f …
Run Code Online (Sandbox Code Playgroud)

iphone camera

8
推荐指数
1
解决办法
8665
查看次数

标签 统计

camera ×1

iphone ×1