本文介绍了如何将AVFoundations captureStillImageAsynchronouslyFromConnection中的TIFF照片保存到iPhone(iOS)上具有EXIF元数据的文件中?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

关于这个问题,我只想问一下使用Xcode和iOS而无需外部库的可能性.我已经在探索在另一个.

With this question I only ask for the possibilities I have with Xcode and iOS without external libraries. I am already exploring the possibility of using libtiff in another question.

问题

我已经解决了数周的堆栈溢出问题,并自行解决了每个问题的可行解决方案.我有4件事需要工作:

I have been sieving stack overflow for weeks and found working solutions for every one of my problems on its own. I have 4 things that need to work:

  1. 我需要来自相机的RGBA数据,没有任何压缩
  2. 我需要尽可能多的元数据,尤其是EXIF
  3. 我需要以TIFF格式保存,以便与其他软件兼容并且无损
  4. 我需要保存文件而不是照片库,以防止随意查看

使用JPEG,我可以拥有2和4.我可以将1、3和4的原始数据(分别是NSData)从相机缓冲区中提取出来.我可以同时具备Xcode和iOS的所有4个先决条件吗?我将放弃,并在万不得已时寻求您的意见.

I can have 2 and 4 by using JPEG.I can have 1, 3 and 4 with raw data (respectively NSData) made from the camera buffer.Can I have all 4 of my prerequisites with Xcode and iOS? I am about to give up and looking for your input as a last resort.

在继续探索这一点的同时,我也停留在尝试的其他途径上,.我仍然在尝试...

While still exploring this, I am also stuck on the other avenue I tried, libtiff. I am still trying, though...

这是我尝试过的重要建议的列表,我自己的代码只是从像这样的堆栈溢出源中放在一起的:

Here is the list of great advice I have tried, my own code is just put together from stack overflow sources like these:

  • How to write exif metadata to an image (not the camera roll, just a UIImage or JPEG) (makes me wish I could use JPEG format, it is so effortless when doing what Apple prefers)
  • Raw image data from camera like "645 PRO" (this would be the point to use e.g. libtiff)
  • Saving CGImageRef to a png file? (works with kUTTypeTIFF, too, but no metadata)

解决方案

captureStillImageAsynchronouslyFromConnection中所有动作的完整顺序:

The complete sequence of actions from captureStillImageAsynchronouslyFromConnection:

[[self myAVCaptureStillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
 {
     //get all the metadata in the image
     CFDictionaryRef metadata = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageSampleBuffer, kCMAttachmentMode_ShouldPropagate);
     // get image reference
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
     // >>>>>>>>>> lock buffer address
     CVPixelBufferLockBaseAddress(imageBuffer, 0);

     //Get information about the image
     uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
     size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
     size_t width = CVPixelBufferGetWidth(imageBuffer);
     size_t height = CVPixelBufferGetHeight(imageBuffer);

     // create suitable color space
     CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

     //Create suitable context (suitable for camera output setting kCVPixelFormatType_32BGRA)
     CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);

     // <<<<<<<<<< unlock buffer address
     CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

     // release color space
     CGColorSpaceRelease(colorSpace);

     //Create a CGImageRef from the CVImageBufferRef
     CGImageRef newImage = CGBitmapContextCreateImage(newContext);

     // release context
     CGContextRelease(newContext);

     // create destination and write image with metadata
     CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:filePath isDirectory:NO];
     CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypeTIFF, 1, NULL);
     CGImageDestinationAddImage(destination, imageRef, metadata);

     // finalize and release destination
     CGImageDestinationFinalize(destination);
     CFRelease(destination);
 }

与静止图像输出相关的相机设置为:

The still image output related camera settings were:

[[self myAVCaptureSession] setSessionPreset:AVCaptureSessionPresetPhoto];

NSDictionary *outputSettings = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA], (id)kCVPixelBufferPixelFormatTypeKey, nil];
[myAVCaptureStillImageOutput setOutputSettings:outputSettings];

我获得了带有所有元数据的,标称TIFF格式的标称未压缩图像. (它已镜像到其他系统上,但是我现在可以编写EXIF和其他元数据,我也可以对其进行微调.)

I get a nice nominally uncompressed image in nominal TIFF format with all metadata. (It is mirrored on other systems, but now I can write the EXIF and other metadata, I can also fine-tune that, I am sure).

再次感谢 Wildaker 为他提供帮助!

Thanks again to Wildaker for his help!

推荐答案

您已经破解了1、3和4,看来您所缺少的唯一障碍是将数据和元数据保存在一起.尝试以下操作(假设未处理的数据位于名为myImageDataSampleBufferCMSampleBufferRef中,并且已经完成了将图形数据放入名为myImageCGImageRef中的繁重工作):

As you've already cracked 1, 3 and 4, it seems the only hurdle you're missing is saving the data and metadata together. Try this (assuming the unprocessed data is in a CMSampleBufferRef called myImageDataSampleBuffer and you've done the heavy lifting of putting the graphical data into a CGImageRef called myImage):

CFDictionaryRef metadata = CMCopyDictionaryOfAttachments(kCFAllocatorDefault,
  myImageDataSampleBuffer,
  kCMAttachmentMode_ShouldPropagate);
NSFileManager* fm = [[NSFileManager alloc] init];
NSURL* pathUrl = [fm URLForDirectory:saveDir
  inDomain:NSUserDomainMask
  appropriateForURL:nil
  create:YES
  error:nil];
NSURL* saveUrl = [pathUrl URLByAppendingPathComponent:@"myfilename.tif"];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)saveUrl,
  (CFStringRef)@"public.tiff", 1, NULL);
CGImageDestinationAddImage(destination, myImage, metadata);
CGImageDestinationFinalize(destination);
CFRelease(destination);

这篇关于如何将AVFoundations captureStillImageAsynchronouslyFromConnection中的TIFF照片保存到iPhone(iOS)上具有EXIF元数据的文件中?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-22 21:40