问题描述
我正在开发一个增强现实应用程序位置,我需要获得视野[FOV](我只是更新的方向改变,所以我正在寻找一个方法,当我调用它可以得到这个值)
I am developping an augmented reality application location based and I need to get the field of view [FOV] (I just update the value when the orientation change, so I am looking for a method which can get this value when I call it)
目标是使一个度尺与现实相关,如下:
The goal is to make a "degree ruler" relevant to reality like the following:
我已经使用 AVCaptureSession
显示摄像机流;以及与 CAShapeLayer
结合以绘制标尺的路径。这是工作相当不错,但现在我必须使用视场值将我的元素放在正确的地方(选择正确的空间,例如在160°和170°之间!
I am already using AVCaptureSession
to display camera stream ; and a path coupled with a CAShapeLayer
to draw the ruler. This is working pretty good, but now I have to use Field of view value to place my element in the right place (choose the right space between 160° and 170° for example!).
实际上,我使用这些来源硬编码这些值:(特别感谢@ hotpaw2!)但我不知道他们是完全精确,这不是处理iPhone 5等,我无法从官方来源获得值苹果!),但有一个链接显示所有iDevice的值我认为我需要(4,4S,5,5S):。
Actually, I am hardcoding these values with these sources : http://stackoverflow.com/a/3594424/3198096 (Special thanks to @hotpaw2!) But I am not sure they are fully precise and this is not handling iPhone 5, etc. I was unable to obtain values from official sources (Apple!), but there is a link showing values for all iDevice I think I need (4, 4S, 5, 5S) : AnandTech | Some thoughts about the iphone 5s camera improvements.
注意:在个人测试和其他在线研究之后,我很确定这些值不准确!此外,这迫使我使用外部库来检查我使用哪个型号的手动初始化我的FOV ...我必须检查我的值所有支持的设备。
Note: After personal test and some other research online, I am pretty sure these values are inaccurate! Also this forces me to use an external library to check which model of iPhone am I using to manually initialize my FOV... And I have to check my values for all supported device.
阅读这篇文章后: -
注意:在更新一定数量后,似乎我们无法从exif直接获取视野!)
After reading this post: iPhone: Real-time video color info, focal length, aperture?, I am trying to get exif data
from AVCaptureStillImageOutput like suggested. After what I could be able to read the focal length from the exif data, and then calculate the horizontal and vertical field of view via formula! (Or maybe directly obtain the FOV like showed here : http://www.brianklug.org/2011/11/a-quick-analysis-of-exif-data-from-apples-iphone-4s-camera-samples/ --note: after a certain number of update, It seems that we can't get directly field of view from exif!)
来源:和
这里是我使用的代码:
AVCaptureDevice* camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if (camera != nil)
{
captureSession = [[AVCaptureSession alloc] init];
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:nil];
[captureSession addInput:newVideoInput];
captureLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
captureLayer.frame = overlayCamera.bounds;
[captureLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
previewLayerConnection=captureLayer.connection;
[self setCameraOrientation:[[UIApplication sharedApplication] statusBarOrientation]];
[overlayCamera.layer addSublayer:captureLayer];
[captureSession startRunning];
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
[captureSession addOutput:stillImageOutput];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
NSData *imageNSData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
CGImageSourceRef imgSource = CGImageSourceCreateWithData((__bridge_retained CFDataRef)imageNSData, NULL);
NSDictionary *metadata = (__bridge NSDictionary *)CGImageSourceCopyPropertiesAtIndex(imgSource, 0, NULL);
NSMutableDictionary *metadataAsMutable = [metadata mutableCopy];
NSMutableDictionary *EXIFDictionary = [[metadataAsMutable objectForKey:(NSString *)kCGImagePropertyExifDictionary]mutableCopy];
if(!EXIFDictionary)
EXIFDictionary = [[NSMutableDictionary dictionary] init];
[metadataAsMutable setObject:EXIFDictionary forKey:(NSString *)kCGImagePropertyExifDictionary];
NSLog(@"%@",EXIFDictionary);
}];
}
这里是输出:
{
ApertureValue = "2.52606882168926";
BrightnessValue = "0.5019629837352776";
ColorSpace = 1;
ComponentsConfiguration = (
1,
2,
3,
0
);
ExifVersion = (
2,
2,
1
);
ExposureMode = 0;
ExposureProgram = 2;
ExposureTime = "0.008333333333333333";
FNumber = "2.4";
Flash = 16;
FlashPixVersion = (
1,
0
);
FocalLenIn35mmFilm = 40;
FocalLength = "4.28";
ISOSpeedRatings = (
50
);
LensMake = Apple;
LensModel = "iPhone 4S back camera 4.28mm f/2.4";
LensSpecification = (
"4.28",
"4.28",
"2.4",
"2.4"
);
MeteringMode = 5;
PixelXDimension = 1920;
PixelYDimension = 1080;
SceneCaptureType = 0;
SceneType = 1;
SensingMethod = 2;
ShutterSpeedValue = "6.906947890818858";
SubjectDistance = "69.999";
UserComment = "[S.D.] kCGImagePropertyExifUserComment";
WhiteBalance = 0;
}
我想我有一切需要计算FOV。但他们是正确的价值观吗?因为在阅读了很多不同的网站给不同的焦距值后,我有点困惑!此外,我的PixelDimensions似乎是错误的!
I think I have everything I need to calculate FOV. But are they the right values? Because after reading a lot of different website giving different focal length values, I am a bit confused! Also my PixelDimensions seems to be wrong!
通过这是我计划使用的公式:
Via http://en.wikipedia.org/wiki/Angle_of_view this is the formula I planned to use:
FOV = (IN_DEGREES( 2*atan( (d) / (2 * f) ) ));
// d = sensor dimensions (mm)
// f = focal length (mm)
我的问题
我的方法和我的公式是否正确,如果是,我传递给函数?
My Question
Do my method and my formula look right, and if yes, which values do I pass to the function?
- FOV是我想要的使用,如果你有任何建议如何统治者可以匹配现实;我会接受答案!
- 在增强现实视图控制器中禁用缩放,因此当相机初始化时,我的视野是固定的,并且在用户旋转手机之前不能改变!
还要抱歉我的英语错误, ..
Also sorry for my English mistakes, I'm French...
推荐答案
在iOS 7及以上版本中,您可以按照以下方式执行:
In iOS 7 and above you can do something along these lines:
float FOV = camera.activeFormat.videoFieldOfView;
其中相机
是您的 AVCaptureDevice
。根据您为视频会话选择的预设值,即使在同一设备上也可更改 。这是水平视野(以度为单位),因此您需要从显示维度计算垂直视野。
where camera
is your AVCaptureDevice
. Depending on what preset you choose for the video session, this can change even on the same device. It's the horizontal field-of-view (in degrees), so you'll need to calculate the vertical field-of-view from the display dimensions.
这里是 Apple的参考资料。
这篇关于如何计算FOV?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!