我正在Xamarin.iOS项目中设置AVCaptureSession,无论我在设备上设置的帧速率如何,都会有很高比例的丢帧。当我减少 session 以使用后置摄像头和仅一个输出(AVCaptureVideoDataOutput)时,我的代表的DidOutputSampleBuffer方法可能仅被调用2/3。即使将最大和最小帧持续时间设置为1/3秒,并且本质上破坏了委托方法,除了递增计数器之外,它什么也没做,这是正确的。很抱歉后面的代码墙,但是我不知道哪里可能有什么问题。
捕获 session 设置:
AVCaptureDeviceType[] deviceTypes = new AVCaptureDeviceType[] {
AVCaptureDeviceType.BuiltInTrueDepthCamera,
AVCaptureDeviceType.BuiltInDualCamera,
AVCaptureDeviceType.BuiltInWideAngleCamera};
AVCaptureDeviceDiscoverySession discoverySession =
AVCaptureDeviceDiscoverySession.Create(deviceTypes,
AVMediaType.Video, AVCaptureDevicePosition.Back);
AVCaptureDevice[] devices = discoverySession.Devices;
AVCaptureDevice device = devices.FirstOrDefault();
if (device == null)
{
return false;
}
previewView.VideoPreviewLayer.Session = null;
m_CaptureSession?.Dispose();
m_CaptureSession = new AVCaptureSession();
NSError error;
AVCaptureDeviceInput input = new AVCaptureDeviceInput(device, out error);
if (error == null)
{
if (m_CaptureSession.CanAddInput(input))
{
m_CaptureSession.AddInput(input);
}
// note: will try for 120fps on my iPhone 5S, but I've
// also overridden the settings with a hard-coded 3fps
// and get the same results (dropping a large percentage
// of frames)
ConfigureCameraForHighestFrameRate(device);
AVCaptureVideoDataOutput output = new AVCaptureVideoDataOutput();
AVVideoSettingsUncompressed settings =
new AVVideoSettingsUncompressed();
settings.PixelFormatType = CVPixelFormatType.CV32BGRA;
output.UncompressedVideoSetting = settings;
m_OutputQueue = new DispatchQueue("outputQueue", false);
m_OutputRecorder = new OutputRecorder(device);
output.SetSampleBufferDelegateQueue(m_OutputRecorder, m_OutputQueue);
if (m_CaptureSession.CanAddOutput(output))
{
m_CaptureSession.AddOutput(output);
}
// don't hook up preview view, so we can test the video data
// output on its own...
// previewView.VideoPreviewLayer.Session = m_CaptureSession;
m_CaptureSession.StartRunning();
}
委托方法:
public override void DidOutputSampleBuffer(AVCaptureOutput captureOutput,
CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
lock (m_Mutex)
{
if (!m_Recording)
{
return;
}
m_FrameCount++;
return;
}
}
public override void DidDropSampleBuffer(AVCaptureOutput captureOutput,
CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
lock (m_Mutex)
{
if (!m_Recording)
{
return;
}
m_DroppedFrameCount++;
}
}
我做错了什么吗?
最佳答案
因此,问题显然在于,处理sampleBuffer
是委托人的工作,而当我针对记录时生成的帧执行此操作时,我不是针对记录开始之前捕获的帧。解决方案是确保无论应用程序是否正在录制,都必须处理sampleBuffer
。这可能是Xamarin特有的问题和解决方案。我不知道这对Swift开发人员是否有用。
关于c# - 如何使用AVCaptureVideoDataOutput避免AVCaptureSession中的帧丢失,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/51314705/