问题描述
我试图在后台查看视频/摄像机,同时我也允许我的应用中的触觉反馈以执行各种操作,但是似乎AVFoundation在我所进行的任何通话中都表现不佳触觉调用:
I'm trying to have a video / camera view in the background while I also allow for haptic feedback in my app for various actions, but it seems that AVFoundation is not playing nice with any of the calls I am making that involve the haptic calls:
if #available(iOS 10.0, *) {
let generator = UIImpactFeedbackGenerator(style: .light)
generator.prepare()
generator.impactOccurred()
// More:
let feedbackGenerator = UISelectionFeedbackGenerator()
feedbackGenerator.selectionChanged()
}
触觉反馈效果很好,符合预期只要将AVFoundation的内容注释掉即可。有任何想法吗?
Haptic feedback works great and as expected as long as the AVFoundation stuff is commented out. Any ideas?
使用:
captureSession = AVCaptureSession()
AND:
谢谢
推荐答案
我假设,如果您使用的是AVCaptureSession,那么您可能会有类似这样的代码:
I assume that if you are using AVCaptureSession then you probably have code like this:
do {
let audioDevice = AVCaptureDevice.default(for: .audio)
let audioDeviceInput = try AVCaptureDeviceInput(device: audioDevice!)
if captureSession.canAddInput(audioDeviceInput) {
captureSession.addInput(audioDeviceInput)
} else {
print("Could not add audio device input to the session")
}
} catch {
print("Could not create audio device input: \(error)")
}
因此音频输入无法播放与触觉引擎配合得很好。在播放触觉之前,必须从捕获会话中删除音频输入,然后将其重新添加。
So audio input is not playing well with haptic engine. You have to remove audio input from capture session before you are playing haptic and then add it back.
这篇关于触觉反馈与AVFoundation搭配不好? (UIImpactFeedbackGenerator等)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!