问题描述
我正在构建一个必须跟踪用户麦克风输入幅度的应用程序。
AudioKit有很多方便的对象可以满足我的需求:AKAmplitudeTracker等。我尚未找到有关如何启动AudioKit,开始跟踪等的任何可行信息。
I'm building an app that has to track input amplitude of users mic.AudioKit has a bunch of handy objects for my needs: AKAmplitudeTracker and so. I haven't found any viable info on how is it supposed to start AudioKit, begin tracking etc.
目前,与AudioKit初始化相关的所有代码都在我的viewDidLoad方法中录音机模块的根VC。这是不正确的,因为会发生随机错误,而且我无法跟踪出什么问题了。下面的代码显示了我现在如何使用AudioKit。
For now all code related to AudioKit initialization is in viewDidLoad method of my root VC of audio recorder module. It is not correct, because random errors occur and I can't track whats wrong. Code below shows how I use AudioKit now.
var silence: AKBooster!
var tracker: AKAmplitudeTracker!
var mic: AKMicrophone!
...
override func viewDidLoad() {
super.viewDidLoad()
switch AVAudioSession.sharedInstance().recordPermission() {
case AVAudioSessionRecordPermission.granted:
self.mic = AKMicrophone()
self.tracker = AKAmplitudeTracker(self.mic)
AKSettings.audioInputEnabled = true
AudioKit.output = self.tracker
AudioKit.start()
self.mic.start()
self.tracker.start()
break
case AVAudioSessionRecordPermission.undetermined:
AVAudioSession.sharedInstance().requestRecordPermission {
(granted) in
if granted {
self.mic = AKMicrophone()
self.tracker = AKAmplitudeTracker(self.mic)
AKSettings.audioInputEnabled = true
AudioKit.output = self.tracker
AudioKit.start()
self.mic.start()
self.tracker.start()
}
}
case AVAudioSessionRecordPermission.denied:
AVAudioSession.sharedInstance().requestRecordPermission {
(granted) in
if granted {
self.mic = AKMicrophone()
self.tracker = AKAmplitudeTracker(self.mic)
AKSettings.audioInputEnabled = true
AudioKit.output = self.tracker
AudioKit.start()
self.mic.start()
self.tracker.start()
}
}
default:
print("")
}
...
}
请帮助我弄清楚如何正确管理AudioKit。
Please help me figure out how to correctly manage AudioKit.
推荐答案
Alexey,
我建议管理AudioKit的生命周期是将其容纳在单例课程。这是在回购协议中包含的某些AudioKit示例中进行设置的方式,例如和。这样,它就不会绑定到特定ViewController的 viewDidLoad
上,并且可以从多个ViewController或管理应用状态的AppDelegate进行访问。它还可以确保仅创建一个实例。
My recommendation for managing AudioKit's lifecycle is to house it within a singleton class. This is how it's set up in some of the AudioKit examples included in the repo, such as Analog Synth X and Drums. That way, it's not bound to a specific ViewController's viewDidLoad
and can be accessed from multiple ViewControllers or the AppDelegate that manages the app's state. It also ensures that you will only create one instance of it.
下面是一个示例,其中AudioKit在名为 Conductor
(也可以称为 AudioManager
等):
Here's an example where AudioKit is initialized within a class called, Conductor
(could also be called AudioManager
, etc):
import AudioKit
import AudioKitUI
// Treat the conductor like a manager for the audio engine.
class Conductor {
// Singleton of the Conductor class to avoid multiple instances of the audio engine
static let sharedInstance = Conductor()
// Create instance variables
var mic: AKMicrophone!
var tracker: AKAmplitudeTracker!
// Add effects
var delay: AKDelay!
var reverb: AKCostelloReverb!
// Balance between the delay and reverb mix.
var reverbAmountMixer = AKDryWetMixer()
init() {
// Allow audio to play while the iOS device is muted.
AKSettings.playbackWhileMuted = true
AKSettings.defaultToSpeaker = true
// Capture mic input
mic = AKMicrophone()
// Pull mic output into the tracker node.
tracker = AKAmplitudeTracker(mic)
// Pull the tracker output into the delay effect node.
delay = AKDelay(tracker)
delay.time = 2.0
delay.feedback = 0.1
delay.dryWetMix = 0.5
// Pull the delay output into the reverb effect node.
reverb = AKCostelloReverb(delay)
reverb.presetShortTailCostelloReverb()
// Mix the amount of reverb to the delay output node.
reverbAmountMixer = AKDryWetMixer(delay, reverb, balance: 0.8)
// Assign the reverbAmountMixer output to be the final audio output
AudioKit.output = reverbAmountMixer
// Start the AudioKit engine
// This is in its own method so that the audio engine will start and stop via the AppDelegate's current state.
startAudioEngine()
}
internal func startAudioEngine() {
AudioKit.start()
print("Audio engine started")
}
internal func stopAudioEngine() {
AudioKit.stop()
print("Audio engine stopped")
}
}
这里是如何访问 Conductor
来自ViewController的单调类:
Here's how to access the amplitude tracking data that's occurring within the Conductor
singletone class from the ViewController:
import UIKit
class ViewController: UIViewController {
var conductor = Conductor.sharedInstance
override func viewDidLoad() {
super.viewDidLoad()
Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true) { [unowned self] (timer) in
print(self.conductor.tracker.amplitude)
}
}
}
您可以从以下位置下载此GitHub存储库此处:
You can download this GitHub repo from here:
我希望这会有所帮助。
护理,
标记
Take care,
Mark
这篇关于管理AudioKit生命周期的正确方法是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!