问题描述
有人可以向我解释一下OpenAL如何适应iPhone上的声音架构吗?
Could someone explain to me how OpenAL fits in with the schema of sound on the iPhone?
似乎有不同级别的API用于处理声音。较高级别的知识很容易理解。
There seem to be APIs at different levels for handling sound. The higher level ones are easy enough to understand.
但是我的理解变得模糊不清。有核心音频,音频单元,OpenAL。
But my understanding gets murky towards the bottom. There is Core Audio, Audio Units, OpenAL.
这些之间有什么联系? openAL是哪个基础,其上有Core Audio(其中包含作为其低级对象之一的Audio Units)?
What is the connection between these? Is openAL the substratum, upon which rests Core Audio (which contains as one of its lower-level objects Audio Units) ?
Opencode似乎没有被Xcode记录,但我可以运行使用其功能的代码。
OpenAL doesn't seem to be documented by Xcode, yet I can run code that uses its functions.
推荐答案
这就是我想到的:
底层是Core Audio。具体来说,音频单元。
The substratum is Core Audio. Specifically, Audio Units.
因此,音频单元构成了基础层,并且在此基础上构建了一些低级框架。整个caboodle被称为Core Audio。
So Audio Units form the base layer, and some low-level framework has been built on top of this. And the whole caboodle is termed Core Audio.
OpenAL是一个多平台API - 创建者试图反映OpenGL的可移植性。一些公司正在赞助OpenAL,包括Creative Labs和Apple!
OpenAL is a multiplatform API -- the creators are trying to mirror the portability of OpenGL. A few companies are sponsoring OpenAL, including Creative Labs and Apple!
因此Apple提供了这个API,基本上是作为Core Audio的薄包装器。我猜这是为了让开发人员轻松地提取代码。请注意,这是一个不完整的实现,因此如果您希望OpenAL执行Core Audio可以执行的操作,它将执行此操作。但是否则它不会。
So Apple has provided this API, basically as a thin wrapper over Core Audio. I am guessing this is to allow developers to pull over code easily. Be warned, it is an incomplete implementation, so if you want OpenAL to do something that Core Audio can do, it will do it. But otherwise it won't.
有点违反直觉 - 只要查看来源,看起来好像OpenAL是较低级别。不是这样!
Kind of counterintuitive -- just looking at the source, it looks as if OpenAL is lower level. Not so!
这篇关于iOS:Audio Units vs OpenAL vs Core Audio的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!