我试图将AudioFilePlayer AudioUnit连接到AU3DMixerEmbedded音频单元,但是没有成功。

这是我在做什么:

  • 使用AUGraph创建一个NewAUGraph()
  • 打开图表
  • 初始化图
  • 添加3个节点:
  • outputNode:kAudioUnitSubType_RemoteIO
  • 混合器节点:kAudioUnitSubType_AU3DMixerEmbedded
  • filePlayerNode:kAudioUnitSubType_AudioFilePlayer
  • 连接节点:
  • filePlayerNode-> mixerNode
  • mixerNode-> outputNode
  • 配置filePlayer音频单元以播放所需的文件
  • 启动图表

  • 这是行不通的:它在AUGraphInitialize上出现错误10868(kAudioUnitErr_FormatNotSupported)。我认为问题是由于filePlayer和混音器之间的音频格式不匹配。我认为这是因为:
    -如果我注释掉了将filePlayerNode连接到mixerNode(AUGraphConnectNodeInput(_graph, filePlayerNode, 0, mixerNode, 0))并注释掉了第6步,则不会报告任何错误。
    -如果我将步骤3替换为将filePlayerNode直接连接到outputNode(AUGraphConnectNodeInput(_graph, filePlayerNode, 0, outputNode, 0)),则会播放音频。

    将filePlayerNode连接到mixerNode时,我缺少哪些步骤?

    这是完整的代码。它基于Apple的示例代码以及我从互联网上找到的其他示例。 (AUGraphStart称为后者):
    - (id)init
    {
        self = [super init];
        if (self != nil)
        {
            {
                //create a new AUGraph
                CheckError(NewAUGraph(&_graph), "NewAUGraph failed");
                // opening the graph opens all contained audio units but does not allocate any resources yet
                CheckError(AUGraphOpen(_graph), "AUGraphOpen failed");
                // now initialize the graph (causes resources to be allocated)
                CheckError(AUGraphInitialize(_graph), "AUGraphInitialize failed");
            }
    
            AUNode outputNode;
            {
                AudioComponentDescription outputAudioDesc = {0};
                outputAudioDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
                outputAudioDesc.componentType = kAudioUnitType_Output;
                outputAudioDesc.componentSubType = kAudioUnitSubType_RemoteIO;
                // adds a node with above description to the graph
                CheckError(AUGraphAddNode(_graph, &outputAudioDesc, &outputNode), "AUGraphAddNode[kAudioUnitSubType_DefaultOutput] failed");
            }
    
            AUNode mixerNode;
            {
                AudioComponentDescription mixerAudioDesc = {0};
                mixerAudioDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
                mixerAudioDesc.componentType = kAudioUnitType_Mixer;
                mixerAudioDesc.componentSubType = kAudioUnitSubType_AU3DMixerEmbedded;
                mixerAudioDesc.componentFlags = 0;
                mixerAudioDesc.componentFlagsMask = 0;
                // adds a node with above description to the graph
                CheckError(AUGraphAddNode(_graph, &mixerAudioDesc, &mixerNode), "AUGraphAddNode[kAudioUnitSubType_AU3DMixerEmbedded] failed");
            }
    
            AUNode filePlayerNode;
            {
                AudioComponentDescription fileplayerAudioDesc = {0};
                fileplayerAudioDesc.componentType = kAudioUnitType_Generator;
                fileplayerAudioDesc.componentSubType = kAudioUnitSubType_AudioFilePlayer;
                fileplayerAudioDesc.componentManufacturer = kAudioUnitManufacturer_Apple;
                // adds a node with above description to the graph
                CheckError(AUGraphAddNode(_graph, &fileplayerAudioDesc, &filePlayerNode), "AUGraphAddNode[kAudioUnitSubType_AudioFilePlayer] failed");
            }
    
            //Connect the nodes
            {
                // connect the output source of the file player AU to the input source of the output node
    //            CheckError(AUGraphConnectNodeInput(_graph, filePlayerNode, 0, outputNode, 0), "AUGraphConnectNodeInput");
    
                CheckError(AUGraphConnectNodeInput(_graph, filePlayerNode, 0, mixerNode, 0), "AUGraphConnectNodeInput");
                CheckError(AUGraphConnectNodeInput(_graph, mixerNode, 0, outputNode, 0), "AUGraphConnectNodeInput");
            }
    
    
    
            // configure the file player
            // tell the file player unit to load the file we want to play
            {
                //?????
                AudioStreamBasicDescription inputFormat; // input file's data stream description
                AudioFileID inputFile; // reference to your input file
    
                // open the input audio file and store the AU ref in _player
                CFURLRef songURL = (__bridge CFURLRef)[[NSBundle mainBundle] URLForResource:@"monoVoice" withExtension:@"aif"];
                CheckError(AudioFileOpenURL(songURL, kAudioFileReadPermission, 0, &inputFile), "AudioFileOpenURL failed");
    
                //create an empty MyAUGraphPlayer struct
                AudioUnit fileAU;
    
                // get the reference to the AudioUnit object for the file player graph node
                CheckError(AUGraphNodeInfo(_graph, filePlayerNode, NULL, &fileAU), "AUGraphNodeInfo failed");
    
                // get and store the audio data format from the file
                UInt32 propSize = sizeof(inputFormat);
                CheckError(AudioFileGetProperty(inputFile, kAudioFilePropertyDataFormat, &propSize, &inputFormat), "couldn't get file's data format");
    
                CheckError(AudioUnitSetProperty(fileAU, kAudioUnitProperty_ScheduledFileIDs, kAudioUnitScope_Global, 0, &(inputFile), sizeof((inputFile))), "AudioUnitSetProperty[kAudioUnitProperty_ScheduledFileIDs] failed");
    
                UInt64 nPackets;
                UInt32 propsize = sizeof(nPackets);
                CheckError(AudioFileGetProperty(inputFile, kAudioFilePropertyAudioDataPacketCount, &propsize, &nPackets), "AudioFileGetProperty[kAudioFilePropertyAudioDataPacketCount] failed");
    
                // tell the file player AU to play the entire file
                ScheduledAudioFileRegion rgn;
                memset (&rgn.mTimeStamp, 0, sizeof(rgn.mTimeStamp));
                rgn.mTimeStamp.mFlags = kAudioTimeStampSampleTimeValid;
                rgn.mTimeStamp.mSampleTime = 0;
                rgn.mCompletionProc = NULL;
                rgn.mCompletionProcUserData = NULL;
                rgn.mAudioFile = inputFile;
                rgn.mLoopCount = 1;
                rgn.mStartFrame = 0;
                rgn.mFramesToPlay = nPackets * inputFormat.mFramesPerPacket;
    
                CheckError(AudioUnitSetProperty(fileAU, kAudioUnitProperty_ScheduledFileRegion, kAudioUnitScope_Global, 0,&rgn, sizeof(rgn)), "AudioUnitSetProperty[kAudioUnitProperty_ScheduledFileRegion] failed");
    
                // prime the file player AU with default values
                UInt32 defaultVal = 0;
                CheckError(AudioUnitSetProperty(fileAU, kAudioUnitProperty_ScheduledFilePrime, kAudioUnitScope_Global, 0, &defaultVal, sizeof(defaultVal)), "AudioUnitSetProperty[kAudioUnitProperty_ScheduledFilePrime] failed");
    
                // tell the file player AU when to start playing (-1 sample time means next render cycle)
                AudioTimeStamp startTime;
                memset (&startTime, 0, sizeof(startTime));
                startTime.mFlags = kAudioTimeStampSampleTimeValid;
                startTime.mSampleTime = -1;
                CheckError(AudioUnitSetProperty(fileAU, kAudioUnitProperty_ScheduleStartTimeStamp, kAudioUnitScope_Global, 0, &startTime, sizeof(startTime)), "AudioUnitSetProperty[kAudioUnitProperty_ScheduleStartTimeStamp]");
    
                // file duration
                //double duration = (nPackets * _player.inputFormat.mFramesPerPacket) / _player.inputFormat.mSampleRate;
            }
    
    
        }
        return self;
    }
    

    最佳答案

    我没有在您的代码中看到为音频单元设置适当的kAudioUnitProperty_StreamFormat的位置。您还必须检查错误结果代码,以查看所选择的流格式设置实际上是否受到配置的音频单元的支持。如果不是,请尝试其他格式。

    关于ios - 如何将AudioFilePlayer AudioUnit连接到3DMixer?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/10300965/

    10-11 14:51