从AVPlayer获取HLS的PCM数据

从AVPlayer获取HLS的PCM数据

本文介绍了从AVPlayer获取HLS的PCM数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

此问题似乎问几次在过去的几年里,但没有回答已经为该。我试图处理来自HLS PCM数据,我不得不使用AVPlayer。

This question seems to be asked few times over last few years but none has answer for that. I am trying to process PCM data from HLS and I have to use AVPlayer.

这个帖子水龙头本地文件
<一href=\"https://chritto.word$p$pss.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/\">https://chritto.word$p$pss.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/

this post taps the local fileshttps://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/

和远程文件,这个水龙头工作,但不能与.m3u8 HLS文件。
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/

and this tap work with remote files but not with .m3u8 hls files.http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/

我可以在播放列表中播放前两个轨道,但不启动必要的回调,以获得PCM,当文件是本地的还是远程的(不流),我仍然可以得到PCM但它是HLS不工作,我需要工作HLS

I can play first two tracks in the playlist but it doesn't start the needed callbacks to get the pcm, when the file is local or remote(not stream) I can still get the pcm but it is the hls is not working and I need HLS working

这是我的code

//avplayer tap try
- (void)viewDidLoad {
    [super viewDidLoad];

    NSURL*testUrl= [NSURL URLWithString:@"http://playlists.ihrhls.com/c5/1469/playlist.m3u8"];

    AVPlayerItem *item = [AVPlayerItem playerItemWithURL:testUrl];
    self.player = [AVPlayer playerWithPlayerItem:item];

    // Watch the status property - when this is good to go, we can access the
    // underlying AVAssetTrack we need.
    [item addObserver:self forKeyPath:@"status" options:0 context:nil];

}

-(void)observeValueForKeyPath:(NSString *)keyPath
ofObject:(id)object
change:(NSDictionary *)change
context:(void *)context
{
    if(![keyPath isEqualToString:@"status"])
        return;

    AVPlayerItem *item = (AVPlayerItem *)object;
    if(item.status != AVPlayerItemStatusReadyToPlay)
        return;

    NSArray *tracks = [self.player.currentItem tracks];
    for(AVPlayerItemTrack *track in tracks) {
        if([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio]) {
            NSLog(@"GOT DAT FUCKER");
            [self beginRecordingAudioFromTrack:track.assetTrack];
            [self.player play];
        }
    }
}

- (void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack
{
    // Configure an MTAudioProcessingTap to handle things.
    MTAudioProcessingTapRef tap;
    MTAudioProcessingTapCallbacks callbacks;
    callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
    callbacks.clientInfo = (__bridge void *)(self);
    callbacks.init = init;
    callbacks.prepare = prepare;
    callbacks.process = process;
    callbacks.unprepare = unprepare;
    callbacks.finalize = finalize;

    OSStatus err = MTAudioProcessingTapCreate(
                                              kCFAllocatorDefault,
                                              &callbacks,
                                              kMTAudioProcessingTapCreationFlag_PostEffects,
                                              &tap
                                              );

    if(err) {
        NSLog(@"Unable to create the Audio Processing Tap %d", (int)err);
        return;
    }

    // Create an AudioMix and assign it to our currently playing "item", which
    // is just the stream itself.
    AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
    AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters
                                                     audioMixInputParametersWithTrack:audioTrack];

    inputParams.audioTapProcessor = tap;
    audioMix.inputParameters = @[inputParams];
    self.player.currentItem.audioMix = audioMix;
}

void process(MTAudioProcessingTapRef tap, CMItemCount numberFrames,
             MTAudioProcessingTapFlags flags, AudioBufferList *bufferListInOut,
             CMItemCount *numberFramesOut, MTAudioProcessingTapFlags *flagsOut)
{
    OSStatus err = MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut,
                                                      flagsOut, NULL, numberFramesOut);
    if (err) NSLog(@"Error from GetSourceAudio: %d", (int)err);

    NSLog(@"Process");

}

void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut)
{
    NSLog(@"Initialising the Audio Tap Processor");
    *tapStorageOut = clientInfo;
}

void finalize(MTAudioProcessingTapRef tap)
{
    NSLog(@"Finalizing the Audio Tap Processor");
}

void prepare(MTAudioProcessingTapRef tap, CMItemCount maxFrames, const AudioStreamBasicDescription *processingFormat)
{
    NSLog(@"Preparing the Audio Tap Processor");
}

void unprepare(MTAudioProcessingTapRef tap)
{
    NSLog(@"Unpreparing the Audio Tap Processor");
}

无效的init 被称为无效prepare 过程已被称为好。

void init is called void prepare and process has to be called as well.

我怎么能做到这一点?

how can I do this?

推荐答案

我建议使用

在iOS和Mac OS X的真快音频使用音频单元是很难的,而且会离开你伤痕累累,鲜血直流。过去需要几天,现在可以用code的短短几行完成的。

Really fast audio in iOS and Mac OS X using Audio Units is hard, and will leave you scarred and bloody. What used to take days can now be done with just a few lines of code.

这篇关于从AVPlayer获取HLS的PCM数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

08-14 01:11