我需要在音频文件周围寻找并提取块。我正在尝试使用AVAssetReader。我看到的错误是,如果我在一段时间内从不同的偏移量读取音频,则得到的平均值(块)会有所不同。

例如,如果我从0.1s到0.5s读取音频,我收到的块数会有所不同,如果从0.2s到0.5s读取则不同。

以下是演示它的代码示例

#import <AudioToolbox/AudioToolbox.h>
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MediaPlayer.h>

+ (void) test
{
    NSURL* path = [[NSBundle mainBundle] URLForResource:@"music" withExtension:@"mp3"];

    [self test:path sample:1 showChunks:5];
    [self test:path sample:2 showChunks:4];
    [self test:path sample:3 showChunks:3];
}

+(void) test:(NSURL*) url sample:(NSInteger) sample showChunks:(NSInteger) chunkCount
{
#define CHUNK 800
#define SAMPLE_RATE 8000
    AVURLAsset* asset = [AVURLAsset URLAssetWithURL:url options:nil];
    NSError *assetError = nil;
    AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&assetError];

    CMTime startTime = CMTimeMake(sample*CHUNK, SAMPLE_RATE);
    CMTimeShow(startTime);

    CMTimeRange timeRange = CMTimeRangeMake(startTime, kCMTimePositiveInfinity);
    assetReader.timeRange = timeRange;

    NSDictionary* dict = nil;
    dict = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInteger:SAMPLE_RATE], AVSampleRateKey, [NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, nil];


    AVAssetReaderAudioMixOutput* assetReaderOutput = [AVAssetReaderAudioMixOutput assetReaderAudioMixOutputWithAudioTracks:asset.tracks audioSettings: dict];
    if (! [assetReader canAddOutput: assetReaderOutput]) {
        NSLog (@"error: Cannot add output reader");
        assetReader = nil;
        return;
    }

    [assetReader addOutput: assetReaderOutput];

    [assetReader startReading];

    CMSampleBufferRef nextBuffer;

    if (!(nextBuffer = [assetReaderOutput copyNextSampleBuffer]))
    {
        return;
    }
    CMSampleBufferGetTotalSampleSize (nextBuffer);
    // Extract bytes from buffer
    CMBlockBufferRef dataBuffer = CMSampleBufferGetDataBuffer(nextBuffer);

    NSInteger len = CMBlockBufferGetDataLength(dataBuffer);
    if (len < chunkCount*CHUNK)
    {
        printf("CHUNK is to big");
        return;
    }
    UInt8* buf = malloc(len);
    CMBlockBufferCopyDataBytes(dataBuffer, 0, len, buf);

    for (int ii = 0; ii < chunkCount*CHUNK; ii+=CHUNK)
    {
        CGFloat av = 0;
        for (int jj = 0; jj < CHUNK; jj++)
        {
            av += (CGFloat) buf[jj+ii];
        }
        printf("Time: %f av: %f\n", (CGFloat)(ii+CHUNK*sample)/(CGFloat)SAMPLE_RATE,  av/(CGFloat)CHUNK);
    }
    printf("\n");

    free(buf);


}

这是输出
{800/8000 = 0.100}
Time: 0.100000 av: 149.013748
Time: 0.200000 av: 100.323753
Time: 0.300000 av: 146.991257
Time: 0.400000 av: 106.763748
Time: 0.500000 av: 145.020004

{1600/8000 = 0.200}
Time: 0.200000 av: 145.011246
Time: 0.300000 av: 110.718750
Time: 0.400000 av: 154.543747
Time: 0.500000 av: 112.025002

{2400/8000 = 0.300}
Time: 0.300000 av: 149.278748
Time: 0.400000 av: 104.477501
Time: 0.500000 av: 158.162506

请帮助

最佳答案

在我看来,您的问题是假设以下代码准确地寻找了startTime:

CMTimeRange timeRange = CMTimeRangeMake(startTime, kCMTimePositiveInfinity);
assetReader.timeRange = timeRange;

您可以通过以下方式进行测试:

CMSampleBufferGetOutputPresentationTimeStamp(nextBuffer);

从中您将能够看到缓冲区启动的确切时间(以秒为单位)。

关于iphone - AVAssetReader寻求,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/6083162/

10-11 19:24