我正在尝试构建一个视频混搭应用程序,我需要用户能够设置一条具有恒定视频轨道的轨道,并拥有第二条轨道作为b-roll轨道,并从中分离出时不时显示主要内容。
我有第一个音轨,所以可以使用AVMutableCompositionInstructions将时间轴上的剪辑安排在合成中,但是我无法围绕如何在独立控制的B-roll音轨中工作?我已经为此奋斗了好几天!
这是构建第一条值得跟踪的内容的代码,现在我将其设置为在剪辑之间变为黑色。任何AVFoundation专家都可以给我提示吗?

CMTime nextClipStartTime = kCMTimeZero;
    NSInteger i;
    CMTime transitionDuration = CMTimeMakeWithSeconds(1,30);
    AVMutableCompositionTrack *compositionVideoTrack[2];
    AVMutableCompositionTrack *compositionAudioTrack[2];
    compositionVideoTrack[0] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionAudioTrack[0] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionAudioTrack[1] = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    compositionVideoTrack[1] = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    AVMutableCompositionTrack *bedMusicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    i = 0;
    NSMutableArray *allAudioParams = [NSMutableArray array];
    AVMutableAudioMixInputParameters *audioInputParams[2];
    audioInputParams[0] = [AVMutableAudioMixInputParameters audioMixInputParameters];
    audioInputParams[1] = [AVMutableAudioMixInputParameters audioMixInputParameters];
    [audioInputParams[0] setTrackID: compositionAudioTrack[0].trackID];
    [audioInputParams[1] setTrackID: compositionAudioTrack[1].trackID];
    float lastVol = 0;
    NSMutableArray *instructions = [NSMutableArray array];
    for(ClipInfo *info in videoLine.items){
        AVAsset *asset = [AVAsset assetWithURL:info.url];
        CMTimeRange timeRangeInAsset = CMTimeRangeMake(info.inTime, info.duration);
        AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

        [compositionVideoTrack[0] insertTimeRange:timeRangeInAsset ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];
        AVAssetTrack *clipAudioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        [compositionAudioTrack[0] insertTimeRange:timeRangeInAsset ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
        if(i != 0){
            [audioInputParams[0] setVolume:lastVol atTime: CMTimeSubtract(nextClipStartTime,CMTimeMakeWithSeconds(1,30))];;
        }
        [audioInputParams[0] setVolume:info.volume atTime:nextClipStartTime];
        lastVol = info.volume;

        CMTime clipStartTime = (i == 0) ? nextClipStartTime : CMTimeAdd(nextClipStartTime,transitionDuration);
        CMTime clipDuration = (i == 0 || i == (videoLine.items.count - 1)) ? CMTimeSubtract(timeRangeInAsset.duration, transitionDuration) : CMTimeSubtract(timeRangeInAsset.duration, CMTimeMultiply(transitionDuration, 2));
        if([videoLine.items count] == 1){
            clipDuration = timeRangeInAsset.duration;
        }
        if(i != 0){
            //trans in
            AVMutableVideoCompositionInstruction *inInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
            inInstruction.timeRange = CMTimeRangeMake(nextClipStartTime, transitionDuration);
            AVMutableVideoCompositionLayerInstruction *fadeIn = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack[0]];
            [fadeIn setOpacityRampFromStartOpacity:0 toEndOpacity:1 timeRange:CMTimeRangeMake(nextClipStartTime, transitionDuration)];
            inInstruction.layerInstructions = [NSArray arrayWithObject:fadeIn];
            [instructions addObject:inInstruction];
        }

        AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
        passThroughInstruction.timeRange = CMTimeRangeMake(clipStartTime,clipDuration);
        AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack[0]];
        passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer];
        [instructions addObject:passThroughInstruction];

        if(i < (videoLine.items.count - 1)){
            //fade out
            AVMutableVideoCompositionInstruction *outInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
            outInstruction.timeRange = CMTimeRangeMake(CMTimeAdd(clipStartTime,clipDuration), transitionDuration);
            AVMutableVideoCompositionLayerInstruction *fadeOut = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack[0]];
            [fadeOut setOpacityRampFromStartOpacity:1.0 toEndOpacity:0 timeRange:CMTimeRangeMake(CMTimeAdd(clipStartTime,clipDuration), transitionDuration)];
            outInstruction.layerInstructions = [NSArray arrayWithObject:fadeOut];
            [instructions addObject:outInstruction];
        }
        nextClipStartTime = CMTimeAdd(nextClipStartTime,timeRangeInAsset.duration);
        if(i == ([videoLine.items count] - 1)){
            [audioInputParams[0] setVolume:info.volume atTime:nextClipStartTime];
        }
        i++;
    }

最佳答案

您需要在视频/音频合成音轨“ A”和“ B”之间来回切换,或在代码compositionVideoTrack [0]和compositionVideoTrack [1]之间来回切换。循环中的每次迭代都会切换出您要合成的音轨。

关于objective-c - 如何使用AVFoundation进行双轨编辑?,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/12015337/

10-13 04:01