2015-05-21 15 views
7

Questa domanda è stata posta molte volte prima ma nulla mi ha aiutato. Sto unendo più video usando AVMutableComposition. Dopo aver unito i video, ottengo fotogrammi vuoti tra il 30 e il 40% dei video. Altri si fondono bene. Suono la composizione direttamente usando AVPlayer come AVPlayerItem. Codice è qui sotto:Cornice vuota sulla fusione di video utilizzando AVMutableComposition

AVMutableComposition *mutableComposition = [AVMutableComposition composition]; 
    AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                         preferredTrackID:kCMPersistentTrackID_Invalid]; 
    AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                         preferredTrackID:kCMPersistentTrackID_Invalid]; 

    NSMutableArray *instructions = [NSMutableArray new]; 
    CGSize size = CGSizeZero; 

    CMTime time = kCMTimeZero; 
    for (AVURLAsset *asset in assets) 
    { 
     AVAssetTrack *assetTrack; 
     assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 
     AVAssetTrack *audioAssetTrack = [asset tracksWithMediaType:AVMediaTypeAudio].firstObject; 

     NSError *error; 
     [videoCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration) 
             ofTrack:assetTrack 
             atTime:time 
             error:&error]; 


     if (error) { 
      NSLog(@"asset url :: %@",assetTrack.asset); 
      NSLog(@"Error - %@", error.debugDescription); 
     } 

     [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetTrack.timeRange.duration) 
             ofTrack:audioAssetTrack 
             atTime:time 
             error:&error]; 


     if (error) { 
      NSLog(@"Error - %@", error.debugDescription); 
     } 
     AVMutableVideoCompositionInstruction *videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
     videoCompositionInstruction.timeRange = CMTimeRangeMake(time, assetTrack.timeRange.duration); 
     videoCompositionInstruction.layerInstructions = @[[AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack]]; 
     [instructions addObject:videoCompositionInstruction]; 

     time = CMTimeAdd(time, assetTrack.timeRange.duration); 

     if (CGSizeEqualToSize(size, CGSizeZero)) { 
      size = assetTrack.naturalSize;; 
     } 
    } 

    AVMutableVideoComposition *mutableVideoComposition = [AVMutableVideoComposition videoComposition]; 
    mutableVideoComposition.instructions = instructions; 
    mutableVideoComposition.frameDuration = CMTimeMake(1, 30); 
    mutableVideoComposition.renderSize = size; 

    playerItem = [AVPlayerItem playerItemWithAsset:mutableComposition]; 
    playerItem.videoComposition = mutableVideoComposition; 
+0

Il vostro 'layerInstructions' è non corrette, possono disporre guardare commentando l'ultima riga:' playerItem.videoComposition = mutableVideoComposition; ' –

+0

vuoi dire non è corretta? Cosa non è corretto nelle istruzioni? Dopo aver commentato quella linea, ottengo una cornice nera tra tutti i video. – blancos

risposta

1

Per quanto ne so, il AVMutableVideoCompositionLayerInstruction non può essere semplicemente "aggiunto" o "aggiunta", come il tuo modo di codice.

Dal codice, credo che si desidera mantenere le informazioni sulle istruzioni video quando si uniscono le risorse video, ma le istruzioni non possono essere "copiate" direttamente.

Se si desidera fare ciò, consultare i documenti per AVVideoCompositionLayerInstruction, ad es.

getTransformRampForTime:startTransform:endTransform:timeRange: 
    setTransformRampFromStartTransform:toEndTransform:timeRange: 
    setTransform:atTime: 

    getOpacityRampForTime:startOpacity:endOpacity:timeRange: 
    setOpacityRampFromStartOpacity:toEndOpacity:timeRange: 
    setOpacity:atTime: 

    getCropRectangleRampForTime:startCropRectangle:endCropRectangle:timeRange: 
    setCropRectangleRampFromStartCropRectangle:toEndCropRectangle:timeRange: 
    setCropRectangle:atTime: 

Si dovrebbe usare getFoo... metodi su traccia sorgente, quindi calc la insertTime o timeRange per la traccia finale, quindi setFoo..., quindi aggiungere ai layerInstructions del videoComposition finale.

SI, un po 'complicato ... Inoltre, il più importante, non è possibile ottenere tutti gli effetti video che si applicano alla risorsa sorgente.

Quindi qual è il tuo scopo? E a cosa è supportata la risorsa sorgente?

Se si desidera unire alcuni file mp4/mov, basta eseguire il loop delle tracce e aggiungerle a AVMutableCompositionTrack, non videoComposition. E ho testato il tuo codice, funziona.

Se si desidera unire AVAssets che con le istruzioni video, vedere la spiegazione sopra e docs. E la mia pratica migliore è, prima di unire, salvare quegli AVAssets in un file usando AVAssetExportSession, quindi basta unire file video.

p.s. Forse ci sono alcuni problemi con i tuoi file di test o risorse di origine.

codice dal mio progetto come Vine:

- (BOOL)generateComposition 
    { 
      [self cleanComposition]; 

      NSUInteger segmentsCount = self.segmentsCount; 
      if (0 == segmentsCount) { 
        return NO; 
      } 

      AVMutableComposition *composition = [AVMutableComposition composition]; 
      AVMutableVideoComposition *videoComposition = nil; 
      AVMutableVideoCompositionInstruction *videoCompositionInstruction = nil; 
      AVMutableVideoCompositionLayerInstruction *videoCompositionLayerInstruction = nil; 
      AVMutableAudioMix *audioMix = nil; 

      AVMutableCompositionTrack *videoTrack = nil; 
      AVMutableCompositionTrack *audioTrack = nil; 
      AVMutableCompositionTrack *musicTrack = nil; 
      CMTime currentTime = kCMTimeZero; 

      for (MVRecorderSegment *segment in self.segments) { 
        AVURLAsset *asset = segment.asset; 
        NSArray *videoAssetTracks = [asset tracksWithMediaType:AVMediaTypeVideo]; 
        NSArray *audioAssetTracks = [asset tracksWithMediaType:AVMediaTypeAudio]; 

        CMTime maxBounds = kCMTimeInvalid; 

        CMTime videoTime = currentTime; 
        for (AVAssetTrack *videoAssetTrack in videoAssetTracks) { 
          if (!videoTrack) { 
            videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
            videoTrack.preferredTransform = CGAffineTransformIdentity; 

            videoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
            videoCompositionLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; 
          } 

          /* Fix orientation */ 
          CGAffineTransform transform = videoAssetTrack.preferredTransform; 
          if (AVCaptureDevicePositionFront == segment.cameraPosition) { 
            transform = CGAffineTransformMakeTranslation(self.config.videoSize, 0); 
            transform = CGAffineTransformScale(transform, -1.0, 1.0); 
          } else if (AVCaptureDevicePositionBack == segment.cameraPosition) { 

          } 
          [videoCompositionLayerInstruction setTransform:transform atTime:videoTime]; 

          /* Append track */ 
          videoTime = [MVHelper appendAssetTrack:videoAssetTrack toCompositionTrack:videoTrack atTime:videoTime withBounds:maxBounds]; 
          maxBounds = videoTime; 
        } 

        if (self.sessionConfiguration.originalVoiceOn) { 
          CMTime audioTime = currentTime; 
          for (AVAssetTrack *audioAssetTrack in audioAssetTracks) { 
            if (!audioTrack) { 
              audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
            } 
            audioTime = [MVHelper appendAssetTrack:audioAssetTrack toCompositionTrack:audioTrack atTime:audioTime withBounds:maxBounds]; 
          } 
        } 

        currentTime = composition.duration; 
      } 

      if (videoCompositionInstruction && videoCompositionLayerInstruction) { 
        videoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration); 
        videoCompositionInstruction.layerInstructions = @[videoCompositionLayerInstruction]; 

        videoComposition = [AVMutableVideoComposition videoComposition]; 
        videoComposition.renderSize = CGSizeMake(self.config.videoSize, self.config.videoSize); 
        videoComposition.frameDuration = CMTimeMake(1, self.config.videoFrameRate); 
        videoComposition.instructions = @[videoCompositionInstruction]; 
      } 


      // 添加背景音乐 musicTrack 
      NSURL *musicFileURL = self.sessionConfiguration.musicFileURL; 
      if (musicFileURL && musicFileURL.isFileExists) { 
        AVAsset *musicAsset = [AVAsset assetWithURL:musicFileURL]; 
        AVAssetTrack *musicAssetTrack = [musicAsset tracksWithMediaType:AVMediaTypeAudio].firstObject; 
        if (musicAssetTrack) { 
          musicTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
          if (CMTIME_COMPARE_INLINE(musicAsset.duration, >=, composition.duration)) { 
            // 如果背景音乐时长大于视频总时长, 则直接添加 
            [musicTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, composition.duration) ofTrack:musicAssetTrack atTime:kCMTimeZero error:NULL]; 
          } else { 
            // 否则, 循环背景音乐 
            CMTime musicTime = kCMTimeZero; 
            CMTime bounds = composition.duration; 
            while (true) { 
              musicTime = [MVHelper appendAssetTrack:musicAssetTrack toCompositionTrack:musicTrack atTime:musicTime withBounds:bounds]; 
              if (CMTIME_COMPARE_INLINE(musicTime, >=, composition.duration)) { 
                break; 
              } 
            } 
          } 
        } 
      } 

      // 处理音频 
      if (musicTrack) { 
        AVMutableAudioMixInputParameters *audioMixParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:musicTrack]; 

        /* 背景音乐添加淡入淡出 */ 
        AVAsset *musicAsset = musicTrack.asset; 
        CMTime crossfadeDuration = CMTimeMake(15, 10); // 前后都是1.5秒 
        CMTime halfDuration = CMTimeMultiplyByFloat64(musicAsset.duration, 0.5); 
        crossfadeDuration = CMTimeMinimum(crossfadeDuration, halfDuration); 
        CMTimeRange crossfadeRangeBegin = CMTimeRangeMake(kCMTimeZero, crossfadeDuration); 
        CMTimeRange crossfadeRangeEnd = CMTimeRangeMake(CMTimeSubtract(musicAsset.duration, crossfadeDuration), crossfadeDuration); 
        [audioMixParameters setVolumeRampFromStartVolume:0.0 toEndVolume:self.sessionConfiguration.musicVolume timeRange:crossfadeRangeBegin]; 
        [audioMixParameters setVolumeRampFromStartVolume:self.sessionConfiguration.musicVolume toEndVolume:0.0 timeRange:crossfadeRangeEnd]; 

        audioMix = [AVMutableAudioMix audioMix]; 
        [audioMix setInputParameters:@[audioMixParameters]]; 
      } 

      _composition = composition; 
      _videoComposition = videoComposition; 
      _audioMix = audioMix; 

      return YES; 
    } 


    - (AVPlayerItem *)playerItem 
    { 
      AVPlayerItem *playerItem = nil; 
      if (self.composition) { 
        playerItem = [AVPlayerItem playerItemWithAsset:self.composition]; 
        if (!self.videoComposition.animationTool) { 
          playerItem.videoComposition = self.videoComposition; 
        } 
        playerItem.audioMix = self.audioMix; 
      } 
      return playerItem; 
    } 

    ///============================================= 
    /// MVHelper 
    ///============================================= 

    + (CMTime)appendAssetTrack:(AVAssetTrack *)track toCompositionTrack:(AVMutableCompositionTrack *)compositionTrack atTime:(CMTime)atTime withBounds:(CMTime)bounds 
    { 
      CMTimeRange timeRange = track.timeRange; 
      atTime = CMTimeAdd(atTime, timeRange.start); 

      if (!track || !compositionTrack) { 
        return atTime; 
      } 

      if (CMTIME_IS_VALID(bounds)) { 
        CMTime currentBounds = CMTimeAdd(atTime, timeRange.duration); 
        if (CMTIME_COMPARE_INLINE(currentBounds, >, bounds)) { 
          timeRange = CMTimeRangeMake(timeRange.start, CMTimeSubtract(timeRange.duration, CMTimeSubtract(currentBounds, bounds))); 
        } 
      } 
      if (CMTIME_COMPARE_INLINE(timeRange.duration, >, kCMTimeZero)) { 
        NSError *error = nil; 
        [compositionTrack insertTimeRange:timeRange ofTrack:track atTime:atTime error:&error]; 
        if (error) { 
          MVLog(@"Failed to append %@ track: %@", compositionTrack.mediaType, error); 
        } 
        return CMTimeAdd(atTime, timeRange.duration); 
      } 

      return atTime; 
    } 
Problemi correlati