2013-01-02 19 views
7

Sto scrivendo un file video MP4 con un AVAssetWriter utilizzando un AVAssetWriterInputPixelBufferAdaptor.AVAssetWriter a volte fallisce con lo stato AVAssetWriterStatusFailed. Sembra casuale

La sorgente è un video da uno UIImagePickerController, appena catturato dalla fotocamera o dalla libreria di risorse. La qualità al momento è UIImagePickerControllerQualityTypeMedium.

Alcune volte lo scrittore non riesce. E 'stato è AVAssetWriterStatusFailed e la proprietà error AVAssetWriter oggetti è:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" 
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210), 
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)", 
NSLocalizedDescription=The operation could not be completed 

L'errore si verifica circa il 20% delle volte il codice viene eseguito. Sembra non funzionare più frequentemente su iPhone 4/4S che su iPhone 5.

Si verifica anche più frequentemente se la qualità del video sorgente è superiore. Utilizzo di UIImagePickerControllerQualityTypeLow l'errore non si verifica così spesso. Utilizzando UIImagePickerControllerQualityTypeHigh, l'errore si verifica un po 'più frequentemente.

Ho notato anche qualcos'altro: Sembra arrivare a ondate. Quando fallisce, spesso anche le seguenti esecuzioni falliscono, anche se cancello l'app e la reinstallo. Questo mi fa pensare, se il mio programma perde memoria e se quella memoria rimane viva anche se l'app viene uccisa (è possibile?).

Ecco il codice che uso per rendere il mio video:

- (void)writeVideo 
{ 
    offlineRenderingInProgress = YES; 

/* --- Writer Setup --- */ 

    [locationQueue cancelAllOperations]; 

    [self stopWithoutRewinding]; 

    NSError *writerError = nil; 

    BOOL succes; 

    succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil]; 

    // DLog(@"Url: %@, succes: %i, error: %@", self.outputURL, succes, fileError); 

    writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError]; 
    //writer.shouldOptimizeForNetworkUse = NO; 

    if (writerError) { 
     DLog(@"Writer error: %@", writerError); 
     return; 
    } 

    float bitsPerPixel; 
    CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0])); 
    int numPixels = dimensions.width * dimensions.height; 
    int bitsPerSecond; 

    // Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate 
    if (numPixels < (640 * 480)) 
     bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low. 
    else 
     bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh. 

    bitsPerSecond = numPixels * bitsPerPixel; 

    NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
              AVVideoCodecH264, AVVideoCodecKey, 
              [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, 
              [NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey, 
              [NSDictionary dictionaryWithObjectsAndKeys: 
              [NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey, 
              nil], AVVideoCompressionPropertiesKey, 
              nil]; 

    writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings]; 
    writerVideoInput.transform = movie.preferredTransform; 
    writerVideoInput.expectsMediaDataInRealTime = YES; 
    [writer addInput:writerVideoInput]; 

    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: 
                 [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; 

    writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput 
                         sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary]; 
    BOOL couldStart = [writer startWriting]; 

    if (!couldStart) { 
     DLog(@"Could not start AVAssetWriter!"); 
     abort = YES; 
     [locationQueue cancelAllOperations]; 
     return; 
    } 

    [self configureFilters]; 

    CIContext *offlineRenderContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @NO}]; 


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    if (!self.canEdit) { 
     [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES]; 
    } else { 
     [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES]; 
    } 

    CMTime startOffset = reader.timeRange.start; 

    DLog(@"startOffset: %llu", startOffset.value); 

    [self.thumbnailEditView removeFromSuperview]; 
    // self.thumbnailEditView = nil; 

    [glLayer removeFromSuperlayer]; 
    glLayer = nil; 

    [playerView removeFromSuperview]; 
    playerView = nil; 

    glContext = nil; 



    [writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ 

     @try { 


     BOOL didWriteSomething = NO; 

     DLog(@"Preparing to write..."); 

     while ([writerVideoInput isReadyForMoreMediaData]) { 

      if (abort) { 
       NSLog(@"Abort == YES"); 
       [locationQueue cancelAllOperations]; 
       [writerVideoInput markAsFinished]; 
       videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
      } 

      if (writer.status == AVAssetWriterStatusFailed) { 
       DLog(@"Writer.status: AVAssetWriterStatusFailed, error: %@", writer.error); 

       [[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:@"QualityOverride"]; 
       [[NSUserDefaults standardUserDefaults] synchronize]; 

       abort = YES; 
       [locationQueue cancelAllOperations]; 
       videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
       return; 
       DLog(@"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]); 
      } 

      DLog(@"Writing started..."); 

      CMSampleBufferRef buffer = nil; 

      if (reader.status != AVAssetReaderStatusUnknown) { 

       if (reader.status == AVAssetReaderStatusReading) { 
        buffer = [readerVideoOutput copyNextSampleBuffer]; 
        if (didWriteSomething == NO) { 
         DLog(@"Copying sample buffers..."); 
        } 
       } 

       if (!buffer) { 

        [writerVideoInput markAsFinished]; 

        DLog(@"Finished..."); 

        CGColorSpaceRelease(colorSpace); 

        [self offlineRenderingDidFinish]; 


        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ 

         [writer finishWriting]; 
         if (writer.error != nil) { 
          DLog(@"Error: %@", writer.error); 
         } else { 
          DLog(@"Succes!"); 
         } 

         if (writer.status == AVAssetWriterStatusCompleted) { 

          videoConvertCompletionBlock(YES, nil); 
         } 

         else { 
          abort = YES; 
          videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
         } 

        }); 


        return; 
       } 

       didWriteSomething = YES; 
      } 
      else { 

       DLog(@"Still waiting..."); 
       //Reader just needs a moment to get ready... 
       continue; 
      } 

      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer); 

      if (pixelBuffer == NULL) { 
       DLog(@"Pixelbuffer == NULL"); 
       continue; 
      } 

      //DLog(@"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer)); 

      //NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace]; 

      CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil]; 

      CIImage *outputImage = [self filteredImageWithImage:ciimage]; 


      CVPixelBufferRef outPixelBuffer = NULL; 
      CVReturn status; 

      CFDictionaryRef empty; // empty value for attr value. 
      CFMutableDictionaryRef attrs; 
      empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary 
             NULL, 
             NULL, 
             0, 
             &kCFTypeDictionaryKeyCallBacks, 
             &kCFTypeDictionaryValueCallBacks); 

      attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 
               1, 
               &kCFTypeDictionaryKeyCallBacks, 
               &kCFTypeDictionaryValueCallBacks); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferIOSurfacePropertiesKey, 
           empty); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferCGImageCompatibilityKey, 
           (__bridge const void *)([NSNumber numberWithBool:YES])); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferCGBitmapContextCompatibilityKey, 
           (__bridge const void *)([NSNumber numberWithBool:YES])); 


      status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer); 

      //DLog(@"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer)); 

      if (status != kCVReturnSuccess) { 
       DLog(@"Couldn't allocate output pixelBufferRef!"); 
       continue; 
      } 

      [offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace]; 

      CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer); 
      CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset); 
      CMTime duration = reader.timeRange.duration; 
      if (CMTIME_IS_POSITIVE_INFINITY(duration)) { 
       duration = movie.duration; 
      } 
      CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default); 

      float durationFloat = (float)durationConverted.value; 
      float progress = ((float) currentTime.value)/durationFloat; 

      //DLog(@"duration : %f, progress: %f", durationFloat, progress); 

      [self updateOfflineRenderProgress:progress]; 

      if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) { 
       [writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime]; 
      } else { 
       continue; 
      } 

      if (writer.status == AVAssetWriterStatusWriting) { 
       DLog(@"Writer.status: AVAssetWriterStatusWriting"); 
      } 

      CFRelease(buffer); 
      CVPixelBufferRelease(outPixelBuffer); 
     } 

     } 

     @catch (NSException *exception) { 
      DLog(@"Catching exception: %@", exception); 
     } 

    }]; 

} 
+0

Le opzioni sono CIContext all'indietro. Immagino che intendiate scrivere 'CIContext * offlineRenderContext = [CIContext contextWithOptions: @ {kCIContextUseSoftwareRenderer: @NO}];' –

+0

Sì, certo. L'ho corretto nel post. –

risposta

12

Ok, penso ho risolto io stesso. Il cattivo era questa linea:

[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ .... 

La coda globale che stavo passando è una coda concorrente. Ciò consente di effettuare una nuova richiamata prima che sia terminata quella precedente. Lo scrittore di asset non è progettato per essere scritto da più di un thread alla volta.

Creazione e utilizzo di una nuova coda di serie sembra per risolvere il problema:

assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL); 

[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{... 
Problemi correlati