2015-09-24 17 views
6

Sto affrontando alcuni problemi relativi al ritaglio con iOS9 SDK.ios9 - Problemi con il ritaglio CVImageBuffer

Ho il seguente codice per ridimensionare un'immagine (convertendo da 4: 3 a 16: 9 ritagliando al centro). Questo funzionava benissimo fino a iOS8 SDK. Con iOS 9, l'area in basso è vuota.

(CMSampleBufferRef)resizeImage:(CMSampleBufferRef) sampleBuffer { 
    { 
     CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
     CVPixelBufferLockBaseAddress(imageBuffer,0); 

     int target_width = CVPixelBufferGetWidth(imageBuffer); 
     int target_height = CVPixelBufferGetHeight(imageBuffer); 
     int height = CVPixelBufferGetHeight(imageBuffer); 
     int width = CVPixelBufferGetWidth(imageBuffer); 

     int x=0, y=0; 

     // Convert 16:9 to 4:3 
     if (((target_width*3)/target_height) == 4) 
     { 
      target_height = ((target_width*9)/16); 
      target_height = ((target_height + 15)/16) * 16; 
      y = (height - target_height)/2; 
     } 
     else 
     if ((target_width == 352) && (target_height == 288)) 
     { 
      target_height = ((target_width*9)/16); 
      target_height = ((target_height + 15)/16) * 16; 
      y = (height - target_height)/2; 
     } 
     else 
     if (((target_height*3)/target_width) == 4) 
     { 
      target_width = ((target_height*9)/16); 
      target_width = ((target_width + 15)/16) * 16; 
       x = ((width - target_width)/2); 
     } 
     else 
     if ((target_width == 288) && (target_height == 352)) 
     { 
      target_width = ((target_height*9)/16); 
      target_width = ((target_width + 15)/16) * 16; 
       x = ((width - target_width)/2); 
     } 

     CGRect cropRect; 

     NSLog(@"resizeImage x %d, y %d, target_width %d, target_height %d", x, y, target_width, target_height); 
     cropRect = CGRectMake(x, y, target_width, target_height); 
     CFDictionaryRef empty; // empty value for attr value. 
     CFMutableDictionaryRef attrs; 
     empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary 
            NULL, 
            NULL, 
            0, 
            &kCFTypeDictionaryKeyCallBacks, 
            &kCFTypeDictionaryValueCallBacks); 
     attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 
              1, 
              &kCFTypeDictionaryKeyCallBacks, 
              &kCFTypeDictionaryValueCallBacks); 

     CFDictionarySetValue(attrs, 
           kCVPixelBufferIOSurfacePropertiesKey, 
           empty); 

     OSStatus status; 
     CIImage *ciImage = [CIImage imageWithCVPixelBuffer:imageBuffer]; //options: [NSDictionary dictionaryWithObjectsAndKeys:[NSNull null], kCIImageColorSpace, nil]]; 
     CVPixelBufferRef pixelBuffer; 
     status = CVPixelBufferCreate(kCFAllocatorSystemDefault, target_width, target_height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, attrs, &pixelBuffer); 
     if (status != 0) 
     { 
      NSLog(@"CVPixelBufferCreate error %d", (int)status); 
     } 

     [ciContext render:ciImage toCVPixelBuffer:pixelBuffer bounds:cropRect colorSpace:nil]; 
     CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 
     CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

     CMSampleTimingInfo sampleTime = { 
      .duration = CMSampleBufferGetDuration(sampleBuffer), 
      .presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer), 
      .decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer) 
     }; 

     CMVideoFormatDescriptionRef videoInfo = NULL; 
     status = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &videoInfo); 
     if (status != 0) 
     { 
      NSLog(@"CMVideoFormatDescriptionCreateForImageBuffer error %d", (int)status); 
     } 
     CMSampleBufferRef oBuf; 
     status = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, &sampleTime, &oBuf); 
     if (status != 0) 
     { 
      NSLog(@"CMSampleBufferCreateForImageBuffer error %d", (int)status); 
     } 
     CFRelease(pixelBuffer); 
     ciImage = nil; 
     pixelBuffer = nil; 
     return oBuf; 
    } 
} 

Qualche idea o suggerimento su questo? Ho provato a cambiare il rettangolo di ritaglio ma senza alcun effetto.

Grazie

+0

Se tengo xey a zero, ad esempio x = 0 y = 0, funziona ma non ritaglia dal centro, rimuove invece la parte superiore del fotogramma. – manishg

risposta

1

Sei consapevole che il commento di documentazione della funzione [CIContext toCVPixelBuffer: bounds: colorSpace:] dice di iOS8- e iOS9 +? (Non ho potuto ho trovato alcuna risorsa in linea per collegare, però.)

/* Render 'image' to the given CVPixelBufferRef. 
* The 'bounds' parameter has the following behavior: 
* In OS X and iOS 9 and later: The 'image' is rendered into 'buffer' so that 
*  point (0,0) of 'image' aligns to the lower left corner of 'buffer'. 
*  The 'bounds' acts like a clip rect to limit what region of 'buffer' is modified. 
* In iOS 8 and earlier: The 'bounds' parameter acts to specify the region of 'image' to render. 
*  This region (regarless of its origin) is rendered at upper-left corner of 'buffer'. 
*/ 

Prendendo in considerazione ho risolto il mio problema, che sembra uguali ai suoi.