2012-02-16 11 views
9

Ciao Voglio configurare la sessione di acquisizione AV per acquisire immagini con risoluzione specifica (e, se possibile, con qualità specifica) utilizzando la fotocamera iphone. Ecco setupping AV codice di sessioneAVCaptureSession specifica la risoluzione e la qualità delle immagini acquisite obj-c iphone app

// Create and configure a capture session and start it running 
- (void)setupCaptureSession 
{ 
    NSError *error = nil; 

    // Create the session 
    self.captureSession = [[AVCaptureSession alloc] init]; 

    // Configure the session to produce lower resolution video frames, if your 
    // processing algorithm can cope. We'll specify medium quality for the 
    // chosen device. 
    captureSession.sessionPreset = AVCaptureSessionPresetMedium; 

    // Find a suitable AVCaptureDevice 
    NSArray *cameras=[AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; 
    AVCaptureDevice *device; 
    if ([UserDefaults camera]==UIImagePickerControllerCameraDeviceFront) 
    { 
     device =[cameras objectAtIndex:1]; 
    } 
    else 
    { 
     device = [cameras objectAtIndex:0]; 
    }; 

    // Create a device input with the device and add it to the session. 
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; 
    if (!input) 
    { 
     NSLog(@"PANIC: no media input"); 
    } 
    [captureSession addInput:input]; 

    // Create a VideoDataOutput and add it to the session 
    AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init]; 
    [captureSession addOutput:output]; 
    NSLog(@"connections: %@", output.connections); 

    // Configure your output. 
    dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL); 
    [output setSampleBufferDelegate:self queue:queue]; 
    dispatch_release(queue); 

    // Specify the pixel format 
    output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; 


    // If you wish to cap the frame rate to a known value, such as 15 fps, set 
    // minFrameDuration. 


    // Assign session to an ivar. 
    [self setSession:captureSession]; 
    [self.captureSession startRunning]; 
} 

e setSession:

-(void)setSession:(AVCaptureSession *)session 
{ 
    NSLog(@"setting session..."); 
    self.captureSession=session; 
    NSLog(@"setting camera view"); 
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session]; 
    //UIView *aView = self.view; 
    CGRect videoRect = CGRectMake(20.0, 20.0, 280.0, 255.0); 
    previewLayer.frame = videoRect; // Assume you want the preview layer to fill the view. 
    [previewLayer setBackgroundColor:[[UIColor grayColor] CGColor]]; 
    [self.view.layer addSublayer:previewLayer]; 
    //[aView.layer addSublayer:previewLayer]; 
} 

e di uscita metodi:

// Delegate routine that is called when a sample buffer was written 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection 
{ 
    //NSLog(@"captureOutput: didOutputSampleBufferFromConnection"); 

    // Create a UIImage from the sample buffer data 
    self.currentImage = [self imageFromSampleBuffer:sampleBuffer]; 

    //< Add your code here that uses the image > 
} 

// Create a UIImage from sample buffer data 
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{ 
    //NSLog(@"imageFromSampleBuffer: called"); 
    // Get a CMSampleBuffer's Core Video image buffer for the media data 
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
    // Lock the base address of the pixel buffer 
    CVPixelBufferLockBaseAddress(imageBuffer, 0); 

    // Get the number of bytes per row for the pixel buffer 
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

    // Get the number of bytes per row for the pixel buffer 
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
    // Get the pixel buffer width and height 
    size_t width = CVPixelBufferGetWidth(imageBuffer); 
    size_t height = CVPixelBufferGetHeight(imageBuffer); 

    // Create a device-dependent RGB color space 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    // Create a bitmap graphics context with the sample buffer data 
    CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
    // Create a Quartz image from the pixel data in the bitmap graphics context 
    CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
    // Unlock the pixel buffer 
    CVPixelBufferUnlockBaseAddress(imageBuffer,0); 


    // Free up the context and color space 
    CGContextRelease(context); 
    CGColorSpaceRelease(colorSpace); 

    // Create an image object from the Quartz image 
    UIImage *image = [UIImage imageWithCGImage:quartzImage]; 

    // Release the Quartz image 
    CGImageRelease(quartzImage); 

    return (image); 
} 

Tutto è abbastanza standard. Ma dove e cosa dovrei cambiare per specificare la risoluzione dell'immagine catturata e la sua qualità. Aiutami a compiacere

+0

[Checkout una domanda simile] (http: // StackOverflow. com/domande/24758407/ios-capture-ad alta risoluzione-photo-mentre-using-a-low-avcapturesessionpreset-per-v/40.609.268 # 40.609.268). Questo potrebbe aiutare. –

risposta

11

Consultare la sezione Apple's guide Cattura immagini fisse riguardo le dimensioni che si otterranno se si imposta uno o un altro preset.

Il parametro si dovrebbe cambiare è captureSession.sessionPreset

+0

Ho un 'UISlider' nella mia applicazione che usa l'utente specificherà il valore più alto o più basso che desidera. Ma quali valori sono più alti? – Oleg

+0

'NSString * const AVCaptureSessionPresetPhoto; NSString * const AVCaptureSessionPresetHigh; NSString * const AVCaptureSessionPresetMedium; NSString * const AVCaptureSessionPresetLow; NSString * const AVCaptureSessionPreset320x240; NSString * const AVCaptureSessionPreset352x288; NSString * const AVCaptureSessionPreset640x480; NSString * const AVCaptureSessionPreset960x540; NSString * const AVCaptureSessionPreset1280x720; ' – Oleg

+0

Bene a quanto pare quelli con una qualità migliore sono più alti. – Eugene

0

cercare di andare con qualcosa come questo, dove cx e cy sono le vostre risoluzioni personalizzate:

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
           AVVideoScalingModeResizeAspectFill,AVVideoScalingModeKey, 
          AVVideoCodecH264, AVVideoCodecKey, 
          [NSNumber numberWithInt:cx], AVVideoWidthKey, 
          [NSNumber numberWithInt:cx], AVVideoHeightKey, 
          nil]; 
_videoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings]; 
Problemi correlati