web-dev-qa-db-ja.com

AVCaptureVideoPreviewLayerにカメラプレビューを取得

カメラ入力をプレビューレイヤービューに表示しようとしていました。

self.cameraPreviewViewは、IBのUIViewに関連付けられています

これが、AVFoundationプログラミングガイドからまとめた現在のコードです。しかし、プレビューには表示されません

AVCaptureSession *session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetHigh;

    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = nil;
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];

    if (!input) {
        NSLog(@"Couldn't create video capture device");
    }
    [session addInput:input];


        // Create video preview layer and add it to the UI
        AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
        UIView *view = self.cameraPreviewView;
        CALayer *viewLayer = [view layer];

        newCaptureVideoPreviewLayer.frame = view.bounds;

        [viewLayer addSublayer:newCaptureVideoPreviewLayer];

        self.cameraPreviewLayer = newCaptureVideoPreviewLayer;



        [session startRunning];
10
William Smith

試行錯誤を繰り返し、AppleのAVCamサンプルコードを確認した後

PreviewLayerコードとセッションstartRunningをそのようにグランドセントラルディスパッチブロックにラップすると、動作を開始しました。

 dispatch_async(dispatch_get_main_queue(), ^{
    AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
    UIView *view = self.cameraPreviewView;
    CALayer *viewLayer = [view layer];

    newCaptureVideoPreviewLayer.frame = view.bounds;

    [viewLayer addSublayer:newCaptureVideoPreviewLayer];

    self.cameraPreviewLayer = newCaptureVideoPreviewLayer;

    [session startRunning];
});
21
William Smith

これが私のコードです、それは私にとって完璧に機能します、あなたはそれを参照することができます

- (void)initCapture
{
    AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:inputDevice error:nil];
    if (!captureInput) {
        return;
    }
    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    /* captureOutput:didOutputSampleBuffer:fromConnection delegate method !*/
    [captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [captureOutput setVideoSettings:videoSettings];
    self.captureSession = [[AVCaptureSession alloc] init];
    NSString* preset = 0;
    if (!preset) {
        preset = AVCaptureSessionPresetMedium;
    }
    self.captureSession.sessionPreset = preset;
    if ([self.captureSession canAddInput:captureInput]) {
        [self.captureSession addInput:captureInput];
    }
    if ([self.captureSession canAddOutput:captureOutput]) {
        [self.captureSession addOutput:captureOutput];
    }

    //handle prevLayer
    if (!self.captureVideoPreviewLayer) {
        self.captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    }

    //if you want to adjust the previewlayer frame, here!
    self.captureVideoPreviewLayer.frame = self.view.bounds;
    self.captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    [self.view.layer addSublayer: self.captureVideoPreviewLayer];
    [self.captureSession startRunning];
}
18
jianpx