web-dev-qa-db-ja.com

CMSampleBufferからUIImageを作成します

これは、CMSampleBufferUIImageに変換することに関する無数の質問と同じではありません。なぜ私はそれをこのように変換できないのか疑問に思っています:

CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage * imageFromCoreImageLibrary = [CIImage imageWithCVPixelBuffer: pixelBuffer];
UIImage * imageForUI = [UIImage imageWithCIImage: imageFromCoreImageLibrary];

YCbCr色空間やRGBAなどで機能するため、はるかに単純に見えます。そのコードに何か問題がありますか?

17
mrplants

Swift 3およびiOS10 AVCapturePhotoOutputの場合:含まれるもの:

import UIKit
import CoreData
import CoreMotion
import AVFoundation

プレビュー用のUIViewを作成し、メインクラスにリンクします

  @IBOutlet var preview: UIView!

これを作成してカメラセッションを設定します(kCVPixelFormatType_32BGRA重要です!!):

  lazy var cameraSession: AVCaptureSession = {
    let s = AVCaptureSession()
    s.sessionPreset = AVCaptureSessionPresetHigh
    return s
  }()

  lazy var previewLayer: AVCaptureVideoPreviewLayer = {
    let previewl:AVCaptureVideoPreviewLayer =  AVCaptureVideoPreviewLayer(session: self.cameraSession)
    previewl.frame = self.preview.bounds
    return previewl
  }()

  func setupCameraSession() {
    let captureDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo) as AVCaptureDevice

    do {
      let deviceInput = try AVCaptureDeviceInput(device: captureDevice)

      cameraSession.beginConfiguration()

      if (cameraSession.canAddInput(deviceInput) == true) {
        cameraSession.addInput(deviceInput)
      }

      let dataOutput = AVCaptureVideoDataOutput()
      dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString) : NSNumber(value: **kCVPixelFormatType_32BGRA** as UInt32)]
      dataOutput.alwaysDiscardsLateVideoFrames = true

      if (cameraSession.canAddOutput(dataOutput) == true) {
        cameraSession.addOutput(dataOutput)
      }

      cameraSession.commitConfiguration()

      let queue = DispatchQueue(label: "fr.popigny.videoQueue", attributes: [])
      dataOutput.setSampleBufferDelegate(self, queue: queue)

    }
    catch let error as NSError {
      NSLog("\(error), \(error.localizedDescription)")
    }
  }

WillAppearで:

  override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)
    setupCameraSession()
  }

Didappearで:

  override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)
    preview.layer.addSublayer(previewLayer)
    cameraSession.startRunning()
  }

出力をキャプチャする関数を作成します。

  func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {

    // Here you collect each frame and process it
    let ts:CMTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
    self.mycapturedimage = imageFromSampleBuffer(sampleBuffer: sampleBuffer)
}

kCVPixelFormatType_32BGRA CMSampleBufferをIImageに変換するコードは次のとおりです。重要なのはbitmapInfoであり、これは2BGRAに対応する必要があります。 = premultfirstとalpha情報で32少し:

  func imageFromSampleBuffer(sampleBuffer : CMSampleBuffer) -> UIImage
  {
    // Get a CMSampleBuffer's Core Video image buffer for the media data
    let  imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer
    CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags.readOnly);


    // Get the number of bytes per row for the pixel buffer
    let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!);

    // Get the number of bytes per row for the pixel buffer
    let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!);
    // Get the pixel buffer width and height
    let width = CVPixelBufferGetWidth(imageBuffer!);
    let height = CVPixelBufferGetHeight(imageBuffer!);

    // Create a device-dependent RGB color space
    let colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    var bitmapInfo: UInt32 = CGBitmapInfo.byteOrder32Little.rawValue
    bitmapInfo |= CGImageAlphaInfo.premultipliedFirst.rawValue & CGBitmapInfo.alphaInfoMask.rawValue
    //let bitmapInfo: UInt32 = CGBitmapInfo.alphaInfoMask.rawValue
    let context = CGContext.init(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo)
    // Create a Quartz image from the pixel data in the bitmap graphics context
    let quartzImage = context?.makeImage();
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags.readOnly);

    // Create an image object from the Quartz image
    let image = UIImage.init(cgImage: quartzImage!);

    return (image);
  }
20
Popigny

JPEG画像の場合:

Swift 4:

let buff: CMSampleBuffer ...            // Have you have CMSampleBuffer 
if let imageData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buff, previewPhotoSampleBuffer: nil) {
    let image = UIImage(data: imageData) //  Here you have UIImage
}
25

次のコードを使用して、PixelBufferオプション1から画像を変換します。

CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];

CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef myImage = [context
                         createCGImage:ciImage
                         fromRect:CGRectMake(0, 0,
                                             CVPixelBufferGetWidth(pixelBuffer),
                                             CVPixelBufferGetHeight(pixelBuffer))];

UIImage *uiImage = [UIImage imageWithCGImage:myImage];

オプション2:

int w = CVPixelBufferGetWidth(pixelBuffer);
int h = CVPixelBufferGetHeight(pixelBuffer);
int r = CVPixelBufferGetBytesPerRow(pixelBuffer);
int bytesPerPixel = r/w;

unsigned char *buffer = CVPixelBufferGetBaseAddress(pixelBuffer);

UIGraphicsBeginImageContext(CGSizeMake(w, h));

CGContextRef c = UIGraphicsGetCurrentContext();

unsigned char* data = CGBitmapContextGetData(c);
if (data != NULL) {
    int maxY = h;
    for(int y = 0; y<maxY; y++) {
        for(int x = 0; x<w; x++) {
            int offset = bytesPerPixel*((w*y)+x);
            data[offset] = buffer[offset];     // R
            data[offset+1] = buffer[offset+1]; // G
            data[offset+2] = buffer[offset+2]; // B
            data[offset+3] = buffer[offset+3]; // A
        }
    }
}
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();
13

Swift 4.x/3.xで使用する簡単な拡張機能を作成して、UIImageからCMSampleBufferを生成しました。

これはスケーリングと方向も処理しますが、デフォルト値が機能する場合はそのまま受け入れることができます。

import UIKit
import AVFoundation

extension CMSampleBuffer {
    func image(orientation: UIImageOrientation = .up, 
               scale: CGFloat = 1.0) -> UIImage? {
        if let buffer = CMSampleBufferGetImageBuffer(self) {
            let ciImage = CIImage(cvPixelBuffer: buffer)

            return UIImage(ciImage: ciImage, 
                           scale: scale,
                           orientation: orientation)
        }

        return nil
    }
}
  1. イメージからバッファデータを取得できる場合は続行し、それ以外の場合はnilが返されます
  2. バッファを使用して、CIImageを初期化します
  3. UIImage値で初期化されたciImageと、scaleおよびorientation値を返します。何も指定されていない場合、デフォルトのupおよび1.0はそれぞれ使用されます
11
CodeBender

これは、iOS 10AVCapturePhotoOutputクラスに関連して多く発生します。ユーザーが写真を撮りたいと思っていて、capturePhoto(with:delegate:)を呼び出し、設定にpreview画像のリクエストが含まれているとします。これはプレビュー画像を取得するための非常に効率的な方法ですが、インターフェイスにどのように表示しますか?プレビュー画像は、デリゲートメソッドの実装でCMSampleBufferとして届きます。

func capture(_ output: AVCapturePhotoOutput, 
    didFinishProcessingPhotoSampleBuffer buff: CMSampleBuffer?, 
    previewPhotoSampleBuffer: CMSampleBuffer?, 
    resolvedSettings: AVCaptureResolvedPhotoSettings, 
    bracketSettings: AVCaptureBracketedStillImageSettings?, 
    error: Error?) {

CMSampleBuffer、previewPhotoSampleBufferをUIImageに変換する必要があります。どうやってやるの?このような:

if let prev = previewPhotoSampleBuffer {
    if let buff = CMSampleBufferGetImageBuffer(prev) {
        let cim = CIImage(cvPixelBuffer: buff)
        let im = UIImage(ciImage: cim)
        // and now you have a UIImage! do something with it ...
    }
}
3
matt

TO ALL:次のようなメソッドを使用しないでください:

    private let context = CIContext()

    private func imageFromSampleBuffer2(_ sampleBuffer: CMSampleBuffer) -> UIImage? {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }
        let ciImage = CIImage(cvPixelBuffer: imageBuffer)
        guard let cgImage = context.createCGImage(ciImage, from: ciImage.extent) else { return nil }
        return UIImage(cgImage: cgImage)
    }

彼らははるかに多くのCPUを食べ、変換するためにより多くの時間を費やします

https://stackoverflow.com/a/40193359/7767664 のソリューションを使用してください

aVCaptureVideoDataOutputの次の設定を設定することを忘れないでください

    videoOutput = AVCaptureVideoDataOutput()

    videoOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as String) : NSNumber(value: kCVPixelFormatType_32BGRA as UInt32)]
    //videoOutput.alwaysDiscardsLateVideoFrames = true

    videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "MyQueue"))

変換メソッド

    func imageFromSampleBuffer(_ sampleBuffer : CMSampleBuffer) -> UIImage {
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        let  imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer!, CVPixelBufferLockFlags.readOnly);


    // Get the number of bytes per row for the pixel buffer
    let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer!);

    // Get the number of bytes per row for the pixel buffer
    let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer!);
    // Get the pixel buffer width and height
    let width = CVPixelBufferGetWidth(imageBuffer!);
    let height = CVPixelBufferGetHeight(imageBuffer!);

    // Create a device-dependent RGB color space
    let colorSpace = CGColorSpaceCreateDeviceRGB();

    // Create a bitmap graphics context with the sample buffer data
    var bitmapInfo: UInt32 = CGBitmapInfo.byteOrder32Little.rawValue
    bitmapInfo |= CGImageAlphaInfo.premultipliedFirst.rawValue & CGBitmapInfo.alphaInfoMask.rawValue
    //let bitmapInfo: UInt32 = CGBitmapInfo.alphaInfoMask.rawValue
    let context = CGContext.init(data: baseAddress, width: width, height: height, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo)
    // Create a Quartz image from the pixel data in the bitmap graphics context
    let quartzImage = context?.makeImage();
    // Unlock the pixel buffer
    CVPixelBufferUnlockBaseAddress(imageBuffer!, CVPixelBufferLockFlags.readOnly);

    // Create an image object from the Quartz image
    let image = UIImage.init(cgImage: quartzImage!);

    return (image);
}
2
user924