将 vImage_Scale 与 kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange 一起使用

Posted

技术标签:

【中文标题】将 vImage_Scale 与 kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange 一起使用【英文标题】:Using vImage_Scale with kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange 【发布时间】:2021-04-10 18:59:41 【问题描述】:

我从 iPhone 的前置摄像头收到CMSampleBuffer。目前它的尺寸是 1920x1080,我想把它缩小到 1280x720。我想使用 vImageScale 函数,但无法正常工作。相机的像素格式是kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,所以我尝试了以下方法,但它输出了一个不正确的奇怪绿色图像:

private var scaleBuffer: vImage_Buffer = 
  var scaleBuffer: vImage_Buffer = vImage_Buffer()
  let newHeight = 720
  let newWidth = 1280
  scaleBuffer.data = UnsafeMutableRawPointer.allocate(byteCount: Int(newWidth * newHeight * 4), alignment: MemoryLayout<UInt>.size)
  scaleBuffer.width = vImagePixelCount(newWidth)
  scaleBuffer.height = vImagePixelCount(newHeight)
  scaleBuffer.rowBytes = Int(newWidth * 4)
  return scaleBuffer
()

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
    

  guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else 
    return
  

  CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))

  // create vImage_Buffer out of CVImageBuffer
  var inBuff: vImage_Buffer = vImage_Buffer()
  inBuff.width = UInt(CVPixelBufferGetWidth(imageBuffer))
  inBuff.height = UInt(CVPixelBufferGetHeight(imageBuffer))
  inBuff.rowBytes = CVPixelBufferGetBytesPerRow(imageBuffer)
  inBuff.data = CVPixelBufferGetBaseAddress(imageBuffer)

  // perform scale
  var err = vImageScale_CbCr8(&inBuff, &scaleBuffer, nil, 0)
  if err != kvImageNoError 
      print("Can't scale a buffer")
      return
  
  CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))

  var newBuffer: CVPixelBuffer?
  let attributes : [NSObject:AnyObject] = [
    kCVPixelBufferCGImageCompatibilityKey : true as AnyObject,
    kCVPixelBufferCGBitmapContextCompatibilityKey : true as AnyObject
  ]

  let status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault,
                                                  Int(scaleBuffer.width), Int(scaleBuffer.height),
                                                  kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange, scaleBuffer.data,
                                                  Int(scaleBuffer.width) * 4,
                                                  nil, nil,
                                                  attributes as CFDictionary?, &newBuffer)

  guard status == kCVReturnSuccess,
        let b = newBuffer else 
    return
  

  // Do something with the buffer to output it

这里出了什么问题?看着这个答案here,看起来我需要分别缩放“Y”和“UV”平面。我怎样才能快速做到这一点,然后将它们组合回一个 CVPixelBuffer?

【问题讨论】:

【参考方案1】:

CMSampleBufferGetImageBuffer 返回的imageBuffer 实际上包含两个离散平面——一个亮度平面和一个色度平面(请注意,对于 420,色度平面是亮度平面大小的一半)。这在this sample code project 中进行了讨论。

这可以让你快到那里。我没有使用 Core Video CVPixelBufferCreateWithBytes 的经验,但此代码将为您创建缩放的 YpCbCr 缓冲区,并将它们转换为交错的 ARGB 缓冲区:

let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
let lumaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0)
let lumaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 0)
let lumaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0)

var sourceLumaBuffer = vImage_Buffer(data: lumaBaseAddress,
                                     height: vImagePixelCount(lumaHeight),
                                     width: vImagePixelCount(lumaWidth),
                                     rowBytes: lumaRowBytes)

let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1)
let chromaWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 1)
let chromaHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1)
let chromaRowBytes = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1)

var sourceChromaBuffer = vImage_Buffer(data: chromaBaseAddress,
                                       height: vImagePixelCount(chromaHeight),
                                       width: vImagePixelCount(chromaWidth),
                                       rowBytes: chromaRowBytes)

var destLumaBuffer = try! vImage_Buffer(size: CGSize(width: Int(sourceLumaBuffer.width / 4),
                                                     height: Int(sourceLumaBuffer.height / 4)),
                                        bitsPerPixel: 8)

var destChromaBuffer = try! vImage_Buffer(size: CGSize(width: Int(sourceChromaBuffer.width / 4),
                                                       height: Int(sourceChromaBuffer.height / 4)),
                                          bitsPerPixel: 8 * 2)

vImageScale_CbCr8(&sourceChromaBuffer, &destChromaBuffer, nil, 0)
vImageScale_Planar8(&sourceLumaBuffer, &destLumaBuffer, nil, 0)

var argbBuffer = try! vImage_Buffer(size: destLumaBuffer.size,
                                    bitsPerPixel: 8 * 4)

vImageConvert_420Yp8_CbCr8ToARGB8888(&destLumaBuffer,
                                     &destChromaBuffer,
                                     &argbBuffer,
                                     &infoYpCbCrToARGB,
                                     nil,
                                     255,
                                     vImage_Flags(kvImagePrintDiagnosticsToConsole))

destLumaBuffer.free()
destChromaBuffer.free()
argbBuffer.free()

【讨论】:

以上是关于将 vImage_Scale 与 kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange 一起使用的主要内容,如果未能解决你的问题,请参考以下文章

php [将产品与社交共享插件集成]将社交共享插件与WooCommerce集成 - Sharedaddy

php [将产品与社交共享插件集成]将社交共享插件与WooCommerce集成 - Sharedaddy

php [将产品与社交共享插件集成]将社交共享插件与WooCommerce集成 - 分享此功能

php [将产品与社交共享插件集成]将社交共享插件与WooCommerce集成 - 分享此功能

php [将产品与社交共享插件集成]将社交共享插件与WooCommerce集成 - 分享此功能

将 Apollo for iOS 与现有项目集成?