从 CMSampleBufferRef 转换的 UIImage,导致 UIImage 无法正确渲染

Posted

技术标签:

【中文标题】从 CMSampleBufferRef 转换的 UIImage,导致 UIImage 无法正确渲染【英文标题】:UIImage from CMSampleBufferRef conversion, resulting UIImage not rendering properly 【发布时间】:2015-05-01 14:41:49 【问题描述】:

我正在与 AV Foundation 合作,我正在尝试保存一个特定的 在某个变量中将 CMSampleBufferRef 输出为 UIImage。我正在使用manatee works 示例代码,它使用

kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange

kCVPixelBufferPixelFormatTypeKey

NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];

NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];

但是当我保存图像时,输出只是零或ImageView 的背景。我也尝试不设置输出设置,只使用默认但无用的任何设置。图像仍未渲染。我还尝试设置kCVPixelFormatType_32BGRA,但随后海牛停止检测条形码。

我正在使用苹果on developer website提供的示例代码中的上下文设置

// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(NULL,
                                             CVPixelBufferGetWidth(imageBuffer),
                                             CVPixelBufferGetHeight(imageBuffer),
                                             8,
                                             0,
                                             CGColorSpaceCreateDeviceRGB(),
                                             kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);

有人可以帮我解决这里出了什么问题吗?它应该很简单,但我对 AVFoundation 框架没有太多经验。这是上下文使用CGColorSpaceCreateDeviceRGB() 时的一些色彩空间问题吗?

如果需要,我可以提供更多信息。我搜索了 ***,有很多关于此的条目,但没有一个能解决我的问题

【问题讨论】:

【参考方案1】:

您将bytesPerRow 的0 传递给CGBitmapContextCreate 有什么原因吗? 此外,您将 NULL 作为缓冲区而不是 CMSampleBufferRef 地址传递。

sampleBuffer 是您的CMSampleBufferRef 时,创建位图上下文应该大致如下:

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
CVPixelBufferLockBaseAddress(imageBuffer,0);
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

CGContextRef context = CGBitmapContextCreate(baseAddress,
                                             CVPixelBufferGetWidth(imageBuffer),
                                             CVPixelBufferGetHeight(imageBuffer),
                                             8,
                                             CVPixelBufferGetBytesPerRow(imageBuffer),
                                             colorSpace,
                                             kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);

CGColorSpaceRelease(colorSpace); 
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
CGContextRelease(newContext);

【讨论】:

此源代码在使用“kCVPixelFormatType_32BGRA”初始化视频设置时工作。有没有办法使用视频设置“kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange”执行 UIImage 转换?实际上,我正在获得 YUV NV12 格式的相机原始输出。现在我想将这些原始数据转换为 UIImage 并进一步处理。任何建议都会有所帮助。谢谢。【参考方案2】:

这是我过去的做法。代码是用swift编写的。但它有效。 您应该注意到最后一行的方向参数,它取决于视频设置。

extension UIImage 
    /**
    Creates a new UIImage from the video frame sample buffer passed.
    @param sampleBuffer the sample buffer to be converted into a UIImage.
    */
    convenience init?(sampleBuffer: CMSampleBufferRef) 
        // Get a CMSampleBuffer's Core Video image buffer for the media data
        let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0)

        // Get the number of bytes per row for the pixel buffer
        let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)

        // Get the number of bytes per row for the pixel buffer
        let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
        // Get the pixel buffer width and height
        let width = CVPixelBufferGetWidth(imageBuffer)
        let height = CVPixelBufferGetHeight(imageBuffer)

        // Create a device-dependent RGB color space
        let colorSpace = CGColorSpaceCreateDeviceRGB()

        // Create a bitmap graphics context with the sample buffer data
        let bitmap = CGBitmapInfo(CGBitmapInfo.ByteOrder32Little.rawValue|CGImageAlphaInfo.PremultipliedFirst.rawValue)
        let context = CGBitmapContextCreate(baseAddress, width, height, 8,
            bytesPerRow, colorSpace, bitmap)
        // Create a Quartz image from the pixel data in the bitmap graphics context
        let quartzImage = CGBitmapContextCreateImage(context)
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0)

        // Create an image object from the Quartz image
        self.init(CGImage: quartzImage, scale: 1, orientation: UIImageOrientation.LeftMirrored)
    

【讨论】:

这段代码不是和我的代码一样吗?除了最后一个方向设置?【参考方案3】:

我经常使用这个:

   UIImage *image = [UIImage imageWithData:[self imageToBuffer:sampleBuffer]];

    - (NSData *) imageToBuffer:(CMSampleBufferRef)source 
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
        CVPixelBufferLockBaseAddress(imageBuffer,0);

        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);

        NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];

        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
        return data;
    

【讨论】:

以上是关于从 CMSampleBufferRef 转换的 UIImage,导致 UIImage 无法正确渲染的主要内容,如果未能解决你的问题,请参考以下文章

如何将 CMSampleBufferRef 转换为 NSData

从 CMSampleBufferRef 获取当前视频时长

iOS音频推流格式转换

如何从 CMSampleBufferRef 获取字节,通过网络发送

捕获仍然没有压缩的 UIImage(来自 CMSampleBufferRef)?

使用重叠执行缓冲/窗口添加 CMSampleBufferRef