常用知识之CMSampleBufferRef系

Posted hbblzjy

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了常用知识之CMSampleBufferRef系相关的知识,希望对你有一定的参考价值。

//
//  ViewController.m
//  Demo
//
//  Created by on 2021/9/28.
//

#import "ViewController.h"

#import <CoreMedia/CoreMedia.h>

@interface ViewController ()

@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view.
    
    /**
     CMSampleBufferRef的数据结构是一个包含0个或多个解码后(未解码)特定
     媒体类型的样本(音频、视频、多路复用等);
     A reference to a CMSampleBuffer, a CF object containing
     zero or more compressed (or uncompressed) samples of a
     particular media type (audio, video, muxed, etc).
     
     CMSampleBufferRef的使用场景:
     
     比如:摄像头、麦克风采集的音视频数据
     - (void)captureOutput:(AVCaptureOutput *)output
     didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
     fromConnection:(AVCaptureConnection *)connection;
     
     比如:屏幕共享ReplayKit,音视频流输出
     - (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer
     withType:(RPSampleBufferType)sampleBufferType
     
     比如:读取视频文件的输出流AVAssetWriterInput
     - (nullable CMSampleBufferRef)copyNextSampleBuffer
     CF_RETURNS_RETAINED;
     
     比如:ARKit ARSessionObserver输出捕获的audiosampleBuffer
     - (void)session:(ARSession *)session
     didOutputAudioSampleBuffer:(CMSampleBufferRef)audioSampleBuffer;
     
     最近开发的WebRTC音视频的项目,就涉及到此类数据的处理和使用。
     
     下面有两个常用的数据转换,仅供参考:
     CMSampleBufferRef转换为UIImage;
     CMSampleBufferRef转换为NSData;
     
     //获取一个Core Video图像缓存对象
     //CVImageBufferRef:Base type for all CoreVideo image buffers
     CVImageBufferRef imgBuffer = CMSampleBufferGetImageBuffer(buffer);
     
     CVPixelBufferRef:Based on the image buffer type. The pixel
     buffer implements the memory storage for an image buffer.
     
     CVPixelBufferRef是以CVImageBufferRef为基础的类型。这是一种像素图片
     类型,属于Core Video模块.
     
     CVPixelBufferRef是一个C对象,不是OC对象,也不是一个类。
     
     CVPixelBufferRef是用CVBufferRef typedef 定义出来的,
     而CVBufferRef本质上就是一个void *,所以从下面的方法中可以看到这一点。
     
     因为CVPixelBufferRef属于C对象,所以定义了很多C函数,比如:
     CVPixelBufferRelease()
     CVPixelBufferCreate()
     CVPixelBufferLockBaseAddress()
     CVPixelBufferUnlockBaseAddress()
     CVPixelBufferGetWidth()
     CVPixelBufferGetHeight()
     等等。
     
     由于是C对象,它是不受ARC管理的,所以需要手动管理引用计数,控制对象的
     生命周期,可以用CVPixelBufferRetain,CVPixelBufferRelease函数
     用来加减引用计数,其实和CFRetain和CFRelease是等效的,所以可以用 CFGetRetainCount来查看当前引用计数。
     
     CVPixelBufferRef是iOS视频采集处理编码流程的重要中间数据媒介和纽带,
     理解CVPixelBufferRef有助于写出高性能可靠的视频处理。
     
     其实可以下载一个集美颜、滤镜、水印等功能的GPUImage源码看看,里面有很多
     关于CMSampleBufferRef、CVImageBufferRef、CVPixelBufferRef的
     处理,有助于更好的了解和使用。
     
     最近的项目中,需要适配手机屏幕共享时的视频方向,即获取视频帧方向,
     所以使用以下方法:
     CGImagePropertyOrientation oritation = (CGImagePropertyOrientation)((__bridge NSNumber *)CMGetAttachment(sampleBuffer, (__bridge CFStringRef)RPVideoSampleOrientationKey, NULL)).unsignedIntValue;
     
     方向枚举:
     typedef CF_CLOSED_ENUM(uint32_t, CGImagePropertyOrientation) {
// 0th row at top,    0th column on left   - default orientation
         kCGImagePropertyOrientationUp = 1,  
// 0th row at top,    0th column on right  - horizontal flip      
         kCGImagePropertyOrientationUpMirrored,    
// 0th row at bottom, 0th column on right  - 180 deg rotation
         kCGImagePropertyOrientationDown, 
// 0th row at bottom, 0th column on left   - vertical flip         
         kCGImagePropertyOrientationDownMirrored,  
// 0th row on left,   0th column at top
         kCGImagePropertyOrientationLeftMirrored,  
// 0th row on right,  0th column at top    - 90 deg CW
         kCGImagePropertyOrientationRight,      
// 0th row on right,  0th column on bottom   
         kCGImagePropertyOrientationRightMirrored, 
// 0th row on left,   0th column at bottom - 90 deg CCW
         kCGImagePropertyOrientationLeft           
     };
     
     */
    
    
}

#pragma mark - CMSampleBufferRef转换为UIImage
-(UIImage *)bufferToImage:(CMSampleBufferRef)buffer{
    
    //获取一个Core Video图像缓存对象
    //CVImageBufferRef:Base type for all CoreVideo image buffers
    CVImageBufferRef imgBuffer = CMSampleBufferGetImageBuffer(buffer);
    
    //锁定基地址
    //参数一:CVPixelBufferRef
    /**
     CVPixelBufferRef:Based on the image buffer type. The pixel
     buffer implements the memory storage for an image buffer.
     CVPixelBufferRef是以CVImageBufferRef为基础的类型
     */
    //参数二:CVPixelBufferLockFlags
    /**
     If you are not going to modify the data while you hold the
     lock, you should set this flag to avoid potentially
     invalidating any existing caches of the buffer contents.
     This flag should be passed both to the lock and unlock
     functions.  Non-symmetrical usage of this flag will result in
     undefined behavior.
     如果在持有锁时不打算修改数据,则应设置此标志以避免可能使缓冲区内容的任何现有缓存失效。此标志应同时传递给锁定和解锁功能。非对称使用此标志将导致未定义的行为。
     */
    CVReturn result = CVPixelBufferLockBaseAddress(imgBuffer, 0);
    NSLog(@"success or error result:%d", result);
    
    //获得基地址
    void *baseAddr = CVPixelBufferGetBaseAddress(imgBuffer);
    
    //获得行字节数
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imgBuffer);
    
    //获得宽、高
    size_t width = CVPixelBufferGetWidth(imgBuffer);
    size_t height = CVPixelBufferGetHeight(imgBuffer);
    NSLog(@"width:%zu, height:%zu", width, height);
    
    //创建RGB颜色空间
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
    //创建一个图形上下文
    CGContextRef context = CGBitmapContextCreate(baseAddr, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    
    //根据上下文创建CGImage
    CGImageRef cgImg = CGBitmapContextCreateImage(context);
    
    //处理完成后,解锁(加锁、解锁要配套使用)
    CVPixelBufferUnlockBaseAddress(imgBuffer, 0);
    
    //这些资源不会自动释放,需要手动释放
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    
    //创建UIImage
    //UIImage *img = [UIImage imageWithCGImage:cgImg];
    UIImage *img = [UIImage imageWithCGImage:cgImg scale:1.0f orientation:UIImageOrientationUp];
    
    //释放
    CGImageRelease(cgImg);
    
    return img;
}

#pragma mark - CMSampleBufferRef转换为NSData
-(NSData *)bufferToData:(CMSampleBufferRef)buffer{
    
    CVImageBufferRef imgBuffer = CMSampleBufferGetImageBuffer(buffer);
    
    CVPixelBufferLockBaseAddress(imgBuffer, 0);
    
    size_t bytePerRow = CVPixelBufferGetBytesPerRow(imgBuffer);
    
    size_t height = CVPixelBufferGetHeight(imgBuffer);
    
    void *baseAddr = CVPixelBufferGetBaseAddress(imgBuffer);
    
    NSData *data = [NSData dataWithBytes:baseAddr length:bytePerRow * height];
    
    CVPixelBufferUnlockBaseAddress(imgBuffer, 0);
    
    return data;
    
}


@end

以上是关于常用知识之CMSampleBufferRef系的主要内容,如果未能解决你的问题,请参考以下文章

从 CMSampleBufferRef 获取当前视频时长

如何将 CMSampleBufferRef 转换为 NSData

来自 CMSampleBufferRef 的 NSData 或字节

来自 CMSampleBufferRef 的图像始终为白色

缩放和裁剪 CMSampleBufferRef

关联映射级联操作关系维护 ---- Hibernate之一对多|多对一关系