iOS拍照定制之AVCaptureVideoDataOutput

Posted 善斋书屋

tags:

篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了iOS拍照定制之AVCaptureVideoDataOutput相关的知识,希望对你有一定的参考价值。

  • 问题

    领导看了前面做的拍照,问了句"哪来的声音",

    "系统的,自带的,你看系统的拍照也有声音"

    "有办法能去掉吗?挺糟心的"

    "我试试"

  • 思路

    路漫漫其修远兮,吾在度娘+SDK中求索

    拍砖AVCaptureVideoDataOutput, 代理方法中将CMSampleBufferRef转成UIImage

  • 上码

    • session设置不提
    • layer设置可参考上篇 [ios拍照定制之AVCapturePhotoOutput] 以及 上上篇[iOS写在定制相机之前]
    • 获取摄像头、取到设备输入添加到session、初始化videoOutput添加入session
    AVCaptureDevice *device = [self cameraDevice];
    if (!device) {
        NSLog(@"取得后置摄像头出问题");
        return;;
    }
    
    NSError *error = nil;
    self.videoInput = [[AVCaptureDeviceInput alloc] initWithDevice:device error:nil];
    
    // 设备添加到会话中
    if ([self.captureSession canAddInput:self.videoInput]) {
        [self.captureSession addInput:self.videoInput];
    }
    
    [self.videoOutput setSampleBufferDelegate:self queue:self.videoQueue];
    if ([self.captureSession canAddOutput:self.videoOutput]) {
        [self.captureSession addOutput:self.videoOutput];
    }
    
    // 懒加载
    - (AVCaptureVideoDataOutput *)videoOutput {
        if (!_videoOutput) {
            _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
            _videoOutput.alwaysDiscardsLateVideoFrames = YES;
            _videoOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
                                                                     forKey:(id)kCVPixelBufferPixelFormatTypeKey];
        }
        return _videoOutput;
    }
    
    - (dispatch_queue_t)videoQueue {
        if (!_videoQueue) {
            _videoQueue = dispatch_queue_create("queue", DISPATCH_QUEUE_SERIAL);
        }
        return _videoQueue;
    }
    
    • 代理AVCaptureVideoDataOutputSampleBufferDelegate
    - (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
        @autoreleasepool {
            if (connection == [self.videoOutput connectionWithMediaType:AVMediaTypeVideo]) { // 视频
                @synchronized (self) {
                    UIImage *image = [self bufferToImage:sampleBuffer rect:self.scanView.scanRect];
                    self.uploadImg = image;
                }
            }
        }
    }
    • CMSampleBufferRef转成UIImage, 该方法有所调整,截图整张图中的某一部分,按需设置。具体获取指定区域图片需自己调整
    - (UIImage *)bufferToImage:(CMSampleBufferRef)sampleBuffer rect:(CGRect)rect {
        // Get a CMSampleBuffer\'s Core Video image buffer for the media data
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
        // Lock the base address of the pixel buffer
        CVPixelBufferLockBaseAddress(imageBuffer, 0);
    
        // Get the number of bytes per row for the pixel buffer
        void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    
        // Get the number of bytes per row for the pixel buffer
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        // Get the pixel buffer width and height
        size_t width = CVPixelBufferGetWidth(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
    
        // Create a device-dependent RGB color space
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    
        // Create a bitmap graphics context with the sample buffer data
        CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
                                                     bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
        // Create a Quartz image from the pixel data in the bitmap graphics context
        CGImageRef quartzImage = CGBitmapContextCreateImage(context);
        // Unlock the pixel buffer
        CVPixelBufferUnlockBaseAddress(imageBuffer,0);
    
        // Free up the context and color space
        CGContextRelease(context);
        CGColorSpaceRelease(colorSpace);
        
        // 获取指定区域图片
        CGRect dRect;
        CGSize msize = UIScreen.mainScreen.bounds.size;
        msize.height = msize.height - 150;
        CGFloat x = width * rect.origin.x / msize.width;
        CGFloat y = height * rect.origin.y / msize.height;
        CGFloat w = width * rect.size.width / msize.width;
        CGFloat h = height * rect.size.height / msize.height;
        dRect = CGRectMake(x, y, w, h);
        
        CGImageRef partRef = CGImageCreateWithImageInRect(quartzImage, dRect);
        
        // Create an image object from the Quartz image
        UIImage *image = [UIImage imageWithCGImage:partRef];
    
        // Release the Quartz image
        CGImageRelease(partRef);
        CGImageRelease(quartzImage);
        
        return image;
    }
    
  • 图有了,收工。怎么用图,业务该干活了

以上是关于iOS拍照定制之AVCaptureVideoDataOutput的主要内容,如果未能解决你的问题,请参考以下文章

iOS拍照之系统拍照

iOS之录像 拍照

iOS写在定制相机之前

iOS开发——定制圆形头像与照相机图库的使用

在 iOS 应用中拍照后是不是可以避免“使用”/“重拍”屏幕?

Android开发 开发的拍照功能放到手机上用不显示图像,是黑色的