如何使用 AVCapturePhotoOutput

Posted

技术标签:

【中文标题】如何使用 AVCapturePhotoOutput【英文标题】:How to use AVCapturePhotoOutput 【发布时间】:2016-10-18 14:33:39 【问题描述】:

我一直在使用自定义相机,最近我升级到 Xcode 8 beta 和 Swift 3。我最初有这个:

var stillImageOutput: AVCaptureStillImageOutput?

但是,我现在收到警告:

“AVCaptureStillImageOutput”在 ios 10.0 中已弃用:改用 AVCapturePhotoOutput

由于这是相当新的,我没有看到太多这方面的信息。这是我当前的代码:

var captureSession: AVCaptureSession?
var stillImageOutput: AVCaptureStillImageOutput?
var previewLayer: AVCaptureVideoPreviewLayer?

func clickPicture() 

    if let videoConnection = stillImageOutput?.connection(withMediaType: AVMediaTypeVideo) 

        videoConnection.videoOrientation = .portrait
        stillImageOutput?.captureStillImageAsynchronously(from: videoConnection, completionHandler:  (sampleBuffer, error) -> Void in

            if sampleBuffer != nil 

                let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
                let dataProvider = CGDataProvider(data: imageData!)
                let cgImageRef = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)

                let image = UIImage(cgImage: cgImageRef!, scale: 1, orientation: .right)

            

        )

    


我曾尝试查看AVCapturePhotoCaptureDelegate,但我不太确定如何使用它。有人知道如何使用这个吗?谢谢。

【问题讨论】:

你需要看 WWDC 2016 session 511 video.. 哦!所以我会看视频,如果可以的话,我会发布答案。谢谢! 查看the docs 也可能有所帮助。 【参考方案1】:

更新到 Swift 4 嗨,它真的很容易使用AVCapturePhotoOutput

您需要返回CMSampleBufferAVCapturePhotoCaptureDelegate

如果您告诉AVCapturePhotoSettings previewFormat,您也可以获得预览图像

    class CameraCaptureOutput: NSObject, AVCapturePhotoCaptureDelegate 

        let cameraOutput = AVCapturePhotoOutput()

        func capturePhoto() 

          let settings = AVCapturePhotoSettings()
          let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
          let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                               kCVPixelBufferWidthKey as String: 160,
                               kCVPixelBufferHeightKey as String: 160]
          settings.previewPhotoFormat = previewFormat
          self.cameraOutput.capturePhoto(with: settings, delegate: self)

        

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?)                         
            if let error = error 
                print(error.localizedDescription)
            

            if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) 
              print("image: \(UIImage(data: dataImage)?.size)") // Your Image
               
        
    

欲了解更多信息,请访问https://developer.apple.com/reference/AVFoundation/AVCapturePhotoOutput

注意:您必须在拍照前将AVCapturePhotoOutput 添加到AVCaptureSession。比如:session.addOutput(output),然后:output.capturePhoto(with:settings, delegate:self) 谢谢@BigHeadCreations

【讨论】:

给出错误:“[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] 没有活动和启用的视频连接”。您能否提供 iOS 10 / Swift 3 的完整示例。 @TuomasLaatikainen 您可能需要将捕获会话预设设置为 AVCaptureSessionPresetPhoto 我观看了视频,浏览了整个网络,重新编写了代码,更换了 iPhone,但无法解决“没有活动和启用的视频连接”异常。 Apple 文档通常含糊不清,缺乏细节。帮助!有工作项目可以分享吗?? @TuomasLaatikainen 你知道你的问题是什么吗?有同样的问题 @TuomasLaatikainen 您必须在拍照之前AVCapturePhotoOutput 添加到AVCaptureSession。比如:session.addOutput(output),然后:output.capturePhoto(with:settings, delegate:self)【参考方案2】:

我在 GitHub 上发现了这个项目,它帮助我了解了设备的初始化和捕获会话。

AVCapturePhotoOutput_test by inoue0426

【讨论】:

【参考方案3】:

有我的完整实现

import UIKit
import AVFoundation

class ViewController: UIViewController, AVCapturePhotoCaptureDelegate 

var captureSesssion : AVCaptureSession!
var cameraOutput : AVCapturePhotoOutput!
var previewLayer : AVCaptureVideoPreviewLayer!

@IBOutlet weak var capturedImage: UIImageView!
@IBOutlet weak var previewView: UIView!

override func viewDidLoad() 
    super.viewDidLoad()
    captureSesssion = AVCaptureSession()
    captureSesssion.sessionPreset = AVCaptureSessionPresetPhoto
    cameraOutput = AVCapturePhotoOutput()

    let device = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeVideo)

    if let input = try? AVCaptureDeviceInput(device: device) 
        if (captureSesssion.canAddInput(input)) 
            captureSesssion.addInput(input)
            if (captureSesssion.canAddOutput(cameraOutput)) 
                captureSesssion.addOutput(cameraOutput)
                previewLayer = AVCaptureVideoPreviewLayer(session: captureSesssion)
                previewLayer.frame = previewView.bounds
                previewView.layer.addSublayer(previewLayer)
                captureSesssion.startRunning()
            
         else 
            print("issue here : captureSesssion.canAddInput")
        
     else 
        print("some problem here")
    


// Take picture button
@IBAction func didPressTakePhoto(_ sender: UIButton) 
    let settings = AVCapturePhotoSettings()
    let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
    let previewFormat = [
         kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
         kCVPixelBufferWidthKey as String: 160,
         kCVPixelBufferHeightKey as String: 160
    ]
    settings.previewPhotoFormat = previewFormat
    cameraOutput.capturePhoto(with: settings, delegate: self)


// callBack from take picture
func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) 

    if let error = error 
        print("error occure : \(error.localizedDescription)")
    

    if  let sampleBuffer = photoSampleBuffer,
        let previewBuffer = previewPhotoSampleBuffer,
        let dataImage =  AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer:  sampleBuffer, previewPhotoSampleBuffer: previewBuffer) 
        print(UIImage(data: dataImage)?.size as Any)

        let dataProvider = CGDataProvider(data: dataImage as CFData)
        let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
        let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImageOrientation.right)

        self.capturedImage.image = image
     else 
        print("some error here")
    


// This method you can use somewhere you need to know camera permission   state
func askPermission() 
    print("here")
    let cameraPermissionStatus =  AVCaptureDevice.authorizationStatus(forMediaType: AVMediaTypeVideo)

    switch cameraPermissionStatus 
    case .authorized:
        print("Already Authorized")
    case .denied:
        print("denied")

        let alert = UIAlertController(title: "Sorry :(" , message: "But  could you please grant permission for camera within device settings",  preferredStyle: .alert)
        let action = UIAlertAction(title: "Ok", style: .cancel,  handler: nil)
        alert.addAction(action)
        present(alert, animated: true, completion: nil)

    case .restricted:
        print("restricted")
    default:
        AVCaptureDevice.requestAccess(forMediaType: AVMediaTypeVideo, completionHandler: 
            [weak self]
            (granted :Bool) -> Void in

            if granted == true 
                // User granted
                print("User granted")
 DispatchQueue.main.async()
            //Do smth that you need in main thread   
             
            
            else 
                // User Rejected
                print("User Rejected")

DispatchQueue.main.async()
            let alert = UIAlertController(title: "WHY?" , message:  "Camera it is the main feature of our application", preferredStyle: .alert)
                let action = UIAlertAction(title: "Ok", style: .cancel, handler: nil)
                alert.addAction(action)
                self?.present(alert, animated: true, completion: nil)  
             
            
        );
    


【讨论】:

你是如何设置 flashMode 的? 在 iOS 10.0.2 上工作。用于开启闪光灯settings.flashMode = .on 为什么选择 UIImageOrientation.right ?那么这是 iPad 上的错误方向。 像魅力一样工作:)【参考方案4】:

capture 委托函数已更改为 photoOutput。这是 Swift 4 的更新函数。

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?)             
        if let error = error 
            print(error.localizedDescription)
        

        if let sampleBuffer = photoSampleBuffer, let previewBuffer = previewPhotoSampleBuffer, let dataImage = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: sampleBuffer, previewPhotoSampleBuffer: previewBuffer) 
            print("image: \(String(describing: UIImage(data: dataImage)?.size))") // Your Image
        

【讨论】:

【参考方案5】:

在 iOS 11 中"photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) " is deprecated

使用以下方法:

func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) 
    let imageData = photo.fileDataRepresentation()
    if let data = imageData, let img = UIImage(data: data) 
        print(img)
    

【讨论】:

【参考方案6】:

我采纳了@Aleksey Timoshchenko 的 出色答案并将其更新为Swift 4.x

请注意,对于我的用例,我允许用户拍摄多张照片,这就是我将它们保存在 images 数组中的原因。

请注意,您需要通过您的storyboard 或在代码中连接@IBAction takePhoto 方法。就我而言,我使用storyboard

截至iOS 11@Aleksey Timoshchenko 的答案中使用的AVCapturePhotoOutput.jpegPhotoDataRepresentation 已弃用。

斯威夫特 4.x

class CameraVC: UIViewController 

    @IBOutlet weak var cameraView: UIView!

    var images = [UIImage]()

    var captureSession: AVCaptureSession!
    var cameraOutput: AVCapturePhotoOutput!
    var previewLayer: AVCaptureVideoPreviewLayer!

    override func viewDidLoad() 
        super.viewDidLoad()
    

    override func viewDidAppear(_ animated: Bool) 
        super.viewDidAppear(animated)
        startCamera()
    

    func startCamera() 
        captureSession = AVCaptureSession()
        captureSession.sessionPreset = AVCaptureSession.Preset.photo
        cameraOutput = AVCapturePhotoOutput()

        if let device = AVCaptureDevice.default(for: .video),
           let input = try? AVCaptureDeviceInput(device: device) 
            if (captureSession.canAddInput(input)) 
                captureSession.addInput(input)
                if (captureSession.canAddOutput(cameraOutput)) 
                    captureSession.addOutput(cameraOutput)
                    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                    previewLayer.frame = cameraView.bounds
                    cameraView.layer.addSublayer(previewLayer)
                    captureSession.startRunning()
                
             else 
                print("issue here : captureSesssion.canAddInput")
            
         else 
            print("some problem here")
        
    

    @IBAction func takePhoto(_ sender: UITapGestureRecognizer) 
        let settings = AVCapturePhotoSettings()
        let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
        let previewFormat = [
            kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
            kCVPixelBufferWidthKey as String: 160,
            kCVPixelBufferHeightKey as String: 160
        ]
        settings.previewPhotoFormat = previewFormat
        cameraOutput.capturePhoto(with: settings, delegate: self)   
    


extension CameraVC : AVCapturePhotoCaptureDelegate 
    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) 

        if let error = error 
            print("error occured : \(error.localizedDescription)")
        

        if let dataImage = photo.fileDataRepresentation() 
            print(UIImage(data: dataImage)?.size as Any)

            let dataProvider = CGDataProvider(data: dataImage as CFData)
            let cgImageRef: CGImage! = CGImage(jpegDataProviderSource: dataProvider!, decode: nil, shouldInterpolate: true, intent: .defaultIntent)
            let image = UIImage(cgImage: cgImageRef, scale: 1.0, orientation: UIImage.Orientation.right)

            /**
               save image in array / do whatever you want to do with the image here
            */
            self.images.append(image)

         else 
            print("some error here")
        
    

【讨论】:

这是最好的答案。它专注于核心方面使其发挥作用!!! 很好的答案。但是注意fileDataRepresentation()需要iOS11

以上是关于如何使用 AVCapturePhotoOutput的主要内容,如果未能解决你的问题,请参考以下文章

无法使用 AVCapturePhotoOutput 快速捕捉照片 + xcode

AVCapturePhotoOutput - Xcode 9 Beta 5 中的更改

AVCapturePhotoOutput 颜色与预览层不同

使用 AVCapturePhotoOutput 使用闪光灯拍照时出现问题

swift AVCapturePhotoOutput capturePhoto 挂起预览

无法仅获取在 Swift 中显示相机(AVCapturePhotoOutput)的 UIView 的屏幕截图