使用增强现实录制视频的最佳方式是什么?
Posted
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了使用增强现实录制视频的最佳方式是什么?相关的知识,希望对你有一定的参考价值。
使用增强现实录制视频的最佳方式是什么? (从iPhone / iPad相机添加文字,图像徽标到相框)
以前我试图弄清楚如何绘制到CIImage
(How to draw text into CIImage?)并将CIImage
转换回CMSampleBuffer
(CIImage back to CMSampleBuffer)
我几乎做了所有事情,只有在CMSampleBuffer
使用新的AVAssetWriterInput
录制视频时遇到问题
但是这个解决方案总体上并不好,它在将CIImage
转换为CVPixelBuffer
(ciContext.render(ciImage!, to: aBuffer)
)时吃了很多CPU
所以我想在这里停下来找一些其他方法来录制带有增强现实的视频(例如,在将视频编码为mp4文件时,在帧内动态添加(绘制)文本)
在这里,我尝试过,不想再使用了......
// convert original CMSampleBuffer to CIImage,
// combine multiple `CIImage`s into one (adding augmented reality -
// text or some additional images)
let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let ciimage : CIImage = CIImage(cvPixelBuffer: pixelBuffer)
var outputImage: CIImage?
let images : Array<CIImage> = [ciimage, ciimageSec!] // add all your CIImages that you'd like to combine
for image in images {
outputImage = outputImage == nil ? image : image.composited(over: outputImage!)
}
// allocate this class variable once
if pixelBufferNew == nil {
CVPixelBufferCreate(kCFAllocatorSystemDefault, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer), kCVPixelFormatType_32BGRA, nil, &pixelBufferNew)
}
// convert CIImage to CVPixelBuffer
let ciContext = CIContext(options: nil)
if let aBuffer = pixelBufferNew {
ciContext.render(outputImage!, to: aBuffer) // >>> IT EATS A LOT OF <<< CPU
}
// convert new CVPixelBuffer to new CMSampleBuffer
var sampleTime = CMSampleTimingInfo()
sampleTime.duration = CMSampleBufferGetDuration(sampleBuffer)
sampleTime.presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
sampleTime.decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
var videoInfo: CMVideoFormatDescription? = nil
CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, &videoInfo)
var oBuf: CMSampleBuffer?
CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBufferNew!, true, nil, nil, videoInfo!, &sampleTime, &oBuf)
/*
try to append new CMSampleBuffer into a file (.mp4) using
AVAssetWriter & AVAssetWriterInput... (I met errors with it, original buffer works ok
- "from func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)")
*/*
有没有更好的解决方案?
答案
现在我回答我自己的问题
最好的方法是使用Objective-C++
类(.mm
)我们可以使用OpenCV并在处理后轻松/快速地从CMSampleBuffer
转换为cv::Mat
并返回CMSampleBuffer
我们可以从Swift轻松调用Objective-C ++函数
以上是关于使用增强现实录制视频的最佳方式是什么?的主要内容,如果未能解决你的问题,请参考以下文章