可以同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 吗?
Posted
技术标签:
【中文标题】可以同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 吗?【英文标题】:Can use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput at the same time? 【发布时间】:2011-06-24 01:33:51 【问题描述】:我想用我的代码同时录制视频和抓取帧。
我使用AVCaptureVideoDataOutput
抓取帧,AVCaptureMovieFileOutput
用于视频录制。但不能同时工作,但单独工作时出现错误代码-12780。
我搜索了这个问题,但没有得到答案。有没有人有同样的经历或者解释一下? 确实困扰了我一段时间。
谢谢。
【问题讨论】:
"可以使用 AVCaptureMovieFileOutput 将视频直接捕获到文件中。但是,该类没有可显示的数据,并且不能与 AVCaptureVideoDataOutput 同时使用。" i> 在这里找到:link .. 只是为了澄清问题的实际原因 【参考方案1】:这是汤米回答的快速版本。
// Set up the Capture Session
// Add the Inputs
// Add the Outputs
var outputSettings = [
AVVideoWidthKey : Int(640),
AVVideoHeightKey : Int(480),
AVVideoCodecKey : .h264
]
var assetWriterInput = AVAssetWriterInput(mediaType: AVMediaTypeVideo,outputSettings: outputSettings)
var pixelBufferAdaptor = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput, sourcePixelBufferAttributes:
[ kCVPixelBufferPixelFormatTypeKey : Int(kCVPixelFormatType_32BGRA)])
var assetWriter = AVAssetWriter(url: URLFromSomwhere, fileType: AVFileTypeMPEG4 , error : Error )
assetWriter.addInput(assetWriterInput)
assetWriterInput.expectsMediaDataInRealTime = true
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)
captureSession.startRunning()
func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection)
var imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
var frameNumber: Int64 = 0
if assetWriterInput.readyForMoreMediaData
pixelBufferAdaptor.appendPixelBuffer(imageBuffer, withPresentationTime: CMTimeMake(frameNumber, 25))
frameNumber += 1
captureSession.stopRunning()
assetWriter.finishWriting()
但我不能保证 100% 准确,因为我是 swift 新手。
【讨论】:
太棒了。这行得通。谢谢。但它需要更新到当前的语法。【参考方案2】:我无法回答提出的具体问题,但我已经成功地同时使用以下方法录制视频和抓取帧:
AVCaptureSession
和 AVCaptureVideoDataOutput
将帧路由到我自己的代码中
AVAssetWriter
、AVAssetWriterInput
和 AVAssetWriterInputPixelBufferAdaptor
将帧写入 H.264 编码电影文件
这还没有调查音频。我最终从捕获会话中获得CMSampleBuffers
,然后将它们推入像素缓冲区适配器。
编辑:所以我的代码看起来或多或少像,你在略过和忽略范围问题时没有问题:
/* to ensure I'm given incoming CMSampleBuffers */
AVCaptureSession *captureSession = alloc and init, set your preferred preset/etc;
AVCaptureDevice *captureDevice = default for video, probably;
AVCaptureDeviceInput *deviceInput = input with device as above,
and attach it to the session;
AVCaptureVideoDataOutput *output = output for 32BGRA pixel format, with me as the
delegate and a suitable dispatch queue affixed.
/* to prepare for output; I'll output 640x480 in H.264, via an asset writer */
NSDictionary *outputSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
AVVideoCodecH264, AVVideoCodecKey,
nil];
AVAssetWriterInput *assetWriterInput = [AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:outputSettings];
/* I'm going to push pixel buffers to it, so will need a
AVAssetWriterPixelBufferAdaptor, to expect the same 32BGRA input as I've
asked the AVCaptureVideDataOutput to supply */
AVAssetWriterInputPixelBufferAdaptor *pixelBufferAdaptor =
[[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:assetWriterInput
sourcePixelBufferAttributes:
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil]];
/* that's going to go somewhere, I imagine you've got the URL for that sorted,
so create a suitable asset writer; we'll put our H.264 within the normal
MPEG4 container */
AVAssetWriter *assetWriter = [[AVAssetWriter alloc]
initWithURL:URLFromSomwhere
fileType:AVFileTypeMPEG4
error:you need to check error conditions,
this example is too lazy];
[assetWriter addInput:assetWriterInput];
/* we need to warn the input to expect real time data incoming, so that it tries
to avoid being unavailable at inopportune moments */
assetWriterInput.expectsMediaDataInRealTime = YES;
... eventually ...
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[captureSession startRunning];
... elsewhere ...
- (void) captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// a very dense way to keep track of the time at which this frame
// occurs relative to the output stream, but it's just an example!
static int64_t frameNumber = 0;
if(assetWriterInput.readyForMoreMediaData)
[pixelBufferAdaptor appendPixelBuffer:imageBuffer
withPresentationTime:CMTimeMake(frameNumber, 25)];
frameNumber++;
... and, to stop, ensuring the output file is finished properly ...
[captureSession stopRunning];
[assetWriter finishWriting];
【讨论】:
您介意发布一个示例代码来说明如何做到这一点吗?你的真实生活业力会增加10倍!!! :D 哦,有因果报应吗?然后添加了一些非常基本的示例代码! 感谢您的代码,我让它可以处理图像。给视频加声音怎么样,有什么线索吗? 设置输出 av 录制媒体的方向:developer.apple.com/library/ios/#qa/qa1744/_index.html#//… 你能整合声音吗?? @ImranRaheem以上是关于可以同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 吗?的主要内容,如果未能解决你的问题,请参考以下文章
可以同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 吗?
在ubuntu linux使用wifi创建AP同时使用wifi上网和共享热点,直接命令行就可以,目前一般的wifi设备都支持创建ap,可以同时上网同时共享热点。
在ubuntu linux使用wifi创建AP同时使用wifi上网和共享热点,直接命令行就可以,目前一般的wifi设备都支持创建ap,可以同时上网同时共享热点。
在ubuntu linux使用wifi创建AP同时使用wifi上网和共享热点,直接命令行就可以,目前一般的wifi设备都支持创建ap,可以同时上网同时共享热点。