使用 iOS 广播扩展录制视频
Posted
技术标签:
【中文标题】使用 iOS 广播扩展录制视频【英文标题】:Recording video with iOS broadcasting extension 【发布时间】:2017-08-09 07:13:43 【问题描述】:我正在尝试调整 ios 广播扩展功能以录制视频而不是直播。
这似乎是可能的,因为您可以在 processSampleBuffer:withType:
方法中获取像素缓冲区。
然后我像下面这样编码,但是在附加缓冲区时它失败了。
我对 AVAssetWriter 和 App Extension 编程都不熟悉,所以我无法弄清楚这里出了什么问题。
我是否在做一些我们不应该在扩展程序中做的事情?还是 AVAssetWriter 的用法不对?
任何想法都是有帮助的。谢谢!
//
// SampleHandler.m
// The main class of the App Extension
//
#import "SampleHandler.h"
#import <AVFoundation/AVFoundation.h>
@implementation SampleHandler
VideoExporter *exporter;
NSDate *startDate;
- (void)broadcastStartedWithSetupInfo:(NSDictionary<NSString *,NSObject *> *)setupInfo
// User has requested to start the broadcast. Setup info from the UI extension can be supplied but optional.
[self setupWriter];
- (void)setupWriter
NSFileManager *fm = [NSFileManager defaultManager];
// Set video path into shared container
NSURL *containerURL = [fm containerURLForSecurityApplicationGroupIdentifier:@"group.com.mycompany"];
NSURL *libraryURL = [containerURL URLByAppendingPathComponent:@"Library" isDirectory: true];
BOOL isDir = false;
NSURL *cachesURL = [libraryURL URLByAppendingPathComponent:@"Caches" isDirectory: true];
NSURL *outVideoURL = [cachesURL URLByAppendingPathComponent:@"output.mov"];
if([[NSFileManager defaultManager] fileExistsAtPath:[outVideoURL path]])
[[NSFileManager defaultManager] removeItemAtPath:[outVideoURL path] error:nil];
exporter = [[VideoExporter alloc] initWithOutputURL:outVideoURL size:CGSizeMake(1280, 720) frameRate:30];
exporter.delegate = self;
[exporter beginExport];
startDate = [NSDate date];
- (void)broadcastPaused
// User has requested to pause the broadcast. Samples will stop being delivered.
- (void)broadcastResumed
// User has requested to resume the broadcast. Samples delivery will resume.
- (void)broadcastFinished
// User has requested to finish the broadcast.
NSLog(@"User requested finish writing");
[exporter finishWriting];
- (void)processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType
switch (sampleBufferType)
case RPSampleBufferTypeVideo:
// Handle video sample buffer
[exporter addCMSampleBuffer:sampleBuffer];
break;
case RPSampleBufferTypeAudioApp:
// Handle audio sample buffer for app audio
break;
case RPSampleBufferTypeAudioMic:
// Handle audio sample buffer for mic audio
break;
default:
break;
@end
//
// VideoExporter.m
// Helper class to write the video
//
#import "VideoExporter.h"
@implementation VideoExporter
@synthesize width, height;
@synthesize framesPerSecond;
@synthesize outputURL;
@synthesize delegate;
- (id)initWithOutputURL:(NSURL *)aURL size:(CGSize)size frameRate:(uint64_t)fps
if ((self = [super init]))
width = (int)round(size.width);
height = (int)round(size.height);
framesPerSecond = fps;
outputURL = aURL;
return self;
- (void)beginExport
NSError *error = nil;
writer = [[AVAssetWriter alloc] initWithURL:outputURL
fileType:AVFileTypeQuickTimeMovie
error:&error];
NSAssert(writer != nil, @"Writer should not be nil");
NSDictionary * outSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecTypeH264, AVVideoCodecKey,
[NSNumber numberWithInt:width], AVVideoWidthKey,
[NSNumber numberWithInt:height], AVVideoHeightKey, nil];
writerInput = [[AVAssetWriterInput alloc] initWithMediaType:AVMediaTypeVideo outputSettings:outSettings];
NSDictionary * pixelAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey, nil];
adaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:pixelAttributes];
[writer addInput:writerInput];
BOOL started = [writer startWriting];
NSAssert(started, @"Should start writing!");
[writer startSessionAtSourceTime:kCMTimeZero];
- (void)addCMSampleBuffer:(CMSampleBufferRef)buf
if(writer.status==0)
NSLog(@"Writer status unknown!!");
[self appendCMSampleBuffer:buf];
- (void)finishWriting
[writerInput markAsFinished];
dispatch_semaphore_t semaphore = dispatch_semaphore_create(0);
int64_t longDuration = 1000;
CMTime cmTime = CMTimeMake(longDuration, 1);
[writer endSessionAtSourceTime:cmTime];
[writer finishWritingWithCompletionHandler:^
// Call delegate method here
dispatch_semaphore_signal(semaphore);
];
dispatch_semaphore_wait(semaphore, DISPATCH_TIME_FOREVER);
#pragma mark - Private -
- (void)appendCMSampleBuffer:(CMSampleBufferRef)bufferRef
if(![writerInput isReadyForMoreMediaData])
NSLog(@"WriterInput not ready! status = %ld, error=%@", (long)writer.status, writer.error);
return;
BOOL success = [adaptor appendPixelBuffer:(CVPixelBufferRef)bufferRef withPresentationTime:CMTimeMake(frameCount++, (int32_t) framesPerSecond)];
if(success == NO)
NSLog(@"Append buffer failed! status = %ld, error=%@", (long)writer.status, writer.error); // Always gets here
@end
【问题讨论】:
我在 iOS12 中测试过,在[writer startWriting]
中崩溃了。我认为不允许在 Extension 中对图像进行编码。
【参考方案1】:
我在 iOS 13.7 (17H35) 上测试了这种行为
似乎AVAssetWriter
需要前台权限,而扩展程序没有。 (source)
startWriting
方法返回 false
和 assetWriter.error
等于:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=
NSLocalizedFailureReason=An unknown error occurred (-17508),
NSLocalizedDescription=The operation could not be completed,
NSUnderlyingError=0x282a80120
Error Domain=NSOSStatusErrorDomain Code=-17508 "(null)"
【讨论】:
【参考方案2】:iOS 14:从 BroadcastUpload 扩展 Example project with Package 使用 AVAssetWriter 时没问题
【讨论】:
以上是关于使用 iOS 广播扩展录制视频的主要内容,如果未能解决你的问题,请参考以下文章
Swift IOS 使用 AVFoundation 录制视频和音频