编写多个视频会导致挂起
Posted
技术标签:
【中文标题】编写多个视频会导致挂起【英文标题】:Composing multiple videos causes hang 【发布时间】:2015-08-05 18:20:13 【问题描述】:我正在开发一个应用程序,该应用程序可以组合用户拍摄的多个视频剪辑。剪辑被记录在相机上,并与另一个视频叠加,然后将录制的剪辑组合成一个长剪辑。每个剪辑的长度由叠加的视频文件决定。
我正在使用AVAssetExportSession
和exportAsynchronouslyWithCompletionHandler
。奇怪的是,这适用于某些剪辑,而不适用于其他剪辑。真正的问题是导出器没有报告任何错误或失败,只是零进度并且从不调用完成处理程序。
我什至不知道从哪里开始寻找问题所在。这是我用来组合剪辑的函数
- (void) setupAndStitchVideos:(NSMutableArray*)videoData
// Filepath to where the final generated video is stored
NSURL * exportUrl = nil;
// Contains information about a single asset/track
NSDictionary * assetOptions = nil;
AVURLAsset * currVideoAsset = nil;
AVURLAsset * currAudioAsset = nil;
AVAssetTrack * currVideoTrack = nil;
AVAssetTrack * currAudioTrack = nil;
// Contains all tracks and time ranges used to build the final composition
NSMutableArray * allVideoTracks = nil;
NSMutableArray * allVideoRanges = nil;
NSMutableArray * allAudioTracks = nil;
NSMutableArray * allAudioRanges = nil;
AVMutableCompositionTrack * videoTracks = nil;
AVMutableCompositionTrack * audioTracks = nil;
// Misc time values used when calculating a clips start time and total length
float animationLength = 0.0f;
float clipLength = 0.0f;
float startTime = 0.0f;
CMTime clipStart = kCMTimeZero;
CMTime clipDuration = kCMTimeZero;
CMTimeRange currRange = kCMTimeRangeZero;
// The final composition to be generated and exported
AVMutableComposition * finalComposition = nil;
// Cancel any already active exports
if (m_activeExport)
[m_activeExport cancelExport];
m_activeExport = nil;
// Initialize and setup all composition related member variables
allVideoTracks = [[NSMutableArray alloc] init];
allAudioTracks = [[NSMutableArray alloc] init];
allVideoRanges = [[NSMutableArray alloc] init];
allAudioRanges = [[NSMutableArray alloc] init];
exportUrl = [NSURL fileURLWithPath:[MobveoAnimation getMergeDestination]];
finalComposition = [AVMutableComposition composition];
videoTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
animationLength = m_animation.videoDuration;
// Define all of the audio and video tracks that will be used in the composition
for (NSDictionary * currData in videoData)
currVideoAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_VIDEO_URL] options:assetOptions];
currAudioAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_AUDIO_URL] options:assetOptions];
currVideoTrack = [[currVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSArray *audioTracks = [currAudioAsset tracksWithMediaType:AVMediaTypeAudio];
if ( audioTracks != nil && audioTracks.count > 0 )
currAudioTrack = audioTracks[0];
else
currAudioTrack = nil;
clipLength = animationLength * [(NSNumber*)[currData objectForKey:KEY_STITCH_LENGTH_PERCENTAGE] floatValue];
clipStart = CMTimeMakeWithSeconds(startTime, currVideoAsset.duration.timescale);
clipDuration = CMTimeMakeWithSeconds(clipLength, currVideoAsset.duration.timescale);
NSLog(@"Clip length: %.2f", clipLength);
NSLog(@"Clip Start: %lld", clipStart.value );
NSLog(@"Clip duration: %lld", clipDuration.value);
currRange = CMTimeRangeMake(clipStart, clipDuration);
[allVideoTracks addObject:currVideoTrack];
if ( currAudioTrack != nil )
[allAudioTracks addObject:currAudioTrack];
[allAudioRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
[allVideoRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
startTime += clipLength;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:nil];
if ( allAudioTracks.count > 0 )
[audioTracks insertTimeRanges:allAudioRanges ofTracks:allAudioTracks atTime:kCMTimeZero error:nil];
for ( int i = 0; i < allVideoTracks.count - allAudioTracks.count; ++i )
CMTimeRange curRange = [allVideoRanges[i] CMTimeRangeValue];
[audioTracks insertEmptyTimeRange:curRange];
// Delete any previous exported video files that may already exist
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
// Begin the composition generation and export process!
m_activeExport = [[AVAssetExportSession alloc] initWithAsset:finalComposition presetName:AVAssetExportPreset1280x720];
[m_activeExport setOutputFileType:AVFileTypeQuickTimeMovie];
[m_activeExport setOutputURL:exportUrl];
NSLog(@"Exporting async");
[m_activeExport exportAsynchronouslyWithCompletionHandler:^(void)
NSLog(@"Export complete");
// Cancel the update timer
[m_updateTimer invalidate];
m_updateTimer = nil;
// Dismiss the displayed dialog
[m_displayedDialog hide:TRUE];
m_displayedDialog = nil;
// Re-enable touch events
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
// Report the success/failure result
switch (m_activeExport.status)
case AVAssetExportSessionStatusFailed:
[self performSelectorOnMainThread:@selector(videoStitchingFailed:) withObject:m_activeExport.error waitUntilDone:FALSE];
break;
case AVAssetExportSessionStatusCompleted:
[self performSelectorOnMainThread:@selector(videoStitchingComplete:) withObject:m_activeExport.outputURL waitUntilDone:FALSE];
break;
// Clear our reference to the completed export
m_activeExport = nil;
];
编辑:
感谢 cmets 中的 Josh,我注意到我没有使用错误参数。在现在失败的情况下,我在插入视频轨道的时间范围时遇到了非常有用的“无法完成操作”错误:
NSError *videoError = nil;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:&videoError];
if ( videoError != nil )
NSLog(@"Error adding video track: %@", videoError);
输出:
Error adding video track: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17426dd00 NSUnderlyingError=0x174040cc0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed
值得注意的是,在整个代码库中,没有任何地方使用urlWithString
代替fileUrlWithPath
,所以这不是问题。
【问题讨论】:
@JoshCaswell 感谢您指出这一点,我不是这个 cod 的原作者,我倾向于不忽略这类函数的错误参数。您可以在我的编辑中看到我现在实际收到的错误 您是否正在尝试直播导出会话? @ChrisHaze 不,该应用程序的工作方式是用户录制一系列剪辑,然后一旦完成,所有剪辑都会组合成一个长视频 我不这么认为,但我只是想确定一下。我知道问题是什么,我会尝试在我的回答中提供一些准确的细节 【参考方案1】:从您对 videoData 数组的for in
枚举来看,在您初始化组合成员变量之后,看起来好像您正在阻塞调用线程。尽管允许访问每个 AVAssetTrack 实例,但键的值并不总是立即可用并同步运行..
请尝试使用AVSynchronousKeyValueLoading
协议注册更改通知。 Apple 的documentation 应该可以帮助您解决问题并让您顺利上路!
以下是我为 AVFoundation 汇总的一些 Apple 建议:
希望这能解决问题!祝您好运,如果您有任何其他问题/问题,请告诉我。
【讨论】:
我没看懂,你怎么看多线程安全的? 我不完全确定之前出了什么问题,但事实证明我弄错了一些剪辑不工作,而一些工作(不应该让客户自行决定)给我一份完整准确的质量保证报告)。我最终添加了对loadValuesAsynchronouslyForKeys
的调用,并且奇迹般地修复了它。
Apple 的 AVFoundation SDK 并不是最直接的框架,老实说,我也无法确切地告诉您遇到问题的原因。我在构建视频共享概念时遇到了类似的问题,不得不求助于 2011 年的 WWDC 视频,他们在其中提到了一些阻塞问题。很高兴它奏效了! - 我会得到赏金吗?
我今天只想把修复程序通过绞拧机,然后一旦我确定它是正确的修复程序,我将奖励赏金
我已经构建了整个 AVFoundation 框架的思维导图。如果您熟悉 Xmind(或 theBrain)并想查看它们,希望能解释一下这些问题,请告诉我,我会通过电子邮件将它们发送给您。以上是关于编写多个视频会导致挂起的主要内容,如果未能解决你的问题,请参考以下文章
具有多个视频帧的 DirectShow 变换过滤器 - 与音频同步