使用 AVAsset 制作带有图片数组和歌曲文件的电影文件
Posted
技术标签:
【中文标题】使用 AVAsset 制作带有图片数组和歌曲文件的电影文件【英文标题】:Make movie file with picture Array and song file, using AVAsset 【发布时间】:2011-08-28 23:59:37 【问题描述】:我正在尝试使用图片数组和音频文件制作电影文件。为了制作带有图片数组的电影,我使用了 zoul here 的大帖子。一切都很完美,我有我的电影和我的照片。但是,当我尝试添加一些音轨时,我遇到了很多问题。为了理解我把我的代码:
当我调用这个方法时,图片数组和歌曲文件都准备好了:
-(void) writeImagesToMovieAtPath:(NSString *) path withSize:(CGSize) size
NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSArray *dirContents = [[NSFileManager defaultManager] directoryContentsAtPath:documentsDirectoryPath];
for (NSString *tString in dirContents)
if ([tString isEqualToString:@"essai.mp4"])
[[NSFileManager defaultManager]removeItemAtPath:[NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,tString] error:nil];
NSLog(@"Write Started");
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AudioChannelLayout channelLayout;
memset(&channelLayout, 0, sizeof(AudioChannelLayout));
channelLayout.mChannelLayoutTag = kAudioChannelLayoutTag_Stereo;
NSDictionary *audiosettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatMPEG4AAC], AVFormatIDKey,
[NSNumber numberWithFloat:44100.0] ,AVSampleRateKey,
[NSNumber numberWithInt: 1] ,AVNumberOfChannelsKey,
[NSNumber numberWithInt:192000],AVEncoderBitRateKey,
[NSData dataWithBytes:&channelLayout length:sizeof(AudioChannelLayout)],AVChannelLayoutKey,
nil];
AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInput* audioWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeAudio
outputSettings:audioSettings] retain];
NSURL* fileURL = [[NSBundle mainBundle] URLForResource:@"Big_Voice_1" withExtension:@"caf"];
NSLog(@"%@",fileURL);
AVAsset *asset = [[AVURLAsset URLAssetWithURL:fileURL
options:nil] retain];
AVAssetReader *audioReader = [[AVAssetReader assetReaderWithAsset:asset error:&error] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
AVAssetTrack* audioTrack = [asset.tracks objectAtIndex:0];
AVAssetReaderOutput *readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:audioTrack outputSettings:nil];
[audioReader addOutput:readerOutput];
NSParameterAssert(videoWriterInput);
NSParameterAssert(audioWriterInput);
NSParameterAssert([videoWriter canAddInput:audioWriterInput]);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
audioWriterInput.expectsMediaDataInRealTime = NO;
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:audioWriterInput];
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//Video encoding
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
for(int i = 0; i<20; i++)
buffer = [self pixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30)
if (adaptor.assetWriterInput.readyForMoreMediaData)
printf("appending %d attemp %d\n", frameCount, j);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10);
//CVPixelBufferPoolCreatePixelBuffer (kCFAllocatorDefault, adaptor.pixelBufferPool, &buffer);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
NSParameterAssert(bufferPool != NULL);
[NSThread sleepForTimeInterval:0.05];
else
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
j++;
if (!append_ok)
printf("error appending image %d times %d\n", frameCount, j);
frameCount++;
//Finish the session:
[videoWriterInput markAsFinished];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
//Write all picture array in movie file.
int frameCount = 0;
for(int i = 0; i<[m_PictArray count]; i++)
buffer = [self pixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30)
if (adaptor.assetWriterInput.readyForMoreMediaData)
printf("appending %d attemp %d\n", frameCount, j);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
NSParameterAssert(bufferPool != NULL);
[NSThread sleepForTimeInterval:0.05];
else
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
j++;
if (!append_ok)
printf("error appending image %d times %d\n", frameCount, j);
frameCount++;
//Finish writing picture:
[videoWriterInput markAsFinished];
我完成了在电影文件中写入图片,我想在文件中复制音频,我这样做:
[audioReader startReading];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
dispatch_queue_t mediaInputQueue = dispatch_queue_create("mediaInputQueue", NULL);
[audioWriterInput requestMediaDataWhenReadyOnQueue:mediaInputQueue usingBlock:^
NSLog(@"Request");
NSLog(@"Asset Writer ready :%d",audioWriterInput.readyForMoreMediaData);
while (audioWriterInput.readyForMoreMediaData)
NSLog(@"Ready");
CMSampleBufferRef nextBuffer = [readerOutput copyNextSampleBuffer];
if (nextBuffer)
NSLog(@"NextBuffer");
[audioWriterInput appendSampleBuffer:nextBuffer];
];
[audioWriterInput markAsFinished];
[videoWriter finishWriting];
但音频文件的 AssetWriterInput 状态始终为“NO”。
我的问题:如何使用 AVFoundation 将音频添加到视频文件中?
如果我忘记了什么或有什么问题,请有人帮助我。
非常感谢
【问题讨论】:
您好,您有此示例代码吗 【参考方案1】:我终于找到了如何用图片数组和音频文件制作电影。所以如果你想做同样的事情,我把我的代码放在这里(注意记忆):
首先用你的图片数组制作一个电影文件,使用zoul的帖子here:
-(void) writeImagesToMovieAtPath:(NSString *) path withSize:(CGSize) size
NSString *documentsDirectoryPath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSArray *dirContents = [[NSFileManager defaultManager] contentsOfDirectoryAtPath:documentsDirectoryPath error:nil];
for (NSString *tString in dirContents)
if ([tString isEqualToString:@"essai.mp4"])
[[NSFileManager defaultManager]removeItemAtPath:[NSString stringWithFormat:@"%@/%@",documentsDirectoryPath,tString] error:nil];
NSLog(@"Write Started");
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
[NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:size.width], AVVideoWidthKey,
[NSNumber numberWithInt:size.height], AVVideoHeightKey,
nil];
AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//Video encoding
CVPixelBufferRef buffer = NULL;
//convert uiimage to CGImage.
int frameCount = 0;
for(int i = 0; i<[m_PictArray count]; i++)
buffer = [self pixelBufferFromCGImage:[[m_PictArray objectAtIndex:i] CGImage] andSize:size];
BOOL append_ok = NO;
int j = 0;
while (!append_ok && j < 30)
if (adaptor.assetWriterInput.readyForMoreMediaData)
printf("appending %d attemp %d\n", frameCount, j);
CMTime frameTime = CMTimeMake(frameCount,(int32_t) 10);
append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
CVPixelBufferPoolRef bufferPool = adaptor.pixelBufferPool;
NSParameterAssert(bufferPool != NULL);
[NSThread sleepForTimeInterval:0.05];
else
printf("adaptor not ready %d, %d\n", frameCount, j);
[NSThread sleepForTimeInterval:0.1];
j++;
if (!append_ok)
printf("error appending image %d times %d\n", frameCount, j);
frameCount++;
CVBufferRelease(buffer);
[videoWriterInput markAsFinished];
[videoWriter finishWriting];
[videoWriterInput release];
[videoWriter release];
[m_PictArray removeAllObjects];
NSLog(@"Write Ended");
之后,您必须将电影文件和音频文件放在一起。要做到这一点,请按照我的代码:
-(void)CompileFilesToMakeMovie
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSString* audio_inputFileName = @"deformed.caf";
NSString* audio_inputFilePath = [Utilities documentsPath:audio_inputFileName];
NSURL* audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];
NSString* video_inputFileName = @"essai.mp4";
NSString* video_inputFilePath = [Utilities documentsPath:video_inputFileName];
NSURL* video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath];
NSString* outputFileName = @"outputFile.mov";
NSString* outputFilePath = [Utilities documentsPath:outputFileName];
NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
CMTime nextClipStartTime = kCMTimeZero;
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];
//nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = @"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void )
[self saveVideoToAlbum:outputFilePath];
];
抱歉,如果有泄漏,我正在优化内存。
【讨论】:
@Rhythmic Fistman :我无法使用 AVAssetWriter 编写带有视频轨道的音轨。这是一件坏事? 不,但我很想知道您的原始代码出了什么问题。 我不知道。从未添加声音! 大家好。我在设备日志上的 iPhone 3G (iOS 4.2.1) 上遇到问题我读到此错误:“com.apple.mediaserverd[82]AVChannelLayoutKey 应该指向一个包含 AudioChannelLayout 的 NSData 实例。
你的指向一个 NSNumber。
【讨论】:
谢谢我的第一个问题现在已经解决了。我会在 10 分钟内编辑我的帖子,如果您有任何解决方案可以将一些音轨添加到视频文件中……以上是关于使用 AVAsset 制作带有图片数组和歌曲文件的电影文件的主要内容,如果未能解决你的问题,请参考以下文章
在 iOS 中为 AVAsset 检索 AVMetadataItem 上的键名