使用 UIImage 和 caf 创建视频文件的问题
Posted
技术标签:
【中文标题】使用 UIImage 和 caf 创建视频文件的问题【英文标题】:problem using UIImage and caf to create video file 【发布时间】:2011-08-04 09:07:53 【问题描述】:我已经阅读了我在互联网上找到的所有关于这个功能的帖子,并且我已经成功地创建了视频文件,但是我还有 3 个问题,似乎没有人提到过。
我有 3 个问题:
视频在某些播放器上无法正常播放:quicktime(window),视频只播放一帧屏幕变白,视频无法在youtube上播放。
一些图像,由于某种原因,图像非常不正常
http://lh3.googleusercontent.com/-Jyz-L1k3MEk/TjpfSfKf8LI/AAAAAAAADBs/D1GYuEqI-Oo/h301/1.JPG(好吧,他们说我是新用户,不允许我在帖子中发布图片。)
一些图像,由于某种原因,方向不对,即使我根据方向转换了上下文,它仍然不起作用。
有人可以帮我解决这个问题吗,非常感谢您!
这是我的代码:
1:使用此功能通过 UIImage 创建视频,我只使用了一张图像和 1 个音频文件 (caf),我想在播放该音频时显示该图像。
- (void)writeImageAndAudioAsMovie:(UIImage*)image andAudio:(NSString *)audioFilePath duration:(int)duration
NSLog(@"start make movie: length:%d",duration);
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:ImageVideoPath] fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(videoWriter);
if ([[NSFileManager defaultManager] fileExistsAtPath:ImageVideoPath])
[[NSFileManager defaultManager] removeItemAtPath:ImageVideoPath error:nil];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:image.size.width],AVVideoWidthKey,[NSNumber numberWithInt:image.size.height], AVVideoHeightKey,nil];
AVAssetWriterInput* writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput sourcePixelBufferAttributes:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
writerInput.expectsMediaDataInRealTime = YES;
[videoWriter setShouldOptimizeForNetworkUse:YES];
[videoWriter addInput:writerInput];
//Start a session:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
//Write samples:
CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage];
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
//Finish the session:
[videoWriter endSessionAtSourceTime:CMTimeMake(duration, 1)];
[writerInput markAsFinished];
[videoWriter finishWriting];
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
[writerInput release];
[self addAudioToFileAtPath:ImageVideoPath andAudioPath:audioFilePath];
2。为视频创建 CVPixelBufferRef
-(CVPixelBufferRef)pixelBufferFromCGImage: (CGImageRef) image
float width = CGImageGetWidth(cgimage);
float height = CGImageGetHeight(cgimage);
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, width,height, kCVPixelFormatType_32ARGB,(CFDictionaryRef)options,&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata,width,height,8,4*width,rgbColorSpace,kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextDrawImage(context, CGRectMake(0, 0,width, height), cgimage);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
3。将视频和音频放在一起
-(void) addAudioToFileAtPath:(NSString *)vidoPath andAudioPath:(NSString *)audioPath
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSURL* audio_inputFileUrl = [NSURL fileURLWithPath:audioPath];
NSURL* video_inputFileUrl = [NSURL fileURLWithPath:vidoPath];
NSString *outputFilePath = FinalVideoPath;
NSURL* outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
//nextClipStartTime = CMTimeAdd(nextClipStartTime, a_timeRange.duration);
[audioAsset release];audioAsset = nil;
[videoAsset release];videoAsset = nil;
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void )
switch (_assetExport.status)
case AVAssetExportSessionStatusCompleted:
//export complete
NSLog(@"Export Complete");
break;
case AVAssetExportSessionStatusFailed:
NSLog(@"Export Failed");
NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
//export error (see exportSession.error)
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"Export Failed");
NSLog(@"ExportSessionError: %@", [_assetExport.error localizedDescription]);
//export cancelled
break;
];
【问题讨论】:
我可以帮助你解决方向问题,请参阅我的帖子***.com/questions/11414351/…的答案 【参考方案1】:有这个问题。如果您希望它得到修复,这是您的复选标记列表:
1) 视频不能有 alpha 通道,因此您的 pixelBufferFromCGImage 应该如下所示
static OSType pixelFormatType = kCVPixelFormatType_32ARGB;
- (CVPixelBufferRef)pixelBufferFromCGImage:(CGImageRef)image
CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
@YES, kCVPixelBufferCGImageCompatibilityKey,
@YES, kCVPixelBufferCGBitmapContextCompatibilityKey,
nil];
CVPixelBufferRef pxbuffer = NULL;
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
frameSize.width,
frameSize.height,
pixelFormatType,
(__bridge CFDictionaryRef)options,
&pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGBitmapInfo bitmapInfo = kCGImageAlphaNoneSkipFirst & kCGBitmapAlphaInfoMask;
//NSUInteger bytesPerRow = 4 * frameSize.width;
NSUInteger bitsPerComponent = 8;
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pxbuffer);
CGContextRef context = CGBitmapContextCreate(pxdata,
frameSize.width,
frameSize.height,
bitsPerComponent,
bytesPerRow,
rgbColorSpace,
bitmapInfo);
CGContextDrawImage(context, CGRectMake(0, 0, frameSize.width, frameSize.height), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);
CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;
2) 确保您在真实设备上进行测试。模拟器往往会扭曲视频,我在使用模拟器制作视频时遇到了完全相同的问题。
3) 确保像这样创建 AVAssetWriterInputPixelBufferAdaptor
NSMutableDictionary *attributes = [[NSMutableDictionary alloc] init];
[attributes setObject:[NSNumber numberWithUnsignedInt:pixelFormatType] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:imageSize.width] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:imageSize.height] forKey:(NSString*)kCVPixelBufferHeightKey];
AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
sourcePixelBufferAttributes:attributes];
我还有一些其他问题,但不是视频失真或方向问题。如果您不直接从资产请求缩略图图像,则需要将图像旋转到正确的方向。
【讨论】:
你是上帝。仅当图像处于纵向模式和横向工作正常时,我才遇到问题。我挖掘了问题,发现实际问题出在 alpha 设置中。谢谢。以上是关于使用 UIImage 和 caf 创建视频文件的问题的主要内容,如果未能解决你的问题,请参考以下文章
如何让 UNNotificationSound 使用项目中蓝色文件夹中的 .caf 文件