AVFoundation 框架初探究
Posted 蜗牛的脚印
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了AVFoundation 框架初探究相关的知识,希望对你有一定的参考价值。
叨叨两句
动手写这篇总结时候也是二月底过完年回来上班了,又开始新的一年了,今年会是什么样子?这问题可能得年底再回答自己了。在家窝了那么久,上班还是的接着看我们要看的东西,毕竟我们要做的事还真的太多的。
总结第五章的内容,这两天把后面几章的内容大概的翻着看了看,知道了下后面几章的内容大致讲的都是那些内容。这里就先开始总结书本中第五章的内容。前面第四章的内容视频播放我们再前面的确也总过了,就不在这里再去重复总结。
一:AVPlayerViewController
在第五章的最开始讲述的就是AVPlayerViewController,这个控制器在前面也没有好好说过,不过苹果给我们的关于AVPlayerViewController的API也就那么多,我们在这里看看它的头文件,以及它的一些使用。下面就先看看AVPlayerViewController这个类的头文件的方法,我们对它的属性进行一个解释说明:
File: AVPlayerViewController.h Framework: AVKit Copyright © 2014-2017 Apple Inc. All rights reserved. 导入库头文件 #import <UIKit/UIKit.h> #import <AVFoundation/AVFoundation.h> NS_ASSUME_NONNULL_BEGIN //AVPlayerViewController有这个代理,这个代理方法就在下面 @protocol AVPlayerViewControllerDelegate; @class AVPlayerViewController 这个摘要说明了AVPlayerViewController这个控制器的基本特征 @abstract AVPlayerViewController is a subclass of UIViewController that can be used to display the visual content of an AVPlayer object and the standard playback controls. // 8.0之后有的AVPlayerViewController API_AVAILABLE(ios(8.0)) @interface AVPlayerViewController : UIViewController // 简单的播放器AVPlayer属性 @property player @abstract The player from which to source the media content for the view controller. @property (nonatomic, strong, nullable) AVPlayer *player; // 是否展示添加在上面的子控件 @property showsPlaybackControls @abstract Whether or not the receiver shows playback controls. Default is YES. @discussion Clients can set this property to NO when they don\'t want to have any playback controls on top of the visual content (e.g. for a game splash screen). @property (nonatomic) BOOL showsPlaybackControls; // 这个属性字面意思是视频重力 其实它是用来确定在承载层的范围内视频可以拉伸或者缩放的程度 // 具体的在下面@discussion部分有讨论,我们再总结一下它三个值分别代表的含义 // AVLayerVideoGravityResizeAspect 会在承载层的范围内缩放视频的大小来保持视频的原始比例宽高,默认值 // AVLayerVideoGravityResizeAspectFill 保留视频的宽高比,并且通过缩放填满整个播放界面 // AVLayerVideoGravityResize 会将视频内容拉伸匹配承载层的范围,一般不常用,因为会把图片扭曲导致变形 @property videoGravity @abstract A string defining how the video is displayed within an AVPlayerLayer bounds rect. @discussion Options are AVLayerVideoGravityResizeAspect, AVLayerVideoGravityResizeAspectFill and AVLayerVideoGravityResize. AVLayerVideoGravityResizeAspect is default. See <AVFoundation/AVAnimation.h> for a description of these options. @property (nonatomic, copy) NSString *videoGravity; // 通过这个bool类型的值确定视频是否已经准备好展示 @property readyForDisplay @abstract Boolean indicating that the first video frame has been made ready for display for the current item of the associated AVPlayer. @property (nonatomic, readonly, getter = isReadyForDisplay) BOOL readyForDisplay; // 视频的尺寸 @property videoBounds @abstract The current size and position of the video image as displayed within the receiver\'s view\'s bounds. @property (nonatomic, readonly) CGRect videoBounds; // 有自定义的控件可以添加在这里UIView上 @property contentOverlayView @abstract Use the content overlay view to add additional custom views between the video content and the controls. @property (nonatomic, readonly, nullable) UIView *contentOverlayView; // 是否允许使用画中画播放模式,这个画中画播放在下面会写Demo 9.0之后的属性 @property allowsPictureInPicturePlayback @abstract Whether or not the receiver allows Picture in Picture playback. Default is YES. @property (nonatomic) BOOL allowsPictureInPicturePlayback API_AVAILABLE(ios(9.0)); // 10.0之后的属性 @property updatesNowPlayingInfoCenter @abstract Whether or not the now playing info center should be updated. Default is YES. @property (nonatomic) BOOL updatesNowPlayingInfoCenter API_AVAILABLE(ios(10.0)); // 理解摘要的意思是是否允许点击播放之后自动全屏播放视频 默认值是NO @property entersFullScreenWhenPlaybackBegins @abstract Whether or not the receiver automatically enters full screen when the play button is tapped. Default is NO. @discussion If YES, the receiver will show a user interface tailored to this behavior. @property (nonatomic) BOOL entersFullScreenWhenPlaybackBegins API_AVAILABLE(ios(11.0)); // 也是理解摘要,是否允许退出全屏播放在播放结束之后 @property exitsFullScreenWhenPlaybackEnds @abstract Whether or not the receiver automatically exits full screen when playback ends. Default is NO. @discussion If multiple player items have been enqueued, the receiver exits fullscreen once no more items are remaining in the queue. @property (nonatomic) BOOL exitsFullScreenWhenPlaybackEnds API_AVAILABLE(ios(11.0)); // AVPlayerViewControllerDelegate代理 @property delegate @abstract The receiver\'s delegate. @property (nonatomic, weak, nullable) id <AVPlayerViewControllerDelegate> delegate API_AVAILABLE(ios(9.0)); @end // 下面就是AVPlayerViewControllerDelegate代理方法 下面的代理方法都是optional可选的 @protocol AVPlayerViewControllerDelegate @abstract A protocol for delegates of AVPlayerViewController. @protocol AVPlayerViewControllerDelegate <NSObject> @optional @method playerViewControllerWillStartPictureInPicture: @param playerViewController The player view controller. @abstract Delegate can implement this method to be notified when Picture in Picture will start. // 即将开始画中画播放 - (void)playerViewControllerWillStartPictureInPicture:(AVPlayerViewController *)playerViewController; @method playerViewControllerDidStartPictureInPicture: @param playerViewController The player view controller. @abstract Delegate can implement this method to be notified when Picture in Picture did start. // 已经开始了画中画播放模式 - (void)playerViewControllerDidStartPictureInPicture:(AVPlayerViewController *)playerViewController; @method playerViewController:failedToStartPictureInPictureWithError: @param playerViewController The player view controller. @param error An error describing why it failed. @abstract Delegate can implement this method to be notified when Picture in Picture failed to start. // 这个摘要已经给我们说的比较清楚了,当画中画模式播放失败之后就会走这个代理方法 - (void)playerViewController:(AVPlayerViewController *)playerViewController failedToStartPictureInPictureWithError:(NSError *)error; @method playerViewControllerWillStopPictureInPicture: @param playerViewController The player view controller. @abstract Delegate can implement this method to be notified when Picture in Picture will stop. // 画中画播放模式即将结束 - (void)playerViewControllerWillStopPictureInPicture:(AVPlayerViewController *)playerViewController; @method playerViewControllerDidStopPictureInPicture: @param playerViewController The player view controller. @abstract Delegate can implement this method to be notified when Picture in Picture did stop. // 画中画播放已经结束 - (void)playerViewControllerDidStopPictureInPicture:(AVPlayerViewController *)playerViewController; @method playerViewControllerShouldAutomaticallyDismissAtPictureInPictureStart: @param playerViewController The player view controller. @abstract Delegate can implement this method and return NO to prevent(阻止) player view controller from automatically being dismissed when Picture in Picture starts. // 要是在画中画开始播放的时候 播放的底层控制器要是消失就返回NO - (BOOL)playerViewControllerShouldAutomaticallyDismissAtPictureInPictureStart:(AVPlayerViewController *)playerViewController; @method playerViewController:restoreUserInterfaceForPictureInPictureStopWithCompletionHandler: @param playerViewController The player view controller. @param completionHandler The completion handler the delegate needs to call after restore. @abstract Delegate can implement this method to restore(恢复) the user interface(交互) before Picture in Picture stops. // 画中画播放结束恢复用户交互 - (void)playerViewController:(AVPlayerViewController *)playerViewController restoreUserInterfaceForPictureInPictureStopWithCompletionHandler:(void (^)(BOOL restored))completionHandler; @end NS_ASSUME_NONNULL_END
就这两点了
书中的第四五章的内容在总结也就还剩这么两点,由于还有部分涉及到的是Mac OS 系统的部分知识,我们就在这里不提了,就说说还需要我们理解的两点:
一: CMTime 也是对第四章一点内容的补充
我们简单的看看这个CMTime的一般算数运算
#pragma mark -- #pragma mark -- CMTime的简单的使用 -(void)CMTimeCalculate{ CMTime timeO = CMTimeMake(1,10); CMTime timeT = CMTimeMake(1,5); // 加 CMTime timeA = CMTimeAdd(timeO,timeT); CMTimeShow(timeA); //减 CMTime timeS = CMTimeSubtract(timeO,timeT); CMTimeShow(timeS); //乘 整形 CMTime timeB = CMTimeMake(1,10); CMTime timeI = CMTimeMultiply(timeB,5); CMTimeShow(timeI); //乘 浮点型 CMTime timeF = CMTimeMultiplyByFloat64(timeB,5.6); CMTimeShow(timeF); // 比较 1是大于 0是相等 -1是小于 int timeC = CMTimeCompare(timeO,timeT); NSLog(@"比较得到大的是%d",timeC); // -1 //求绝对值 CMTime timeAB = CMTimeAbsoluteValue(timeS); CMTimeShow(timeAB); }
CMTimeRange也是属于CMTime范畴,下面是在我们的iOS源代码对于它的定义:
/*! @typedef CMTimeRange @abstract A time range represented as two CMTime structures. */ typedef struct { CMTime start; /*! @field start The start time of the time range. */ CMTime duration; /*! @field duration The duration of the time range. */ } CMTimeRange;
通过这个定义我们就了解了它的组成,在Demo中我们已经是简单的使用过它了,具体点的我们可以在代码中去查看。
关于CMTime还有一点值得我们注意,那就是它和秒之间的转换函数: Float64 CMTimeGetSeconds(CMTime time) 通过这个函数,你就可以把一个CMTime实例转换成Float64位对象。
AVAssetExportSession
我们先看看在我们的Demo里面我们使用到的关于AVAssetExportSession的代码,我们在这里使用它的时候只是利用它进行了一下视频的压缩:
#pragma mark -- #pragma mark -- 视频压缩方法 -(void)compressVideoWithFileUrl:(NSURL *)fileUrl{ /* 这里需要注意的一点就是在重复的路径上保存文件是不行的,可以选择在点击开始的时候删除之前的 也可以这样按照时间命名不同的文件保存 在后面的 AVAssetWriter 也要注意这一点 */ // 压缩后的视频的方法命名 NSDateFormatter * formatter = [[NSDateFormatter alloc]init]; [formatter setDateFormat:@"yyyy-MM-dd-HH:mm:ss"]; // 压缩后的文件路径 self.videoPath = [NSString stringWithFormat:@"%@/%@.mov",NSTemporaryDirectory(),[formatter stringFromDate:[NSDate date]]]; // 先根据你传入的文件的路径穿件一个AVAsset AVAsset * asset = [AVAsset assetWithURL:fileUrl]; /* 根据urlAsset创建AVAssetExportSession压缩类 第二个参数的意义:常用 压缩中等质量 AVAssetExportPresetMediumQuality AVF_EXPORT NSString *const AVAssetExportPresetLowQuality NS_AVAILABLE_IOS(4_0); AVF_EXPORT NSString *const AVAssetExportPresetMediumQuality NS_AVAILABLE_IOS(4_0); AVF_EXPORT NSString *const AVAssetExportPresetHighestQuality NS_AVAILABLE_IOS(4_0); */ AVAssetExportSession * exportSession = [[AVAssetExportSession alloc]initWithAsset:asset presetName:AVAssetExportPresetMediumQuality]; // 优化压缩,这个属性能使压缩的质量更好 exportSession.shouldOptimizeForNetworkUse = YES; // 到处的文件的路径 exportSession.outputURL = [NSURL fileURLWithPath:self.videoPath]; // 导出的文件格式 /*! @constant AVFileTypeMPEG4 mp4格式的 AVFileTypeQuickTimeMovie mov格式的 @abstract A UTI for the MPEG-4 file format. @discussion The value of this UTI is @"public.mpeg-4". Files are identified with the .mp4 extension. 可以看看这个outputFileType格式,比如AVFileTypeMPEG4也可以写成public.mpeg-4,其他类似 */ exportSession.outputFileType = AVFileTypeQuickTimeMovie; NSLog(@"视频压缩后的presetName: %@",exportSession.presetName); // 压缩的方法 export 导出 Asynchronously 异步 [exportSession exportAsynchronouslyWithCompletionHandler:^{ /* exportSession.status 枚举属性 typedef NS_ENUM(NSInteger, AVAssetExportSessionStatus) { AVAssetExportSessionStatusUnknown, AVAssetExportSessionStatusWaiting, AVAssetExportSessionStatusExporting, AVAssetExportSessionStatusCompleted, AVAssetExportSessionStatusFailed, AVAssetExportSessionStatusCancelled }; */ int exportStatus = exportSession.status; switch (exportStatus) { case AVAssetExportSessionStatusFailed: NSLog(@"压缩失败"); break; case AVAssetExportSessionStatusCompleted: { /* 压缩后的大小 也可以利用exportSession的progress属性,随时监测压缩的进度 */ NSData * data = [NSData dataWithContentsOfFile:self.videoPath]; float dataSize = (float)data.length/1024/1024; NSLog(@"视频压缩后大小 %f M", dataSize); } break; default: break; } }]; }
总结一下:
通过补充上面的这两点就算是总结完了书中前五章大概的内容,通过后面这两点的学习,能总结出来的就是多看API文件!摘要虽然都是英文的,有些同行可能因为不太好就不会去看,但读懂一些基本的英文文档也是我们的基本技能,通过看API可以学到许多东西!
以上是关于AVFoundation 框架初探究的主要内容,如果未能解决你的问题,请参考以下文章
Combine框架中两个相近操作符scan和reduce探究
Combine框架中两个相近操作符scan和reduce探究