ijkplay播放直播流延时控制小结
Posted 灰色飘零
tags:
篇首语:本文由小常识网(cha138.com)小编为大家整理,主要介绍了ijkplay播放直播流延时控制小结相关的知识,希望对你有一定的参考价值。
本文讨论ijkplay播放直播流延时现象产生的原因和解决方法。
原因
1,网络抖动
a),推流端因为网络变差,buffer queue 会越来越大,等网络恢复正常时,再推流出去。当然,推流端大家估计有不同的控制策略。
b),CDN 源节点到边缘节点转发网络抖动
c),播放器端拉流因为网络变差,读取不到数据,等网络恢复正常时,会把之前的数据读回来(CDN服务器缓存多少秒?),导致buffer queue变大.
设置player.shouldShowHudView=YES; 可以实时观察音视频缓冲区的大小,对于is中的videoq和audioq,存的是未解码前数据包。
2,待补充
解决方法
推荐使用 cutv web播放端 播放直播流,或者使用 ffplay -fflags nobuffer -i 播放地址。
针对播放器端因网络抖动引起缓冲区变大,该怎么处理呢?
1,倍速播放
ijkplay虽然提供倍速播放的接口,但是需要android 6.0以上,我这边测试过效果并不好,没有使用,但是这个倍速播放方法用户体验会更好。
倍速播放,同时也需要解码性能跟得上,当然也可以只解码I帧。
2,丢包
丢解码前数据包还是解码后的数据帧呢?
为了简单处理,这边采用丢解码前的数据包,策略如下:
a),有音频流和视频流,或者只有音频流情况下,当audioq达到一定的duration,就丢掉前面一部分数据包,因为默认是AV_SYNC_AUDIO_MASTER,视频会追上来。
b),只有视频流情况,当videoq达到一定的duration,就丢掉前面一部分数据包。
下面是代码,ijkplay版本是 0.5.1
ff_ffplay_def.h中添加最大缓存时长
typedef struct VideoState {
...
// Add by ljsdaya
// for low delay time with live play(realtime), control videoq/audioq duration < max_cached_duration
// realtime set to 0, max_cached_duration = 0 means is playback
int max_cached_duration;
} VideoState;
ff_ffplay.c 添加控制缓存队列的方法
static void drop_queue_until_pts(PacketQueue *q, int64_t drop_to_pts) {
MyAVPacketList *pkt1 = NULL;
int del_nb_packets = 0;
for (;;) {
pkt1 = q->first_pkt;
if (!pkt1) {
break;
}
// video need key frame? 这里如果不判断是否是关键帧会导致视频画面花屏。但是这样会导致全部清空的可能也会出现花屏
// 所以这里推流端设置好 GOP 的大小,如果 max_cached_duration > 2 * GOP,可以尽可能规避全部清空
// 也可以在调用control_queue_duration之前判断新进来的视频pkt是否是关键帧,这样即使全部清空了也不会花屏
if ((pkt1->pkt.flags & AV_PKT_FLAG_KEY) && pkt1->pkt.pts >= drop_to_pts) {
// if (pkt1->pkt.pts >= drop_to_pts) {
break;
}
q->first_pkt = pkt1->next;
if (!q->first_pkt)
q->last_pkt = NULL;
q->nb_packets--;
++del_nb_packets;
q->size -= pkt1->pkt.size + sizeof(*pkt1);
if (pkt1->pkt.duration > 0)
q->duration -= pkt1->pkt.duration;
av_free_packet(&pkt1->pkt);
#ifdef FFP_MERGE
av_free(pkt1);
#else
pkt1->next = q->recycle_pkt;
q->recycle_pkt = pkt1;
#endif
}
av_log(NULL, AV_LOG_INFO, "233 del_nb_packets = %d.\n", del_nb_packets);
}
static void control_video_queue_duration(FFPlayer *ffp, VideoState *is) {
int time_base_valid = 0;
int64_t cached_duration = -1;
int nb_packets = 0;
int64_t duration = 0;
int64_t drop_to_pts = 0;
//Lock
SDL_LockMutex(is->videoq.mutex);
time_base_valid = is->video_st->time_base.den > 0 && is->video_st->time_base.num > 0;
nb_packets = is->videoq.nb_packets;
// TOFIX: if time_base_valid false, calc duration with nb_packets and framerate
// 为什么不用 videoq.duration?因为遇到过videoq.duration 一直为0,audioq也一样
if (time_base_valid) {
if (is->videoq.first_pkt && is->videoq.last_pkt) {
duration = is->videoq.last_pkt->pkt.pts - is->videoq.first_pkt->pkt.pts;
cached_duration = duration * av_q2d(is->video_st->time_base) * 1000;
}
}
if (cached_duration > is->max_cached_duration) {
// drop
av_log(NULL, AV_LOG_INFO, "233 video cached_duration = %lld, nb_packets = %d.\n", cached_duration, nb_packets);
drop_to_pts = is->videoq.last_pkt->pkt.pts - (duration / 2); // 这里删掉一半,你也可以自己修改,依据设置进来的max_cached_duration大小
drop_queue_until_pts(&is->videoq, drop_to_pts);
}
//Unlock
SDL_UnlockMutex(is->videoq.mutex);
}
static void control_audio_queue_duration(FFPlayer *ffp, VideoState *is) {
int time_base_valid = 0;
int64_t cached_duration = -1;
int nb_packets = 0;
int64_t duration = 0;
int64_t drop_to_pts = 0;
//Lock
SDL_LockMutex(is->audioq.mutex);
time_base_valid = is->audio_st->time_base.den > 0 && is->audio_st->time_base.num > 0;
nb_packets = is->audioq.nb_packets;
// TOFIX: if time_base_valid false, calc duration with nb_packets and samplerate
if (time_base_valid) {
if (is->audioq.first_pkt && is->audioq.last_pkt) {
duration = is->audioq.last_pkt->pkt.pts - is->audioq.first_pkt->pkt.pts;
cached_duration = duration * av_q2d(is->audio_st->time_base) * 1000;
}
}
if (cached_duration > is->max_cached_duration) {
// drop
av_log(NULL, AV_LOG_INFO, "233 audio cached_duration = %lld, nb_packets = %d.\n", cached_duration, nb_packets);
drop_to_pts = is->audioq.last_pkt->pkt.pts - (duration / 2);
drop_queue_until_pts(&is->audioq, drop_to_pts);
}
//Unlock
SDL_UnlockMutex(is->audioq.mutex);
}
static void control_queue_duration(FFPlayer *ffp, VideoState *is) {
if (is->max_cached_duration <= 0) {
return;
}
if (is->audio_st) {
return control_audio_queue_duration(ffp, is);
}
if (is->video_st) {
return control_video_queue_duration(ffp, is);
}
}
ff_ffplay.c read_thread 线程中,在每次 av_read_frame后去判断缓存队列有没有达到最大时长。这里需要把原来的realtime设置为0
...
// 把原来的realtime设置为0,并从外部设置获取max_cached_duration的值
// is->realtime = is_realtime(ic);
is->realtime = 0;
AVDictionaryEntry *e = av_dict_get(ffp->player_opts, "max_cached_duration", NULL, 0);
if (e) {
int max_cached_duration = atoi(e->value);
if (max_cached_duration <= 0) {
is->max_cached_duration = 0;
} else {
is->max_cached_duration = max_cached_duration;
}
} else {
is->max_cached_duration = 0;
}
if (true || ffp->show_status)
av_dump_format(ic, 0, is->filename, 0);
...
...
// 每次读取一个pkt,都去判断处理
// TODO:优化,不用每次都调用
if (is->max_cached_duration > 0) {
control_queue_duration(ffp, is);
}
if (pkt->stream_index == is->audio_stream && pkt_in_play_range) {
packet_queue_put(&is->audioq, pkt);
} else if (pkt->stream_index == is->video_stream && pkt_in_play_range
&& !(is->video_st && (is->video_st->disposition & AV_DISPOSITION_ATTACHED_PIC))) {
packet_queue_put(&is->videoq, pkt);
#ifdef FFP_MERGE
} else if (pkt->stream_index == is->subtitle_stream && pkt_in_play_range) {
packet_queue_put(&is->subtitleq, pkt);
#endif
} else {
av_packet_unref(pkt);
}
...
...
ios端使用实例代码
IJKFFOptions *options = [IJKFFOptions optionsByDefault];
// Set param
[options setFormatOptionIntValue:1024 * 16 forKey:@"probsize"];
[options setFormatOptionIntValue:50000 forKey:@"analyzeduration"];
[options setPlayerOptionIntValue:0 forKey:@"videotoolbox"];
[options setCodecOptionIntValue:IJK_AVDISCARD_DEFAULT forKey:@"skip_loop_filter"];
[options setCodecOptionIntValue:IJK_AVDISCARD_DEFAULT forKey:@"skip_frame"];
if (_isLive) {
// Param for living
[options setPlayerOptionIntValue:3000 forKey:@"max_cached_duration"]; // 最大缓存大小是3秒,可以依据自己的需求修改
[options setPlayerOptionIntValue:1 forKey:@"infbuf"]; // 无限读
[options setPlayerOptionIntValue:0 forKey:@"packet-buffering"]; // 关闭播放器缓冲
} else {
// Param for playback
[options setPlayerOptionIntValue:0 forKey:@"max_cached_duration"];
[options setPlayerOptionIntValue:0 forKey:@"infbuf"];
[options setPlayerOptionIntValue:1 forKey:@"packet-buffering"];
}
Android端使用实例代码
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_FORMAT, "probesize", 1024 * 16);
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_FORMAT, "analyzeduration", 50000);
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_CODEC, "skip_loop_filter", 0);
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_CODEC, "skip_frame", 0);
if (mIsLive) {
// Param for living
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "max_cached_duration", 3000);
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "infbuf", 1);
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "packet-buffering", 0);
} else {
// Param for playback
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "max_cached_duration", 0);
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "infbuf", 0);
ijkMediaPlayer.setOption(IjkMediaPlayer.OPT_CATEGORY_PLAYER, "packet-buffering", 1);
}
以上是关于ijkplay播放直播流延时控制小结的主要内容,如果未能解决你的问题,请参考以下文章
视频直播技术:使用Ijkplayer播放音视频不同步解决方案