在不使用媒体提取器的情况下,无法在 Android 4.2 上对视频进行低级解码

Posted

技术标签:

【中文标题】在不使用媒体提取器的情况下,无法在 Android 4.2 上对视频进行低级解码【英文标题】:Unable to do low-level decoding of video on Android 4.2 without using media extractor 【发布时间】:2013-05-24 07:54:36 【问题描述】:

我想在不使用提取器的情况下解码视频帧。所以我只是尝试了一个小样本,我使用媒体提取器,但我不使用extractor.readsample() 将比特流数据复制到输入缓冲区,而是在 JNI 中使用 FFmpeg 解析器,在那里我将视频帧复制到输入字节中缓冲区,然后将输入缓冲区排队。

但是当我打电话给decoder.dequeueOutputBuffer(info, 10000):

返回MediaCodec.INFO_TRY_AGAIN_LATER 如果我使用extractor.readsample(),它可以正常工作

Java 端:

import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;

import android.app.Activity;
import android.media.MediaCodec;
import android.media.MediaCodec.BufferInfo;
import android.media.MediaExtractor;
import android.media.MediaFormat;
import android.os.Bundle;
import android.os.Environment;
import android.util.Log;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;

public class VideoBrowser extends Activity implements SurfaceHolder.Callback 
    private static final String SAMPLE = Environment.getExternalStorageDirectory() + "/obama.mp4";
    private PlayerThread mPlayer = null;
    private static native < jintArray > int AVinitializecntxt(String strl, jintArray arr);
    private native int AVREADVIDEO(byte[] array);
    public int FLAG = 0;
    public int jk = 0;
    File f1;
    FileOutputStream f;

    static 
        Log.i("ABCD", "BEFORE");
        System.loadLibrary("ffmpeg");
        System.loadLibrary("ffmpeg-test-jni");
        Log.i("ABCD", "Success");
    

    @Override
    protected void onCreate(Bundle savedInstanceState) 
        super.onCreate(savedInstanceState);
        SurfaceView sv = new SurfaceView(this);
        sv.getHolder().addCallback(this);
        setContentView(sv);
        int val;
        int[] array = new int[6];
        int END_OF_FILE = 0;
        int aud_stream = 0;
        int vid_stream = 0;
        String urlString = "/mnt/sdcard/obama.mp4";
        f1 = new File("/mnt/sdcard/t.h264");
        try 
            f = new FileOutputStream(f1);
         catch (FileNotFoundException e) 
            e.printStackTrace();
        
        // This is where I call the function to initialize the ffmpeg inside JNI
        val = AVinitializecntxt(urlString, array);
        FLAG = val;
        Log.i("ABCD", "FLAG : " + FLAG + val);
    

    protected void onDestroy() 
        super.onDestroy();
    

    @Override
    public void surfaceCreated(SurfaceHolder holder) 

    @Override
    public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) 
        if (mPlayer == null) 
            mPlayer = new PlayerThread(holder.getSurface());
            mPlayer.start();
        
    

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) 
        if (mPlayer != null) 
            mPlayer.interrupt();
        
    

    private class PlayerThread extends Thread 
        private MediaExtractor extractor;
        private MediaCodec decoder;
        private Surface surface;
        // private VideoPlayer VideoPlayerAPIInterfaceClass = new VideoPlayer();

        public PlayerThread(Surface surface) 
            this.surface = surface;
        

        @Override
        public void run() 
            if (FLAG == 1) 
                extractor = new MediaExtractor();
                extractor.setDataSource(SAMPLE);
                for (int i = 0; i < extractor.getTrackCount(); i++) 
                    MediaFormat format = extractor.getTrackFormat(i);
                    String mime = format.getString(MediaFormat.KEY_MIME);
                    if (mime.startsWith("video/")) 
                        extractor.selectTrack(i);
                        decoder = MediaCodec.createDecoderByType("video/avc");
                        // Log.i("ABCD", "MIME : " + mime);
                        decoder.configure(format, surface, null, 0);
                        break;
                    
                

                if (decoder == null) 
                    Log.e("DecodeActivity", "Can't find video info!");
                    return;
                

                decoder.start();

                ByteBuffer[] inputBuffers = decoder.getInputBuffers();
                ByteBuffer[] outputBuffers = decoder.getOutputBuffers();
                BufferInfo info = new BufferInfo();
                boolean isEOS = false;
                long startMs = System.currentTimeMillis();
                int outIndex1 = -1;

                while (outIndex1 < 0) 
                    outIndex1 = decoder.dequeueOutputBuffer(info, 10000);
                    Log.i("ABCD", "etgeuieoy");
                

                while (!Thread.interrupted()) 
                    if (!isEOS) 
                        int inIndex = decoder.dequeueInputBuffer(10000);
                        if (inIndex >= 0) 
                            ByteBuffer buffer = inputBuffers[inIndex];
                            // int sampleSize = extractor.readSampleData(buffer, 0);
                            byte[] bytes = new byte[buffer.capacity()];
                            // This is where we call JNI function to memcopy the encoded bitstream into the input buffer
                            int sampleSize = [b] AVREADVIDEO[/b](bytes);
                            buffer.clear(); buffer.put(bytes, 0, sampleSize);
                            if (sampleSize < 0) 
                            // We shouldn't stop the playback at this point, just pass the EOS
                            // flag to decoder, we will get it again from the
                            // dequeueOutputBuffer
                            // Log.d("DecodeActivity", "InputBuffer BUFFER_FLAG_END_OF_STREAM");
                                decoder.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
                                isEOS = true;
                             else 
                                decoder.queueInputBuffer(inIndex, 0, sampleSize, 0, 0);
                                extractor.advance();
                            
                         
                     
                     int outIndex = decoder.dequeueOutputBuffer(info, 10000);
                     switch (outIndex) 
                        case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
                            Log.d("DecodeActivity", "INFO_OUTPUT_BUFFERS_CHANGED");
                            outputBuffers = decoder.getOutputBuffers();
                            break;
                        case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
                             Log.d("DecodeActivity", "New format " + decoder.getOutputFormat());
                             break;
                        case MediaCodec.INFO_TRY_AGAIN_LATER:
                             Log.d("DecodeActivity", "dequeueOutputBuffer timed out!");
                             break;
                       default:
                            ByteBuffer buffer = outputBuffers[outIndex];
                            Log.v("DecodeActivity", "We can't use this buffer but render it due to the API limit, " + buffer);
                            // We use a very simple clock to keep the video FPS, or the video
                            // playback will be too fast
                            while (info.presentationTimeUs / 1000 > System.currentTimeMillis() - startMs) 
                                try 
                                    sleep(10);
                                 catch (InterruptedException e) 
                                    e.printStackTrace();
                                    break;
                                
                            
                            //  Log.i("ABCD", "RELEASING OUTPUT BUFFER");
                            decoder.releaseOutputBuffer(outIndex, true);
                            //decoder.releaseOutputBuffer(outIndex, false);
                            break;
                        

                        // All decoded frames have been rendered, we can stop playing now
                        if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) 
                            Log.d("DecodeActivity", "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
                            break;
                        
                    

                    decoder.stop();
                    decoder.release();
                    extractor.release();
                
            
        
    

JNI 方面:

JNIEXPORT jint JNICALL
Java_com_alldigital_videoplayer_VideoBrowser_AVREADVIDEO(JNIEnv * pEnv, 
    jobject pObj, jbyteArray array) 

    AV_ctxt * avctxt = & aud_vid_ctxt;
    jbyte * buf = ( * pEnv) - > GetByteArrayElements(pEnv, array, NULL);
    if (buf == NULL) 
        LOGERR(10, "AVVIDEOREAD", "Bytes null");
    

    AVPacket * packet;
    packet = av_malloc(sizeof(AVPacket));
    av_init_packet(packet);
    int avread_res = av_read_frame(avctxt - > gFormatCtx, packet);
    int size = packet - > size;
    if (avread_res >= 0) 
        if (packet - > stream_index == avctxt - > gVideoStreamIndex) 
            // packet->size,packet->
            if (NULL == memcpy(buf,(char * ) packet - > data, packet - > size))
                LOGERR(10, "AV_AUDIO_DECODE", "memcpy for audio buffer failed");
            
        
        ( * pEnv) - > ReleaseByteArrayElements(pEnv, array, buf, 0);
        av_free_packet(packet);
        packet = NULL;
        return size;
    

即使我在不​​调用提取器的情况下通过 FFmpeg 复制每个帧的编码数据,我仍然遇到此输出缓冲区超时问题 - 为什么?

【问题讨论】:

您好,我在配置前试过设置“csd-0”的格式,但还是没有成功。 sourcey.com/ffmpeg-avpacket-to-opencv-mat-converter 【参考方案1】:
try 
  ByteArrayOutputStream baos = new ByteArrayOutputStream();
  FileInputStream fis = new FileInputStream(new File(
  "ur file path"));

  byte[] buf = new byte[1024];
  int n;
  while (-1 != (n = fis.read(buf))) 
  baos.write(buf, 0, n);
  

  byte[] videoBytes = baos.toByteArray();

  // use this videoBytes which is low level of original video 

 catch (Exception e) 
  e.printStackTrace();

【讨论】:

以上是关于在不使用媒体提取器的情况下,无法在 Android 4.2 上对视频进行低级解码的主要内容,如果未能解决你的问题,请参考以下文章

为啥 glDrawElments() 在不使用任何着色器的情况下工作?

在不显示选择器的情况下启动 Native Camera

为什么glDrawElments()在不使用任何着色器的情况下工作?

无需配对即可通过蓝牙提取移动(android 和 IOS)传感器数据

无法在不单击标记的情况下在 GoogleMap 中显示信息窗口 - GoogleMap Cluster Android

在不使用 Terraform 文件配置器的情况下将本地文件部署到实例