Android camera2 输出到 ImageReader 格式 YUV_420_888 仍然很慢

Posted

技术标签:

【中文标题】Android camera2 输出到 ImageReader 格式 YUV_420_888 仍然很慢【英文标题】:Android camera2 output to ImageReader format YUV_420_888 still slow 【发布时间】:2017-07-30 00:22:47 【问题描述】:

我正在尝试让 android camera2 在后台服务中运行,然后在回调 ImageReader.OnImageAvailableListener 中处理帧。我已经使用建议的原始格式 YUV_420_888 来获得最大 fps,但是在 640x480 分辨率下我只能获得大约 7fps。这甚至比我使用旧的 Camera 接口(我想升级到 Camera2 以获得更高的 fps)或 OpenCV JavaCameraView(我不能使用它,因为我需要在后台服务中运行处理)。

下面是我的服务类。我错过了什么?

我的手机是运行 Android 5.0.2 的 Redmi Note 3

public class Camera2ServiceYUV extends Service 
    protected static final String TAG = "VideoProcessing";
    protected static final int CAMERACHOICE = CameraCharacteristics.LENS_FACING_BACK;
    protected CameraDevice cameraDevice;
    protected CameraCaptureSession captureSession;
    protected ImageReader imageReader;

    // A semaphore to prevent the app from exiting before closing the camera.
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);


    public static final String RESULT_RECEIVER = "resultReceiver";
    private static final int JPEG_COMPRESSION = 90;

    public static final int RESULT_OK = 0;
    public static final int RESULT_DEVICE_NO_CAMERA= 1;
    public static final int RESULT_GET_CAMERA_FAILED = 2;
    public static final int RESULT_ALREADY_RUNNING = 3;
    public static final int RESULT_NOT_RUNNING = 4;

    private static final String START_SERVICE_COMMAND = "startServiceCommands";
    private static final int COMMAND_NONE = -1;
    private static final int COMMAND_START = 0;
    private static final int COMMAND_STOP = 1;

    private boolean mRunning = false;
    public Camera2ServiceYUV() 
    

    public static void startToStart(Context context, ResultReceiver resultReceiver) 
        Intent intent = new Intent(context, Camera2ServiceYUV.class);
        intent.putExtra(START_SERVICE_COMMAND, COMMAND_START);
        intent.putExtra(RESULT_RECEIVER, resultReceiver);
        context.startService(intent);
    

    public static void startToStop(Context context, ResultReceiver resultReceiver) 
        Intent intent = new Intent(context, Camera2ServiceYUV.class);
        intent.putExtra(START_SERVICE_COMMAND, COMMAND_STOP);
        intent.putExtra(RESULT_RECEIVER, resultReceiver);
        context.startService(intent);
    

    // SERVICE INTERFACE
    @Override
    public int onStartCommand(Intent intent, int flags, int startId) 
        switch (intent.getIntExtra(START_SERVICE_COMMAND, COMMAND_NONE)) 
            case COMMAND_START:
                startCamera(intent);
                break;
            case COMMAND_STOP:
                stopCamera(intent);
                break;
            default:
                throw new UnsupportedOperationException("Cannot start the camera service with an illegal command.");
        

        return START_STICKY;



    

    @Override
    public void onDestroy() 
        try 
            captureSession.abortCaptures();
         catch (CameraAccessException e) 
            Log.e(TAG, e.getMessage());
        
        captureSession.close();
    

    @Override
    public IBinder onBind(Intent intent) 
        return null;
    


    // CAMERA2 INTERFACE
    /**
     * 1. The android CameraManager class is used to manage all the camera devices in our android device
     * Each camera device has a range of properties and settings that describe the device.
     * It can be obtained through the camera characteristics.
     */
    public void startCamera(Intent intent) 

        final ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);

        if (mRunning) 
            resultReceiver.send(RESULT_ALREADY_RUNNING, null);
            return;
        
        mRunning = true;

        CameraManager manager = (CameraManager) getSystemService(CAMERA_SERVICE);
        try 
            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) 
                throw new RuntimeException("Time out waiting to lock camera opening.");
            
            String pickedCamera = getCamera(manager);
            Log.e(TAG,"AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA " + pickedCamera);
            manager.openCamera(pickedCamera, cameraStateCallback, null);
            CameraCharacteristics characteristics = manager.getCameraCharacteristics(pickedCamera);
            Size[] jpegSizes = null;
            if (characteristics != null) 
                jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.YUV_420_888);
            
            int width = 640;
            int height = 480;
//            if (jpegSizes != null && 0 < jpegSizes.length) 
//                width = jpegSizes[jpegSizes.length -1].getWidth();
//                height = jpegSizes[jpegSizes.length - 1].getHeight();
//            
//            for(Size s : jpegSizes)
//            
//                Log.e(TAG,"Size = " + s.toString());
//            


            // DEBUG
            StreamConfigurationMap map = characteristics.get(
                    CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
            if (map == null) 
                return;
            
            Log.e(TAG,"Width = " + width + ", Height = " + height);
            Log.e(TAG,"output stall duration = " + map.getOutputStallDuration(ImageFormat.YUV_420_888, new Size(width,height)) );
            Log.e(TAG,"Min output stall duration = " + map.getOutputMinFrameDuration(ImageFormat.YUV_420_888, new Size(width,height)) );

//            Size[] sizeList = map.getInputSizes(ImageFormat.YUV_420_888);
//            for(Size s : sizeList)
//            
//                Log.e(TAG,"Size = " + s.toString());
//            

            imageReader = ImageReader.newInstance(width, height, ImageFormat.YUV_420_888, 2 /* images buffered */);
            imageReader.setOnImageAvailableListener(onImageAvailableListener, null);
            Log.i(TAG, "imageReader created");
         catch (CameraAccessException e) 
            Log.e(TAG, e.getMessage());
            resultReceiver.send(RESULT_DEVICE_NO_CAMERA, null);
        catch (InterruptedException e) 
            resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
            throw new RuntimeException("Interrupted while trying to lock camera opening.", e);
        
        catch(SecurityException se)
        
            resultReceiver.send(RESULT_GET_CAMERA_FAILED, null);
            throw new RuntimeException("Security permission exception while trying to open the camera.", se);
        

        resultReceiver.send(RESULT_OK, null);
    

    // We can pick the camera being used, i.e. rear camera in this case.
    private String getCamera(CameraManager manager) 
        try 
            for (String cameraId : manager.getCameraIdList()) 
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
                int cOrientation = characteristics.get(CameraCharacteristics.LENS_FACING);
                if (cOrientation == CAMERACHOICE) 
                    return cameraId;
                
            
         catch (CameraAccessException e) 
            e.printStackTrace();
        
        return null;
    


    /**
     * 1.1 Callbacks when the camera changes its state - opened, disconnected, or error.
     */
    protected CameraDevice.StateCallback cameraStateCallback = new CameraDevice.StateCallback() 
        @Override
        public void onOpened(@NonNull CameraDevice camera) 
            Log.i(TAG, "CameraDevice.StateCallback onOpened");
            mCameraOpenCloseLock.release();
            cameraDevice = camera;
            createCaptureSession();
        

        @Override
        public void onDisconnected(@NonNull CameraDevice camera) 
            Log.w(TAG, "CameraDevice.StateCallback onDisconnected");
            mCameraOpenCloseLock.release();
            camera.close();
            cameraDevice = null;
        

        @Override
        public void onError(@NonNull CameraDevice camera, int error) 
            Log.e(TAG, "CameraDevice.StateCallback onError " + error);
            mCameraOpenCloseLock.release();
            camera.close();
            cameraDevice = null;
        
    ;


    /**
     * 2. To capture or stream images from a camera device, the application must first create
     * a camera capture captureSession.
     * The camera capture needs a surface to output what has been captured, in this case
     * we use ImageReader in order to access the frame data.
     */
    public void createCaptureSession() 
        try 
            cameraDevice.createCaptureSession(Arrays.asList(imageReader.getSurface()), sessionStateCallback, null);
         catch (CameraAccessException e) 
            Log.e(TAG, e.getMessage());
        
    

        protected CameraCaptureSession.StateCallback sessionStateCallback = new CameraCaptureSession.StateCallback() 
        @Override
        public void onConfigured(@NonNull CameraCaptureSession session) 
            Log.i(TAG, "CameraCaptureSession.StateCallback onConfigured");

            // The camera is already closed
            if (null == cameraDevice) 
                return;
            

            // When the captureSession is ready, we start to grab the frame.
            Camera2ServiceYUV.this.captureSession = session;

            try 
                session.setRepeatingRequest(createCaptureRequest(), null, null);
             catch (CameraAccessException e) 
                Log.e(TAG, e.getMessage());
            
        

        @Override
        public void onConfigureFailed(@NonNull CameraCaptureSession session) 
            Log.e(TAG, "CameraCaptureSession.StateCallback onConfigureFailed");
        
    ;

    /**
     * 3. The application then needs to construct a CaptureRequest, which defines all the capture parameters
     *    needed by a camera device to capture a single image.
     */
    private CaptureRequest createCaptureRequest() 
        try 
            /**
             * Check other templates for further details.
             * TEMPLATE_MANUAL = 6
             * TEMPLATE_PREVIEW = 1
             * TEMPLATE_RECORD = 3
             * TEMPLATE_STILL_CAPTURE = 2
             * TEMPLATE_VIDEO_SNAPSHOT = 4
             * TEMPLATE_ZERO_SHUTTER_LAG = 5
             *
             * TODO: can set camera features like auto focus, auto flash here
             * captureRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE,CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
             */
            CaptureRequest.Builder captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_RECORD);
//            captureRequestBuilder.set(CaptureRequest.EDGE_MODE,
//                    CaptureRequest.EDGE_MODE_OFF);
//            captureRequestBuilder.set(
//                    CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE,
//                    CaptureRequest.LENS_OPTICAL_STABILIZATION_MODE_ON);
//            captureRequestBuilder.set(
//                    CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE,
//                    CaptureRequest.COLOR_CORRECTION_ABERRATION_MODE_OFF);
//            captureRequestBuilder.set(CaptureRequest.NOISE_REDUCTION_MODE,
//                    CaptureRequest.NOISE_REDUCTION_MODE_OFF);
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AF_TRIGGER,
//                    CaptureRequest.CONTROL_AF_TRIGGER_CANCEL);
//
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AE_LOCK, true);
//            captureRequestBuilder.set(CaptureRequest.CONTROL_AWB_LOCK, true);

            captureRequestBuilder.addTarget(imageReader.getSurface());
            return captureRequestBuilder.build();
         catch (CameraAccessException e) 
            Log.e(TAG, e.getMessage());
            return null;
        
    


    /**
     * ImageReader provides a surface for the camera to output what has been captured.
     * Upon the image available, call processImage() to process the image as desired.
     */
    private long frameTime = 0;
    private ImageReader.OnImageAvailableListener onImageAvailableListener = new ImageReader.OnImageAvailableListener() 
        @Override
        public void onImageAvailable(ImageReader reader) 
            Log.i(TAG, "called ImageReader.OnImageAvailable");
            Image img = reader.acquireLatestImage();
            if (img != null) 
                if( frameTime != 0 )
                
                    Log.e(TAG, "fps = " + (float)(1000.0 / (float)(SystemClock.elapsedRealtime() - frameTime)) + " fps");
                
                frameTime = SystemClock.elapsedRealtime();
                img.close();
            
        
    ;

    private void processImage(Image image) 
        Mat outputImage = imageToMat(image);
        Bitmap bmp = Bitmap.createBitmap(outputImage.cols(), outputImage.rows(), Bitmap.Config.ARGB_8888);
        Utils.bitmapToMat(bmp, outputImage);
        Point mid = new Point(0, 0);
        Point inEnd = new Point(outputImage.cols(), outputImage.rows());
        Imgproc.line(outputImage, mid, inEnd, new Scalar(255, 0, 0), 2, Core.LINE_AA, 0);
        Utils.matToBitmap(outputImage, bmp);

        Intent broadcast = new Intent();
        broadcast.setAction("your_load_photo_action");
        broadcast.putExtra("BitmapImage", bmp);
        sendBroadcast(broadcast);
    

    private Mat imageToMat(Image image) 
        ByteBuffer buffer;
        int rowStride;
        int pixelStride;
        int width = image.getWidth();
        int height = image.getHeight();
        int offset = 0;

        Image.Plane[] planes = image.getPlanes();
        byte[] data = new byte[image.getWidth() * image.getHeight() * ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8];
        byte[] rowData = new byte[planes[0].getRowStride()];

        for (int i = 0; i < planes.length; i++) 
            buffer = planes[i].getBuffer();
            rowStride = planes[i].getRowStride();
            pixelStride = planes[i].getPixelStride();
            int w = (i == 0) ? width : width / 2;
            int h = (i == 0) ? height : height / 2;
            for (int row = 0; row < h; row++) 
                int bytesPerPixel = ImageFormat.getBitsPerPixel(ImageFormat.YUV_420_888) / 8;
                if (pixelStride == bytesPerPixel) 
                    int length = w * bytesPerPixel;
                    buffer.get(data, offset, length);

                    // Advance buffer the remainder of the row stride, unless on the last row.
                    // Otherwise, this will throw an IllegalArgumentException because the buffer
                    // doesn't include the last padding.
                    if (h - row != 1) 
                        buffer.position(buffer.position() + rowStride - length);
                    
                    offset += length;
                 else 

                    // On the last row only read the width of the image minus the pixel stride
                    // plus one. Otherwise, this will throw a BufferUnderflowException because the
                    // buffer doesn't include the last padding.
                    if (h - row == 1) 
                        buffer.get(rowData, 0, width - pixelStride + 1);
                     else 
                        buffer.get(rowData, 0, rowStride);
                    

                    for (int col = 0; col < w; col++) 
                        data[offset++] = rowData[col * pixelStride];
                    
                
            
        

        // Finally, create the Mat.
        Mat mat = new Mat(height + height / 2, width, CV_8UC1);
        mat.put(0, 0, data);

        return mat;
    


    private void stopCamera(Intent intent) 
        ResultReceiver resultReceiver = intent.getParcelableExtra(RESULT_RECEIVER);

        if (!mRunning) 
            resultReceiver.send(RESULT_NOT_RUNNING, null);
            return;
        

        closeCamera();

        resultReceiver.send(RESULT_OK, null);

        mRunning = false;
        Log.d(TAG, "Service is finished.");
    

    /**
     * Closes the current @link CameraDevice.
     */
    private void closeCamera() 
        try 
            mCameraOpenCloseLock.acquire();
            if (null != captureSession) 
                captureSession.close();
                captureSession = null;
            
            if (null != cameraDevice) 
                cameraDevice.close();
                cameraDevice = null;
            
            if (null != imageReader) 
                imageReader.close();
                imageReader = null;
            
         catch (InterruptedException e) 
            throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
         finally 
            mCameraOpenCloseLock.release();
        
    

【问题讨论】:

【参考方案1】:

我最近在尝试将我的 AR 应用程序从 camera1 升级到 camera2 API 时遇到了这个问题,我使用了具有Exynos 7872 CPU 和Mali-G71 GPU 的中端设备进行测试(魅族 S6)。我想要实现的是稳定的 30fps AR 体验。 但通过迁移,我发现使用 Camera2 API 获得不错的预览帧率非常棘手。

我使用 TEMPLATE_PREVIEW 配置了我的捕获请求

mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);

然后我放置 2 个表面,一个用于预览,它是尺寸为 1280x720 的表面纹理, 另一个 ImageReader,大小为 1280x720,用于图像处理。

mImageReader = ImageReader.newInstance(
    mVideoSize.getWidth(),
    mVideoSize.getHeight(),
    ImageFormat.YUV_420_888,
    2);

List<Surface> surfaces =new ArrayList<>();
Surface previewSurface = new Surface(mSurfaceTexture);
surfaces.add(previewSurface);
mPreviewBuilder.addTarget(previewSurface);

Surface frameCaptureSurface = mImageReader.getSurface();
surfaces.add(frameCaptureSurface);
mPreviewBuilder.addTarget(frameCaptureSurface);

mPreviewBuilder.set(CaptureRequest.CONTROL_AF_MODE,
                    CameraMetadata.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
mPreviewSession.setRepeatingRequest(mPreviewBuilder.build(), captureCallback, mBackgroundHandler);

一切正常,我的 TextureView 得到更新,framecallback 也被调用除了......帧速率约为 10 fps,我什至还没有进行任何图像处理。

我已经尝试了许多 Camera2 API 设置,包括 SENSOR_FRAME_DURATION 和不同的 ImageFormat 和大小组合,但它们都没有提高帧速率。但如果我只是从输出表面移除 ImageReader,那么预览很容易获得 30 fps!

所以我猜问题是通过将 ImageReader 添加为 Camera2 输出表面会大大降低预览帧速率。 至少在我的情况下,那么解决方案是什么?

我的解决方案是 glReadPixel

我知道 glReadPixel 是一种邪恶的东西,因为它会将字节从 GPU 复制到主内存,并且还会导致 OpenGL 刷新绘制命令,因此为了性能,我们最好避免使用它。但令人惊讶的是 glReadPixel 实际上非常快,并且提供的帧速率比 ImageReader 的 YUV_420_888 输出要好得多。

除了减少内存开销之外,我还使用更小的帧缓冲区(如 360x640)而不是预览专用于特征检测的 720p 进行另一个绘制调用。

【讨论】:

【参考方案2】:

基于openCV库对camera2的实现。 我遇到了同样的问题,然后我注意到 JavaCamera2View 的 openCV 代码中的这段代码,您需要以这种方式更改 CaptureRequest.Builder 的设置:

CaptureRequest.Builder captureBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
captureBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);

对我来说,它将 fps 从 10fps 更改为 28-30fps 左右。为我工作了两个目标表面,一个是预览纹理视图的表面,第二个是 ImageReader:

Surface readerSurface = imageReader.getSurface();
Surface surface = new Surface(surfaceTexture);
captureBuilder.addTarget(surface);
captureBuilder.addTarget(readerSurface);

【讨论】:

【参考方案3】:

无法发表评论(没有足够的代表)。但是在 Redmi 6 上遇到了同样的问题。

如果使用 TextureView 预览相机输出,我会得到大约 30 fps,但用 ImageReader 替换它会下降到 8/9 fps。在任何一种情况下,所有相机配置都是相同的。

有趣的是,在尝试CameraXBasic 时,它显示了同样的问题。来自相机的更新缓慢。但是android-Camera2Basic(使用 TextureView)运行没有任何问题。

更新:1 通过将预览尺寸从 1280x720 降低到 640x480 进行了测试,正如预期的那样看到了更好的性能。

【讨论】:

【参考方案4】:

这是我稍微调整后知道的,问题出在 ImageReader 的 maxImage 参数上,我将它从 2 更改为 3 到 56,它改变了很多 fps,我认为是我们渲染的表面当 ImageReader.OnImageAvailableListener 的 Image 类正在处理或未释放时,从 ImageReader 到 camera2 倾向于阻止将相机的图像保存到缓冲区/缓存的过程,或者我们可以说相机想要使用缓冲区但它没有'没有足够的缓冲区,所以当我们增加imageReader的最大缓冲区时,我们可以给camera2空间来保存图像。

【讨论】:

以上是关于Android camera2 输出到 ImageReader 格式 YUV_420_888 仍然很慢的主要内容,如果未能解决你的问题,请参考以下文章

Android camera2 输出到 ImageReader 格式 YUV_420_888 仍然很慢

将android camera2图像保存为无损PNG

Android Camera2 预览输出大小

使用 OpenCV 从 android Camera2 将 YUV 转换为 RGB ImageReader 时出现问题,输出图像为灰度

在Android camera2下将YUV_420_888转换为位图的图像不正确

无法导入导入 android.hardware.camera2 类