Android 相机将无法工作。开始预览失败

Posted

技术标签:

【中文标题】Android 相机将无法工作。开始预览失败【英文标题】:Android Camera will not work. startPreview fails 【发布时间】:2011-12-18 01:08:56 【问题描述】:

我从 LogCat 收到这些错误:

10-30 00:31:51.494: D/CameraHal(1205): CameraHal setOverlay/1/00000000/00000000
10-30 00:31:51.494: E/CameraHal(1205): Trying to set overlay, but overlay is null!, line:3472
10-30 00:31:51.494: W/CameraService(1205): Overlay create failed - retrying
...
10-30 00:31:52.526: E/CameraService(1205): Overlay Creation Failed!
...
10-30 00:31:52.588: E/androidRuntime(5040): FATAL EXCEPTION: main
10-30 00:31:52.588: E/AndroidRuntime(5040): java.lang.RuntimeException: startPreview failed
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.hardware.Camera.startPreview(Native Method)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at com.matthewmitchell.nightcam.CameraSurfaceView.surfaceCreated(CameraSurfaceView.java:47)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.SurfaceView.updateWindow(SurfaceView.java:544)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.SurfaceView.dispatchDraw(SurfaceView.java:341)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewGroup.drawChild(ViewGroup.java:1638)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewGroup.dispatchDraw(ViewGroup.java:1367)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.View.draw(View.java:6743)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.widget.FrameLayout.draw(FrameLayout.java:352)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewGroup.drawChild(ViewGroup.java:1640)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewGroup.dispatchDraw(ViewGroup.java:1367)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewGroup.drawChild(ViewGroup.java:1638)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewGroup.dispatchDraw(ViewGroup.java:1367)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.View.draw(View.java:6743)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.widget.FrameLayout.draw(FrameLayout.java:352)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at com.android.internal.policy.impl.PhoneWindow$DecorView.draw(PhoneWindow.java:1876)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewRoot.draw(ViewRoot.java:1407)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewRoot.performTraversals(ViewRoot.java:1163)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.view.ViewRoot.handleMessage(ViewRoot.java:1727)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.os.Handler.dispatchMessage(Handler.java:99)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.os.Looper.loop(Looper.java:123)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at android.app.ActivityThread.main(ActivityThread.java:4627)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at java.lang.reflect.Method.invokeNative(Native Method)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at java.lang.reflect.Method.invoke(Method.java:521)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:868)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:626)
10-30 00:31:52.588: E/AndroidRuntime(5040):     at dalvik.system.NativeStart.main(Native Method)

这是 Activity 类:

public class NightCamActivity extends Activity 
    private GLSurfaceView mGLView;
    CameraSurfaceView surface_view;

    @Override
    public void onCreate(Bundle savedInstanceState) 
        super.onCreate(savedInstanceState);
        // Create a GLSurfaceView instance and set it
        // as the ContentView for this Activity
        Debug.out("Welcome");
        surface_view = new CameraSurfaceView(this);
        mGLView = new MySurfaceView(this);
        setContentView(mGLView);
        addContentView(surface_view, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT));
    

    @Override
    protected void onPause() 
        super.onPause();
        // The following call pauses the rendering thread.
        // If your OpenGL application is memory intensive,
        // you should consider de-allocating objects that
        // consume significant memory here.
        mGLView.onPause();
    

    @Override
    protected void onResume() 
        super.onResume();
        // The following call resumes a paused rendering thread.
        // If you de-allocated graphic objects for onPause()
        // this is a good place to re-allocate them.
        mGLView.onResume();
    

MySurfaceView 类:

class MySurfaceView extends GLSurfaceView

    public MySurfaceView(NightCamActivity context)
        super(context);
        // Create an OpenGL ES 2.0 context.
        Debug.out("Mysurfaceview welcome");
        setEGLContextClientVersion(2);
        // Set the Renderer for drawing on the GLSurfaceView
        MyRenderer renderer = new MyRenderer();
        renderer.takeContext(context);
        context.surface_view.renderer = renderer;
        setRenderer(renderer);
    

CameraSurfaceView 类:

public class CameraSurfaceView extends SurfaceView implements SurfaceHolder.Callback, PreviewCallback  

    private Camera camera;
    Camera.Size use_size;
    MyRenderer renderer;

    public CameraSurfaceView(Context context) 
        super(context);
        SurfaceHolder holder = getHolder();
        holder.addCallback(this);
        Debug.out("Init CSV");
        camera = Camera.open();
    

    public void surfaceCreated(SurfaceHolder holder) 
        Debug.out("SC");
        try 
            camera.setPreviewDisplay(holder);
         catch (IOException e) 
            Debug.out("Could not set preview display for camera.");
        
        camera.setPreviewCallback(this);
    

    public void surfaceDestroyed(SurfaceHolder holder) 
        // Surface will be destroyed when we return, so stop the preview.
        // Because the CameraDevice object is not a shared resource, it's very
        // important to release it when the activity is paused.
        try 
            if (camera != null) 
                camera.stopPreview();  
                camera.release();
            
         catch (Exception e) 
            Debug.out("Camera release failure.");
        
    

    public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) 
        Camera.Parameters parameters = camera.getParameters();
        List<Camera.Size> supportedPreviewSizes = parameters.getSupportedPreviewSizes();
        Camera.Size optimalPreviewSize = getOptimalPreviewSize(supportedPreviewSizes, w, h);
        if (optimalPreviewSize != null) 
            parameters.setPreviewSize(optimalPreviewSize.width, optimalPreviewSize.height);
            camera.setParameters(parameters);
            camera.startPreview();
        
    
    static Camera.Size getOptimalPreviewSize(List<Camera.Size> sizes, int w, int h) 
        final double ASPECT_TOLERANCE = 0.1;
        final double MAX_DOWNSIZE = 1.5;

        double targetRatio = (double) w / h;
        if (sizes == null) return null;

        Camera.Size optimalSize = null;
        double minDiff = Double.MAX_VALUE;

        int targetHeight = h;

        // Try to find an size match aspect ratio and size
        for (Camera.Size size : sizes) 
            double ratio = (double) size.width / size.height;
            double downsize = (double) size.width / w;
            if (downsize > MAX_DOWNSIZE) 
                //if the preview is a lot larger than our display surface ignore it
                //reason - on some phones there is not enough heap available to show the larger preview sizes 
                continue;
            
            if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE) continue;
            if (Math.abs(size.height - targetHeight) < minDiff) 
                optimalSize = size;
                minDiff = Math.abs(size.height - targetHeight);
            
        

        // Cannot find the one match the aspect ratio, ignore the requirement
        //keep the max_downsize requirement
        if (optimalSize == null) 
            minDiff = Double.MAX_VALUE;
            for (Camera.Size size : sizes) 
                double downsize = (double) size.width / w;
                if (downsize > MAX_DOWNSIZE) 
                    continue;
                
                if (Math.abs(size.height - targetHeight) < minDiff) 
                    optimalSize = size;
                    minDiff = Math.abs(size.height - targetHeight);
                
            
        
        //everything else failed, just take the closest match
        if (optimalSize == null) 
            minDiff = Double.MAX_VALUE;
            for (Camera.Size size : sizes) 
                if (Math.abs(size.height - targetHeight) < minDiff) 
                    optimalSize = size;
                    minDiff = Math.abs(size.height - targetHeight);
                
            
        

        return optimalSize;
    

    public void onPreviewFrame(byte[] data, Camera arg1) 
        Debug.out("PREVIEW FRAME:");
        byte[] pixels = new byte[use_size.width * use_size.height * 3]; ;
        decodeYUV420SP(pixels, data, use_size.width,  use_size.height); 
        renderer.bindCameraTexture(pixels, use_size.width,  use_size.height);
    

    void decodeYUV420SP(byte[] rgb, byte[] yuv420sp, int width, int height)   

        final int frameSize = width * height;  

        for (int j = 0, yp = 0; j < height; j++)        
            int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;  
            for (int i = 0; i < width; i++, yp++)   
                int y = (0xff & ((int) yuv420sp[yp])) - 16;  
                if (y < 0)  
                    y = 0; 
                
                if ((i & 1) == 0)   
                    v = (0xff & yuv420sp[uvp++]) - 128;  
                    u = (0xff & yuv420sp[uvp++]) - 128;  
                  

                int y1192 = 1192 * y;  
                int r = (y1192 + 1634 * v);  
                int g = (y1192 - 833 * v - 400 * u);  
                int b = (y1192 + 2066 * u);  

                if (r < 0)
                    r = 0;               
                else if (r > 262143)  
                    r = 262143; 
                
                if (g < 0)                  
                    g = 0;               
                else if (g > 262143)
                    g = 262143; 
                
                if (b < 0)                  
                    b = 0;               
                else if (b > 262143)
                    b = 262143; 
                
                rgb[yp*3] = (byte) (b << 6);
                rgb[yp*3 + 1] = (byte) (b >> 2);
                rgb[yp*3 + 2] = (byte) (b >> 10);
              
          
      


最后是 MyRender 类:

public class MyRenderer implements GLSurfaceView.Renderer
    private FloatBuffer vertices;
    private FloatBuffer texcoords;
    private int mProgram;
    private int maPositionHandle;
    private int gvTexCoordHandle;
    private int gvSamplerHandle;
    private static Context context;
    int[] camera_texture;
    public void onSurfaceCreated(GL10 unused, EGLConfig config) 
        initShapes();
        GLES20.glClearColor(0.0f, 1.0f, 0.2f, 1.0f);
        Debug.out("Hello init.");
        //Shaders
        int vertexShader = 0;
        int fragmentShader = 0;
        try 
            vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, readFile("vertex.vsh"));
            fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, readFile("fragment.fsh"));
         catch (IOException e) 
            Debug.out("The shaders could not be found.");
            e.printStackTrace();
        
        mProgram = GLES20.glCreateProgram();             // create empty OpenGL Program
        GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
        GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
        GLES20.glLinkProgram(mProgram);                  // creates OpenGL program executables
        // get handles
        maPositionHandle = GLES20.glGetAttribLocation(mProgram, "vPosition");
        gvTexCoordHandle = GLES20.glGetAttribLocation(mProgram, "a_texCoord");
        gvSamplerHandle = GLES20.glGetAttribLocation(mProgram, "s_texture");
        GLES20.glPixelStorei(GLES20.GL_UNPACK_ALIGNMENT, 1);
        camera_texture = null;
    


    private void initShapes()
        float triangleCoords[] = 
            // X, Y, Z
            -1.0f, -1.0f, 0.0f,
             1.0f, -1.0f, 0.0f,
             -1.0f, 1.0f, 0.0f,
             1.0f,  1.0f, 0.0f,
        ; 
        float texcoordf[] = 
            // X, Y, Z
            -1.0f,-1.0f,
            1.0f,-1.0f,
            -1.0f,1.0f,
            1.0f,1.0f,
        ;

        // initialize vertex Buffer for vertices
        ByteBuffer vbb = ByteBuffer.allocateDirect(triangleCoords.length * 4); 
        vbb.order(ByteOrder.nativeOrder());// use the device hardware's native byte order
        vertices = vbb.asFloatBuffer();  // create a floating point buffer from the ByteBuffer
        vertices.put(triangleCoords);    // add the coordinates to the FloatBuffer
        vertices.position(0);            // set the buffer to read the first coordinate
        // initialize vertex Buffer for texcoords 
        vbb = ByteBuffer.allocateDirect(texcoordf.length * 4); 
        vbb.order(ByteOrder.nativeOrder());// use the device hardware's native byte order
        texcoords = vbb.asFloatBuffer();  // create a floating point buffer from the ByteBuffer
        texcoords.put(texcoordf);    // add the coordinates to the FloatBuffer
        texcoords.position(0);            // set the buffer to read the first coordinate
    

    private static String readFile(String path) throws IOException 
        AssetManager assetManager = context.getAssets();
        InputStream stream = assetManager.open(path);
        try 
            return new Scanner(stream).useDelimiter("\\A").next();
        
        finally 
            stream.close();
        
    

    private int loadShader(int type, String shaderCode)
        // create a vertex shader type (GLES20.GL_VERTEX_SHADER)
        // or a fragment shader type (GLES20.GL_FRAGMENT_SHADER)
        int shader = GLES20.glCreateShader(type); 
        // add the source code to the shader and compile it
        GLES20.glShaderSource(shader, shaderCode);
        GLES20.glCompileShader(shader);
        return shader;
    

    public void onDrawFrame(GL10 unused) 
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
        if(camera_texture == null)
            return;
        
        // Add program to OpenGL environment
        GLES20.glUseProgram(mProgram);
        // Prepare the triangle data
        GLES20.glVertexAttribPointer(maPositionHandle, 3, GLES20.GL_FLOAT, false, 0, vertices);
        GLES20.glVertexAttribPointer(gvTexCoordHandle, 2, GLES20.GL_FLOAT, false, 0, texcoords);
        GLES20.glEnableVertexAttribArray(maPositionHandle);
        GLES20.glEnableVertexAttribArray(gvTexCoordHandle);
        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
        GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, camera_texture[0]);
        GLES20.glUniform1i(gvSamplerHandle, 0);
        // Draw the triangle
        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
        GLES20.glDisableVertexAttribArray(maPositionHandle);
        GLES20.glDisableVertexAttribArray(gvTexCoordHandle);
    

    public void onSurfaceChanged(GL10 unused, int width, int height) 
        GLES20.glViewport(0, 0, width, height);
    

    public void takeContext(Context ocontext) 
        Debug.out("Take context");
        context = ocontext;
    

    void bindCameraTexture(byte[] data,int w,int h) 
        byte[] pixels = new byte[256*256*3];
        for(int x = 0;x < 256;x++)
            for(int y = 0;x < 256;x++)
                pixels[x*256+y] = data[x*w+y];
            
        
        if (camera_texture==null)
            camera_texture=new int[1];
        else
            GLES20.glDeleteTextures(1, camera_texture, 0);
           
        GLES20.glGenTextures(1, camera_texture, 0);
        int tex = camera_texture[0];
        GLES20.glBindTexture(GL10.GL_TEXTURE_2D, tex);
        GLES20.glTexImage2D(GL10.GL_TEXTURE_2D, 0, GL10.GL_RGB, 256, 256, 0, GL10.GL_RGB, GL10.GL_UNSIGNED_BYTE, ByteBuffer.wrap(pixels));
        GLES20.glTexParameterf(GL10.GL_TEXTURE_2D, GL10.GL_TEXTURE_MIN_FILTER, GL10.GL_LINEAR);
    

【问题讨论】:

不应该在camera.startPreview()之前调用camera.setPreviewCallback(this)吗? 不幸的是,这没有任何区别。 :( 嗯看到了这个:developer.android.com/resources/samples/ApiDemos/src/com/… 他们在surfaceChanged() 中调用camera.startPreview() 此外,在surfaceCreated 中的try 块上方初始化相机的大部分内容都可以放入构造函数中。还有为什么要同步? 我找到了一个同步的例子。我不明白。我会删除它。我按照建议重新排列了部分,但仍然失败...... 【参考方案1】:

我拿走了你的代码,得到了和你一样的错误。但是,在调试时,在我看来,预览可能会失败,因为宽度和高度尺寸似乎是错误的,但它不仅仅是切换它们的情况,因为我认为方向也起作用。

无论如何,我已将您的 CameraSurfaceView 替换为我自己的(见下文),并且我认为它现在可以工作了。也不例外,但屏幕完全是亮绿色(我想这可能是因为我没有 vertex.vsh 或 fragment.vsh 文件。

package ***.test;

import java.io.IOException;
import java.util.List;

import android.content.Context;
import android.hardware.Camera;
import android.hardware.Camera.Size;
import android.util.AttributeSet;
import android.util.Log;
import android.view.Display;
import android.view.Surface;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;

public class CameraSurfaceView extends ViewGroup implements SurfaceHolder.Callback


private Size mPreviewSize;
private List<Size> mSupportedPreviewSizes;        
private Context mContext;
private SurfaceView mSurfaceView;
private SurfaceHolder mHolder;
private final String TAG = "CameraSurfaceView";
private Camera mCamera;
private List<String> mSupportedFlashModes;

public CameraSurfaceView(Context context)

    super(context);
    mContext = context;
    mCamera = Camera.open();        
    setCamera(mCamera);

    mSurfaceView = new SurfaceView(context);
    addView(mSurfaceView, 0);        
    mHolder = mSurfaceView.getHolder();
    mHolder.addCallback(this);
    mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
    mHolder.setKeepScreenOn(true);


public CameraSurfaceView(Context context, AttributeSet attrs)

    super(context, attrs);
    mContext = context;            


public void setSupportedPreviewSizes(List<Size> supportedPreviewSizes)

    mSupportedPreviewSizes = supportedPreviewSizes;


public Size getPreviewSize()

    return mPreviewSize;


public void setCamera(Camera camera)

    mCamera = camera;
    if (mCamera != null)
    
        mSupportedPreviewSizes = mCamera.getParameters().getSupportedPreviewSizes();                
        mSupportedFlashModes = mCamera.getParameters().getSupportedFlashModes();
        // Set the camera to Auto Flash mode.
        if (mSupportedFlashModes.contains(Camera.Parameters.FLASH_MODE_AUTO))
        
            Camera.Parameters parameters = mCamera.getParameters();
            parameters.setFlashMode(Camera.Parameters.FLASH_MODE_AUTO);             
            mCamera.setParameters(parameters);
                           
    
    requestLayout();


@Override
public void surfaceDestroyed(SurfaceHolder holder)

    // Surface will be destroyed when we return, so stop the preview.
    if (mCamera != null)
    
        mCamera.stopPreview();
    


@Override
public void surfaceChanged(SurfaceHolder holder, int format, int width, int height)

    // Now that the size is known, set up the camera parameters and begin
    // the preview.
    if (mCamera != null)
    
        Camera.Parameters parameters = mCamera.getParameters();        
        Size previewSize = getPreviewSize();
        parameters.setPreviewSize(previewSize.width, previewSize.height);                

        mCamera.setParameters(parameters);
        mCamera.startPreview();
    



@Override
public void surfaceCreated(SurfaceHolder holder)

    // The Surface has been created, acquire the camera and tell it where
    // to draw.
    try
    
        if (mCamera != null)
        
            mCamera.setPreviewDisplay(holder);
        
    
    catch (IOException exception)
    
        Log.e(TAG, "IOException caused by setPreviewDisplay()", exception);
    


@Override
protected void onMeasure(int widthMeasureSpec, int heightMeasureSpec)
        
    final int width = resolveSize(getSuggestedMinimumWidth(), widthMeasureSpec);
    final int height = resolveSize(getSuggestedMinimumHeight(), heightMeasureSpec);
    setMeasuredDimension(width, height);

    if (mSupportedPreviewSizes != null)
    
        mPreviewSize = getOptimalPreviewSize(mSupportedPreviewSizes, width, height);
    


@Override
protected void onLayout(boolean changed, int left, int top, int right, int bottom)

    if (changed)
                                
        final View cameraView = getChildAt(0);          

        final int width = right - left;
        final int height = bottom - top;

        int previewWidth = width;
        int previewHeight = height;
        if (mPreviewSize != null)
        
            Display display = ((WindowManager)mContext.getSystemService(Context.WINDOW_SERVICE)).getDefaultDisplay();

            switch (display.getRotation())
            
                case Surface.ROTATION_0:
                    previewWidth = mPreviewSize.height;
                    previewHeight = mPreviewSize.width;
                    mCamera.setDisplayOrientation(90);
                    break;
                case Surface.ROTATION_90:
                    previewWidth = mPreviewSize.width;
                    previewHeight = mPreviewSize.height;
                    break;
                case Surface.ROTATION_180:
                    previewWidth = mPreviewSize.height;
                    previewHeight = mPreviewSize.width;
                    break;
                case Surface.ROTATION_270:
                    previewWidth = mPreviewSize.width;
                    previewHeight = mPreviewSize.height;
                    mCamera.setDisplayOrientation(180);
                    break;
                                                
        

        final int scaledChildHeight = previewHeight * width / previewWidth;

        cameraView.layout(0, height - scaledChildHeight, width, height);

    



private Size getOptimalPreviewSize(List<Size> sizes, int width, int height)
           
    Size optimalSize = null;                                

    final double ASPECT_TOLERANCE = 0.1;
    double targetRatio = (double) height / width;

    // Try to find a size match which suits the whole screen minus the menu on the left.
    for (Size size : sizes)
    
        if (size.height != width) continue;
        double ratio = (double) size.width / size.height;
        if (ratio <= targetRatio + ASPECT_TOLERANCE && ratio >= targetRatio - ASPECT_TOLERANCE)
        
            optimalSize = size;
                       
    

    // If we cannot find the one that matches the aspect ratio, ignore the requirement.
    if (optimalSize == null)
    
        // TODO : Backup in case we don't get a size.
    

    return optimalSize;


public void previewCamera()
        
    try 
               
        mCamera.setPreviewDisplay(mHolder);         
        mCamera.startPreview();                 
    
    catch(Exception e)
    
        Log.d(TAG, "Cannot start preview.", e);    
    



/*public void onPreviewFrame(byte[] data, Camera arg1)  
    Log.d("CameraSurfaceView", "PREVIEW FRAME:"); 
    byte[] pixels = new byte[use_size.width * use_size.height * 3]; ; 
    decodeYUV420SP(pixels, data, use_size.width,  use_size.height);  
    renderer.bindCameraTexture(pixels, use_size.width,  use_size.height); 
*/ 

void decodeYUV420SP(byte[] rgb, byte[] yuv420sp, int width, int height)    

    final int frameSize = width * height;   

    for (int j = 0, yp = 0; j < height; j++)         
        int uvp = frameSize + (j >> 1) * width, u = 0, v = 0;   
        for (int i = 0; i < width; i++, yp++)    
            int y = (0xff & ((int) yuv420sp[yp])) - 16;   
            if (y < 0)   
                y = 0;  
             
            if ((i & 1) == 0)    
                v = (0xff & yuv420sp[uvp++]) - 128;   
                u = (0xff & yuv420sp[uvp++]) - 128;   
               

            int y1192 = 1192 * y;   
            int r = (y1192 + 1634 * v);   
            int g = (y1192 - 833 * v - 400 * u);   
            int b = (y1192 + 2066 * u);   

            if (r < 0) 
                r = 0;                
            else if (r > 262143)   
                r = 262143;  
             
            if (g < 0)                   
                g = 0;                
            else if (g > 262143) 
                g = 262143;  
             
            if (b < 0)                   
                b = 0;                
            else if (b > 262143) 
                b = 262143;  
             
            rgb[yp*3] = (byte) (b << 6); 
            rgb[yp*3 + 1] = (byte) (b >> 2); 
            rgb[yp*3 + 2] = (byte) (b >> 10); 
           
       
     

你会注意到我注释掉了你的 onPreviewFrame() 方法只是为了让它运行,还有行 context.surface_view.renderer = renderer。

我对 OpenGL 库并不熟悉,但也许这足以让你重新开始。

【讨论】:

感谢您的回答,但 getOptimalPreviewSize 返回一个空指针。我会再看看它...... 我将您的答案与 Andrei 的 getOptimalPreviewSize 混合在一起,它不会崩溃!我会尝试让 OpenGL 正常工作。 :-) 如果这一切都在一起,我可以分配赏金吗?... 好吧,显然我只能给一个。 是你的回答帮助我首先让它工作,我会把它给你,但感谢 Andrei 和 Mehul。 谢谢!这帮助我用我自己的不同代码解决了这个问题。问题是我缺少“mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);”这一行-- 尽管这个函数的 API 文档说“这被忽略,这个值在需要时自动设置”,添加它使我的应用程序工作。很棒的文档!【参考方案2】:

我遇到了同样的问题,删除/播放预览大小设置对我不起作用。我通过下一行代码修复了它:

mHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

现在它对我来说很好用!

【讨论】:

"此常量在 API 级别 11 中已弃用。此值被忽略,此值在需要时自动设置。"【参考方案3】:

在异常之前检查您的 logcat 是否有类似的内容:

11-02 09:25:44.305: ERROR/QualcommCameraHardware(56): failed to construct master heap for pmem pool /dev/pmem_adsp
11-02 09:25:44.305: ERROR/QualcommCameraHardware(56): initPreview X: could not initialize preview heap.
11-02 09:25:44.305: ERROR/QualcommCameraHardware(56): startPreview X initPreview failed.  Not starting preview.

当我尝试将预览尺寸设置为 1024x768 时,这发生在我的手机横向上。 getSupportedPreviewSizes() 方法表示支持此大小,但是操作系统似乎无法为这么大的预览分配足够的内存。设置较小的尺寸确实有效。

另外,试试下面的代码。您应该从 surfaceChanged() 方法调用 startPreview(),此时 SurfaceView 的尺寸是已知的。从 surfaceCreated() 调用它还为时过早。

使用它来确定给定尺寸的 SurfaceView 的最佳预览尺寸(改编自 Google CameraPreview 示例)

static Size getOptimalPreviewSize(List<Size> sizes, int w, int h) 
    final double ASPECT_TOLERANCE = 0.1;
    final double MAX_DOWNSIZE = 1.5;

    double targetRatio = (double) w / h;
    if (sizes == null) return null;

    Size optimalSize = null;
    double minDiff = Double.MAX_VALUE;

    int targetHeight = h;

    // Try to find an size match aspect ratio and size
    for (Size size : sizes) 
        double ratio = (double) size.width / size.height;
        double downsize = (double) size.width / w;
        if (downsize > MAX_DOWNSIZE) 
            //if the preview is a lot larger than our display surface ignore it
            //reason - on some phones there is not enough heap available to show the larger preview sizes 
            continue;
        
        if (Math.abs(ratio - targetRatio) > ASPECT_TOLERANCE) continue;
        if (Math.abs(size.height - targetHeight) < minDiff) 
            optimalSize = size;
            minDiff = Math.abs(size.height - targetHeight);
        
    

    // Cannot find the one match the aspect ratio, ignore the requirement
    //keep the max_downsize requirement
    if (optimalSize == null) 
        minDiff = Double.MAX_VALUE;
        for (Size size : sizes) 
            double downsize = (double) size.width / w;
            if (downsize > MAX_DOWNSIZE) 
                continue;
            
            if (Math.abs(size.height - targetHeight) < minDiff) 
                optimalSize = size;
                minDiff = Math.abs(size.height - targetHeight);
            
        
    
    //everything else failed, just take the closest match
    if (optimalSize == null) 
        minDiff = Double.MAX_VALUE;
        for (Size size : sizes) 
            if (Math.abs(size.height - targetHeight) < minDiff) 
                optimalSize = size;
                minDiff = Math.abs(size.height - targetHeight);
            
        
    

    return optimalSize;

你可以从你的 surfaceChanged() 方法中调用它,像这样:

public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) 
    Camera.Parameters parameters = camera.getParameters();
    List<Size> supportedPreviewSizes = parameters.getSupportedPreviewSizes();
    Size optimalPreviewSize = getOptimalPreviewSize(supportedPreviewSizes, w, h);
    if (optimalPreviewSize != null) 
        parameters.setPreviewSize(optimalPreviewSize.width, optimalPreviewSize.height);
        camera.setParameters(parameters);
        camera.startPreview();
    

【讨论】:

感谢您的回答,但仍然无法正常工作。我将更新我的整个代码。 这对我有用。谢谢!我在 API 17 中不需要它,但在 11 中确实需要它。【参考方案4】:

尝试在 initCamera() 中设置 Surface 的类型。

private void initCamera() 
mCamSV = (SurfaceView)findViewById(R.id.surface_camera);
mCamSH = mCamSV.getHolder();
mCamSH.addCallback(this);
**mCamSH.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);**


编辑 1

我在这里复制所有适用于我的 android 2.2 sdk 的文件

活动

package com.stack.camera;


import java.io.IOException;

import android.app.Activity;
import android.hardware.Camera;
import android.os.Bundle;
import android.view.SurfaceHolder;
import android.view.SurfaceView;
import android.view.WindowManager;
import android.widget.FrameLayout;

public class CameraStackActivity extends Activity implements SurfaceHolder.Callback 
    private Camera mCam;
    private SurfaceView mCamSV;
    private SurfaceHolder mCamSH;

/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) 
    super.onCreate(savedInstanceState);

    getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN,
            WindowManager.LayoutParams.FLAG_FULLSCREEN);
    setContentView(R.layout.main);
    initCamera();


@Override
public void onDestroy() 
    stopCamera();


public void surfaceChanged(SurfaceHolder holder, int format, int width,
        int height) 

    startCamera(holder, width, height);


public void surfaceCreated(SurfaceHolder holder) 
    // TODO Auto-generated method stub
    mCam = Camera.open();
    try 
        mCam.setPreviewDisplay(holder);
     catch (IOException e) 
        // TODO Auto-generated catch block
        e.printStackTrace();
    


public void surfaceDestroyed(SurfaceHolder holder) 
    // TODO Auto-generated method stub



private void initCamera() 
    mCamSV = (SurfaceView)findViewById(R.id.surface_camera);
    mCamSH = mCamSV.getHolder();
    mCamSH.addCallback(this);
    mCamSH.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);




private void startCamera(SurfaceHolder sh, int width, int height) 
    Camera.Parameters p = mCam.getParameters();
    // Camera.Size s = p.getSupportedPreviewSizes().get(0);
    p.setPreviewSize(width, height);

    mCam.setParameters(p);

    try 
        mCam.setPreviewDisplay(sh);
     catch (Exception e) 
    

    mCam.startPreview();


private void stopCamera() 
    mCamSH.removeCallback(this);

    mCam.stopPreview();
    mCam.release();


布局

<?xml version="1.0" encoding="utf-8"?>
<FrameLayout xmlns:android="http://schemas.android.com/apk/res/android"
    android:orientation="vertical" android:layout_
    android:layout_>
    <SurfaceView android:id="@+id/surface_camera"
        android:layout_ android:layout_ />
</FrameLayout>

清单文件

<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
      package="com.stack.camera"
      android:versionCode="1"
      android:versionName="1.0">

<uses-permission android:name="android.permission.CAMERA" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" />

    <application android:icon="@drawable/icon" android:label="@string/app_name">
        <activity android:name="CameraStackActivity"
                  android:label="@string/app_name">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />
                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

    </application>
</manifest>

【讨论】:

感谢您的回答,但它现在似乎可以工作了,我只需要把 OpenGL 放在一起。 对于 Galaxy S Android 2.3.3,需要linesetType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS) @NgocDao 以上注释应为:对于 Galaxy S Android 2.3.3,需要setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS) 行。【参考方案5】:

与 OP 问题并没有真正的关系,但是...

我看到以下问题给了我java.io.IOException: setPreviewDisplay failed

如果你同时做视频和照片,有两个函数,camera.unlock()camera.reconnect()。您必须在录制视频之前camera.unlock() 并在拍照之前camera.reconnect()

【讨论】:

【参考方案6】:

简单的解决方案:在你的 CameraSurfaceView 类中添加这一行:

holder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);

以下几行:

SurfaceHolder holder = getHolder();
        holder.addCallback(this);

【讨论】:

现在已弃用,需要时会自动设置【参考方案7】:

我玩这个游戏有点晚了,但我正在运行cordova,并且有两个插件试图同时注册相机。

不确定这是否对我以外的任何人有用。

【讨论】:

以上是关于Android 相机将无法工作。开始预览失败的主要内容,如果未能解决你的问题,请参考以下文章

Android 相机启动预览失败

相机预览人像设置参数失败

如何使用 Android API 将相机预览大小设置为全屏?

检测实时android相机预览的颜色代码

在相机预览中不工作 setAlpha

片段中的 Android 相机预览