如何结合 OpenGL 使用 onSensorChanged 传感器数据

Posted

技术标签:

【中文标题】如何结合 OpenGL 使用 onSensorChanged 传感器数据【英文标题】:How to use onSensorChanged sensor data in combination with OpenGL 【发布时间】:2011-02-22 07:02:41 【问题描述】:

(编辑:我在augmented reality framework 中添加了最佳工作方法,现在还考虑了陀螺仪,使其再次更加稳定:DroidAR framework)

我编写了一个 TestSuite 来了解如何根据您在SensorEventListener.onSensorChanged() 中获得的数据计算旋转角度。 我真的希望你能完成我的解决方案,以帮助像我一样有同样问题的人。下面是代码,我想你看完就明白了。

随意更改它,主要思想是实现几种方法将方向角发送到 opengl 视图或任何其他需要它的目标。

方法 1 到 4 正在工作,它们直接将旋转矩阵发送到 OpenGL 视图。

方法 6 现在也可以了,但我无法解释为什么必须进行 y x z 的旋转..

所有其他方法都不起作用或有问题,我希望有人知道让它们起作用。我认为最好的方法是方法 5,如果它可行,因为它最容易理解,但我不确定如何效率很高。完整的代码没有优化,所以我建议不要在你的项目中使用它。

这里是:

/**
 * This class provides a basic demonstration of how to use the
 * @link android.hardware.SensorManager SensorManager API to draw a 3D
 * compass.
 */
public class SensorToOpenGlTests extends Activity implements Renderer,
  SensorEventListener 

 private static final boolean TRY_TRANSPOSED_VERSION = false;

 /*
  * MODUS overview:
  * 
  * 1 - unbufferd data directly transfaired from the rotation matrix to the
  * modelview matrix
  * 
  * 2 - buffered version of 1 where both acceleration and magnetometer are
  * buffered
  * 
  * 3 - buffered version of 1 where only magnetometer is buffered
  * 
  * 4 - buffered version of 1 where only acceleration is buffered
  * 
  * 5 - uses the orientation sensor and sets the angles how to rotate the
  * camera with glrotate()
  * 
  * 6 - uses the rotation matrix to calculate the angles
  * 
  * 7 to 12 - every possibility how the rotationMatrix could be constructed
  * in SensorManager.getRotationMatrix (see
  * http://www.songho.ca/opengl/gl_anglestoaxes.html#anglestoaxes for all
  * possibilities)
  */

 private static int MODUS = 2;

 private GLSurfaceView openglView;
 private FloatBuffer vertexBuffer;
 private ByteBuffer indexBuffer;
 private FloatBuffer colorBuffer;

 private SensorManager mSensorManager;
 private float[] rotationMatrix = new float[16];
 private float[] accelGData = new float[3];
 private float[] bufferedAccelGData = new float[3];
 private float[] magnetData = new float[3];
 private float[] bufferedMagnetData = new float[3];
 private float[] orientationData = new float[3];

 // private float[] mI = new float[16];

 private float[] resultingAngles = new float[3];

 private int mCount;

 final static float rad2deg = (float) (180.0f / Math.PI);

 private boolean landscape;

 public SensorToOpenGlTests() 
 

 /** Called with the activity is first created. */
 @Override
 public void onCreate(Bundle savedInstanceState) 
  super.onCreate(savedInstanceState);

  mSensorManager = (SensorManager) getSystemService(Context.SENSOR_SERVICE);
  openglView = new GLSurfaceView(this);
  openglView.setRenderer(this);
  setContentView(openglView);
 

 @Override
 protected void onResume() 
  // Ideally a game should implement onResume() and onPause()
  // to take appropriate action when the activity looses focus
  super.onResume();
  openglView.onResume();

  if (((WindowManager) getSystemService(WINDOW_SERVICE))
    .getDefaultDisplay().getOrientation() == 1) 
   landscape = true;
   else 
   landscape = false;
  

  mSensorManager.registerListener(this, mSensorManager
    .getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
    SensorManager.SENSOR_DELAY_GAME);
  mSensorManager.registerListener(this, mSensorManager
    .getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
    SensorManager.SENSOR_DELAY_GAME);
  mSensorManager.registerListener(this, mSensorManager
    .getDefaultSensor(Sensor.TYPE_ORIENTATION),
    SensorManager.SENSOR_DELAY_GAME);
 

 @Override
 protected void onPause() 
  // Ideally a game should implement onResume() and onPause()
  // to take appropriate action when the activity looses focus
  super.onPause();
  openglView.onPause();
  mSensorManager.unregisterListener(this);
 

 public int[] getConfigSpec() 
  // We want a depth buffer, don't care about the
  // details of the color buffer.
  int[] configSpec =  EGL10.EGL_DEPTH_SIZE, 16, EGL10.EGL_NONE ;
  return configSpec;
 

 public void onDrawFrame(GL10 gl) 

  // clear screen and color buffer:
  gl.glClear(GL10.GL_COLOR_BUFFER_BIT | GL10.GL_DEPTH_BUFFER_BIT);
  // set target matrix to modelview matrix:
  gl.glMatrixMode(GL10.GL_MODELVIEW);
  // init modelview matrix:
  gl.glLoadIdentity();
  // move camera away a little bit:

  if ((MODUS == 1) || (MODUS == 2) || (MODUS == 3) || (MODUS == 4)) 

   if (landscape) 
    // in landscape mode first remap the rotationMatrix before using
    // it with glMultMatrixf:
    float[] result = new float[16];
    SensorManager.remapCoordinateSystem(rotationMatrix,
      SensorManager.AXIS_Y, SensorManager.AXIS_MINUS_X,
      result);
    gl.glMultMatrixf(result, 0);
    else 
    gl.glMultMatrixf(rotationMatrix, 0);
   
   else 
   //in all other modes do the rotation by hand
   //the order y x z is important!
   gl.glRotatef(resultingAngles[2], 0, 1, 0);
   gl.glRotatef(resultingAngles[1], 1, 0, 0);
   gl.glRotatef(resultingAngles[0], 0, 0, 1);
  

  //move the axis to simulate augmented behaviour:
  gl.glTranslatef(0, 2, 0);

  // draw the 3 axis on the screen:
  gl.glVertexPointer(3, GL_FLOAT, 0, vertexBuffer);
  gl.glColorPointer(4, GL_FLOAT, 0, colorBuffer);
  gl.glDrawElements(GL_LINES, 6, GL_UNSIGNED_BYTE, indexBuffer);
 

 public void onSurfaceChanged(GL10 gl, int width, int height) 
  gl.glViewport(0, 0, width, height);
  float r = (float) width / height;
  gl.glMatrixMode(GL10.GL_PROJECTION);
  gl.glLoadIdentity();
  gl.glFrustumf(-r, r, -1, 1, 1, 10);
 

 public void onSurfaceCreated(GL10 gl, EGLConfig config) 
  gl.glDisable(GL10.GL_DITHER);
  gl.glClearColor(1, 1, 1, 1);
  gl.glEnable(GL10.GL_CULL_FACE);
  gl.glShadeModel(GL10.GL_SMOOTH);
  gl.glEnable(GL10.GL_DEPTH_TEST);

  gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
  gl.glEnableClientState(GL10.GL_COLOR_ARRAY);

  // load the 3 axis and there colors:
  float vertices[] =  0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1 ;
  float colors[] =  0, 0, 0, 0, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 1, 1 ;
  byte indices[] =  0, 1, 0, 2, 0, 3 ;

  ByteBuffer vbb;
  vbb = ByteBuffer.allocateDirect(vertices.length * 4);
  vbb.order(ByteOrder.nativeOrder());
  vertexBuffer = vbb.asFloatBuffer();
  vertexBuffer.put(vertices);
  vertexBuffer.position(0);

  vbb = ByteBuffer.allocateDirect(colors.length * 4);
  vbb.order(ByteOrder.nativeOrder());
  colorBuffer = vbb.asFloatBuffer();
  colorBuffer.put(colors);
  colorBuffer.position(0);

  indexBuffer = ByteBuffer.allocateDirect(indices.length);
  indexBuffer.put(indices);
  indexBuffer.position(0);
 

 public void onAccuracyChanged(Sensor sensor, int accuracy) 
 

 public void onSensorChanged(SensorEvent event) 

  // load the new values:
  loadNewSensorData(event);

  if (MODUS == 1) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);
  

  if (MODUS == 2) 
   rootMeanSquareBuffer(bufferedAccelGData, accelGData);
   rootMeanSquareBuffer(bufferedMagnetData, magnetData);
   SensorManager.getRotationMatrix(rotationMatrix, null,
     bufferedAccelGData, bufferedMagnetData);
  

  if (MODUS == 3) 
   rootMeanSquareBuffer(bufferedMagnetData, magnetData);
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     bufferedMagnetData);
  

  if (MODUS == 4) 
   rootMeanSquareBuffer(bufferedAccelGData, accelGData);
   SensorManager.getRotationMatrix(rotationMatrix, null,
     bufferedAccelGData, magnetData);
  

  if (MODUS == 5) 
   // this mode uses the sensor data recieved from the orientation
   // sensor
   resultingAngles = orientationData.clone();
   if ((-90 > resultingAngles[1]) || (resultingAngles[1] > 90)) 
    resultingAngles[1] = orientationData[0];
    resultingAngles[2] = orientationData[1];
    resultingAngles[0] = orientationData[2];
   
  

  if (MODUS == 6) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);
   final float[] anglesInRadians = new float[3];
   SensorManager.getOrientation(rotationMatrix, anglesInRadians);
   //TODO check for landscape mode
   resultingAngles[0] = anglesInRadians[0] * rad2deg;
   resultingAngles[1] = anglesInRadians[1] * rad2deg;
   resultingAngles[2] = anglesInRadians[2] * -rad2deg;
  

  if (MODUS == 7) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);

   rotationMatrix = transpose(rotationMatrix);
   /*
    * this assumes that the rotation matrices are multiplied in x y z
    * order Rx*Ry*Rz
    */

   resultingAngles[2] = (float) (Math.asin(rotationMatrix[2]));
   final float cosB = (float) Math.cos(resultingAngles[2]);
   resultingAngles[2] = resultingAngles[2] * rad2deg;
   resultingAngles[0] = -(float) (Math.acos(rotationMatrix[0] / cosB))
     * rad2deg;
   resultingAngles[1] = (float) (Math.acos(rotationMatrix[10] / cosB))
     * rad2deg;
  

  if (MODUS == 8) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);
   rotationMatrix = transpose(rotationMatrix);
   /*
    * this assumes that the rotation matrices are multiplied in z y x
    */

   resultingAngles[2] = (float) (Math.asin(-rotationMatrix[8]));
   final float cosB = (float) Math.cos(resultingAngles[2]);
   resultingAngles[2] = resultingAngles[2] * rad2deg;
   resultingAngles[1] = (float) (Math.acos(rotationMatrix[9] / cosB))
     * rad2deg;
   resultingAngles[0] = (float) (Math.asin(rotationMatrix[4] / cosB))
     * rad2deg;
  

  if (MODUS == 9) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);
   rotationMatrix = transpose(rotationMatrix);
   /*
    * this assumes that the rotation matrices are multiplied in z x y
    * 
    * note z axis looks good at this one
    */

   resultingAngles[1] = (float) (Math.asin(rotationMatrix[9]));
   final float minusCosA = -(float) Math.cos(resultingAngles[1]);
   resultingAngles[1] = resultingAngles[1] * rad2deg;
   resultingAngles[2] = (float) (Math.asin(rotationMatrix[8]
     / minusCosA))
     * rad2deg;
   resultingAngles[0] = (float) (Math.asin(rotationMatrix[1]
     / minusCosA))
     * rad2deg;
  

  if (MODUS == 10) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);
   rotationMatrix = transpose(rotationMatrix);
   /*
    * this assumes that the rotation matrices are multiplied in y x z
    */

   resultingAngles[1] = (float) (Math.asin(-rotationMatrix[6]));
   final float cosA = (float) Math.cos(resultingAngles[1]);
   resultingAngles[1] = resultingAngles[1] * rad2deg;
   resultingAngles[2] = (float) (Math.asin(rotationMatrix[2] / cosA))
     * rad2deg;
   resultingAngles[0] = (float) (Math.acos(rotationMatrix[5] / cosA))
     * rad2deg;
  

  if (MODUS == 11) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);
   rotationMatrix = transpose(rotationMatrix);
   /*
    * this assumes that the rotation matrices are multiplied in y z x
    */

   resultingAngles[0] = (float) (Math.asin(rotationMatrix[4]));
   final float cosC = (float) Math.cos(resultingAngles[0]);
   resultingAngles[0] = resultingAngles[0] * rad2deg;
   resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC))
     * rad2deg;
   resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC))
     * rad2deg;
  

  if (MODUS == 12) 
   SensorManager.getRotationMatrix(rotationMatrix, null, accelGData,
     magnetData);
   rotationMatrix = transpose(rotationMatrix);
   /*
    * this assumes that the rotation matrices are multiplied in x z y
    */

   resultingAngles[0] = (float) (Math.asin(-rotationMatrix[1]));
   final float cosC = (float) Math.cos(resultingAngles[0]);
   resultingAngles[0] = resultingAngles[0] * rad2deg;
   resultingAngles[2] = (float) (Math.acos(rotationMatrix[0] / cosC))
     * rad2deg;
   resultingAngles[1] = (float) (Math.acos(rotationMatrix[5] / cosC))
     * rad2deg;
  
  logOutput();
 

 /**
  * transposes the matrix because it was transposted (inverted, but here its
  * the same, because its a rotation matrix) to be used for opengl
  * 
  * @param source
  * @return
  */
 private float[] transpose(float[] source) 
  final float[] result = source.clone();
  if (TRY_TRANSPOSED_VERSION) 
   result[1] = source[4];
   result[2] = source[8];
   result[4] = source[1];
   result[6] = source[9];
   result[8] = source[2];
   result[9] = source[6];
  
  // the other values in the matrix are not relevant for rotations
  return result;
 

 private void rootMeanSquareBuffer(float[] target, float[] values) 

  final float amplification = 200.0f;
  float buffer = 20.0f;

  target[0] += amplification;
  target[1] += amplification;
  target[2] += amplification;
  values[0] += amplification;
  values[1] += amplification;
  values[2] += amplification;

  target[0] = (float) (Math
    .sqrt((target[0] * target[0] * buffer + values[0] * values[0])
      / (1 + buffer)));
  target[1] = (float) (Math
    .sqrt((target[1] * target[1] * buffer + values[1] * values[1])
      / (1 + buffer)));
  target[2] = (float) (Math
    .sqrt((target[2] * target[2] * buffer + values[2] * values[2])
      / (1 + buffer)));

  target[0] -= amplification;
  target[1] -= amplification;
  target[2] -= amplification;
  values[0] -= amplification;
  values[1] -= amplification;
  values[2] -= amplification;
 

 private void loadNewSensorData(SensorEvent event) 
  final int type = event.sensor.getType();
  if (type == Sensor.TYPE_ACCELEROMETER) 
   accelGData = event.values.clone();
  
  if (type == Sensor.TYPE_MAGNETIC_FIELD) 
   magnetData = event.values.clone();
  
  if (type == Sensor.TYPE_ORIENTATION) 
   orientationData = event.values.clone();
  
 

 private void logOutput() 
  if (mCount++ > 30) 
   mCount = 0;
   Log.d("Compass", "yaw0: " + (int) (resultingAngles[0])
     + "  pitch1: " + (int) (resultingAngles[1]) + "  roll2: "
     + (int) (resultingAngles[2]));
  
 

【问题讨论】:

【参考方案1】:

我还不能测试代码(但我会的,看起来很有趣)。引起我注意的一件事是,您似乎没有以任何方式过滤传感器数据。

传感器读数本质上是非常嘈杂,尤其是磁传感器。我建议你实现一些低通滤波。

请参阅我的previous answer 以进一步阅读。

【讨论】:

【参考方案2】:

使用 GLU 的 lookAt 函数测试和调试方法 5 会更容易:http://www.opengl.org/sdk/docs/man2/xhtml/gluLookAt.xml

此外,正如 villoren 建议的那样,过滤传感器数据是件好事,但如果您缓慢移动设备,也不会真正导致错误。如果你想尝试,一个简单的方法如下:

newValue = oldValue * 0.9 + sensorValue * 0.1;
oldValue = newValue;

【讨论】:

【参考方案3】:

在分析了上面的代码后,在方法 5 中,您将按如下方式分配方向数据,

resultingAngles[1] = orientationData[0]; // orientation z axis to y axis
resultingAngles[2] = orientationData[1]; // orientation x axis to z axis 
resultingAngles[0] = orientationData[2]; // orientation y axis to x axis

您已经以 y z x 方式进行了旋转。尝试改变方向..

我认为这可能是那里的问题..请检查并告诉我..

有关事件值,请参阅文档, http://developer.android.com/guide/topics/sensors/sensors_position.html

感谢您的辛勤工作..

【讨论】:

如果您能解释一下为什么它之前不能正常工作以及您所做的更改是什么,将会很有帮助。感谢您的贡献..【参考方案4】:

请注意,如果您一直得到错误的读数,您可能需要校准指南针,方法是用手腕移动指南针,如图 8 所示。

这很难用语言来解释;看这个视频: http://www.youtube.com/watch?v=sP3d00Hr14o

【讨论】:

我建议你疯狂地摇晃你的手机(注意不要松开它;),8 机芯通常不适用于我的 g1。另外,我认为只有当我得到异常值时,指南针才会重新校准,因此以 8 位数缓慢移动设备将无济于事【参考方案5】:

您可以使用 and-engine 来使用 OpenGL 的传感器只需查看示例 https://github.com/nicolasgramlich/AndEngineExamples/tree/GLES2/src/org/andengine/examples/app/cityradar

【讨论】:

【参考方案6】:

查看Sensor fusion demo app,它使用不同的传感器(陀螺仪、旋转矢量、加速度计 + 指南针等)并将 onSensorChanged 事件的输出呈现为一个彩色立方体,该立方体会根据您的手机旋转。

这些事件的结果是 stored as quaternions and rotation matrices 并用于 OpenGL 的 this class。

【讨论】:

以上是关于如何结合 OpenGL 使用 onSensorChanged 传感器数据的主要内容,如果未能解决你的问题,请参考以下文章

我的OpenGL学习进阶之旅如何在Android中使用ARCore结合OpenGL ES来实现增强人脸Augmented Faces?

如何使用 Opengl ES 裁剪图像?

结合 UIView 动画块和 OpenGL ES 渲染

使用 openGL ES 2.0 结合其他线条绘制功能的字体渲染(Freetype)不起作用

OpenGL 与 OpenCV 相结合的计算机视觉教程 [关闭]

3D Computer Grapihcs Using OpenGL - 20 结合Buffer