微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

java – Android getOrientation()方法返回不良结果

我正在创建3D Compass应用程序.

我正在使用getOrientation方法获取方向(几乎相同的实现,如here).如果我把手机放在桌子上,它的效果很好,但是当手机顶部指向天空(图片上的Z轴为零时,球体为地球),getorientation开始给出非常糟糕的结果.它给出Z轴在0到180度之间的几个实数值的值.有什么办法抑制这种行为吗?我创建了一个little video描述问题(对质量不好的抱歉).提前致谢.

解:
当你旋转模型时,有区别:

gl.glrotatef(_angleY,0f,1f,0f); //ROLL
gl.glrotatef(_angleX,0f); //ELEVATION
gl.glrotatef(_angleZ,1f); //AZIMUTH


gl.glrotatef(_angleX,0f); //ELEVATION
gl.glrotatef(_angleY,0f); //ROLL
gl.glrotatef(_angleZ,1f); //AZIMUTH

解决方法

嗯,我可以看到至少有一个问题,你的这种方法.

我假设您将与您的磁力计相对应的3D矢量与平均低通滤波器相结合,以平滑数据.虽然这种方法对于传感器值无差异,如来自加速度计的原始数据,但是从磁力计中获取的角度变量不会如此重要.为什么有人会问?

因为这些角度变量(方位角,俯仰角,滚动角)具有上限和下限,这意味着高于180度(如181度)的任何值都将绕到181-360 = -179度,任何变量低于-180度会在另一个方向缠绕.所以当这些角度变量之一接近这些阈值(180或-180)时,这个变量将倾向于摆动到接近于这2个极值的值.当您盲目地将低通滤波器应用于这些值时,您可以从180度向-180度平滑下降,或者从-180到180度平滑地增加.无论哪种方式,结果将看起来像您的视频上面…只要一个直接对getorientation(…)的原始角度数据应用平均缓冲区,这个问题将会出现(应该不仅仅存在于电话是直立的情况,而且在方位角环绕的情况下也可能也可以测试这些错误…).

你说你用一个缓冲区大小为1测试了这个.理论上说,如果没有平均值,这个问题不应该出现,尽管在过去看到的循环缓冲区的某些实现中,可能意味着仍然平均至少有一个过去的价值,而不是没有平均.如果是这种情况,我们发现你的错误的根本原因.

不幸的是,在使用标准平均滤波器的同时,还没有很多优雅的解决方案可以实现.在这种情况下,我通常会做的是切换到另一种类型的低通滤波器,它不需要任何深度的缓冲区来操作:简单的IIR滤波器(1):

diff = x [n] -y [n-1]

y [n] – y [n-1] = alpha *(x [n] -y [n-1])= alpha * diff

…其中y是滤波角,x是原始角,并且α<1类似于时间常数,因为α= 1对应于无滤波器情况,并且低通滤波器的频率截止获得阿尔法接近零时降低.目前,急性眼睛可能已经注意到这对应于简单的比例控制器. 这样的滤波器允许补偿角度值的包围,因为我们可以将差值360加上或减去diff,以确保abs(diff)≤180,这又确保滤波角度值总是增加/减小在最佳方向达到“设定值”. 计算周期性地调度给定原始角度值x的滤波角度值y的示例函数调用可以是这样的:

private float restrictAngle(float tmpAngle){
    while(tmpAngle>=180) tmpAngle-=360;
    while(tmpAngle<-180) tmpAngle+=360;
    return tmpAngle;
}

//x is a raw angle value from getorientation(...)
//y is the current filtered angle value
private float calculateFilteredAngle(float x,float y){ 
    final float alpha = 0.1f;
    float diff = x-y;

    //here,we ensure that abs(diff)<=180
    diff = restrictAngle(diff);

    y += alpha*diff;
    //ensure that y stays within [-180,180[ bounds
    y = restrictAngle(y);

    return y;
}

函数calculateFilteredAngle(float x,float y)可以使用这样的方式定期调用(来自getorientation(…)函数的方位角的示例)

filteredAzimuth = calculateFilteredAngle(azimuth,filteredAzimuth);

使用这种方法,滤波器不会像OP所提到的平均滤波器那样行不通.

由于我无法加载OP上传的.apk,所以我决定实施自己的测试项目,以查看修正是否正常工作.这是整个代码(它不使用.XML作为主要的布局,所以我没有包括它).只需将其复制到测试项目中,看看它是否适用于特定的设备(在HTC Desire w / Android v.2.1上测试功能):

文件1:Compass3DActivity.java:

package com.epichorns.compass3D;

import android.app.Activity;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.os.Bundle;
import android.view.ViewGroup;
import android.widget.LinearLayout;
import android.widget.TextView;

public class Compass3DActivity extends Activity {
    //Textviews for showing angle data
    TextView mTextView_azimuth;
    TextView mTextView_pitch;
    TextView mTextView_roll;

    TextView mTextView_filtered_azimuth;
    TextView mTextView_filtered_pitch;
    TextView mTextView_filtered_roll;


    float mAngle0_azimuth=0;
    float mAngle1_pitch=0;
    float mAngle2_roll=0;

    float mAngle0_filtered_azimuth=0;
    float mAngle1_filtered_pitch=0;
    float mAngle2_filtered_roll=0;

    private Compass3DView mCompassView;

    private SensorManager sensorManager;
    //sensor calculation values
    float[] mGravity = null;
    float[] mGeomagnetic = null;
    float Rmat[] = new float[9];
    float Imat[] = new float[9];
    float orientation[] = new float[3];
    SensorEventListener mAccelerometerListener = new SensorEventListener(){
        public void onAccuracyChanged(Sensor sensor,int accuracy) {}

        public void onSensorChanged(SensorEvent event) {
            if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER){
                mGravity = event.values.clone();
                processSensorData();
            }
        }   
    };
    SensorEventListener mMagnetometerListener = new SensorEventListener(){
        public void onAccuracyChanged(Sensor sensor,int accuracy) {}

        public void onSensorChanged(SensorEvent event) {
            if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD){
                mGeomagnetic = event.values.clone();
                processSensorData();                
                update();
            }
        }   
    };

    private float restrictAngle(float tmpAngle){
        while(tmpAngle>=180) tmpAngle-=360;
        while(tmpAngle<-180) tmpAngle+=360;
        return tmpAngle;
    }

    //x is a raw angle value from getorientation(...)
    //y is the current filtered angle value
    private float calculateFilteredAngle(float x,float y){ 
        final float alpha = 0.3f;
        float diff = x-y;

        //here,we ensure that abs(diff)<=180
        diff = restrictAngle(diff);

        y += alpha*diff;
        //ensure that y stays within [-180,180[ bounds
        y = restrictAngle(y);

        return y;
    }



    public void processSensorData(){
        if (mGravity != null && mGeomagnetic != null) { 
            boolean success = SensorManager.getRotationMatrix(Rmat,Imat,mGravity,mGeomagnetic);
            if (success) {              
                SensorManager.getorientation(Rmat,orientation);
                mAngle0_azimuth = (float)Math.todegrees((double)orientation[0]); // orientation contains: azimut,pitch and roll
                mAngle1_pitch = (float)Math.todegrees((double)orientation[1]); //pitch
                mAngle2_roll = -(float)Math.todegrees((double)orientation[2]); //roll               
                mAngle0_filtered_azimuth = calculateFilteredAngle(mAngle0_azimuth,mAngle0_filtered_azimuth);
                mAngle1_filtered_pitch = calculateFilteredAngle(mAngle1_pitch,mAngle1_filtered_pitch);
                mAngle2_filtered_roll = calculateFilteredAngle(mAngle2_roll,mAngle2_filtered_roll);    
            }           
            mGravity=null; //oblige full new refresh
            mGeomagnetic=null; //oblige full new refresh
        }
    }

    /** Called when the activity is first created. */
    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);     
        LinearLayout ll = new LinearLayout(this);       
        LinearLayout.LayoutParams llParams = new LinearLayout.LayoutParams(LinearLayout.LayoutParams.FILL_PARENT,LinearLayout.LayoutParams.FILL_PARENT);      
        ll.setLayoutParams(llParams);      
        ll.setorientation(LinearLayout.VERTICAL);      
        ViewGroup.LayoutParams txtParams = new ViewGroup.LayoutParams(ViewGroup.LayoutParams.WRAP_CONTENT,ViewGroup.LayoutParams.WRAP_CONTENT);        
        mTextView_azimuth = new TextView(this);
        mTextView_azimuth.setLayoutParams(txtParams);
        mTextView_pitch = new TextView(this);
        mTextView_pitch.setLayoutParams(txtParams);
        mTextView_roll = new TextView(this);
        mTextView_roll.setLayoutParams(txtParams);      
        mTextView_filtered_azimuth = new TextView(this);
        mTextView_filtered_azimuth.setLayoutParams(txtParams);
        mTextView_filtered_pitch = new TextView(this);
        mTextView_filtered_pitch.setLayoutParams(txtParams);
        mTextView_filtered_roll = new TextView(this);
        mTextView_filtered_roll.setLayoutParams(txtParams);

        mCompassView = new Compass3DView(this);        
        ViewGroup.LayoutParams compassparams = new ViewGroup.LayoutParams(200,200);
        mCompassView.setLayoutParams(compassparams);

        ll.addView(mCompassView);
        ll.addView(mTextView_azimuth);
        ll.addView(mTextView_pitch);
        ll.addView(mTextView_roll);
        ll.addView(mTextView_filtered_azimuth);
        ll.addView(mTextView_filtered_pitch);
        ll.addView(mTextView_filtered_roll);

        setContentView(ll);

        sensorManager = (SensorManager) this.getSystemService(Context.SENSOR_SERVICE);
        sensorManager.registerListener(mAccelerometerListener,sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),SensorManager.SENSOR_DELAY_UI); 
        sensorManager.registerListener(mMagnetometerListener,sensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),SensorManager.SENSOR_DELAY_UI);
        update();       
    }


    @Override
    public void onDestroy(){
        super.onDestroy();
        sensorManager.unregisterListener(mAccelerometerListener);
        sensorManager.unregisterListener(mMagnetometerListener);
    }


    private void update(){
        mCompassView.changeAngles(mAngle1_filtered_pitch,mAngle2_filtered_roll,mAngle0_filtered_azimuth);

        mTextView_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_azimuth));
        mTextView_pitch.setText("Pitch: "+String.valueOf(mAngle1_pitch));
        mTextView_roll.setText("Roll: "+String.valueOf(mAngle2_roll));

        mTextView_filtered_azimuth.setText("Azimuth: "+String.valueOf(mAngle0_filtered_azimuth));
        mTextView_filtered_pitch.setText("Pitch: "+String.valueOf(mAngle1_filtered_pitch));
        mTextView_filtered_roll.setText("Roll: "+String.valueOf(mAngle2_filtered_roll));

    }
}

文件2:Compass3DView.java:

package com.epichorns.compass3D;

import android.content.Context;
import android.opengl.GLSurfaceView;

public class Compass3DView extends GLSurfaceView {
    private Compass3DRenderer mRenderer;

    public Compass3DView(Context context) {
        super(context);
        mRenderer = new Compass3DRenderer(context);
        setRenderer(mRenderer);
    }

    public void changeAngles(float angle0,float angle1,float angle2){
        mRenderer.setAngleX(angle0);
        mRenderer.setAngleY(angle1);
        mRenderer.setAngleZ(angle2);
    }

}

文件3:Compass3DRenderer.java:

package com.epichorns.compass3D;


import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.nio.ShortBuffer;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

import android.content.Context;
import android.opengl.GLSurfaceView;


public class Compass3DRenderer implements GLSurfaceView.Renderer {
    Context mContext;

    // a raw buffer to hold indices
    ShortBuffer _indexBuffer;    
    // raw buffers to hold the vertices
    FloatBuffer _vertexBuffer0;
    FloatBuffer _vertexBuffer1;
    FloatBuffer _vertexBuffer2;
    FloatBuffer _vertexBuffer3;
    FloatBuffer _vertexBuffer4;
    FloatBuffer _vertexBuffer5;
    int _numVertices = 3; //standard triangle vertices = 3

    FloatBuffer _textureBuffer0123;



    //private FloatBuffer _light0Position;
    //private FloatBuffer _light0Ambient;
    float _light0Position[] = new float[]{10.0f,10.0f,0.0f};
    float _light0Ambient[] = new float[]{0.05f,0.05f,1.0f};
    float _light0Diffuse[] = new float[]{0.5f,0.5f,1.0f};
    float _light0specular[] = new float[]{0.7f,0.7f,1.0f};
    float _matAmbient[] = new float[] { 0.6f,0.6f,1.0f };
    float _matDiffuse[] = new float[] { 0.6f,1.0f };




    private float _angleX=0f;
    private float _angleY=0f;
    private float _angleZ=0f;


    Compass3DRenderer(Context context){
        super();
        mContext = context;
    }

    public void setAngleX(float angle) {
        _angleX = angle;
    }

    public void setAngleY(float angle) {
        _angleY = angle;
    }

    public void setAngleZ(float angle) {
        _angleZ = angle;
    }

    FloatBuffer InitFloatBuffer(float[] src){
        ByteBuffer bb = ByteBuffer.allocateDirect(4*src.length);
        bb.order(ByteOrder.nativeOrder());
        FloatBuffer inBuf = bb.asFloatBuffer();
        inBuf.put(src);
        return inBuf;
    }

    ShortBuffer InitShortBuffer(short[] src){
        ByteBuffer bb = ByteBuffer.allocateDirect(2*src.length);
        bb.order(ByteOrder.nativeOrder());
        ShortBuffer inBuf = bb.asShortBuffer();
        inBuf.put(src);
        return inBuf;
    }

    //Init data for our rendered pyramid
    private void initTriangles() {

        //Side faces triangles
        float[] coords = {
            -0.25f,-0.5f,0.25f,0f
        };

        float[] coords1 = {
            0.25f,-0.25f,0f
        };

        float[] coords2 = {
            0.25f,0f
        };

        float[] coords3 = {
            -0.25f,0f
        };

        //Base triangles
        float[] coords4 = {
            -0.25f,0.25f
        };

        float[] coords5 = {
            -0.25f,-0.25f
        };


        float[] textures0123 = {
                // Mapping coordinates for the vertices (UV mapping CW)
                0.0f,0.0f,// bottom left                    
                1.0f,// bottom right
                0.5f,1.0f,// top ctr              
        };


        _vertexBuffer0 = InitFloatBuffer(coords);
        _vertexBuffer0.position(0);

        _vertexBuffer1 = InitFloatBuffer(coords1);
        _vertexBuffer1.position(0);    

        _vertexBuffer2 = InitFloatBuffer(coords2);
        _vertexBuffer2.position(0);

        _vertexBuffer3 = InitFloatBuffer(coords3);
        _vertexBuffer3.position(0);

        _vertexBuffer4 = InitFloatBuffer(coords4);
        _vertexBuffer4.position(0);

        _vertexBuffer5 = InitFloatBuffer(coords5);
        _vertexBuffer5.position(0);

        _textureBuffer0123 = InitFloatBuffer(textures0123);
        _textureBuffer0123.position(0);

        short[] indices = {0,1,2};
        _indexBuffer = InitShortBuffer(indices);        
        _indexBuffer.position(0);

    }


    public void onSurfaceCreated(GL10 gl,EGLConfig config) {

        gl.glEnable(GL10.GL_CULL_FACE); // enable the differentiation of which side may be visible 
        gl.glShadeModel(GL10.GL_SMOOTH);

        gl.glFrontFace(GL10.GL_ccw); // which is the front? the one which is drawn counter clockwise
        gl.glCullFace(GL10.GL_BACK); // which one should NOT be drawn

        initTriangles();

        gl.glEnableClientState(GL10.GL_VERTEX_ARRAY);
        gl.glEnableClientState(GL10.GL_TEXTURE_COORD_ARRAY);
    }

    public void onDrawFrame(GL10 gl) {


        gl.glPushmatrix();

        gl.glClearColor(0,1.0f); //clipping backdrop color
        // clear the color buffer to show the ClearColor we called above...
        gl.glClear(GL10.GL_COLOR_BUFFER_BIT);

        // set rotation       
        gl.glrotatef(_angleY,0f); //ROLL
        gl.glrotatef(_angleX,0f); //ELEVATION
        gl.glrotatef(_angleZ,1f); //AZIMUTH

        //Draw our pyramid

        //4 side faces
        gl.glColor4f(0.5f,0.5f);
        gl.glVertexPointer(3,GL10.GL_FLOAT,_vertexBuffer0);
        gl.glDrawElements(GL10.GL_TRIANGLES,_numVertices,GL10.GL_UNSIGNED_SHORT,_indexBuffer);

        gl.glColor4f(0.5f,_vertexBuffer1);
        gl.glDrawElements(GL10.GL_TRIANGLES,_indexBuffer);

        gl.glColor4f(0f,_vertexBuffer2);
        gl.glDrawElements(GL10.GL_TRIANGLES,_vertexBuffer3);
        gl.glDrawElements(GL10.GL_TRIANGLES,_indexBuffer);

        //Base face
        gl.glColor4f(0f,_vertexBuffer4);
        gl.glDrawElements(GL10.GL_TRIANGLES,_indexBuffer);
        gl.glVertexPointer(3,_vertexBuffer5);
        gl.glDrawElements(GL10.GL_TRIANGLES,_indexBuffer);

        gl.glPopMatrix();
    }

    public void onSurfaceChanged(GL10 gl,int w,int h) {
        gl.glViewport(0,w,h);
        gl.glViewport(0,h);

    }



}

请注意,此代码不会补偿平板电脑认的横向,因此只能预期在手机上正常工作(我没有使用平板电脑关闭测试任何更正代码).

原文地址:https://www.jb51.cc/android/125860.html

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐