Adr*_* Eb 10 android sensor orientation augmented-reality compass-geolocation
我正在尝试构建一个简单的增强现实应用程序,因此我开始使用传感器数据.
根据此线程(Android罗盘示例)和示例(http://www.codingforandroid.com/2011/01/using-orientation-sensors-simple.html),使用Sensor.TYPE_ACCELEROMETER和Sensor计算方向. TYPE_MAGNETIC_FIELD并不适合.
所以我无法获得"好"的价值.azimut值根本没有任何意义,所以如果我只是将电话向上移动,则值会发生极大的变化.即使我只是旋转手机,这些值也不代表手机方向.
有谁有想法,谁根据给定的例子提高价值质量?
Tíb*_*íbó 20
您使用此示例应用程序的方向是什么?从编写的代码来看,支持的唯一方向是纵向或平面,取决于设备.你说"好"是什么意思?
旋转设备时该值不是"好"是正常的,设备坐标系应该是纵向工作,或平面我不知道(Y轴垂直向上,Z轴指向外面屏幕来自屏幕中心,X轴垂直于Y轴,沿着屏幕向右移动).有了这个,旋转设备将不会旋转设备坐标系,你必须重新映射它.
但是如果你想要以纵向方向设备的标题,这里有一段对我有用的代码:
@Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrix,
event.values);
SensorManager
.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
Run Code Online (Sandbox Code Playgroud)
您将获得标题(或方位角):
orientationVals[0]
Run Code Online (Sandbox Code Playgroud)
小智 14
来自Tíbó的回答很好,但是如果你记录滚动值,你会得到不规则的数字.(滚动对AR浏览器很重要)
这是因为
SensorManager.remapCoordinateSystem(mRotationMatrix,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
Run Code Online (Sandbox Code Playgroud)
您必须使用不同的矩阵进行重映射.以下代码适用于我正确的滚动值:
@Override
public void onSensorChanged(SensorEvent event)
{
// It is good practice to check that we received the proper sensor event
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR)
{
// Convert the rotation-vector to a 4x4 matrix.
SensorManager.getRotationMatrixFromVector(mRotationMatrixFromVector, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrixFromVector,
SensorManager.AXIS_X, SensorManager.AXIS_Z,
mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
// Optionally convert the result from radians to degrees
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]);
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]);
tv.setText(" Yaw: " + orientationVals[0] + "\n Pitch: "
+ orientationVals[1] + "\n Roll (not used): "
+ orientationVals[2]);
}
}
Run Code Online (Sandbox Code Playgroud)
Probably late to the party. Anyway here is how I got the azimuth
private final int sensorType = Sensor.TYPE_ROTATION_VECTOR;
float[] rotMat = new float[9];
float[] vals = new float[3];
@Override
public void onSensorChanged(SensorEvent event) {
sensorHasChanged = false;
if (event.sensor.getType() == sensorType){
SensorManager.getRotationMatrixFromVector(rotMat,
event.values);
SensorManager
.remapCoordinateSystem(rotMat,
SensorManager.AXIS_X, SensorManager.AXIS_Y,
rotMat);
SensorManager.getOrientation(rotMat, vals);
azimuth = deg(vals[0]); // in degrees [-180, +180]
pitch = deg(vals[1]);
roll = deg(vals[2]);
sensorHasChanged = true;
}
}
Run Code Online (Sandbox Code Playgroud)
Hope it helps
您是否尝试过组合(传感器融合)类型Sensor.TYPE_ROTATION_VECTOR。这可能会产生更好的结果:转到https://developer.android.com/reference/android/hardware/SensorEvent.html并搜索“rotation_vector”。
| 归档时间: |
|
| 查看次数: |
19426 次 |
| 最近记录: |