如何仅使用语音命令导航Google Glass GDK Immersion应用程序?

Dra*_*ace 11 android google-glass google-gdk

我如何编写语音触发器来导航Google Glass Cards?

This is how I see it happening:

1) "Ok Glass, Start My Program"

2) Application begins and shows the first card

3) User can say "Next Card" to move to the next card 
(somewhat the equivalent of swiping forward when in the timeline)

4) User can say "Previous Card" to go back 
Run Code Online (Sandbox Code Playgroud)

我需要显示的卡片是简单的文字和图像,我想知道是否可以在显示卡片时设置某种类型的监听器来监听语音命令.


我已经从给定列表中研究过Glass语音命令最接近匹配但是无法运行代码,尽管我确实拥有所有库.

旁注:使用语音命令时,用户仍然可以看到卡片,这一点非常重要.他的手也很忙,所以不能选择点击/滑动.

关于如何仅使用语音控制来控制我的Immersion应用程序中的时间线的任何想法非常感谢!

我正在跟踪https://code.google.com/p/google-glass-api/issues/detail?id=273.


我正在进行的研究让我回顾Google Glass Developer使用谷歌建议的聆听手势的方式:https://developers.google.com/glass/develop/gdk/input/touch#detecting_gestures_with_a_gesture_detector

我们如何使用语音命令激活这些手势?


Android只是beta发布的可穿戴设备升级为Android http://developer.android.com/wear/notifications/remote-input.html,有没有办法我们可以用它来回答我的问题?它仍然感觉我们仍然只有一步之遥,因为我们可以打电话给服务,但在我们谈话时没有"睡觉"和"醒来"作为后台服务.

Dra*_*ace 2

我正在详细编写整个代码,因为我花了很长时间才使其工作。也许它会节省其他人的宝贵时间。

此代码是 Google 上下文语音命令的实现,如 Google 开发人员此处所述:上下文语音命令

ContextualMenuActivity.java

   package com.drace.contextualvoicecommands;

    import android.app.Activity;
    import android.os.Bundle;
    import android.view.Menu;
    import android.view.MenuItem;
    import com.drace.contextualvoicecommands.R;
    import com.google.android.glass.view.WindowUtils;

    public class ContextualMenuActivity extends Activity {

    @Override
    protected void onCreate(Bundle bundle) {
        super.onCreate(bundle);

        // Requests a voice menu on this activity. As for any other
        // window feature, be sure to request this before
        // setContentView() is called
        getWindow().requestFeature(WindowUtils.FEATURE_VOICE_COMMANDS);
        setContentView(R.layout.activity_main);
    }

    @Override
    public boolean onCreatePanelMenu(int featureId, Menu menu) {
        if (featureId == WindowUtils.FEATURE_VOICE_COMMANDS) {
            getMenuInflater().inflate(R.menu.main, menu);
            return true;
        }
        // Pass through to super to setup touch menu.
        return super.onCreatePanelMenu(featureId, menu);
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }

    @Override
    public boolean onMenuItemSelected(int featureId, MenuItem item) {
        if (featureId == WindowUtils.FEATURE_VOICE_COMMANDS) {
            switch (item.getItemId()) {
                case R.id.dogs_menu_item:
                    // handle top-level dogs menu item
                    break;
                case R.id.cats_menu_item:
                    // handle top-level cats menu item
                    break;
                case R.id.lab_menu_item:
                    // handle second-level labrador menu item
                    break;
                case R.id.golden_menu_item:
                    // handle second-level golden menu item
                    break;
                case R.id.calico_menu_item:
                    // handle second-level calico menu item
                    break;
                case R.id.cheshire_menu_item:
                    // handle second-level cheshire menu item
                    break;
                default:
                    return true;
            }
            return true;
        }
        // Good practice to pass through to super if not handled
        return super.onMenuItemSelected(featureId, item);
    }
    }
Run Code Online (Sandbox Code Playgroud)

Activity_main.xml(布局)

 <?xml version="1.0" encoding="utf-8"?>
    <RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
        xmlns:tools="http://schemas.android.com/tools"
        android:layout_width="match_parent"
        android:layout_height="match_parent" >

          <TextView
        android:id="@+id/coming_soon"
        android:layout_alignParentTop="true"
        android:layout_width="wrap_content"
        android:layout_height="wrap_content"
        android:text="@string/voice_command_test"
        android:textSize="22sp"
        android:layout_marginRight="40px"
        android:layout_marginTop="30px"
        android:layout_marginLeft="210px" /> 
    </RelativeLayout>
Run Code Online (Sandbox Code Playgroud)

字符串.xml

<resources>
<string name="app_name">Contextual voice commands</string>
<string name="voice_start_command">Voice commands</string>
<string name="voice_command_test">Say "Okay, Glass"</string>
<string name="show_me_dogs">Dogs</string>
<string name="labrador">labrador</string>
<string name="golden">golden</string>
<string name="show_me_cats">Cats</string>
<string name="cheshire">cheshire</string>
<string name="calico">calico</string>
</resources>
Run Code Online (Sandbox Code Playgroud)

AndroidManifest.xml

 <manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.drace.contextualvoicecommands"
    android:versionCode="1"
    android:versionName="1.0" >

    <uses-sdk
        android:minSdkVersion="19"
        android:targetSdkVersion="19" />

    <uses-permission android:name="com.google.android.glass.permission.DEVELOPMENT"/>

    <application
        android:allowBackup="true"
        android:icon="@drawable/ic_launcher"
        android:label="@string/app_name" >

       <activity
            android:name="com.drace.contextualvoicecommands.ContextualMenuActivity"
            android:label="@string/app_name" >
            <intent-filter>
                <action android:name="com.google.android.glass.action.VOICE_TRIGGER" />
            </intent-filter>

            <meta-data
                android:name="com.google.android.glass.VoiceTrigger"
                android:resource="@xml/voice_trigger_start" />
        </activity>

    </application>
    </manifest>
Run Code Online (Sandbox Code Playgroud)

它已经过测试,在 Google Glass XE22 下运行良好!