簡體   English   中英

權限拒絕錯誤-SpeechRecognizer是連續服務嗎? (android.permission.INTERACT_ACROSS_USERS_FULL)

[英]Permission Denial Error - SpeechRecognizer as a continuous service? (android.permission.INTERACT_ACROSS_USERS_FULL)

編輯:我已經更改了服務代碼以實現為啟動服務,而不是IntentService作為更新的StreamService.java,現在,我在StreamService.java之后收到有關日志記錄消息中所述的權限拒絕錯誤的錯誤

編輯:

As mentioned in Android Developer site that SpeechRecognizer API can only be used as Application Context. Is there any woraround with which I can get it working

我已經實現了具有所有UI組件的MainActivity類。 班級如下

代碼-MainActivity.java

package com.example.speechsensorservice;

import android.app.Activity;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.os.Bundle;
import android.util.Log;
import android.view.Menu;
import android.view.View;
import android.widget.ImageButton;
import android.widget.TextView;
import android.widget.Toast;

public class MainActivity extends Activity {


    private static final String TAG = "SpeechSensor";

    private boolean headsetConnected = false;

    public TextView txtText;

    private BroadcastReceiver mReceiver;
    private ImageButton btnSpeak;

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);

        txtText = (TextView) findViewById(R.id.txtText);
        btnSpeak = (ImageButton) findViewById(R.id.btnSpeak);

        btnSpeak.setOnClickListener(new View.OnClickListener() {

            @Override
            public void onClick(View v) {
                Intent intent = new Intent(getApplicationContext(),StreamService.class);
                startService(intent);
            }
        });
    }

    protected void onResume() {
        super.onResume();

        IntentFilter sIF = new IntentFilter();
        sIF.addAction(Intent.ACTION_HEADSET_PLUG);
        sIF.addAction("com.example.speechsensorservice.TEXT");
        mReceiver = new BroadcastReceiver() {

                @Override
            public void onReceive(Context arg0, Intent arg1) {
                // TODO Auto-generated method stub
                String act = arg1.getAction();
                Log.d(TAG, "Received Action = " + act);
                if ( Intent.ACTION_HEADSET_PLUG.equals(act) ) {
                    if ( arg1.hasExtra("state")) {
                        if ( !headsetConnected && arg1.getIntExtra("state", 0) == 1 ) {
                            headsetConnected = true;
                            txtText.setText("Headset Plugged in");
                            startNoiseProcessService();
                        }
                    }
                }
                else if ( act.equals("com.example.speechsensorservice.TEXT") ){
                    if ( arg1.hasExtra("Identity")) {
                        String s = arg1.getStringExtra("Identity");
                        if ( s.equals("NA") ) {
                            Toast t = Toast.makeText(getApplicationContext(), 
                                    "Your Device doesnot support Speech to Text", 
                                    Toast.LENGTH_SHORT);
                            t.show();
                        }
                        else txtText.setText(s);
                    }
                }
            }

        };  

        this.registerReceiver(mReceiver, sIF);      
    }

    public void onPause() {
        super.onPause();
        this.unregisterReceiver(this.mReceiver);
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
       getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }

    public void startNoiseProcessService() {
        Intent intent = new Intent(this,StreamService.class);
        startService(intent);
    }


}

我實現了通過繼承IntentService類將語音識別服務作為后台任務啟動的另一個類。 實現如下

代碼-StreamService.java

    package com.example.speechsensorservice;

import java.util.ArrayList;

import android.app.Service;
import android.content.BroadcastReceiver;
import android.content.Context;
import android.content.Intent;
import android.content.IntentFilter;
import android.os.Bundle;
import android.os.IBinder;
import android.speech.RecognitionListener;
import android.speech.RecognizerIntent;
import android.speech.SpeechRecognizer;
import android.util.Log;

public class StreamService extends Service {
     private static final String TAG = "SpeechSensor";
     private static final String ACTION = "com.example.speechsensorservice.TEXT";
    private SpeechRecognizer sr;

    private BroadcastReceiver sReceiver;

    private boolean headsetConnected = true;

    String text;


    @Override
    public IBinder onBind(Intent arg0) {
        // TODO Auto-generated method stub
        return null;
    }

    @Override
    public void onCreate() {
        Log.d(TAG, "onCreate() StreamService Method");
        super.onCreate();
        sReceiver = new BroadcastReceiver() {
            public void onReceive(Context arg0, Intent arg1) {
                // TODO Auto-generated method stub
                if ( Intent.ACTION_HEADSET_PLUG.equals(arg1.getAction()) ) {
                    if ( arg1.hasExtra("state")) {
                            if ( headsetConnected && arg1.getIntExtra("state", 0) == 0 ) {
                                headsetConnected = false;
                                stopStreaming(); 
                            } 
                    }
                }
            }

        };  
        this.registerReceiver(sReceiver, new IntentFilter(Intent.ACTION_HEADSET_PLUG)); 
    }

    @Override
    public int onStartCommand(Intent intent, int flags, int startId) {
        Log.d(TAG,"Inside onStartCommand()");
    //  Runnable r = new Runnable() {
    //      public void run() {
                startStreaming();
    //      }
    //  };

    //  Thread t = new Thread(r);
    //  t.start();

        return Service.START_STICKY;

    }

    @Override
    public  void onDestroy() {
        Log.d(TAG, "onDestroy() StreamService Method");
        super.onDestroy();
        this.unregisterReceiver(this.sReceiver);
    }


     public void startStreaming() {
         Log.d(TAG, "Inside startStreaming()");
            Intent intent;
            text = "";
            if ( !SpeechRecognizer.isRecognitionAvailable(this) ) {
                Log.d(TAG, "Not Applicable with your device");
                text = "NA";
                intent = new Intent(ACTION);
                intent.putExtra("Identity", text);
                sendBroadcast(intent);
            }
            else {
                Log.d(TAG, "started taking input");
                sr = SpeechRecognizer.createSpeechRecognizer(this.getApplicationContext());

                intent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);

                //intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, "hi-IN");
                intent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL, "en-US");//RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);//RecognizerIntent.LANGUAGE_MODEL_WEB_SEARCH);
             //   intent.putExtra(RecognizerIntent.EXTRA_MAX_RESULTS, 3);

                sr.setRecognitionListener( new mylistener());
                sr.startListening(intent);
            }

     }

     public void stopStreaming() {
            if ( sr == null ) return;
            Log.d(TAG, "stopped taking input");
            sr.cancel();
            sr.destroy();
            sr = null;
            this.stopSelf();
     }

     public boolean isStreaming() {
            // TODO Auto-generated method stub
            Log.d(TAG,"isStreaming : YES");
            if ( sr != null ) return true;
            return false;
     }

     class mylistener implements RecognitionListener {

            @Override
            public void onBeginningOfSpeech() {
                // TODO Auto-generated method stub
                Log.d(TAG, "onBeginningOfSpeech");
            }

            @Override
            public void onBufferReceived(byte[] arg0) {
                // TODO Auto-generated method stub

            }

            @Override
            public void onEndOfSpeech() {
                // TODO Auto-generated method stub
                Log.d(TAG, "onEndOfSpeech");
            }

            @Override
            public void onError(int arg0) {
                // TODO Auto-generated method stub

            }

            @Override
            public void onEvent(int arg0, Bundle arg1) {
                // TODO Auto-generated method stub

            }

            @Override
            public void onPartialResults(Bundle arg0) {
                // TODO Auto-generated method stub

            }

            @Override
            public void onReadyForSpeech(Bundle arg0) {
                // TODO Auto-generated method stub
                Log.d(TAG, "onReadyForSpeech");
            }

            @Override
            public void onResults(Bundle arg0) {
                // TODO Auto-generated method stub


                Log.d(TAG, "Got Results");
                ArrayList<String> al = arg0.getStringArrayList(SpeechRecognizer.RESULTS_RECOGNITION);
                text = al.get(0);
                for ( int i =0 ; i < al.size(); i++ ) {
                    Log.d(TAG,"result=" + al.get(i));
                }
                Intent intent = new Intent(ACTION);
                intent.putExtra("Identifier", text);
                sendBroadcast(intent);
              //  startStreaming();

            }

            @Override
            public void onRmsChanged(float arg0) {
                // TODO Auto-generated method stub

            }

        }

}

我在這里遇到錯誤java.lang.RuntimeException: SpeechRecognizer should be used only from the application's main thread

代碼流是這樣的:

ImageButton-> onClick()->觸發StreamService的服務意圖.class-> onCreate()-> onHandleIntent()->調用startStreaming()->出錯

LogCat消息:

12-13 17:03:24.822   794  7381 E DatabaseUtils: Writing exception to parcel
12-13 17:03:24.822   794  7381 E DatabaseUtils: java.lang.SecurityException: Permission Denial: get/set setting for user asks to run as user -2 but is calling from user 0; this requires android.permission.INTERACT_ACROSS_USERS_FULL
12-13 17:03:24.822   794  7381 E DatabaseUtils:     at com.android.server.am.ActivityManagerService.handleIncomingUser(ActivityManagerService.java:12754)
12-13 17:03:24.822   794  7381 E DatabaseUtils:     at android.app.ActivityManager.handleIncomingUser(ActivityManager.java:1998)
12-13 17:03:24.822   794  7381 E DatabaseUtils:     at com.android.providers.settings.SettingsProvider.call(SettingsProvider.java:574)
12-13 17:03:24.822   794  7381 E DatabaseUtils:     at android.content.ContentProvider$Transport.call(ContentProvider.java:256)
12-13 17:03:24.822   794  7381 E DatabaseUtils:     at android.content.ContentProviderNative.onTransact(ContentProviderNative.java:256)
12-13 17:03:24.822   794  7381 E DatabaseUtils:     at android.os.Binder.execTransact(Binder.java:351)
12-13 17:03:24.822   794  7381 E DatabaseUtils:     at dalvik.system.NativeStart.run(Native Method)

有時,該特定錯誤實際上是令人誤解的,並且是由其他運行時問題引起的。

在這里記錄了一個這樣的示例-拋出一個NullPointerException並最終被報告為同一錯誤,即使它與跨用戶權限無關。

在我的特殊情況下,ProGuard剝離了我需要的方法,這導致拋出NullPointerException。 堆棧跟蹤如下所示:

Permission Denial: get/set setting for user asks to run as user -2 but is calling from user 0; this requires android.permission.INTERACT_ACROSS_USERS_FULL
java.lang.NullPointerException
 at java.lang.Enum$1.create(Enum.java:43)
 at java.lang.Enum$1.create(Enum.java:35)
 at libcore.util.BasicLruCache.get(BasicLruCache.java:54)
 at java.lang.Enum.getSharedConstants(Enum.java:209)
 at java.lang.Enum.valueOf(Enum.java:189)
 at com.my.app.package.b.c.a(Unknown Source)
 at com.my.app.package.b.a.onCreate(Unknown Source)
 at android.support.v4.app.FragmentManagerImpl.moveToState(Unknown Source)
 at android.support.v4.app.FragmentManagerImpl.moveToState(Unknown Source)
 at android.support.v4.app.BackStackRecord.run(Unknown Source)
 at android.support.v4.app.FragmentManagerImpl.execPendingActions(Unknown Source)
 at android.support.v4.app.FragmentManagerImpl$1.run(Unknown Source)
 at android.os.Handler.handleCallback(Handler.java:730)
 at android.os.Handler.dispatchMessage(Handler.java:92)
 at android.os.Looper.loop(Looper.java:137)
 at android.app.ActivityThread.main(ActivityThread.java:5455)
 at java.lang.reflect.Method.invokeNative(Native Method)
 at java.lang.reflect.Method.invoke(Method.java:525)
 at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1187)
 at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1003)
 at dalvik.system.NativeStart.main(Native Method)

我不知道為什么Android會將NullPointerException變成android.permission.INTERACT_ACROSS_USERS_FULL錯誤,但是顯而易見的解決方案是調整ProGuard配置,以便不剝離該方法。

我所調用的方法不是在枚舉上存在“ valueOf”方法。 事實證明,引擎蓋下有一些有趣的反射(我在上面的鏈接中有介紹),但是對我來說,解決方案是在ProGuard配置中添加以下內容。

-keepclassmembers enum * {
    public static **[] values();
    public static ** valueOf(java.lang.String);
}

很好的問題是自我解決方案說明,您的logCat中的第一行為您提供了解決方案,當前線程不具有執行用戶任務的權限,因此只需在清單中添加以下權限,看看是否可行。

<uses-permission android:name="android.permission.INTERACT_ACROSS_USERS_FULL">

我知道我是否正確理解了這個問題

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM