Code Monkey home page Code Monkey logo

android-speech's People

Contributors

fredyk avatar gotev avatar kristiyanp avatar yuripourre avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

android-speech's Issues

Languages support

Can it be used to recognise different languages, Like i speak in one language and by using the text i get from it i translate it to other language using google translate api , basically want to build something similar to google translate app, speech to speech translation and i am using following algorithm

record sound from microphone, encode it to flac format, send it to google for recognition via speech API, when you receive the answer, translate it by google translate API to desired language, after translation, convert the translated text to speech, you could use Google speech API for that

there is a way to know when TextToSpeech engine successfully started without logger delegate

Hi, I have a doubt about if the loggerDelegate will works in production mode, i am using this delegate to know if the engine has been started to set the locale in there , example:

    Speech.init(this, packageName)
    Logger.setLogLevel(Logger.LogLevel.DEBUG);
    Logger.setLoggerDelegate(object : LoggerDelegate {
        override fun error(tag: String?, message: String?) {
        }

        override fun error(tag: String?, message: String?, exception: Throwable?) {
        }

        override fun debug(tag: String?, message: String?) {
        }
        override fun info(tag: String?, message: String?) {
            if (message == "TextToSpeech engine successfully started"){
                Speech.getInstance().setLocale(Locale("es", "MX"))
            }
        }
    })

Question

Can I import these files into an Android Studio Project?

Is there a way to read large amounts of text?

Thanks for this code.

I have tried reading a single text and it works, but I am interested in reading a lot of text. I have chunked the content so it doesn't exceed the maximum number of characters allowed. But reading doesn't work.

This is mi code:

        Speech speaker=Speech.getInstance();
        String[] textParts = mText.split(SEPARATOR);

        for (String s : textParts) {
            speaker.say(s,this);
        }

And I have this error:

W/TextToSpeech: Synthesis request paramter streamType containst value with invalid type. Should be an Integer or a Long

Is there a way to read large amounts of text?

Not allowed to bind to service Intent

When Google Voice typing is not enabled, speech library throws this exception:

java.lang.SecurityException: Not allowed to bind to service Intent { act=android.speech.RecognitionService cmp=com.mediatek.voicecommand/.service.VoiceWakeupRecognitionService }

Make it easier to understand by catching it and throwing it as a GoogleVoiceTypingDisabledException

SpeechRecognitionException: 3 - Audio recording error

I got an error that the speech just decided to finish it self. The log error not even showing if I'm not adding Logger.setLogLevel(Logger.LogLevel.DEBUG);

Logcat with 'speech' regex:

10-06 23:24:06.517 9162-9162/skrip.si.findthissong D/Speech: DelayedOperation - created delayed operation with tag: delayStopListening
10-06 23:24:06.566 9162-9162/skrip.si.findthissong I/TextToSpeech: Sucessfully bound to com.google.android.tts
10-06 23:24:06.567 9162-9162/skrip.si.findthissong W/TextToSpeech: setLanguage failed: not bound to TTS engine
10-06 23:24:06.567 9162-9162/skrip.si.findthissong D/Speech: DelayedOperation - created delayed operation with tag: delayStopListening
10-06 23:24:06.572 9162-9162/skrip.si.findthissong I/speech: speech recognition is now active
10-06 23:24:06.576 9162-9215/skrip.si.findthissong D/FA: Logging event (FE): screen_view(_vs), Bundle[{firebase_event_origin(_o)=auto, firebase_previous_class(_pc)=MainActivity, firebase_previous_id(_pi)=-7656231050442859303, firebase_screen_class(_sc)=SpeechSearchDialog, firebase_screen_id(_si)=-7656231050442859302}]
10-06 23:24:06.650 9162-9162/skrip.si.findthissong I/TextToSpeech: Connected to ComponentInfo{com.google.android.tts/com.google.android.tts.service.GoogleTTSService}
10-06 23:24:06.660 9162-9242/skrip.si.findthissong I/TextToSpeech: Set up connection to ComponentInfo{com.google.android.tts/com.google.android.tts.service.GoogleTTSService}
10-06 23:24:06.688 9162-9162/skrip.si.findthissong I/Speech: Speech - TextToSpeech engine successfully started
10-06 23:24:06.763 9162-9162/skrip.si.findthissong D/speech: rms is now: -2.12
10-06 23:24:06.821 9162-9162/skrip.si.findthissong D/speech: rms is now: -2.12
10-06 23:24:06.822 9162-9162/skrip.si.findthissong E/Speech: Speech - Speech recognition error
                                                             net.gotev.speech.SpeechRecognitionException: 3 - Audio recording error
                                                                 at net.gotev.speech.Speech$2.onError(Speech.java:180)
                                                                 at android.speech.SpeechRecognizer$InternalListener$1.handleMessage(SpeechRecognizer.java)
                                                                 at android.os.Handler.dispatchMessage(Handler.java)
                                                                 at android.os.Looper.loop(Looper.java)
                                                                 at android.app.ActivityThread.main(ActivityThread.java)
                                                                 at java.lang.reflect.Method.invoke(Native Method)
                                                                 at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java)
                                                                 at com.android.internal.os.ZygoteInit.main(ZygoteInit.java)
10-06 23:24:06.822 9162-9162/skrip.si.findthissong I/speech: result: 
10-06 23:24:06.831 9162-9162/skrip.si.findthissong D/Speech: DelayedOperation - created delayed operation with tag: delayStopListening
10-06 23:24:06.862 9162-9215/skrip.si.findthissong D/FA: Logging event (FE): screen_view(_vs), Bundle[{firebase_event_origin(_o)=auto, firebase_previous_class(_pc)=SpeechSearchDialog, firebase_previous_id(_pi)=-7656231050442859302, firebase_screen_class(_sc)=MainActivity, firebase_screen_id(_si)=-7656231050442859303}]

SpeechDialog Activity:
public class SpeechSearchDialog extends AppCompatActivity {

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        supportRequestWindowFeature(Window.FEATURE_NO_TITLE);
        setContentView(R.layout.activity_speech_search_dialog);

        final SpeechProgressView progressView = findViewById(R.id.progress);
        final TextView resultText = findViewById(R.id.text_result);

        Logger.setLogLevel(Logger.LogLevel.DEBUG);

        try {
            // https://gotev.github.io/android-speech/
            Speech.init(this, getPackageName());
//            Speech.getInstance().setLocale(new Locale("id"));
            Speech.getInstance().setStopListeningAfterInactivity(6000);
            Speech.getInstance().startListening(progressView, new SpeechDelegate() {
                @Override
                public void onStartOfSpeech() {
                    Log.i("speech", "speech recognition is now active");
                }

                @Override
                public void onSpeechRmsChanged(float v) {
                    Log.d("speech", "rms is now: " + v);
                }

                @Override
                public void onSpeechPartialResults(List<String> results) {
                    StringBuilder str = new StringBuilder();
                    for (String res : results) {
                        str.append(res).append(" ");
                    }
                    Log.i("speech", "partial result: " + str.toString().trim());

                    resultText.setText(str.toString().trim());
                }

                @Override
                public void onSpeechResult(String result) {
                    Log.i("speech", "result: " + result);

                    Intent returnIntent = new Intent();
                    returnIntent.putExtra(StaticVariablesHelper.ARG_PATTERN, result);
                    setResult(Activity.RESULT_OK, returnIntent);

                    finish();
                }
            });
        } catch (SpeechRecognitionNotAvailable exc) {
            Log.e("Speech", "Speech recognition is not available on this device!");
        } catch (GoogleVoiceTypingDisabledException exc) {
            Log.e("Speech", "Google voice typing must be enabled!");
        } catch (Exception e) {
            Log.e("Speech", e+"");
        }
    }

    @Override
    protected void onDestroy() {
        super.onDestroy();
        Speech.getInstance().shutdown();
    }
}

I already granted the RECORD_AUDIO permission.

Issue in Offline mode

Didn't recognizing properly in offline mode?

Is there any separate method for offline?
PS: Speech.getInstance().setPreferOffline(true); is not working.

Speech Recognition uses only English

Library ignores setLocale for recognition.. It applies to text to speech engine only.. in code you apply it to speechRecognitonEngine too but it doesnt take effect.. What is problem ? I setLocale before invoking startListening method.. it only recognizes speech in english..

Arabic text to speech

Is it possible to read Arabic text to speech?
I've tested not working(Locale mLocale = new Locale("ar");
Please help,Advance thanks

how to get same result as demo app

I followed the code as in the demo app, but the sound that came out was not the same as in the demo app. how to configure it if you want to get the same result

onStartOfSpeech is called too early

Hi,
i used the onStartOfSpeech() method of the SpeechDelegate to show the Speech Progress View. But then the progress view is shown before the speech is ready. This is because the speechRecognizer.startListening(intent) method is an async operation Instead i would have to use the SpeechRecognizerListener method onBeginningOfSpeech(). The documentation says

Called when the endpointer is ready for the user to start speaking.

So a solution would be to call the onStartOfSpeech() in onBeginningOfSpeech() or instead introduce another callback in SpeechDelegate onBeginningOfSpeech()

Anyway thanks for the work you put into this, it's a nice sample how to use the Android speech features.

Manifest merger failed with multiple error

Hi,
I am getting the above error while adding the library in Gradle. On syncing it gives the error "Manifest merger failed with multiple error".

Regards,
Junaid Ahmed

Letter 'E' is pronounced as 'A'

Output I hear for the following line of code is 'A'
Speech.getInstance().say("E")

Let me know if you need anything else from my side.

Muting the underlying API

Would it be possible to mute the sounds which the underlying API makes?
E.g. when we start listening for voice input, this would be the flow that the API goes through"

  1. Play beep, start listening
  2. Speech input
  3. Convert to text
  4. End of input
  5. Play beep, stop listening

I do not like the 1 and 5 sounds, I've searched a bit and found this https://stackoverflow.com/a/37882934/5273299 and this https://android.stackexchange.com/a/129713 link. Would it be possible to integrate some solution into this library, and then setting the sound on/off based on a boolean?

Great library, thanks!

Minimum API version

Please tell what is the minimal Android API version for you library?
For example, on API 10 Speech.init() gives java.lang.NoClassDefFoundError: net.gotev.speech.TtsProgressListener.

I've got into trouble with the production version of my app for a user with 2.3.3 version of Android.

Using different source

Hi!

More than an issue this is a question, is it possible to use a different audio source? like an external microphone??

Thanks!

Mute Tune

How can silence the start sound and end sound (tone or beep) ?

java.lang.IllegalArgumentException: Service not registered

java.lang.IllegalArgumentException: Service not registered: android.speech.SpeechRecognizer$Connection@367945c4
       at android.app.LoadedApk.forgetServiceDispatcher(LoadedApk.java:1094)
       at android.app.ContextImpl.unbindService(ContextImpl.java:2000)
       at android.content.ContextWrapper.unbindService(ContextWrapper.java:558)
       at android.speech.SpeechRecognizer.destroy(SpeechRecognizer.java:408)
       at net.gotev.speech.Speech.initSpeechRecognizer(Speech.java:262)
       at net.gotev.speech.Speech.returnPartialResultsAndRecreateSpeechRecognizer(Speech.java:471)
       at net.gotev.speech.Speech.stopListening(Speech.java:442)
       at net.igenius.crystal.view.activity.AdvisorActivity.stopSpeech(AdvisorActivity.java:228)

Changing Voice

I'm trying to change the default voice to a male voice using the setVoice method, Bu t it doesn't seem to work

Speech.getInstance ( ).setVoice ( new Voice ( "en-us-x-sfg#male_1-local" , Locale.ENGLISH , Voice.QUALITY_HIGH , Voice.LATENCY_VERY_LOW , false , a ) );

MainActivity cannot be cast to net.gotev.speech.SpeechDelegate

i am trying to implement voice recognition in fragmentx , i am getting below error
2021-12-06 16:40:22.470 28156-28156/com.sample.coin E/InputEventReceiver: Exception dispatching input event.
2021-12-06 16:40:22.470 28156-28156/com.sample.coin E/MessageQueue-JNI: Exception in MessageQueue callback: handleReceiveCallback
2021-12-06 16:40:22.471 28156-28156/com.sample.coin E/MessageQueue-JNI: java.lang.ClassCastException: com.sample.coin.customer.ui.main.MainActivity cannot be cast to net.gotev.speech.SpeechDelegate
at com.sample.coin.customer.ui.fragment.home.HomeFragment.onRecordAudioPermissionGranted(HomeFragment.java:225)
at com.sample.coin.customer.ui.fragment.home.HomeFragment.VoiceSearch(HomeFragment.java:180)
at com.sample.coin.customer.ui.fragment.home.HomeFragment.access$100(HomeFragment.java:44)
at com.sample.coin.customer.ui.fragment.home.HomeFragment$2.onTouch(HomeFragment.java:132)
at android.view.View.dispatchTouchEvent(View.java:14442)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at com.android.internal.policy.DecorView.superDispatchTouchEvent(DecorView.java:575)
at com.android.internal.policy.PhoneWindow.superDispatchTouchEvent(PhoneWindow.java:1945)
at android.app.Activity.dispatchTouchEvent(Activity.java:4232)
at androidx.appcompat.view.WindowCallbackWrapper.dispatchTouchEvent(WindowCallbackWrapper.java:69)
at com.android.internal.policy.DecorView.dispatchTouchEvent(DecorView.java:528)
at android.view.View.dispatchPointerEvent(View.java:14724)
at android.view.ViewRootImpl$ViewPostImeInputStage.processPointerEvent(ViewRootImpl.java:6516)
at android.view.ViewRootImpl$ViewPostImeInputStage.onProcess(ViewRootImpl.java:6299)
at android.view.ViewRootImpl$InputStage.deliver(ViewRootImpl.java:5760)
at android.view.ViewRootImpl$InputStage.onDeliverToNext(ViewRootImpl.java:5827)
at android.view.ViewRootImpl$InputStage.forward(ViewRootImpl.java:5788)
at android.view.ViewRootImpl$AsyncInputStage.forward(ViewRootImpl.java:5950)
at android.view.ViewRootImpl$InputStage.apply(ViewRootImpl.java:5796)
at android.view.ViewRootImpl$AsyncInputStage.apply(ViewRootImpl.java:6007)
at android.view.ViewRootImpl$InputStage.deliver(ViewRootImpl.java:5764)
at android.view.ViewRootImpl$InputStage.onDeliverToNext(ViewRootImpl.java:5827)
at android.view.ViewRootImpl$InputStage.forward(ViewRootImpl.java:5788)
at android.view.ViewRootImpl$InputStage.apply(ViewRootImpl.java:5796)
at android.view.ViewRootImpl$InputStage.deliver(ViewRootImpl.java:5764)
at android.view.ViewRootImpl.deliverInputEvent(ViewRootImpl.java:8619)
at android.view.ViewRootImpl.doProcessInputEvents(ViewRootImpl.java:8570)
2021-12-06 16:40:22.471 28156-28156/com.sample.coin E/MessageQueue-JNI: at android.view.ViewRootImpl.enqueueInputEvent(ViewRootImpl.java:8522)
at android.view.ViewRootImpl$WindowInputEventReceiver.onInputEvent(ViewRootImpl.java:8759)
at android.view.InputEventReceiver.dispatchInputEvent(InputEventReceiver.java:238)
at android.os.MessageQueue.nativePollOnce(Native Method)
at android.os.MessageQueue.next(MessageQueue.java:339)
at android.os.Looper.loop(Looper.java:208)
at android.app.ActivityThread.main(ActivityThread.java:8185)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:626)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1015)
2021-12-06 16:40:22.471 28156-28156/com.sample.coin D/AndroidRuntime: Shutting down VM

Choose the audio stream

It would be nice to be able to choose the audio stream for the Speech instance. This would enable usage of the TTS system on bluetooth devices.
Something like:

Speech.getInstance().setAudioStream(AudioManager.STREAM_VOICE_CALL);

I'll submit a PR with this enhancement.

Text to speech not working when device screen is off

Thanks for this good lib.

I am developing one app in which i want to play text to speech on foreground service .
its working fine when device screen is ON but its not working sometime when device screen is OFF,
can you please suggest possible solution and cause of this issue.

Thanks

Add the ability to use network voice.

Could you add the ability to use the network for the voice? the current offline voice sounds really machine like, I know this can be done by adding this to the TTS params

map.put(TextToSpeech.Engine.KEY_FEATURE_NETWORK_SYNTHESIS, Boolean.TRUE.toString());

However when I checked the library had no option that allowed me to add my own params.

Implement on speech view

Add the possibility to set a view in which to draw an animation when the user speaks during speech recognition

API or URL on Private connection

Hi

I have a private connection in my company, and some urls are blocked. I really need to know which API or URL use this library to unlock in my private connection.

Best Regards

MainActivity cannot be cast to net.gotev.speech.SpeechDelegate

i have implemented voice recognition in fragment and it is throwing the error below

2021-12-06 16:40:22.470 28156-28156/com.sample.coin E/InputEventReceiver: Exception dispatching input event.
2021-12-06 16:40:22.470 28156-28156/com.sample.coin E/MessageQueue-JNI: Exception in MessageQueue callback: handleReceiveCallback
2021-12-06 16:40:22.471 28156-28156/com.sample.coin E/MessageQueue-JNI: java.lang.ClassCastException: com.sample.coin.customer.ui.main.MainActivity cannot be cast to net.gotev.speech.SpeechDelegate
at com.sample.coin.customer.ui.fragment.home.HomeFragment.onRecordAudioPermissionGranted(HomeFragment.java:225)
at com.sample.coin.customer.ui.fragment.home.HomeFragment.VoiceSearch(HomeFragment.java:180)
at com.sample.coin.customer.ui.fragment.home.HomeFragment.access$100(HomeFragment.java:44)
at com.sample.coin.customer.ui.fragment.home.HomeFragment$2.onTouch(HomeFragment.java:132)
at android.view.View.dispatchTouchEvent(View.java:14442)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at android.view.ViewGroup.dispatchTransformedTouchEvent(ViewGroup.java:3170)
at android.view.ViewGroup.dispatchTouchEvent(ViewGroup.java:2832)
at com.android.internal.policy.DecorView.superDispatchTouchEvent(DecorView.java:575)
at com.android.internal.policy.PhoneWindow.superDispatchTouchEvent(PhoneWindow.java:1945)
at android.app.Activity.dispatchTouchEvent(Activity.java:4232)
at androidx.appcompat.view.WindowCallbackWrapper.dispatchTouchEvent(WindowCallbackWrapper.java:69)
at com.android.internal.policy.DecorView.dispatchTouchEvent(DecorView.java:528)
at android.view.View.dispatchPointerEvent(View.java:14724)
at android.view.ViewRootImpl$ViewPostImeInputStage.processPointerEvent(ViewRootImpl.java:6516)
at android.view.ViewRootImpl$ViewPostImeInputStage.onProcess(ViewRootImpl.java:6299)
at android.view.ViewRootImpl$InputStage.deliver(ViewRootImpl.java:5760)
at android.view.ViewRootImpl$InputStage.onDeliverToNext(ViewRootImpl.java:5827)
at android.view.ViewRootImpl$InputStage.forward(ViewRootImpl.java:5788)
at android.view.ViewRootImpl$AsyncInputStage.forward(ViewRootImpl.java:5950)
at android.view.ViewRootImpl$InputStage.apply(ViewRootImpl.java:5796)
at android.view.ViewRootImpl$AsyncInputStage.apply(ViewRootImpl.java:6007)
at android.view.ViewRootImpl$InputStage.deliver(ViewRootImpl.java:5764)
at android.view.ViewRootImpl$InputStage.onDeliverToNext(ViewRootImpl.java:5827)
at android.view.ViewRootImpl$InputStage.forward(ViewRootImpl.java:5788)
at android.view.ViewRootImpl$InputStage.apply(ViewRootImpl.java:5796)
at android.view.ViewRootImpl$InputStage.deliver(ViewRootImpl.java:5764)
at android.view.ViewRootImpl.deliverInputEvent(ViewRootImpl.java:8619)
at android.view.ViewRootImpl.doProcessInputEvents(ViewRootImpl.java:8570)
2021-12-06 16:40:22.471 28156-28156/com.sample.coin E/MessageQueue-JNI: at android.view.ViewRootImpl.enqueueInputEvent(ViewRootImpl.java:8522)
at android.view.ViewRootImpl$WindowInputEventReceiver.onInputEvent(ViewRootImpl.java:8759)
at android.view.InputEventReceiver.dispatchInputEvent(InputEventReceiver.java:238)
at android.os.MessageQueue.nativePollOnce(Native Method)
at android.os.MessageQueue.next(MessageQueue.java:339)
at android.os.Looper.loop(Looper.java:208)
at android.app.ActivityThread.main(ActivityThread.java:8185)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:626)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1015)
2021-12-06 16:40:22.471 28156-28156/com.sample.coin D/AndroidRuntime: Shutting down VM

Pronunciation of words is inconsistent

I have a problem with sequential using of .say method. For example I wrote the follow operators:

Speech.getInstance().say("First Word”);
Speech.getInstance().say("Second Word”);
Speech.getInstance().say("Third Word”);

From this code I will hear only the last "Third word" phrase, but others will be skipped. How can I make to speak all phrases in sequel manner, when the next operator is called only after previous have finished speaking

To solve my problem I tried to use ADD_QUEUE mode but it doesn't work like I want. With this mode all words will be simply pushing into the queue without waiting for the end of speaking.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.