just-ai / aimybox-android-assistant Goto Github PK
View Code? Open in Web Editor NEWEmbeddable custom voice assistant for Android applications
Home Page: https://aimybox.com
License: Apache License 2.0
Embeddable custom voice assistant for Android applications
Home Page: https://aimybox.com
License: Apache License 2.0
I'm new to android and a newbie qn. Along with voice, how can I add a text so users can type too? In your example/demo code what should I change? I want to have both voice and text... I'm using rasa nlu stack....
Hi When I click the button link from dialogflow API, it auto dismiss itself instead of opening the link in browser, how do I solve that?
hello. i am creating a calories application with edamam api. i create a ChangeViewSkill class. in suspend fun onRequest i accept the request, send it to edamam and get a response (food calories). Eventually I want to show the user the calories in the voice assistant widget. how can I do that?
I don't use some dialog API libraries
Looks like AimyboxFragmentImpl
does nothing special here, so it would be better to make it a part of components
Now there are some required components with versions from Config.kt
. This requires one more module to start using Aimybox SDK. Is it possible to simplify build.gradle to get rid of additional module?
It's expectable that recognition and synthesis stop once user taps Home or switches the screen off. As well as any other action that interrupts the conversation between the assistant and user.
Hi,
I'm looking for Arabic speech recognition.
GooglePlatformTextToSpeech(context, Locale.ENGLISH) does not support arabic text to speech.
Thanks
The demo app works fine when I push to my android device. However, when I try to test on my Android Studio device, the speech recognition functionality does not work. It cancels as soon as I click on the microphone button.
app
is more convenient naming for any runnable application I guess.
Please add images and buttons replies support
Hi,
I'm working in a new project where I need a custom voice assistant in my android app. I tried this example Project and everything works perfectly, however, if I go from MainActivity to another activity and I try to load the assistant on this new activity, tts, stt and wake word doesn't work, I press the floating button and the assitant does its animation, but thats all, my question is, How can I implement something like the example project but extending the assistant in all the activities?
Thank you very much in advance.
Hi,
When i run the app demo in my device, i send my first speech ("what time is it"), but no reply !!
it gives me the error below (in andoird studio) :
com.justai.aimybox.core.ApiRequestTimeoutException: Request timeout: AimyboxRequest(query=what time is it, apiKey=Ldf0j7WZi3KwNah2aNeXVIACz0lb9qMH, unitId=8d456677-94d9-4a9a-afd4-fecb599b4545, data={}). Server didn't respond within 10000 ms.
at com.justai.aimybox.api.DialogApi$send$3.invokeSuspend(DialogApi.kt:69)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
at kotlinx.coroutines.ResumeModeKt.resumeUninterceptedWithExceptionMode(ResumeMode.kt:56)
at kotlinx.coroutines.TimeoutCoroutine.afterCompletionInternal(Timeout.kt:98)
at kotlinx.coroutines.JobSupport.completeStateFinalization(JobSupport.kt:310)
at kotlinx.coroutines.JobSupport.tryFinalizeFinishingState(JobSupport.kt:236)
at kotlinx.coroutines.JobSupport.tryMakeCompletingSlowPath(JobSupport.kt:849)
at kotlinx.coroutines.JobSupport.tryMakeCompleting(JobSupport.kt:811)
at kotlinx.coroutines.JobSupport.makeCompletingOnce$kotlinx_coroutines_core(JobSupport.kt:787)
at kotlinx.coroutines.AbstractCoroutine.resumeWith(AbstractCoroutine.kt:111)
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46)
at kotlinx.coroutines.DispatchedTask.run(Dispatched.kt:334)
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:594)
at kotlinx.coroutines.scheduling.CoroutineScheduler.access$runSafely(CoroutineScheduler.kt:60)
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:740)
D/OkHttp: <-- HTTP FAILED: java.net.SocketTimeoutException: failed to connect to api.aimybox.com/63.34.12.130 (port 443) from /10.0.8.1 (port 40158) after 10000ms
В своем проекте обновили firebase, зависимость com.google.firebase:firebase-perf-ktx тянет com.google.protobuf:protobuf-javalite:3.14.0. Эта версия конфликтует с com.google.protobuf:protobuf-lite:3.0.1, которую подтягивает библиотека com.justai.aimybox:yandex-speechkit.
Duplicate class com.google.protobuf.AbstractMessageLite found in modules jetified-protobuf-javalite-3.14.0 (com.google.protobuf:protobuf-javalite:3.14.0) and jetified-protobuf-lite-3.0.1 (com.google.protobuf:protobuf-lite:3.0.1)
От использования новой версии firebase отказаться не можем. Чтобы разрешить конфликт зависимостей необходимо поддержать в com.justai.aimybox:yandex-speechkit обновление io.grpc » grpc-protobuf-lite до версии 1.35.0
При подключении этой версии происходит краш:
2021-02-20 10:25:05.750 10628-10839/ru.alfabank.mobile.android.feature W/System.err: java.lang.ExceptionInInitializerError
2021-02-20 10:25:05.750 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at yandex.cloud.ai.stt.v2.SttServiceOuterClass$StreamingRecognitionRequest.getDefaultInstance(SttServiceOuterClass.java:467)
2021-02-20 10:25:05.751 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at yandex.cloud.ai.stt.v2.SttServiceGrpc.getStreamingRecognizeMethod(SttServiceGrpc.java:50)
2021-02-20 10:25:05.751 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at yandex.cloud.ai.stt.v2.SttServiceGrpc$SttServiceStub.streamingRecognize(SttServiceGrpc.java:130)
2021-02-20 10:25:05.751 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.justai.aimybox.speechkit.yandex.cloud.YandexRecognitionApi.openStream$yandex_speechkit_release(YandexRecognitionApi.kt:53)
2021-02-20 10:25:05.752 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.justai.aimybox.speechkit.yandex.cloud.YandexRecognitionApi$openStream$1.invokeSuspend(Unknown Source:12)
2021-02-20 10:25:05.752 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33)
2021-02-20 10:25:05.752 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:56)
2021-02-20 10:25:05.752 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571)
2021-02-20 10:25:05.753 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:738)
2021-02-20 10:25:05.753 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678)
2021-02-20 10:25:05.754 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665)
2021-02-20 10:25:05.755 10628-10839/ru.alfabank.mobile.android.feature W/System.err: Caused by: java.lang.RuntimeException: Unable to get message info for yandex.cloud.ai.stt.v2.SttServiceOuterClass$StreamingRecognitionRequest
2021-02-20 10:25:05.755 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.GeneratedMessageInfoFactory.messageInfoFor(GeneratedMessageInfoFactory.java:62)
2021-02-20 10:25:05.755 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.ManifestSchemaFactory$CompositeMessageInfoFactory.messageInfoFor(ManifestSchemaFactory.java:143)
2021-02-20 10:25:05.755 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.ManifestSchemaFactory.createSchema(ManifestSchemaFactory.java:55)
2021-02-20 10:25:05.756 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.Protobuf.schemaFor(Protobuf.java:90)
2021-02-20 10:25:05.756 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.Protobuf.schemaFor(Protobuf.java:104)
2021-02-20 10:25:05.756 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.GeneratedMessageLite.makeImmutable(GeneratedMessageLite.java:175)
2021-02-20 10:25:05.756 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at yandex.cloud.ai.stt.v2.SttServiceOuterClass$StreamingRecognitionRequest.<clinit>(SttServiceOuterClass.java:463)
2021-02-20 10:25:05.756 10628-10839/ru.alfabank.mobile.android.feature W/System.err: ... 11 more
2021-02-20 10:25:05.757 10628-10839/ru.alfabank.mobile.android.feature W/System.err: Caused by: java.lang.UnsupportedOperationException
2021-02-20 10:25:05.757 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at yandex.cloud.ai.stt.v2.SttServiceOuterClass$StreamingRecognitionRequest.dynamicMethod(SttServiceOuterClass.java:455)
2021-02-20 10:25:05.758 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.GeneratedMessageLite.dynamicMethod(GeneratedMessageLite.java:256)
2021-02-20 10:25:05.758 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.GeneratedMessageLite.buildMessageInfo(GeneratedMessageLite.java:284)
2021-02-20 10:25:05.758 10628-10839/ru.alfabank.mobile.android.feature W/System.err: at com.google.protobuf.GeneratedMessageInfoFactory.messageInfoFor(GeneratedMessageInfoFactory.java:60)
2021-02-20 10:25:05.758 10628-10839/ru.alfabank.mobile.android.feature W/System.err: ... 17 more```
Hi,
I want to run the demo, but both in a device and emulator (in android studio), when I click on the mic button, it doesn't work !!
When I click on the mic button, it cancels quickly and I can't send the speech.
Thank you for your help !
Below the trace given to me in the android studio :
I/Aimybox(Aimybox-Components): [main] STANDBY
I/Aimybox(STT): [DefaultDispatcher-worker-4] Begin recognition
I/Aimybox(Aimybox-Components): [main] LISTENING
E/Aimybox(STT): [DefaultDispatcher-worker-3] Failed to get recognition result
com.justai.aimybox.speechkit.google.platform.GooglePlatformSpeechToTextException: Exception [3]: Audio recording error.
at com.justai.aimybox.speechkit.google.platform.GooglePlatformSpeechToText$createRecognitionListener$1.onError(GooglePlatformSpeechToText.kt:77)
at android.speech.SpeechRecognizer$InternalListener$1.handleMessage(SpeechRecognizer.java:450)
at android.os.Handler.dispatchMessage(Handler.java:107)
at android.os.Looper.loop(Looper.java:214)
at android.app.ActivityThread.main(ActivityThread.java:7356)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:492)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:930)
E/Aimybox(Aimybox-Components): [main]
com.justai.aimybox.speechkit.google.platform.GooglePlatformSpeechToTextException: Exception [3]: Audio recording error.
at com.justai.aimybox.speechkit.google.platform.GooglePlatformSpeechToText$createRecognitionListener$1.onError(GooglePlatformSpeechToText.kt:77)
at android.speech.SpeechRecognizer$InternalListener$1.handleMessage(SpeechRecognizer.java:450)
at android.os.Handler.dispatchMessage(Handler.java:107)
at android.os.Looper.loop(Looper.java:214)
at android.app.ActivityThread.main(ActivityThread.java:7356)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:492)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:930)
I/Aimybox(Aimybox-Components): [main] STANDBY
Aimybox startRecognition
could be invoked from everywhere of the application. Not only by clicking on the mic button, but also through the voice trigger event or any other trigger (camera, sensors, etc). Thus AimyboxFragment
should appear once a recognition event is fired.
Make it possible to configure any labels of Aimybox fragment through strings.xml
for example as well as styles.
When
Would be great to eliminate KodeIn dependency from the assistant demo once any other project that would like to embed Aimybox could have another solution already
Hi,
I made a chatbot with english language and i integrated it with aimybox and its works.
Now i want to use the same chatbot but with french language (maybe with using translator API !! but how to do that ?)
Thanks for help !
I am using core library for core functionality in my app. that is 'com.justai.aimybox:core:0.11.0'. and I am using Google cloud speechkit for TTS and STT APIs.
The problem is that speech recognization is not working many times. Sometimes it works and somtimes its not. And even if it starts recognizing the speech. It stopps after recognizing few words. Please let me know what is the issue and how to resolve it very quickly.
I am posting the code here that I am using
Below is the code for Initializing Aimybox
`//[START] Initializing Aimybox assistance at the application level,
* to get the singleton object throughout the app*/
val credentials = GoogleCloudCredentials.fromAsset(applicationContext, "credentials.json")
GoogleCloudTextToSpeech.Config(Gender.FEMALE)
val textToSpeech = GoogleCloudTextToSpeech(applicationContext, credentials, Locale.ENGLISH)
val speechToText = GoogleCloudSpeechToText(credentials, Locale.ENGLISH)
//val dialogApi = AimyboxDialogApi("your Aimybox project API key", unitId)
val unitId = UUID.randomUUID().toString()
val dialogApi = DummyDialogApi()
aimybox = Aimybox(Config.create(speechToText, textToSpeech, dialogApi))`
And here is the code to start recognizing the speech
aimybox.startRecognition()
Please provide the solution quickly.
Move components to the public dependencies instead of project dependency. To get rid of additional module.
I'm currently working on modifying this demo app to make it a Tasker plugin. I can send the messages the app hears now to Tasker, but I don't see any way to get the message back to the app and update the message in the UI fragment.
I'm not very familiar with kotlin, so any help is appreciated!
Some modules (like Snowboy VT & Google Cloud) requires access to external storage. We need to find a way to check/request the mic and storage permissions before VT start.
Need to have a way to add custom widgets that could be rendered by the Aimybox's fragment.
Would be great to have a single place where the user can add their map of Reply
-> AssistantWidget
(or something like that) during the Aimybox
initialisation.
Is it possible to move MainViewModel
to components? Looks like there is nothing special it this class here.
Would be great to have module android-things
that is ready to be built for RPi 3 B.
In other case an exception is thrown
android.view.InflateException: Binary XML file line #2: Binary XML file line #2: You must supply a layout_width attribute.
Followed building demo app to phone on Samsung Galaxy S10E and was unable to get it to run. The following is the error message:
The application could not be installed: INSTALL_PARSE_FAILED_MANIFEST_MALFORMED
List of apks:
[0] '/Users/daniyalmohammed/AndroidStudioProjects/aimybox-android-assistant/app/build/intermediates/apk/debug/app-debug.apk'
Installation failed due to: 'Failed to commit install session 472356883 with command package install-commit 472356883. Error: INSTALL_PARSE_FAILED_MANIFEST_MALFORMED: Failed parse during installPackageLI: /data/app/vmdl472356883.tmp/base.apk (at Binary XML file line #63): leakcanary.internal.activity.LeakLauncherActivity: Targeting S+ (version 31 and above) requires that an explicit value for android:exported be defined when intent filters are present'
Retry
Failed to launch an application on all devices
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.