Code Monkey home page Code Monkey logo

dialogflow-android-client's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dialogflow-android-client's Issues

Dependencies out of Date

compile 'com.android.support:appcompat-v7:23.2.1'
compile 'com.google.code.gson:gson:2.3.1'
compile 'commons-io:commons-io:2.4'

These are all out of date. Here are the most recent ones.
compile 'com.android.support:appcompat-v7:25.3.0'
compile 'com.google.code.gson:gson:2.8.0'
compile 'commons-io:commons-io:2.5'

How to create new an agent?

I want to create new an agent from client. What should I do? Can you write example for me? Thank you very much.

Language default when not in "SupportedLanguages"

Hi,

I'm using the Android SDK, building a simple test app to know if I'm going to use your services or another :-)

When I try this:
final AIConfiguration.SupportedLanguages lang = AIConfiguration.SupportedLanguages.fromLanguageTag("nl");

The "lang" is always English. How come it doesn't support Dutch when your doc says it does (https://docs.api.ai/docs/languages)? Did I miss something?

Thank you !

App Crashes after listening command ( FATAL EXCEPTION: AsyncTask )

After implementing the Android SDK ,

The app crashes after it listens to my request.

Here is the monitor log, please help.
FATAL EXCEPTION: AsyncTask #3 Process: com.aiapp.user.homeai, PID: 15096 java.lang.RuntimeException: An error occurred while executing doInBackground() at android.os.AsyncTask$3.done(AsyncTask.java:309) at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354) at java.util.concurrent.FutureTask.setException(FutureTask.java:223) at java.util.concurrent.FutureTask.run(FutureTask.java:242) at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588) at java.lang.Thread.run(Thread.java:818) Caused by: java.lang.NumberFormatException: Invalid int: "simple_response" at java.lang.Integer.invalidInt(Integer.java:138) at java.lang.Integer.parse(Integer.java:410) at java.lang.Integer.parseInt(Integer.java:367) at java.lang.Integer.parseInt(Integer.java:334) at com.google.gson.JsonPrimitive.getAsInt(JsonPrimitive.java:260) at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:78) at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:71) at com.google.gson.internal.bind.TreeTypeAdapter.read(TreeTypeAdapter.java:69) at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:41) at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:82) at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:61) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129) at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220) at com.google.gson.Gson.fromJson(Gson.java:887) at com.google.gson.Gson.fromJson(Gson.java:852) at com.google.gson.Gson.fromJson(Gson.java:801) at com.google.gson.Gson.fromJson(Gson.java:773) at ai.api.AIDataService.request(AIDataService.java:193) at ai.api.AIDataService.request(AIDataService.java:148) at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:166) at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:158) at android.os.AsyncTask$2.call(AsyncTask.java:295) at java.util.concurrent.FutureTask.run(FutureTask.java:237) at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)  at java.lang.Thread.run(Thread.java:818)

Caused by: java.lang.NumberFormatException: For input string: "simple_response"

When i Send a message to API.AI server and client blink to error below :

java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:325)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
at java.lang.Thread.run(Thread.java:761)
Caused by: java.lang.NumberFormatException: For input string: "simple_response"
at java.lang.Integer.parseInt(Integer.java:521)
at java.lang.Integer.parseInt(Integer.java:556)
at com.google.gson.JsonPrimitive.getAsInt(JsonPrimitive.java:260)
at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:78)
at ai.api.GsonFactory$ResponseItemAdapter.deserialize(GsonFactory.java:71)
at com.google.gson.internal.bind.TreeTypeAdapter.read(TreeTypeAdapter.java:69)
at com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:41)
at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:82)
at com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:61)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:129)
at com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:220)
at com.google.gson.Gson.fromJson(Gson.java:887)
at com.google.gson.Gson.fromJson(Gson.java:852)
at com.google.gson.Gson.fromJson(Gson.java:801)
at com.google.gson.Gson.fromJson(Gson.java:773)
at ai.api.AIDataService.request(AIDataService.java:193)
at ai.api.AIDataService.request(AIDataService.java:148)
at ai.api.AIDataService.request(AIDataService.java:124)
at ninhv.vl.vlchat.views.chat.ChatActivity$1.doInBackground(ChatActivity.java:90)
at ninhv.vl.vlchat.views.chat.ChatActivity$1.doInBackground(ChatActivity.java:85)
at android.os.AsyncTask$2.call(AsyncTask.java:305)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607) 
at java.lang.Thread.run(Thread.java:761) 

And here is code in doInBackground method
@OverRide
protected AIResponse doInBackground(AIRequest... requests) {
AIRequest request = requests[0];
try {
Log.e("REQUEST",request.toString());
return aiDataService.request(request);
} catch (AIServiceException e) {
}
return null;
}

Get only the exact response

Hello!

I may have overseen this but how can I just get things like "weather.search" or the content of "speech"?

Many thanks

hello

支持的最低版本是多少(zhi chi de zui di ban ben shi duo shao)

ai.api.AIServiceException: Can't connect to the api.ai service.

Hi, i face a issue:
when i use some speech recognition api like bing speech 、nuance or ibm, it can work fine.
but when i use the google cloud speech api, the github demo url is https://github.com/GoogleCloudPlatform/android-docs-samples/tree/master/speech/Speech.
the google speech api use grpc, when i use this, happen the issue like title.
the log like below:

i.api.AIServiceException: Can't connect to the api.ai service. System.err: at ai.api.AIDataService.doTextRequest(AIDataService.java:389) System.err: at ai.api.AIDataService.request(AIDataService.java:147) System.err: at ai.api.AIDataService.request(AIDataService.java:117) System.err: at com.amyrobotics.amya_one.main.HardWakeMutualVoiceService$6.doInBackground(HardWakeMutualVoiceService.java:511) System.err: at com.amyrobotics.amya_one.main.HardWakeMutualVoiceService$6.doInBackground(HardWakeMutualVoiceService.java:506) System.err: at android.os.AsyncTask$2.call(AsyncTask.java:292) System.err: at java.util.concurrent.FutureTask.run(FutureTask.java:237) System.err: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231) System.err: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) System.err: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) System.err: at java.lang.Thread.run(Thread.java:818) System.err: Caused by: javax.net.ssl.SSLHandshakeException: Handshake failed System.err: at com.android.org.conscrypt.OpenSSLSocketImpl.startHandshake(OpenSSLSocketImpl.java:390) System.err: at com.android.okhttp.Connection.upgradeToTls(Connection.java:201) System.err: at com.android.okhttp.Connection.connect(Connection.java:155) System.err: at com.android.okhttp.internal.http.HttpEngine.connect(HttpEngine.java:276) System.err: at com.android.okhttp.internal.http.HttpEngine.sendRequest(HttpEngine.java:211) System.err: at com.android.okhttp.internal.http.HttpURLConnectionImpl.execute(HttpURLConnectionImpl.java:382) System.err: at com.android.okhttp.internal.http.HttpURLConnectionImpl.connect(HttpURLConnectionImpl.java:106) System.err: at com.android.okhttp.internal.http.DelegatingHttpsURLConnection.connect(DelegatingHttpsURLConnection.java:89) System.err: at com.android.okhttp.internal.http.HttpsURLConnectionImpl.connect(HttpsURLConnectionImpl.java:25) System.err: at ai.api.AIDataService.doTextRequest(AIDataService.java:368) System.err: ... 10 more System.err: Caused by: javax.net.ssl.SSLProtocolException: SSL handshake aborted: ssl=0xaf458600: Failure in SSL library, usually a protocol error System.err: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure (external/openssl/ssl/s23_clnt.c:770 0xac139001:0x00000000) System.err: at com.android.org.conscrypt.NativeCrypto.SSL_do_handshake(Native Method) System.err: at com.android.org.conscrypt.OpenSSLSocketImpl.startHandshake(OpenSSLSocketImpl.java:318) System.err: ... 19 more I/System.out: error messageis Can't connect to the api.ai service.

where the error occured is in the program below:

final AIRequest aiRequest = new AIRequest(text); new AsyncTask<AIRequest, Void, AIResponse>() { @Override protected AIResponse doInBackground(AIRequest... requests) {

i suspect the error is about the port "443", used in google cloud speech for "setAccessToken".

but i change the port of google speech to 442, it did't work, and google speech error.
so i don't kown how to solve this, please help;
i don't kown ho

NetworkOnMainThreadException

Hey!

When trying to do what the tutorial says in the own application, I get a NetworkOnMainThreadException. How can I fix it? Already tried working with a Thread but that didn't work.

Bluetooth

Recognition doesn't work with Bluetooth headset

Crash doing voice recognition

11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #3
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: Process: com.getscarlett.android, PID: 8094
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: java.lang.RuntimeException: An error occured while executing doInBackground()
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$3.done(AsyncTask.java:304)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:355)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.setException(FutureTask.java:222)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.run(FutureTask.java:242)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.lang.Thread.run(Thread.java:818)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: Caused by: java.lang.NullPointerException: lock == null
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.io.Reader.(Reader.java:64)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.io.InputStreamReader.(InputStreamReader.java:120)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at org.apache.commons.io.IOUtils.copy(IOUtils.java:1906)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at org.apache.commons.io.IOUtils.toString(IOUtils.java:778)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.AIDataService.doTextRequest(AIDataService.java:319)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.AIDataService.doTextRequest(AIDataService.java:282)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.AIDataService.request(AIDataService.java:107)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:146)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:138)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$2.call(AsyncTask.java:292)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.FutureTask.run(FutureTask.java:237)
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231) 
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112) 
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587) 
11-24 15:17:47.267 8094-8877/com.getscarlett.android E/AndroidRuntime: at java.lang.Thread.run(Thread.java:818) 

RecognitionEngine Speaktoit is not working

Application is working fine when RecognitionEngine is set to system (AIConfiguration.RecognitionEngine.System).
I changed RecognitionEngine "System " to "Speaktoit"(AIConfiguration.RecognitionEngine.Speaktoit) Application is not working.
Following Exception is occurring.

07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: java.io.IOException: Stream closed
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at java.io.BufferedInputStream.getInIfOpen(BufferedInputStream.java:151)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at java.io.BufferedInputStream.read1(BufferedInputStream.java:273)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at sun.nio.cs.StreamDecoder.readBytes(StreamDecoder.java:287)
07-06 18:01:39.339 21358-22502/ai.api.sample W/System.err: at sun.nio.cs.StreamDecoder.implRead(StreamDecoder.java:350)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at sun.nio.cs.StreamDecoder.read(StreamDecoder.java:179)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at java.io.InputStreamReader.read(InputStreamReader.java:184)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at java.io.Reader.read(Reader.java:140)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.copy(IOUtils.java:140)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.copy(IOUtils.java:131)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.readAll(IOUtils.java:126)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.readAll(IOUtils.java:102)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.util.IOUtils.readAll(IOUtils.java:114)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.http.HttpClient.getErrorString(HttpClient.java:158)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.doSoundRequest(AIDataService.java:799)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.doSoundRequest(AIDataService.java:743)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.voiceRequest(AIDataService.java:288)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.AIDataService.voiceRequest(AIDataService.java:253)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:380)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:362)
07-06 18:01:39.340 21358-22502/ai.api.sample W/System.err: at android.os.AsyncTask$2.call(AsyncTask.java:305)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.util.concurrent.FutureTask.run(FutureTask.java:237)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
07-06 18:01:39.341 21358-22502/ai.api.sample W/System.err: at java.lang.Thread.run(Thread.java:761)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: 2017-07-06 18:01:39,341 ERROR An exception occurred processing Appender Console java.lang.NullPointerException: Attempt to get length of null array
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.util.ReflectionUtil.getCurrentStackTrace(ReflectionUtil.java:274)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.impl.ThrowableProxy.(ThrowableProxy.java:116)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.impl.Log4jLogEvent.getThrownProxy(Log4jLogEvent.java:323)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.pattern.ExtendedThrowablePatternConverter.format(ExtendedThrowablePatternConverter.java:64)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.pattern.PatternFormatter.format(PatternFormatter.java:36)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.layout.PatternLayout.toSerializable(PatternLayout.java:196)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.layout.PatternLayout.toSerializable(PatternLayout.java:55)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.layout.AbstractStringLayout.toByteArray(AbstractStringLayout.java:71)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:108)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:99)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:430)
07-06 18:01:39.346 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:409)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:367)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.spi.AbstractLogger.logMessage(AbstractLogger.java:727)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.spi.AbstractLogger.logIfEnabled(AbstractLogger.java:716)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at org.apache.logging.log4j.spi.AbstractLogger.error(AbstractLogger.java:354)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.doSoundRequest(AIDataService.java:813)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.doSoundRequest(AIDataService.java:743)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.voiceRequest(AIDataService.java:288)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.AIDataService.voiceRequest(AIDataService.java:253)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:380)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at ai.api.services.SpeaktoitRecognitionServiceImpl$RecognizeTask.doInBackground(SpeaktoitRecognitionServiceImpl.java:362)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at android.os.AsyncTask$2.call(AsyncTask.java:305)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.util.concurrent.FutureTask.run(FutureTask.java:237)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:243)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1133)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:607)
07-06 18:01:39.347 21358-22502/ai.api.sample I/System.out: at java.lang.Thread.run(Thread.java:761)
07-06 18:01:39.349 21358-22502/ai.api.sample E/TEST: doInBackground e*****ai.api.AIServiceException: Can't make request to the API.AI service. Please, check connection settings and API.AI keys.
07-06 18:01:39.391 21358-21358/ai.api.sample D/ai.api.sample.AIButtonSampleActivity: onCancelled
07-06 18:01:39.391 21358-21358/ai.api.sample D/ai.api.sample.AIButtonSampleActivity: onError*

Kindly help me to solve this issue.

I am setting the session id in AIRequest object but its giving some other session id in response

      `AIRequest request = new AIRequest();

       request.setQuery(message);

       request.setSessionId(MyData.getInstance(MyApplication.getInstance()).getGatewayId() + "_" +
               MyData.getInstance(MyApplication.getInstance()).getUsername());

       return aiService.textRequest(request);`

and when i get AIResponse object it gives me some different session id.
How can i set my custom session id for a particular user.

Trim empty parameters

Parameters in response json could look like:

"parameters": {
    "date": "2015-03-19",
    "date-time": "",
    "time": "",
    "text": "feed it",
    "priority": "",
    "remind": "remind"
    },

So, it is hard to check existent parameters with

result.getParameters().containsKey("key")

Parameters with empty values should be trimmed.

Hard to detect final speech recognition result when using GoogleRecognitionServiceImpl

This is an enhancement proposal, I hope I'm not missing anything.

When you use the GoogleRecognitionServiceImpl as the AIService, there's no easy way of notifying the user of their final speech output when its submitted to API.AI.

You can use the PartialResultsListener with GoogleRecognitionServiceImpl, but you get something like 2 callbacks to PartialResultsListener.onPartialResults(List<String>) after receiving the AIListener.onListeningFinished() callback.

Here's the logcat output from my phone to illustrate the issue, refer to the last 3-4 lines:
api_ai_partial_result_listener_output_logcat

The really hacky, nasty and potentially fragile way to detect the final input is to just wait for 2 callbacks after AIListener.onListeningFinished().

I'd like to modify theGoogleRecognitionServiceImpl.InternalRecognitionListener.onResults(Bundle) handling by adding something like a QuerySubmitListener (prevents breaking the PartialResultsListener contract). The QuerySubmitListener would notify developers of the query or queries and the confidences (right about here https://github.com/api-ai/apiai-android-client/blob/master/ailib/src/main/java/ai/api/services/GoogleRecognitionServiceImpl.java#L392).

Happy to create a PR if this sounds OK

Insufficient Permissions

With the standard two INTERNET and RECORD_AUDIO permissions, it throws an Insufficient Permissions error. You must add the ACCESS_NETWORK_STATE permission for it to work.

App crashes on startup with NoClassDefFound error

Hi everyone!
I'm currently trying to implement AI.API by following the provided tutorial. When I try to get the app run on the emulator, it crashes upon startup with the following error.
screen shot 2017-06-22 at 3 44 11 pm

I noticed the error has to do with something on line 35 of my code, but I'm simply following the tutorial and not sure what is the issue (token was erased from the code below for confidential purposes).
screen shot 2017-06-22 at 3 46 34 pm

I also tried to use the provided sample app but the same error occurred.
I would be really appreciated if someone could provide me with some insights!
Thank you so much in advance!

Only English speech recognition

I tried to follow the presented tutorial, so I'm currently having a very simple AI-application. Now I changed the supported language to German, but still the app only recognizes english words. This is my configuration:

final AIConfiguration config = new AIConfiguration("---", AIConfiguration.SupportedLanguages.German, AIConfiguration.RecognitionEngine.System);

My dependencies:

compile 'com.android.support:appcompat-v7:25.0.1'
testCompile 'junit:junit:4.12'
compile 'ai.api:sdk:2.0.0@aar'
compile 'ai.api:libai:1.2.1'
compile 'com.android.support:appcompat-v7:25.0.1'
compile 'com.google.code.gson:gson:2.8.0'
compile 'commons-io:commons-io:20030203.000550'

Sample App cannot run

Hello. I'm having trouble with the Android Studio documentation.

**Now return to the MainActivity.java file. Add three import statements to access our widgets:

import android.view.View;
import android.widget.Button;
import android.widget.TextView;

Create two private members in MainActivity for the widgets:

private Button listenButton;
private TextView resultTextView;
At the end of the OnCreate method, add these lines to initialize the widgets:

listenButton = (Button) findViewById(R.id.listenButton);
resultTextView = (TextView) findViewById(R.id.resultTextView);**


I'm sorry, but can you give the example of where these go? My app closes immediately.

Getting Following exception: Caused by: java.lang.AbstractMethodError: abstract method "java.util.TimeZone ai.api.AIServiceContext.getTimeZone()"

03-01 12:28:58.999 6355-9154/? E/AndroidRuntime: FATAL EXCEPTION: AsyncTask #2
Process: com.hexa.helloaction, PID: 6355
java.lang.RuntimeException: An error occurred while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:309)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:354)
at java.util.concurrent.FutureTask.setException(FutureTask.java:223)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588)
at java.lang.Thread.run(Thread.java:818)
Caused by: java.lang.AbstractMethodError: abstract method "java.util.TimeZone ai.api.AIServiceContext.getTimeZone()"
at ai.api.AIDataService.getTimeZone(AIDataService.java:962)
at ai.api.AIDataService.request(AIDataService.java:172)
at ai.api.AIDataService.request(AIDataService.java:148)
at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:166)
at ai.api.services.GoogleRecognitionServiceImpl$2.doInBackground(GoogleRecognitionServiceImpl.java:158)
at android.os.AsyncTask$2.call(AsyncTask.java:295)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:234) 
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1113) 
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:588) 
at java.lang.Thread.run(Thread.java:818) 

Error: java.lang.NoClassDefFoundError: org.apache.commons.io.Charsets

I did add commons-io:commons-io:20030203.000550 dependency and still getting these errors...
Heres the error log....

java.lang.RuntimeException: An error occured while executing doInBackground()
at android.os.AsyncTask$3.done(AsyncTask.java:300)
at java.util.concurrent.FutureTask.finishCompletion(FutureTask.java:355)
at java.util.concurrent.FutureTask.setException(FutureTask.java:222)
at java.util.concurrent.FutureTask.run(FutureTask.java:242)
at android.os.AsyncTask$SerialExecutor$1.run(AsyncTask.java:231)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1112)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:587)
at java.lang.Thread.run(Thread.java:841)
Caused by: java.lang.NoClassDefFoundError: org.apache.commons.io.Charsets
at ai.api.AIDataService.doTextRequest(AIDataService.java:342)
at ai.api.AIDataService.request(AIDataService.java:125)
at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:147)
at ai.api.services.GoogleRecognitionServiceImpl$1.doInBackground(GoogleRecognitionServiceImpl.java:139)
at android.os.AsyncTask$2.call(AsyncTask.java:288)
at java.util.concurrent.FutureTask.run(FutureTask.java:237)

Application crashing after receiving speech input

I followed the instruction and tried to run it in my Nexus-6P. After receiving the speech input, the application crashes without showing any error msg. Please help.

here is my MainActivity code.

package poc.apiai.com.firstaiapiapp;

import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;

import ai.api.AIConfiguration;
import ai.api.AIListener;
import ai.api.AIService;
import ai.api.model.AIError;
import ai.api.model.AIResponse;
import ai.api.model.Result;
import com.google.gson.JsonElement;
import java.util.Map;

import android.view.View;
import android.widget.Button;
import android.widget.TextView;

public class MainActivity extends AppCompatActivity implements AIListener {

private Button listenButton;
private TextView resultTextView;
private AIService aiService;

@Override
protected void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_main);

    listenButton = (Button) findViewById(R.id.listenButton);
    resultTextView = (TextView) findViewById(R.id.resultTextView);

    final AIConfiguration config = new AIConfiguration("abc",
            AIConfiguration.SupportedLanguages.English,
            AIConfiguration.RecognitionEngine.System);

    aiService = AIService.getService(this, config);
    aiService.setListener(this);
}

public void listenButtonOnClick(final View view) {
    aiService.startListening();
}

@Override
public void onResult(AIResponse response) {
    Result result = response.getResult();

    // Get parameters
    String parameterString = "";
    if (result.getParameters() != null && !result.getParameters().isEmpty()) {
        for (final Map.Entry<String, JsonElement> entry : result.getParameters().entrySet()) {
            parameterString += "(" + entry.getKey() + ", " + entry.getValue() + ") ";
        }
    }

    // Show results in TextView.
    resultTextView.setText("Query:" + result.getResolvedQuery() +
            "\nAction: " + result.getAction() +
            "\nParameters: " + parameterString);
}

@Override
public void onError(AIError error) {
    resultTextView.setText(error.toString());
}

@Override
public void onAudioLevel(float level) {

}

@Override
public void onListeningStarted() {

}

@Override
public void onListeningCanceled() {

}

@Override
public void onListeningFinished() {

}

}

Looking fwd to your reply.

Thanks

ProGuard

Lack of ProGuard guidance in readme.

Google speech recognition language not setup correctly for English

The supported languages are defined in

ailib/src/main/java/ai/api/AIConfiguration.java

     /**
         * Currently supported languages
         */
        public enum SupportedLanguages {
            English("en"),

    ...
     public static SupportedLanguages fromLanguageTag(final String languageTag) {
                switch (languageTag) {
                   case "en":
                        return English;

The language string does not correctly setup the google recognizer, taking into account country accent i.e. en-US vs en-GB

The android specification for EXTRA_LANGUAGE
http://developer.android.com/reference/android/speech/RecognizerIntent.html#EXTRA_LANGUAGE
shows that it should be Optional IETF language tag (as defined by BCP 47), for example "en-US"

I suggest adding additional enums for atleast "en-US" and "en-GB"

I've tested replacing "en" with "en_GB" in the following code
i.e. final String language = "en_GB";

ailib/src/main/java/ai/api/services/GoogleRecognitionServiceImpl.java

    private Intent createRecognitionIntent() {
            final Intent sttIntent = new Intent(RecognizerIntent.ACTION_RECOGNIZE_SPEECH);
            sttIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_MODEL,
                    RecognizerIntent.LANGUAGE_MODEL_FREE_FORM);

            final String language = config.getLanguage().replace('-', '_');

            sttIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE, language);
            sttIntent.putExtra(RecognizerIntent.EXTRA_LANGUAGE_PREFERENCE, language);

and I get much better speech recognition for my uk accent once this has been implemented.

How to get all the text response?

In Android SDK I'm getting response like this-

final Result result = response.getResult();
final String speech = result.getFulfillment().getSpeech();
Log.i(TAG, "Speech: " + speech);

This will give only one response. How can I get all the text responses?
Please help me out?

Class loading warnings due to Log4J and JMX

I'm seeing these warnings on application startup

Rejecting re-init on previously-failed class java.lang.Class<org.apache.logging.log4j.core.lookup.JmxRuntimeInputArgumentsLookup>: java.lang.NoClassDefFoundError: Failed resolution of: Ljava/lang/management/ManagementFactory;

Likely due to the Java VM Android is running does not have those classes. Anyway we can exclude them fro being loaded or referenced? App still runs fine though.

1 second delay seem to be too low

This commit 334e165 introduced:

Made google recognizer cancellation after 1 sec from last partial results for google app starts with v5.9.26.

Which is really low amount of time if you want to input whole sentence. You need to speak really fast.
I think it should be at least configurable by SDK user.

Extra field in Request Json

I need to send some additional data using Webhook (e.g. current logged in username). Is there any possibility that I could add some extra field in request json?

Request Json:
{
"extra_field": "abc"
"lang": "en",
"status": {
"errorType": "success",
"code": 200
},
"timestamp": "2017-02-09T16:06:01.908Z",
"sessionId": "1486656220806",
"result": {
"parameters": {
"city": "Rome",
"name": "Ana"
},
"contexts": [],
"resolvedQuery": "my name is Ana and I live in Rome",
"source": "agent",
"score": 1.0,
"speech": "",
"fulfillment": {
"messages": [
{
"speech": "Hi Ana! Nice to meet you!",
"type": 0
}
],
"speech": "Hi Ana! Nice to meet you!"
},
"actionIncomplete": false,
"action": "greetings",
"metadata": {
"intentId": "9f41ef7c-82fa-42a7-9a30-49a93e2c14d0",
"webhookForSlotFillingUsed": "false",
"intentName": "greetings",
"webhookUsed": "true"
}
},
"id": "ab30d214-f4bb-4cdd-ae36-31caac7a6693",
"originalRequest": {
"source": "google",
"data": {
"inputs": [
{
"raw_inputs": [
{
"query": "my name is Ana and I live in Rome",
"input_type": 2
}
],
"intent": "assistant.intent.action.TEXT",
"arguments": [
{
"text_value": "my name is Ana and I live in Rome",
"raw_text": "my name is Ana and I live in Rome",
"name": "text"
}
]
}
],
"user": {
"user_id": "PuQndWs1OMjUYwVJMYqwJv0/KT8satJHAUQGiGPDQ7A="
},
"conversation": {
"conversation_id": "1486656220806",
"type": 2,
"conversation_token": "[]"
}
}
}
}

Annotation processor dependency error

I'm running 'com.android.tools.build:gradle:2.4.0-alpha7' on AS 2.4 Preview 7 and getting the following error:

Error:Execution failed for task ':zapImoveisApp:javaPreCompileZapDebug'.

Annotation processors must be explicitly declared now. The following dependencies on the compile classpath are found to contain annotation processor. Please add them to the annotationProcessor configuration.
- log4j-core-2.2.jar
Alternatively, set android.defaultConfig.javaCompileOptions.annotationProcessorOptions.includeCompileClasspath = true to continue with previous behavior. Note that this option is deprecated and will be removed in the future.
See https://developer.android.com/r/tools/annotation-processor-error-message.html for more details.

As you can see on the provided url documentation, it's possible to disable the verification for this error using the following gradle config, but must be used as a temporary solution, because it may be removed in the future.

javaCompileOptions { annotationProcessorOptions { includeCompileClasspath false } }

So we need a alternative solution that fits this new annotation processor dependency configurations.

Settings Activity not opening

Actual Behaviour
Nothing happens when we click on the settings option in options menu in AIServiceSampleActivity and AIButtonSampleActivity
Expected Behaviour
Settings activity should be opened when we click on settings item in options menu
Would like to resolve this issue !!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.