Code Monkey home page Code Monkey logo

openai-java's Introduction

Maven Central

⚠️ Notice: This project is no longer maintained and has been archived as of June 6th, 2024. Thank you to everyone who has contributed and supported this project. While the repository will remain available in its current state, no further updates or support will be provided. Please feel free to fork and modify the code as needed.

⚠️OpenAI has deprecated all Engine-based APIs. See Deprecated Endpoints below for more info.

OpenAI-Java

Java libraries for using OpenAI's GPT apis. Supports GPT-3, ChatGPT, and GPT-4.

Includes the following artifacts:

  • api : request/response POJOs for the GPT APIs.
  • client : a basic retrofit client for the GPT endpoints, includes the api module
  • service : A basic service class that creates and calls the client. This is the easiest way to get started.

as well as an example project using the service.

Supported APIs

Deprecated by OpenAI

Importing

Gradle

implementation 'com.theokanning.openai-gpt3-java:<api|client|service>:<version>'

Maven

   <dependency>
    <groupId>com.theokanning.openai-gpt3-java</groupId>
    <artifactId>{api|client|service}</artifactId>
    <version>version</version>       
   </dependency>

Usage

Data classes only

If you want to make your own client, just import the POJOs from the api module. Your client will need to use snake case to work with the OpenAI API.

Retrofit client

If you're using retrofit, you can import the client module and use the OpenAiApi.
You'll have to add your auth token as a header (see AuthenticationInterceptor) and set your converter factory to use snake case and only include non-null fields.

OpenAiService

If you're looking for the fastest solution, import the service module and use OpenAiService.

⚠️The OpenAiService in the client module is deprecated, please switch to the new version in the service module.

OpenAiService service = new OpenAiService("your_token");
CompletionRequest completionRequest = CompletionRequest.builder()
        .prompt("Somebody once told me the world is gonna roll me")
        .model("babbage-002"")
        .echo(true)
        .build();
service.createCompletion(completionRequest).getChoices().forEach(System.out::println);

Customizing OpenAiService

If you need to customize OpenAiService, create your own Retrofit client and pass it in to the constructor. For example, do the following to add request logging (after adding the logging gradle dependency):

ObjectMapper mapper = defaultObjectMapper();
OkHttpClient client = defaultClient(token, timeout)
        .newBuilder()
        .interceptor(HttpLoggingInterceptor())
        .build();
Retrofit retrofit = defaultRetrofit(client, mapper);

OpenAiApi api = retrofit.create(OpenAiApi.class);
OpenAiService service = new OpenAiService(api);

Adding a Proxy

To use a proxy, modify the OkHttp client as shown below:

ObjectMapper mapper = defaultObjectMapper();
Proxy proxy = new Proxy(Proxy.Type.HTTP, new InetSocketAddress(host, port));
OkHttpClient client = defaultClient(token, timeout)
        .newBuilder()
        .proxy(proxy)
        .build();
Retrofit retrofit = defaultRetrofit(client, mapper);
OpenAiApi api = retrofit.create(OpenAiApi.class);
OpenAiService service = new OpenAiService(api);

Functions

You can create your functions and define their executors easily using the ChatFunction class, along with any of your custom classes that will serve to define their available parameters. You can also process the functions with ease, with the help of an executor called FunctionExecutor.

First we declare our function parameters:

public class Weather {
    @JsonPropertyDescription("City and state, for example: León, Guanajuato")
    public String location;
    @JsonPropertyDescription("The temperature unit, can be 'celsius' or 'fahrenheit'")
    @JsonProperty(required = true)
    public WeatherUnit unit;
}
public enum WeatherUnit {
    CELSIUS, FAHRENHEIT;
}
public static class WeatherResponse {
    public String location;
    public WeatherUnit unit;
    public int temperature;
    public String description;
    
    // constructor
}

Next, we declare the function itself and associate it with an executor, in this example we will fake a response from some API:

ChatFunction.builder()
        .name("get_weather")
        .description("Get the current weather of a location")
        .executor(Weather.class, w -> new WeatherResponse(w.location, w.unit, new Random().nextInt(50), "sunny"))
        .build()

Then, we employ the FunctionExecutor object from the 'service' module to assist with execution and transformation into an object that is ready for the conversation:

List<ChatFunction> functionList = // list with functions
FunctionExecutor functionExecutor = new FunctionExecutor(functionList);

List<ChatMessage> messages = new ArrayList<>();
ChatMessage userMessage = new ChatMessage(ChatMessageRole.USER.value(), "Tell me the weather in Barcelona.");
messages.add(userMessage);
ChatCompletionRequest chatCompletionRequest = ChatCompletionRequest
        .builder()
        .model("gpt-3.5-turbo-0613")
        .messages(messages)
        .functions(functionExecutor.getFunctions())
        .functionCall(new ChatCompletionRequestFunctionCall("auto"))
        .maxTokens(256)
        .build();

ChatMessage responseMessage = service.createChatCompletion(chatCompletionRequest).getChoices().get(0).getMessage();
ChatFunctionCall functionCall = responseMessage.getFunctionCall(); // might be null, but in this case it is certainly a call to our 'get_weather' function.

ChatMessage functionResponseMessage = functionExecutor.executeAndConvertToMessageHandlingExceptions(functionCall);
messages.add(response);

Note: The FunctionExecutor class is part of the 'service' module.

You can also create your own function executor. The return object of ChatFunctionCall.getArguments() is a JsonNode for simplicity and should be able to help you with that.

For a more in-depth look, refer to a conversational example that employs functions in: OpenAiApiFunctionsExample.java. Or for an example using functions and stream: OpenAiApiFunctionsWithStreamExample.java

Streaming thread shutdown

If you want to shut down your process immediately after streaming responses, call OpenAiService.shutdownExecutor().
This is not necessary for non-streaming calls.

Running the example project

All the example project requires is your OpenAI api token

export OPENAI_TOKEN="sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"

You can try all the capabilities of this project using:

./gradlew runExampleOne

And you can also try the new capability of using functions:

./gradlew runExampleTwo

Or functions with 'stream' mode enabled:

./gradlew runExampleThree

FAQ

Does this support GPT-4?

Yes! GPT-4 uses the ChatCompletion Api, and you can see the latest model options here.
GPT-4 is currently in a limited beta (as of 4/1/23), so make sure you have access before trying to use it.

Does this support functions?

Absolutely! It is very easy to use your own functions without worrying about doing the dirty work. As mentioned above, you can refer to OpenAiApiFunctionsExample.java or OpenAiApiFunctionsWithStreamExample.java projects for an example.

Why am I getting connection timeouts?

Make sure that OpenAI is available in your country.

Why doesn't OpenAiService support x configuration option?

Many projects use OpenAiService, and in order to support them best I've kept it extremely simple.
You can create your own OpenAiApi instance to customize headers, timeouts, base urls etc.
If you want features like retry logic and async calls, you'll have to make an OpenAiApi instance and call it directly instead of using OpenAiService

Deprecated Endpoints

OpenAI has deprecated engine-based endpoints in favor of model-based endpoints. For example, instead of using v1/engines/{engine_id}/completions, switch to v1/completions and specify the model in the CompletionRequest. The code includes upgrade instructions for all deprecated endpoints.

I won't remove the old endpoints from this library until OpenAI shuts them down.

License

Published under the MIT License

openai-java's People

Contributors

aaronuu avatar bartsoj avatar bjorkman avatar bowenliang123 avatar costescuandrei avatar csoltenborn avatar danielfariati avatar dehidehidehi avatar drakeet avatar eduardo-t avatar fchebbo avatar gianseb avatar grogdunn avatar haumacher avatar hawkan avatar izeye avatar mrblw avatar mzhu-ai avatar nahakiole avatar remyohajinwa avatar ronilbhattarai avatar runningcode avatar sliman4 avatar teerapap avatar tgolob avatar theokanning avatar tox-p avatar vacuityv avatar venthe avatar yeikel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

openai-java's Issues

java.net.sockettimeoutexception read timed out

Randomically I receive "java.net.sockettimeoutexception read timed out " Maybe chatgpt´s server is full and delays sending response. After 10 seconds with no answer a Timeout Exceptions appears. Could this time out be increased? This my code

String token = System.getenv("OPENAI_TOKEN");
ChatResponse = "";
OpenAiService service = new OpenAiService(token);
Engine davinci = service.getEngine("text-davinci-003");
ArrayList storyArray = new ArrayList();
System.out.println("\nBrewing up a story...");
CompletionRequest completionRequest = CompletionRequest.builder()
.prompt(message)
.temperature(0.6) //0.9
.maxTokens(450)
.topP(1.0) //1.0
.frequencyPenalty(0.4) //0.0
.presencePenalty(0.1) //0.6
.build();
service.createCompletion("text-davinci-003", completionRequest).getChoices().forEach(line -> {
storyArray.add(line);
});
ChatResponse = storyArray.get(0).toString();

retrofit2 HTTP 401 Exception

截屏2022-12-06 11 09 22

console print:

Exception in thread "main" retrofit2.adapter.rxjava2.HttpException: HTTP 401 Unauthorized
at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:57)
at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:38)
at retrofit2.adapter.rxjava2.CallExecuteObservable.subscribeActual(CallExecuteObservable.java:48)
at io.reactivex.Observable.subscribe(Observable.java:12284)
at retrofit2.adapter.rxjava2.BodyObservable.subscribeActual(BodyObservable.java:35)
at io.reactivex.Observable.subscribe(Observable.java:12284)
at io.reactivex.internal.operators.observable.ObservableSingleSingle.subscribeActual(ObservableSingleSingle.java:35)
at io.reactivex.Single.subscribe(Single.java:3666)
at io.reactivex.Single.blockingGet(Single.java:2869)
at com.theokanning.openai.OpenAiService.createCompletion(OpenAiService.java:91)

my token is fresh, I don't know why I reported this error?

Add usage object with the used tokens to CompletionResult

In the docs, you have this request example ([here]):

  "model": "text-davinci-002",
  "prompt": "Say this is a test",
  "max_tokens": 6,
  "temperature": 0,
  "top_p": 1,
  "n": 1,
  "stream": false,
  "logprobs": null,
  "stop": "\n"
}

And this response example:

  "id": "cmpl-uqkvlQyYK7bGYrRHQ0eXlWi7",
  "object": "text_completion",
  "created": 1589478378,
  "model": "text-davinci-002",
  "choices": [
    {
      "text": "\n\nThis is a test",
      "index": 0,
      "logprobs": null,
      "finish_reason": "length"
    }
  ],
  "usage": {
    "prompt_tokens": 5,
    "completion_tokens": 6,
    "total_tokens": 11
  }
}

I think it's a good option add the usage object (currently not available in the CompletionResult object):

package com.theokanning.openai.completion;
import lombok.Data;
import java.util.List;
@Data
public class CompletionResult {
    String id;
    String object;
    long created;
    String model;
    List<CompletionChoice> choices;
}

No error handling at all

There are a variety of errors that may be returned from the openai API, but none of them are handled by this library. I pulled the source code and could not find any use of keywords such as "error" or "invalid_request_error" so it seems this was missed. There are a variety of reasons errors could happen such as running out of tokens, rate limiting, system overloaded, etc. Instead, the library just crashes with no indication of what the error is.
The library needs a way for clients to know why the request failed so that appropriate action can be taken. For instance, if it's a rate limit error, the client code could pause for a minute, increase the sleep time between requests, and then retry.

Example response from openai:

"error": {
    "message": "Incorrect API key provided: xxxxxxxx****************************************xxxx. You can find your API key at https://beta.openai.com.",
    "type": "invalid_request_error",
    "param": null,
    "code": "invalid_api_key"
}

How openai-java behaves:

Exception in thread "main" retrofit2.adapter.rxjava2.HttpException: HTTP 401 
	at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:57)
	at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:38)
	at retrofit2.adapter.rxjava2.CallExecuteObservable.subscribeActual(CallExecuteObservable.java:48)
	at io.reactivex.Observable.subscribe(Observable.java:10151)
	at retrofit2.adapter.rxjava2.BodyObservable.subscribeActual(BodyObservable.java:35)
	at io.reactivex.Observable.subscribe(Observable.java:10151)
	at io.reactivex.internal.operators.observable.ObservableSingleSingle.subscribeActual(ObservableSingleSingle.java:35)
	at io.reactivex.Single.subscribe(Single.java:2517)
	at io.reactivex.Single.blockingGet(Single.java:2001)
	at com.theokanning.openai.OpenAiService.createCompletion(OpenAiService.java:91)

java.net.SocketTimeoutException: timeout

Especially when it is a complex sentence, if you use a language other than English, even if it means the same thing, the error occurs, so may I ask how to solve it?

Various security vulnerabilities due to outdated `jackson-databind:2.10.1`

Provides transitive vulnerable dependency maven:com.fasterxml.jackson.core:jackson-databind:2.10.1
CVE-2020-25649 7.5 Improper Restriction of XML External Entity Reference vulnerability pending CVSS allocation
CVE-2021-20190 8.1 Deserialization of Untrusted Data vulnerability pending CVSS allocation
CVE-2020-10650 8.1 Deserialization of Untrusted Data vulnerability with high severity found
Cxced0c06c-935c 5.9 Uncontrolled Resource Consumption vulnerability pending CVSS allocation
CVE-2020-36518 7.5 Out-of-bounds Write vulnerability pending CVSS allocation
CVE-2022-42003 7.5 Deserialization of Untrusted Data vulnerability pending CVSS allocation
CVE-2022-42004 7.5 Deserialization of Untrusted Data vulnerability pending CVSS allocation

image

Dependency tree:

[INFO] |  \- com.squareup.retrofit2:converter-jackson:jar:2.9.0:runtime
[INFO] |     \- com.fasterxml.jackson.core:jackson-databind:jar:2.10.1:runtime
[INFO] |        \- com.fasterxml.jackson.core:jackson-core:jar:2.10.1:runtime

Simple Gradle 8.0 deprecation fix

gradle --warning-mode all build

> Configure project :example
The JavaApplication.setMainClassName(String) method has been deprecated. This is scheduled to be removed in Gradle 8.0. Use #getMainClass().set(...) instead. See https://docs.gradle.org/7.4.2/dsl/org.gradle.api.plugins.JavaApplication.html#org.gradle.api.plugins.JavaApplication:mainClass for more details.
        at build_aj5rf6yanecnhsnx159poi4y1$_run_closure1.doCall(openai-java/example/build.gradle:5)
        (Run with --stacktrace to get the full stack trace of this deprecation warning.)

Change: example/build.gradle

to .set() insteadl of 'mainClassName =':

mainClass.set('example.OpenAiApiExample')

API build.gradle file

api/build.gradle: compileOnly 'org.projectlombok:lombok:1.18.12'

  • What went wrong:
    Execution failed for task ':api:compileJava'.

java.lang.IllegalAccessError: class lombok.javac.apt.LombokProcessor (in unnamed module @0x241019f1) cannot access class com.sun.tools.javac.processing.JavacProcessingEnvironment (in module jdk.compiler) because module jdk.compiler does not export com.sun.tools.javac.processing to unnamed module @0x241019f1

Workaround: Changed to 1.18.20

Truncated response text from CompletionChoice

The following program is supposed to return a long text, but truncates the response pretty early using the 0.8.0 version of the client.

package org.example;

import com.theokanning.openai.OpenAiService;
import com.theokanning.openai.completion.CompletionChoice;
import com.theokanning.openai.completion.CompletionRequest;

import java.util.stream.Collectors;

public class Main {
    public static void main(String[] args) {
        var m = new Main();
        final String res = m.process();
        System.out.println(res);
    }

    public String process() {
        final var service = new OpenAiService("api-key");
        final var completionRequest = CompletionRequest
                .builder()
                .model("text-davinci-002")
                .prompt("Create a text that is longer than 1024 characters")
                .echo(false)
                .build();

        return service.createCompletion(completionRequest)
                .getChoices()
                .stream()
                .map(CompletionChoice::getText)
                .collect(Collectors.joining(" "));

    }
}

Result is:

Lorem ipsum dolor sit amet, consectet

Crash while creating completion

Here's my code

` String token = "MY_KEYS";
OpenAiService service = new OpenAiService(token);

    System.out.println("\nCreating completion...");
    CompletionRequest completionRequest = CompletionRequest.builder()
            .model("ada")
            .prompt("Somebody once told me the world is gonna roll me")
            .echo(true)
            .user("testing")
            .build();

    btn.setOnClickListener(v -> {
        CompletionResult r = service.createCompletion(completionRequest); //Error arises here
    });`

can't find class OpenAiService

When I try to load the library (example or not) the line service = new Open... is unresolved as the class doesn't exist

Problem with stream attribute for CompletionRequest.

CompletionRequest completionRequest = CompletionRequest.builder() .prompt(dto.getQuestion()) .model(OPENAI_COMPLETION_MODEL) .maxTokens(2048) .n(1) .stream(true) .echo(false) .build();

have error Resolved [java.lang.RuntimeException: com.fasterxml.jackson.core.JsonParseException: Unrecognized token 'data': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')

Can't create completion giving null error HTTP 429

Already checked the HTTP 429 issue but still error

Code:

OpenAiService service = new OpenAiService("token here");
CompletionRequest completionRequest = CompletionRequest.builder()
                .prompt(scanner.next())
                .model("text-davinci-003")
                .echo(true)
                .temperature(0.7)
                .maxTokens(96)
                .topP(1.0)
                .frequencyPenalty(0.0)
                .presencePenalty(0.3)
                .build();
CompletionResult result = service.createCompletion(completionRequest); // <---- null here

Read timed out

I'm getting a "Read timed out" error on a few completion calls. It happens quite quickly. I'd like to increase the timeout. Where do I do this?

Still experiencing "Unrecognized token 'data': was expecting..." when using stream(true)...code works fine when stream(false) is set.

This issue was set to "Closed" (issue #52), but I am still having the same issue despite trying the recommended fix. I've been pulling my hair out with this one for three days. Please help?

Here is my code that gives the error in the logcat, "Unrecognized token 'data': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')":

From another method:
        final Duration ai_timeout = Duration.ofSeconds(75);
        service = new OpenAiService("MY_API_KEY_GOES_HERE", ai_timeout);
        String userInput = binding.fragmentMainUsersEditTextToAi.getText().toString();
        new OpenAITask().execute(userInput);


String Username;
private class OpenAITask extends AsyncTask<String, Void, String> {
    @Override
    protected String doInBackground(String... params) {

        SetThreadPolicy();

        //TODO: Set username to "Username" variable to the users username:
        Username = "Username goes here.";

        String question = params[0];
        String response = "";

        CompletionRequest request = CompletionRequest.builder()
                .prompt(question)
                .model("text-davinci-003")
                .maxTokens(220)
                .topP(0.1)
                .stream(Boolean.TRUE) //TODO: Figure out why setting .stream(true) causes the error: Unrecognized token 'data': was expecting (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
                .echo(false)
                .frequencyPenalty(0.0)
                .user(Username)
                .build();

        List<String> responses = null;

        try {
            responses = service.createCompletion(request)
                    .getChoices()
                    .stream()
                    .map(CompletionChoice::getText)
                    .collect(Collectors.toList());

            System.out.println(responses);

            if (responses != null && responses.size() > 0) {
                responses.stream().forEach(System.out::println);
                response = responses.get(0);
            } else {
                System.out.println("Response is null or size=0");
            }
        } catch (Exception e) {
            System.out.println(e.getMessage());
        }
        return response;
    }


    @Override
    protected void onPostExecute(String response) {
        if (response != null && !response.isEmpty()) {
            Log.d("Debug", "response: " + response);

            messagesAdapter.appendMessage(new com.wiscoapps.askai.Message(response.trim(), false)); //Note: ".trim()" will remove any leading or trailing whitespace or line breaks from the response before appending the text.

            storeMessagesAndRecyclerViewPositionToSharedPreferences();

            messagesAdapter.notifyItemInserted(messagesAdapter.getItemCount() - 1);

            // Smooth scroll to the last item in the RecyclerView:
            retrieveMessagesFromSharedPreferences();
            LinearLayoutManager layoutManager = ((LinearLayoutManager) binding.messagesRecyclerView.getLayoutManager());
            if (layoutManager != null && layoutManager.findLastCompletelyVisibleItemPosition() < messagesAdapter.getItemCount() - 1) {
                binding.messagesRecyclerView.requestLayout();
                if (binding.messagesRecyclerView.getLayoutManager() instanceof LinearLayoutManager) {
                    LinearLayoutManager linearLayoutManager = (LinearLayoutManager) binding.messagesRecyclerView.getLayoutManager();
                    linearLayoutManager.smoothScrollToPosition(binding.messagesRecyclerView, null, messagesAdapter.getItemCount() - 1);
                }
            }

            //TODO: HANDLE EXCEPTIONS/ALERT USER...like if the user hasn't input an API...an invalid API is input...no internet connection...etc.
        }
    }

Continue completion response

Is there a way to continue a completion response if the reason is length?
I get a

 java.net.SocketTimeoutException: timeout
	at io.reactivex.internal.util.ExceptionHelper.wrapOrThrow(ExceptionHelper.java:45)
	at io.reactivex.internal.observers.BlockingMultiObserver.blockingGet(BlockingMultiObserver.java:90)
	at io.reactivex.Single.blockingGet(Single.java:2002)
	at com.theokanning.openai.OpenAiService.createCompletion(OpenAiService.java:91)

when I set maxTokens>500

Override base url and header with gateway data

So in my company we use an api gateway (Gravitee) to provide a central point for all apis (internal and external).
We use our own api key internally and let the gateway transform the headers.

Question 1: How can we change the base url to our gateway?
Question 2: How can we use different headers for all requests?

All help is greatly appreciated :)

Conflicting package names when importing from Maven

When only using Maven, not Gradle, api and client both use com.theokanning.openai as the package name, and so maven cannot build as the error module [X] reads package com.theokanning.openai from both client and api. Unless there is no way to use without Gradle?

java.lang.NoClassDefFoundError: retrofit2/converter/jackson/JacksonConverterFactory

I am facing the issue below:
Exception in thread "main" java.lang.NoClassDefFoundError: retrofit2/converter/jackson/JacksonConverterFactory
at com.example.journaling.OpenAiApi.main(OpenAiApi.java:8)
Caused by: java.lang.ClassNotFoundException: retrofit2.converter.jackson.JacksonConverterFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 1 more

My gradle build looks like this:
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'androidx.appcompat:appcompat:1.0.2'
implementation 'androidx.constraintlayout:constraintlayout:1.1.3'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.0'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.1.1'
implementation 'com.theokanning.openai-gpt3-java:client:0.8.1'

Does somebody know how to fix this issue. Would really appreciate it.

timeout on text-davinci-003

with text-davinci-003 the response takes longer than before and times out.

Caused by: java.net.SocketTimeoutException: timeout
at okhttp3.internal.http2.Http2Stream$StreamTimeout.newTimeoutException(Http2Stream.kt:675)
at okhttp3.internal.http2.Http2Stream$StreamTimeout.exitAndThrowIfTimedOut(Http2Stream.kt:684)
at okhttp3.internal.http2.Http2Stream.takeHeaders(Http2Stream.kt:143)
at okhttp3.internal.http2.Http2ExchangeCodec.readResponseHeaders(Http2ExchangeCodec.kt:96)
at okhttp3.internal.connection.Exchange.readResponseHeaders(Exchange.kt:106)
at okhttp3.internal.http.CallServerInterceptor.intercept(CallServerInterceptor.kt:79)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.kt:34)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.kt:95)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.kt:83)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:76)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
at com.theokanning.openai.AuthenticationInterceptor.intercept(AuthenticationInterceptor.java:26)
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109)
at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201)
at okhttp3.internal.connection.RealCall.execute(RealCall.kt:154)
at retrofit2.OkHttpCall.execute(OkHttpCall.java:204)
at retrofit2.adapter.rxjava2.CallExecuteObservable.subscribeActual(CallExecuteObservable.java:46)
at io.reactivex.Observable.subscribe(Observable.java:12284)
at retrofit2.adapter.rxjava2.BodyObservable.subscribeActual(BodyObservable.java:35)
at io.reactivex.Observable.subscribe(Observable.java:12284)
at io.reactivex.internal.operators.observable.ObservableSingleSingle.subscribeActual(ObservableSingleSingle.java:35)
at io.reactivex.Single.subscribe(Single.java:3666)
at io.reactivex.Single.blockingGet(Single.java:2869)
... 110 more

retrofit2 HTTP 400 Exception

package net.Amogh;

import com.theokanning.openai.OpenAiService;
import com.theokanning.openai.completion.CompletionRequest;

public class Main {

    public static void main(String[] args){
        OpenAiService service = new OpenAiService("token here");
        CompletionRequest completionRequest = CompletionRequest.builder()
                .prompt("Hi")
                .echo(true)
                .build();
        service.createCompletion(completionRequest).getChoices().forEach(System.out::println);
    }
}

I have removed the token however I am sure it is correct. It produces

Exception in thread "main" retrofit2.adapter.rxjava2.HttpException: HTTP 400 
	at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:57)
	at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:38)
	at retrofit2.adapter.rxjava2.CallExecuteObservable.subscribeActual(CallExecuteObservable.java:48)
	at io.reactivex.Observable.subscribe(Observable.java:10151)
	at retrofit2.adapter.rxjava2.BodyObservable.subscribeActual(BodyObservable.java:35)
	at io.reactivex.Observable.subscribe(Observable.java:10151)
	at io.reactivex.internal.operators.observable.ObservableSingleSingle.subscribeActual(ObservableSingleSingle.java:35)
	at io.reactivex.Single.subscribe(Single.java:2517)
	at io.reactivex.Single.blockingGet(Single.java:2001)
	at com.theokanning.openai.OpenAiService.createCompletion(OpenAiService.java:91)
	at net.Amogh.Main.main(Main.java:14)

every single time its run.

retrofit2 429 exception

Facing the below error
java.util.concurrent.CompletionException: retrofit2.adapter.rxjava2.HttpException: HTTP 429
Details:

  • I have 7 threads creating a new instance of the OpenAiService.
  • I have quite a few requests made on each thread.
  • What is the cause of this exception. I know I have not exceeded my rate limit.

Query with ".echo(false)" has empty response (when completing lists)

Hi all,

I have a strange behaviour. When I ask GPT-3 to continue a list AND set "echo" to false, it returns me an empty text.
When I do the same in playground (I uncheck "Inject start text" there assuming this is equal to echo=false), it works fine.

For example I give this prompt to GPT-3:

Title above list is Fruits
Write a couple of next list items:

- Apple
- Orange
- Grape

Code:

CompletionRequest completionRequest = CompletionRequest.builder()
  .model("text-davinci-003")
  .prompt(prompt__see__above)
  .maxTokens(tokensLimit)
  .temperature(0.7)
  .echo(false)
  .build();
CompletionResult result = service.createCompletion(completionRequest);

And see a single CompletionChoice:

text: "",
index: 0,
logprobs: null,
finish_reason: "stop"

Do you have an idea of workaround of this?

Provide statically typed model selection

It would be a nice-to-have to provide a mechanism to access models in a statically typed fashion rather than using the hard coded Strings.

From the onboarding example, rather than :

CompletionRequest completionRequest = CompletionRequest.builder()
        .prompt("Somebody once told me the world is gonna roll me")
        .model("ada")
        .echo(true)
        .build();

It would be nice to be able to do something like :

CompletionRequest completionRequest = CompletionRequest.builder()
        .prompt("Somebody once told me the world is gonna roll me")
        .model(Model.ada)
        .echo(true)
        .build();

With the current setup, it fails at runtime with a 404 when the model is invalid

Exception in thread "main" retrofit2.adapter.rxjava2.HttpException: HTTP 404 
	at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:57)
	at retrofit2.adapter.rxjava2.BodyObservable$BodyObserver.onNext(BodyObservable.java:38)
	at retrofit2.adapter.rxjava2.CallExecuteObservable.subscribeActual(CallExecuteObservable.java:48)
	at io.reactivex.Observable.subscribe(Observable.java:10151)

Image Generation

Add the Api files for Create image ,Create Image edit and Create Image variation.

Continue the conversation

Can we continue the conversation without including the previous response in the prompt? Is there a way to maintain context, like a conversation identifier?

Illegal reflective access

I can't use this library on my software because I get this error:

WARNING: An illegal reflective access operation has occurred WARNING: Illegal reflective access by retrofit2.Platform (file:/home/admin/test.jar) to constructor java.lang.invoke.MethodHandles$Lookup(java.lang.Class,int) WARNING: Please consider reporting this to the maintainers of retrofit2.Platform WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations WARNING: All illegal access operations will be denied in a future release

I use the client.
OpenAiService service = new OpenAiService(token, API_SOCKET_TIMEOUT);

Any solution?

Thanks

Thread-safety

Hi there,

I was playing around with the library and started wondering whether it is thread-safe. Unfortunately, I haven't had the time to take a closer look and tell what's going on. I didn't find any issues relating to this yet, so I thought I could just ask.

Previous code that worked cause .builder() issue

Refreshing some of the code, I ran across this new issue that wouldn't compile.

The method builder() is undefined for the type CompletionRequest

		CompletionRequest completionRequest = CompletionRequest.builder()
		        .prompt(prompt)
		        .temperature(0.9)
		        .maxTokens(150)
		        .topP(1.0)
		        .frequencyPenalty(0.0)
		        .presencePenalty(0.5)
		        .stop(stops)
		        .echo(false)
		        .build();

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.