Code Monkey home page Code Monkey logo

chatbot-ui's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chatbot-ui's Issues

Table not displaying correctly

Hello ! I tried to do a table by asking ChatGPT but it doesn't display the table, it displays the data. Look at the picture to see what I'm talking about. Thanks you !
image_2023-03-18_141951573

Requests

Great app! Here are a few possible improvements...

Streaming API response -- currently have to wait for API completion to finish. I think stream:true API parameter does this
Stop API generation -- if the response was streaming, a button so you can cut the bot off mid generation, like with the original
Flash of pure white screen on page load and reload -- quite painful in dark mode

Publish Docker Image

Summary

Add a CI step to build and publish the docker image for this repo to the GitHub package registry.

Details

Use GitHub Actions to define a build/deploy step that runs on each commit to the main branch. The CI step should build the docker image and publish it to the GitHub package repository for this repo under both a tag with the short commit hash, as well as the latest tag.

This would allow users to pull an already-built docker image from the registry and enable easier use and integration of new changes in hosted docker environments.

activate gpt-4

My api key gives me access to model gpt-4 and un-commenting

// GPT_4 = "gpt-4"
and
// [OpenAIModel.GPT_4]: "GPT-4"
, I could verify that everything works.

Now, I guess you kept it commented in order to avoid being flooded with issues saying that GPT 4 is producing errors from people with no gpt-4 access.

I wrote the following api route that provides a list a supported available models given an api key, but I'm not familiar with next.js and I see issues with spamming openai again and again for just checking the available models. Caching should be used. Anyway, here's the endpoint code in case part of it can be useful (maybe with getServerSideProps ?)

// api/models.ts
import { OpenAIModel } from '@/types';

export const config = {
  runtime: "edge"
};

type Model = {
  id: string;
};

type ModelsData = {
  data: Model[];
}



const handler = async (req: Request): Promise<Response> => {
  try {
    let key: string | null = null;
    try {
      key = (await req.json())?.key;
    } catch (e) { }

    const res = await fetch("https://api.openai.com/v1/models", {
      headers: {
        Authorization: `Bearer ${key ? key : process.env.OPENAI_API_KEY}`
      },
    });

    if (res.status !== 200) {
      throw new Error(`OpenAI API returned an error ${res.status} : ${await res.text()}`);
    }

    const { data } = await res.json() as ModelsData;
    const availableModelIds = data.map((model: Model) => model.id)
    const supportedModels = availableModelIds.filter(
      x => (Object.values(OpenAIModel) as String[]).includes(x)
    )
    return new Response(
      JSON.stringify(supportedModels),
      {
        status: 200,
        headers: {
          'content-type': 'application/json',
        },
      }
    );
  } catch (error) {
    console.error(error);
    return new Response("Error", { status: 500 });
  }
};

export default handler;

Suggestion: Decouple auto scrolling when user manually scrolls during reply streaming.

Currently, when your interface GPT writes a reply that exceeds the bottom of the page, it automatically scrolls down to keep the streaming text in view. However, it would be more user-friendly if this feature could be decoupled whenever the user manually scrolls, similar to how the official ChatGPT works.

Sometimes we want to start reading the top of the reply, before it finishes streaming the reply.

Error "Sorry, there was an error."

I'm trying to get things running locally. I've cloned the repo, installed dependencies, provided my api key, but I keep receiving the message: Sorry, there was an error.

I've tried creating a new api key and it doesn't fix the issue. Although I can see on OpenAI's site that it updates my API key "last used" to today, so it does hit it.

I've also searched for an error log file but couldn't find it. Any help would be appreciated thanks! (Great work by the way!)

Screenshot 2023-03-21 at 11 54 57

UI responsive problem

I noticed that the UI doesn't seem to work properly when I switch to mobile mode. When I click on conversation button, there is no response.
tg_image_2501166157

Suggestion: Saveable System prompts

Currently, you have a default system prompt that resets with every new chat.

Having the ability to store and utilize our own customized system prompts within the interface would be an extremely valuable addition. This would allow a user to reuse prompts based on the specific behaviour we desire from GPT in a given thread. Instead of re-typing a new system prompt each time.

image

Support for deleting or editing messages

I believe the biggest strength of the API is to be able to edit the whole conversation, similar to the openAI playground. It would be great to have edit/delete features.

UI design

Hi Is the UI intend to replicate the original chatgpt? because right now it isnt the same and some things things hurt my eyes, also there are some layout issues. If you intend to replicate it 100% I am happy to contribute on that end, fixing the mismatches and layout issues

Any way to set proxy?

I have a networking problem and it seems that I have to set a proxy. Message:

 [TypeError: fetch failed] {
  cause:  [ConnectTimeoutError: Connect Timeout Error] {
  name: 'ConnectTimeoutError',
  code: 'UND_ERR_CONNECT_TIMEOUT',
  message: 'Connect Timeout Error'
}

Getting error - Class extends value undefined is not a constructor or null

After setting up env file, running "npm run dev" gives:

error - Class extends value undefined is not a constructor or null

This might be caused by a React Class Component being rendered in a Server Component, React Class Components only works in Client Components. Read more: https://nextjs.org/docs/messages/class-component-in-server-component
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/file.js (evalmachine.:5724:19)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/formdata.js (evalmachine.:5881:49)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/body.js (evalmachine.:6094:35)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/response.js (evalmachine.:6510:49)
at __require (evalmachine.:14:50)
at (evalmachine.:11635:30)
at requireFn (file://D:\software\projects\html\chatbot-ui\node_modules\next\dist\compiled\edge-runtime\index.js:1:7079) {
name: 'TypeError'
}

Feature Request: scrap urls to get context from it

Sample:
While chatting with the bot, the user can add a url for an article or a code on github, the chat-bot could identify there is an url, scrap it and extract the contents and use it as context for the chat.

Does not support users who type in Japanese

bug

The Japanese language has both kanji and hiragana, and the enter key is used to convert hiragana into kanji.
In the current implementation, handleSend is executed when the conversion is confirmed and starts sending messages.

solution

When the conversion key is pressed, the message is not sent.

スクリーンショット 2023-03-18 8 30 03

Dialog Context gets stuck

After chatting back and forth a few times, the dialog context seems to freeze and the responses to an new query stay the same.

Seems there is an issue with the dialogue state?
(GPT3.5 default).

markdown rendering?

How can I make the markdown code/math render like it does in real chat gpt? thanks

error - Class extends value undefined is not a constructor or null

error - Class extends value undefined is not a constructor or null

This might be caused by a React Class Component being rendered in a Server Component, React Class Components only works in Client Components. Read more: https://nextjs.org/docs/messages/class-component-in-server-component
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/file.js (evalmachine.:5724:19)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/formdata.js (evalmachine.:5881:49)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/body.js (evalmachine.:6094:35)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/response.js (evalmachine.:6510:49)
at __require (evalmachine.:14:50)
at (evalmachine.:11635:30)
at requireFn (file:///Users/bobjames/code/chatgpt/chatbot-ui/node_modules/next/dist/compiled/edge-runtime/index.js:1:7079) {
name: 'TypeError'
}
error - Class extends value undefined is not a constructor or null

This might be caused by a React Class Component being rendered in a Server Component, React Class Components only works in Client Components. Read more: https://nextjs.org/docs/messages/class-component-in-server-component
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/file.js (evalmachine.:5724:19)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/formdata.js (evalmachine.:5881:49)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/body.js (evalmachine.:6094:35)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/response.js (evalmachine.:6510:49)
at __require (evalmachine.:14:50)
at (evalmachine.:11635:30)
at requireFn (file:///Users/bobjames/code/chatgpt/chatbot-ui/node_modules/next/dist/compiled/edge-runtime/index.js:1:7079) {
name: 'TypeError'
}

Feature Request: Allow to reply to messages to add context to the chat

Sample:
When chatting in a long thread, it can happen that we need some clarifications or we have any question about old messages in the same thread, currently we must copy the whole content go down and ask the chat again and paste the copied text.

With a reply feature we can easily click on "reply" and down in the chat we just ask another question adding the replied text content internally and automatically as part of the context.

Feature request: Tree-like message history

Hello, I recently built a tree-like message history feature (link) and I would like to develop it further to make it more similar to a real ChatGPT conversation. However, implementing this feature would require refactoring the current structure of the conversation. I'm wondering if this is worth doing?

Error handling

An error message should be displayed, and provide a retry button when an error occurs.

error - Class extends value undefined is not a constructor or null

tg_image_162084211
This might be caused by a React Class Component being rendered in a Server Component, React Class Components only works in Client Components. Read more: https://nextjs.org/docs/messages/class-component-in-server-component
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/file.js (evalmachine.:5724:19)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/formdata.js (evalmachine.:5881:49)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/body.js (evalmachine.:6094:35)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/response.js (evalmachine.:6510:49)
at __require (evalmachine.:14:50)
at (evalmachine.:11635:30)
at requireFn (file:///Users/bobjames/code/chatgpt/chatbot-ui/node_modules/next/dist/compiled/edge-runtime/index.js:1:7079) {
name: 'TypeError'
}
error - Class extends value undefined is not a constructor or null

This might be caused by a React Class Component being rendered in a Server Component, React Class Components only works in Client Components. Read more: https://nextjs.org/docs/messages/class-component-in-server-component
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/file.js (evalmachine.:5724:19)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/formdata.js (evalmachine.:5881:49)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/body.js (evalmachine.:6094:35)
at __require (evalmachine.:14:50)
at ../../node_modules/.pnpm/[email protected]/node_modules/undici/lib/fetch/response.js (evalmachine.:6510:49)
at __require (evalmachine.:14:50)
at (evalmachine.:11635:30)
at requireFn (file:///Users/bobjames/code/chatgpt/chatbot-ui/node_modules/next/dist/compiled/edge-runtime/index.js:1:7079) {
name: 'TypeError'
}
image

Feature request: Speech2Text (Whisper) as chat input

Use OpenAI Whisper API to transcribe user voice to input into the chat field (OS native transcription is inaccurate). There could be a passphrase like ("over out" or whatever) that functions as [Enter] command and send the query.

I could try to develop this feature if it's deemed valuable

Improve the new conversation behaviour.

We should refine the logic for creating new sessions to align it more closely with the ChatGPT website behavior. Specifically, when users click on "New Chat," they will be directed to an empty chat page, and the conversation will only be created in the sidebar after they send their first message.

I'm sorry but the issue you and alanjustalan closed...

Sorry there's really python beginner like me😅

10 hours ago you told Alanjustalan to "

Ah I see! Get rid of the < and >. Sorry for the confusion there. That should fix it!"

The same error, while I don't understand. You means a special character in my openai API key?

捕获
捕获2

Could not parse ~/Dropbox/github/package.json. Ignoring it.

I'm getting this:

$ npm run dev

> [email protected] dev
> next dev

ready - started server on 0.0.0.0:3000, url: http://localhost:3000
info  - Loaded env from /Users/jay/Dropbox/github/chatbot-ui/.env.local
[Browserslist] Could not parse /Users/jay/Dropbox/github/package.json. Ignoring it.
[Browserslist] Could not parse /Users/jay/Dropbox/github/package.json. Ignoring it.
event - compiled client and server successfully in 1762 ms (173 modules)
wait  - compiling...
event - compiled successfully in 178 ms (139 modules)

Keep focus on the chat input box after sending a message

First of all, I'd like to thank you for creating and maintaining this wonderful project. I have been using it and find it very helpful.

However, I noticed that the chat input box loses focus after sending a message. This requires users to click on the input box again with the mouse before they can type another message, which can be a bit inconvenient.

I would like to request a feature that keeps the focus on the chat input box after sending a message. This would allow users to continue typing and sending messages more efficiently, without having to manually click on the input box each time.

Best regards

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.