In this project some basic front-ends are provivided that enable the webinterface based interaction with an LLM (APIs). Further, some pre-requisites are implemented on the interface design that allow for creating an abstraction layer against an account for an LLM such that multiple user can use the same account but without access to the account itself. We added some functionalities with Aleph Alpha such as Summarization, Document Processing and Question & Answering to the showcase. It is not meant to be beautiful code that can be transferred to production, rather it is meant to see something on the screen other than code.
- About
- Showcases
- Getting Started
- Deployment or usage
- Configuration
- TODO
- Licensing Overview
- Authors
- Acknowledgments
- Notes
So far if you have an Aleph Alpha account for example and would like to share it in your organization with many people you would have to share the account credentials with everyone that would give too many rights to everyone. The playground however is only accessible through those extensive credentials. Therefore, this projects provides a front-end to create an abstraction and further provides some basic wrappers for LLM use cases.
The front-end allows you to use the plain playground just like in the respective account to configure and try out prompt engineering. Further, there is also a functionality that allows you uploading a PDF file for which the selected page will be summarized. Finally, there is also a basic chat functionality to chat with the world knowledge of the llm.
The first use case is the dupe version of the Aleph Alpha Playround. So far if you have an Aleph Alpha account for example and would like to share it in your organization with many people you would have to share the account credentials with everyone that would give too many rights to everyone. The playground however is only accessible through those extensive credentials. Therefore, this projects provides a front-end to create an abstraction.
Configuration of the chat prompt:
This chat prompt is a very simple one and intended to be used to chat with the world knowledge of the foundation model.
### Instruction: You are a chatbot and you answer questions.
### Input:{{chatmessage}}
### Response:
The second use case show cases a simple summary of input text derived from https://towardsdatascience.com/summarize-podcast-transcripts-and-long-texts-better-with-nlp-and-ai-e04c89d3b2cb.
You can find the summarization and chat prompt in the "prompts" folder.
Configuration of the summarization prompt:
This summarization is a very simple.
### Instruction:
Generate concise summary as continuous text from the given text. If the given text does not have real sentences, say "TEXT_CONTAINS_NO_REAL_SENTENCES".
### Input:
Text: {{document}}
### Response:
Summarization prompt for guided summary.
### Instruction:
Answer the question using the given text.
### Input:
Question: {{question}}
Text: {{document}}
### Response:
The summarization comes with a keyword extraction (prompt taken from Aleph Alpha Playground).
Identify matching keywords for each text.
###
Text: The "Whiskey War" is an ongoing conflict between Denmark and Canada over ownership of Hans Island. The dispute began in 1973, when Denmark and Canada reached an agreement on Greenland's borders. However, no settlement regarding Hans Island could be reached by the time the treaty was signed. Since then both countries have used peaceful means - such as planting their national flag or burying liquor - to draw attention to the disagreement.
Keywords: Conflict, Whiskey War, Denmark, Canada, Treaty, Flag, Liquor
###
Text: NASA launched the Discovery program to explore the solar system. It comprises a series of expeditions that have continued from the program's launch in the 1990s to the present day. In the course of the 16 expeditions launched so far, the Moon, Mars, Mercury and Venus, among others, have been explored. Unlike other space programs, the Discovery program places particular emphasis on cost efficiency, true to the motto: "faster, better, cheaper".
Keywords: Space program, NASA, Expedition, Cost efficiency, Moon, Mars, Mercury, Venus
###
Text: {{document}}
Keywords:
The third use case show cases question and answering (with natural language generation) in which a given input document can be queried. The output is the display of the most suitable n text chunks and a machine generated answer in natural language.
Configuration of the embedding function:
### Instruction: Answer the given question by the provided document.
### Input: {{string}}
### Response: {{query}} Der Sachverhalt ist wir folgt:
The forth use case show cases document processing in which a given input document can be queried for specific entities.
### Instruction: Please extract relevant information from OCR-read document ("{{namedentity1}}","{{namedentity2}}","{{namedentity3}}").
If several values are present format them as a list.
If value cannot be extracted use "values":"NotAvail".
### Input: {{document}}
### Output:
For changing the color scheme on the front-end two files need to be touched that are located in the "www" folder.
style.R in lines 3 and 4 to change the sidebar background color and text color:
# Color configuration beside CSS elements; mix colors here = https://cssgradient.io/
config_button = "color: #fff; background-color: #06498c; border-color: #06498c"
config_primary = "#06498c"
config_sidebar_text_color = "#fff"
style.css in line 135 to change the background color of the main interaction field:
body {
background: linear-gradient(90deg, rgba(6,73,140,1) 28%, rgba(21,146,227,1) 63%, rgba(0,224,255,1) 100%);
}
These instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.
What things you need to run the software:
- Install Docker in your environment or Rancher
- Aleph Alpha Account or Token (see image description below)
In the following we describe how you can deploy this application with a Docker image.
Pros | Cons |
---|---|
Time efficient set-up | No visual configuration possible |
No messing around with the console | No other configuration of inputs possible |
Step 0: Open your console
Step 1: Download the Docker image (=application)
docker pull schiggy89/llm-playground:<<VERSION NUMBER>>
Step 2: Start the downloaded Docker image (=application)
docker run -p 3838:3838 --rm schiggy89/llm-playground:<<VERSION NUMBER>>
This step might take a while until it startet.
Step 3: Start a browser and enter "localhost:3838" to open the application
Step 4: Enter your token on the left. You do not need a USER ID (because this is for other enterprise purposes).
In the following we describe how you can deploy this application by building your own a Docker image and with some configurations of the inputs.
Pros | Cons |
---|---|
Visual configuration possible | No Time efficient set-up |
Other configuration of inputs possible | Messing around with the console |
Step 0: Open your console and cd to the filepath where you want to save the application
Step 1: Download the repository (=application)
git clone https://github.com/LilianDK/llm-playground.git
Step 2: Configurate what ever you need to and build the docker image
For norma docker image builds:
docker build -t YOURTAG/PROJECTNAME:<<VERSION NUMBER>> --push .
For multiarch build, that is for diverse operating systems, you need to run following instruction if it is your first time:
docker buildx create --name multiarch --driver docker-container --use
Then for all builds afterwards and only:
docker buildx build --no-cache --platform=linux/amd64,linux/arm64 -t YOURTAG/PROJECTNAME:<<VERSION NUMBER>> --push --builder multiarch .
(approx. more than 5 minutes)
Step 3: Start the Docker image (=application)
docker run -p 3838:3838 --rm schiggy89/llm-playground:<<VERSION NUMBER>>
This step might take a while until it startet.
Step 4: Start a browser and enter "localhost:3838" to open the application
Optional if you want to share you image like in the first deployment option:
docker push YOURTAG/PROJECTNAME
Step 5: Enter your token on the left. You do not need a USER ID (because this is for other enterprise purposes).
In the following we describe how you can "deploy" this application through R Studio.
Pros | Cons |
---|---|
Visual configuration possible | No Time efficient set-up |
Other configuration of inputs possible | Messing around with the console |
Step 0: Open your console and cd to the filepath where you want to save the application
Step 1: Download the repository (=application)
git clone https://github.com/LilianDK/llm-playground.git
Step 2: Open the aa_playground.Rproj
Step 3: Run app.R
Step 4: Enter your token on the left. You do not need a USER ID (because this is for other enterprise purposes).
- Download prompt report multi-file
- Database for token tracking
- https://github.com/momper14/alephAlphaClient
- Audio recording
- Hate blocker
- Improve the summarization prompt (rejected to be integrated in new application for large text corpus summarization)
- Included other LLM API, maybe cohere
- Adding chatbot functionality
- https://github.com/daattali/shinyscreenshot/
- Test framework for document processing
- Tests
- Prompt Catalogue
- Websocket (for the audio transcription display on the front-end)
Name | Version | Licence |
---|---|---|
R | 4.3.1 | GPL-2/GPL-3 |
shinyproxy | 3.0.2 | Apache-2.0 (see: https://github.com/LilianDK/shinyproxy_generative_ai_meadow) |
Aleph Alpha Client | None | None (scheduled for integration) |
audio | 0.1.11 | MIT (scheduled for integration) |
OpenAI Whisper | 0.2.1-1 | MIT |
renv | 1.0.2 | MIT |
Glue | 1.6.2 | MIT |
Stringr | 1.5.0 | MIT |
gridlayout | 0.2.1 | MIT |
DT | 0.29 | GPL-3 |
pdftools | 3.3.3 | MIT |
rpy2 | 3.5.14 | GPLv2+ |
thematic | 0.1.3 | MIT |
Rmarkdown | 3.3.3 | GPL-3 |
markdown | 3.4.4 | BSD License |
Shiny | 2.24 | GPL-3 |
ShinyWidgets | 0.8.0 | GPL-3 |
Bootswatch | 5.3.1 | MIT |
shinycssloaders | 1.0.0 | MIT |
bslib | 0.5.1 | MIT |
TheOpenAIR | 0.1.0 | MIT |
Reticulate | 1.31 | Apache-2.0 |
Python | 3.11 | PSF-2 |
Aleph Alpha Client | 3.4.1 | MIT |
Cohere Client | None | MIT (scheduled for integration) |
Jinja2 | 3.1.2 | BSD License (BSD-3-Clause) |
NumPy | 1.25 | NumPy licence |
Docker | 23.0.3 | Apache-2.0 |
Docker Compose | 2.21.0 | Apache-2.0 |
Name | Version | Licence |
---|---|---|
reactlog | 1.1.1 | GPL-3 |
- @LilianDK - Idea & Initial work
- @mfmezger - Supporting with the nasty Python and Docker stuff
- @momper14 - Supporting with the Docker compilation stuff and writing of the Aleph Alpha Client in R
- Bootswatch inspired me to make it at least a little beautiful
- renv::snapshot(type = all)