ritual/projects/tgi-llm/ui
2024-03-29 10:50:13 -04:00
..
src feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
Dockerfile feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
gradio_ui.env.sample feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
Makefile feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
prompt.txt feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00
README.md feat: publishing infernet-container-starter v0.2.0 2024-03-29 10:50:13 -04:00

Gradio UI

This is a utility UI project to chat with your TGI LLM.

Configuration

Copy the gradio_ui.env.sample file into a new file called gradio_ui.env and fill in the necessary environment variables.

cp gradio_ui.env.sample gradio_ui.env

Environment variables are as follows:

TGI_SERVICE_URL= # URL to your running TGI service
HF_API_TOKEN=
PROMPT_FILE_PATH= # path to a prompt file

Running

Simply run:

make run

The UI will run on port 3001 on your localhost. You can change that configuration here.

Congratulations! You have successfully set up the Gradio UI for your TGI LLM.

Now you can go to http://localhost:3001 and chat with your LLM instance.