User-friendly WebUI for LLMs (Formerly Ollama WebUI)
-
Updated
Jul 16, 2024 - Svelte
User-friendly WebUI for LLMs (Formerly Ollama WebUI)
LLMX; Easiest 3rd party Local LLM UI for the web!
CasaOS + Ollama + Open WebUI = Belullama
Fully-featured, beautiful web interface for vLLM - built with NextJS.
Notebook `.ipynb` files provided in the repository can be used to setup LLM UI on online notebooks
An empirical comparative study on energy efficiency of LLMs (ChatGPT, Bard, Llama2) associated User Interfaces.
User-friendly WebUI for LLMs which is based on Open WebUI. It used by the Kompetenzwerkstatt Digital Humanities (KDH) at the Humboldt-Universität zu Berlin
Add a description, image, and links to the llm-ui topic page so that developers can more easily learn about it.
To associate your repository with the llm-ui topic, visit your repo's landing page and select "manage topics."