A static single-page frontend for https://github.com/ChaoticByte/transcript_api
Updated 2025-09-28 18:34:02 +00:00
A frontend for large language models like 🐨 Koala or 🦙 Vicuna running on CPU with llama.cpp, using the API server library provided by llama-cpp-python. NOTE: I had to discontinue this project because its maintenance takes more time than I can and want to invest. Feel free to fork :)
Updated 2025-09-28 18:15:39 +00:00