-
Version 4.3 Stable
released this
2023-05-30 21:12:32 +00:00 | 0 commits to main since this release- Redesigned the chat history
- Renamed the profile names of vicuna-v0 and vicuna-v1.1
- Updated the README
Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Version 4.2 Stable
released this
2023-05-30 18:54:27 +00:00 | 2 commits to main since this release- Bumped llama-cpp-python[server] from 0.1.54 to 0.1.56
- Changed the profile format
- Added support for Vicuna v1.1
- Added support for Manticore Chat
Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Version 4.1 Stable
released this
2023-05-25 19:26:00 +00:00 | 10 commits to main since this release- Bugfix: Preserve whitespaces in messages #10
- minor code improvements
Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Version 4.0 Stable
released this
2023-05-24 19:06:08 +00:00 | 13 commits to main since this releaseBREAKING! Bumped llama-cpp-python[server] from 0.1.50 to 0.1.54
This (again) requires re-quantized models. The new format is ggml v3. See https://github.com/ggerganov/llama.cpp/pull/1508
Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Source code (ZIP)
-
Version 3.1 Stable
released this
2023-05-18 22:19:27 +00:00 | 15 commits to main since this release- Added a toggle button for the sidebar
- Implemented a responsive design (fixes #4)
- Made more minor improvements to the frontend
Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Version 3.0 Stable
released this
2023-05-18 14:39:20 +00:00 | 16 commits to main since this releaseChanged the frontend to support other LLMs and added support for Vicuna.
Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Source code (ZIP)
-
Version 2.1 Stable
released this
2023-05-18 10:01:30 +00:00 | 20 commits to main since this releaseAdded more parameters to the sidebar: top_k, repeat_penalty, presence_penalty, frequency_penalty
Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Source code (ZIP)
-
Version 2.0 Stable
released this
2023-05-18 09:16:30 +00:00 | 22 commits to main since this release(Breaking) Bumped llama-cpp-python[server] from 0.1.48 to 0.1.50
You may have to re-quantize your models, see https://github.com/ggerganov/llama.cpp/pull/1405Downloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Source code (ZIP)
-
Version 1.1 Stable
released this
2023-05-18 08:32:31 +00:00 | 24 commits to main since this releaseDownloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Source code (ZIP)
-
Version 1.0 Stable
released this
2023-05-11 07:01:35 +00:00 | 25 commits to main since this releaseDownloads
-
Source code (ZIP)
1 download
-
Source code (TAR.GZ)
1 download
-
Source code (ZIP)