|
74e59a9fd4
|
Fixed a small typo in the README
|
2023-05-30 23:12:32 +02:00 |
|
|
2a2241ce08
|
Redesigned the chat history, renamed the profile names of vicuna-v0 and vicuna-v1.1 and updated the screnshot
|
2023-05-30 23:10:36 +02:00 |
|
|
f4abe93735
|
Added a profile for Manticore Chat
|
2023-05-30 20:54:27 +02:00 |
|
|
faed129586
|
Added a profile for Vicuna v1.1
|
2023-05-30 19:13:21 +02:00 |
|
|
abb8054892
|
Added an empty profile
|
2023-05-30 18:57:19 +02:00 |
|
|
de194bead6
|
Improved the vicuna-v0 profile
|
2023-05-30 18:53:24 +02:00 |
|
|
2a46750ee9
|
Updated README
|
2023-05-30 12:10:54 +02:00 |
|
|
ae0058bdee
|
Improved profiles by adding 'separator' field to the profile format, improved vicuna-v0 profile, removed default profile from frontend-server cli, updated README
|
2023-05-30 10:55:31 +02:00 |
|
|
bd44e45801
|
Merge pull request #12 from ChaoticByte/dependabot/pip/llama-cpp-python-server--0.1.56
Bump llama-cpp-python[server] from 0.1.54 to 0.1.56
|
2023-05-30 10:20:58 +02:00 |
|
dependabot[bot]
|
5cfa6a7b0a
|
Bump llama-cpp-python[server] from 0.1.54 to 0.1.56
Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.1.54 to 0.1.56.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Changelog](https://github.com/abetlen/llama-cpp-python/blob/main/CHANGELOG.md)
- [Commits](https://github.com/abetlen/llama-cpp-python/compare/v0.1.54...v0.1.56)
---
updated-dependencies:
- dependency-name: llama-cpp-python[server]
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-05-30 07:53:12 +00:00 |
|
|
8c29a31598
|
Updated section about memory/disk requirements in the README
|
2023-05-25 21:26:00 +02:00 |
|
|
345d0cfc5c
|
Moved import of create_app in api-server.py to the top
|
2023-05-25 21:18:29 +02:00 |
|
|
ea2f59f94e
|
Preserve whitespaces in messages by using pre-wrap, fixes #10
|
2023-05-25 19:55:57 +02:00 |
|
|
060d522f6c
|
Merge pull request #7 from ChaoticByte/dependabot/pip/llama-cpp-python-server--0.1.54
Bump llama-cpp-python[server] from 0.1.50 to 0.1.54 (requires re-quantized models using ggml v3)
|
2023-05-24 21:06:08 +02:00 |
|
dependabot[bot]
|
1718520de9
|
Bump llama-cpp-python[server] from 0.1.50 to 0.1.54
Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.1.50 to 0.1.54.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Commits](https://github.com/abetlen/llama-cpp-python/compare/v0.1.50...v0.1.54)
---
updated-dependencies:
- dependency-name: llama-cpp-python[server]
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-05-24 18:01:24 +00:00 |
|
|
22ba6239c7
|
Added a toggle button for the sidebar, implemented a responsive design (fixes #4) and made more minor improvements to the frontend
|
2023-05-19 00:19:27 +02:00 |
|
|
63706a3c64
|
Updated the README, including the screenshot
|
2023-05-18 16:39:20 +02:00 |
|
|
43fbe364fb
|
Added a profile file for the Vicuna model #5
|
2023-05-18 16:18:24 +02:00 |
|
|
7590c31f89
|
Made the frontend more flexible to also support other models than just Koala
|
2023-05-18 15:54:41 +02:00 |
|
|
c3fda61b21
|
Made the frontend more flexible to also support other models than just Koala
|
2023-05-18 15:34:34 +02:00 |
|
|
02a142012b
|
Updated the screenshot in the README
|
2023-05-18 12:01:30 +02:00 |
|
|
718f483a75
|
Added more parameters to the sidebar: top_k, repeat_penalty, presence_penalty, frequency_penalty - fixes #3
|
2023-05-18 11:54:57 +02:00 |
|
|
924721863d
|
Merge pull request #2 from ChaoticByte/dependabot/pip/llama-cpp-python-server--0.1.50
Bump llama-cpp-python[server] from 0.1.48 to 0.1.50
|
2023-05-18 11:16:30 +02:00 |
|
|
19b6162b57
|
Minor style improvements and fixes
|
2023-05-18 10:32:31 +02:00 |
|
dependabot[bot]
|
0a11d89a73
|
Bump llama-cpp-python[server] from 0.1.48 to 0.1.50
Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.1.48 to 0.1.50.
- [Release notes](https://github.com/abetlen/llama-cpp-python/releases)
- [Commits](https://github.com/abetlen/llama-cpp-python/compare/v0.1.48...v0.1.50)
---
updated-dependencies:
- dependency-name: llama-cpp-python[server]
dependency-type: direct:production
update-type: version-update:semver-patch
...
Signed-off-by: dependabot[bot] <support@github.com>
|
2023-05-15 15:04:03 +00:00 |
|
|
3ef9099407
|
Pinned all pip dependencies in requirements.txt and added dependabot configuration
|
2023-05-11 09:01:35 +02:00 |
|
|
7154fc276d
|
Fixed #1 by changing import statement, using factory function, and pinning the llama-cpp-python to 0.1.48
|
2023-05-11 08:35:56 +02:00 |
|
|
b1da9fb0e9
|
Added the version of Python to the Dependencies section in the README
|
2023-04-30 20:43:36 +02:00 |
|
|
a03f51f921
|
Documented cli arguments and dependencies in the README
|
2023-04-30 19:36:46 +02:00 |
|
|
bfb8b6baf2
|
Auto-resize input field after a message was sent
|
2023-04-30 19:34:11 +02:00 |
|
|
cbdccf3b92
|
Added screenshot to README
|
2023-04-30 12:49:13 +02:00 |
|
|
aca00bd214
|
Added button to reset settings, disable also sidebar inputs/buttons while waiting for reply
|
2023-04-30 12:41:11 +02:00 |
|
|
1bd108b4cc
|
Automatically scroll to bottom when a new message is added
|
2023-04-30 12:25:04 +02:00 |
|
|
cd6036eff4
|
Initial release
|
2023-04-30 12:16:48 +02:00 |
|
|
9e3f7b8a93
|
Initial commit
|
2023-04-30 12:02:27 +02:00 |
|