This website requires JavaScript.
Explore
Help
Sign in
ChaoticByte
/
Eucalyptus-Chat
Archived
Watch
1
Star
0
Fork
You've already forked Eucalyptus-Chat
0
Code
Issues
Pull requests
Projects
Releases
10
Packages
Wiki
Activity
Actions
This repository has been archived on
2025-09-28
. You can view files and clone it, but you cannot make any changes to it's state, such as pushing and creating new issues, pull requests or comments.
0a11d89a73
Eucalyptus-Chat
/
requirements.txt
3 lines
62 B
Text
Raw
Normal View
History
Unescape
Escape
Bump llama-cpp-python[server] from 0.1.48 to 0.1.50 Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.1.48 to 0.1.50. - [Release notes](https://github.com/abetlen/llama-cpp-python/releases) - [Commits](https://github.com/abetlen/llama-cpp-python/compare/v0.1.48...v0.1.50) --- updated-dependencies: - dependency-name: llama-cpp-python[server] dependency-type: direct:production update-type: version-update:semver-patch ... Signed-off-by: dependabot[bot] <support@github.com>
2023-05-15 15:04:03 +00:00
llama-cpp-python[server]==0.1.50
Pinned all pip dependencies in requirements.txt and added dependabot configuration
2023-05-11 09:01:35 +02:00
uvicorn==0.22.0
sanic==23.3.0
Reference in a new issue
Copy permalink