Bump llama-cpp-python[server] from 0.1.50 to 0.1.54 #7
No reviewers
Labels
No labels
bug
dependencies
documentation
duplicate
enhancement
good first issue
help wanted
invalid
question
security-related
wontfix
No milestone
No project
No assignees
1 participant
Due date
No due date set.
Dependencies
No dependencies set.
Reference: ChaoticByte/Eucalyptus-Chat#7
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "dependabot/pip/llama-cpp-python-server--0.1.54"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Bumps llama-cpp-python[server] from 0.1.50 to 0.1.54.
Commits
e5d596e
Bump versionc41b1eb
Update llama.cppaa3d7a6
Merge pull request #263 from abetlen/dependabot/pip/mkdocs-material-9.1.142240b94
Bump mkdocs-material from 9.1.12 to 9.1.1401c79e7
Merge pull request #258 from Pipboyguy/mainc3e80b1
Merge pull request #262 from abetlen/dependabot/pip/httpx-0.24.18e41d72
Bump httpx from 0.24.0 to 0.24.1e6639e6
Change docker build dynamic param to image instead of cuda version4f7a6da
Merge pull request #248 from localagi/main0adb9ec
Use model_name and index in responseDependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)llama.cpp changed their format again, models have to be requantized again. See https://github.com/ggerganov/llama.cpp/pull/1508
-> new models are in ggml3 format