Bump llama-cpp-python[server] from 0.1.50 to 0.1.54 #7

Merged
dependabot[bot] merged 1 commit from dependabot/pip/llama-cpp-python-server--0.1.54 into main 2023-05-24 19:06:08 +00:00
dependabot[bot] commented 2023-05-24 18:01:25 +00:00 (Migrated from github.com)

Bumps llama-cpp-python[server] from 0.1.50 to 0.1.54.

Commits
  • e5d596e Bump version
  • c41b1eb Update llama.cpp
  • aa3d7a6 Merge pull request #263 from abetlen/dependabot/pip/mkdocs-material-9.1.14
  • 2240b94 Bump mkdocs-material from 9.1.12 to 9.1.14
  • 01c79e7 Merge pull request #258 from Pipboyguy/main
  • c3e80b1 Merge pull request #262 from abetlen/dependabot/pip/httpx-0.24.1
  • 8e41d72 Bump httpx from 0.24.0 to 0.24.1
  • e6639e6 Change docker build dynamic param to image instead of cuda version
  • 4f7a6da Merge pull request #248 from localagi/main
  • 0adb9ec Use model_name and index in response
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.1.50 to 0.1.54. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/e5d596e0e922efe45b5a152d934431520116300a"><code>e5d596e</code></a> Bump version</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/c41b1ebca7a52d1eb6791ae5958457bcea8d67bc"><code>c41b1eb</code></a> Update llama.cpp</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/aa3d7a6299d887836238a93a8a435f2aa0a21335"><code>aa3d7a6</code></a> Merge pull request <a href="https://redirect.github.com/abetlen/llama-cpp-python/issues/263">#263</a> from abetlen/dependabot/pip/mkdocs-material-9.1.14</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/2240b949ae0df0832a2c860710a5e379a2dda2cb"><code>2240b94</code></a> Bump mkdocs-material from 9.1.12 to 9.1.14</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/01c79e7bf192cdb65bd3cdfbcb869c439f1222ca"><code>01c79e7</code></a> Merge pull request <a href="https://redirect.github.com/abetlen/llama-cpp-python/issues/258">#258</a> from Pipboyguy/main</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/c3e80b17144c1b798566e15d55c8b309a03d8c21"><code>c3e80b1</code></a> Merge pull request <a href="https://redirect.github.com/abetlen/llama-cpp-python/issues/262">#262</a> from abetlen/dependabot/pip/httpx-0.24.1</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/8e41d724aba87e5e77dae0ca9c974a3d1748d513"><code>8e41d72</code></a> Bump httpx from 0.24.0 to 0.24.1</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/e6639e6620d7f6f8dcabef41b14f34870a3460ca"><code>e6639e6</code></a> Change docker build dynamic param to image instead of cuda version</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/4f7a6daa25f240cb914a1ce1ba1ebdb0ff49ceb3"><code>4f7a6da</code></a> Merge pull request <a href="https://redirect.github.com/abetlen/llama-cpp-python/issues/248">#248</a> from localagi/main</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/0adb9ec37a3bc92e3a817b22a02a4bba30c4d82e"><code>0adb9ec</code></a> Use model_name and index in response</li> <li>Additional commits viewable in <a href="https://github.com/abetlen/llama-cpp-python/compare/v0.1.50...v0.1.54">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=llama-cpp-python[server]&package-manager=pip&previous-version=0.1.50&new-version=0.1.54)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details>
ChaoticByte commented 2023-05-24 18:59:57 +00:00 (Migrated from github.com)

llama.cpp changed their format again, models have to be requantized again. See https://github.com/ggerganov/llama.cpp/pull/1508
-> new models are in ggml3 format

llama.cpp changed their format again, models have to be requantized again. See https://github.com/ggerganov/llama.cpp/pull/1508 -> new models are in ggml3 format
This repository is archived. You cannot comment on pull requests.
No description provided.