Bump llama-cpp-python[server] from 0.1.54 to 0.1.55 #11

Closed
dependabot[bot] wants to merge 1 commit from dependabot/pip/llama-cpp-python-server--0.1.55 into main
dependabot[bot] commented 2023-05-29 15:04:07 +00:00 (Migrated from github.com)

Bumps llama-cpp-python[server] from 0.1.54 to 0.1.55.

Commits
  • 6075e17 Bump version
  • 2adf6f3 Merge pull request #265 from dmahurin/fix-from-bytes-byteorder
  • 34ad71f Merge pull request #274 from dmahurin/fix-missing-antiprompt
  • d78453c Merge pull request #264 from dmahurin/fix-min-keep
  • 4c1b7f7 Bugfix for logits_processor and stopping_criteria
  • 0fa2ec4 low_level_api_chat_cpp.py: Fix missing antiprompt output in chat.
  • 433a2e3 Add extra logits_processor and stopping_criteria
  • 30bf8ec Update llama.cpp
  • f74b90e Fix streaming hang on last token when cache is on.
  • 5be8354 Added tokenizer
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
Bumps [llama-cpp-python[server]](https://github.com/abetlen/llama-cpp-python) from 0.1.54 to 0.1.55. <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/6075e17cb6c74cc48369a65cc742dda607ebc43f"><code>6075e17</code></a> Bump version</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/2adf6f3f9ab26bba0069a0d60bb555010dcc2ade"><code>2adf6f3</code></a> Merge pull request <a href="https://redirect.github.com/abetlen/llama-cpp-python/issues/265">#265</a> from dmahurin/fix-from-bytes-byteorder</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/34ad71f448a589f792f07415912d27c94344e6e7"><code>34ad71f</code></a> Merge pull request <a href="https://redirect.github.com/abetlen/llama-cpp-python/issues/274">#274</a> from dmahurin/fix-missing-antiprompt</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/d78453c04532482b4bee78f6e2c4b89aad154133"><code>d78453c</code></a> Merge pull request <a href="https://redirect.github.com/abetlen/llama-cpp-python/issues/264">#264</a> from dmahurin/fix-min-keep</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/4c1b7f7a76df3459c958a9da640b84fac2110c86"><code>4c1b7f7</code></a> Bugfix for logits_processor and stopping_criteria</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/0fa2ec490396447afa2fc7da3dc5033759c888a6"><code>0fa2ec4</code></a> low_level_api_chat_cpp.py: Fix missing antiprompt output in chat.</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/433a2e3e8a518bfb0eff21af23933014ce7a5b20"><code>433a2e3</code></a> Add extra logits_processor and stopping_criteria</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/30bf8ec55776d6254ec4b2146ea5585db6010459"><code>30bf8ec</code></a> Update llama.cpp</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/f74b90ed6767957ac0eb1b5364196a22e10166de"><code>f74b90e</code></a> Fix streaming hang on last token when cache is on.</li> <li><a href="https://github.com/abetlen/llama-cpp-python/commit/5be8354e11e5b5cf99963eefc2c13541d60c0634"><code>5be8354</code></a> Added tokenizer</li> <li>Additional commits viewable in <a href="https://github.com/abetlen/llama-cpp-python/compare/v0.1.54...v0.1.55">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=llama-cpp-python[server]&package-manager=pip&previous-version=0.1.54&new-version=0.1.55)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details>
dependabot[bot] commented 2023-05-30 07:53:15 +00:00 (Migrated from github.com)

Superseded by #12.

Superseded by #12.
This repository is archived. You cannot comment on pull requests.
No description provided.