• v4.0 060d522f6c

    Version 4.0 Stable

    ChaoticByte released this 2023-05-24 19:06:08 +00:00 | 13 commits to main since this release

    BREAKING! Bumped llama-cpp-python[server] from 0.1.50 to 0.1.54

    This (again) requires re-quantized models. The new format is ggml v3. See https://github.com/ggerganov/llama.cpp/pull/1508

    Downloads