AttributeError: module 'llama_cpp.server' has no attribute 'app' #1

Closed
opened 2023-05-09 22:55:03 +00:00 by xysmalobia · 3 comments
xysmalobia commented 2023-05-09 22:55:03 +00:00 (Migrated from github.com)

This looks great, but I seem to have an issue with importing the llama-cpp api server.

File "~/Eucalyptus-Chat/api-server.py", line 22, in
uvicorn.run(server.app, host=args.host, port=args.port)
AttributeError: module 'llama_cpp.server' has no attribute 'app'

I've tried referring to app.py in llama-cpp-python/llama_cpp/server directly and then the api-server starts but throws a module error. not sure if this is a bug or something strange on my system.

This looks great, but I seem to have an issue with importing the llama-cpp api server. File "~/Eucalyptus-Chat/api-server.py", line 22, in <module> uvicorn.run(server.app, host=args.host, port=args.port) AttributeError: module 'llama_cpp.server' has no attribute 'app' I've tried referring to app.py in llama-cpp-python/llama_cpp/server directly and then the api-server starts but throws a module error. not sure if this is a bug or something strange on my system.
ChaoticByte commented 2023-05-11 06:32:44 +00:00 (Migrated from github.com)

I was able to reproduce the error. Seems like the developer of llama-cpp-python made an API change without raising the major version of the pip package.

I was able to reproduce the error. Seems like the developer of [llama-cpp-python](https://github.com/abetlen/llama-cpp-python) made an [API change](https://github.com/abetlen/llama-cpp-python/commit/9eafc4c49aa4d1dbd3cf58c73c753382a821800f) without raising the major version of the pip package.
ChaoticByte commented 2023-05-11 06:36:52 +00:00 (Migrated from github.com)

should be fixed now :)

should be fixed now :)
xysmalobia commented 2023-05-11 21:46:58 +00:00 (Migrated from github.com)

working perfectly now, thank you!

working perfectly now, thank you!
This repository is archived. You cannot comment on issues.
No description provided.