[Script request] llama.cpp (standalone or as part of openwebui installer) #2403
Replies: 3 comments 1 reply
-
|
Is this really active? Because we Crawl only releases, and Last Release is 6month ago |
Beta Was this translation helpful? Give feedback.
-
|
Having llama.cpp-vulkan would be a huge benefit with GPU passthrough and the vulkan drivers. People seem to be getting good performance with old cheap used cards on local llms recently with them. I think it would be a great addition. Ollama has no vulkan support. |
Beta Was this translation helpful? Give feedback.
-
|
llama.cpp have just released a web GUI that makes it a lot more accessible to a general or new user especially for those who are GPU poor or now RAM poor. I think it would be a nice option along with the existing Open WebUI and comfyUI choices. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Application Name
llama.cpp
Website
https://github.com/ggerganov/llama.cpp
Description
llama.cpp is a C++ implementation of ollama APIs that is faster and more efficient on many hardware setups. It would be awesome to be able to deploy this as part of the openwebui installer like we can for ollama now, or to install it in a seperate LXC using a new script
Due Diligence
Beta Was this translation helpful? Give feedback.
All reactions