Open
Description
Since this repo hasn’t been maintained in over 6 months and I couldn’t get in touch with the original author (@abetlen) via issues or socials, I’ve started a maintained fork:
👉 https://github.com/inference-sh/llama-cpp-python
Much respect to @abetlen — their work made Python integration with llama.cpp accessible and clean.
The fork includes:
• Updates to the latest llama.cpp (as of yesterday)
• Fixed bindings, memory layout, and runtime compatibility
• Plans to track upstream closely and add support for new models
• Upcoming improvements for ROCm, cross-platform support, and stability
Contributions welcome. Also looking for co-maintainers with Python/C++ experience and familiarity with repo lifecycle management.
Metadata
Metadata
Assignees
Labels
No labels