Skip to content

🚀 MAINTAINED FORK: inference-sh/llama-cpp-python – Active, Up-to-date, Contributors Welcome #2033

Open
@okaris

Description

@okaris

Since this repo hasn’t been maintained in over 6 months and I couldn’t get in touch with the original author (@abetlen) via issues or socials, I’ve started a maintained fork:
👉 https://github.com/inference-sh/llama-cpp-python

Much respect to @abetlen — their work made Python integration with llama.cpp accessible and clean.

The fork includes:
• Updates to the latest llama.cpp (as of yesterday)
• Fixed bindings, memory layout, and runtime compatibility
• Plans to track upstream closely and add support for new models
• Upcoming improvements for ROCm, cross-platform support, and stability

Contributions welcome. Also looking for co-maintainers with Python/C++ experience and familiarity with repo lifecycle management.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions