AI Copilot for Vim/NeoVim
-
Updated
Feb 28, 2025 - Python
AI Copilot for Vim/NeoVim
A high-performance API server that provides OpenAI-compatible endpoints for MLX models. Developed using Python and powered by the FastAPI framework, it provides an efficient, scalable, and user-friendly solution for running MLX-based vision and language models locally with an OpenAI-compatible interface.
Build an Autonomous Web3 AI Trading Agent (BASE + Uniswap V4 example)
Add MLX support to Pydantic AI through LM Studio or mlx-lm, run MLX compatible HF models on Apple silicon.
Various LLM resources and experiments
A comprehensive toolkit for end-to-end continued pre-training, fine-tuning, monitoring, testing and publishing of language models with MLX-LM
LLM model inference on Apple Silicon Mac using the Apple MLX Framework.
MLX inference service compatible with OpenAI API, built on MLX-LM and MLX-VLM.基于MLX-LM和MLX-VLM构建的OpenAI API兼容的MLX推理服务.
Add a description, image, and links to the mlx-lm topic page so that developers can more easily learn about it.
To associate your repository with the mlx-lm topic, visit your repo's landing page and select "manage topics."