< Back to marketplace
registry
vllm-mlx
OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.
About this MCP
OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.
Package details
anthropicapple siliconaudio processingclaude codecomputer visionimage understandinginferencellmmachine learningmacosmllmmlxmultimodal aispeech to textstttext to speechttsvideo understandingvision language modelvllm