< Back to marketplace
🧩
registry

vllm-mlx

OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.

★ 0.0 / 10♥ 0

About this MCP

OpenAI and Anthropic compatible server for Apple Silicon. Run LLMs and vision-language models (Llama, Qwen-VL, LLaVA) with continuous batching, MCP tool calling, and multimodal support. Native MLX backend, 400+ tok/s. Works with Claude Code.

Package details

Authorregistry
LicenseUnknown
Last updated
anthropicapple siliconaudio processingclaude codecomputer visionimage understandinginferencellmmachine learningmacosmllmmlxmultimodal aispeech to textstttext to speechttsvideo understandingvision language modelvllm