AI & ML
Run local LLMs via Ollama server
mcpizy install ollama
npx -y ollama-mcp
{ "mcpServers": { "ollama": { "command": "npx", "args": [ "-y", "ollama-mcp" ] } } }
GPT models, DALL-E, Whisper via OpenAI API
Models, datasets, and spaces on HF
Run ML models in the cloud via API
Image generation with Stable Diffusion
Monitor usage, track costs, and discover new MCPs.