In depth
Function calling lets an LLM transform 'I need to get the weather' into a structured call: `{ name: 'get_weather', args: { location: 'Paris' } }`. The model is trained to choose between free-text output and structured function invocation based on context. When it picks a function, the host executes it and feeds the result back.
OpenAI introduced function calling in June 2023 as a way to bridge chat models and real-world actions. Anthropic shipped 'tool use' with similar semantics shortly after. Gemini, Mistral, and nearly every modern frontier model now support it. The API shapes differ slightly, but the concept is universal.
Function calling operates at the model-API level — the LLM emits tool calls and the client executes them. MCP sits one layer up: it standardizes WHERE those tools come from. An MCP tool IS a function the LLM can call; the MCP server provides the implementation.
Think of function calling as the 'language' and MCP as the 'library'. They work together: Claude emits function calls in Anthropic format, the host translates these into MCP `tools/call` requests to the right server.