— Concepts
Function Calling
An LLM feature that lets the model invoke your code/APIs in a structured way.
Also known as: Tool use · Tool calling · Function invocation
What is Function Calling?
Function calling (or 'tool use' in Anthropic's vocabulary) is the mechanism by which an LLM decides to call your defined function — like get_weather(city) or send_email(to, body) — and returns a structured JSON payload of arguments instead of free text. You execute the function, return the result, and the model continues. It is the core primitive behind every modern AI agent, every voice assistant, and every workflow automation built on Claude, GPT, or Gemini. Function calling is what turns an LLM from a chatbot into something that can DO things.
— Related
Terms connected to Function Calling
Concepts
Tool Use
An AI's ability to call external tools (APIs, code, search) instead of just generating text.
Open →Concepts
AI Agent
An AI system that decides its own next action and takes multi-step actions autonomously.
Open →Concepts
Structured Output
Forcing an LLM to return JSON / typed data that matches a schema you define.
Open →Concepts
MCP (Model Context Protocol)
Anthropic's open standard for connecting AI assistants to external tools and data.
Open →Concepts
Agentic AI
AI systems that plan, decide, and act on multi-step goals with minimal human oversight.
Open →From definitions to deployed projects.
Knowing what a term means is step one. ONROL's AI Generalist track gets you shipping projects that use it.
