1:56:20 What Is Function Calling?
LLM Function Calling
Function calling lets LLMs invoke external functions or APIs based on conversation context. Instead of just generating text, the model can decide to call a weather API, search a database, or execute code — then use the results to formulate its response.
How Function Calling Works
You define available functions with names, descriptions, and parameter schemas. When the user asks 'What's the weather in Tokyo?', the model returns a structured function call: {name: 'get_weather', arguments: {city: 'Tokyo'}}. Your code executes the function and feeds the result back to the model.
Key Concepts
- Tool Schema — JSON schema defining available functions — name, description, and typed parameters
- Tool Choice — Whether the model must call a tool, can choose to, or should never call one — configurable per request
- Parallel Tool Calls — Some models can invoke multiple functions simultaneously for efficiency
Learn Function Calling — Top Videos
1:56:20
34:05
3:31:24 Function Calling Educators
@leilagharani
Excel. Power Query. Copilot. ChatGPT. Power BI. PowerPoint. You use them every day to automate Excel and your work - s...
@openai
OpenAI’s mission is to ensure that artificial general intelligence benefits all of humanity.
@academind
There's always something to learn! We create courses and tutorials on tech-related topics since 2016! We teach develop...
Frequently Asked Questions
Is function calling the same as MCP?
Function calling is the LLM capability. MCP is a protocol for organizing and serving tools to AI clients. MCP uses function calling under the hood, but adds discovery, authentication, and standardization.
Which LLMs support function calling?
GPT-4o, Claude 3+, Gemini, and most major LLMs support function calling. The exact implementation and reliability varies by model.
Want a structured learning path?
Plan a Function Calling Lesson →