Define a Java `Function` or `@Tool` bean with a clear description and input/output schema. This exposes internal capabilities (like DB or external APIs) to the LLM.
↓
2
App Sends Prompt + Tool Schema
The ChatClient sends the user's natural language Prompt along with the function's contract (JSON Schema) to the LLM.
↓
3
LLM Returns JSON Call
The LLM, recognizing the need for external data, returns a Tool Call Request (JSON specifying the function name and arguments), not the final answer.
↓
4
Spring AI Executes Function
The framework intercepts the JSON, executes the corresponding Java method, retrieves real-time data, and sends the data result back to the LLM.
↓
5
LLM Generates Final Answer
With the external data provided, the LLM generates a complete, accurate, and contextually rich response for the user.