Tool Calling
Imagine that you'd like to extend the capabilities of your AI Agent with some new abilities. For example, out of the box, an LLM doesn't know what time it is. If you ask it anyway, you're not going to get a very good answer.
If you want to teach the LLM to tell the time, you need to give it a tool.
Basic Example
final timeTool = Tool(
name: 'get_time',
description: 'Get current time in a location',
inputSchema: JsonSchema.create({
'type': 'object',
'properties': {
'location': {'type': 'string'},
},
'required': ['location'],
}),
onCall: (args) async {
// NOTE: insert actual timezone magic here
return {
'time': '3:42 PM',
'location': args['location'],
};
},
);
final agent = Agent('openai', tools: [timeTool]);
final result = await agent.send("What time is it in New York?");
print(result.output); // "It's currently 3:42 PM in New York"
Multiple Tools
final agent = Agent('anthropic', tools: [
weatherTool,
temperatureConverterTool,
currentDateTimeTool,
]);
// Single request, multiple tool calls
final result = await agent.send(
"What's the weather in Paris? Convert to Fahrenheit and tell me the time."
);
// Agent calls all three tools automatically
Streaming Tool Calls
await for (final chunk in agent.sendStream(
"Weather in NYC and Boston?",
)) {
stdout.write(chunk.output); // Streams response as tools execute
}
Error Handling
If you throw an error in a tool, it will be caught and returned to the LLM as an error message so it can handle it gracefully (ideally).
final tool = Tool(
name: 'divide',
description: 'Divide numbers',
inputSchema: JsonSchema.create({
'type': 'object',
'properties': {
'a': {'type': 'number'},
'b': {'type': 'number'},
},
'required': ['a', 'b'],
}),
onCall: (args) async {
if (args['b'] == 0) throw 'Cannot divide by zero';
return {'result': args['a'] / args['b']};
},
);
Automatic Schema Generation
Use json_serializable
and soti_schema
for type-safe tools:
@SotiSchema()
@JsonSerializable()
class WeatherInput {
WeatherInput({required this.city});
/// The city to get weather for
final String city;
factory WeatherInput.fromJson(Map<String, dynamic> json) =>
_$WeatherInputFromJson(json);
@jsonSchema
static Map<String, dynamic> get schemaMap =>
_$WeatherInputSchemaMap;
}
// Use the generated schema
final weatherTool = Tool(
name: 'get_weather',
description: 'Get weather for a city',
inputSchema: JsonSchema.create(WeatherInput.schemaMap),
onCall: (args) async {
final input = WeatherInput.fromJson(args);
return {'temp': 72, 'city': input.city};
},
);
Next Steps
- Typed Output - Structured responses with tools
- Multi-turn Chat - Conversations with tool memory