Documentation Index
Fetch the complete documentation index at: https://docs.dartantic.ai/llms.txt
Use this file to discover all available pages before exploring further.
Migrating from 2.x to 3.0.0
Thinking API Change
Thinking is also stored as ThinkingPart in the consolidated message for
history.
// Non-streaming - use result.thinking
final result = await agent.send(prompt);
if (result.thinking != null) {
print('[[${result.thinking}]]');
}
print(result.output);
// Streaming - use chunk.thinking for real-time display
await for (final chunk in agent.sendStream(prompt)) {
if (chunk.thinking != null) stdout.write(chunk.thinking);
stdout.write(chunk.output);
history.addAll(chunk.messages);
}
Schema API Change
The JsonSchema class has been replaced with Schema:
// OLD
import 'package:json_schema/json_schema.dart';
final schema = JsonSchema.create({
'type': 'object',
'properties': {
'city': {'type': 'string'},
},
'required': ['city'],
});
// Or with builder methods:
final schema = JsonSchema.object({
'location': JsonSchema.string(description: 'The location'),
'count': JsonSchema.integer(description: 'The count'),
}, required: ['location']);
// NEW
import 'package:dartantic_ai/dartantic_ai.dart'; // Schema is re-exported
final schema = Schema.fromMap({
'type': 'object',
'properties': {
'city': {'type': 'string'},
},
'required': ['city'],
});
// Or with builder methods (using S shorthand):
final schema = S.object(properties: {
'location': S.string(description: 'The location'),
'count': S.integer(description: 'The count'),
}, required: ['location']);
Key changes:
JsonSchema.create({...}) becomes Schema.fromMap({...})
JsonSchema.object({...}) becomes S.object(properties: {...})
JsonSchema.string(...) becomes S.string(...)
JsonSchema.integer(...) becomes S.integer(...)
- No need to import
json_schema package; Schema is re-exported from
dartantic_ai
Migrating from 1.x to 2.0.0
Exposing dartantic_interface directly from dartantic_ai
It’s no longer necessary to manually include the dartantic_interface package.
// OLD - had to import both packages
import 'package:dartantic_ai/dartantic_ai.dart';
import 'package:dartantic_interface/dartantic_interface.dart';
// NEW - one import does it all
import 'package:dartantic_ai/dartantic_ai.dart';
Provider Factory Registry
Provider lookup has been moved from the Providers class to Agent static
methods. Providers are now created via factory functions not cached instances.
// OLD
final provider = Providers.get('openai');
final allProviders = Providers.all;
Providers.providerMap['custom'] = MyProvider();
final provider2 = Providers.openai;
// NEW
final provider = Agent.getProvider('openai');
final allProviders = Agent.allProviders;
Agent.providerFactories['custom'] = MyProvider.new;
final provider2 = OpenAIProvider();
Moved OpenAI-compat providers to example (except OpenRouter)
Removed the following intrinsic providers from dartantic to the
openai_compat.dart example:
google-openai
together
ollama-openai
The openrouter OpenAI-compatible provider remains as an intrinsic provider.
Simplified Thinking API
Extended thinking (chain-of-thought reasoning) is now a first-class feature in
Dartantic with a simplified, unified API across all providers that support
thinking:
// OLD
final agent = Agent(
'openai-responses:gpt5',
chatModelOptions: OpenAIResponsesChatModelOptions(
reasoningSummary: OpenAIReasoningSummary.detailed,
),
);
final thinking = result.metadata['thinking'] as String?;
// NEW
final agent = Agent('openai-responses:gpt5', enableThinking: true);
final thinking = result.thinking;
- Provider-specific fine-tuning options remain for advanced use cases:
GoogleChatModelOptions.thinkingBudgetTokens
AnthropicChatOptions.thinkingBudgetTokens
OpenAIResponsesChatModelOptions.reasoningSummary
Removed ProviderCaps
The ProviderCaps type was removed from the provider implementation and moved
to a helper function in the tests.
// OLD
final visionProviders = Providers.allWith({ProviderCaps.chatVision});
// NEW
// use Provider.listModels() and choose via ModelInfo instead
Migrating from 0.9.7 to 1.0.0
This release went through a major refactoring to support many more providers for
both chat and embeddings, as well as to support more features. The API is
different in a lot of tiny ways, but the core concepts remain largely the same.
This is a list of changes (some of them breaking) to help you migrate. And as
long as this list is, I’m sure I’ve missed some. If you find something that’s
not here, please let me know!
Dynamic => Static Provider factories
Provider access has moved to Agent static methods:
// OLD (0.9.7)
final provider = OpenAiProvider();
final providerFactory = Agent.providers['google'];
final providerFactoryByAlias = Agent.providers['gemini'];
// NEW (2.0.0)
final provider1 = Agent.createProvider('openai');
final provider2 = Agent.createProvider('google');
final provider3 = Agent.createProvider('gemini');
If you’d like to extend the list of providers dynamically at runtime, you can
use the providerFactories map on the Agent class:
Agent.providerFactories['my-provider'] = MyProvider.new;
Agent.runXxx => Agent.sendXxx
The Agent.runXxx methods have been renamed for consistency with chat models
and the new Chat class:
// OLD
final result = await agent.run('Hello');
final typedResult = await agent.runFor<T>('Hello', outputSchema: schema);
await for (final chunk in agent.runStream('Hello')) {...}
// NEW
final result = await agent.send('Hello');
final typedResult = await agent.sendFor<T>('Hello', outputSchema: schema);
await for (final chunk in agent.sendStream('Hello')) {...}
Also, when you’re sending a prompt to the agent, instead of passing a list of
messages via the messages parameter, you can pass it via the history parameter:
// OLD
final result = await agent.run('Hello', messages: messages);
// NEW
final result = await agent.send('Hello', history: history);
The subtle difference is that the history is a list of previous messages before
the prompt + optional attachments, which forms the new message. Love it or
don’t, but it made sense to me at the time…
Agent.provider => Agent.forProvider
The Agent.provider constructor has been renamed to Agent.forProvider for
clarity:
// OLD
final agent = Agent.provider(OpenAiProvider());
// NEW
final agent = Agent.forProvider(Agent.createProvider('anthropic'));
Message => ChatMessage
The Message type has been renamed to ChatMessage for consistency with chat
models:
// OLD
var messages = <Message>[];
final response = await agent.run('Hello', messages: messages);
messages = response.messages.toList();
// NEW
var history = <ChatMessage>[];
final response = await agent.send('Hello', history: history);
history.addAll(response.messages);
toSchema => JsonSchema.create
The toSchema method has been dropped in favor of the built-in
JsonSchema.create method for simplicity:
// OLD
final schema = <String, dynamic>{
'type': 'object',
'properties': {
'town': {'type': 'string'},
'country': {'type': 'string'},
},
'required': ['town', 'country'],
}.toSchema();
// NEW
final schema = JsonSchema.create({
'type': 'object',
'properties': {
'town': {'type': 'string', 'description': 'Name of the town'},
'country': {'type': 'string', 'description': 'Name of the country'},
},
'required': ['town', 'country'],
});
systemPrompt + Message.system() => ChatMessage.system()
The systemPrompt parameter has been removed from Agent and model constructors.
It was confusing to have both a system prompt and a system message, so I’ve
simplified the implementation to use just an optional ChatMessage.system()
instead. In practice, you’ll want to keep the system message in the history
anyway, so think of this as a “pit of success” thing:
// OLD
final agent = Agent(
'openai',
systemPrompt: 'You are a helpful assistant.',
);
final result = await agent.send('Hello');
// NEW
final agent = Agent('openai');
final result = await agent.send(
'Hello',
history: [
const ChatMessage.system('You are a helpful assistant.'),
],
);
Agent chat and streaming
The agent now streams new messages as they’re created along with the output:
final agent = Agent('openai');
final history = <ChatMessage>[];
await for (final chunk in agent.sendStream('Hello', history: history)) {
// collect text and messages as they're created
print(chunk.output);
history.addAll(chunk.messages);
}
If you’d prefer not to collect and track the message history manually, you can
use the Chat class to collect messages for you:
final chat = Chat(Agent('openai'));
await for (final chunk in chat.sendStream('Hello')) {
print(chunk.output);
}
// chat.history is a list of ChatMessage objects
DataPart.file(File) => DataPart.fromFile(XFile)
The DataPart.file constructor has been replaced with DataPart.fromFile to
support cross-platform file handling, i.e. the web:
// OLD
import 'dart:io';
final part = await DataPart.file(File('bio.txt'));
// NEW
import 'package:cross_file/cross_file.dart';
final file = XFile.fromData(
await File('bio.txt').readAsBytes(),
path: 'bio.txt',
);
final part = await DataPart.fromFile(file);
The model string format has been enhanced to support chat, embeddings and other
model names using custom relative URI. This was important to be able to specify
the model for chat and embeddings separately:
// OLD
Agent('openai');
Agent('openai:gpt-4o');
Agent('openai/gpt-4o');
// NEW - all of the above still work plus:
Agent('openai?chat=gpt-4o&embeddings=text-embedding-3-large');
Agent.embedXxx
The agent gets new Agent.embedXxx methods for creating embeddings for
documents and queries:
final agent = Agent('openai');
final embedding = await agent.embedQuery('Hello world');
final results = await agent.embedDocuments(['Text 1', 'Text 2']);
final similarity = EmbeddingsModel.cosineSimilarity(e1, e2);
Also, the cosineSimilarity method has been moved to the EmbeddingsModel.
Automatic Retry
The agent now supports automatic retry for rate limits and failures:
final agent = Agent('openai');
final result = await agent.send('Hello!'); // Automatically retries on 429
Agent<TOutput>(outputSchema) => sendForXxx<TOutput>(outputSchema)
Instead of putting the output schema on the Agent class, it’s now on the
sendForXxx method:
// OLD
final agent = Agent<Map<String, dynamic>>('openai', outputSchema: ...);
final result = await agent.send('Hello');
// NEW
final agent = Agent('openai');
final result = await agent.sendFor<Map<String, dynamic>>('Hello', outputSchema: ...);
This allows you to be more flexible from message to message.
AgentResponse to ChatResult<MyType>
The AgentResponse type has been renamed to ChatResult.
DotPrompt Support Removed
The dependency on the dotprompt_dart
package has been removed from
dartantic_ai. However, you can still use the DotPrompt class to parse
.prompt files:
import 'package:dotprompt_dart/dotprompt_dart.dart';
final dotPrompt = DotPrompt(...);
final prompt = dotPrompt.render();
final agent = Agent(dotPrompt.frontMatter.model!);
await agent.send(prompt);
The Agent.sendForXxx method now supports specifying the output type of the
tool call:
final provider = Agent.createProvider('openai');
// Note: ProviderCaps removed - capabilities are now model-specific
// tools
final agent = Agent.forProvider(
provider,
tools: [currentDateTimeTool, temperatureTool, recipeLookupTool],
);
// typed output
final result = await agent.sendFor<TimeAndTemperature>(
'What is the time and temperature in Portland, OR?',
outputSchema: TimeAndTemperature.schema,
outputFromJson: TimeAndTemperature.fromJson,
);
// magic!
print('time: ${result.output.time}');
print('temperature: ${result.output.temperature}');
Unfortunately, not all providers support this feature. You can check the
provider’s capabilities to see if it does.
ChatMessage Part Helpers
The ChatMessage class has been enhanced with helpers for extracting specific
types of parts from a list:
final message = ChatMessage.system('You are a helpful assistant.');
final text = message.text; // "You are a helpful assistant."
final toolCalls = message.toolCalls; // []
final toolResults = message.toolResults; // []
Usage Tracking
The agent now supports usage tracking:
final result = await agent.send('Hello');
print('Tokens used: ${result.usage.totalTokens}');
Logging
The agent now supports logging:
Agent.loggingOptions = const LoggingOptions(level: LogLevel.ALL);
Migrating from 0.9.6 to 0.9.7
Breaking Change: DataPart.file → DataPart.stream
The DataPart.file constructor has been removed and replaced with
DataPart.stream for file and image attachments. This change was made to enable
compatibility with web and WASM platforms, where direct file access is not
available.
What you need to do
- Replace all usages of
DataPart.file(File(...)) with await DataPart.stream(file.openRead(), name: file.path).
- Always provide a
name argument for best results (especially for web/WASM).
Before (0.9.6 and earlier)
final file = File('bio.txt');
final part = await DataPart.file(file);
After (0.9.7 and later)
final file = File('bio.txt');
final part = await DataPart.stream(file.openRead(), name: file.path);
Why?
- This approach works on all Dart platforms, including web and WASM, by using
streams instead of direct file APIs.
- It also makes it easier to support file uploads from sources other than the
local filesystem.