Skip to main content
A full-featured command-line interface for the Dartantic framework. Think of it as curl for AI: scriptable, composable, and ready for shell pipelines.

Installation

cd samples/dartantic_cli
dart pub get

Quick Start

# Simple question
dartantic -p "What is 2+2?"

# Use a specific provider
dartantic -a anthropic -p "Explain recursion in one sentence"

# Pipe input
echo "Summarize this: Dart is a client-optimized language" | dartantic

# Attach files
dartantic -p "Describe this image: @photo.jpg"

Commands

CommandDescription
chatSend a chat prompt (default)
generateGenerate media (images, PDFs, CSVs)
embed createCreate embeddings from documents
embed searchSemantic search over embeddings
modelsList available models for a provider

Global Options

Global options must be placed before the command name:
dartantic [global-options] [command] [command-options]
OptionShortTypeDefaultDescription
--agent-astringgoogleAgent name or model string
--settings-sstring~/.dartantic/settings.yamlPath to settings file
--cwd-dstringcurrent directoryWorking directory for relative paths
--output-dir-ostringcwdOutput directory for generated files
--verbose-vflagfalseShow token usage stats
--no-thinkingflagfalseDisable extended thinking
--no-server-toolmultiDisable server-side tools by name
--no-colorflagfalseDisable colored output
--versionflagShow CLI version
--help-hflagShow help text
Important: Global options must come before the command:
# Correct
dartantic -a openai -p "Hello"
dartantic -v chat -p "Hello"

# Incorrect - causes errors
dartantic chat -a openai -p "Hello"

Chat Command

Send a prompt to an AI agent and receive a streaming response. This is the default command.

Options

OptionShortTypeDescription
--prompt-pstringPrompt text or @filename
--output-schemastringJSON Schema for structured output
--temperature-tfloatModel temperature (0.0-1.0)

Examples

# Basic chat (google is the default provider)
dartantic -p "What is the capital of France?"

# Different providers
dartantic -a openai -p "Hello from GPT-4o"
dartantic -a anthropic -p "Hello from Claude"
dartantic -a ollama -p "Hello from local Llama"

# Model strings for specific models
dartantic -a "google:gemini-2.5-flash" -p "Quick response please"
dartantic -a "openai?chat=gpt-4o&embeddings=text-embedding-3-small" -p "Hi"

# File attachments
dartantic -p "Summarize this: @document.txt"
dartantic -p "Compare these: @file1.txt and @file2.txt"
dartantic -p "What's in this image? @screenshot.png"

# Structured JSON output
dartantic -p "List 3 colors" --output-schema '{"type":"array","items":{"type":"string"}}'

# Control temperature
dartantic -t 0.9 -p "Write a creative story opening"

# Verbose mode (shows token usage)
dartantic -v -p "Hello"

# From stdin
echo "Your prompt" | dartantic
cat prompt.txt | dartantic

Generate Command

Generate media content (images, PDFs) based on a prompt.

Options

OptionShortTypeRequiredDescription
--prompt-pstringNoPrompt text or @filename
--mimemultiYesMIME type to generate (repeatable)

Examples

# Generate an image
dartantic generate --mime image/png -p "A minimalist robot logo"

# Generate to a specific directory
dartantic generate --mime image/png -p "A sunset" -o ./images/

# Generate a PDF
dartantic generate -a openai-responses --mime application/pdf \
  -p "Create a one-page report about AI trends"

# Generate CSV data
dartantic generate -a openai-responses --mime text/csv \
  -p "Sample user data with name, email, age columns"

# Generate multiple formats
dartantic generate -p @prompt.txt --mime image/jpeg --mime image/png

Embeddings Commands

embed create

Create embeddings from text files.
dartantic embed create [options] <files...>
OptionTypeDefaultDescription
--chunk-sizeint512Chunk size in characters
--chunk-overlapint100Overlap between chunks
# Create embeddings
dartantic embed create doc1.txt doc2.txt > embeddings.json

# Custom chunk settings
dartantic embed create --chunk-size 256 --chunk-overlap 50 *.txt > small-chunks.json
Output format:
{
  "model": "google",
  "created": "2024-12-14T15:30:00Z",
  "chunk_size": 512,
  "chunk_overlap": 100,
  "documents": [
    {
      "file": "doc.txt",
      "chunks": [
        {
          "text": "chunk content",
          "vector": [0.1, 0.2, ...],
          "offset": 0
        }
      ]
    }
  ]
}
Semantic search over embeddings.
dartantic embed search -q <query> <embeddings.json>
OptionShortTypeRequiredDescription
--query-qstringYesSearch query
# Semantic search
dartantic embed search -q "machine learning concepts" embeddings.json

# Search with scores
dartantic -v embed search -q "neural networks" embeddings.json

# Search directory of JSON files
dartantic embed search -q "API usage" ./embeddings/
Output format:
{
  "query": "search term",
  "results": [
    {
      "file": "doc.txt",
      "text": "matching text...",
      "offset": 125,
      "similarity": 0.87
    }
  ]
}

Models Command

List available models for a provider.
# Default provider (google)
dartantic models

# Specific provider
dartantic -a openai models
dartantic -a anthropic models
Output:
Provider: Google (google)

Chat Models:
  gemini-2.5-flash
  gemini-1.5-pro

Embeddings Models:
  text-embedding-004

Media Models:
  imagen-3

Settings File

Create ~/.dartantic/settings.yaml to define custom agents and defaults.

Schema

Root level:
FieldTypeDefaultDescription
default_agentstringgoogleDefault agent if none specified
thinkingbooleanfalseEnable extended thinking globally
server_toolsbooleantrueEnable server-side tools
chunk_sizeint512Default chunk size for embeddings
chunk_overlapint100Default overlap between chunks
agentsmapAgent configurations
Agent configuration (agents.<name>):
FieldTypeRequiredDescription
modelstringYesModel string (e.g., openai:gpt-4o)
systemstringNoSystem prompt
thinkingbooleanNoEnable extended thinking
server_toolsbooleanNoEnable server-side tools
output_schemaobjectNoJSON Schema for structured output
api_key_namestringNoEnvironment variable for API key
base_urlstringNoOverride provider’s base URL
headersmapNoCustom HTTP headers
mcp_serverslistNoMCP server configurations
MCP server configuration:
FieldTypeUseDescription
namestringBothServer identifier
urlstringRemoteHTTP server URL
headersmapRemoteCustom HTTP headers
commandstringLocalCommand to execute
argslistLocalCommand arguments
environmentmapLocalEnvironment variables
working_directorystringLocalWorking directory

Full Example

# Global defaults
default_agent: coder
thinking: true
server_tools: true
chunk_size: 512
chunk_overlap: 100

agents:
  # Simple agent with just model
  default:
    model: google

  # Coding assistant
  coder:
    model: anthropic:claude-sonnet-4-20250514
    system: |
      You are an expert software engineer.
      Write clean, well-documented code.

  # Fast responses
  quick:
    model: google:gemini-2.5-flash
    thinking: false

  # Entity extraction with structured output
  extractor:
    model: openai:gpt-4o
    output_schema:
      type: object
      properties:
        entities:
          type: array
          items:
            type: object
            properties:
              name: { type: string }
              type: { type: string }
      required: [entities]

  # Research agent with MCP tools (remote)
  research:
    model: anthropic:claude-sonnet-4-20250514
    mcp_servers:
      - name: context7
        url: https://mcp.context7.com/mcp
        headers:
          CONTEXT7_API_KEY: "${CONTEXT7_API_KEY}"

  # Agent with local MCP server
  filesystem:
    model: google
    mcp_servers:
      - name: filesystem
        command: npx
        args: ["-y", "@anthropic/mcp-server-filesystem", "/tmp"]

  # Custom provider endpoint
  custom:
    model: openai:gpt-4o-mini
    base_url: https://api.custom-provider.com/v1
    headers:
      X-Custom-Header: value
Then use them:
dartantic -a coder -p "Write a binary search in Rust"
dartantic -a extractor -p "John Smith works at Acme Corp"
dartantic -a research -p "Find documentation about hooks"

Environment Variable Substitution

Use ${VAR_NAME} syntax for environment variables:
agents:
  custom:
    model: openai
    headers:
      Authorization: "Bearer ${MY_API_KEY}"

Model String Formats

FormatExampleDescription
Provider onlyopenaiUses provider’s default model
Provider:modelopenai:gpt-4oLegacy colon notation
Provider/modelopenai/gpt-4oSlash notation
URI paramsopenai?chat=gpt-4o&embeddings=text-embedding-3-smallMultiple models
Agent namecoderLookup in settings file

DotPrompt Templates

Use .prompt files for reusable templates with variable substitution:
# math.prompt
---
model: google
input:
  default:
    operation: add
---
Calculate: What is 5 {{operation}} 3?
# Uses default (add)
dartantic -p @math.prompt

# Override variable
dartantic -p @math.prompt operation=multiply

Template Features

  • YAML frontmatter with --- delimiters
  • model: field overrides the agent’s model
  • input: section defines variable defaults
  • {{variable}} placeholders use Mustache syntax
  • Variables passed via CLI: name=value

Exit Codes

CodeNameMeaning
0successCommand executed successfully
1generalErrorGeneral execution error
2invalidArgumentsInvalid arguments or missing options
3configurationErrorSettings file error
4apiErrorProvider API error
5networkErrorNetwork connectivity error

Environment Variables

# Set default agent
export DARTANTIC_AGENT=anthropic

# Enable debug logging
export DARTANTIC_LOG_LEVEL=FINE

# API keys (standard pattern)
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export GEMINI_API_KEY="..."

Agent Resolution Order

  1. CLI --agent flag (highest priority)
  2. Environment variable DARTANTIC_AGENT
  3. Settings file default_agent field
  4. Direct provider name (google, openai, etc.)
  5. ‘google’ (final default)