Overview
Build powerful agent capabilities with TypeScript-based tools that integrate seamlessly with Compozy's workflow engine
What Are Tools?
Tools are the building blocks that extend your AI agents' capabilities beyond text generation. They're TypeScript functions that agents can call during workflow execution, running in a secure, sandboxed Bun runtime environment.
Why Use Tools?
When building AI-powered applications, you often need agents to perform specific actions like fetching data from APIs, processing files, or interacting with databases. Tools provide a structured, type-safe interface for these operations while maintaining security and performance.
Extend Agent Capabilities
Type-Safe Execution
Secure Runtime
Performance Optimized
How Tools Work
Agent Request
During workflow execution, an agent decides to use a tool based on the task requirements
Input Validation
The tool's input is validated against its JSON Schema definition
Isolated Execution
The tool runs in a sandboxed Bun process with configured permissions
Output Processing
Results are validated and returned to the agent for further processing
Tool Inheritance
Compozy supports hierarchical tool inheritance so you can define reusable tools once and make them available across workflows and agents.
- 1Project Tools
Define shared tools at the project level under
tools:
incompozy.yaml
. These are available to all workflows and agents by default. - 2Workflow Overrides
Workflows can declare their own
tools:
. When a tool ID matches a project tool, the workflow version overrides it. - 3Agent Overrides
If an agent declares any tools, inheritance is disabled for that agent and only the agent's tools are advertised to the LLM.
- 4Deterministic Order
When tools are inherited, the final set is sorted alphabetically by tool ID to keep LLM context stable.
tools:
- id: code-analyzer
description: Analyzes code quality and patterns
- id: data-processor
description: Processes and transforms data
tools:
- id: code-analyzer # Overrides project definition with workflow-specific config
description: Analyzer tuned for this workflow
agents:
- id: reviewer
tools:
- code-analyzer
- read_file
Core Features
Schema-First Design
Every tool defines its interface using JSON Schema, providing:
- Type SafetyCatch errors before runtime with validated inputs/outputs
- Self-DocumentationClear parameter descriptions for agents and developers
- LLM IntegrationSeamless function calling with structured data
input:
type: object
properties:
query:
type: string
description: "Search query to execute"
limit:
type: integer
default: 10
minimum: 1
maximum: 100
required: [query]
Secure Execution Environment
Tools run in isolated Bun processes with granular security controls:
- Permission SystemControl file, network, and environment access
- Resource LimitsSet memory, CPU, and timeout constraints
- Environment IsolationSecure credential management with filtered variables
- Output ValidationPrevent memory exhaustion with size limits
runtime:
permissions:
- --allow-read=./data
- --allow-net=api.example.com
limits:
memory: "256MB"
timeout: "30s"
Performance Optimization
Built-in optimizations ensure tools run efficiently:
- Worker CachingPre-compiled templates for faster startup
- Buffer PoolingEfficient memory management for I/O operations
- Parallel ExecutionRun multiple tools concurrently when needed
- Connection PoolingReuse HTTP connections across requests
Basic Tool Example
// tools/data-processor.ts
interface Input {
data: unknown[];
operation: 'filter' | 'map' | 'reduce';
expression: string;
}
interface Output {
result: unknown;
processed: number;
duration: number;
}
export async function dataProcessor(input: Input): Promise<Output> {
const start = Date.now();
// Validate and process data
if (!Array.isArray(input.data)) {
throw new Error('Input data must be an array');
}
let result: unknown;
switch (input.operation) {
case 'filter':
result = input.data.filter(item =>
evaluate(item, input.expression)
);
break;
case 'map':
result = input.data.map(item =>
transform(item, input.expression)
);
break;
case 'reduce':
result = input.data.reduce((acc, item) =>
aggregate(acc, item, input.expression),
{}
);
break;
}
return {
result,
processed: input.data.length,
duration: Date.now() - start
};
}