Structured Outputs
Schema-driven structured outputs with provider-native enforcement
Overview
Structured outputs ensure agents return data in predictable, machine‑readable formats. This is essential for workflow automation, API integrations, and reliable data processing. Compozy uses your action output
JSON Schemas to request provider‑native structured outputs when supported (e.g., OpenAI response_format) and always validates responses against the schema.
If you're new to structured outputs, start with our Quick Start Guide to understand basic workflow concepts, then explore Agent Configuration for foundational agent setup patterns.
Enabling Structured Outputs
Structured outputs are enabled by defining an output
schema on an action. When present, Compozy:
- Requests native structured outputs for providers that support it (e.g., OpenAI compatible response_format JSON schema).
- Falls back to prompt guidance and post‑validation when native support isn’t available.
actions:
- id: analyze-data
prompt: "Analyze the data and return findings"
output:
type: object
properties:
findings:
type: array
items:
type: string
Structured Output Behavior
When structured outputs are in effect, agent behavior ensures schema‑compatible responses:
- Pure JSON Response: Agent must respond with valid JSON only
- No Markdown: No markdown formatting or explanatory text outside JSON
- Direct Parsing: Response is directly parseable by downstream workflow tasks
- Validation Errors: Runtime errors occur if response isn't valid JSON
Output Schemas
Output schemas define the expected structure of agent responses using JSON Schema (Draft 7). Schemas provide validation, type safety, and clear contracts for agent interactions within workflow orchestration.
Basic Schema Definition
Define schemas directly within action definitions for simple validation:
actions:
- id: classify-text
prompt: "Classify the sentiment of the text"
output:
type: object
properties:
sentiment:
type: string
enum: ["positive", "negative", "neutral"]
description: "Overall sentiment"
confidence:
type: number
minimum: 0
maximum: 1
description: "Confidence score"
keywords:
type: array
items:
type: string
description: "Key words indicating sentiment"
required: ["sentiment", "confidence"]
This basic approach works well for simple schemas. For complex validation patterns, consider using reusable schemas to promote consistency across your workflow definitions.
Reusable Schemas
Define reusable schemas within your workflow to promote consistency and reduce duplication across multiple agents and actions. Reference schemas directly by ID (no $ref
).
Defining Schemas
Add schema definitions to your workflow configuration under the schemas
property:
# workflow.yaml
schemas:
- id: user_profile
type: object
properties:
email:
type: string
format: email
name:
type: string
minLength: 2
age:
type: integer
minimum: 0
maximum: 150
required: [email, name]
- id: api_response
type: object
properties:
status:
type: string
enum: [success, error, pending]
data:
type: object
message:
type: string
required: [status]
- id: classification_result
type: object
properties:
category:
type: string
confidence:
type: number
minimum: 0
maximum: 1
metadata:
type: object
required: [category, confidence]
Referencing Schemas
Reference defined schemas in your agent actions by ID:
Reference a schema by ID using local scope:
agents:
- id: user-processor
actions:
- id: validate-user
prompt: "Validate the user data structure"
output: user_profile
- id: classify-content
prompt: "Classify the content type"
output: classification_result
Schema Benefits
Reusable schemas provide several advantages for workflow development:
- Consistency: Ensure uniform data structures across agents and actions
- Maintainability: Update schema definitions in one place for project-wide changes
- Validation: Leverage JSON Schema's powerful validation capabilities
- Documentation: Schema descriptions serve as inline documentation for data structures
- Integration: Seamless integration with YAML template system and workflow configuration
Practical Examples
Real-world examples demonstrating structured outputs in common scenarios. These patterns work well with parallel task execution and collection tasks for processing multiple data items efficiently.
Extract structured data from unstructured text using precise schemas:
schemas:
- id: invoice_data
type: object
properties:
invoice_number:
type: string
description: "Unique invoice identifier"
date:
type: string
format: date
description: "Invoice issue date in YYYY-MM-DD format"
due_date:
type: string
format: date
vendor:
type: object
properties:
name:
type: string
address:
type: string
tax_id:
type: string
line_items:
type: array
items:
type: object
properties:
description:
type: string
quantity:
type: number
unit_price:
type: number
total:
type: number
subtotal:
type: number
tax:
type: number
total:
type: number
required: ["invoice_number", "date", "vendor", "line_items", "total"]
agents:
- id: invoice-processor
actions:
- id: extract-invoice
prompt: |
Extract invoice information from the text:
{{.input.text}}
Return all monetary values as numbers without currency symbols.
Return dates in ISO 8601 format.
# structured outputs are derived from the schema
output: invoice_data
This pattern works well for document processing workflows that need to extract and validate financial data.
Schema Validation Process
Understanding the validation lifecycle helps with debugging and building robust AI-powered workflows. Compozy validates schemas at multiple stages to ensure data integrity throughout the execution pipeline.
Pre-execution
Agent Processing
Post-execution
Error Handling
Validation Error Examples
Common validation scenarios and their error messages:
# Schema definition
output:
type: object
properties:
score:
type: number
minimum: 0
maximum: 100
required: [score]
# Valid response ✅
{"score": 85}
# Invalid responses ❌
{"score": "high"} # Error: expected number, got string
{"rating": 85} # Error: missing required property 'score'
{"score": 150} # Error: value exceeds maximum of 100
For advanced validation patterns and custom error handling, explore YAML Template Advanced Patterns and Error Handling in Workflows.