Agents

LLM Integration

How agents integrate with various LLM providers through a unified interface

Overview

Compozy agents integrate with multiple LLM providers through a unified interface, allowing seamless switching between models and providers. This abstraction enables you to choose the best model for each task while maintaining consistent configuration patterns.

Supported Providers

OpenAI

Access GPT-4, GPT-3.5, and other OpenAI models with full function calling support

Anthropic

Claude 3 family (Opus, Sonnet, Haiku) with advanced reasoning capabilities

Google

Gemini Pro, Gemini Ultra, and PaLM models for multimodal applications

Groq

Fast inference with Llama, Mixtral, and other open models at low latency

Ollama

Run any Ollama-supported model locally for privacy-first applications

DeepSeek

DeepSeek AI models with OpenAI-compatible API for advanced reasoning

xAI

Grok models with OpenAI-compatible API for cutting-edge AI capabilities

Custom

Connect to any OpenAI-compatible API endpoint for specialized models

Provider Configuration

Basic Configuration

Each agent requires a provider configuration that specifies the LLM to use. The model property accepts any valid model string for the chosen provider - it's not limited to a predefined list:

config:
  provider: openai
  model: gpt-4-turbo-preview
  api_key: "{{ .env.OPENAI_API_KEY }}"
  params:
    temperature: 0.7
    max_tokens: 2000

Provider-Specific Examples

config:
  provider: openai
  model: gpt-4-turbo-preview
  api_key: "{{ .env.OPENAI_API_KEY }}"
  params:
    temperature: 0.7
    max_tokens: 4000
    top_p: 0.9
    frequency_penalty: 0.1
    presence_penalty: 0.1

Model Parameters

Common Parameters

Most providers support these standard parameters. For the complete and most up-to-date parameter schema, see the Provider Schema Documentation.

ParameterTypeDescriptionRange
temperaturefloatControls randomness in generation0.0 - 2.0
max_tokensintMaximum tokens to generateModel-specific
top_pfloatNucleus sampling threshold0.0 - 1.0
top_kintTop-k sampling parameter1 - 100
frequency_penaltyfloatReduces repetitive tokens-2.0 - 2.0
presence_penaltyfloatEncourages new topics-2.0 - 2.0

Provider-Specific Parameters

Global Model Configuration

Define reusable model configurations in your project's compozy.yaml:

models:
  - provider: openai
    model: gpt-4-turbo-preview
    api_key: "{{ .env.OPENAI_API_KEY }}"
    params:
      temperature: 0.7

  - provider: anthropic
    model: claude-4-opus
    api_key: "{{ .env.ANTHROPIC_API_KEY }}"

  - provider: ollama
    model: llama2:13b
    api_url: "http://localhost:11434"

Reference global models in agents:

agents:
  - id: code-reviewer
    config:
      $ref: global::models.#(provider=="anthropic")
    instructions: "Review code for quality..."

Next Steps