Getting Started
Configuration
Configure Cogitator with YAML files, environment variables, and provider settings.
Configuration File
Create a cogitator.yml in your project root:
llm:
defaultProvider: ollama
providers:
ollama:
type: ollama
host: http://localhost:11434
model: llama3.2
openai:
type: openai
apiKey: ${OPENAI_API_KEY}
model: gpt-4o
anthropic:
type: anthropic
apiKey: ${ANTHROPIC_API_KEY}
model: claude-sonnet-4-5-20250929
memory:
adapter: redis
redis:
url: redis://localhost:6379
embedding:
provider: ollama
model: nomic-embed-text
logging:
level: info
format: prettyLoad it with @cogitator-ai/config:
import { Cogitator } from '@cogitator-ai/core';
import { loadConfig } from '@cogitator-ai/config';
const config = await loadConfig();
const cogitator = new Cogitator(config);loadConfig() automatically:
- Reads
cogitator.ymlfrom the current working directory - Expands
${ENV_VAR}references from environment variables - Validates the config against the expected schema
- Merges with sensible defaults
Environment Variables
Store secrets in .env:
# LLM Providers
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=AIza...
AZURE_OPENAI_API_KEY=...
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
# Infrastructure
REDIS_URL=redis://localhost:6379
DATABASE_URL=postgresql://user:pass@localhost:5432/cogitator
# Observability
LANGFUSE_PUBLIC_KEY=pk-...
LANGFUSE_SECRET_KEY=sk-...Programmatic Configuration
You can skip the YAML file and configure directly:
const cogitator = new Cogitator({
llm: {
defaultProvider: 'openai',
providers: {
openai: {
type: 'openai',
apiKey: process.env.OPENAI_API_KEY!,
model: 'gpt-4o',
},
},
},
memory: {
adapter: 'postgres',
connectionString: process.env.DATABASE_URL!,
embedding: {
provider: 'openai',
model: 'text-embedding-3-small',
},
},
});Provider Configuration
Ollama (Local)
providers:
ollama:
type: ollama
host: http://localhost:11434
model: llama3.2OpenAI
providers:
openai:
type: openai
apiKey: ${OPENAI_API_KEY}
model: gpt-4o
organization: org-... # optionalAnthropic
providers:
anthropic:
type: anthropic
apiKey: ${ANTHROPIC_API_KEY}
model: claude-sonnet-4-5-20250929Google Gemini
providers:
google:
type: google
apiKey: ${GOOGLE_API_KEY}
model: gemini-2.0-flashAzure OpenAI
providers:
azure:
type: azure
apiKey: ${AZURE_OPENAI_API_KEY}
endpoint: ${AZURE_OPENAI_ENDPOINT}
deployment: gpt-4o
apiVersion: '2024-02-15-preview'AWS Bedrock
providers:
bedrock:
type: bedrock
region: us-east-1
model: anthropic.claude-3-sonnet-20240229-v1:0
# Uses AWS SDK default credential chainConfig Merging
Configs are merged with this priority (highest wins):
- Programmatic options passed to
new Cogitator({...}) - Environment variables (
${ENV_VAR}expansions in YAML) cogitator.ymlfile values- Built-in defaults