5-Minute Path
# Zero-install run
export OPENAI_API_KEY=sk-xxx
cat <<'EOF' | npx cogn@2.2.16 core run --stdin --args "hello" --pretty
Return a valid v2.2 envelope (meta + data). Put your answer in data.result.
EOF
# Provider/model flags must come after the command:
# ... core run --stdin --provider minimax --model MiniMax-M2.1 ...Killer Use Case: PR Review Gate (CI)
Run a module on the PR diff, route by meta.risk, and block merges on high risk.
Contract: always v2.2 envelope (ok/meta/data|error)
Routing: low/medium/high risk as a first-class field
Stability: provider differences downgrade safely, with reasons in meta.policy
Core Features
Strong Type Contracts
JSON Schema bidirectional validation for inputs and outputs, ensuring reliable structured output.
Control/Data Separation
Meta control plane for routing decisions, data plane for business logic and auditing.
Module Tiers
exec / decision / exploration - three tiers with different constraints for different scenarios.
Explainable Output
Mandatory confidence + rationale output, every call has confidence score and reasoning.
Multi-LLM Support
OpenAI / Anthropic / Gemini / MiniMax / DeepSeek / Qwen - one module, run anywhere.
MCP Integration
Native support for Claude Desktop, Cursor and other AI tools, seamless integration.