Define AI_
workflows as YAML.
Deploy like infrastructure.
Single binary. Postgres only. Bring your own keys. 9 connectors, infinite plugins.
name: fetch-and-summarize
description: Fetch data from an API and summarize it
inputs:
url:
type: string
description: URL to fetch
steps:
- name: fetch-data
action: http/request
params:
method: GET
url: "{{ inputs.url }}"
- name: summarize
action: ai/completion
credential: my-openai-key
params:
model: gpt-4o
prompt: "Summarize: {{ steps['fetch-data'].output.body }}"
output_schema:
type: object
properties:
summary:
type: string
key_points:
type: array Why Mantle?
IaC Lifecycle
validate, plan, apply, run. The same workflow you use for Terraform, applied to workflow automation. Pin executions to immutable versions. Diff before you deploy.
Single Binary
One Go binary. One Postgres database. No message queues, no worker fleets, no cluster topology. Deploy anywhere containers run.
BYOK
Your API keys live in your database, encrypted with your encryption key. Mantle never proxies through a hosted service. OpenAI, Anthropic, Bedrock, Azure, or self-hosted.
AI Tool Use
Multi-turn function calling out of the box. The LLM requests tools, Mantle executes them via connectors, feeds results back. Crash recovery included.
How It Works
Write
Define your workflow as a YAML file
Validate
Check structure offline, run in CI
Apply
Store an immutable version in Postgres
Run
Execute with checkpoint-and-resume
$ mantle validate examples/fetch-and-summarize.yaml
fetch-and-summarize.yaml: valid
$ mantle plan examples/fetch-and-summarize.yaml
+ fetch-and-summarize (new workflow, version 1)
$ mantle apply examples/fetch-and-summarize.yaml
Applied fetch-and-summarize version 1
$ mantle run fetch-and-summarize --input url=https://api.example.com/data
Running fetch-and-summarize (version 1)...
Execution a1b2c3d4: completed
fetch-data: completed (0.8s)
summarize: completed (2.1s) Built-in Connectors
7 connectors with 9 actions ship with the binary — plus a plugin system for anything else.
HTTP
REST APIs, webhooks, any HTTP endpoint
AI (OpenAI)
Chat completions, structured output, tool use
AI (Bedrock)
AWS Bedrock models with region routing
Slack
Send messages, read channel history
Send via SMTP, plaintext and HTML
Postgres
Parameterized SQL against external databases
S3
Put, get, list objects (S3-compatible)
+ Plugins
Any executable. JSON in, JSON out. Python, Go, Bash.
How Mantle Compares
| Mantle | Temporal | n8n / Zapier | LangChain | Prefect / Airflow | |
|---|---|---|---|---|---|
| Primary use case | AI workflow automation | Distributed transactions | SaaS integration | LLM app development | Data pipelines |
| Workflow format | YAML + CEL | Go / Java SDK | Visual canvas | Python code | Python code |
| Deployment | Single binary + Postgres | Multi-service cluster | Self-hosted or cloud | Library in your app | Python platform |
| AI/LLM support | First-class (built-in) | Build it yourself | Partial (AI nodes) | First-class (library) | Build it yourself |
| Checkpointing | Built-in | Built-in | Partial | None | Built-in |
| Secrets management | Built-in, encrypted | External | Built-in | External | External |
| Version control | Git-native (IaC) | Code in repos | Database-stored | Code in repos | Code in repos |
| Target user | Platform engineers | Backend engineers | Non-technical users | Python developers | Data engineers |
| Operational complexity | Low | High | Medium | N/A (library) | Medium-High |
Mantle is early (v0.1.0). Temporal, Airflow, and LangChain have years of production hardening and larger ecosystems. Choose the tool that fits your use case and team.
Built for Production
Enterprise features ship in the same binary. No upgrade tiers, no feature gates.
Multi-tenancy & RBAC
Teams, users, roles (admin / team_owner / operator). API key authentication with hashed storage.
OIDC / SSO
Token-sniffing middleware supports API keys and OIDC tokens. Integrate with your existing identity provider.
Encrypted Credentials
AES-256-GCM encryption at rest. Cloud secret backends: AWS Secrets Manager, GCP Secret Manager, Azure Key Vault.
Audit Trail
Every state-changing operation emits an immutable audit event to Postgres. Query with mantle audit.
Prometheus Metrics
Workflow execution counts, step durations, connector latencies, active executions gauge. Scrape /metrics in server mode.
Helm Chart
Production-ready Kubernetes deployment. PDB, migration job, startup probes, security contexts, ServiceMonitor.
Rate Limiting
Protect external APIs and control execution throughput.
Data Retention
Configure retention policies for execution history and audit events.
Multi-arch Images
amd64 + arm64 Docker images published to GHCR. govulncheck, gosec, and Trivy scanning in CI.
Get Running in 5 Minutes
From zero to running workflow in four commands.
$ go install github.com/dvflw/mantle/cmd/mantle@latest $ docker compose up -d
$ mantle init $ mantle apply examples/hello-world.yaml
# Applied hello-world version 1 $ mantle run hello-world
# Running hello-world (version 1)...
# Execution a1b2c3d4: completed
# fetch: completed (1.0s) 17 example workflows included. HTTP, AI, Slack, Postgres, S3, parallel execution, cron triggers, webhooks, and multi-turn tool use.