Skip to main content
Version: 1.0.1

Workflows

Workflows let you chain multiple jobs into a directed acyclic graph (DAG) where each step runs a scheduled job and the engine coordinates execution order, data passing, conditional branching, and failure handling automatically.

What Is a Workflow?

A workflow is a named, versionable pipeline made up of:

ConceptDescription
StepA node in the DAG — either a Job (Task), a Condition, or a Merge point
EdgeA directed connection from one step to another
RunA single execution instance of the workflow
Data MappingA rule that passes an output field from one step as an input to another

Core Concepts

Node Types

Every step in a workflow is one of three node types:

TypeDescription
TaskDispatches a scheduled job. This is the primary building block.
ConditionEvaluates an expression and routes execution through a true or false port. Does not run a job.
MergeWaits for all incoming branches to complete before allowing downstream steps to proceed. Does not run a job.

Condition and Merge are virtual nodes — they control flow without dispatching any job.

Edges and Ports

Edges connect nodes and define execution order. They can carry optional source ports for conditional routing:

Step A ──────────────────► Step B         (unconditional)

Condition ──── true ────► Step B
└── false ────► Step C (conditional)

Condition node outputs are named ports: true and false. Connect downstream steps to the appropriate port in the Workflow Builder.

Runs

Each time a workflow is triggered (manually or by cron), a WorkflowRun is created. It tracks:

  • Overall status (PendingRunningCompleted / Failed / PartiallyCompleted / Cancelled)
  • Start/end time, total duration
  • Individual step statuses (one JobOccurrence per Task step)
  • Trigger reason

Workflow Settings

When creating or editing a workflow, the following settings are available:

SettingDescription
NameDisplay name (required)
DescriptionOptional human-readable description
TagsComma-separated labels for filtering
ActiveWhether the workflow can be triggered
Cron Expression6-field cron for automatic scheduling (e.g. 0 0 9 * * * = daily at 9:00 AM). Leave empty for manual-only.
Failure StrategyHow the engine handles step failures (see below)
Max Step RetriesNumber of times a failed step is retried before marking it as failed
Timeout (seconds)Maximum total duration. The run is cancelled if exceeded. null = no timeout.

Failure Strategies

StrategyBehavior
Stop on First FailureThe entire workflow stops when any step fails. All pending steps are skipped.
Continue on FailureIndependent parallel branches keep running. Only steps that depend on the failed step are skipped.

Step Settings

Each Task step has the following configuration:

SettingDescription
Step NameLabel shown in the DAG builder and run history
JobWhich scheduled job this step executes
Delay (seconds)Wait this many seconds after dependencies complete before dispatching
Job Data OverrideA static JSON object that replaces the job's default data for this run. Disabled when Data Mappings are active.
Data MappingsDynamic rules to forward output fields from parent steps into this step's job data (see Data Mappings)

Condition steps additionally have a Condition Expression field (see Condition Nodes).


Data Mappings

Data Mappings allow you to pass output fields from a completed upstream step into the job data of a downstream step, making workflows truly data-driven.

Format

Internally, mappings are stored as a JSON dictionary on the step:

{
"sourceStepId:sourcePath": "targetPath"
}
PartExampleDescription
sourceStepId019d13f6-e0d5-7286-be33-945cdb1c83f7ID of the upstream step whose result to read from
sourcePathcomplexProp.titleDot-separated JSON path into the upstream result
targetPathsubjectKey to set in this step's job data

Path lookup is case-insensitivecomplexProp.title matches both complexProp.title and ComplexProp.Title in the upstream result.

Schema-Assisted Selection

In the Workflow Builder, when a job is selected the Input Schema and Output Schema panels appear — generated from the job's C# types. You can pick source and target fields from the dropdowns, with nested object properties shown as dotted paths.

You can also type a custom path directly into the search box and press Enter.

Wildcard Mapping

Select * entire result as the source path to pass the entire JSON result object of the upstream step as-is.

Interaction with Job Data Override

ConditionBehavior
No mappings configuredJob Data Override is editable and used as-is
≥1 mapping configuredJob Data Override is disabled and cleared. Mappings take full control of job data.

Example

Step 1 (ExtractPrices) → result: { "price": 99, "item": { "name": "Widget" } }
Step 2 (SendInvoice) → job data: { "amount": null, "title": null }

Mapping on Step 2:
step1Id:price → amount
step1Id:item.name → title

Result job data for Step 2:
{ "amount": 99, "title": "Widget" }

Condition Nodes

A Condition node evaluates an expression against the results and statuses of its parent steps, then routes execution through either the true or false port.

Expression Syntax

[stepId:](@status|$.field) operator value

Expressions support:

  • && (AND — higher precedence) and || (OR)
  • @status — checks a step's WorkflowStepStatus
  • $.field — checks a field in a step's JSON result
  • Optional stepId: prefix to target a specific parent step; otherwise all parents are checked

Operators

OperatorApplicable to
==, !=Status and string fields
>, <, >=, <=Numeric fields

Examples

# All parents completed
@status == 'Completed'

# A specific parent completed
019d...83f7:@status == 'Completed'

# Result field check
$.price > 100

# Combined
@status == 'Completed' && $.price > 50

# OR branch
$.status == 'approved' || $.status == 'auto-approved'

# Mixed with step prefix
019d...83f7:$.price > 100 && @status != 'Skipped'

If the expression cannot be parsed or evaluated, it defaults to true (the step executes).


Execution Engine

The Workflow Engine runs as a background service inside the Milvaion API, polling at a configurable interval (WorkflowEngine:PollingIntervalSeconds).

Step Lifecycle

Pending → (dependencies satisfied) → Delayed (if delay > 0)
→ Running (dispatched to worker)
→ Completed | Failed | Skipped | Cancelled

Run Status Transitions

StatusMeaning
PendingRun created, engine has not processed it yet
RunningAt least one step is active
CompletedAll steps completed successfully
PartiallyCompletedSome steps failed or were skipped, but at least one succeeded
FailedAll steps either failed or were never reached — no successful completions
CancelledRun was cancelled manually or due to a timeout

Zombie Detection

Task steps that are Running and have not received a heartbeat within their ZombieTimeoutMinutes threshold are detected and marked as zombie/failed, preventing runs from getting stuck indefinitely.


Triggering Workflows

Manual Trigger

Workflows can be triggered from the Milvaion Portal (the Trigger button on the Workflows list or detail page) or via the API:

POST /api/workflows/{workflowId}/trigger

Cron Schedule

Set a Cron Expression on the workflow. The engine automatically triggers the workflow when the cron schedule fires:

0 0 9 * * *       → Every day at 09:00 UTC
0 */30 * * * * → Every 30 minutes
0 0 0 * * MON → Every Monday at midnight

Uses the 6-field cron format (seconds included). See Core Concepts — Cron Expressions for full reference.


Building a Workflow (Step-by-Step)

1. Open the Builder

Navigate to Workflows → click New Workflow or open an existing one and click Edit / Builder.

2. Configure Workflow Settings

Fill in the name, optional cron expression, and choose a failure strategy in Settings.

3. Add Nodes

Click Add Node ▼ in the toolbar to add:

  • Task Step — select the job to run
  • Condition — write a branching expression
  • Merge — join parallel branches

4. Connect Nodes

Drag from the bottom handle of one node to the top handle of another to create an edge. For Condition nodes, drag from the true (right) or false (bottom) port.

5. Configure Each Step

Click a node to open the Step Config Panel:

  1. Set a Step Name
  2. For Task nodes: select the Job
  3. Optionally set a Delay
  4. Add Data Mappings to forward upstream results

6. Save

Click Save in the toolbar. The workflow version is incremented and the definition is persisted.


Versioning

Every save creates a new version. Older versions are stored as snapshots in Workflow.Versions. This allows you to:

  • See when the DAG definition changed
  • Compare what a historical run was executing against

Active runs always execute against the workflow version that was current when the run was triggered.


Limitations

LimitationDetail
No cyclesThe DAG must be acyclic. Circular dependencies are not detected at save time but will cause runs to stall.
Delay on virtual nodesDelay is only supported on Task nodes. Condition and Merge nodes execute instantly when their dependencies are satisfied.
Parallel scalingAll runs for a workflow execute concurrently — there is no built-in workflow-level concurrency limit.
Result sizeStep results are stored in PostgreSQL as plain JSON strings. Very large results (> a few MB) may impact performance.

  • Implementing Jobs — Write jobs that return typed results for use in Data Mappings
  • Core Concepts — Job, Occurrence, Worker fundamentals
  • Reliability — Retry, timeout, and zombie detection details
  • ConfigurationWorkflowEngine section for polling interval and enable/disable