Skip to content

YAML Schema Reference

Runsight uses three YAML file types. This page is the exhaustive field reference for all three. For a guided walkthrough of workflow files, see YAML Schema.

Workflow files live in custom/workflows/ and define the full execution graph. The root model is RunsightWorkflowFile.

FieldTypeDefaultRequiredDescription
versionstr"1.0"noSchema version
enabledboolfalsenoWhether the workflow is active
configDict[str, Any]{}noArbitrary workflow configuration
interfaceWorkflowInterfaceDefnonenoPublic input/output contract for callable sub-workflows
toolsList[str][]noTool IDs available to souls in this workflow. Duplicates are rejected.
soulsDict[str, SoulDef]{}noInline soul definitions. The dict key must match the soul’s id field.
blocksDict[str, BlockDef]{}noBlock definitions keyed by block ID. Uses a discriminated union on type.
workflowWorkflowDefyesGraph metadata (name, entry point, transitions)
limitsWorkflowLimitsDefnonenoWorkflow-level budget constraints
evalEvalSectionDefnonenoEmbedded test cases for offline evaluation
FieldTypeDefaultRequiredDescription
namestryesWorkflow name
entrystryesBlock ID to start execution
transitionsList[TransitionDef][]noSimple A to B transitions
conditional_transitionsList[ConditionalTransitionDef][]noMulti-path transitions based on output
FieldTypeRequiredDescription
fromstryesSource block ID
tostr or nullnoTarget block ID, or null for terminal
FieldTypeRequiredDescription
fromstryesSource block ID
defaultstr or nullnoFallback target if no key matches
(extra keys)strnoDecision key mapped to target block ID

Public callable contract for sub-workflows.

FieldTypeDefaultDescription
inputsList[WorkflowInterfaceInputDef][]Input parameters. Names must be unique.
outputsList[WorkflowInterfaceOutputDef][]Output parameters. Names must be unique.
FieldTypeDefaultRequiredDescription
namestryesInput parameter name (must be unique)
targetstryesDot-notation path to child state key
typestrnonenoType hint
requiredbooltruenoWhether input must be provided
defaultAnynonenoDefault value if not provided
descriptionstrnonenoHuman-readable description
FieldTypeDefaultRequiredDescription
namestryesOutput parameter name (must be unique)
sourcestryesDot-notation path to child result
typestrnonenoType hint
descriptionstrnonenoHuman-readable description
FieldTypeDefaultConstraintsDescription
max_duration_secondsintnone1—86400Maximum wall-clock time
cost_cap_usdfloatnone>= 0.0Maximum cost in USD
token_capintnone>= 1Maximum total tokens
on_exceedstr"fail""warn" or "fail"Action when limit is exceeded
warn_at_pctfloat0.80.0—1.0Threshold percentage to trigger a warning

Same as WorkflowLimitsDef but without warn_at_pct. Applied per block via the limits field.

FieldTypeDefaultConstraintsDescription
max_duration_secondsintnone1—86400Maximum wall-clock time
cost_cap_usdfloatnone>= 0.0Maximum cost in USD
token_capintnone>= 1Maximum total tokens
on_exceedstr"fail""warn" or "fail"Action when limit is exceeded

The blocks dict uses a discriminated union on the type field. Each block type extends BaseBlockDef and adds its own fields. The following sections document block types with non-trivial additional fields.

Calls a child workflow as a sub-workflow (hierarchical state machine). The parent block binds values into the child’s declared interface and reads results back out after the child completes.

FieldTypeDefaultRequiredDescription
type"workflow"yesDiscriminator
workflow_refstryesSlug or path of the child workflow to call
inputsDict[str, str]nonenoMaps child interface input names to parent dotted paths (e.g. topic: shared_memory.topic)
outputsDict[str, str]nonenoMaps parent dotted paths to child interface output names (e.g. shared_memory.summary: summary)
max_depthintnone (runtime default 10)noMaximum HSM recursion depth
on_error"raise" or "catch""raise"no"catch" swallows child failure and returns an error exit handle instead of propagating

All inherited BaseBlockDef fields (stateful, routes, depends, error_route, retry_config, exits, exit_conditions, timeout_seconds, limits, etc.) are also available.

Validation rules:

  • inputs keys must be plain interface names (no dots). They reference the child workflow’s interface.inputs[].name.
  • outputs values must be plain interface names (no dots). They reference the child workflow’s interface.outputs[].name.

The parent workflow defines a workflow block that calls a child. The child declares its callable contract via the top-level interface section.

Parent workflow — calls the child and wires data in and out:

custom/workflows/research_pipeline.yaml
version: "1.0"
enabled: true
blocks:
run_analysis:
type: workflow
workflow_ref: analysis_subworkflow
inputs:
topic: shared_memory.topic
depth: shared_memory.analysis_depth
outputs:
shared_memory.summary: summary
shared_memory.citations: sources
max_depth: 5
on_error: catch
timeout_seconds: 600
workflow:
name: Research Pipeline
entry: run_analysis
transitions:
- from: run_analysis
to: null

Child workflow — declares the interface contract the parent binds to:

custom/workflows/analysis_subworkflow.yaml
version: "1.0"
enabled: true
interface:
inputs:
- name: topic
target: shared_memory.topic
type: string
required: true
description: The research topic to analyze
- name: depth
target: shared_memory.depth
type: integer
required: false
default: 3
description: How many layers deep to research
outputs:
- name: summary
source: shared_memory.final_summary
type: string
description: Completed analysis summary
- name: sources
source: shared_memory.collected_sources
type: list
description: List of cited sources
souls:
analyst:
id: analyst
role: Research Analyst
system_prompt: "Analyze the topic in shared_memory.topic."
blocks:
analyze:
type: soul
soul_ref: analyst
task: "Research the topic and write a summary."
workflow:
name: Analysis Sub-Workflow
entry: analyze
transitions:
- from: analyze
to: null

In the parent, inputs keys (topic, depth) match the child’s interface.inputs[].name values. The parent values (shared_memory.topic, shared_memory.analysis_depth) are dotted paths into the parent’s own state.

In the parent, outputs values (summary, sources) match the child’s interface.outputs[].name values. The parent keys (shared_memory.summary, shared_memory.citations) are dotted paths where results are written in the parent’s state.

FieldTypeDefaultConstraintsDescription
thresholdfloatnone0.0—1.0Pass rate threshold for the suite
casesList[EvalCaseDef]min 1 item, unique IDsTest case definitions
FieldTypeDefaultDescription
idstrUnique test case ID (strict string)
descriptionstrnoneHuman-readable description
inputsDict[str, Any]noneInput data for the workflow
fixturesDict[str, str]noneBlock ID to mock output (skips LLM calls)
expectedDict[str, List[Dict[str, Any]]]noneBlock ID to list of assertion configs
FieldTypeDefaultDescription
eval_keystrDot-notation path into block’s own result
operatorstrComparison operator
valueAnynoneComparison value (none for unary operators like is_empty, exists)
FieldTypeDefaultDescription
combinatorstr"and""and" or "or"
conditionsList[ConditionDef]List of conditions to combine
FieldTypeDefaultDescription
case_idstrUnique case identifier
condition_groupConditionGroupDefnoneConditions for this case (none when default: true)
defaultboolfalseWhether this is the default/fallback case

Shorthand route definition that compiles into output conditions and transitions.

FieldTypeDefaultDescription
casestrCase identifier (YAML alias for case_id)
whenConditionGroupDefnoneConditions for this route
gotostrTarget block ID
defaultboolfalseWhether this is the default route

Exactly one route must have default: true. Route case values must be unique.

FieldTypeDescription
fromstrDot-notation reference to upstream block output (e.g. "step_id.output_field")
FieldTypeDescription
idstrUnique exit port ID
labelstrHuman-readable label

Extends ExitDef with per-exit soul and task instruction.

FieldTypeDescription
idstrUnique exit port ID
labelstrHuman-readable label
soul_refstrSoul ID for this branch
taskstrTask instruction for this branch’s soul
FieldTypeDefaultDescription
containsstrnoneSubstring match against output
regexstrnoneRegex pattern match against output
exit_handlestrExit handle value to set on match
FieldTypeDefaultConstraintsDescription
max_attemptsint31—20Maximum retry attempts
backoffstr"fixed""fixed" or "exponential"Backoff strategy
backoff_base_secondsfloat1.00.1—60.0Base delay between retries
non_retryable_errorsList[str]noneError types that should not be retried

Soul files live in custom/souls/ as standalone YAML files (one soul per file). The filename stem becomes the soul key. Soul files are flat — no wrapper object, just the fields directly.

FieldTypeDefaultRequiredDescription
idstryesUnique identifier for the soul
rolestryesThe role of the agent (e.g. "Senior Researcher")
system_promptstryesSystem instructions defining the agent’s behavior
toolsList[str]nonenoTool name references available to this soul
required_tool_callsList[str]nonenoTool function names that must be called before completion
max_tool_iterationsint5noMaximum tool-use iterations per execution
model_namestrnonenoModel override (uses runner default if not set)
providerstrnonenoProvider override for the selected model
temperaturefloatnonenoSampling temperature override
max_tokensintnonenoOutput token limit override
avatar_colorstrnonenoUI color hint for displaying the soul
modified_atstrnonenoTimestamp of last modification
custom/souls/researcher.yaml
id: researcher_1
role: Senior Researcher
system_prompt: >
You are a senior researcher. Analyze the given topic thoroughly
and produce a structured research report with citations.
tools: null
max_tool_iterations: 5
model_name: gpt-4.1-mini
provider: openai
temperature: 0.7
max_tokens: null
avatar_color: primary
modified_at: null

Souls can also be defined inline in a workflow file under the souls: section. When inline, the dict key must match the soul’s id field:

workflow with inline soul
souls:
my_analyst:
id: my_analyst
role: Analyst
system_prompt: "Analyze the data."

If a soul file and an inline soul share the same key, the inline definition takes precedence and a warning is logged.


Custom tool files live in custom/tools/ as standalone YAML files (one tool per file). The filename stem becomes the tool ID. Reserved builtin tool IDs (http, file_io, delegate) cannot be used.

FieldTypeDefaultRequiredDescription
versionstryesSchema version (e.g. "1.0")
typestryesMust be "custom"
executorstryes"python" or "request"
namestryesHuman-readable tool name
descriptionstryesDescription of what the tool does
parametersDictyesJSON Schema object describing the tool’s input parameters
codestrnoneconditionalPython source code with def main(args). Required for executor: python unless code_file is set.
code_filestrnoneconditionalPath to external Python file (relative to tool YAML). Mutually exclusive with code.
requestDictnoneconditionalHTTP request configuration. Required for executor: request.
timeout_secondsintnonenoRequest timeout in seconds. Only valid for executor: request. Must be a positive integer.
FieldTypeDefaultRequiredDescription
methodstr"GET"yesHTTP method
urlstryesRequest URL. Supports ${ENV_VAR} substitution.
headersDict[str, str]{}noRequest headers
body_templatestrnonenoRequest body template with {{ param }} substitution
response_pathstrnonenoJSONPath to extract from response
custom/tools/slack_payload_builder.yaml
version: "1.0"
type: custom
executor: python
name: Slack Payload Builder
description: Build a JSON payload string for the Slack incoming webhook.
parameters:
type: object
properties:
text:
type: string
description: Message text to encode for Slack.
required:
- text
code: |
import json
def main(args):
text = str(args.get("text", ""))
return {"payload_json": json.dumps({"text": text})}
custom/tools/slack_webhook.yaml
version: "1.0"
type: custom
executor: request
name: Slack Webhook
description: Send a message to the configured Slack incoming webhook.
parameters:
type: object
properties:
payload_json:
type: string
description: Complete JSON payload to send to Slack.
required:
- payload_json
request:
method: POST
url: "${SLACK_WEBHOOK_URL}"
headers:
Content-type: application/json
body_template: "{{ payload_json }}"
timeout_seconds: 10

A JSON schema is auto-generated from the Pydantic models for Monaco editor autocomplete:

Terminal window
python packages/core/scripts/generate_schema.py # generate to disk
python packages/core/scripts/generate_schema.py --check # CI mode: exit 1 if out of sync

The generated schema lives at packages/core/runsight-workflow-schema.json.