Skip to Content
This documentation is provided with the HEAT environment and is relevant for this HEAT instance only.
RunnersDashboard UtilsFilter Payload Node

filter-payload (Processing Node)

The filter-payload node checks an upstream payload against allowed file types and forwards the binary data only when it matches.

Use this node as a gate before type-specific processors so downstream nodes only receive known payload formats.


Configuration Schema

PropertyTypeRequiredDescription
dataSourceNamestringYesData source used when writing the filtered binary output.
allowedTypesarray<string>YesAllowed logical type keys (for example: json, csv, txt, xml, html, dis, protobuf, bin).

How it works

At runtime, the processor:

  1. Fetches the latest full input payload (fetch_latest_full_input_data).
  2. Reads config values dataSourceName and allowedTypes.
  3. Runs type detection (payload_filter_utils.orchestrator.guess) on the input bytes with allowedTypes.
  4. If detection matches, writes a binary output artifact with extension set to the detected type.
  5. Uses a deterministic outputIdentifier to avoid re-processing the same input/type/size combination.

Runtime behavior and fallbacks

  • If no input is available, the node fails.
  • If input does not match any configured allowed type, the node fails.
  • If the same input was already filtered previously (same detected type, source path, and size), processing is skipped.
  • On successful match, output is written as binary via submit_binary_output.

Validation behavior:

  • dataSourceName must be a string.
  • allowedTypes must be an array.

Example configuration

{ "dataSourceName": "default-store", "allowedTypes": ["json", "csv"] }

Supported type keys based on detector mapping:

{ "dataSourceName": "default-store", "allowedTypes": ["json", "csv", "txt", "xml", "html", "dis", "protobuf", "bin"] }

Integration in a session template

  1. Place filter-payload after a node that outputs raw/binary payloads.
  2. Configure allowedTypes for the formats you want to permit.
  3. Connect downstream parser/transform nodes that expect those formats.