Skip to main content

Transfer Tool (Canonical v10)

The transfer tool performs bulk data movement between a source and target in a single pipeline task.

Canonical reminders:

  • Use task.spec.policy.rules for retry/fail.
  • Keep large intermediate data reference-first when possible.

Basic usage (HTTP → Postgres)

- step: transfer_posts
tool:
- xfer:
kind: transfer
source:
tool: http
url: "{{ workload.api_url }}/posts"
method: GET
target:
tool: postgres
auth:
source: credential
key: pg_demo
service: postgres
table: public.posts
mapping:
post_id: id
user_id: userId
title: title
body: body
spec:
policy:
rules:
- when: "{{ outcome.status == 'error' }}"
then: { do: fail }
- else:
then: { do: break }

Common fields

FieldMeaning
source.toolhttp | postgres | snowflake (implementation-defined)
source.urlHTTP URL (for HTTP sources)
source.querySQL query (for DB sources)
source.authAuth config for DB sources
target.toolpostgres | snowflake (implementation-defined)
target.tableDestination table (or target.query for custom writes)
target.authAuth config for target
mappingTarget column → source field mapping
chunk_sizeRows per chunk (optional)
modeAppend/overwrite/upsert (implementation-defined)

See also

  • Snowflake tool: documentation/docs/reference/tools/snowflake.md
  • Postgres tool: documentation/docs/reference/tools/postgres.md