Skip to content

Copy files across endpoints

connect copy [flags] source_files... destination

Copies one or more files without touching the original source. Sources and the destination can be local paths, SFTP URLs, S3/GCS buckets, Azure containers, or aliases.

  • source_files... — one or more paths, globs (*.zip), or regex patterns (regex:^.*\.csv$).
  • destination — a single endpoint or alias. Append /folder/ to drop files into a subdirectory.
  • -r, --recursive — traverse subdirectories when patterns match folders.
  • --flat — with --recursive, collapse the folder structure at the destination.
  • -p, --parallel — number of concurrent workers (default 1).
  • --timeout — network timeout in seconds (default 30).
  • --key, --src-key, --dst-key — specify SSH private keys for SFTP endpoints (single key or per side).
  • --batch, --progress, --compact, --stats-interval, --no-color, --quiet — control logging and progress output.
  • --archive <dir> — move each successfully copied source into a local archive directory.
  • --failover-dst <uri> + --failover-sticky — retry a secondary destination when the primary fails.
  • --sftp-max-conn, --sftp-idle-timeout, --sftp-pool-disable — tune the SFTP connection pool.
  • Embed credentials directly in URIs where appropriate (sftp://user:pass@host/).
  • Prefer aliases (connect alias add) to store secrets securely and reuse them (@prod-sftp/incoming/).
  • S3 credentials can be inline (s3://access:secret:region@bucket/path) or resolved via the AWS credential chain.
  • GCS accepts service-account JSON paths (gs:///var/keys/key.json@bucket/data/) or ADC.
  • Azure URIs support account keys, full connection strings, or SAS URLs.
  • When multiple source arguments are provided, Connect expands each one independently before copying.
  • Recursive copies preserve directory layouts unless --flat is set.
  • Failover destinations are only attempted when the primary copy exits with an error.
  • Progress output downgrades automatically when stdout is not a TTY; use --progress to force an interactive bar.
Terminal window
# Copy a single file to local disk
$ connect cp report.csv /local/path/
# Copy two files to local disk
$ connect cp report.csv summary.xlsx /local/path/
# Copy all CSV files in a folder to local disk
$ connect cp /data/2024/*.csv /local/path/
# Recursive copy of a folder to local disk
$ connect cp -r /data/2024/ /local/path/
# Copy a single file to SFTP using inline credentials
$ connect cp report.csv sftp://user:pass@host/incoming/
# Copy a single file to SFTP using SSH key authentication
$ connect cp --key ~/.ssh/id_rsa report.csv sftp://user@host/incoming/
# Copy multiple files to SFTP using regex pattern and alias
$ connect cp regex:^.*\.(csv|xlsx)$ @sales-sftp/uploads/
# Copy multiple files to GCS bucket using regex pattern from folder and alias
$ connect cp /var/logs/regex:^.*\.log$ @gcs-logs-bucket/2024/
# Copy a single file to SFTP using alias credentials
$ connect cp report.csv @finance-sftp/incoming/
# Copy with inline S3 credentials
$ connect cp invoice.pdf s3://AKIA...:SECRET:us-east-1@archive-bucket/2024/
# Recursive copy with flattened layout
$ connect cp -r --flat /data/jobs/2024/* @analytics-staging/drop/
# Fan-out to a fallback target when the first destination fails
$ connect cp app.log sftp://ops@10.0.0.5/logs/ --failover-dst s3://backup/logs/
# Archive sources once a transfer succeeds
$ connect cp /exports/*.gz @warehouse/raw/ --archive /var/archive