CDS Worker


A pipeline is structured in sequential stages containing one or multiple concurrent jobs. A Job will be executed by a worker.

The worker provides some useful commands that can be used in a step, as worker upload..., worker download...``worker cache...

On Windows OS, theses commands can be accessed with worker.exe [cmd] syntax.

worker [flags]


      --api string                   URL of CDS API
      --basedir string               This directory (default TMPDIR os environment var) will contains worker working directory and temporary files
      --booked-workflow-job-id int   Booked Workflow job id
      --config string                base64 encoded json configuration
      --graylog-extra-key string     Ex: --graylog-extra-key=xxxx-yyyy
      --graylog-extra-value string   Ex: --graylog-extra-value=xxxx-yyyy
      --graylog-host string          Ex: --graylog-host=xxxx-yyyy
      --graylog-port string          Ex: --graylog-port=12202
      --graylog-protocol string      Ex: --graylog-protocol=xxxx-yyyy
      --hatchery-name string         Hatchery Name spawing worker
      --insecure                     (SSL) This option explicitly allows curl to perform "insecure" SSL connections and transfers.
      --log-level string             Log Level: debug, info, notice, warning, critical (default "notice")
      --model string                 Model of worker
      --name string                  Name of worker
      --token string                 CDS Token