Pipelines Utility Functions

JFrog Pipelines Documentation

ft:sourceType
Paligo

The Utility Functions are built-in shell functions that can be used in steps to interact with the runtime environment.

Most utility functions are available in both Bash (Linux) and Powershell (Windows) OS runtimes.

See it Live

Click here to see some of these utility functions in action.

bump_semver

Description

Increments the provided semver version with the given action.

Usage

bump_semver <semver string> <action>

  • semver string is the semver version to be incremented

  • action is the type of increment to be applied

The valid actions are:

  • major: Increment the major version. The minor and patch versions are reset to 0.

  • minor: Increment the minor version. The patch version is reset to 0.

  • patch: Increment the patch version.

  • alpha: Increment or add an alpha pre-release tag. For example, v1.1.1 becomes v1.1.1-alpha and v1.1.1-alpha becomes v1.1.1-alpha.1. Any other pre-release tags will be removed.

  • beta: Increment or add a beta pre-release tag. For example, v1.1.1 becomes v1.1.1-beta and v1.1.1-beta becomes v1.1.1-beta.1. Any other pre-release tags will be removed.

  • rc: Increment or add an rc pre-release tag. For example, v1.1.1 becomes v1.1.1-rc and v1.1.1-rcbecomes v1.1.1-rc.1. Any other pre-release tags will be removed.

  • final: Remove any pre-release tags, leaving major.minor.patch.

replace_envs

Description

Replaces variables in a file with values based on your current shell env. This is useful to create config files from templates, for example.

If the file contains placeholders that are not defined in the environment, they will become empty strings (“”). The original file is overwritten with the modified file.

Usage

replace_envs <filename1> <filename2> <filenameN>

                     

where your files have placeholders in the format $ENVIRONMENT_VARIABLE_NAME or ${ENVIRONMENT_VARIABLE_NAME}.

retry_command

Description

Execute any command up to three times if it returns a non-zero error code. This is useful when you need to execute a command that can be flaky as a result of network hiccups, for example.

Usage

Shell

Function

Bash

retry_command <shell command>

PowerShell

retry_command <shell command>

  • shell command is the command to be retried

get_uuid

Description

Puts a uuid to stdout. Uses /proc/sys/kernel/random/uuid if available and falls back to uuidgen if not. The function calls exit 1 if neither of these are available.

Usage

Shell

Function

Bash

get_uuid

save_artifact_info

Description

Saves metadata about an artifact. When saved, this metadata is used to enable signed pipelines for the artifacts.

Usage

Shell

Function

Bash

save_artifact_info <artifact type> <file path> [--build-name <build name> --build-number <build number> --project-key <project key>]

PowerShell

save_artifact_info <artifact type> <file path> [-build-name <build name> -build-number <build number> -release-bundle-name <name> -release-bundle-version <version> -project-key <project key>]

  • artifact type : This is the type of artifact. Either file, buildInfo, or releaseBundle.

  • file path : This is the path to the metadata file to be saved.

Bash

PowerShell

Description

--build-name

-build-name

This is name of the build. Required when artifact type is buildInfo.

--build-number

-build-number

This is number of the build. Required when artifact type is buildInfo.

--release-bundle-name

-release-bundle-name

This is name of the Release Bundle. Required when artifact type is releaseBundle.

--release-bundle-version

-release-bundle-version

This is version of the Release Bundle. Required when artifact type is releaseBundle.

--project-key

-project-key

(optional) Defaults to the environment's project_key. Can be specified to save info for a different project.

validate_artifact

Description

Validates the signature of an artifact. Requires signed pipelines to be enabled.

Usage

Shell

Function

Bash

validate_artifact <artifact type> <file path> [--build-name <build name> --build-number <build number> --project-key <project key>]

PowerShell

validate_artifact <artifact type> <file path> [-build-name <build name> -build-number <build number> -project-key <project key>]

  • artifact type: This is the type of artifact. Supports buildinfo.

  • file path: This is the path to the metadata file to be validated.

Bash

PowerShell

Description

--build-name

-build-name

This is name of the build. Required when artifact type is buildInfo.

--build-number

-build-number

This is number of the build. Required when artifact type is buildInfo.

--project-key

-project-key

(optional) Defaults to the environment's project_key. Can be specified to validate info for a different project.

configure_jfrog_cli

Description

Configures the JFrog CLI (version 1 or 2 specified as jfrogCliVersion in the pipeline configuration) with the provided credentials, handling the different formats for different minor versions. Artifactory integrations listed in the integrations section of the step will be automatically configured, but this may be useful for resources or if the credentials are provided to the step in another way. When using v2, a non-default Xray URL may also be specified using the environment variable JFROG_XRAY_URL.

Usage

Bash

configure_jfrog_cli --artifactory-url <url> [--xray-url <url> --user USER --apikey <key> --access-token <token> --server-name <name>]

PowerShell

configure_jfrog_cli -artifactory-url <url> [-xray-url <url> -user USER -apikey <key> -access-token <token> -server-name <name>]

  • artifactory-url : Required. The Artifactory URL.

  • xray-url : Optional. The Xray URL. Only used with CLI v2.

  • user: The user. Required when an API key is provided.

  • apikey : An API key. Requires --user and may not be used with --access-token.

  • access-token : An Access token. May not be used with --access-token or --user.

  • server-name : Defaults to default. Can be specified to configure the CLI with that name.

check_xray_available

Description

With JFrog CLI v2 (version 2 specified as jfrogCliVersion in the pipeline configuration), checks that Xray is available for the specified CLI configuration, or the default configuration if none is specified.

Usage

Bash

check_xray_available [--server-name <name>]

PowerShell

check_xray_available [-server-name <name>]

  • server-name : Optional. Specifies a JFrog CLI configuration to check.

cleanup_jfrog_cli

Description

Removes configuration for the JFrog CLI (v1), handling the different formats for different minor versions. Artifactory integrations listed in the integrations section of the step will be automatically removed at the end of the step, but this may be useful to remove the credentials earlier or when using configure_jfrog_cli.

Usage

Bash

cleanup_jfrog_cli [--server-name <name>]

PowerShell

cleanup_jfrog_cli [-server-name <name>]

  • server-name : Defaults to default. Can be specified to remove that configuration.

set_trigger_payload

Description

Adds one or more key=value pairs as custom step or pipeline variables to a JSON payload that can be used to call the pipelines trigger API.

Multiple key=value pairs can be given in a single command or can be split across multiple commands.

If a key is added twice, the original value will be replaced.

Usage

Bash

set_trigger_payload [stepVariables|pipelineVariables] <key value pairs>

PowerShell

Not currently supported in PowerShell

  • stepVariables: These key value pairs will be present in the environment of the triggered step

  • pipelineVariables: These key value pairs will be present in the environment of every step in the run that is triggered.

get_trigger_payload

Description

Prints on stdout a payload that can be used with the pipelines trigger API.

Use set_trigger_payload to add additional values to the payload.

Usage

Bash

get_trigger_payload

PowerShell

Not currently supported in PowerShell

end_step

Description

Stops execution of the onStart or onExecute section and immediately continues to the onSuccess, onFailure, or onComplete section setting the specified status. If "success" is provided, the onSuccess and onComplete sections will be executed. For "failure," the onFailure and onComplete sections will be executed. And for "skipped," only the onComplete section. The end_step utility function may only be called in the onStart and onExecute sections and the only statuses supported are success, failure, and skipped.

Usage

Bash

end_step [success|failure|skipped]

PowerShell

end_step [success|failure|skipped]

update_run_description

Description

Provide dynamic description for a run, which will be shown in the UI. This is useful for providing more context about a run. Every step in a pipeline can include a run description.

Usage

Bash

update_run_description "description"

PowerShell

update_run_description "description"

Full YAML Example

name: UpdateRunDescription
steps:
        - name: update_l
          type: Bash
          execution:
                onStart:
                        - msg="Run description updated from step $step_id"
                        - update_run_description "$msg"
        - name: update_2
          type: Bash
          configuration:
                inputSteps:
                        - name: update_l
          execution:
                onStart:
                        - msg="Run description updated from step $step_id"
                        - update_run_description "$msg"

set_run_name

Description

Provide a unique, dynamic name for a run, which will be shown in the UI. This is useful for providing more context about a run. Unlike run description, run name is unique for a run.

In the UI, run name appears in the Run View:

180146976.png

Run name appears also appears in Pipeline of Pipelines view and Active Board view:

180146977.png

Usage

Bash

set_run_name "1.0.1"

PowerShell

set_run_name "1.0.1"

Full YAML Example

name: UpdateRunDescription
steps:
        - name: update_l
          type: Bash
          execution:
                onStart:
                        - set_run_name "1.0.1"
        - name: update_2
          type: Bash
          configuration:
                inputSteps:
                        - name: update_l
          execution:
                onStart:
                        - msg="Run description updated from step $step_id"
                        - update_run_description "$msg"

Source Control

Test Reports

save_tests

Description

Copies test reports given as input to later be parsed and uploaded (if file storage is available).

Usage

save_tests <file or directory>

  • file or directoryspecifies either a filename for the test report file, or a directory name for a directory of test report files

Encryption

encrypt_string

Description

Uses the provided public key to encrypt the specified string.

Usage

Shell

Function

Bash

encrypt_string --key <path> <source string>

PowerShell

encrypt_string -key <path> <source string>

  • key is the fully qualified path of the public key file

  • source string is the string to be encrypted

decrypt_string

Description

Uses the provided private key to decrypt the specified string.

This is typically used to decrypt information that was encrypted using encrypt_string with the corresponding public key. It helps you avoid building your own encrypt-decrypt system.

Usage

Shell

Function

Bash

decrypt_string --key <path> <encrypted string>

PowerShell

decrypt_string -key <path> <encrypted string>

  • key is the fully qualified path of the private key file

  • encrypted string is the string to be decrypted

encrypt_file

Description

Uses the provided public key to encrypt the specified file to a new file.

Usage

Shell

Function

Bash

encrypt_file --key <path> [--output <filename>] <source filename>

PowerShell

encrypt_file -key <path> [-output <filename>] <source filename>

  • key is the fully qualified path of the public key file

  • output is the name of the resulting encrypted file. Defaults to “encrypted”

  • source filename is the file to be decrypted

decrypt_file

Description

Uses the provided private key to decrypt the specified file to a new file.

This is typically used to decrypt information that was encrypted using encrypt_file with the corresponding public key. It helps you avoid building your own encrypt-decrypt system.

Usage

Shell

Function

Bash

decrypt_file --key <path> [--output <filename>] <source filename>

PowerShell

decrypt_file -key <path> [-output <filename>] <source filename>

  • key is the fully qualified path of the private key file

  • output is the name of the resulting decrypted file. Defaults to “decrypted”

  • source filename is the file to be decrypted

Notifications

send_notification

Description

Utilizes notification integration to send custom messages at any time during the build to any recipient.

For more information, see Sending Notifications from Pipelines.

Usage

send_notification <integration> [options]

The options can be specified as part of the command, or defined as environment variables before the command is issued.

The command line arguments take priority over the environment variables.

AirBrake

Creates an AirBrake deployment through an Airbrake Integration. Not supported in PowerShell.

Bash

Option Description

--project-id

the project ID to send the notification for

--environment

the environment value to use when posting the deployment

--username

used when posting an AirBrake deployment

--email

the email to be used when posting the AirBrake deployment

--repository

the repository to use when posting the AirBrake deployment

--revision

the deployment revision

--version

the version to use when posting the AirBrake deployment

--type

currently only type “deploy” is supported

--description

description of the deployment

--payload

path to a valid JSON file that contains a payload to use to POST the AirBrake deployment

Jira

Creates a Jira issue (also known as a ticket).

Bash

PowerShell

Option Description

--project-id

-project-id

the Project Key of the project to associate the new issue with. The project key is the short string that begins all issue numbers for the project (e.g., "EXAMPLE-1234")

--type

-type

the issue type for the new issue (e.g., "Bug", "Task", etc.). This string must be one of the recognized Jira issue types

--summary

-summary

a string for the new issue's Summary field (it's title)

--description

--description

(optional) a string for the new issue's Description field

--attach-file

--attach-file

(optional) a path to a file that you’d like to attach to the issue

NewRelic

Creates a NewRelic deployment through a NewRelic Integration. Not supported in PowerShell.

Bash

Option Description

--type

the type of object to be posted. At the moment, only “deployment” is supported

--description

description of the deployment

--username

the user recording the deployment. Defaults to “JFrog Pipelines”

--changelog

the changelog value to use in the deployment

--revision

the deployment revision (required)

--appId

the ID of the app being deployed. If not provided, --appName must be present

--appName

the name of the app being deployed. If not provided, --appId must be present

--payload

path to a valid JSON file that contains a payload to use to POST the NewRelic deployment

PagerDuty Events

Sends an event through a PagerDuty Events Integration.

Bash

PowerShell

Option Description

--text

-text

The main text to display in the event on PagerDuty.

Slack

Sends a message on Slack through a Slack Integration.

Bash

PowerShell

Option Description

--payload

-payload

(optional) A path to a valid json file to act as the payload of the message. If a payload is provided, all other parameters are ignored. This payload is directly sent to Slack, so please view the Slack API documentation for information on how the payload should be formatted.

--username

-username

(optional) shows in the heading of the Slack message

--pretext

-pretext

(optional) a string that becomes the first part of the Slack message. Defaults to current date/time

--text

-text

(optional) the main text to display in the message.

--color

-color

(optional) hex string that changes the color of the status bar to the left of the Slack message.

--recipient

-recipient

(optional) the target of the message. Should start with “@” or “#” for user or channel, respectively.

--icon-url

-icon-url

(optional) the url of the icon to show next to the message

smtpCreds (email)

Sends an email through an SMTP Credentials Integration.

Bash

PowerShell

Option Description

--recipients

-recipients

one or more email addresses

--subject

-subject

(optional) add a message to the subject. Does not replace the default subject

--body

-body

(optional) specify some text to add to the body of the email. Does not replace the existing body information

--status

-status

(optional) can be set to a valid status string. By default it will be set based on the section of scrip the command is executed in.

--attachments

-attachments

(optional) a list of files to attach to the email. Combined total of all files cannot exceed 5MB

--attach-logs

-attach-logs

(optional) ‘true’ or ‘false’. Defaults to false. All available logs for the step will be attached to the email. Note that it can only attach logs that have already been created, so using this option in the onStart section, for example, would not have very detailed logs.

--show-failing-commands

-show-failing-commands

(optional) 'true’ or ‘false’. Defaults to false. The existing logs for the step will be parsed. Any failed command that is detected will be added to the body of the email, along with up to 100 preceding lines (if printed from the same command)

Environment Options

All of the above options can also be included as environment variables instead of arguments. The command line argument will have priority over the environment. Here is the full list of ENVs:

  • NOTIFY_USERNAME (--username/-username)

  • NOTIFY_PASSWORD (--password/-password)

  • NOTIFY_RECIPIENT (--recipient/-recipient)

  • NOTIFY_PRETEXT (--pretext/-pretext)

  • NOTIFY_TEXT (--text/-text)

  • NOTIFY_COLOR (--color/-color)

  • NOTIFY_ICON_URL (--icon-url/-icon-url)

  • NOTIFY_PAYLOAD (--payload/-payload)

  • NOTIFY_TYPE (--type/-type)

  • NOTIFY_PROJECT_ID (--project-id/-project-id)

  • NOTIFY_ENVIRONMENT (--environment/-environment)

  • NOTIFY_REVISION (–revision/-revision)

  • NOTIFY_SUMMARY (--summary/-summary)

  • NOTIFY_ATTACH_FILE (--attach-file/-attach-file)

  • NOTIFY_REPOSITORY (--repository/-repository)

  • NOTIFY_EMAIL (–email--email/-email)

  • NOTIFY_STATUS (--status/-status)

  • NOTIFY_VERSION (--version/-version)

  • NOTIFY_CHANGELOG (--changelog/-changelog)

  • NOTIFY_DESCRIPTION (–description--description/-description)

  • NOTIFY_ATTACHMENTS (–attachments--attachments/-attachments)

  • NOTIFY_ATTACH_LOGS (--attach-logs/-attach-logs)

  • NOTIFY_SHOW_FAILING_COMMANDS (--show-failing-commands/-show-failing-commands)

  • NOTIFY_SUBJECT (--subject/-subject)

  • NOTIFY_BODY (–body)

JSON

set_payload

Description

Sets an optional JSON payload (string or file) for an OutgoingWebhook resource. When the OutgoingWebhook is specified in a step's outputresources the payload is sent when the step is complete.

Usage

Shell

Function

Bash

set_payload <resource> <payload> [--file]

PowerShell

set_payload <resource> <payload> [-file]

  • resource is the name of an OutgoingWebhook resource.

  • payload is a JSON string or file to attach to the resource that will be sent as part of the outgoing webhook. A file can be specified as a path relative to the current directory, absolute path, or path relative to the step workspace directory.

  • file option specifies that the payload parameter is a file. If not specified, payload will be processed as a string.

read_json

Description

Extracts the json property value from the specified file.

This simplifies handling of a JSON file to read specific property values that are required for your workflow.

Not supported in PowerShell

In PowerShell, ConvertTo-Json is suggested as an alternative.

Usage

read_json <path to file> <field name>

  • path to file is the fully qualified path of the JSON file

  • field name is the field for which you want to read the value. Use dot notation and [n] for arrays.

Resources

replicate_resource

Description

This command takes an input resource and creates an exact copy. This helps you to transfer metadata from one step to the next.

Usage

Shell

Function

Bash

replicate_resource <from_resource> <to_resource> [--options]

PowerShell

replicate_resource <from_resource> <to_resource> [-options]

  • from_resource is the name of the inputResources resource that you're copying from.

  • to_resource is the name of the outputResources resource that will receive the replicated data from the from_resource.Any pre-existing files or key-value pairs in the to_resource will be replaced.

  • match-settings option should be set when you want the replication to adhere to any branch/tag settings in the to_resource. For example, If your from_resource gitRepo can trigger on both commits and pull requests, but you only want to update your to_resource on commits, you can replicate with --match-settings, and the to_resource will only be updated when the from_resource had a commit.

write_output

Description

Adds data to an output resource in the form of key/value pairs that will become properties of the resource.

Usage

Bash

write_output <resource> <key value pair>... [--overwrite]

PowerShell

write_output <resource> <key value pair>... [-overwrite]

  • resource is the resource to update

  • key value pair is a single string with a key and a value, separated by an “=”. Multiple of these strings can be supplied as input. A value with spaces should be surrounded by quotes.

Bash

Powershell

Option Description

--overwrite

-overwrite

If supplied, all key value pairs will be replaced.

The newly attached properties can be accessed as environment variables of the form res_{Resource Name}_{Key Name}.

For example, the above created properties can be accessed as these environment variables:

$ printenv res_myImage_master
master
$ printenv res_myImage_sha
d6cd1e2bd19e03a81132a23b2025920577f84e37
$ printenv res_myImage_description
"hello world"

Caching

Caching helps you speed up execution of your steps by preserving and restoring packages and dependencies between runs of a step. In this way, you can reduce build times by avoiding repeating the installation or loading of large dependencies.

add_cache_files

Description

Copies files given as input to later be uploaded if file storage is available.

For more information about using this utility function, see Caching Step Runtimes.

Usage

add_cache_files <file or directory> <name>

  • file or directory is a file or directory to store in the cache

  • name is a name to give the stored file or directory (without spaces)

restore_cache_files

Description

Copies stored cache (if file storage is available) to the specified location. No error will occur if nothing is available for <name> in the cache.

Use this utility function when you want to make use of files generated from the first step in the second step, which runs only after first step finishes.

Usage

restore_cache_files <name> <path>

  • name is the name the file or driectory to be restored was given when cached.

  • path is a path at which to place the file or directory.

Run State Management

add_run_variables

Description

Allows you to add environment variables that will be available in the following steps of the run.

If the following variables are set, they will be used:

  • JFROG_CLI_BUILD_NAME: If set, the pipeline uses this value instead of the default pipeline name for the build info collected.

  • JFROG_CLI_BUILD_NUMBER: If set, the pipeline uses this value instead of the default run number for the build info collected.

  • USE_LOCAL_JFROG_CLI: If set as true, the local JFrog CLI on the host or in the image (depending on runtime configuration) is used instead of the version packaged with JFrog Pipelines. This is not recommended and native steps may not be able to run with the local JFrog CLI version.

Usage

add_run_variables <key value pair>...

  • key value pair is a single string with a key and a value, separated by an “=”. Multiple of these strings can be supplied as input. Each value will be exported as an environment variable at the time this command is used and automatically in any later steps within the run.

export_run_variables

Description

Sources the file containing the run variables. This will be done automatically, but may also be used to “reset” the environment variables in the current step.

Usage

export_run_variables

add_run_files

Description

Copies files given as input into the run state for use in later steps in the run, if file storage is available.

Use this utility function when you want to make use of files generated from the first step in the second step, which runs only after first step finishes.

Usage

add_run_files <file or directory> <name>

file or directory is a file or directory to store in the run state

name is a name to give the stored file or directory (without spaces). This cannot be run.env.

restore_run_files

Description

Copies files stored in the run state (if file storage is available) to the specified location. No error will occur if nothing is available for <name> in the run state.

Use this utility function when you want to make use of files generated from the first step in the second step, which runs only after first step finishes.

Usage

restore_run_files <name> <path>

  • path is the name the files to be restored were given when added to the run state.

  • file or directory is a path at which to place the file or files.

Affinity Group State Management

add_affinity_group_files

Description

Copies files or directories given as input into the affinity group workspace for use in later steps in the affinity group. Files or directories may be specified using wildcards (*) or as multiple input parameters preceding the name under which the files will be stored.

Usage

add_affinity_group_files <file or directory> <name>

file or directory is a file or directory to store in the run state.

name is a name to give the stored file or directory (without spaces).

restore_affinity_group_files

Description

Copies files stored in the affinity group workspace by add_affinity_group_files to the specified location. The restore behavior differs from the other state restore functions to better handle wildcard patterns with a variable number of matching files or directories. With restore_affinity_group_files, the relative path specified in the add_affinity_group_files function will be preserved (excluding any traversal to parent directories) and the path specified should be the base for that relative path. The following examples are to symmetrically restore the files saved in the examples above for add_affinity_group_files. No error will occur if no files were saved for <name>.

Usage

restore_affinity_group_files <name> <path>

  • name is the name the files to be restored were given when storing the files with add_affinity_group_files.

  • path is a path at which to place the file or files.

Pipeline State Management

add_pipeline_variables

Description

Allows you to add environment variables that will be available in the following steps of the run and in future runs. These variables may be overridden by another variable with the same key added to the current run.

If the following variables are set, they will be used:

  • JFROG_CLI_BUILD_NAME: If set, the pipeline uses this value instead of the default pipeline name for the build info collected.

  • JFROG_CLI_BUILD_NUMBER: If set, the pipeline uses this value instead of the default run number for the build info collected.

  • USE_LOCAL_JFROG_CLI: If set as true, the local JFrog CLI on the host or in the image (depending on runtime configuration) is used instead of the version packaged with JFrog Pipelines. This is not recommended and native steps may not be able to run with the local JFrog CLI version.

Usage

add_pipeline_variables <key value pair>...

  • key value pair is a single string with a key and a value, separated by an “=”. Multiple of these strings can be supplied as input. Each value will be exported as an environment variable at the time this command is used and automatically in any steps that start after this run is complete.

export_pipeline_variables

Description

Sources the file containing the pipeline variables. This will be done automatically, but may also be used to “reset” the environment variables in the current step.

Usage

export_pipeline_variables

add_pipeline_files

Description

Copies files given as input into the pipeline state for use in later steps in the run and future runs, if file storage is available.

Use this utility function when you want to make use of files generated from the first step in the second step, which runs only after first step finishes.

Usage

add_pipeline_files <file or directory> <name>

  • file or directory is a file or directory to store in the pipeline state.

  • name is a name to give the stored file or directory (without spaces). This cannot be pipeline.env.

restore_pipeline_files

Description

Copies files stored in the pipeline state (if file storage is available) to the specified location. No error will occur if nothing is available for <name> in the run state.

Use this utility function when you want to make use of files generated from the first step in the second step, which runs only after first step finishes.

Usage

restore_pipeline_files <name> <path>

  • name is the name the file to be restored was given when added to the pipeline state.

  • path is a path at which to place the file or files.

Step Properties

find_resource_variable

Description

Retrieves the value of the named property of a resource.

Usage

find_resource_variable <resourceName> <propertyName>

  • resourceName is the name of the resource.

  • propertyName is the name of the resource property whose value to retrieve.

get_integration_name

Description

Retrieves the name of the first integration found of the type specified. Available to extension steps to get the name of the first input integration of a particular type.

Usage

get_integration_name --type <integration type>

get_resource_name

Description

Retrieves the name of the first resource found of the type specified in inputResources or outputResources. Available to extension steps to get the name of the first input or output resource of a particular type.

Usage

get_resource_name --type <resource type> --operation <IN | OUT> --syntax-version <semver>

  • resource type is the name of a Pipelines Resource type

  • IN | OUT selects whether the resource is named in inputResources or outputResources

  • semver is the semantic version number of the resource's syntax version

get_resource_names

Description

Retrieves an array of names of the type specified in inputResources or outputResources. Available to extension steps to get the names of input or output resource of a particular type.

Usage

get_resource_names --type <resource type> --operation <IN | OUT> --syntax-version <semver>

  • resource type is the name of a Pipelines Resource type

  • IN | OUT selects whether the resource is named in inputResources or outputResources

  • semver is the semantic version number of the resource's syntax version

  • In PowerShell, a native PowerShell array is returned. In Bash, a JSON array is returned that can be handled with jq.

get_affinity_group_step_names

Description

Retrieves a JSON array of names of steps of the type specified in the current affinity group.

Usage

get_affinity_group_step_names [--type <step type>]  [--syntax-version <semver>] [--namespace <namespace>]find_step_configuration_value <propertyName>

  • type is used to specify the type of the steps to be found. If not specified, steps of all types will be returned.

  • syntax-version is used to specify the type of the steps to be found. If not specified, steps of all types will be returned.

  • namespacetype is used with extension steps to specify the namespace of the steps to be found.

find_step_configuration_value

Description

Retrieves the value of the configuration property for the currently executing step. If the property is a collection, the first value will be returned. Available to extension steps to get the value of a configuration.

Usage

find_step_configuration_value <propertyName>

  • propertyName is the name of the step's configuration property whose value to retrieve