[ aws . datapipeline ]
Gets the definition of the specified pipeline. You can call GetPipelineDefinition
to retrieve the pipeline definition that you provided using PutPipelineDefinition .
See also: AWS API Documentation
See ‘aws help’ for descriptions of global parameters.
get-pipeline-definition
--pipeline-id <value>
[--pipeline-version <value>]
[--cli-input-json | --cli-input-yaml]
[--generate-cli-skeleton <value>]
[--cli-auto-prompt <value>]
--pipeline-id
(string)
The ID of the pipeline.
--pipeline-version
(string)
The version of the pipeline definition to retrieve. Set this parameter to
latest
(default) to use the last definition saved to the pipeline oractive
to use the last definition that was activated.
--cli-input-json
| --cli-input-yaml
(string)
Reads arguments from the JSON string provided. The JSON string follows the format provided by --generate-cli-skeleton
. If other arguments are provided on the command line, those values will override the JSON-provided values. It is not possible to pass arbitrary binary values using a JSON-provided value as the string will be taken literally. This may not be specified along with --cli-input-yaml
.
--generate-cli-skeleton
(string)
Prints a JSON skeleton to standard output without sending an API request. If provided with no value or the value input
, prints a sample input JSON that can be used as an argument for --cli-input-json
. Similarly, if provided yaml-input
it will print a sample input YAML that can be used with --cli-input-yaml
. If provided with the value output
, it validates the command inputs and returns a sample output JSON for that command.
--cli-auto-prompt
(boolean)
Automatically prompt for CLI input parameters.
See ‘aws help’ for descriptions of global parameters.
To get a pipeline definition
This example gets the pipeline definition for the specified pipeline:
aws datapipeline get-pipeline-definition --pipeline-id df-00627471SOVYZEXAMPLE
The following is example output:
{
"parameters": [
{
"type": "AWS::S3::ObjectKey",
"id": "myS3OutputLoc",
"description": "S3 output folder"
},
{
"default": "s3://us-east-1.elasticmapreduce.samples/pig-apache-logs/data",
"type": "AWS::S3::ObjectKey",
"id": "myS3InputLoc",
"description": "S3 input folder"
},
{
"default": "grep -rc \"GET\" ${INPUT1_STAGING_DIR}/* > ${OUTPUT1_STAGING_DIR}/output.txt",
"type": "String",
"id": "myShellCmd",
"description": "Shell command to run"
}
],
"objects": [
{
"type": "Ec2Resource",
"terminateAfter": "20 Minutes",
"instanceType": "t1.micro",
"id": "EC2ResourceObj",
"name": "EC2ResourceObj"
},
{
"name": "Default",
"failureAndRerunMode": "CASCADE",
"resourceRole": "DataPipelineDefaultResourceRole",
"schedule": {
"ref": "DefaultSchedule"
},
"role": "DataPipelineDefaultRole",
"scheduleType": "cron",
"id": "Default"
},
{
"directoryPath": "#{myS3OutputLoc}/#{format(@scheduledStartTime, 'YYYY-MM-dd-HH-mm-ss')}",
"type": "S3DataNode",
"id": "S3OutputLocation",
"name": "S3OutputLocation"
},
{
"directoryPath": "#{myS3InputLoc}",
"type": "S3DataNode",
"id": "S3InputLocation",
"name": "S3InputLocation"
},
{
"startAt": "FIRST_ACTIVATION_DATE_TIME",
"name": "Every 15 minutes",
"period": "15 minutes",
"occurrences": "4",
"type": "Schedule",
"id": "DefaultSchedule"
},
{
"name": "ShellCommandActivityObj",
"command": "#{myShellCmd}",
"output": {
"ref": "S3OutputLocation"
},
"input": {
"ref": "S3InputLocation"
},
"stage": "true",
"type": "ShellCommandActivity",
"id": "ShellCommandActivityObj",
"runsOn": {
"ref": "EC2ResourceObj"
}
}
],
"values": {
"myS3OutputLoc": "s3://my-s3-bucket/",
"myS3InputLoc": "s3://us-east-1.elasticmapreduce.samples/pig-apache-logs/data",
"myShellCmd": "grep -rc \"GET\" ${INPUT1_STAGING_DIR}/* > ${OUTPUT1_STAGING_DIR}/output.txt"
}
}
The output of this command is the pipeline definition, which is documented in the Pipeline Definition File Syntax