[ aws . personalize ]
Gets the properties of a batch inference job including name, Amazon Resource Name (ARN), status, input and output configurations, and the ARN of the solution version used to generate the recommendations.
See also: AWS API Documentation
See ‘aws help’ for descriptions of global parameters.
describe-batch-inference-job
--batch-inference-job-arn <value>
[--cli-input-json | --cli-input-yaml]
[--generate-cli-skeleton <value>]
--batch-inference-job-arn
(string)
The ARN of the batch inference job to describe.
--cli-input-json
| --cli-input-yaml
(string)
Reads arguments from the JSON string provided. The JSON string follows the format provided by --generate-cli-skeleton
. If other arguments are provided on the command line, those values will override the JSON-provided values. It is not possible to pass arbitrary binary values using a JSON-provided value as the string will be taken literally. This may not be specified along with --cli-input-yaml
.
--generate-cli-skeleton
(string)
Prints a JSON skeleton to standard output without sending an API request. If provided with no value or the value input
, prints a sample input JSON that can be used as an argument for --cli-input-json
. Similarly, if provided yaml-input
it will print a sample input YAML that can be used with --cli-input-yaml
. If provided with the value output
, it validates the command inputs and returns a sample output JSON for that command.
See ‘aws help’ for descriptions of global parameters.
batchInferenceJob -> (structure)
Information on the specified batch inference job.
jobName -> (string)
The name of the batch inference job.
batchInferenceJobArn -> (string)
The Amazon Resource Name (ARN) of the batch inference job.
filterArn -> (string)
The ARN of the filter used on the batch inference job.
failureReason -> (string)
If the batch inference job failed, the reason for the failure.
solutionVersionArn -> (string)
The Amazon Resource Name (ARN) of the solution version from which the batch inference job was created.
numResults -> (integer)
The number of recommendations generated by the batch inference job. This number includes the error messages generated for failed input records.
jobInput -> (structure)
The Amazon S3 path that leads to the input data used to generate the batch inference job.
s3DataSource -> (structure)
The URI of the Amazon S3 location that contains your input data. The Amazon S3 bucket must be in the same region as the API endpoint you are calling.
path -> (string)
The file path of the Amazon S3 bucket.
kmsKeyArn -> (string)
The Amazon Resource Name (ARN) of the Amazon Key Management Service (KMS) key that Amazon Personalize uses to encrypt or decrypt the input and output files of a batch inference job.
jobOutput -> (structure)
The Amazon S3 bucket that contains the output data generated by the batch inference job.
s3DataDestination -> (structure)
Information on the Amazon S3 bucket in which the batch inference job’s output is stored.
path -> (string)
The file path of the Amazon S3 bucket.
kmsKeyArn -> (string)
The Amazon Resource Name (ARN) of the Amazon Key Management Service (KMS) key that Amazon Personalize uses to encrypt or decrypt the input and output files of a batch inference job.
batchInferenceJobConfig -> (structure)
A string to string map of the configuration details of a batch inference job.
itemExplorationConfig -> (map)
A string to string map specifying the exploration configuration hyperparameters, including
explorationWeight
andexplorationItemAgeCutOff
, you want to use to configure the amount of item exploration Amazon Personalize uses when recommending items. See native-recipe-new-item-USER_PERSONALIZATION .key -> (string)
value -> (string)
roleArn -> (string)
The ARN of the Amazon Identity and Access Management (IAM) role that requested the batch inference job.
status -> (string)
The status of the batch inference job. The status is one of the following values:
PENDING
IN PROGRESS
ACTIVE
CREATE FAILED
creationDateTime -> (timestamp)
The time at which the batch inference job was created.
lastUpdatedDateTime -> (timestamp)
The time at which the batch inference job was last updated.