class Aws::Personalize::Types::CreateBatchInferenceJobRequest

@note When making an API call, you may pass CreateBatchInferenceJobRequest

data as a hash:

    {
      job_name: "Name", # required
      solution_version_arn: "Arn", # required
      filter_arn: "Arn",
      num_results: 1,
      job_input: { # required
        s3_data_source: { # required
          path: "S3Location", # required
          kms_key_arn: "KmsKeyArn",
        },
      },
      job_output: { # required
        s3_data_destination: { # required
          path: "S3Location", # required
          kms_key_arn: "KmsKeyArn",
        },
      },
      role_arn: "RoleArn", # required
      batch_inference_job_config: {
        item_exploration_config: {
          "ParameterName" => "ParameterValue",
        },
      },
    }

@!attribute [rw] job_name

The name of the batch inference job to create.
@return [String]

@!attribute [rw] solution_version_arn

The Amazon Resource Name (ARN) of the solution version that will be
used to generate the batch inference recommendations.
@return [String]

@!attribute [rw] filter_arn

The ARN of the filter to apply to the batch inference job. For more
information on using filters, see [Filtering Batch
Recommendations][1]..

[1]: https://docs.aws.amazon.com/personalize/latest/dg/filter-batch.html
@return [String]

@!attribute [rw] num_results

The number of recommendations to retreive.
@return [Integer]

@!attribute [rw] job_input

The Amazon S3 path that leads to the input file to base your
recommendations on. The input material must be in JSON format.
@return [Types::BatchInferenceJobInput]

@!attribute [rw] job_output

The path to the Amazon S3 bucket where the job's output will be
stored.
@return [Types::BatchInferenceJobOutput]

@!attribute [rw] role_arn

The ARN of the Amazon Identity and Access Management role that has
permissions to read and write to your input and output Amazon S3
buckets respectively.
@return [String]

@!attribute [rw] batch_inference_job_config

The configuration details of a batch inference job.
@return [Types::BatchInferenceJobConfig]

@see docs.aws.amazon.com/goto/WebAPI/personalize-2018-05-22/CreateBatchInferenceJobRequest AWS API Documentation

Constants

SENSITIVE