|
NAMEPaws::MachineLearning - Perl Interface to AWS Amazon Machine LearningSYNOPSISuse Paws; my $obj = Paws->service('MachineLearning')->new; my $res = $obj->Method( Arg1 => $val1, Arg2 => [ 'V1', 'V2' ], # if Arg3 is an object, the HashRef will be used as arguments to the constructor # of the arguments type Arg3 => { Att1 => 'Val1' }, # if Arg4 is an array of objects, the HashRefs will be passed as arguments to # the constructor of the arguments type Arg4 => [ { Att1 => 'Val1' }, { Att1 => 'Val2' } ], ); DESCRIPTIONDefinition of the public APIs exposed by Amazon Machine LearningMETHODSCreateBatchPrediction(BatchPredictionDataSourceId => Str, BatchPredictionId => Str, MLModelId => Str, OutputUri => Str, [BatchPredictionName => Str])Each argument is described in detail in: Paws::MachineLearning::CreateBatchPredictionReturns: a Paws::MachineLearning::CreateBatchPredictionOutput instance Generates predictions for a group of observations. The observations to process exist in one or more data files referenced by a "DataSource". This operation creates a new "BatchPrediction", and uses an "MLModel" and the data files referenced by the "DataSource" as information sources. "CreateBatchPrediction" is an asynchronous operation. In response to "CreateBatchPrediction", Amazon Machine Learning (Amazon ML) immediately returns and sets the "BatchPrediction" status to "PENDING". After the "BatchPrediction" completes, Amazon ML sets the status to "COMPLETED". You can poll for status updates by using the GetBatchPrediction operation and checking the "Status" parameter of the result. After the "COMPLETED" status appears, the results are available in the location specified by the "OutputUri" parameter. CreateDataSourceFromRDS(DataSourceId => Str, RDSData => Paws::MachineLearning::RDSDataSpec, RoleARN => Str, [ComputeStatistics => Bool, DataSourceName => Str])Each argument is described in detail in: Paws::MachineLearning::CreateDataSourceFromRDSReturns: a Paws::MachineLearning::CreateDataSourceFromRDSOutput instance Creates a "DataSource" object from an Amazon Relational Database Service (Amazon RDS). A "DataSource" references data that can be used to perform CreateMLModel, CreateEvaluation, or CreateBatchPrediction operations. "CreateDataSourceFromRDS" is an asynchronous operation. In response to "CreateDataSourceFromRDS", Amazon Machine Learning (Amazon ML) immediately returns and sets the "DataSource" status to "PENDING". After the "DataSource" is created and ready for use, Amazon ML sets the "Status" parameter to "COMPLETED". "DataSource" in "COMPLETED" or "PENDING" status can only be used to perform CreateMLModel, CreateEvaluation, or CreateBatchPrediction operations. If Amazon ML cannot accept the input source, it sets the "Status" parameter to "FAILED" and includes an error message in the "Message" attribute of the GetDataSource operation response. CreateDataSourceFromRedshift(DataSourceId => Str, DataSpec => Paws::MachineLearning::RedshiftDataSpec, RoleARN => Str, [ComputeStatistics => Bool, DataSourceName => Str])Each argument is described in detail in: Paws::MachineLearning::CreateDataSourceFromRedshiftReturns: a Paws::MachineLearning::CreateDataSourceFromRedshiftOutput instance Creates a "DataSource" from Amazon Redshift. A "DataSource" references data that can be used to perform either CreateMLModel, CreateEvaluation or CreateBatchPrediction operations. "CreateDataSourceFromRedshift" is an asynchronous operation. In response to "CreateDataSourceFromRedshift", Amazon Machine Learning (Amazon ML) immediately returns and sets the "DataSource" status to "PENDING". After the "DataSource" is created and ready for use, Amazon ML sets the "Status" parameter to "COMPLETED". "DataSource" in "COMPLETED" or "PENDING" status can only be used to perform CreateMLModel, CreateEvaluation, or CreateBatchPrediction operations. If Amazon ML cannot accept the input source, it sets the "Status" parameter to "FAILED" and includes an error message in the "Message" attribute of the GetDataSource operation response. The observations should exist in the database hosted on an Amazon Redshift cluster and should be specified by a "SelectSqlQuery". Amazon ML executes Unload command in Amazon Redshift to transfer the result set of "SelectSqlQuery" to "S3StagingLocation." After the "DataSource" is created, it's ready for use in evaluations and batch predictions. If you plan to use the "DataSource" to train an "MLModel", the "DataSource" requires another item -- a recipe. A recipe describes the observation variables that participate in training an "MLModel". A recipe describes how each input variable will be used in training. Will the variable be included or excluded from training? Will the variable be manipulated, for example, combined with another variable or split apart into word combinations? The recipe provides answers to these questions. For more information, see the Amazon Machine Learning Developer Guide. CreateDataSourceFromS3(DataSourceId => Str, DataSpec => Paws::MachineLearning::S3DataSpec, [ComputeStatistics => Bool, DataSourceName => Str])Each argument is described in detail in: Paws::MachineLearning::CreateDataSourceFromS3Returns: a Paws::MachineLearning::CreateDataSourceFromS3Output instance Creates a "DataSource" object. A "DataSource" references data that can be used to perform CreateMLModel, CreateEvaluation, or CreateBatchPrediction operations. "CreateDataSourceFromS3" is an asynchronous operation. In response to "CreateDataSourceFromS3", Amazon Machine Learning (Amazon ML) immediately returns and sets the "DataSource" status to "PENDING". After the "DataSource" is created and ready for use, Amazon ML sets the "Status" parameter to "COMPLETED". "DataSource" in "COMPLETED" or "PENDING" status can only be used to perform CreateMLModel, CreateEvaluation or CreateBatchPrediction operations. If Amazon ML cannot accept the input source, it sets the "Status" parameter to "FAILED" and includes an error message in the "Message" attribute of the GetDataSource operation response. The observation data used in a "DataSource" should be ready to use; that is, it should have a consistent structure, and missing data values should be kept to a minimum. The observation data must reside in one or more CSV files in an Amazon Simple Storage Service (Amazon S3) bucket, along with a schema that describes the data items by name and type. The same schema must be used for all of the data files referenced by the "DataSource". After the "DataSource" has been created, it's ready to use in evaluations and batch predictions. If you plan to use the "DataSource" to train an "MLModel", the "DataSource" requires another item: a recipe. A recipe describes the observation variables that participate in training an "MLModel". A recipe describes how each input variable will be used in training. Will the variable be included or excluded from training? Will the variable be manipulated, for example, combined with another variable, or split apart into word combinations? The recipe provides answers to these questions. For more information, see the Amazon Machine Learning Developer Guide. CreateEvaluation(EvaluationDataSourceId => Str, EvaluationId => Str, MLModelId => Str, [EvaluationName => Str])Each argument is described in detail in: Paws::MachineLearning::CreateEvaluationReturns: a Paws::MachineLearning::CreateEvaluationOutput instance Creates a new "Evaluation" of an "MLModel". An "MLModel" is evaluated on a set of observations associated to a "DataSource". Like a "DataSource" for an "MLModel", the "DataSource" for an "Evaluation" contains values for the Target Variable. The "Evaluation" compares the predicted result for each observation to the actual outcome and provides a summary so that you know how effective the "MLModel" functions on the test data. Evaluation generates a relevant performance metric such as BinaryAUC, RegressionRMSE or MulticlassAvgFScore based on the corresponding "MLModelType": "BINARY", "REGRESSION" or "MULTICLASS". "CreateEvaluation" is an asynchronous operation. In response to "CreateEvaluation", Amazon Machine Learning (Amazon ML) immediately returns and sets the evaluation status to "PENDING". After the "Evaluation" is created and ready for use, Amazon ML sets the status to "COMPLETED". You can use the GetEvaluation operation to check progress of the evaluation during the creation operation. CreateMLModel(MLModelId => Str, MLModelType => Str, TrainingDataSourceId => Str, [MLModelName => Str, Parameters => Paws::MachineLearning::TrainingParameters, Recipe => Str, RecipeUri => Str])Each argument is described in detail in: Paws::MachineLearning::CreateMLModelReturns: a Paws::MachineLearning::CreateMLModelOutput instance Creates a new "MLModel" using the data files and the recipe as information sources. An "MLModel" is nearly immutable. Users can only update the "MLModelName" and the "ScoreThreshold" in an "MLModel" without creating a new "MLModel". "CreateMLModel" is an asynchronous operation. In response to "CreateMLModel", Amazon Machine Learning (Amazon ML) immediately returns and sets the "MLModel" status to "PENDING". After the "MLModel" is created and ready for use, Amazon ML sets the status to "COMPLETED". You can use the GetMLModel operation to check progress of the "MLModel" during the creation operation. CreateMLModel requires a "DataSource" with computed statistics, which can be created by setting "ComputeStatistics" to "true" in CreateDataSourceFromRDS, CreateDataSourceFromS3, or CreateDataSourceFromRedshift operations. CreateRealtimeEndpoint(MLModelId => Str)Each argument is described in detail in: Paws::MachineLearning::CreateRealtimeEndpointReturns: a Paws::MachineLearning::CreateRealtimeEndpointOutput instance Creates a real-time endpoint for the "MLModel". The endpoint contains the URI of the "MLModel"; that is, the location to send real-time prediction requests for the specified "MLModel". DeleteBatchPrediction(BatchPredictionId => Str)Each argument is described in detail in: Paws::MachineLearning::DeleteBatchPredictionReturns: a Paws::MachineLearning::DeleteBatchPredictionOutput instance Assigns the DELETED status to a "BatchPrediction", rendering it unusable. After using the "DeleteBatchPrediction" operation, you can use the GetBatchPrediction operation to verify that the status of the "BatchPrediction" changed to DELETED. The result of the "DeleteBatchPrediction" operation is irreversible. DeleteDataSource(DataSourceId => Str)Each argument is described in detail in: Paws::MachineLearning::DeleteDataSourceReturns: a Paws::MachineLearning::DeleteDataSourceOutput instance Assigns the DELETED status to a "DataSource", rendering it unusable. After using the "DeleteDataSource" operation, you can use the GetDataSource operation to verify that the status of the "DataSource" changed to DELETED. The results of the "DeleteDataSource" operation are irreversible. DeleteEvaluation(EvaluationId => Str)Each argument is described in detail in: Paws::MachineLearning::DeleteEvaluationReturns: a Paws::MachineLearning::DeleteEvaluationOutput instance Assigns the "DELETED" status to an "Evaluation", rendering it unusable. After invoking the "DeleteEvaluation" operation, you can use the GetEvaluation operation to verify that the status of the "Evaluation" changed to "DELETED". The results of the "DeleteEvaluation" operation are irreversible. DeleteMLModel(MLModelId => Str)Each argument is described in detail in: Paws::MachineLearning::DeleteMLModelReturns: a Paws::MachineLearning::DeleteMLModelOutput instance Assigns the DELETED status to an "MLModel", rendering it unusable. After using the "DeleteMLModel" operation, you can use the GetMLModel operation to verify that the status of the "MLModel" changed to DELETED. The result of the "DeleteMLModel" operation is irreversible. DeleteRealtimeEndpoint(MLModelId => Str)Each argument is described in detail in: Paws::MachineLearning::DeleteRealtimeEndpointReturns: a Paws::MachineLearning::DeleteRealtimeEndpointOutput instance Deletes a real time endpoint of an "MLModel". DescribeBatchPredictions([EQ => Str, FilterVariable => Str, GE => Str, GT => Str, LE => Str, Limit => Int, LT => Str, NE => Str, NextToken => Str, Prefix => Str, SortOrder => Str])Each argument is described in detail in: Paws::MachineLearning::DescribeBatchPredictionsReturns: a Paws::MachineLearning::DescribeBatchPredictionsOutput instance Returns a list of "BatchPrediction" operations that match the search criteria in the request. DescribeDataSources([EQ => Str, FilterVariable => Str, GE => Str, GT => Str, LE => Str, Limit => Int, LT => Str, NE => Str, NextToken => Str, Prefix => Str, SortOrder => Str])Each argument is described in detail in: Paws::MachineLearning::DescribeDataSourcesReturns: a Paws::MachineLearning::DescribeDataSourcesOutput instance Returns a list of "DataSource" that match the search criteria in the request. DescribeEvaluations([EQ => Str, FilterVariable => Str, GE => Str, GT => Str, LE => Str, Limit => Int, LT => Str, NE => Str, NextToken => Str, Prefix => Str, SortOrder => Str])Each argument is described in detail in: Paws::MachineLearning::DescribeEvaluationsReturns: a Paws::MachineLearning::DescribeEvaluationsOutput instance Returns a list of "DescribeEvaluations" that match the search criteria in the request. DescribeMLModels([EQ => Str, FilterVariable => Str, GE => Str, GT => Str, LE => Str, Limit => Int, LT => Str, NE => Str, NextToken => Str, Prefix => Str, SortOrder => Str])Each argument is described in detail in: Paws::MachineLearning::DescribeMLModelsReturns: a Paws::MachineLearning::DescribeMLModelsOutput instance Returns a list of "MLModel" that match the search criteria in the request. GetBatchPrediction(BatchPredictionId => Str)Each argument is described in detail in: Paws::MachineLearning::GetBatchPredictionReturns: a Paws::MachineLearning::GetBatchPredictionOutput instance Returns a "BatchPrediction" that includes detailed metadata, status, and data file information for a "Batch Prediction" request. GetDataSource(DataSourceId => Str, [Verbose => Bool])Each argument is described in detail in: Paws::MachineLearning::GetDataSourceReturns: a Paws::MachineLearning::GetDataSourceOutput instance Returns a "DataSource" that includes metadata and data file information, as well as the current status of the "DataSource". "GetDataSource" provides results in normal or verbose format. The verbose format adds the schema description and the list of files pointed to by the DataSource to the normal format. GetEvaluation(EvaluationId => Str)Each argument is described in detail in: Paws::MachineLearning::GetEvaluationReturns: a Paws::MachineLearning::GetEvaluationOutput instance Returns an "Evaluation" that includes metadata as well as the current status of the "Evaluation". GetMLModel(MLModelId => Str, [Verbose => Bool])Each argument is described in detail in: Paws::MachineLearning::GetMLModelReturns: a Paws::MachineLearning::GetMLModelOutput instance Returns an "MLModel" that includes detailed metadata, and data source information as well as the current status of the "MLModel". "GetMLModel" provides results in normal or verbose format. Predict(MLModelId => Str, PredictEndpoint => Str, Record => Paws::MachineLearning::Record)Each argument is described in detail in: Paws::MachineLearning::PredictReturns: a Paws::MachineLearning::PredictOutput instance Generates a prediction for the observation using the specified "MLModel". Not all response parameters will be populated because this is dependent on the type of requested model. UpdateBatchPrediction(BatchPredictionId => Str, BatchPredictionName => Str)Each argument is described in detail in: Paws::MachineLearning::UpdateBatchPredictionReturns: a Paws::MachineLearning::UpdateBatchPredictionOutput instance Updates the "BatchPredictionName" of a "BatchPrediction". You can use the GetBatchPrediction operation to view the contents of the updated data element. UpdateDataSource(DataSourceId => Str, DataSourceName => Str)Each argument is described in detail in: Paws::MachineLearning::UpdateDataSourceReturns: a Paws::MachineLearning::UpdateDataSourceOutput instance Updates the "DataSourceName" of a "DataSource". You can use the GetDataSource operation to view the contents of the updated data element. UpdateEvaluation(EvaluationId => Str, EvaluationName => Str)Each argument is described in detail in: Paws::MachineLearning::UpdateEvaluationReturns: a Paws::MachineLearning::UpdateEvaluationOutput instance Updates the "EvaluationName" of an "Evaluation". You can use the GetEvaluation operation to view the contents of the updated data element. UpdateMLModel(MLModelId => Str, [MLModelName => Str, ScoreThreshold => Num])Each argument is described in detail in: Paws::MachineLearning::UpdateMLModelReturns: a Paws::MachineLearning::UpdateMLModelOutput instance Updates the "MLModelName" and the "ScoreThreshold" of an "MLModel". You can use the GetMLModel operation to view the contents of the updated data element. SEE ALSOThis service class forms part of PawsBUGS and CONTRIBUTIONSThe source code is located here: https://github.com/pplu/aws-sdk-perlPlease report bugs to: https://github.com/pplu/aws-sdk-perl/issues
Visit the GSP FreeBSD Man Page Interface. |