pyabsa.framework.prediction_class.predictor_template

Module Contents

Classes

InferenceModel

class pyabsa.framework.prediction_class.predictor_template.InferenceModel(checkpoint: str | object = None, config=None, **kwargs)[source]
task_code[source]
to(device=None)[source]

Sets the device on which the model will perform inference.

Parameters:

device – the device to use for inference

cpu()[source]

Sets the device to CPU for performing inference.

cuda(device='cuda:0')[source]

Sets the device to CUDA for performing inference.

Parameters:

device – the CUDA device to use for inference

__post_init__(**kwargs)[source]

Initializes the InferenceModel instance after its properties have been set.

abstract batch_predict(**kwargs)[source]

Predict from a file of sentences. param: target_file: the file path of the sentences to be predicted. param: print_result: whether to print the result. param: save_result: whether to save the result. param: ignore_error: whether to ignore the error when predicting. param: kwargs: other parameters.

abstract predict(**kwargs)[source]

Predict from a sentence or a list of sentences. param: text: the sentence or a list of sentence to be predicted. param: print_result: whether to print the result. param: ignore_error: whether to ignore the error when predicting. param: kwargs: other parameters.

abstract _run_prediction(**kwargs)[source]

This method should be implemented in the subclass for running predictions using the trained model.

Parameters:

kwargs – additional keyword arguments

Returns:

predicted labels or other prediction outputs

destroy()[source]

Deletes the model from memory and empties the CUDA cache.