Gaussian Process Regression (GPR).
Notes
Available conditional attributes:
(Conditional attributes enabled by default suffixed with +)
Methods
clone() | Create full copy of the classifier. |
compute_gradient_log_marginal_likelihood() | Compute gradient of the log marginal likelihood. |
compute_gradient_log_marginal_likelihood_logscale() | Compute gradient of the log marginal likelihood when hyperparameters are in logscale. |
compute_log_marginal_likelihood() | Compute log marginal likelihood using self.train_fv and self.targets. |
generate(ds) | Yield processing results. |
get_postproc() | Returns the post-processing node or None. |
get_sensitivity_analyzer([flavor]) | Returns a sensitivity analyzer for GPR. |
get_space() | Query the processing space name of this node. |
is_trained([dataset]) | Either classifier was already trained. |
predict(obj, data, *args, **kwargs) | |
repredict(obj, data, *args, **kwargs) | |
reset() | |
retrain(dataset, **kwargs) | Helper to avoid check if data was changed actually changed Useful if just some aspects of classifier were changed since its previous training. |
set_hyperparameters(hyperparameter) | Set hyperparameters’ values. |
set_postproc(node) | Assigns a post-processing node Set to None to disable postprocessing. |
set_space(name) | Set the processing space name of this node. |
summary() | Providing summary over the classifier |
train(ds) | The default implementation calls _pretrain(), _train(), and finally _posttrain(). |
untrain() | Reverts changes in the state of this node caused by previous training |
Initialize a GPR regression analysis.
Parameters: | kernel : Kernel
sigma_noise :
lm :
retrainable :
enable_ca : None or list of str
disable_ca : None or list of str
auto_train : bool
force_train : bool
space: str, optional :
postproc : Node instance, optional
descr : str
|
---|
Methods
clone() | Create full copy of the classifier. |
compute_gradient_log_marginal_likelihood() | Compute gradient of the log marginal likelihood. |
compute_gradient_log_marginal_likelihood_logscale() | Compute gradient of the log marginal likelihood when hyperparameters are in logscale. |
compute_log_marginal_likelihood() | Compute log marginal likelihood using self.train_fv and self.targets. |
generate(ds) | Yield processing results. |
get_postproc() | Returns the post-processing node or None. |
get_sensitivity_analyzer([flavor]) | Returns a sensitivity analyzer for GPR. |
get_space() | Query the processing space name of this node. |
is_trained([dataset]) | Either classifier was already trained. |
predict(obj, data, *args, **kwargs) | |
repredict(obj, data, *args, **kwargs) | |
reset() | |
retrain(dataset, **kwargs) | Helper to avoid check if data was changed actually changed Useful if just some aspects of classifier were changed since its previous training. |
set_hyperparameters(hyperparameter) | Set hyperparameters’ values. |
set_postproc(node) | Assigns a post-processing node Set to None to disable postprocessing. |
set_space(name) | Set the processing space name of this node. |
summary() | Providing summary over the classifier |
train(ds) | The default implementation calls _pretrain(), _train(), and finally _posttrain(). |
untrain() | Reverts changes in the state of this node caused by previous training |
Compute gradient of the log marginal likelihood. This version use a more compact formula provided by Williams and Rasmussen book.
Compute gradient of the log marginal likelihood when hyperparameters are in logscale. This version use a more compact formula provided by Williams and Rasmussen book.
Compute log marginal likelihood using self.train_fv and self.targets.
Returns a sensitivity analyzer for GPR.
Parameters: | flavor : str
|
---|
Set hyperparameters’ values.
Note that ‘hyperparameter’ is a sequence so the order of its values is important. First value must be sigma_noise, then other kernel’s hyperparameters values follow in the exact order the kernel expect them to be.