Medial Code Documentation
|
Public Member Functions | |
None | __init__ (self, *float learning_rate=1.0, float subsample=0.8, float colsample_bynode=0.8, float reg_lambda=1e-5, **Any kwargs) |
Dict[str, Any] | get_xgb_params (self) |
int | get_num_boosting_rounds (self) |
"XGBRFRegressor" | fit (self, ArrayLike X, ArrayLike y, *Optional[ArrayLike] sample_weight=None, Optional[ArrayLike] base_margin=None, Optional[Sequence[Tuple[ArrayLike, ArrayLike]]] eval_set=None, Optional[Union[str, Sequence[str], Metric]] eval_metric=None, Optional[int] early_stopping_rounds=None, Optional[Union[bool, int]] verbose=True, Optional[Union[Booster, str, XGBModel]] xgb_model=None, Optional[Sequence[ArrayLike]] sample_weight_eval_set=None, Optional[Sequence[ArrayLike]] base_margin_eval_set=None, Optional[ArrayLike] feature_weights=None, Optional[Sequence[TrainingCallback]] callbacks=None) |
![]() | |
bool | __sklearn_is_fitted__ (self) |
Booster | get_booster (self) |
"XGBModel" | set_params (self, **Any params) |
Dict[str, Any] | get_params (self, bool deep=True) |
None | save_model (self, Union[str, os.PathLike] fname) |
None | load_model (self, ModelIn fname) |
ArrayLike | predict (self, ArrayLike X, bool output_margin=False, bool validate_features=True, Optional[ArrayLike] base_margin=None, Optional[Tuple[int, int]] iteration_range=None) |
np.ndarray | apply (self, ArrayLike X, Optional[Tuple[int, int]] iteration_range=None) |
Dict[str, Dict[str, List[float]]] | evals_result (self) |
int | n_features_in_ (self) |
np.ndarray | feature_names_in_ (self) |
float | best_score (self) |
int | best_iteration (self) |
np.ndarray | feature_importances_ (self) |
np.ndarray | coef_ (self) |
np.ndarray | intercept_ (self) |
Data Fields | |
early_stopping_rounds | |
callbacks | |
![]() | |
n_estimators | |
objective | |
max_depth | |
max_leaves | |
max_bin | |
grow_policy | |
learning_rate | |
verbosity | |
booster | |
tree_method | |
gamma | |
min_child_weight | |
max_delta_step | |
subsample | |
sampling_method | |
colsample_bytree | |
colsample_bylevel | |
colsample_bynode | |
reg_alpha | |
reg_lambda | |
scale_pos_weight | |
base_score | |
missing | |
num_parallel_tree | |
random_state | |
n_jobs | |
monotone_constraints | |
interaction_constraints | |
importance_type | |
device | |
validate_parameters | |
enable_categorical | |
feature_types | |
max_cat_to_onehot | |
max_cat_threshold | |
multi_strategy | |
eval_metric | |
early_stopping_rounds | |
callbacks | |
kwargs | |
n_classes_ | |
evals_result_ | |
Additional Inherited Members | |
![]() | |
str | extra_parameters |
![]() | |
Dict[str, bool] | _more_tags (self) |
str | _get_type (self) |
None | _load_model_attributes (self, dict config) |
Tuple[ Optional[Union[Booster, str, "XGBModel"]], Optional[Metric], Dict[str, Any], Optional[int], Optional[Sequence[TrainingCallback]],] | _configure_fit (self, Optional[Union[Booster, "XGBModel", str]] booster, Optional[Union[Callable, str, Sequence[str]]] eval_metric, Dict[str, Any] params, Optional[int] early_stopping_rounds, Optional[Sequence[TrainingCallback]] callbacks) |
DMatrix | _create_dmatrix (self, Optional[DMatrix] ref, **Any kwargs) |
None | _set_evaluation_result (self, TrainingCallback.EvalsLog evals_result) |
bool | _can_use_inplace_predict (self) |
Tuple[int, int] | _get_iteration_range (self, Optional[Tuple[int, int]] iteration_range) |
![]() | |
_Booster | |
None xgboost.sklearn.XGBRFRegressor.__init__ | ( | self, | |
*float | learning_rate = 1.0 , |
||
float | subsample = 0.8 , |
||
float | colsample_bynode = 0.8 , |
||
float | reg_lambda = 1e-5 , |
||
**Any | kwargs | ||
) |
Reimplemented from xgboost.sklearn.XGBRegressor.
"XGBRFRegressor" xgboost.sklearn.XGBRFRegressor.fit | ( | self, | |
ArrayLike | X, | ||
ArrayLike | y, | ||
*Optional[ArrayLike] | sample_weight = None , |
||
Optional[ArrayLike] | base_margin = None , |
||
Optional[Sequence[Tuple[ArrayLike, ArrayLike]]] | eval_set = None , |
||
Optional[Union[str, Sequence[str], Metric]] | eval_metric = None , |
||
Optional[int] | early_stopping_rounds = None , |
||
Optional[Union[bool, int]] | verbose = True , |
||
Optional[Union[Booster, str, XGBModel]] | xgb_model = None , |
||
Optional[Sequence[ArrayLike]] | sample_weight_eval_set = None , |
||
Optional[Sequence[ArrayLike]] | base_margin_eval_set = None , |
||
Optional[ArrayLike] | feature_weights = None , |
||
Optional[Sequence[TrainingCallback]] | callbacks = None |
||
) |
Fit gradient boosting model. Note that calling ``fit()`` multiple times will cause the model object to be re-fit from scratch. To resume training from a previous checkpoint, explicitly pass ``xgb_model`` argument. Parameters ---------- X : Feature matrix. See :ref:`py-data` for a list of supported types. When the ``tree_method`` is set to ``hist``, internally, the :py:class:`QuantileDMatrix` will be used instead of the :py:class:`DMatrix` for conserving memory. However, this has performance implications when the device of input data is not matched with algorithm. For instance, if the input is a numpy array on CPU but ``cuda`` is used for training, then the data is first processed on CPU then transferred to GPU. y : Labels sample_weight : instance weights base_margin : global bias for each instance. eval_set : A list of (X, y) tuple pairs to use as validation sets, for which metrics will be computed. Validation metrics will help us track the performance of the model. eval_metric : str, list of str, or callable, optional .. deprecated:: 1.6.0 Use `eval_metric` in :py:meth:`__init__` or :py:meth:`set_params` instead. early_stopping_rounds : int .. deprecated:: 1.6.0 Use `early_stopping_rounds` in :py:meth:`__init__` or :py:meth:`set_params` instead. verbose : If `verbose` is True and an evaluation set is used, the evaluation metric measured on the validation set is printed to stdout at each boosting stage. If `verbose` is an integer, the evaluation metric is printed at each `verbose` boosting stage. The last boosting stage / the boosting stage found by using `early_stopping_rounds` is also printed. xgb_model : file name of stored XGBoost model or 'Booster' instance XGBoost model to be loaded before training (allows training continuation). sample_weight_eval_set : A list of the form [L_1, L_2, ..., L_n], where each L_i is an array like object storing instance weights for the i-th validation set. base_margin_eval_set : A list of the form [M_1, M_2, ..., M_n], where each M_i is an array like object storing base margin for the i-th validation set. feature_weights : Weight for each feature, defines the probability of each feature being selected when colsample is being used. All values must be greater than 0, otherwise a `ValueError` is thrown. callbacks : .. deprecated:: 1.6.0 Use `callbacks` in :py:meth:`__init__` or :py:meth:`set_params` instead.
Reimplemented from xgboost.sklearn.XGBModel.
int xgboost.sklearn.XGBRFRegressor.get_num_boosting_rounds | ( | self | ) |
Gets the number of xgboost boosting rounds.
Reimplemented from xgboost.sklearn.XGBModel.
Dict[str, Any] xgboost.sklearn.XGBRFRegressor.get_xgb_params | ( | self | ) |
Get xgboost specific parameters.
Reimplemented from xgboost.sklearn.XGBModel.