dpdt
.DPDTreeClassifier#
- class dpdt.DPDTreeClassifier(max_depth=3, max_nb_trees=1000, cart_nodes_list=(3,), random_state=42)[source]#
Bases:
ClassifierMixin
,BaseEstimator
Dynamic Programming Decision Tree (DPDTree) classifier.
- Parameters:
- max_depthint
The maximum depth of the tree.
- max_nb_treesint, default=1000
The maximum number of trees.
- random_stateint, default=42
Fixes randomness of the classifier. Randomness happens in the calls to cart.
- cart_nodes_listlist of int, default=(3,)
List containing the number of leaf nodes for the CART trees at each depth.
- Attributes:
- X_ndarray, shape (n_samples, n_features)
The input passed during
fit()
.- y_ndarray, shape (n_samples,)
The labels passed during
fit()
.- classes_ndarray, shape (n_classes,)
The classes seen at
fit()
.- n_features_in_int
Number of features seen during fit.
- feature_names_in_ndarray of shape (n_features_in_,)
Names of features seen during fit. Defined only when X has feature names that are all strings.
- mdplist of list of State
The Markov Decision Process represented as a list of lists of states, where each inner list contains the states at a specific depth.
- zetasarray-like
Array of zeta values to be used in the computation.
- treesdict
A dictionary representing the tree policies. The keys are tuples representing the state observation and depth, and the values are the optimal tree for each zeta value.
- init_oarray-like
The initial observation of the MDP.
Methods
expand_node_
(node[, depth])Node Expansion:
fit
(X, y[, feature_costs, infinite_memory])Fit the DPDTree classifier.
Get metadata routing of this object.
get_params
([deep])Get parameters for this estimator.
get_pareto_front
(X, y)Compute the decision path lengths / test accuracy pareto front of DPDTrees.
predict
(X)Predict class for X.
score
(X, y[, sample_weight])Return the mean accuracy on the given test data and labels.
set_fit_request
(*[, feature_costs, ...])Request metadata passed to the
fit
method.set_params
(**params)Set the parameters of this estimator.
set_score_request
(*[, sample_weight])Request metadata passed to the
score
method.Examples
>>> from dpdt import DPDTreeClassifier >>> from sklearn import datasets >>> >>> X, y = datasets.load_breast_cancer(return_X_y=True) >>> >>> clf = DPDTreeClassifier(max_depth=3, random_state=42) >>> clf.fit(X, y) >>> print(clf.score(X, y))
- expand_node_(node, depth=0)[source]#
- Node Expansion:
For each node at the current depth, calculates the unique classes and their counts for the samples in the node (node.nz).
Computes the best possible reward (rstar) and the action (astar) leading to the next state.
If further expansion is possible (i.e., depth budget allows and there are at least two classes), initializes a DecisionTreeClassifier to determine the splits.
Fits the classifier on the samples in the node, and identifies potential splits (features and thresholds).
- Action Creation and Transition:
For each split, creates an Action and determines the left and right child nodes based on the split.
Creates the left and right nodes as new states, and adds transitions to the action.
If an action has valid transitions, adds it to the current node.
- fit(X, y, feature_costs=None, infinite_memory=False)[source]#
Fit the DPDTree classifier.
Creates a root state with the concatenated minimum and maximum values of self.X_ with slight offsets.
Initializes a terminal_state as an array of zeros with a length of twice the number of features in self.X_.
Initializes the root state with all samples (nz as an array of True values).
- Parameters:
- Xarray-like of shape (n_samples, n_features)
The training input samples.
- yarray-like of shape (n_samples,)
The target values.
- feature_costs (optional): list of float, default=None
List containing the features costs.
- infinite_memory (optional): bool, default=None
If true force runs garbage collection. This allows to run CART with an infinite number of leave nodes.. It will multiply runtime by 500.
- Returns:
- selfobject
Fitted estimator.
- get_metadata_routing()#
Get metadata routing of this object.
Please check User Guide on how the routing mechanism works.
- Returns:
- routingMetadataRequest
A
MetadataRequest
encapsulating routing information.
- get_params(deep=True)#
Get parameters for this estimator.
- Parameters:
- deepbool, default=True
If True, will return the parameters for this estimator and contained subobjects that are estimators.
- Returns:
- paramsdict
Parameter names mapped to their values.
- get_pareto_front(X, y)[source]#
Compute the decision path lengths / test accuracy pareto front of DPDTrees.
- Parameters:
- Xarray-like of shape (n_samples, n_features)
The input samples.
- yarray-like of shape (n_samples,)
The target values.
- Returns:
- scoresarray-like of shape (n_samples)
The test accuracies of the trees.
- decision_path_lengtharray-like of shape (n_samples)
The average number of decision nodes traversal in each tree.
- predict(X)[source]#
Predict class for X.
- Parameters:
- Xarray-like of shape (n_samples, n_features)
The input samples.
- Returns:
- y_predarray of shape (n_samples,)
The predicted classes.
- score(X, y, sample_weight=None)#
Return the mean accuracy on the given test data and labels.
In multi-label classification, this is the subset accuracy which is a harsh metric since you require for each sample that each label set be correctly predicted.
- Parameters:
- Xarray-like of shape (n_samples, n_features)
Test samples.
- yarray-like of shape (n_samples,) or (n_samples, n_outputs)
True labels for X.
- sample_weightarray-like of shape (n_samples,), default=None
Sample weights.
- Returns:
- scorefloat
Mean accuracy of
self.predict(X)
w.r.t. y.
- set_fit_request(*, feature_costs: bool | None | str = '$UNCHANGED$', infinite_memory: bool | None | str = '$UNCHANGED$') DPDTreeClassifier #
Request metadata passed to the
fit
method.Note that this method is only relevant if
enable_metadata_routing=True
(seesklearn.set_config()
). Please see User Guide on how the routing mechanism works.The options for each parameter are:
True
: metadata is requested, and passed tofit
if provided. The request is ignored if metadata is not provided.False
: metadata is not requested and the meta-estimator will not pass it tofit
.None
: metadata is not requested, and the meta-estimator will raise an error if the user provides it.str
: metadata should be passed to the meta-estimator with this given alias instead of the original name.
The default (
sklearn.utils.metadata_routing.UNCHANGED
) retains the existing request. This allows you to change the request for some parameters and not others.Added in version 1.3.
Note
This method is only relevant if this estimator is used as a sub-estimator of a meta-estimator, e.g. used inside a
Pipeline
. Otherwise it has no effect.- Parameters:
- feature_costsstr, True, False, or None, default=sklearn.utils.metadata_routing.UNCHANGED
Metadata routing for
feature_costs
parameter infit
.- infinite_memorystr, True, False, or None, default=sklearn.utils.metadata_routing.UNCHANGED
Metadata routing for
infinite_memory
parameter infit
.
- Returns:
- selfobject
The updated object.
- set_params(**params)#
Set the parameters of this estimator.
The method works on simple estimators as well as on nested objects (such as
Pipeline
). The latter have parameters of the form<component>__<parameter>
so that it’s possible to update each component of a nested object.- Parameters:
- **paramsdict
Estimator parameters.
- Returns:
- selfestimator instance
Estimator instance.
- set_score_request(*, sample_weight: bool | None | str = '$UNCHANGED$') DPDTreeClassifier #
Request metadata passed to the
score
method.Note that this method is only relevant if
enable_metadata_routing=True
(seesklearn.set_config()
). Please see User Guide on how the routing mechanism works.The options for each parameter are:
True
: metadata is requested, and passed toscore
if provided. The request is ignored if metadata is not provided.False
: metadata is not requested and the meta-estimator will not pass it toscore
.None
: metadata is not requested, and the meta-estimator will raise an error if the user provides it.str
: metadata should be passed to the meta-estimator with this given alias instead of the original name.
The default (
sklearn.utils.metadata_routing.UNCHANGED
) retains the existing request. This allows you to change the request for some parameters and not others.Added in version 1.3.
Note
This method is only relevant if this estimator is used as a sub-estimator of a meta-estimator, e.g. used inside a
Pipeline
. Otherwise it has no effect.- Parameters:
- sample_weightstr, True, False, or None, default=sklearn.utils.metadata_routing.UNCHANGED
Metadata routing for
sample_weight
parameter inscore
.
- Returns:
- selfobject
The updated object.