Optimization Criteria : Gini impurity or Entropy/Info gain
Max Depth - Build trees with max. d depth deep; d = Number of nodes from top
min_samples_split: The minimum number of samples required to split an internal node
min_samples_leaf: The minimum number of samples required to be at a leaf node
max_features: The number of features to consider when looking for the best split
min_impurity_decrease: A node will be split if this split induces a decrease of the impurity greater than or equal to this value
class_weight:Weights associated with classes in the form {class_label: weight}
Monday, July 22, 2019
Decision Tree Important Hyper parameters
Subscribe to:
Post Comments (Atom)
Image noise comparison methods
1. using reference image technique - peak_signal_noise_ratio (PSNR) - SSI 2. non-reference image technique - BRISQUE python pac...
-
No Risk Details 1 Version Disclosure (ASP.NET) Description: This information can be found in HTTP Response Header w...
-
Artificial Intelligence is the broader umbrella under which Machine Learning and Deep Learning come. And you can also see in the diagram th...
-
Recently, I had a requirement to copy users from one SharePoint group to another group. Unfortunately, SharePoint doesn't support nest...
No comments:
Post a Comment