Right now, I am using fitctree and pruning the tree to level one, but beacuse the 'SplitCriterion' is 'gdi' and the 'PruneCriterion' is 'error', I can not find the best decision stump to minimize the total error. I know there is an option 'MaxNumSplits' for growing a decision tree, but this option was added on 2015 and I am using 2014 version of Matlab. So, I need to build a decision stump which minimizes the error with respects to the given weights, and I prefer to use fitctree instead of writing that from scratch. Any suggestion ?!

Here is an example that explain my problem more clearly. The correct one must look like this :

I have a full tree like the following tree and I am pruning that until (max-1) level.

Since it is not a decision stump yet, I am pruning that one level more. but it does not the same as the correct one.

And it is the code that I am using:

` MinLeafSize = 1; MinParentSize = 2; NumVariablesToSample = 'all'; ScoreTransform = 'none'; PruneCriterion= 'error' ; SplitCriterion = 'gdi'; Weights = Pt; % Train the weak learner by weights Pt`

tree = fitctree(X,Y, 'MinLeafSize',MinLeafSize, 'MinParentSize', MinParentSize, 'NumVariablesToSample', NumVariablesToSample, ... 'PruneCriterion',PruneCriterion, 'SplitCriterion', SplitCriterion, 'Weights', Weights, 'ScoreTransform', ScoreTransform); prune_tree = prune(tree, 'Level', max(tree.PruneList)-1); % prune tree to have decision stump

% if the prune tree still has more than one decision node (three inner nodes) use the max(tree.PruneList) to reduce it to just one node

if length(prune_tree.NodeSize) > 3 prune_tree = prune(tree, 'Level', max(tree.PruneList)); end

P.S. I would like to implement AdaBoostM1 by using Decision Stumps as the weak learner. I know that there is a built in function 'fitensemble' for making an ensemble classifier using AdaBoostM1 algorithm, but I would like to implement it myself.

## Best Answer