In class we talked briefly about the number of trees and you mentioned to experiment with it. Possibly starting at 10 trees and increasing incrementally until your model is no longer making misclassifications. Would it be worth starting with a large number of trees, let's say 100 since after feature selection that may not be too computationally expensive, and then perform pruning incrementally? Also, if you do start at 100 trees and it no longer is making misclassifications after tree 15, is the model benefitting any further by training those additional 85 learners?

Created by butcheer

Th number of trees in boosting ensembles page is loading…