November 29, 2012 by
Filed under: Uncategorized 


3 Comments on WordPress

  1. S├ębastien Derivaux on Thu, 29th Nov 2012 7:45 pm
  2. I’m a big fan of decision tree and linear regression. If that’s not enough, I use a hand made mix of genetic algorithms and ensemble classifiers.

    I still don’t get SVM, especially how you find the kernel to use. I love the idea of avoiding overfitting, but what the point if it’s to introduce a soft marging and space transformation that destroy this nice feature. When I see the number of illustrations that use 4 points for supporting vectors I think few people understand the basics of SVM (for wikipedia, the french version use 4 point, the german probably 4, only the english one is correct with 3 points).

  3. Jeff on Fri, 30th Nov 2012 8:04 pm
  4. Sandro,

    My favorite “hammer” for classification and regression problems is gradient boosted regression trees. Seamless handling of missing values, mixed type potentially correlated predictors, high accuracy, variable importance measures, partial dependency plots to understand average marginal effects of the inputs etc. make this my first go-to algorithm in the toolbox.

    Generally, I also like ensembles of different classier types (hybrid ensembles).

  5. Allan Engelhardt on Fri, 30th Nov 2012 10:26 pm
  6. Another vote for gradient boosted tree ensembles as the first call for many problems, for the reasons @Jeff mentioned. (I most often use the gbm package in R.)

    After that mixed ensembles. Often using the caret package which provides a reasonable consistent interface to >140 different models with all the bagging, cross-validation/bootstrapping, and parallel computing already taken care of.

    Sounds like Jeff and I should set up an echo chamber somewhere.

    Great post, Sandro

Tell me what you're thinking...

  • Swiss Association for Analytics

  • Most Popular Posts

  • T-shirts, Mugs & Mousepads

    All benefits given to a charity association
  • Data Mining Search Engine

    Supported by AnalyticBridge

  • Archives

  • Reading Recommandations