17  Linear Additive Tree

The Linear Additive Tree (LINAD) alternates between training a linear model and performing splits, each time fitting the gradient of the loss. The linear coefficients along each path from root to leaf are summed, giving a decision tree with linear models in the terminal nodes. This results in a single, fully interpretable and highly accurate model that rivals ensembles of traditional decision trees (CART), like random forest and gradient boosting.


At the same time, random forest or gradient boosting trained with LINAD base learners can lead to improved performance over CART ensembles while using considerably fewer trees (new vignette on this is upcoming).

17.1 Synthetic Data

First, we create some simple synthetic data:

x <- rnormmat(500, 12, return.df = TRUE, seed = 1999)
y <- x[, 3] + x[, 5]^2 + x[, 9] + rnorm(500)
dat <- data.frame(x, y)
res <- resample(dat)
05-20-25 07:25:13 Input contains more than one columns; will stratify on last :resample
.:Resampling Parameters
    n.resamples: 10 
      resampler: strat.sub 
   stratify.var: y 
        train.p: 0.75 
   strat.n.bins: 4 
05-20-25 07:25:13 Created 10 stratified subsamples :resample

dat_train <- dat[res$Subsample_1, ]
dat_test <- dat[-res$Subsample_1, ]

17.2 Models

We are going to train elastic net (GLMNET), CART, random forest (RF), gradient boosting (GBM), and Linear Additive Tree (LINAD).

GLMNET is tuned for alpha and lambda.
CART is tuned by cost-complexity pruning.
GBM is tuned for number of trees.
LINAD is tuned for number of leaf nodes.

17.2.1 GLMNET

mod_glmnet <- s_GLMNET(dat_train, dat_test)
05-20-25 07:25:13 Hello, egenn :s_GLMNET

.:Regression Input Summary
Training features: 374 x 12 
 Training outcome: 374 x 1 
 Testing features: 126 x 12 
  Testing outcome: 126 x 1 

05-20-25 07:25:13 Running grid search... :gridSearchLearn
.:Resampling Parameters
    n.resamples: 5 
      resampler: kfold 
   stratify.var: y 
   strat.n.bins: 4 
05-20-25 07:25:13 Created 5 independent folds :resample
.:Search parameters
    grid.params:  
                 alpha: 0, 0.2, 0.4, 0.6, 0.8, 1... 
   fixed.params:  
                             .gs: TRUE 
                 which.cv.lambda: lambda.1se 
05-20-25 07:25:13 Tuning Elastic Net by exhaustive grid search. :gridSearchLearn
05-20-25 07:25:13 5 inner resamples; 30 models total; running on 9 workers (aarch64-apple-darwin20) :gridSearchLearn
05-20-25 07:25:14 Extracting best lambda from GLMNET models... :gridSearchLearn
.:Best parameters to minimize MSE
   best.tune:  
              lambda: 0.464712037007993 
               alpha: 0.8 
05-20-25 07:25:14 Completed in 0.01 minutes (Real: 0.81; User: 0.17; System: 0.17) :gridSearchLearn

.:Parameters
    alpha: 0.8 
   lambda: 0.464712037007993 

05-20-25 07:25:14 Training elastic net model... :s_GLMNET

.:GLMNET Regression Training Summary
    MSE = 2.85
   RMSE = 1.69
    MAE = 1.29
      r = 0.68 (p = 1.8e-52)
   R sq = 0.40

.:GLMNET Regression Testing Summary
    MSE = 2.63
   RMSE = 1.62
    MAE = 1.23
      r = 0.67 (p = 5.8e-18)
   R sq = 0.40
05-20-25 07:25:14 Completed in 0.01 minutes (Real: 0.88; User: 0.22; System: 0.17) :s_GLMNET

mod_glmnet$plotVarImp()

As expected, GLMNET only captures the linear features.

17.2.2 CART

mod_cart <- s_CART(dat_train, dat_test,
                   cp = 0,
                   prune.cp = c(0, .001, .01, .1))
05-20-25 07:25:14 Hello, egenn :s_CART

.:Regression Input Summary
Training features: 374 x 12 
 Training outcome: 374 x 1 
 Testing features: 126 x 12 
  Testing outcome: 126 x 1 

05-20-25 07:25:14 Running grid search... :gridSearchLearn
.:Resampling Parameters
    n.resamples: 5 
      resampler: kfold 
   stratify.var: y 
   strat.n.bins: 4 
05-20-25 07:25:14 Created 5 independent folds :resample
.:Search parameters
    grid.params:  
                  maxdepth: 20 
                  minsplit: 2 
                 minbucket: 1 
                        cp: 0 
                  prune.cp: 0, 0.001, 0.01, 0.1 
   fixed.params:  
                         method: anova 
                          model: TRUE 
                     maxcompete: 0 
                   maxsurrogate: 0 
                   usesurrogate: 2 
                 surrogatestyle: 0 
                           xval: 0 
                           cost: 1, 1, 1, 1, 1, 1... 
                            ifw: TRUE 
                       ifw.type: 2 
                       upsample: FALSE 
                     downsample: FALSE 
                  resample.seed: NULL 
05-20-25 07:25:14 Tuning Classification and Regression Trees by exhaustive grid search. :gridSearchLearn
05-20-25 07:25:14 5 inner resamples; 20 models total; running on 9 workers (aarch64-apple-darwin20) :gridSearchLearn
.:Best parameters to minimize MSE
   best.tune:  
               maxdepth: 20 
               minsplit: 2 
              minbucket: 1 
                     cp: 0 
               prune.cp: 0.01 
05-20-25 07:25:14 Completed in 4.9e-03 minutes (Real: 0.29; User: 0.14; System: 0.09) :gridSearchLearn

05-20-25 07:25:14 Training CART... :s_CART

.:CART Regression Training Summary
    MSE = 1.26
   RMSE = 1.12
    MAE = 0.89
      r = 0.86 (p = 2.9e-109)
   R sq = 0.73

.:CART Regression Testing Summary
    MSE = 2.26
   RMSE = 1.50
    MAE = 1.18
      r = 0.70 (p = 4.2e-20)
   R sq = 0.48
05-20-25 07:25:14 Completed in 0.01 minutes (Real: 0.31; User: 0.15; System: 0.10) :s_CART

mod_cart$plotVarImp()

dplot3_cart(mod_cart)
05-20-25 07:25:14 Object is rtemis rpart model :dplot3_cart

17.2.3 RF

mod_rf <- s_Ranger(dat_train, dat_test,
                   mtry = 12)
05-20-25 07:25:14 Hello, egenn :s_Ranger

.:Regression Input Summary
Training features: 374 x 12 
 Training outcome: 374 x 1 
 Testing features: 126 x 12 
  Testing outcome: 126 x 1 

.:Parameters
   n.trees: 1000 
      mtry: 12 

05-20-25 07:25:14 Training Random Forest (ranger) Regression with 1000 trees... :s_Ranger

.:Ranger Regression Training Summary
    MSE = 0.23
   RMSE = 0.48
    MAE = 0.37
      r = 0.98 (p = 2.4e-275)
   R sq = 0.95

.:Ranger Regression Testing Summary
    MSE = 1.45
   RMSE = 1.21
    MAE = 0.98
      r = 0.84 (p = 2.6e-34)
   R sq = 0.67
05-20-25 07:25:14 Completed in 2.6e-03 minutes (Real: 0.16; User: 0.87; System: 0.02) :s_Ranger

mod_rf$plotVarImp()

17.2.4 GBM

mod_gbm <- s_GBM(dat_train, dat_test)
05-20-25 07:25:14 Hello, egenn :s_GBM

.:Regression Input Summary
Training features: 374 x 12 
 Training outcome: 374 x 1 
 Testing features: 126 x 12 
  Testing outcome: 126 x 1 
05-20-25 07:25:14 Distribution set to gaussian :s_GBM

05-20-25 07:25:14 Running Gradient Boosting Regression with a gaussian loss function :s_GBM

05-20-25 07:25:14 Running grid search... :gridSearchLearn
.:Resampling Parameters
    n.resamples: 5 
      resampler: kfold 
   stratify.var: y 
   strat.n.bins: 4 
05-20-25 07:25:14 Created 5 independent folds :resample
.:Search parameters
    grid.params:  
                 interaction.depth: 2 
                         shrinkage: 0.01 
                      bag.fraction: 0.9 
                    n.minobsinnode: 5 
   fixed.params:  
                           n.trees: 2000 
                         max.trees: 5000 
                 gbm.select.smooth: FALSE 
                       n.new.trees: 500 
                         min.trees: 50 
                    failsafe.trees: 500 
                               ifw: TRUE 
                          ifw.type: 2 
                          upsample: FALSE 
                        downsample: FALSE 
                     resample.seed: NULL 
                            relInf: FALSE 
                   plot.tune.error: FALSE 
                               .gs: TRUE 
05-20-25 07:25:14 Tuning Gradient Boosting Machine by exhaustive grid search. :gridSearchLearn
05-20-25 07:25:14 5 inner resamples; 5 models total; running on 9 workers (aarch64-apple-darwin20) :gridSearchLearn
05-20-25 07:25:15 Running grid line #1 of 5... :...future.FUN
05-20-25 07:25:15 Hello, egenn :s_GBM

.:Regression Input Summary
Training features: 298 x 12 
 Training outcome: 298 x 1 
 Testing features: 76 x 12 
  Testing outcome: 76 x 1 
05-20-25 07:25:15 Distribution set to gaussian :s_GBM

05-20-25 07:25:15 Running Gradient Boosting Regression with a gaussian loss function :s_GBM

.:Parameters
             n.trees: 2000 
   interaction.depth: 2 
           shrinkage: 0.01 
        bag.fraction: 0.9 
      n.minobsinnode: 5 
             weights: NULL 

.:GBM Regression Training Summary
    MSE = 0.47
   RMSE = 0.69
    MAE = 0.54
      r = 0.95 (p = 2.2e-150)
   R sq = 0.90
05-20-25 07:25:15 Using predict for Regression with type = link :s_GBM

.:GBM Regression Testing Summary
    MSE = 1.48
   RMSE = 1.22
    MAE = 0.93
      r = 0.85 (p = 2.6e-22)
   R sq = 0.72
05-20-25 07:25:15 Completed in 2.3e-03 minutes (Real: 0.14; User: 0.13; System: 0.01) :s_GBM
05-20-25 07:25:15 Running grid line #2 of 5... :...future.FUN
05-20-25 07:25:15 Hello, egenn :s_GBM

.:Regression Input Summary
Training features: 301 x 12 
 Training outcome: 301 x 1 
 Testing features: 73 x 12 
  Testing outcome: 73 x 1 
05-20-25 07:25:15 Distribution set to gaussian :s_GBM

05-20-25 07:25:15 Running Gradient Boosting Regression with a gaussian loss function :s_GBM

.:Parameters
             n.trees: 2000 
   interaction.depth: 2 
           shrinkage: 0.01 
        bag.fraction: 0.9 
      n.minobsinnode: 5 
             weights: NULL 

.:GBM Regression Training Summary
    MSE = 0.62
   RMSE = 0.79
    MAE = 0.61
      r = 0.94 (p = 4.1e-141)
   R sq = 0.87
05-20-25 07:25:15 Using predict for Regression with type = link :s_GBM

.:GBM Regression Testing Summary
    MSE = 1.31
   RMSE = 1.14
    MAE = 0.91
      r = 0.85 (p = 1.2e-21)
   R sq = 0.69
05-20-25 07:25:15 Completed in 2.2e-03 minutes (Real: 0.13; User: 0.12; System: 0.01) :s_GBM
05-20-25 07:25:15 Running grid line #3 of 5... :...future.FUN
05-20-25 07:25:15 Hello, egenn :s_GBM

.:Regression Input Summary
Training features: 298 x 12 
 Training outcome: 298 x 1 
 Testing features: 76 x 12 
  Testing outcome: 76 x 1 
05-20-25 07:25:15 Distribution set to gaussian :s_GBM

05-20-25 07:25:15 Running Gradient Boosting Regression with a gaussian loss function :s_GBM

.:Parameters
             n.trees: 2000 
   interaction.depth: 2 
           shrinkage: 0.01 
        bag.fraction: 0.9 
      n.minobsinnode: 5 
             weights: NULL 

.:GBM Regression Training Summary
    MSE = 0.49
   RMSE = 0.70
    MAE = 0.54
      r = 0.95 (p = 1.6e-149)
   R sq = 0.89
05-20-25 07:25:15 Using predict for Regression with type = link :s_GBM

.:GBM Regression Testing Summary
    MSE = 1.38
   RMSE = 1.17
    MAE = 0.92
      r = 0.87 (p = 1.8e-24)
   R sq = 0.74
05-20-25 07:25:15 Completed in 2.2e-03 minutes (Real: 0.13; User: 0.13; System: 0.01) :s_GBM
05-20-25 07:25:15 Running grid line #4 of 5... :...future.FUN
05-20-25 07:25:15 Hello, egenn :s_GBM

.:Regression Input Summary
Training features: 301 x 12 
 Training outcome: 301 x 1 
 Testing features: 73 x 12 
  Testing outcome: 73 x 1 
05-20-25 07:25:15 Distribution set to gaussian :s_GBM

05-20-25 07:25:15 Running Gradient Boosting Regression with a gaussian loss function :s_GBM

.:Parameters
             n.trees: 2000 
   interaction.depth: 2 
           shrinkage: 0.01 
        bag.fraction: 0.9 
      n.minobsinnode: 5 
             weights: NULL 

.:GBM Regression Training Summary
    MSE = 0.66
   RMSE = 0.81
    MAE = 0.64
      r = 0.94 (p = 1.2e-138)
   R sq = 0.87
05-20-25 07:25:15 Using predict for Regression with type = link :s_GBM

.:GBM Regression Testing Summary
    MSE = 1.01
   RMSE = 1.00
    MAE = 0.78
      r = 0.86 (p = 1.1e-22)
   R sq = 0.74
05-20-25 07:25:15 Completed in 2.2e-03 minutes (Real: 0.13; User: 0.12; System: 0.01) :s_GBM
05-20-25 07:25:15 Running grid line #5 of 5... :...future.FUN
05-20-25 07:25:15 Hello, egenn :s_GBM

.:Regression Input Summary
Training features: 298 x 12 
 Training outcome: 298 x 1 
 Testing features: 76 x 12 
  Testing outcome: 76 x 1 
05-20-25 07:25:15 Distribution set to gaussian :s_GBM

05-20-25 07:25:15 Running Gradient Boosting Regression with a gaussian loss function :s_GBM

.:Parameters
             n.trees: 2000 
   interaction.depth: 2 
           shrinkage: 0.01 
        bag.fraction: 0.9 
      n.minobsinnode: 5 
             weights: NULL 

.:GBM Regression Training Summary
    MSE = 0.71
   RMSE = 0.84
    MAE = 0.66
      r = 0.93 (p = 2.3e-130)
   R sq = 0.85
05-20-25 07:25:15 Using predict for Regression with type = link :s_GBM

.:GBM Regression Testing Summary
    MSE = 1.07
   RMSE = 1.04
    MAE = 0.80
      r = 0.89 (p = 1.3e-26)
   R sq = 0.78
05-20-25 07:25:15 Completed in 2.2e-03 minutes (Real: 0.13; User: 0.12; System: 0.01) :s_GBM
.:Best parameters to minimize MSE
   best.tune:  
                        n.trees: 990 
              interaction.depth: 2 
                      shrinkage: 0.01 
                   bag.fraction: 0.9 
                 n.minobsinnode: 5 
05-20-25 07:25:15 Completed in 0.01 minutes (Real: 0.31; User: 0.11; System: 0.05) :gridSearchLearn

.:Parameters
             n.trees: 990 
   interaction.depth: 2 
           shrinkage: 0.01 
        bag.fraction: 0.9 
      n.minobsinnode: 5 
             weights: NULL 
05-20-25 07:25:15 Training GBM on full training set... :s_GBM

.:GBM Regression Training Summary
    MSE = 0.62
   RMSE = 0.79
    MAE = 0.61
      r = 0.94 (p = 2.8e-170)
   R sq = 0.87
05-20-25 07:25:15 Calculating relative influence of variables... :s_GBM
05-20-25 07:25:15 Using predict for Regression with type = link :s_GBM

.:GBM Regression Testing Summary
    MSE = 1.27
   RMSE = 1.13
    MAE = 0.90
      r = 0.86 (p = 1.1e-37)
   R sq = 0.71
05-20-25 07:25:15 Completed in 0.01 minutes (Real: 0.43; User: 0.21; System: 0.06) :s_GBM

mod_gbm$plotVarImp()

17.2.5 LINAD

mod_linad <- s_LINAD(dat_train, dat_test)
05-20-25 07:25:15 Hello, egenn :s_LINAD

.:Regression Input Summary
Training features: 374 x 12 
 Training outcome: 374 x 1 
 Testing features: 126 x 12 
  Testing outcome: 126 x 1 

.:Parameters
         max.leaves: 20 
      learning.rate: 0.5 
              gamma: 0.5 
           lin.type: glmnet 
              nvmax: 3 
              alpha: 1 
             lambda: 0.05 
   minobsinnode.lin: 10 
      part.minsplit: 2 
     part.minbucket: 1 
            part.cp: 0 
05-20-25 07:25:15 Training first Linear Model... :linadleaves
05-20-25 07:25:15 Working on node id #1... :linadleaves
05-20-25 07:25:15 Working on node id #2... :linadleaves
05-20-25 07:25:15 Working on node id #3... :linadleaves
05-20-25 07:25:15 Working on node id #4... :linadleaves
05-20-25 07:25:15 Working on node id #5... :linadleaves
05-20-25 07:25:15 Working on node id #10... :linadleaves
05-20-25 07:25:15 Working on node id #11... :linadleaves
05-20-25 07:25:15 Working on node id #20... :linadleaves
05-20-25 07:25:15 Working on node id #21... :linadleaves
05-20-25 07:25:15 Working on node id #8... :linadleaves
05-20-25 07:25:15 Working on node id #9... :linadleaves
05-20-25 07:25:15 Working on node id #22... :linadleaves
05-20-25 07:25:15 Working on node id #23... :linadleaves
05-20-25 07:25:15 Working on node id #40... :linadleaves
05-20-25 07:25:15 Working on node id #41... :linadleaves
05-20-25 07:25:15 Working on node id #16... :linadleaves
05-20-25 07:25:15 Working on node id #17... :linadleaves
05-20-25 07:25:15 Working on node id #44... :linadleaves
05-20-25 07:25:15 Working on node id #45... :linadleaves
05-20-25 07:25:15 Working on node id #42... :linadleaves
05-20-25 07:25:15 Working on node id #43... :linadleaves
05-20-25 07:25:15 Working on node id #80... :linadleaves
05-20-25 07:25:15 Working on node id #81... :linadleaves
05-20-25 07:25:15 Working on node id #32... :linadleaves
05-20-25 07:25:15 Working on node id #33... :linadleaves
05-20-25 07:25:15 Working on node id #84... :linadleaves
05-20-25 07:25:15 Working on node id #85... :linadleaves
05-20-25 07:25:15 Working on node id #160... :linadleaves
05-20-25 07:25:15 Working on node id #161... :linadleaves
05-20-25 07:25:15 Working on node id #64... :linadleaves
05-20-25 07:25:15 Working on node id #65... :linadleaves
05-20-25 07:25:15 Working on node id #168... :linadleaves
05-20-25 07:25:15 Working on node id #169... :linadleaves
05-20-25 07:25:15 Working on node id #320... :linadleaves
05-20-25 07:25:15 Working on node id #321... :linadleaves
05-20-25 07:25:15 Working on node id #128... :linadleaves
05-20-25 07:25:15 Working on node id #129... :linadleaves
05-20-25 07:25:16 Selected 10 leaves of 20 total function (..., col = "69;1", bold = TRUE) 
{
    paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;", col, "m", paste(...), "\033[0m")
} :selectleaves
05-20-25 07:25:15 Training first Linear Model... :linadleaves
05-20-25 07:25:15 Working on node id #1... :linadleaves
05-20-25 07:25:15 Working on node id #2... :linadleaves
05-20-25 07:25:15 Working on node id #3... :linadleaves
05-20-25 07:25:15 Working on node id #4... :linadleaves
05-20-25 07:25:15 Working on node id #5... :linadleaves
05-20-25 07:25:15 Working on node id #10... :linadleaves
05-20-25 07:25:15 Working on node id #11... :linadleaves
05-20-25 07:25:15 Working on node id #20... :linadleaves
05-20-25 07:25:15 Working on node id #21... :linadleaves
05-20-25 07:25:15 Working on node id #42... :linadleaves
05-20-25 07:25:15 Working on node id #43... :linadleaves
05-20-25 07:25:15 Working on node id #8... :linadleaves
05-20-25 07:25:15 Working on node id #9... :linadleaves
05-20-25 07:25:15 Working on node id #40... :linadleaves
05-20-25 07:25:15 Working on node id #41... :linadleaves
05-20-25 07:25:15 Working on node id #16... :linadleaves
05-20-25 07:25:15 Working on node id #17... :linadleaves
05-20-25 07:25:15 Working on node id #86... :linadleaves
05-20-25 07:25:15 Working on node id #87... :linadleaves
05-20-25 07:25:15 Working on node id #84... :linadleaves
05-20-25 07:25:15 Working on node id #85... :linadleaves
05-20-25 07:25:15 Working on node id #80... :linadleaves
05-20-25 07:25:15 Working on node id #81... :linadleaves
05-20-25 07:25:15 Working on node id #32... :linadleaves
05-20-25 07:25:15 Working on node id #33... :linadleaves
05-20-25 07:25:15 Working on node id #172... :linadleaves
05-20-25 07:25:15 Working on node id #173... :linadleaves
05-20-25 07:25:15 Working on node id #168... :linadleaves
05-20-25 07:25:15 Working on node id #169... :linadleaves
05-20-25 07:25:15 Working on node id #160... :linadleaves
05-20-25 07:25:15 Working on node id #161... :linadleaves
05-20-25 07:25:15 Working on node id #336... :linadleaves
05-20-25 07:25:15 Working on node id #337... :linadleaves
05-20-25 07:25:15 Working on node id #64... :linadleaves
05-20-25 07:25:15 Working on node id #65... :linadleaves
05-20-25 07:25:15 Working on node id #344... :linadleaves
05-20-25 07:25:15 Working on node id #345... :linadleaves
05-20-25 07:25:16 Selected 3 leaves of 20 total function (..., col = "69;1", bold = TRUE) 
{
    paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;", col, "m", paste(...), "\033[0m")
} :selectleaves
05-20-25 07:25:15 Training first Linear Model... :linadleaves
05-20-25 07:25:15 Working on node id #1... :linadleaves
05-20-25 07:25:15 Working on node id #2... :linadleaves
05-20-25 07:25:15 Working on node id #3... :linadleaves
05-20-25 07:25:15 Working on node id #4... :linadleaves
05-20-25 07:25:15 Working on node id #5... :linadleaves
05-20-25 07:25:15 Working on node id #10... :linadleaves
05-20-25 07:25:15 Working on node id #11... :linadleaves
05-20-25 07:25:15 Working on node id #20... :linadleaves
05-20-25 07:25:15 Working on node id #21... :linadleaves
05-20-25 07:25:15 Working on node id #22... :linadleaves
05-20-25 07:25:15 Working on node id #23... :linadleaves
05-20-25 07:25:15 Working on node id #8... :linadleaves
05-20-25 07:25:15 Working on node id #9... :linadleaves
05-20-25 07:25:15 Working on node id #16... :linadleaves
05-20-25 07:25:15 Working on node id #17... :linadleaves
05-20-25 07:25:15 Working on node id #40... :linadleaves
05-20-25 07:25:15 Working on node id #41... :linadleaves
05-20-25 07:25:15 Working on node id #32... :linadleaves
05-20-25 07:25:15 Working on node id #33... :linadleaves
05-20-25 07:25:15 Working on node id #42... :linadleaves
05-20-25 07:25:15 Working on node id #43... :linadleaves
05-20-25 07:25:15 Working on node id #80... :linadleaves
05-20-25 07:25:15 Working on node id #81... :linadleaves
05-20-25 07:25:15 Working on node id #64... :linadleaves
05-20-25 07:25:15 Working on node id #65... :linadleaves
05-20-25 07:25:15 Working on node id #160... :linadleaves
05-20-25 07:25:15 Working on node id #161... :linadleaves
05-20-25 07:25:15 Working on node id #84... :linadleaves
05-20-25 07:25:15 Working on node id #85... :linadleaves
05-20-25 07:25:15 Working on node id #128... :linadleaves
05-20-25 07:25:15 Working on node id #129... :linadleaves
05-20-25 07:25:15 Working on node id #168... :linadleaves
05-20-25 07:25:15 Working on node id #169... :linadleaves
05-20-25 07:25:15 Working on node id #320... :linadleaves
05-20-25 07:25:15 Working on node id #321... :linadleaves
05-20-25 07:25:15 Working on node id #256... :linadleaves
05-20-25 07:25:15 Working on node id #257... :linadleaves
05-20-25 07:25:16 Selected 10 leaves of 20 total function (..., col = "69;1", bold = TRUE) 
{
    paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;", col, "m", paste(...), "\033[0m")
} :selectleaves
05-20-25 07:25:15 Training first Linear Model... :linadleaves
05-20-25 07:25:15 Working on node id #1... :linadleaves
05-20-25 07:25:15 Working on node id #2... :linadleaves
05-20-25 07:25:15 Working on node id #3... :linadleaves
05-20-25 07:25:15 Working on node id #4... :linadleaves
05-20-25 07:25:15 Working on node id #5... :linadleaves
05-20-25 07:25:15 Working on node id #10... :linadleaves
05-20-25 07:25:15 Working on node id #11... :linadleaves
05-20-25 07:25:15 Working on node id #20... :linadleaves
05-20-25 07:25:15 Working on node id #21... :linadleaves
05-20-25 07:25:15 Working on node id #42... :linadleaves
05-20-25 07:25:15 Working on node id #43... :linadleaves
05-20-25 07:25:15 Working on node id #8... :linadleaves
05-20-25 07:25:15 Working on node id #9... :linadleaves
05-20-25 07:25:15 Working on node id #40... :linadleaves
05-20-25 07:25:15 Working on node id #41... :linadleaves
05-20-25 07:25:15 Working on node id #16... :linadleaves
05-20-25 07:25:15 Working on node id #17... :linadleaves
05-20-25 07:25:15 Working on node id #86... :linadleaves
05-20-25 07:25:15 Working on node id #87... :linadleaves
05-20-25 07:25:15 Working on node id #80... :linadleaves
05-20-25 07:25:15 Working on node id #81... :linadleaves
05-20-25 07:25:15 Working on node id #84... :linadleaves
05-20-25 07:25:15 Working on node id #85... :linadleaves
05-20-25 07:25:15 Working on node id #172... :linadleaves
05-20-25 07:25:15 Working on node id #173... :linadleaves
05-20-25 07:25:15 Working on node id #32... :linadleaves
05-20-25 07:25:15 Working on node id #33... :linadleaves
05-20-25 07:25:15 Working on node id #160... :linadleaves
05-20-25 07:25:15 Working on node id #161... :linadleaves
05-20-25 07:25:15 Working on node id #168... :linadleaves
05-20-25 07:25:15 Working on node id #169... :linadleaves
05-20-25 07:25:15 Working on node id #344... :linadleaves
05-20-25 07:25:15 Working on node id #345... :linadleaves
05-20-25 07:25:15 Working on node id #64... :linadleaves
05-20-25 07:25:15 Working on node id #65... :linadleaves
05-20-25 07:25:15 Working on node id #320... :linadleaves
05-20-25 07:25:15 Working on node id #321... :linadleaves
05-20-25 07:25:16 Selected 11 leaves of 20 total function (..., col = "69;1", bold = TRUE) 
{
    paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;", col, "m", paste(...), "\033[0m")
} :selectleaves
05-20-25 07:25:15 Training first Linear Model... :linadleaves
05-20-25 07:25:15 Working on node id #1... :linadleaves
05-20-25 07:25:15 Working on node id #2... :linadleaves
05-20-25 07:25:15 Working on node id #3... :linadleaves
05-20-25 07:25:15 Working on node id #6... :linadleaves
05-20-25 07:25:15 Working on node id #7... :linadleaves
05-20-25 07:25:15 Working on node id #12... :linadleaves
05-20-25 07:25:15 Working on node id #13... :linadleaves
05-20-25 07:25:15 Working on node id #26... :linadleaves
05-20-25 07:25:15 Working on node id #27... :linadleaves
05-20-25 07:25:15 Working on node id #52... :linadleaves
05-20-25 07:25:15 Working on node id #53... :linadleaves
05-20-25 07:25:15 Working on node id #4... :linadleaves
05-20-25 07:25:15 Working on node id #5... :linadleaves
05-20-25 07:25:15 Working on node id #8... :linadleaves
05-20-25 07:25:15 Working on node id #9... :linadleaves
05-20-25 07:25:15 Working on node id #16... :linadleaves
05-20-25 07:25:15 Working on node id #17... :linadleaves
05-20-25 07:25:15 Working on node id #32... :linadleaves
05-20-25 07:25:15 Working on node id #33... :linadleaves
05-20-25 07:25:15 Working on node id #64... :linadleaves
05-20-25 07:25:15 Working on node id #65... :linadleaves
05-20-25 07:25:15 Working on node id #128... :linadleaves
05-20-25 07:25:15 Working on node id #129... :linadleaves
05-20-25 07:25:15 Working on node id #256... :linadleaves
05-20-25 07:25:15 Working on node id #257... :linadleaves
05-20-25 07:25:15 Working on node id #512... :linadleaves
05-20-25 07:25:15 Working on node id #513... :linadleaves
05-20-25 07:25:15 Working on node id #1024... :linadleaves
05-20-25 07:25:15 Working on node id #1025... :linadleaves
05-20-25 07:25:15 Working on node id #2048... :linadleaves
05-20-25 07:25:15 Working on node id #2049... :linadleaves
05-20-25 07:25:15 Working on node id #4096... :linadleaves
05-20-25 07:25:15 Working on node id #4097... :linadleaves
05-20-25 07:25:15 Working on node id #8192... :linadleaves
05-20-25 07:25:15 Working on node id #8193... :linadleaves
05-20-25 07:25:15 Working on node id #16384... :linadleaves
05-20-25 07:25:15 Working on node id #16385... :linadleaves
05-20-25 07:25:16 Selected 4 leaves of 20 total function (..., col = "69;1", bold = TRUE) 
{
    paste0(ifelse(bold, "\033[1m", ""), "\033[38;5;", col, "m", paste(...), "\033[0m")
} :selectleaves
05-20-25 07:25:16 Training LINAD on full training set... :s_LINAD
05-20-25 07:25:16 Training Linear Additive Tree Regression (max leaves = 8)... :linadleaves
05-20-25 07:25:16 Training first Linear Model... :linadleaves
05-20-25 07:25:16 Working on node id #1... :linadleaves
05-20-25 07:25:16 Working on node id #2... :linadleaves
05-20-25 07:25:16 Working on node id #3... :linadleaves
05-20-25 07:25:16 Working on node id #6... :linadleaves
05-20-25 07:25:16 Working on node id #7... :linadleaves
05-20-25 07:25:16 Working on node id #12... :linadleaves
05-20-25 07:25:16 Working on node id #13... :linadleaves
05-20-25 07:25:16 Working on node id #26... :linadleaves
05-20-25 07:25:16 Working on node id #27... :linadleaves
05-20-25 07:25:16 Working on node id #52... :linadleaves
05-20-25 07:25:16 Working on node id #53... :linadleaves
05-20-25 07:25:16 Working on node id #4... :linadleaves
05-20-25 07:25:16 Working on node id #5... :linadleaves
05-20-25 07:25:16 Reached 8 leaves. :linadleaves

.:Regression Training Summary
    MSE = 0.82
   RMSE = 0.91
    MAE = 0.72
      r = 0.91 (p = 1.9e-146)
   R sq = 0.83

.:Regression Testing Summary
    MSE = 1.24
   RMSE = 1.11
    MAE = 0.90
      r = 0.85 (p = 4.4e-37)
   R sq = 0.72
05-20-25 07:25:16 Completed in 0.02 minutes (Real: 1.06; User: 0.29; System: 0.10) :s_LINAD

17.2.5.1 Plot LINAD

We can plot a LINAD using dplot3.linad.

This outputs an interactive decision tree.

If you place the mouse pointer over each node, you will see the linear coefficients at that node.

Notice how the sign and magnitude of the nonlinear feature changes from one leaf node to the other, as LINAD has partitioned the case space to find subgroups that are well described by separate linear models.

dplot3_linad(mod_linad)