r version 3 examples

164
Contents ............................................................................................................................................................. 1 Chaos and Cluster....................................................................................................................................................... 4 Uplift - Package .......................................................................................................................................................... 9 Neuralnet - Package.................................................................................................................................................. 10 compute............................................................................................................................................................... 13 gwplot .................................................................................................................................................................. 16 confidence.interval ............................................................................................................................................... 16 neuralnet ............................................................................................................................................................. 18 randomForest – Package ........................................................................................................................................... 20 classCenter........................................................................................................................................................... 20 combine ............................................................................................................................................................... 21 getTree ................................................................................................................................................................ 21 grow .................................................................................................................................................................... 22 importance........................................................................................................................................................... 22 imports85............................................................................................................................................................. 22 margin ................................................................................................................................................................. 23 MDSplot............................................................................................................................................................... 23 na.roughfix ........................................................................................................................................................... 24 outlier .................................................................................................................................................................. 25 partialPlot ............................................................................................................................................................ 25 plot.randomForest................................................................................................................................................ 26 predict.randomForest ........................................................................................................................................... 27 randomForest ....................................................................................................................................................... 29 varUsed ................................................................................................................................................................ 30 partialPlot ............................................................................................................................................................ 31 varImpPlot ........................................................................................................................................................... 32 rfImpute............................................................................................................................................................... 33 rfcv ...................................................................................................................................................................... 34 ESGtoolkit – Package ................................................................................................................................................ 35 esgcortest ............................................................................................................................................................ 35 esgdiscountfactor ................................................................................................................................................. 36 esgfwdrates .......................................................................................................................................................... 37 esgmartingaletest ................................................................................................................................................. 38

Upload: jeffrey-strickland-phd-cmsp-asep

Post on 17-Jul-2015

240 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: R version 3 Examples

Contents .............................................................................................................................................................1

Chaos and Cluster .......................................................................................................................................................4

Uplift - Package ..........................................................................................................................................................9

Neuralnet - Package.................................................................................................................................................. 10

compute............................................................................................................................................................... 13

gwplot.................................................................................................................................................................. 16

confidence.interval ............................................................................................................................................... 16

neuralnet ............................................................................................................................................................. 18

randomForest – Package ........................................................................................................................................... 20

classCenter ........................................................................................................................................................... 20

combine ............................................................................................................................................................... 21

getTree ................................................................................................................................................................ 21

grow .................................................................................................................................................................... 22

importance........................................................................................................................................................... 22

imports85............................................................................................................................................................. 22

margin ................................................................................................................................................................. 23

MDSplot ............................................................................................................................................................... 23

na.roughfix ........................................................................................................................................................... 24

outlier .................................................................................................................................................................. 25

partialPlot ............................................................................................................................................................ 25

plot.randomForest ................................................................................................................................................ 26

predict.randomForest ........................................................................................................................................... 27

randomForest....................................................................................................................................................... 29

varUsed................................................................................................................................................................ 30

partialPlot ............................................................................................................................................................ 31

varImpPlot ........................................................................................................................................................... 32

rfImpute............................................................................................................................................................... 33

rfcv ...................................................................................................................................................................... 34

ESGtoolkit – Package ................................................................................................................................................ 35

esgcortest ............................................................................................................................................................ 35

esgdiscountfactor ................................................................................................................................................. 36

esgfwdrates.......................................................................................................................................................... 37

esgmartingaletest ................................................................................................................................................. 38

Page 2: R version 3 Examples

esgmccv ............................................................................................................................................................... 39

esgmcprices.......................................................................................................................................................... 41

esgplotbands ........................................................................................................................................................ 42

esgplotshocks ....................................................................................................................................................... 45

esgplotts .............................................................................................................................................................. 49

simdiff.................................................................................................................................................................. 49

simshocks............................................................................................................................................................. 53

BayesBridge – Package (Bridge Regression) ................................................................................................................ 57

bridge.EM ............................................................................................................................................................ 57

bridge.reg............................................................................................................................................................. 57

diabetes ............................................................................................................................................................... 58

trace.beta............................................................................................................................................................. 58

FNN - Package .......................................................................................................................................................... 59

get.knn................................................................................................................................................................. 59

KL.dist .................................................................................................................................................................. 60

KL.divergence ....................................................................................................................................................... 61

knn ...................................................................................................................................................................... 61

knn.cv .................................................................................................................................................................. 66

knn.dist ................................................................................................................................................................ 75

knn.index ............................................................................................................................................................. 75

knn.reg................................................................................................................................................................. 76

ownn ................................................................................................................................................................... 76

LogicReg - Package.................................................................................................................................................... 77

cumhaz ................................................................................................................................................................ 77

eval.logreg............................................................................................................................................................ 77

frame.logreg......................................................................................................................................................... 77

Library(LogicReg) .................................................................................................................................................. 78

CDLasso - Package .................................................................................................................................................. 123

cv.logit.reg ......................................................................................................................................................... 123

Logit.reg ............................................................................................................................................................. 124

plot.cv.logit.reg................................................................................................................................................... 125

Print cv.logit.reg ................................................................................................................................................. 126

Print logit ........................................................................................................................................................... 127

Summary logit .................................................................................................................................................... 128

Support Vector Machings Major .............................................................................................................................. 137

AusCredit ........................................................................................................................................................... 137

diabetes ............................................................................................................................................................. 138

Page 3: R version 3 Examples

hinge.................................................................................................................................................................. 138

Support Vector Machins.......................................................................................................................................... 139

wsvm ................................................................................................................................................................. 139

wsvm.boost ........................................................................................................................................................ 146

wsvm.predict...................................................................................................................................................... 153

Rknn ...................................................................................................................................................................... 160

confusion ........................................................................................................................................................... 160

eta ..................................................................................................................................................................... 160

fitted.................................................................................................................................................................. 160

kknn ...................................................................................................................................................................... 160

contr.dummy...................................................................................................................................................... 160

Glass .................................................................................................................................................................. 161

Ionosphere ......................................................................................................................................................... 161

Kknn .................................................................................................................................................................. 162

R version 3.0.2 (2013-09-25) -- "Frisbee Sailing" Copyright (C) 2013 The R Foundation for Statistical Computing Platform: x86_64-w64-mingw32/x64 (64-bit) R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details. R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R or R packages in publications. Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for an HTML browser interface to help. Type 'q()' to quit R. [Workspace loaded from ~/.RData] The data sets used herein are either part of the packages invoked by library(package_name), or simulated with the given code.

Page 4: R version 3 Examples

Chaos and Cluster #-------------------------------------------------- # R version of Vincent Granville's Perl program # chaos and cluster # # Recoded in R: J. Strickland - 2014-11-24 #-------------------------------------------------- #---------------distance - function ----------- distance <- function(x,y,p,q) { dist <- abs(x-p) + abs(y-q) if( dist < 0.04 && dist >0 ) { weight <- 1} else { weight <- 0.05 * exp(-20 * dist)} return(weight) } #------------- MAIN PROGRAM ------------------ n <- 100 #dark matter m <- 500 #visible matter niter <- 200 fixed_x <- 0; fixed_y <- 0 for(k in 1:n) {fixed_x[k] <- 2 * runif(1) - 0.5fixed_y[k] <- 2 * runif(1) - 0.5} init_x <- 0; init_y <- 0 moving_x <- 0; moving_y <- 0 rebirth <- 0; d2init <- 0; d2last <- 0 for(k in 1:m) {init_x[k] <- runif(1) init_y[k] <- runif(1) moving_x[k] <- init_x[k] moving_y[k] <- init_y[k] rebirth[k] <-0 d2init[k] <-0 d2last[k] <-0 } tmp_x <- 0 ; tmp_y <- 0 delta <- 0 to_R <- 0 for(iteration in 1:niter) { for(k in 1:m) { x <- moving_x[k] y <- moving_y[k] new_x <- 0 new_y <- 0 sum_weight <- 0 for(l in 1:m) { p <- moving_x[l] q <- moving_y[l] weight <- distance(x,y,p,q) if(k==l) {weight <- 0} new_x <- new_x + weight * p new_y <- new_y + weight * q sum_weight <- sum_weight + weight } for(l in 1:n) { p <- fixed_x[l] q <- fixed_y[l] weight <- distance(x,y,p,q) new_x <- new_x + weight * p new_y <- new_y + weight * q sum_weight <- sum_weight + weight } new_x <- new_x / sum_weight new_y <- new_y / sum_weight new_x <- new_x + 0.10 * (runif(1)-0.50) new_y <- new_y + 0.10 * (runif(1)-0.50) tmp_x[k] <- new_x tmp_y[k] <- new_y if(runif(1) < 0.1/(1+iteration)) { tmp_x[k] <- runif(1) tmp_y[k] <- runif(1) rebirth[k] <- 1 }

Page 5: R version 3 Examples

} delta[iteration] <- 0 for(k in 1:m) { delta[iteration] <- delta[iteration] + abs(moving_x[k]-tmp_x[k]) + + abs(moving_y[k]-tmp_y[k]) d2init[k] <- abs(moving_x[k] - init_x[k]) + abs(moving_y[k] - init_y[k]) d2last[k] <- abs(moving_x[k] - tmp_x[k]) + abs(moving_y[k] - tmp_y[k]) moving_x[k] <- tmp_x[k] moving_y[k] <- tmp_y[k] } delta[iteration] <- delta[iteration]/m iter_val <- rep(iteration,m) to_R_tmp <- cbind(iter_val, moving_x, moving_y, rebirth, d2init, d2last) to_R <- rbind(to_R, to_R_tmp) } #for(iteration... to_R <- as.data.frame(to_R[2:dim(to_R)[1],]) colnames(to_R) <- NULL rownames(to_R) <- NULL names(to_R) <- c('iter','x','y','new','d2init','d2last') save(to_R, file="chaoscluster.RData") #load("chaoscluster.RData") #---------------------------------------- # Plot #---------------------------------------- vv <- to_R iter <- to_R$iter for (n in 1:200) { x<-vv$x[iter == n] y<-vv$y[iter == n] z<-vv$new[iter == n] plot(x,y,xlim=c(0,1),ylim=c(0,1),pch=20,col=1+z,xlab="",ylab="",axes=FALSE, main=paste(n)) }

Page 6: R version 3 Examples

Restarting R session... setwd("~/R/Work") rfile2 <- read.delim("~/R/Work/rfile2.txt") View(rfile2) vv<-rfile2; summary(vv) iter x y new Min. : 0.00 Min. :-0.2126 Min. :-0.07772 Min. :0.0000 1st Qu.: 49.75 1st Qu.: 0.2314 1st Qu.: 0.29020 1st Qu.:0.0000 Median : 99.50 Median : 0.5704 Median : 0.60222 Median :0.0000 Mean : 99.50 Mean : 0.4775 Mean : 0.51662 Mean :0.3781 3rd Qu.:149.25 3rd Qu.: 0.6894 3rd Qu.: 0.72682 3rd Qu.:1.0000 Max. :199.00 Max. : 1.0564 Max. : 1.05825 Max. :1.0000 d2init d2last Min. :0.0000 Min. :0.00000 1st Qu.:0.1828 1st Qu.:0.03551 Median :0.3149 Median :0.05148 Mean :0.3860 Mean :0.05444 3rd Qu.:0.5265 3rd Qu.:0.06840 Max. :1.6528 Max. :1.62335 iter<-vv$iter; for (n in 0:199) { x<-vv$x[iter == n]; y<-vv$y[iter == n]; z<-vv$new[iter == n]; u<-vv$d2init[iter == n]; v<-vv$d2last[iter == n]; plot(x,y,xlim=c(0,1),ylim=c(0,1),pch=20+z,cex=3*u,col=rgb(z/2,0,u/2),xlab="",ylab="",axes=TRUE); Sys.sleep(0.05); # sleep 0.05 second between each iteration } plot(x,y,xlim=c(0,1),ylim=c(0,1),pch=20,cex=5,col=rgb(z,0,0),xlab="",ylab="",axes=TRUE); iter<-vv$iter; for (n in 0:199) { x<-vv$x[iter == n]; y<-vv$y[iter == n]; z<-vv$new[iter == n]; u<-vv$d2init[iter == n]; v<-vv$d2last[iter == n]; plot(x,y,xlim=c(0,1),ylim=c(0,1),pch=20,cex=5,col=rgb(z,0,0),xlab="",ylab="",axes=TRUE); Sys.sleep(0.05); # sleep 0.05 second between each iteration }

Page 7: R version 3 Examples

summary(vv) iter x y new Min. : 0.00 Min. :-0.2126 Min. :-0.07772 Min. :0.0000 1st Qu.: 49.75 1st Qu.: 0.2314 1st Qu.: 0.29020 1st Qu.:0.0000 Median : 99.50 Median : 0.5704 Median : 0.60222 Median :0.0000 Mean : 99.50 Mean : 0.4775 Mean : 0.51662 Mean :0.3781 3rd Qu.:149.25 3rd Qu.: 0.6894 3rd Qu.: 0.72682 3rd Qu.:1.0000 Max. :199.00 Max. : 1.0564 Max. : 1.05825 Max. :1.0000 d2init d2last Min. :0.0000 Min. :0.00000 1st Qu.:0.1828 1st Qu.:0.03551 Median :0.3149 Median :0.05148 Mean :0.3860 Mean :0.05444 3rd Qu.:0.5265 3rd Qu.:0.06840 Max. :1.6528 Max. :1.62335

Page 8: R version 3 Examples

iter<-vv$iter; for (n in 0:199) { x<-vv$x[iter == n]; y<-vv$y[iter == n]; z<-vv$new[iter == n]; u<-vv$d2init[iter == n]; v<-vv$d2last[iter == n]; plot(x,y,xlim=c(0,1),ylim=c(0,1),pch=20+z,cex=3*u,col=rgb(z/2,0,u/2),xlab="",ylab="",axes=TRUE); Sys.sleep(0.05); # sleep 0.05 second between each iteration }

Page 9: R version 3 Examples

Uplift - Package > library(uplift) > ### Simulate data > ### Simulate data > set.seed(12345) > dd <- sim_pte(n = 1000, p = 5, rho = 0, sigma = sqrt(2), beta.den = 4) > dd$treat <- ifelse(dd$treat == 1, 1, 0) # required coding > # for upliftRF > ### Fit upliftRF model > fit1 <- upliftRF(y ~ X1 + X2 + X3 + X4 + X5 + trt(treat), data = dd, mtry = 3, ntree = 50, split_method = "KL", minsplit = 100, verbose = TRUE) uplift: status messages enabled; set "verbose" to false to disable upliftRF: starting. Mon Nov 24 20:33:08 2014 10 out of 50 trees so far... 20 out of 50 trees so far... 30 out of 50 trees so far... 40 out of 50 trees so far... > ### Fitted values on train data > pred <- predict(fit1, dd) > ### Compute uplift predictions > uplift_pred <- pred[, 1] - pred[, 2] > ### Put together data, predictions and add some dummy > ### factors for illustration only > dd2 <- data.frame(dd, uplift_pred, F1 = gl(2, 50, labels = c("A", "B")), F2 = gl(4, 25, labels = c("a", "b", "c", "d"))) > ### Profile data based on fitted model > modelProfile(uplift_pred ~ X1 + X2 + X3 + F1 + F2, data = dd2, groups = 10, group_label = "D", digits_numeric = 2, digits_factor = 4, exclude_na = FALSE, LaTex = FALSE) Group 1 2 3 4 5 6 7 8 n 102 98 100 100 100 100 100 100 uplift_pred Avg. 0.3292 0.2292 0.1537 0.0701 0.0110 -0.0536 -0.1174 -0.1935 X1 Avg. 0.8527 0.6420 0.3270 0.2959 0.1373 0.0014 -0.2662 -0.5927 X2 Avg. -0.6372 -0.4831 -0.1386 -0.1330 -0.1548 0.2872 0.0672 0.0555 X3 Avg. 0.8339 0.5234 0.3197 0.1135 -0.1029 -0.0383 -0.3387 -0.3249 F1 A Pctn. 43.14 48.98 52.00 54.00 50.00 48.00 51.00 52.00 B Pctn. 56.86 51.02 48.00 46.00 50.00 52.00 49.00 48.00 F2 a Pctn. 24.51 24.49 21.00 26.00 26.00 24.00 34.00 21.00 b Pctn. 18.63 24.49 31.00 28.00 24.00 24.00 17.00 31.00 c Pctn. 27.45 25.51 27.00 22.00 25.00 27.00 22.00 20.00 d Pctn. 29.41 25.51 21.00 24.00 25.00 25.00 27.00 28.00 9 10 All 100 100 1000 -0.2734 -0.3871 -0.0230 -0.6762 -0.7797 -0.0054 0.3455 0.9568 0.0162 -0.4995 -0.7476 -0.0255

Page 10: R version 3 Examples

51.00 50.00 50.00 49.00 50.00 50.00 20.00 29.00 25.00 31.00 21.00 25.00 29.00 25.00 25.00 20.00 25.00 25.00 > > varImportance(fit1, plotit = TRUE, normalize = TRUE)

var rel.imp 1 X1 28.436811 2 X3 25.067272 3 X2 24.001705 4 X4 21.240306 5 X5 1.253906 > dd_new <- sim_pte(n = 1000, p = 20, rho = 0, sigma = sqrt(2), beta.den = 4) > dd_new$treat <- ifelse(dd_new$treat == 1, 1, 0) > > pred <- predict(fit1, dd_new) > perf <- performance(pred[, 1], pred[, 2], dd_new$y, dd_new$treat, direction = 1) > plot(perf[, 8] ~ perf[, 1], type ="l", xlab = "Decile", ylab = "uplift")

Neuralnet - Package > library(neuralnet) > AND <- c(rep(0,7),1)

Page 11: R version 3 Examples

> OR <- c(0,rep(1,7)) > binary.data <- data.frame(expand.grid(c(0,1), c(0,1), c(0,1)), AND, OR) > print(net <- neuralnet(AND+OR~Var1+Var2+Var3, binary.data, hidden=0, + rep=10, err.fct="ce", linear.output=FALSE)) Call: neuralnet(formula = AND + OR ~ Var1 + Var2 + Var3, data = binary.data, hidden = 0, rep = 10, err.fct = "ce", linear.output = FALSE) 10 repetitions were calculated. Error Reached Threshold Steps 10 0.07802098697 0.009613117711 214 1 0.07881299490 0.007719322902 252 4 0.07893010816 0.009090211230 234 3 0.08925890228 0.009360464818 234 5 0.08988221662 0.008571790734 214 2 0.09206391915 0.009250658588 208 8 0.09488379395 0.009517761057 207 7 0.09602741081 0.008967205495 219 6 0.09834742386 0.009193440375 201 9 0.10127610011 0.009876758710 210 > XOR <- c(0,1,1,0) > xor.data <- data.frame(expand.grid(c(0,1), c(0,1)), XOR) > print(net.xor <- neuralnet(XOR~Var1+Var2, xor.data, hidden=2, rep=5)) Call: neuralnet(formula = XOR ~ Var1 + Var2, data = xor.data, hidden = 2, rep = 5) 5 repetitions were calculated. Error Reached Threshold Steps 2 0.0002236366235 0.009696711619 131 3 0.0003617230205 0.008672975143 111 4 0.0005973089288 0.009747756656 122 1 0.2511316591172 0.007189065192 86 5 0.5000911293088 0.003483698761 23 > plot(net.xor, rep="best") > data(infert, package="datasets") > print(net.infert <- neuralnet(case~parity+induced+spontaneous, infert, + err.fct="ce", linear.output=FALSE, likelihood=TRUE)) Call: neuralnet(formula = case ~ parity + induced + spontaneous, data = infert, err.fct = "ce", linear.output = FALSE, likelihood = TRUE) 1 repetition was calculated. Error AIC BIC Reached Threshold Steps 1 129.2479977 270.4959955 291.576568 0.008781550507 1027

> gwplot(net.infert, selected.covariate="parity") > gwplot(net.infert, selected.covariate="induced")

Page 12: R version 3 Examples

> gwplot(net.infert, selected.covariate="spontaneous") > confidence.interval(net.infert)

Page 13: R version 3 Examples

$lower.ci $lower.ci[[1]] $lower.ci[[1]][[1]] [,1] [1,] 0.2312473800 [2,] -0.7454511041 [3,] -5.6716954641 [4,] -8.7473892090 $lower.ci[[1]][[2]] [,1] [1,] -0.7090350343 [2,] -6.4128063680 $upper.ci $upper.ci[[1]] $upper.ci[[1]][[1]] [,1] [1,] 2.8558540991 [2,] 4.5306810804 [3,] 0.7543410482 [4,] 0.9521395376 $upper.ci[[1]][[2]] [,1] [1,] 4.2563254315 [2,] -0.8108033267 $nic [1] 135.6746801

compute > Var1 <- runif(50, 0, 100) > sqrt.data <- data.frame(Var1, Sqrt=sqrt(Var1)) > print(net.sqrt <- neuralnet(Sqrt~Var1, sqrt.data, hidden=10, + threshold=0.01)) Call: neuralnet(formula = Sqrt ~ Var1, data = sqrt.data, hidden = 10, threshold = 0.01)

Page 14: R version 3 Examples

1 repetition was calculated. Error Reached Threshold Steps 1 0.0005444428632 0.009761098167 10106 > compute(net.sqrt, (1:10)^2)$net.result [,1] [1,] 0.9755329585 [2,] 1.9997586089 [3,] 2.9892482584 [4,] 4.0066401562 [5,] 4.9965270486 [6,] 6.0038153847 [7,] 6.9986646780 [8,] 7.9952143503 [9,] 9.0068085556 [10,] 9.9908895730 > Var1 <- rpois(100,0.5) > Var2 <- rbinom(100,2,0.6) > Var3 <- rbinom(100,1,0.5) > SUM <- as.integer(abs(Var1+Var2+Var3+(rnorm(100)))) > sum.data <- data.frame(Var1+Var2+Var3, SUM) > print(net.sum <- neuralnet(SUM~Var1+Var2+Var3, sum.data, hidden=1, + act.fct="tanh")) Call: neuralnet(formula = SUM ~ Var1 + Var2 + Var3, data = sum.data, hidden = 1, act.fct = "tanh") 1 repetition was calculated. Error Reached Threshold Steps 1 41.95083744 0.009657036729 38715 > prediction(net.sum) Data Error: 37.20436647; $rep1 Var1 Var2 Var3 SUM 1 0 0 0 0.3897391588 2 1 0 0 0.8609539971 3 0 1 0 0.8616593767 4 1 1 0 1.4795984331 5 2 1 0 2.2727061065 6 0 2 0 1.4805149536 7 1 2 0 2.2738684555 8 2 2 0 3.2648984685 9 0 0 1 0.8959745214 10 1 0 1 1.5241401870 11 0 1 1 1.5250712536 12 1 1 1 2.3303263128 13 2 1 1 3.3342912451 14 3 1 1 4.5440073237 15 0 2 1 2.3315054333 16 1 2 1 3.3357389327 17 2 2 1 4.5457191775 $data Var1 Var2 Var3 SUM 1 0 0 0 0.250000000 2 1 0 0 0.500000000 3 0 1 0 1.066666667 4 1 1 0 1.166666667 5 2 1 0 2.500000000 6 0 2 0 1.636363636 7 1 2 0 2.125000000 8 2 2 0 3.000000000 9 0 0 1 0.250000000 10 1 0 1 1.857142857 11 0 1 1 1.692307692 12 1 1 1 2.000000000 13 2 1 1 3.666666667 14 3 1 1 4.000000000 15 0 2 1 2.285714286

Page 15: R version 3 Examples

16 1 2 1 3.000000000 17 2 2 1 6.000000000 > Var1 <- runif(50, 0, 100) > sqrt.data <- data.frame(Var1, Sqrt=sqrt(Var1)) > print(net.sqrt <- neuralnet(Sqrt~Var1, sqrt.data, hidden=10, + threshold=0.01)) Call: neuralnet(formula = Sqrt ~ Var1, data = sqrt.data, hidden = 10, threshold = 0.01) 1 repetition was calculated. Error Reached Threshold Steps 1 0.001249507018 0.00993410697 3314 > compute(net.sqrt, (1:10)^2)$net.result [,1] [1,] 1.064706663 [2,] 2.002441059 [3,] 2.999868835 [4,] 4.001998099 [5,] 4.995255665 [6,] 6.006914827 [7,] 7.000937872 [8,] 7.989731307 [9,] 9.012385135 [10,] 9.986413428

Page 16: R version 3 Examples

gwplot > data(infert, package="datasets") > print(net.infert <- neuralnet(case~parity+induced+spontaneous, + infert, err.fct="ce", linear.output=FALSE)) Call: neuralnet(formula = case ~ parity + induced + spontaneous, data = infert, err.fct = "ce", linear.output = FALSE) 1 repetition was calculated. Error Reached Threshold Steps 1 129.2481898 0.00984615166 1167

confidence.interval > confidence.interval(net.infert) $lower.ci $lower.ci[[1]] $lower.ci[[1]][[1]] [,1] [1,] 0.2344504548 [2,] -0.7615748667 [3,] -5.6852355556 [4,] -8.7681764114 $lower.ci[[1]][[2]] [,1] [1,] -0.732291581 [2,] -6.449838839 $upper.ci $upper.ci[[1]] $upper.ci[[1]][[1]] [,1] [1,] 2.8566941207 [2,] 4.5389845032 [3,] 0.7766057817 [4,] 0.9867359939 $upper.ci[[1]][[2]] [,1] [1,] 4.2907826714 [2,] -0.7856493926 $nic [1] 135.6867145

Page 17: R version 3 Examples
Page 18: R version 3 Examples

neuralnet > AND <- c(rep(0,7),1) > OR <- c(0,rep(1,7)) > binary.data <- data.frame(expand.grid(c(0,1), c(0,1), c(0,1)), AND, OR) > print(net <- neuralnet(AND+OR~Var1+Var2+Var3, binary.data, hidden=0, + rep=10, err.fct="ce", linear.output=FALSE)) Call: neuralnet(formula = AND + OR ~ Var1 + Var2 + Var3, data = binary.data, hidden = 0, rep = 10, err.fct = "ce", linear.output = FALSE) 10 repetitions were calculated. Error Reached Threshold Steps 1 0.07252067539 0.007377296831 232 6 0.07271556646 0.007456594069 240 5 0.07901433519 0.007435613156 222 2 0.08230204015 0.008188579493 214 3 0.08479086054 0.009141743287 235 7 0.08497250349 0.009255382175 224 8 0.08756526572 0.009644800053 234 9 0.09541228605 0.009784532219 232 10 0.09770473515 0.009798468770 202 4 0.09943562915 0.009710846708 218 > data(infert, package="datasets") > print(net.infert <- neuralnet(case~parity+induced+spontaneous, infert, + err.fct="ce", linear.output=FALSE, likelihood=TRUE)) Call: neuralnet(formula = case ~ parity + induced + spontaneous, data = infert, err.fct = "ce", linear.output = FALSE, likelihood = TRUE) 1 repetition was calculated. Error AIC BIC Reached Threshold Steps 1 129.2482901 270.4965802 291.5771527 0.009920571423 1200 > XOR <- c(0,1,1,0) > xor.data <- data.frame(expand.grid(c(0,1), c(0,1)), XOR) > print(net.xor <- neuralnet( XOR~Var1+Var2, xor.data, hidden=2, rep=5)) Call: neuralnet(formula = XOR ~ Var1 + Var2, data = xor.data, hidden = 2, rep = 5) 5 repetitions were calculated. Error Reached Threshold Steps 4 0.0003242063787 0.009480068047 109 2 0.0004724774632 0.008996410250 103

Page 19: R version 3 Examples

3 0.0007796948635 0.009830825671 48 5 0.2596369670859 0.006835598652 72 1 0.5022560600435 0.009248338334 16 > plot(net.xor, rep="best")

> Var1 <- rpois(100,0.5) > Var2 <- rbinom(100,2,0.6) > Var3 <- rbinom(100,1,0.5) > SUM <- as.integer(abs(Var1+Var2+Var3+(rnorm(100)))) > sum.data <- data.frame(Var1+Var2+Var3, SUM) > print(net.sum <- neuralnet( SUM~Var1+Var2+Var3, sum.data, hidden=1, + act.fct="tanh")) Call: neuralnet(formula = SUM ~ Var1 + Var2 + Var3, data = sum.data, hidden = 1, act.fct = "tanh") 1 repetition was calculated. Error Reached Threshold Steps 1 47.06302463 0.009898245083 10539 > main <- glm(SUM~Var1+Var2+Var3, sum.data, family=poisson()) > full <- glm(SUM~Var1*Var2*Var3, sum.data, family=poisson()) > prediction(net.sum, list.glm=list(main=main, full=full)) Data Error: 39.14255952; $rep1 Var1 Var2 Var3 SUM 1 0 0 0 0.2130806092 2 1 0 0 0.6712515540 3 0 1 0 0.7763467180 4 1 1 0 1.4859320658 5 2 1 0 2.4506886346 6 3 1 0 3.6633944592 7 0 2 0 1.6432365986 8 1 2 0 2.6558891467 9 2 2 0 3.9078565476 10 3 2 0 5.3074490112 11 0 0 1 0.5489039243 12 2 0 1 2.0189591271 13 0 1 1 1.3002897500 14 1 1 1 2.2045743246 15 0 2 1 2.3989747891 16 1 2 1 3.6010710262 $data Var1 Var2 Var3 SUM 1 0 0 0 0.500000000

Page 20: R version 3 Examples

2 1 0 0 0.500000000 3 0 1 0 0.687500000 4 1 1 0 1.400000000 5 2 1 0 3.000000000 6 3 1 0 5.000000000 7 0 2 0 1.444444444 8 1 2 0 2.666666667 9 2 2 0 4.000000000 10 3 2 0 5.000000000 11 0 0 1 0.200000000 12 2 0 1 3.000000000 13 0 1 1 1.733333333 14 1 1 1 1.428571429 15 0 2 1 2.555555556 16 1 2 1 3.600000000 $glm.main Var1 Var2 Var3 SUM 1 0 0 0 0.4377753113 2 1 0 0 0.6955573633 3 0 1 0 0.8287925421 4 1 1 0 1.3168233576 5 2 1 0 2.0922289559 6 3 1 0 3.3242287043 7 0 2 0 1.5690630789 8 1 2 0 2.4929989193 9 2 2 0 3.9609902846 10 3 2 0 6.2934018596 11 0 0 1 0.6791105552 12 2 0 1 1.7143672218 13 0 1 1 1.2856863986 14 1 1 1 2.0427571367 15 0 2 1 2.4340506902 16 1 2 1 3.8673306520 $glm.full Var1 Var2 Var3 SUM 1 0 0 0 0.3107704068 2 1 0 0 0.7495434164 3 0 1 0 0.7097095400 4 1 1 0 1.3601600029 5 2 1 0 2.6067498451 6 3 1 0 4.9958422102 7 0 2 0 1.6207708974 8 1 2 0 2.4682162408 9 2 2 0 3.7587616000 10 3 2 0 5.7240887292 11 0 0 1 0.6670525522 12 2 0 1 1.9448875015 13 0 1 1 1.3256762918 14 1 1 1 1.9025562492 15 0 2 1 2.6346014638 16 1 2 1 3.1779550006

randomForest – Package classCenter > data(iris) > iris.rf <- randomForest(iris[,-5], iris[,5], prox=TRUE) > iris.p <- classCenter(iris[,-5], iris[,5], iris.rf$prox) > plot(iris[,3], iris[,4], pch=21, xlab=names(iris)[3], ylab=names(iris)[4], + bg=c("red", "blue", "green")[as.numeric(factor(iris$Species))], + main="Iris Data with Prototypes") > points(iris.p[,3], iris.p[,4], pch=21, cex=2, bg=c("red", "blue", "green"))

Page 21: R version 3 Examples

combine > rf1 <- randomForest(Species ~ ., iris, ntree=50, norm.votes=FALSE) > rf2 <- randomForest(Species ~ ., iris, ntree=50, norm.votes=FALSE) > rf3 <- randomForest(Species ~ ., iris, ntree=50, norm.votes=FALSE) > rf.all <- combine(rf1, rf2, rf3) > print(rf.all) Call: randomForest(formula = Species ~ ., data = iris, ntree = 50, norm.votes = FALSE) Type of random forest: classification Number of trees: 150 No. of variables tried at each split: 2

getTree > data(iris) > ## Look at the third trees in the forest. > getTree(randomForest(iris[,-5], iris[,5], ntree=10), 3, labelVar=TRUE) left daughter right daughter split var split point status prediction 1 2 3 Petal.Width 0.75 1 <NA> 2 0 0 <NA> 0.00 -1 setosa 3 4 5 Petal.Width 1.65 1 <NA> 4 6 7 Sepal.Width 2.65 1 <NA> 5 8 9 Sepal.Width 3.15 1 <NA> 6 10 11 Petal.Length 4.70 1 <NA> 7 0 0 <NA> 0.00 -1 versicolor 8 0 0 <NA> 0.00 -1 virginica 9 12 13 Petal.Length 4.95 1 <NA> 10 0 0 <NA> 0.00 -1 versicolor 11 0 0 <NA> 0.00 -1 virginica 12 0 0 <NA> 0.00 -1 versicolor 13 0 0 <NA> 0.00 -1 virginica

Page 22: R version 3 Examples

grow > data(iris) > iris.rf <- randomForest(Species ~ ., iris, ntree=50, norm.votes=FALSE) > iris.rf <- grow(iris.rf, 50) > print(iris.rf) Call: randomForest(formula = Species ~ ., data = iris, ntree = 50, norm.votes = FALSE) Type of random forest: classification Number of trees: 100 No. of variables tried at each split: 2

importance > set.seed(4543) > data(mtcars) > mtcars.rf <- randomForest(mpg ~ ., data=mtcars, ntree=1000, + keep.forest=FALSE, importance=TRUE) > importance(mtcars.rf) %IncMSE IncNodePurity cyl 17.061578919 179.77046781 disp 19.020929325 238.25232388 hp 18.195447663 208.00131828 drat 6.677706668 57.73285116 wt 17.858615937 254.21778641 qsec 5.849070524 30.00726819 vs 5.467722377 30.71093409 am 3.933136954 14.62663852 gear 5.129810569 18.25656968 carb 8.332737587 24.93136540 > importance(mtcars.rf, type=1) %IncMSE cyl 17.061578919 disp 19.020929325 hp 18.195447663 drat 6.677706668 wt 17.858615937 qsec 5.849070524 vs 5.467722377 am 3.933136954 gear 5.129810569 carb 8.332737587

imports85 > data(imports85) > imp85 <- imports85[,-2] # Too many NAs in normalizedLosses. > imp85 <- imp85[complete.cases(imp85), ] > ## Drop empty levels for factors. > imp85[] <- lapply(imp85, function(x) if (is.factor(x)) x[, drop=TRUE] else x) > stopifnot(require(randomForest)) > price.rf <- randomForest(price ~ ., imp85, do.trace=10, ntree=100) | Out-of-bag | Tree | MSE %Var(y) | 10 | 5.709e+06 8.77 | 20 | 4.514e+06 6.93 | 30 | 4.19e+06 6.44 | 40 | 4.051e+06 6.22 | 50 | 3.818e+06 5.87 | 60 | 3.708e+06 5.70 | 70 | 3.827e+06 5.88 | 80 | 3.787e+06 5.82 | 90 | 3.836e+06 5.89 | 100 | 3.808e+06 5.85 | > print(price.rf) Call: randomForest(formula = price ~ ., data = imp85, do.trace = 10, ntree = 100) Type of random forest: regression

Page 23: R version 3 Examples

Number of trees: 100 No. of variables tried at each split: 8 Mean of squared residuals: 3807719.449 % Var explained: 94.15 > numDoors.rf <- randomForest(numOfDoors ~ ., imp85, do.trace=10, ntree=100) ntree OOB 1 2 10: 15.62% 10.81% 22.22% 20: 13.47% 10.71% 17.28% 30: 10.88% 8.04% 14.81% 40: 10.88% 8.04% 14.81% 50: 12.44% 8.93% 17.28% 60: 11.92% 8.93% 16.05% 70: 11.92% 8.93% 16.05% 80: 12.44% 8.93% 17.28% 90: 11.40% 8.93% 14.81% 100: 11.92% 8.93% 16.05% > print(numDoors.rf) Call: randomForest(formula = numOfDoors ~ ., data = imp85, do.trace = 10, ntree = 100) Type of random forest: classification Number of trees: 100 No. of variables tried at each split: 4 OOB estimate of error rate: 11.92% Confusion matrix: four two class.error four 102 10 0.08928571429 two 13 68 0.16049382716

margin > set.seed(1) > data(iris) > iris.rf <- randomForest(Species ~ ., iris, keep.forest=FALSE) > plot(margin(iris.rf)) Loading required package: RColorBrewer Warning message: package ‘RColorBrewer’ was built under R version 3.1.0

MDSplot > set.seed(1) > data(iris)

Page 24: R version 3 Examples

> iris.rf <- randomForest(Species ~ ., iris, proximity=TRUE, + keep.forest=FALSE) > MDSplot(iris.rf, iris$Species) > ## Using different symbols for the classes: > MDSplot(iris.rf, iris$Species, palette=rep(1, 3), pch=as.numeric(iris$Species))

na.roughfix > data(iris) > iris.na <- iris > set.seed(111) > ## artificially drop some data values. > for (i in 1:4) iris.na[sample(150, sample(20)), i] <- NA > iris.roughfix <- na.roughfix(iris.na) > iris.narf <- randomForest(Species ~ ., iris.na, na.action=na.roughfix) > print(iris.narf) Call: randomForest(formula = Species ~ ., data = iris.na, na.action = na.roughfix) Type of random forest: classification Number of trees: 500

Page 25: R version 3 Examples

No. of variables tried at each split: 2 OOB estimate of error rate: 4.67% Confusion matrix: setosa versicolor virginica class.error setosa 50 0 0 0.00 versicolor 0 46 4 0.08 virginica 0 3 47 0.06

outlier > set.seed(1) > iris.rf <- randomForest(iris[,-5], iris[,5], proximity=TRUE) > plot(outlier(iris.rf), type="h", + col=c("red", "green", "blue")[as.numeric(iris$Species)])

partialPlot > data(iris) > set.seed(1) > iris.rf <- randomForest(iris[,-5], iris[,5], proximity=TRUE) > plot(outlier(iris.rf), type="h", + col=c("red", "green", "blue")[as.numeric(iris$Species)]) > > > set.seed(543) > iris.rf <- randomForest(Species~., iris) > partialPlot(iris.rf, iris, Petal.Width, "versicolor") > ## Looping over variables ranked by importance: > data(airquality) > airquality <- na.omit(airquality) > set.seed(131) > ozone.rf <- randomForest(Ozone ~ ., airquality, importance=TRUE) > imp <- importance(ozone.rf) > impvar <- rownames(imp)[order(imp[, 1], decreasing=TRUE)] > op <- par(mfrow=c(2, 3)) > for (i in seq_along(impvar)) { + partialPlot(ozone.rf, airquality, impvar[i], xlab=impvar[i], + main=paste("Partial Dependence on", impvar[i]), + ylim=c(30, 70)) + } > par(op)

Page 26: R version 3 Examples

plot.randomForest > data(mtcars) > plot(randomForest(mpg ~ ., mtcars, keep.forest=FALSE, ntree=100), log="y")

Page 27: R version 3 Examples

predict.randomForest > ## Proximities. > predict(iris.rf, iris[ind == 2,], proximity=TRUE) $predicted 18 25 40 44 48 55 60 setosa setosa setosa setosa setosa versicolor versicolor 65 71 78 82 92 97 99 versicolor virginica virginica versicolor versicolor versicolor versicolor 100 103 105 110 113 114 115 versicolor virginica virginica virginica virginica virginica virginica 118 123 127 128 129 134 139 virginica virginica virginica virginica virginica versicolor virginica 141 142 virginica virginica Levels: setosa versicolor virginica $proximity 18 25 40 44 48 55 60 65 71 78 82 92 97 18 1.000 1.000 1.000 1.000 0.998 0.000 0.010 0.000 0.000 0.000 0.000 0.000 0.000 25 1.000 1.000 1.000 1.000 0.998 0.000 0.010 0.000 0.000 0.000 0.000 0.000 0.000 40 1.000 1.000 1.000 1.000 0.998 0.000 0.010 0.000 0.000 0.000 0.000 0.000 0.000 44 1.000 1.000 1.000 1.000 0.998 0.000 0.010 0.000 0.000 0.000 0.000 0.000 0.000 48 0.998 0.998 0.998 0.998 1.000 0.000 0.010 0.000 0.000 0.000 0.000 0.000 0.000 55 0.000 0.000 0.000 0.000 0.000 1.000 0.640 0.764 0.272 0.146 0.696 0.812 0.766 60 0.010 0.010 0.010 0.010 0.010 0.640 1.000 0.788 0.214 0.048 0.780 0.756 0.788 65 0.000 0.000 0.000 0.000 0.000 0.764 0.788 1.000 0.298 0.086 0.902 0.894 0.998 71 0.000 0.000 0.000 0.000 0.000 0.272 0.214 0.298 1.000 0.282 0.262 0.318 0.300 78 0.000 0.000 0.000 0.000 0.000 0.146 0.048 0.086 0.282 1.000 0.064 0.100 0.088 82 0.000 0.000 0.000 0.000 0.000 0.696 0.780 0.902 0.262 0.064 1.000 0.804 0.902 92 0.000 0.000 0.000 0.000 0.000 0.812 0.756 0.894 0.318 0.100 0.804 1.000 0.896 97 0.000 0.000 0.000 0.000 0.000 0.766 0.788 0.998 0.300 0.088 0.902 0.896 1.000 99 0.014 0.014 0.014 0.014 0.014 0.584 0.898 0.772 0.198 0.036 0.852 0.682 0.772 100 0.000 0.000 0.000 0.000 0.000 0.770 0.794 0.990 0.292 0.088 0.910 0.888 0.992 103 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.422 0.558 0.000 0.000 0.000 105 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.428 0.562 0.000 0.000 0.000 110 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.414 0.530 0.000 0.000 0.000 113 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.426 0.566 0.000 0.000 0.000 114 0.000 0.000 0.000 0.000 0.000 0.006 0.010 0.010 0.510 0.432 0.014 0.006 0.012 115 0.000 0.000 0.000 0.000 0.000 0.002 0.002 0.002 0.500 0.434 0.002 0.002 0.002 118 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.414 0.530 0.000 0.000 0.000 123 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.414 0.540 0.000 0.000 0.000 127 0.000 0.000 0.000 0.000 0.000 0.282 0.208 0.278 0.938 0.294 0.254 0.300 0.280 128 0.000 0.000 0.000 0.000 0.000 0.056 0.040 0.062 0.690 0.368 0.050 0.076 0.064

Page 28: R version 3 Examples

129 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.430 0.548 0.000 0.000 0.000 134 0.000 0.000 0.000 0.000 0.000 0.338 0.194 0.226 0.002 0.478 0.182 0.252 0.226 139 0.000 0.000 0.000 0.000 0.000 0.272 0.214 0.298 0.994 0.282 0.262 0.320 0.300 141 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.426 0.564 0.000 0.000 0.000 142 0.000 0.000 0.000 0.000 0.000 0.002 0.002 0.002 0.430 0.576 0.002 0.002 0.002 99 100 103 105 110 113 114 115 118 123 127 128 129 18 0.014 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 25 0.014 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 40 0.014 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 44 0.014 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 48 0.014 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 55 0.584 0.770 0.000 0.000 0.000 0.000 0.006 0.002 0.000 0.000 0.282 0.056 0.000 60 0.898 0.794 0.000 0.000 0.000 0.000 0.010 0.002 0.000 0.000 0.208 0.040 0.000 65 0.772 0.990 0.000 0.000 0.000 0.000 0.010 0.002 0.000 0.000 0.278 0.062 0.000 71 0.198 0.292 0.422 0.428 0.414 0.426 0.510 0.500 0.414 0.414 0.938 0.690 0.430 78 0.036 0.088 0.558 0.562 0.530 0.566 0.432 0.434 0.530 0.540 0.294 0.368 0.548 82 0.852 0.910 0.000 0.000 0.000 0.000 0.014 0.002 0.000 0.000 0.254 0.050 0.000 92 0.682 0.888 0.000 0.000 0.000 0.000 0.006 0.002 0.000 0.000 0.300 0.076 0.000 97 0.772 0.992 0.000 0.000 0.000 0.000 0.012 0.002 0.000 0.000 0.280 0.064 0.000 99 1.000 0.780 0.000 0.000 0.000 0.000 0.012 0.002 0.000 0.000 0.192 0.038 0.000 100 0.780 1.000 0.000 0.000 0.000 0.000 0.012 0.002 0.000 0.000 0.282 0.056 0.000 103 0.000 0.000 1.000 0.976 0.962 0.980 0.676 0.722 0.962 0.970 0.442 0.570 0.952 105 0.000 0.000 0.976 1.000 0.942 0.992 0.690 0.736 0.942 0.950 0.450 0.584 0.976 110 0.000 0.000 0.962 0.942 1.000 0.944 0.656 0.700 1.000 0.964 0.424 0.550 0.918 113 0.000 0.000 0.980 0.992 0.944 1.000 0.684 0.730 0.944 0.952 0.448 0.580 0.968 114 0.012 0.012 0.676 0.690 0.656 0.684 1.000 0.896 0.656 0.672 0.494 0.682 0.700 115 0.002 0.002 0.722 0.736 0.700 0.730 0.896 1.000 0.700 0.722 0.486 0.690 0.752 118 0.000 0.000 0.962 0.942 1.000 0.944 0.656 0.700 1.000 0.964 0.424 0.550 0.918 123 0.000 0.000 0.970 0.950 0.964 0.952 0.672 0.722 0.964 1.000 0.438 0.562 0.954 127 0.192 0.282 0.442 0.450 0.424 0.448 0.494 0.486 0.424 0.438 1.000 0.662 0.460 128 0.038 0.056 0.570 0.584 0.550 0.580 0.682 0.690 0.550 0.562 0.662 1.000 0.588 129 0.000 0.000 0.952 0.976 0.918 0.968 0.700 0.752 0.918 0.954 0.460 0.588 1.000 134 0.154 0.226 0.234 0.242 0.228 0.240 0.130 0.166 0.228 0.244 0.004 0.044 0.256 139 0.198 0.292 0.424 0.430 0.414 0.428 0.512 0.502 0.414 0.416 0.942 0.696 0.432 141 0.000 0.000 0.978 0.990 0.946 0.998 0.682 0.728 0.946 0.950 0.448 0.578 0.966 142 0.002 0.002 0.922 0.916 0.888 0.922 0.706 0.760 0.888 0.894 0.454 0.602 0.892 134 139 141 142 18 0.000 0.000 0.000 0.000 25 0.000 0.000 0.000 0.000 40 0.000 0.000 0.000 0.000 44 0.000 0.000 0.000 0.000 48 0.000 0.000 0.000 0.000 55 0.338 0.272 0.000 0.002 60 0.194 0.214 0.000 0.002 65 0.226 0.298 0.000 0.002 71 0.002 0.994 0.426 0.430 78 0.478 0.282 0.564 0.576 82 0.182 0.262 0.000 0.002 92 0.252 0.320 0.000 0.002 97 0.226 0.300 0.000 0.002 99 0.154 0.198 0.000 0.002 100 0.226 0.292 0.000 0.002 103 0.234 0.424 0.978 0.922 105 0.242 0.430 0.990 0.916 110 0.228 0.414 0.946 0.888 113 0.240 0.428 0.998 0.922 114 0.130 0.512 0.682 0.706 115 0.166 0.502 0.728 0.760 118 0.228 0.414 0.946 0.888 123 0.244 0.416 0.950 0.894 127 0.004 0.942 0.448 0.454 128 0.044 0.696 0.578 0.602 129 0.256 0.432 0.966 0.892 134 1.000 0.002 0.238 0.246 139 0.002 1.000 0.428 0.432 141 0.238 0.428 1.000 0.922 142 0.246 0.432 0.922 1.000

## Nodes matrix. > str(attr(predict(iris.rf, iris[ind == 2,], nodes=TRUE), "nodes"))

Page 29: R version 3 Examples

int [1:30, 1:500] 10 10 10 10 10 13 9 13 13 15 ... - attr(*, "dimnames")=List of 2 ..$ : chr [1:30] "18" "25" "40" "44" ... ..$ : chr [1:500] "1" "2" "3" "4" ...

randomForest > ## Classification: > ##data(iris) > set.seed(71) > iris.rf <- randomForest(Species ~ ., data=iris, importance=TRUE, + proximity=TRUE) > print(iris.rf) Call: randomForest(formula = Species ~ ., data = iris, importance = TRUE, proximity = TRUE) Type of random forest: classification Number of trees: 500 No. of variables tried at each split: 2 OOB estimate of error rate: 5.33% Confusion matrix: setosa versicolor virginica class.error setosa 50 0 0 0.00 versicolor 0 46 4 0.08 virginica 0 4 46 0.08

> ## Look at variable importance: > round(importance(iris.rf), 2) setosa versicolor virginica MeanDecreaseAccuracy MeanDecreaseGini Sepal.Length 6.04 7.85 7.93 11.51 8.77 Sepal.Width 4.40 1.03 5.44 5.40 2.19 Petal.Length 21.76 31.33 29.64 32.94 42.54 Petal.Width 22.84 32.67 31.68 34.50 45.77 > ## Do MDS on 1 - proximity: > iris.mds <- cmdscale(1 - iris.rf$proximity, eig=TRUE) > op <- par(pty="s") > pairs(cbind(iris[,1:4], iris.mds$points), cex=0.6, gap=0, + col=c("red", "green", "blue")[as.numeric(iris$Species)], + main="Iris Data: Predictors and MDS of Proximity Based on RandomForest") > par(op) > print(iris.mds$GOF) [1] 0.7282699565 0.7903363321

Page 30: R version 3 Examples

varUsed > ## The `unsupervised' case: > set.seed(17) > iris.urf <- randomForest(iris[, -5]) > MDSplot(iris.urf, iris$Species)

> ## stratified sampling: draw 20, 30, and 20 of the species to grow each tree. > (iris.rf2 <- randomForest(iris[1:4], iris$Species, + sampsize=c(20, 30, 20))) Call: randomForest(x = iris[1:4], y = iris$Species, sampsize = c(20, 30, 20)) Type of random forest: classification Number of trees: 500 No. of variables tried at each split: 2 OOB estimate of error rate: 5.33%

Page 31: R version 3 Examples

Confusion matrix: setosa versicolor virginica class.error setosa 50 0 0 0.00 versicolor 0 47 3 0.06 virginica 0 5 45 0.10

partialPlot > ## Regression: > ## data(airquality) > set.seed(131) > ozone.rf <- randomForest(Ozone ~ ., data=airquality, mtry=3, + importance=TRUE, na.action=na.omit) > print(ozone.rf) Call: randomForest(formula = Ozone ~ ., data = airquality, mtry = 3, importance = TRUE, na.action = na.omit) Type of random forest: regression Number of trees: 500 No. of variables tried at each split: 3 Mean of squared residuals: 293.5400613 % Var explained: 73.25

> ## Show "importance" of variables: higher value mean more important: > round(importance(ozone.rf), 2) Error in importance(ozone.rf) : object 'ozone.rf' not found > ## "x" can be a matrix instead of a data frame: > set.seed(17) > x <- matrix(runif(5e2), 100) > y <- gl(2, 50) > (myrf <- randomForest(x, y)) Call: randomForest(x = x, y = y) Type of random forest: classification Number of trees: 500 No. of variables tried at each split: 2 OOB estimate of error rate: 45% Confusion matrix: 1 2 class.error 1 30 20 0.4 2 25 25 0.5 > (predict(myrf, x)) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2 2 2 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 Levels: 1 2

> ## "complicated" formula: > (swiss.rf <- randomForest(sqrt(Fertility) ~ . - Catholic + I(Catholic < 50), + data=swiss)) Call: randomForest(formula = sqrt(Fertility) ~ . - Catholic + I(Catholic < 50), data = swiss) Type of random forest: regression Number of trees: 500 No. of variables tried at each split: 1

Page 32: R version 3 Examples

Mean of squared residuals: 0.3207372 % Var explained: 45.54 > (predict(swiss.rf, swiss)) Courtelary Delemont Franches-Mnt Moutier Neuveville Porrentruy 8.544219 8.977287 9.118769 8.781467 8.522875 8.924009 Broye Glane Gruyere Sarine Veveyse Aigle 8.888588 9.140677 8.952968 8.875341 9.114092 7.936236 Aubonne Avenches Cossonay Echallens Grandson Lausanne 8.328221 8.229574 8.023034 8.274981 8.434141 7.742152 La Vallee Lavaux Morges Moudon Nyone Orbe 7.598992 8.180066 7.987572 8.354154 7.860249 7.873372 Oron Payerne Paysd'enhaut Rolle Vevey Yverdon 8.483563 8.551495 8.410619 8.035697 7.846781 8.363983 Conthey Entremont Herens Martigwy Monthey St Maurice 8.844938 8.657802 8.776533 8.570825 8.892461 8.477949 Sierre Sion Boudry La Chauxdfnd Le Locle Neuchatel 8.919503 8.664517 8.277097 8.033753 8.170280 7.760338 Val de Ruz ValdeTravers V. De Geneve Rive Droite Rive Gauche 8.553581 8.152936 6.781376 7.500274 7.194497

> ## Grow no more than 4 nodes per tree: > (treesize(randomForest(Species ~ ., data=iris, maxnodes=4, ntree=30))) [1] 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4 4

> data(iris) > iris.rf <- randomForest(Species ~ ., iris) > hist(treesize(iris.rf))

varImpPlot > set.seed(4543) > data(mtcars) > mtcars.rf <- randomForest(mpg ~ ., data=mtcars, ntree=1000, keep.forest=FALSE, + importance=TRUE) > varImpPlot(mtcars.rf)

Page 33: R version 3 Examples

> data(fgl, package="MASS") > fgl.res <- tuneRF(fgl[,-10], fgl[,10], stepFactor=1.5) mtry = 3 OOB error = 24.3% Searching left ... mtry = 2 OOB error = 22.43% 0.07692308 0.05 Searching right ... mtry = 4 OOB error = 21.5% 0.04166667 0.05

rfImpute > data(iris) > iris.na <- iris > set.seed(111) > ## artificially drop some data values. > for (i in 1:4) iris.na[sample(150, sample(20)), i] <- NA

Page 34: R version 3 Examples

> set.seed(222) > iris.imputed <- rfImpute(Species ~ ., iris.na) ntree OOB 1 2 3 300: 4.67% 0.00% 8.00% 6.00% ntree OOB 1 2 3 300: 6.00% 0.00% 8.00% 10.00% ntree OOB 1 2 3 300: 5.33% 0.00% 8.00% 8.00% ntree OOB 1 2 3 300: 5.33% 0.00% 8.00% 8.00% ntree OOB 1 2 3 300: 5.33% 0.00% 8.00% 8.00% > set.seed(333) > iris.rf <- randomForest(Species ~ ., iris.imputed) > print(iris.rf) Call: randomForest(formula = Species ~ ., data = iris.imputed) Type of random forest: classification Number of trees: 500 No. of variables tried at each split: 2 OOB estimate of error rate: 5.33% Confusion matrix: setosa versicolor virginica class.error setosa 50 0 0 0.00 versicolor 0 46 4 0.08 virginica 0 4 46 0.08

rfcv > set.seed(647) > myiris <- cbind(iris[1:4], matrix(runif(96 * nrow(iris)), nrow(iris), 96)) > result <- rfcv(myiris, iris$Species, cv.fold=3) > with(result, plot(n.var, error.cv, log="x", type="o", lwd=2)) > ## The following can take a while to run, so if you really want to try > ## it, copy and paste the code into R. > ## Not run: > result <- replicate(5, rfcv(myiris, iris$Species), simplify=FALSE)

> ## Do MDS on 1 - proximity: > iris.mds <- cmdscale(1 - iris.rf$proximity, eig=TRUE) > op <- par(pty="s") > pairs(cbind(iris[,1:4], iris.mds$points), cex=0.6, gap=0, + col=c("red", "green", "blue")[as.numeric(iris$Species)], + main="Iris Data: Predictors and MDS of Proximity Based on RandomForest") > par(op) > print(iris.mds$GOF) [1] 0.7282700 0.7903363

Page 35: R version 3 Examples

ESGtoolkit – Package esgcortest > nb <- 500 > s0.par1 <- simshocks(n = nb, horizon = 3, frequency = "semi", family = 1, par = 0.2) > s0.par2 <- simshocks(n = nb, horizon = 3, frequency = "semi", family = 1, par = 0.8) > (test1 <- esgcortest(s0.par1)) $cor.estimate Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 [1] 0.2031796 0.2341318 0.2845056 0.1923496 0.2130655 0.1985990 $conf.int Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 Series 1 Series 2

Page 36: R version 3 Examples

0.5 0.1175840 0.2857785 1.0 0.1495109 0.3153480 1.5 0.2018509 0.3631365 2.0 0.1064546 0.2753951 2.5 0.1277620 0.2952398 3.0 0.1128742 0.2813892 > (test2 <- esgcortest(s0.par2)) $cor.estimate Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 [1] 0.7998014 0.8196032 0.8176367 0.8006462 0.7889443 0.7826464 $conf.int Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 Series 1 Series 2 0.5 0.7658218 0.8293271 1.0 0.7885897 0.8464576 1.5 0.7863249 0.8447589 2.0 0.7667913 0.8300590 2.5 0.7533744 0.8199110 3.0 0.7461657 0.8144413 > par(mfrow=c(2, 1)) > esgplotbands(test1) > esgplotbands(test2)

esgdiscountfactor > kappa <- 1.5

Page 37: R version 3 Examples

> V0 <- theta <- 0.04 > sigma_v <- 0.2 > theta1 <- kappa*theta > theta2 <- kappa > theta3 <- sigma_v > # OU > r <- simdiff(n = 10, horizon = 5, + frequency = "quart", + model = "OU", + x0 = V0, theta1 = theta1, theta2 = theta2, theta3 = theta3) > # Stochastic discount factors > esgdiscountfactor(r, 1) Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 Series 8 0 Q1 0.9900498 0.9900498 0.9900498 0.9900498 0.9900498 0.9900498 0.9900498 0.9900498 0 Q2 0.9585213 0.9820921 0.9512259 1.0080435 0.9981573 0.9474584 0.9746185 0.9839290 0 Q3 0.9343516 0.9491405 0.9298053 1.0171075 0.9804818 0.9210995 0.9400720 0.9490736 0 Q4 0.9342289 0.9574059 0.9363011 1.0084251 0.9552877 0.9633258 0.9355897 0.9229860 1 Q1 0.9564297 0.9715226 0.9389838 0.9922630 0.9212256 1.0015752 0.8994374 0.9142785 1 Q2 0.9744057 0.9637093 0.9055900 0.9613621 0.9045373 1.0368300 0.8403102 0.8789775 1 Q3 1.0041063 0.9477295 0.8899058 0.9480560 0.9495880 1.0772694 0.8019214 0.8423079 1 Q4 1.0195252 0.9634454 0.8926692 0.9530831 1.0149821 1.1114718 0.7828309 0.8094147 2 Q1 1.0330301 0.9670540 0.8800107 0.9399917 1.0679675 1.1038985 0.7761629 0.7703206 2 Q2 0.9846293 0.9613489 0.8855304 0.9192459 1.1141234 1.0907494 0.7629199 0.7681539 2 Q3 0.9856368 0.9818615 0.8718380 0.9612554 1.1677342 1.1114258 0.7407670 0.7827415 2 Q4 0.9951932 0.9616375 0.8615750 0.9492283 1.2091508 1.1046635 0.7004747 0.7913444 3 Q1 0.9895261 0.9468500 0.8676494 0.9343685 1.2336104 1.1234329 0.6663429 0.7860875 3 Q2 0.9838310 0.8958484 0.8574252 0.9073358 1.2025963 1.1368358 0.6549611 0.7947692 3 Q3 0.9762262 0.8721990 0.8274858 0.8818353 1.1251060 1.1259175 0.6611830 0.7734815 3 Q4 0.9450651 0.8445017 0.8343756 0.8566480 1.0783279 1.1296217 0.6678333 0.7639239 4 Q1 0.8949994 0.8163462 0.8263110 0.8293844 1.0407422 1.1474239 0.6749318 0.7808922 4 Q2 0.8472326 0.8140173 0.8220125 0.7988357 0.9743193 1.1749271 0.6699907 0.8050646 4 Q3 0.7995016 0.8300664 0.8184760 0.8001168 0.9439699 1.2299234 0.6583495 0.8180570 4 Q4 0.7655070 0.8266676 0.8129897 0.8007583 0.9154293 1.2843160 0.6655180 0.8324319 5 Q1 0.7422127 0.8230753 0.7961540 0.8064495 0.8775292 1.3344008 0.6713288 0.8700435 Series 9 Series 10 0 Q1 0.9900498 0.9900498 0 Q2 0.9507125 1.0004132 0 Q3 0.9483009 1.0322529 0 Q4 0.9535587 1.0429080 1 Q1 0.9140566 1.0352893 1 Q2 0.9270676 1.0112939 1 Q3 0.9157518 0.9969056 1 Q4 0.9371509 1.0116552 2 Q1 0.9525812 1.0121459 2 Q2 0.9441043 0.9985326 2 Q3 0.9524981 1.0000481 2 Q4 0.9424220 1.0222210 3 Q1 0.9471365 1.0258360 3 Q2 0.9526780 1.0272076 3 Q3 0.9130178 1.0611427 3 Q4 0.8674892 1.0871529 4 Q1 0.8219673 1.1392481 4 Q2 0.7911688 1.1376359 4 Q3 0.7586939 1.1197590 4 Q4 0.7373618 1.1112753 5 Q1 0.7295672 1.0946980

esgfwdrates > # Yield to maturities > txZC <- c(0.01422,0.01309,0.01380,0.01549,0.01747,0.01940,0.02104,0.02236,0.02348,0.02446,0.02535,0.02614,0.02679,0.02727,0.02760,0.02779,0.02787,0.02786,0.02776,0.02762,0.02745,0.02727,0.02707,0.02686,0.02663,0.02640,0.02618,0.02597,0.02578,0.02563) > # Observed maturities > u <- 1:30 > ## Not run: > par(mfrow=c(2,2)) > fwdNS <- esgfwdrates(in.maturities = u, in.zerorates = txZC, n = 10, horizon = 20, out.frequency = "semi-annual", method = "NS") |=============================================================================| 100% > matplot(time(fwdNS), fwdNS, type = 'l')

Page 38: R version 3 Examples

> fwdSV <- esgfwdrates(in.maturities = u, in.zerorates = txZC, n = 10, horizon = 20, out.frequency = "semi-annual", method = "SV") |=============================================================================| 100% > matplot(time(fwdSV), fwdSV, type = 'l') > fwdSW <- esgfwdrates(in.maturities = u, in.zerorates = txZC, n = 10, horizon = 20, out.frequency = "semi-annual", method = "SW") > matplot(time(fwdSW), fwdSW, type = 'l') > fwdHCSPL <- esgfwdrates(in.maturities = u, in.zerorates = txZC, n = 10, horizon = 20, out.frequency = "semi-annual", method = "HCSPL") > matplot(time(fwdHCSPL), fwdHCSPL, type = 'l') > ## End(Not run)

esgmartingaletest > r0 <- 0.03 > S0 <- 100 > set.seed(10) > eps0 <- simshocks(n = 100, horizon = 3, frequency = "quart") > sim.GBM <- simdiff(n = 100, horizon = 3, frequency = "quart", model = "GBM", x0 = S0, theta1 = r0, theta2 = 0.1, eps = eps0) > mc.test <- esgmartingaletest(r = r0, X = sim.GBM, p0 = S0, alpha = 0.05) martingale '1=1' one Sample t-test alternative hypothesis: true mean of the martingale difference is not equal to 0 df = 99 t p-value 0 Q2 0.16260712 0.8711592 0 Q3 0.93680787 0.3511371 0 Q4 0.00693801 0.9944783 1 Q1 -0.03363110 0.9732390 1 Q2 0.36911463 0.7128306 1 Q3 0.39753877 0.6918261

Page 39: R version 3 Examples

1 Q4 0.54344405 0.5880457 2 Q1 0.65522732 0.5138411 2 Q2 0.43944877 0.6612941 2 Q3 -0.04001888 0.9681587 2 Q4 0.23087871 0.8178855 3 Q1 0.22593981 0.8217140 95 percent confidence intervals for the mean : c.i lower bound c.i upper bound 0 Q1 0.0000000 0.000000 0 Q2 -0.8545272 1.007087 0 Q3 -0.7160522 1.996934 0 Q4 -1.6409989 1.652515 1 Q1 -2.1252718 2.054429 1 Q2 -1.9295486 2.811505 1 Q3 -2.1256054 3.190737 1 Q4 -2.0745964 3.639627 2 Q1 -2.0959981 4.162764 2 Q2 -2.7355864 4.292001 2 Q3 -3.7768998 3.627562 2 Q4 -3.3390474 4.218416 3 Q1 -3.5825466 4.503266 > esgplotbands(mc.test)

esgmccv > r <- 0.03 > set.seed(1) > eps0 <- simshocks(n = 100, horizon = 5, frequency = "quart") > sim.GBM <- simdiff(n = 100, horizon = 5, frequency = "quart", model = "GBM", x0 = 100, theta1 = 0.03, theta2 = 0.1, eps = eps0) > # monte carlo prices > esgmcprices(r, sim.GBM) Qtr1 Qtr2 Qtr3 Qtr4 0 100.00000 99.53110 100.60082 100.93049 1 100.59473 100.54263 100.43265 99.54667

Page 40: R version 3 Examples

2 99.65114 99.48301 99.93821 99.53699 3 99.57003 99.02298 98.51563 99.06144 4 98.79179 99.16115 99.26806 98.34561 5 98.83940 > # convergence to a specific price > (esgmccv(r, sim.GBM, 2)) $avg.price [1] 98.61672 99.70465 102.38875 102.35228 102.99831 103.11158 101.07416 102.02474 [9] 101.51025 103.70944 101.82441 101.40041 101.78513 101.37563 103.08515 103.62226 [17] 102.52535 102.17227 102.12122 102.09750 101.66643 101.06019 101.82777 101.96038 [25] 101.76604 101.34487 101.54785 101.28690 101.89039 101.67798 101.84589 101.05548 [33] 100.55827 100.38232 99.93127 99.88172 99.41883 99.81843 99.77110 99.50776 [41] 99.82195 100.26655 100.31622 100.68961 100.47896 100.50890 101.09210 100.64240 [49] 100.66369 100.52656 100.57303 100.79586 100.29109 100.40843 100.28539 99.91204 [57] 99.89632 99.57344 99.96093 99.76212 99.87486 99.82533 99.57743 99.98065 [65] 99.91345 99.82677 99.72080 99.33497 99.05300 98.92512 99.36084 98.93416 [73] 99.02106 99.11064 99.27940 99.15891 98.94379 98.95864 98.81980 98.92155 [81] 98.82669 98.94608 98.69415 98.71769 98.69517 98.77689 99.04688 99.20671 [89] 98.97071 99.24700 99.68083 100.12464 100.12351 100.17292 100.06231 100.09495 [97] 99.93027 99.85317 99.65114 $conf.int lower bound upper bound [1,] 25.76682 171.4666 [2,] 84.71256 114.6967 [3,] 90.79361 113.9839 [4,] 94.51589 110.1887 [5,] 96.84598 109.1506 [6,] 98.15431 108.0689 [7,] 94.71628 107.4320 [8,] 96.13361 107.9159 [9,] 96.21184 106.8086 [10,] 96.90545 110.5134 [11,] 94.41782 109.2310 [12,] 94.59299 108.2078 [13,] 95.48101 108.0893 [14,] 95.48331 107.2679 [15,] 96.50641 109.6639 [16,] 97.37150 109.8730 [17,] 96.22006 108.8306 [18,] 96.18705 108.1575 [19,] 96.46348 107.7790 [20,] 96.73385 107.4612 [21,] 96.48974 106.8431 [22,] 95.96964 106.1507 [23,] 96.71346 106.9421 [24,] 97.05855 106.8622 [25,] 97.04944 106.4826 [26,] 96.73317 105.9566 [27,] 97.09239 106.0033 [28,] 96.96184 105.6120 [29,] 97.53971 106.2411 [30,] 97.45375 105.9022 [31,] 97.74699 105.9448 [32,] 96.77353 105.3374 [33,] 96.28758 104.8290 [34,] 96.22467 104.5400 [35,] 95.79244 104.0701 [36,] 95.85903 103.9044 [37,] 95.39624 103.4414 [38,] 95.82111 103.8157 [39,] 95.87710 103.6651 [40,] 95.67581 103.3397 [41,] 96.03170 103.6122 [42,] 96.46089 104.0722 [43,] 96.59927 104.0332 [44,] 96.98142 104.3978 [45,] 96.82981 104.1281 [46,] 96.93987 104.0779 [47,] 97.40839 104.7758 [48,] 96.92499 104.3598 [49,] 97.02306 104.3043

Page 41: R version 3 Examples

[50,] 96.94920 104.1039 [51,] 97.06561 104.0804 [52,] 97.32791 104.2638 [53,] 96.74202 103.8402 [54,] 96.91806 103.8988 [55,] 96.85047 103.7203 [56,] 96.45735 103.3667 [57,] 96.50289 103.2898 [58,] 96.17719 102.9697 [59,] 96.53410 103.3878 [60,] 96.36972 103.1545 [61,] 96.53115 103.2186 [62,] 96.53470 103.1160 [63,] 96.30197 102.8529 [64,] 96.65784 103.3034 [65,] 96.63960 103.1873 [66,] 96.59844 103.0551 [67,] 96.53415 102.9074 [68,] 96.10267 102.5673 [69,] 95.81875 102.2873 [70,] 95.72737 102.1229 [71,] 96.09106 102.6306 [72,] 95.59995 102.2684 [73,] 95.72843 102.3137 [74,] 95.85804 102.3632 [75,] 96.05303 102.5058 [76,] 95.96637 102.3515 [77,] 95.76412 102.1235 [78,] 95.81998 102.0973 [79,] 95.70895 101.9307 [80,] 95.84327 101.9998 [81,] 95.78090 101.8725 [82,] 95.92841 101.9638 [83,] 95.67133 101.7170 [84,] 95.73081 101.7046 [85,] 95.74339 101.6469 [86,] 95.85520 101.6986 [87,] 96.10963 101.9841 [88,] 96.28579 102.1276 [89,] 96.04505 101.8964 [90,] 96.30250 102.1915 [91,] 96.64411 102.7176 [92,] 96.99452 103.2548 [93,] 97.02731 103.2197 [94,] 97.10835 103.2375 [95,] 97.02230 103.1023 [96,] 97.08614 103.1038 [97,] 96.93484 102.9257 [98,] 96.88458 102.8218 [99,] 96.68555 102.6167

esgmcprices > # GBM > r <- 0.03 > eps0 <- simshocks(n = 100, horizon = 5, frequency = "quart") > sim.GBM <- simdiff(n = 100, horizon = 5, frequency = "quart", model = "GBM", x0 = 100, theta1 = 0.03, theta2 = 0.1, eps = eps0) > # monte carlo prices > esgmcprices(r, sim.GBM) Qtr1 Qtr2 Qtr3 Qtr4 0 100.00000 100.04555 99.84426 101.03072 1 100.81132 101.27135 100.81412 101.43834 2 102.58389 102.52777 102.04297 102.52823 3 102.26105 102.72191 103.32985 103.76487 4 103.42262 102.62676 102.41243 102.05891 5 101.83595 > # monte carlo price for a given maturity > esgmcprices(r, sim.GBM, 2) Qtr1 2 102.5839

Page 42: R version 3 Examples

esgplotbands > # Times series > kappa <- 1.5 > V0 <- theta <- 0.04 > sigma <- 0.2 > theta1 <- kappa*theta > theta2 <- kappa > theta3 <- sigma > x <- simdiff(n = 100, horizon = 5, frequency = "quart", model = "OU", x0 = V0, theta1 = theta1, theta2 = theta2, theta3 = theta3) > par(mfrow=c(2,1)) > esgplotbands(x, xlab = "time", ylab = "values") > matplot(time(x), x, type = 'l', xlab = "time", ylab = "series values") > # Martingale test > r0 <- 0.03 > S0 <- 100 > sigma0 <- 0.1 > nbScenarios <- 100 > horizon0 <- 10 > eps0 <- simshocks(n = nbScenarios, horizon = horizon0, frequency = "quart", method = "anti") > sim.GBM <- simdiff(n = nbScenarios, horizon = horizon0, frequency = "quart", model = "GBM", x0 = S0, theta1 = r0, theta2 = sigma0, eps = eps0) > mc.test <- esgmartingaletest(r = r0, X = sim.GBM, p0 = S0, alpha = 0.05) martingale '1=1' one Sample t-test alternative hypothesis: true mean of the martingale difference is not equal to 0 df = 99 t p-value 0 Q2 0.044011138 0.9649842 0 Q3 0.004345470 0.9965416 0 Q4 0.048943498 0.9610629 1 Q1 -0.029556887 0.9764800 1 Q2 0.060769696 0.9516651 1 Q3 0.030757126 0.9755252 1 Q4 0.084090469 0.9331543 2 Q1 0.021734099 0.9827038 2 Q2 -0.088019266 0.9300392 2 Q3 -0.055105433 0.9561655 2 Q4 -0.002956246 0.9976472 3 Q1 0.017345777 0.9861957 3 Q2 0.054917700 0.9563147 3 Q3 0.073718124 0.9413834 3 Q4 0.046946970 0.9626501 4 Q1 0.050241311 0.9600313 4 Q2 0.017304978 0.9862281 4 Q3 -0.047620281 0.9621148 4 Q4 -0.163154151 0.8707297 5 Q1 -0.183235421 0.8549880 5 Q2 -0.247774817 0.8048218 5 Q3 -0.346472699 0.7297228 5 Q4 -0.382221853 0.7031161 6 Q1 -0.366308659 0.7149164 6 Q2 -0.476963068 0.6344396 6 Q3 -0.528499510 0.5983355 6 Q4 -0.512542238 0.6094132 7 Q1 -0.402942949 0.6878590 7 Q2 -0.469338964 0.6398594 7 Q3 -0.310047594 0.7571765 7 Q4 -0.208720778 0.8350951 8 Q1 -0.140245125 0.8887513 8 Q2 -0.267145090 0.7899132 8 Q3 -0.344859266 0.7309317 8 Q4 -0.504212986 0.6152320 9 Q1 -0.415779591 0.6784712 9 Q2 -0.434435239 0.6649178 9 Q3 -0.405487877 0.6859939 9 Q4 -0.494223380 0.6222433 10 Q1 -0.351800673 0.7257355

Page 43: R version 3 Examples

95 percent confidence intervals for the mean : c.i lower bound c.i upper bound 0 Q1 0.000000 0.000000 0 Q2 -1.065891 1.114248 0 Q3 -1.419285 1.425515 0 Q4 -1.790623 1.881194 1 Q1 -1.968149 1.910374 1 Q2 -2.299495 2.444796 1 Q3 -2.480283 2.558387 1 Q4 -2.709452 2.949267 2 Q1 -2.850362 2.913497 2 Q2 -2.955820 2.704720 2 Q3 -3.139539 2.969870 2 Q4 -3.336194 3.326268 3 Q1 -3.472472 3.533719 3 Q2 -3.633040 3.839870 3 Q3 -3.762850 4.053235 3 Q4 -3.903881 4.093091 4 Q1 -4.012340 4.220807 4 Q2 -4.153751 4.226841 4 Q3 -4.331238 4.128215 4 Q4 -4.444451 3.769085 5 Q1 -4.569278 3.796708 5 Q2 -4.659592 3.625064 5 Q3 -4.748789 3.336910 5 Q4 -4.859788 3.289904 6 Q1 -4.986543 3.432326 6 Q2 -5.098523 3.122392 6 Q3 -5.203164 3.014401 6 Q4 -5.315399 3.133077 7 Q1 -5.420024 3.590268 7 Q2 -5.543714 3.422808 7 Q3 -5.631437 4.109369 7 Q4 -5.719494 4.630747 8 Q1 -5.778788 5.015821 8 Q2 -5.978299 4.559537 8 Q3 -6.127190 4.312721 8 Q4 -6.283928 3.737392 9 Q1 -6.323559 4.132550 9 Q2 -6.460459 4.139620 9 Q3 -6.502237 4.295623 9 Q4 -6.612171 3.975118 10 Q1 -6.625620 4.630004 > esgplotbands(mc.test) > # Correlation test > nb <- 500 > s0.par1 <- simshocks(n = nb, horizon = 3, frequency = "semi", family = 1, par = 0.2) > s0.par2 <- simshocks(n = nb, horizon = 3, frequency = "semi", family = 1, par = 0.8) > (test1 <- esgcortest(s0.par1)) $cor.estimate Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 [1] 0.1052549 0.2471224 0.1710157 0.1962254 0.1792183 0.2162558 $conf.int Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 Series 1 Series 2 0.5 0.01772798 0.1911810 1.0 0.16296328 0.3277114 1.5 0.08459374 0.2548839 2.0 0.11043511 0.2791133 2.5 0.09298912 0.2627791 3.0 0.13105044 0.2982898 > (test2 <- esgcortest(s0.par2))

Page 44: R version 3 Examples

$cor.estimate Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 [1] 0.7933522 0.7945073 0.8248697 0.7753461 0.8162014 0.7930803 $conf.int Time Series: Start = c(0, 2) End = c(3, 1) Frequency = 2 Series 1 Series 2 0.5 0.7584248 0.8237359 1.0 0.7597490 0.8247377 1.5 0.7946595 0.8510044 2.0 0.7378203 0.8080939 2.5 0.7846723 0.8435187 3.0 0.7581132 0.8235000 > par(mfrow=c(2, 1)) > esgplotbands(test1) > esgplotbands(test2)

Page 45: R version 3 Examples

esgplotshocks > # Number of risk factors > d <- 2 > # Number of possible combinations of the risk factors

Page 46: R version 3 Examples

> dd <- d*(d-1)/2 > # Family : Gaussian copula > fam1 <- rep(1,dd) > # Correlation coefficients between the risk factors (d*(d-1)/2) > par0.1 <- 0.1 > par0.2 <- -0.9 > # Family : Rotated Clayton (180 degrees) > fam2 <- 13 > par0.3 <- 2 > # Family : Rotated Clayton (90 degrees) > fam3 <- 23 > par0.4 <- -2 > # number of simulations > nb <- 500 > # Simulation of shocks for the d risk factors > s0.par1 <- simshocks(n = nb, horizon = 4, family = fam1, par = par0.1) > s0.par2 <- simshocks(n = nb, horizon = 4, family = fam1, par = par0.2) > s0.par3 <- simshocks(n = nb, horizon = 4, family = fam2, par = par0.3) > s0.par4 <- simshocks(n = nb, horizon = 4, family = fam3, par = par0.4) > ## Not run: > esgplotshocks(s0.par1, s0.par2) > esgplotshocks(s0.par2, s0.par3) > esgplotshocks(s0.par2, s0.par4) > esgplotshocks(s0.par1, s0.par4) > ## End(Not run)

Page 47: R version 3 Examples
Page 48: R version 3 Examples
Page 49: R version 3 Examples

esgplotts > kappa <- 1.5 > V0 <- theta <- 0.04 > sigma <- 0.2 > theta1 <- kappa*theta > theta2 <- kappa > theta3 <- sigma > x <- simdiff(n = 10, horizon = 5, frequency = "quart", model = "OU", x0 = V0, theta1 = theta1, theta2 = theta2, theta3 = theta3) > esgplotts(x)

simdiff > kappa <- 1.5 > V0 <- theta <- 0.04 > sigma_v <- 0.2 > theta1 <- kappa*theta > theta2 <- kappa > theta3 <- sigma_v > # OU > sim.OU <- simdiff(n = 10, horizon = 5, frequency = "quart", model = "OU", x0 = V0, theta1 = theta1, theta2 = theta2, theta3 = theta3) > head(sim.OU) Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 [1,] 0.040000000 0.04000000 0.04000000 0.04000000 0.04000000 0.04000000 0.04000000 [2,] -0.043793411 0.13545510 0.12061134 0.03128363 0.01231459 -0.07031748 0.10308774 [3,] 0.077083612 0.30748651 0.11622503 0.06632089 0.14065047 0.06397038 0.05518265 [4,] -0.015818401 0.11970038 0.06797801 0.17824899 0.16886970 0.10152937 0.13679321 [5,] -0.001608485 0.22095977 -0.05221429 0.14572684 0.07964959 0.08342765 0.11269795 [6,] 0.131557535 0.07459917 -0.07091586 0.08569221 0.08051147 -0.01917560 0.09571364 Series 8 Series 9 Series 10 [1,] 0.04000000 0.04000000 0.04000000 [2,] 0.05897903 -0.06095583 0.05606124 [3,] -0.00779166 -0.08804009 0.06548068 [4,] 0.06414455 0.03412537 -0.01172195 [5,] 0.02212753 -0.04280621 0.04682079 [6,] 0.09581279 -0.08017211 0.16699108 > par(mfrow=c(2,1)) > esgplotbands(sim.OU, xlab = "time", ylab = "values", main = "with esgplotbands") > matplot(time(sim.OU), sim.OU, type = 'l', main = "with matplot") > # OU with simulated shocks (check the dimensions) > eps0 <- simshocks(n = 50, horizon = 5, frequency = "quart", method = "anti")

Page 50: R version 3 Examples

> sim.OU <- simdiff(n = 50, horizon = 5, frequency = "quart", model = "OU", x0 = V0, theta1 = theta1, theta2 = theta2, theta3 = theta3,eps = eps0) > par(mfrow=c(2,1)) > esgplotbands(sim.OU, xlab = "time", ylab = "values", main = "with esgplotbands") > matplot(time(sim.OU), sim.OU, type = 'l', main = "with matplot") > # a different plot > esgplotts(sim.OU) > # CIR > sim.CIR <- simdiff(n = 50, horizon = 5, frequency = "quart", model = "CIR", x0 = V0, theta1 = theta1, theta2 = theta2, theta3 = 0.05) > esgplotbands(sim.CIR, xlab = "time", ylab = "values", main = "with esgplotbands") > matplot(time(sim.CIR), sim.CIR, type = 'l', main = "with matplot") > # GBM > eps0 <- simshocks(n = 100, horizon = 5, frequency = "quart") > sim.GBM <- simdiff(n = 100, horizon = 5, frequency = "quart", model = "GBM", x0 = 100, theta1 = 0.03, theta2 = 0.1, eps = eps0) > esgplotbands(sim.GBM, xlab = "time", ylab = "values", main = "with esgplotbands") > matplot(time(sim.GBM), sim.GBM, type = 'l', main = "with matplot") > eps0 <- simshocks(n = 100, horizon = 5, frequency = "quart") > sim.GBM <- simdiff(n = 100, horizon = 5, frequency = "quart", model = "GBM", x0 = 100, theta1 = 0.03, theta2 = 0.1, eps = eps0) > esgplotbands(sim.GBM, xlab = "time", ylab = "values", main = "with esgplotbands") > matplot(time(sim.GBM), sim.GBM, type = 'l', main = "with matplot")

Page 51: R version 3 Examples
Page 52: R version 3 Examples
Page 53: R version 3 Examples

simshocks > # Number of risk factors > d <- 6 > # Number of possible combinations of the risk factors > dd <- d*(d-1)/2 > # Family : Gaussian copula for all > fam1 <- rep(1,dd) > # Correlation coefficients between the risk factors (d*(d-1)/2) > par1 <- c(0.2,0.69,0.73,0.22,-0.09,0.51,0.32,0.01,0.82,0.01,-0.2,-0.32,-0.19,-0.17,-0.06) > # Simulation of shocks for the 6 risk factors > simshocks(n = 10, horizon = 5, family = fam1, par = par1) [[1]] Time Series: Start = 1 End = 5 Frequency = 1 Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 1 1.178098 0.3862083 2.0303940 2.20406905 -2.0542165 -2.07007944 -1.03295844 2 0.769871 0.2393833 0.9193507 -0.06522101 -1.5412210 1.64744385 0.23357636 3 -0.504466 1.0731605 0.7441552 -0.86216807 1.7248287 0.69267098 -0.01153005 4 -0.933156 1.3560908 -1.4283321 1.19379175 -1.0957169 1.24792988 0.16116554 5 -2.202509 -1.5803403 -1.5991151 0.89242646 -0.3400662 0.08822614 0.50536465 Series 8 Series 9 Series 10 1 -1.6805350 -1.5512808 0.3857301 2 -0.1589514 0.7533202 -1.3283823 3 -1.3907799 1.2653415 0.2142083 4 -1.4852766 0.3211369 -1.0710613 5 -1.4041627 1.0387447 0.9343686 [[2]] Time Series: Start = 1

Page 54: R version 3 Examples

End = 5 Frequency = 1 Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 1 0.6226539 0.5249207 0.1518922 0.4717459 -0.9473828 -1.0881947 -0.5501138 2 1.1070679 1.8301609 -0.3112848 -0.7555483 -0.1091203 0.9701413 0.2131802 3 -0.4761082 1.1281059 -0.8734716 -0.7399489 -0.4080763 0.6400219 0.4609663 4 -0.7199894 -0.2411203 -0.8971023 -1.8432686 0.5770099 0.9274918 -1.0698480 5 0.7094487 -1.2932800 0.9357272 -0.4347026 1.9285234 0.6628400 -1.3640588 Series 8 Series 9 Series 10 1 -0.7063507 -0.7228447 -0.3625273 2 0.5595327 0.1106446 2.1738936 3 -1.6325803 0.6447560 -1.1826202 4 -1.5322840 0.7425462 0.6945482 5 0.3667316 -0.4066807 0.7430339 [[3]] Time Series: Start = 1 End = 5 Frequency = 1 Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 1 0.65059046 1.2236181 1.07046762 1.883198794 -0.2417453 -1.6315244 -1.50821127 2 -0.57435084 0.9080547 0.21379748 0.215749034 -0.7375088 0.1944677 0.33468248 3 -0.77355525 0.2064375 -0.42542782 -1.059227518 0.7731093 0.7398273 0.09000003 4 -2.49494969 0.7166071 -1.50759374 0.004379487 -0.5295103 1.0954703 0.08534187 5 -0.08818362 -1.1580251 0.06272141 0.153511610 0.3675026 -0.2351841 0.17357065 Series 8 Series 9 Series 10 1 -2.3175555 -1.2749135 -1.1939503 2 -0.1080984 0.9126097 0.4234531 3 -0.5427477 1.9812405 -0.5503593 4 -1.3956060 1.7758916 -1.2006880 5 -1.2201892 1.5909292 -0.1184360 [[4]] Time Series: Start = 1 End = 5 Frequency = 1 Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 1 1.70353788 0.9201644 1.48929432 0.6153544 -2.4544038 -0.8368234 -0.5980719 2 -0.44288859 0.2952432 0.08669404 0.1186515 -1.8288993 0.8416315 0.3688733 3 0.02121711 0.8330503 1.42164605 -1.7205683 0.8366774 -0.5442425 0.8001248 4 -1.32355744 0.9073752 -0.38281185 1.8843159 -0.4366564 0.6119233 -0.5979324 5 -0.32672514 -1.9452360 -0.73437838 0.7051748 -0.7163417 -0.3178667 0.3953845 Series 8 Series 9 Series 10 1 -1.0800944 -0.3915560 -0.2357763 2 1.8687836 1.4322394 0.2138535 3 -1.1684290 1.8647011 -0.1087581 4 -1.2407877 0.5737682 -0.4374636 5 -0.5856278 1.2819189 1.2611642 [[5]] Time Series: Start = 1 End = 5 Frequency = 1 Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 1 -0.03189909 0.01159951 1.2731676 0.56616264 -0.4445116 -0.8716448 -1.0542301 2 -0.46755104 0.58415642 -0.3165276 -1.08161396 0.5487348 0.5152906 0.1101045 3 0.16588741 0.18558376 0.8689375 0.08841241 -0.1360036 0.5251817 -0.3370111 4 0.05713470 -1.11195145 -0.4018324 -1.23763389 0.2847936 0.2379047 -0.7395104 5 -1.50050762 1.25745205 -0.7065954 0.09596990 -0.2065615 0.1047851 0.8527808 Series 8 Series 9 Series 10 1 -0.7607053 0.3841907 1.1504830 2 -1.5578621 0.2712789 -0.7893227 3 -0.1561023 0.7992609 0.7968665 4 -0.4863006 -0.5464912 0.2550082 5 -1.1692485 -0.8169700 -0.5754750 [[6]] Time Series: Start = 1

Page 55: R version 3 Examples

End = 5 Frequency = 1 Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 1 0.90708254 -0.7505323 -1.3737295 -0.34987608 -1.3676920 -1.82576406 -0.6848315 2 0.71688496 1.1923114 0.3267709 0.07048922 0.2439475 0.28263036 0.5999548 3 0.02139674 1.0911339 -1.8428655 0.29835793 -0.8077240 0.07578613 -0.6800668 4 0.38948997 -0.4752186 -0.2818108 -1.60306740 0.3353151 0.12490201 -1.7513817 5 1.28576086 -1.0942419 1.6323907 -0.39813551 2.3186730 0.63218450 -1.7769502 Series 8 Series 9 Series 10 1 -0.09815743 -0.4855522 -0.4375239 2 0.75382513 -0.3570197 3.1202256 3 -2.12352681 -0.2582054 -0.9878015 4 -1.20856063 1.0752415 1.2103918 5 1.15675829 -1.2144819 0.4627515 > # Simulation of shocks for the 6 risk factors > # on a quarterly basis > simshocks(n = 10, frequency = "quarterly", horizon = 2, family = fam1, par = par1) [[1]] Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 0 Q2 0.3707125 -1.46567622 -0.3740442 0.005674489 0.2029973 0.8446815 0.1780092 0 Q3 -0.6074199 -2.04023345 1.0827160 -0.579374664 1.0950635 -0.3720673 -0.4266689 0 Q4 0.9675780 0.07081262 -0.3634863 -0.263612036 -0.2337026 -1.1136236 -0.4514623 1 Q1 2.2354975 1.90455959 0.4683793 0.886005732 -0.4183864 0.3172967 -1.8088411 1 Q2 -1.2560424 0.46753272 -0.7651337 0.785296316 0.3512369 0.8463251 -0.2202432 1 Q3 0.8001233 0.83729309 -0.1366156 1.220617705 -0.3096884 -0.6187528 -1.3348863 1 Q4 0.6633467 1.63657277 0.9534616 0.350087689 0.4725178 -0.8154827 -0.2525277 2 Q1 -0.9084515 -0.43734786 -0.2823401 0.432727804 2.0261029 -0.3208165 1.9093105 Series 8 Series 9 Series 10 0 Q2 0.2682995 0.5478924 -0.6865844 0 Q3 -0.1926350 1.0946728 0.7426243 0 Q4 -1.1019848 -0.5705964 1.0514377 1 Q1 -1.1252055 0.9454039 0.7272249 1 Q2 -0.4737479 1.0641921 -1.2996786 1 Q3 -1.3979540 0.1235702 -0.9089614 1 Q4 -1.2549659 0.4041325 0.4319071 2 Q1 -1.0298439 -0.4065825 -1.6445076 [[2]] Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 0 Q2 -1.2411217 0.47684211 0.2361236 -0.3815699 2.4237282 -1.7696027 -1.7358941 0 Q3 -0.2442417 -1.05099209 -0.8225860 0.1010987 0.7879114 -1.3130948 -1.0711521 0 Q4 -1.8222436 -0.06696778 -1.9894417 0.7845378 0.3092068 -0.6575963 -1.5419132 1 Q1 -1.9079218 1.10050042 1.8345137 1.7733066 -0.2067677 -0.7060003 -0.6331599 1 Q2 -0.6873160 -1.24875998 -1.0597546 -0.1097815 -2.3529222 0.5479628 0.8805344 1 Q3 2.1740452 0.60123791 -0.9381796 0.3062243 -2.3424236 0.1295735 0.2307762 1 Q4 -0.6094359 0.80769895 0.1060721 -1.6749819 0.5191316 -1.3539067 1.6718601 2 Q1 -2.0551509 1.16383763 0.5165494 -1.2999414 1.0012434 -0.5979916 1.0665036 Series 8 Series 9 Series 10 0 Q2 0.4601483 0.76233300 -0.1983229 0 Q3 -0.6824554 1.05877785 1.9535292 0 Q4 0.6052659 1.25881548 -0.5716332 1 Q1 0.5990046 -0.01170955 0.5402820 1 Q2 -0.3970265 -0.13241080 0.1809079 1 Q3 -1.1060801 -0.45967846 1.3409886 1 Q4 1.0671516 1.37173694 1.8328730 2 Q1 -0.4899142 1.07615029 -1.8365339 [[3]] Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 0 Q2 0.1686872 -1.1959615 0.30765257 -0.5822569 1.18788403 -1.1724713 -0.15706049 0 Q3 -0.5339157 -2.5127451 0.91073815 -0.3799381 1.76685398 -1.3397472 -0.04979071 0 Q4 0.9492431 1.3575323 -1.04324865 -0.7805905 0.18958412 -1.0570924 -0.85549086 1 Q1 0.2685471 2.6593787 1.37995431 0.8587814 0.05188178 0.8562368 -1.00104856 1 Q2 -1.6002551 0.5361285 -0.04897626 -0.7611823 -0.02001627 0.5463974 0.73766817 1 Q3 1.6341164 1.2244603 0.59949968 0.9306350 -1.11435826 -0.3391398 -1.37690906 1 Q4 0.1103731 1.3264513 0.38429550 -1.0846517 0.07737981 -0.2505370 0.28701508 2 Q1 -0.5587944 0.1718592 1.24277951 0.3922049 1.99051028 -0.2937047 1.56382321 Series 8 Series 9 Series 10 0 Q2 0.99216403 1.09703479 -0.5403357 0 Q3 -0.04782962 0.83576082 1.0041127 0 Q4 -0.44316423 0.08476564 0.3623836

Page 56: R version 3 Examples

1 Q1 0.41661896 0.24363334 0.2697199 1 Q2 0.22968790 0.09550919 -1.1057878 1 Q3 -1.54821235 0.37189690 -0.3335587 1 Q4 -2.24887872 0.61614157 1.6106861 2 Q1 -0.46188638 -0.32701239 -1.5426635 [[4]] Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 0 Q2 0.1000824 -1.0107819 -0.58584175 0.7518741 1.1403820 0.6761242 -0.49644847 0 Q3 -0.5290636 -1.5991029 0.69403681 0.2704958 1.2495723 -0.8263510 0.44201688 0 Q4 0.9110910 -1.2852840 -0.19836937 0.6673089 0.3158852 0.3781016 -0.02125682 1 Q1 1.5821576 0.9735488 0.89433215 0.2140047 -0.7348323 -1.8196735 -0.97825452 1 Q2 -1.1891418 -0.1418665 -0.74530665 0.7668057 0.1716984 0.2630768 -0.83334852 1 Q3 1.8081017 -0.1627619 0.06326862 0.5733518 -1.6838349 -0.7583934 -0.22342957 1 Q4 -0.3894231 2.1005344 1.04023185 0.1945104 0.5536902 -2.1240061 0.47384906 2 Q1 -1.4818369 1.4169261 -0.08997950 0.7186952 2.0816555 0.7631309 1.32859654 Series 8 Series 9 Series 10 0 Q2 0.4999214 -0.350716860 -0.6494155 0 Q3 0.1221515 1.826389671 1.4092865 0 Q4 -1.5270604 0.793854147 0.9994492 1 Q1 -1.6591539 -0.423280892 0.2020916 1 Q2 0.5593154 -0.003771758 -1.6202740 1 Q3 -0.9313373 -0.412907171 -0.6206657 1 Q4 -2.1001920 1.512250620 1.2032988 2 Q1 -2.1342258 0.288959934 -2.1744327 [[5]] Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 0 Q2 0.3365480 1.0380035 0.2231828 0.8544201 -1.19539664 -1.7265921 -0.1120450 0 Q3 -1.5324399 -1.4062535 0.6113946 -1.9691695 -0.50021618 1.2702376 -1.5098431 0 Q4 2.2268757 -1.0152949 1.5377640 0.1982105 0.05440309 -1.4715420 1.6245995 1 Q1 0.5539573 1.6729968 0.5321313 0.2203970 -0.61500332 -0.2220320 -0.3218990 1 Q2 -1.1621067 -1.3521862 0.4933287 -0.5724821 1.74076061 -0.2271923 -1.0331118 1 Q3 -0.3888972 -0.1614138 -0.4110169 0.3211276 -2.12308037 0.2018823 -1.1466760 1 Q4 2.0229633 -0.3544308 -0.9788290 1.2832903 0.77092180 -0.1073244 -1.2284627 2 Q1 -0.7752845 0.5101412 0.1061588 -1.9206763 -1.16812460 0.3953294 -0.0665465 Series 8 Series 9 Series 10 0 Q2 -0.2037821 0.75360559 -0.9656520 0 Q3 -0.7750083 -0.11972601 0.5436254 0 Q4 -1.1577865 -0.15389751 0.4736938 1 Q1 -0.6583197 -0.62308839 -0.5399112 1 Q2 0.6456551 0.37622358 -1.2579604 1 Q3 0.4232720 1.12394938 0.1724073 1 Q4 0.6718091 -1.40146606 2.0783285 2 Q1 -0.8170303 0.06715462 0.9912030 [[6]] Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 0 Q2 -0.9406206 1.1309040 0.03287867 -0.2372255 3.01993891 -1.8301062 -2.625429088 0 Q3 -0.5359237 -0.2175443 -0.38328367 -0.7133620 0.11020589 -0.9970608 -0.598467484 0 Q4 -2.5721440 -0.3506012 -2.21609069 1.8357921 0.57495074 0.4753882 -1.298999136 1 Q1 -2.6916652 0.1000267 0.64334190 2.2653482 -1.22399255 -1.3761022 0.205307283 1 Q2 -0.2672816 -1.2940225 -0.30356605 -0.7890213 -1.45823314 1.8946243 0.308148915 1 Q3 0.4787267 0.6161345 -0.45929871 0.2841917 -0.01709609 -0.6524816 -0.827077885 1 Q4 -0.9143835 0.5127401 -0.01090857 -1.2702697 0.35285874 -1.7827736 1.426636039 2 Q1 -1.5408818 1.5698178 0.11592144 -1.5272209 0.66021909 -1.1575239 -0.002190201 Series 8 Series 9 Series 10 0 Q2 0.06394807 0.48820464 0.2403208 0 Q3 -0.92664740 1.87075187 1.1972372 0 Q4 0.99500765 0.52129043 -1.5840371 1 Q1 1.10276128 0.56262130 1.0023219 1 Q2 -1.34336650 0.05510734 0.6998302 1 Q3 -0.39644277 -0.76509368 1.5884533 1 Q4 1.62879825 1.34372039 1.2489639 2 Q1 -0.41603336 1.69368222 -1.6577506 > # Simulation of shocks for the 6 risk factors simulation > # on a quarterly basis, with antithetic variates and moment matching. > s0 <- simshocks(n = 10, method = "hyb", horizon = 4,family = fam1, par = par1) > s0[[2]] Time Series: Start = 1

Page 57: R version 3 Examples

End = 4 Frequency = 1 Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 1 0.2985361 0.7224007 0.6852255 1.42020126 -1.0221684 -0.2985361 -0.7224007 2 -0.4041283 0.8424827 -1.4691851 -0.74143309 -0.5854442 0.4041283 -0.8424827 3 1.2236892 -0.2609391 0.2121676 -0.65792940 0.4036703 -1.2236892 0.2609391 4 -1.1180971 -1.3039444 0.5717919 -0.02083878 1.2039423 1.1180971 1.3039444 Series 8 Series 9 Series 10 1 -0.6852255 -1.42020126 1.0221684 2 1.4691851 0.74143309 0.5854442 3 -0.2121676 0.65792940 -0.4036703 4 -0.5717919 0.02083878 -1.2039423 > colMeans(s0[[1]]) Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 -2.428613e-17 -3.816392e-17 -2.602085e-17 -6.938894e-18 0.000000e+00 2.428613e-17 Series 7 Series 8 Series 9 Series 10 3.816392e-17 2.602085e-17 6.938894e-18 0.000000e+00 > colMeans(s0[[5]]) Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 -4.857226e-17 -3.339343e-17 4.163336e-17 2.775558e-17 -5.551115e-17 4.857226e-17 Series 7 Series 8 Series 9 Series 10 3.339343e-17 -4.163336e-17 -2.775558e-17 5.551115e-17 > apply(s0[[3]], 2, sd) Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 Series 8 1 1 1 1 1 1 1 1 Series 9 Series 10 1 1 > apply(s0[[4]], 2, sd) Series 1 Series 2 Series 3 Series 4 Series 5 Series 6 Series 7 Series 8 1 1 1 1 1 1 1 1 Series 9 Series 10 1 1

BayesBridge – Package (Bridge Regression) > library(BayesBridge) Warning message: package ‘BayesBridge’ was built under R version 3.1.2

bridge.EM > # Load the diabetes data... > data(diabetes, package="BayesBridge"); > cov.name = colnames(diabetes$x); > y = diabetes$y; > X = diabetes$x; > # Center the data. > y = y - mean(y); > mX = colMeans(X); > for(i in 1:442){ + X[i,] = X[i,] - mX; + } > # Expectation maximization. > bridge.EM(y, X, 0.5, 1.0, 1e8, 1e-9, 30, use.cg=TRUE); Using conjugate gradient method. age sex bmi map tc ldl hdl -9.784325 -239.774890 519.864326 324.296617 -788.022455 473.628865 98.900975 tch ltg glu 176.127557 749.828649 67.542089

bridge.reg > # Load the diabetes data... > data(diabetes, package="BayesBridge"); > cov.name = colnames(diabetes$x); > y = diabetes$y; > X = diabetes$x; > # Center the data. > y = y - mean(y); > mX = colMeans(X); > for(i in 1:442){ + X[i,] = X[i,] - mX;

Page 58: R version 3 Examples

+ } > # Run the bridge regression when sig2 and tau are unknown. > gb = bridge.reg(y, X, nsamp=10000, alpha=0.5, sig2.shape=0.0, sig2.scale=0.0, nu.shape=2.0, nu.rate=2.0); [1] "Variable extras only for Package testing." Bridge Regression (mix. of triangles): known alpha=0.5 Burn-in: 500, Num. Samples: 10000 Burn-in complete: 0.023 sec. for 500 iterations. Expect approx. 0.46 sec. for 10000 samples. Sampling complete: 0.65 sec. for 10000 iterations.

diabetes > ## Draw random variates from an exponentially tilted stable distribution > ## with given alpha, V0, and h = 1 > alpha <- .2 > N = 200 > V0 <- rgamma(N, 1) > rETS <- retstable.ld(N, alpha, V0) > ## Distribution plot the random variates -- log-scaled > hist(log(rETS), prob=TRUE) > lines(density(log(rETS)), col=2) > rug (log(rETS))

trace.beta > # Load the diabetes data... > data(diabetes, package="BayesBridge"); > cov.name = colnames(diabetes$x); > y = diabetes$y; > X = diabetes$x; > # Center the data. > y = y - mean(y); > mX = colMeans(X); > for(i in 1:442){

Page 59: R version 3 Examples

+ X[i,] = X[i,] - mX; + } > # Expectation maximization. > out = trace.beta(y, X);

FNN - Package > library(FNN) Warning message: package ‘FNN’ was built under R version 3.0.3

get.knn > data<- query<- cbind(1:10, 1:10) > get.knn(data, k=5) $nn.index [,1] [,2] [,3] [,4] [,5] [1,] 2 3 4 5 6 [2,] 1 3 4 5 6 [3,] 2 4 1 5 6 [4,] 3 5 2 6 1 [5,] 4 6 3 7 2 [6,] 7 5 8 4 9 [7,] 8 6 9 5 10 [8,] 9 7 10 6 5 [9,] 10 8 7 6 5 [10,] 9 8 7 6 5 $nn.dist [,1] [,2] [,3] [,4] [,5] [1,] 1.414214 2.828427 4.242641 5.656854 7.071068 [2,] 1.414214 1.414214 2.828427 4.242641 5.656854 [3,] 1.414214 1.414214 2.828427 2.828427 4.242641

Page 60: R version 3 Examples

[4,] 1.414214 1.414214 2.828427 2.828427 4.242641 [5,] 1.414214 1.414214 2.828427 2.828427 4.242641 [6,] 1.414214 1.414214 2.828427 2.828427 4.242641 [7,] 1.414214 1.414214 2.828427 2.828427 4.242641 [8,] 1.414214 1.414214 2.828427 2.828427 4.242641 [9,] 1.414214 1.414214 2.828427 4.242641 5.656854 [10,] 1.414214 2.828427 4.242641 5.656854 7.071068 > get.knnx(data, query, k=5) $nn.index [,1] [,2] [,3] [,4] [,5] [1,] 1 2 3 4 5 [2,] 2 1 3 4 5 [3,] 3 2 4 1 5 [4,] 4 3 5 2 6 [5,] 5 4 6 3 7 [6,] 6 7 5 8 4 [7,] 7 8 6 9 5 [8,] 8 9 7 10 6 [9,] 9 10 8 7 6 [10,] 10 9 8 7 6 $nn.dist [,1] [,2] [,3] [,4] [,5] [1,] 0 1.414214 2.828427 4.242641 5.656854 [2,] 0 1.414214 1.414214 2.828427 4.242641 [3,] 0 1.414214 1.414214 2.828427 2.828427 [4,] 0 1.414214 1.414214 2.828427 2.828427 [5,] 0 1.414214 1.414214 2.828427 2.828427 [6,] 0 1.414214 1.414214 2.828427 2.828427 [7,] 0 1.414214 1.414214 2.828427 2.828427 [8,] 0 1.414214 1.414214 2.828427 2.828427 [9,] 0 1.414214 1.414214 2.828427 4.242641 [10,] 0 1.414214 2.828427 4.242641 5.656854 > get.knnx(data, query, k=5, algo="kd_tree") $nn.index [,1] [,2] [,3] [,4] [,5] [1,] 1 2 3 4 5 [2,] 2 1 3 4 5 [3,] 3 2 4 1 5 [4,] 4 3 5 2 6 [5,] 5 4 6 3 7 [6,] 6 7 5 8 4 [7,] 7 8 6 9 5 [8,] 8 9 7 10 6 [9,] 9 10 8 7 6 [10,] 10 9 8 7 6 $nn.dist [,1] [,2] [,3] [,4] [,5] [1,] 0 1.414214 2.828427 4.242641 5.656854 [2,] 0 1.414214 1.414214 2.828427 4.242641 [3,] 0 1.414214 1.414214 2.828427 2.828427 [4,] 0 1.414214 1.414214 2.828427 2.828427 [5,] 0 1.414214 1.414214 2.828427 2.828427 [6,] 0 1.414214 1.414214 2.828427 2.828427 [7,] 0 1.414214 1.414214 2.828427 2.828427 [8,] 0 1.414214 1.414214 2.828427 2.828427 [9,] 0 1.414214 1.414214 2.828427 4.242641 [10,] 0 1.414214 2.828427 4.242641 5.656854

KL.dist > set.seed(1000) > X<- rexp(10000, rate=0.2) > Y<- rexp(10000, rate=0.4) > KL.dist(X, Y, k=5) [1] 0.4843205 0.5201644 0.4883402 0.4861477 0.4919426 > KLx.dist(X, Y, k=5) [1] 0.4843205 0.5201644 0.4883402 0.4861477 0.4919426

Page 61: R version 3 Examples

> #thoretical distance = (0.2-0.4)^2/(0.2*0.4) = 0.5

KL.divergence > set.seed(1000) > X<- rexp(10000, rate=0.2) > Y<- rexp(10000, rate=0.4) > KL.divergence(X, Y, k=5) [1] 0.2962696 0.3173042 0.3070079 0.3034722 0.3021469 > #theoretical divergence = log(0.2/0.4)+(0.4-0.2)-1 = 1-log(2) = 0.307

knn > data(iris3) > train <- rbind(iris3[1:25,,1], iris3[1:25,,2], iris3[1:25,,3]) > test <- rbind(iris3[26:50,,1], iris3[26:50,,2], iris3[26:50,,3]) > cl <- factor(c(rep("s",25), rep("c",25), rep("v",25))) > knn(train, test, cl, k = 3, prob=TRUE) [1] s s s s s s s s s s s s s s s s s s s s s s s s s c c v c c c c c v c c c c c c c [42] c c c c c c c c c v c c v v v v v c v v v v c v v v v v v v v v v v attr(,"prob") [1] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [9] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [17] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [25] 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 [33] 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [41] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [49] 1.0000000 1.0000000 1.0000000 0.6666667 0.6666667 1.0000000 1.0000000 1.0000000 [57] 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 [65] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 [73] 1.0000000 1.0000000 0.6666667 attr(,"nn.index") [,1] [,2] [,3] [1,] 10 2 13 [2,] 24 8 18 [3,] 1 18 8 [4,] 1 18 8 [5,] 4 12 10 [6,] 10 4 13 [7,] 21 18 11 [8,] 20 11 17 [9,] 16 17 15 [10,] 10 2 13 [11,] 2 3 10 [12,] 11 1 21 [13,] 5 1 8 [14,] 9 14 4 [15,] 8 1 18 [16,] 18 1 5 [17,] 9 14 13 [18,] 4 3 7 [19,] 24 22 18 [20,] 6 22 20 [21,] 2 13 3 [22,] 20 22 5 [23,] 4 3 7 [24,] 11 20 22 [25,] 8 1 18 [26,] 41 34 50 [27,] 34 28 30 [28,] 28 61 74 [29,] 39 37 31 [30,] 45 40 43 [31,] 45 29 35 [32,] 45 29 43 [33,] 43 45 47 [34,] 52 74 48 [35,] 42 31 37 [36,] 32 46 27 [37,] 28 41 34

Page 62: R version 3 Examples

[38,] 44 48 38 [39,] 37 42 31 [40,] 29 45 35 [41,] 31 43 29 [42,] 39 49 27 [43,] 43 45 47 [44,] 33 36 35 [45,] 31 43 45 [46,] 37 31 43 [47,] 37 31 43 [48,] 50 47 37 [49,] 33 36 40 [50,] 43 37 31 [51,] 53 58 71 [52,] 74 39 48 [53,] 46 74 32 [54,] 55 54 62 [55,] 53 58 63 [56,] 58 53 56 [57,] 68 56 60 [58,] 55 54 62 [59,] 48 74 39 [60,] 54 62 67 [61,] 56 53 58 [62,] 66 51 75 [63,] 67 54 62 [64,] 46 74 39 [65,] 63 71 75 [66,] 71 63 55 [67,] 63 61 66 [68,] 52 64 72 [69,] 71 75 55 [70,] 71 75 55 [71,] 63 66 61 [72,] 74 62 48 [73,] 61 62 67 [74,] 66 61 75 [75,] 52 46 72 attr(,"nn.dist") [,1] [,2] [,3] [1,] 0.2000000 0.2236068 0.3000000 [2,] 0.2000000 0.2236068 0.2645751 [3,] 0.1414214 0.1732051 0.2236068 [4,] 0.1414214 0.1732051 0.2236068 [5,] 0.1732051 0.2236068 0.2645751 [6,] 0.1732051 0.2236068 0.2449490 [7,] 0.2828427 0.3464102 0.3605551 [8,] 0.3741657 0.4582576 0.4582576 [9,] 0.3605551 0.3872983 0.4123106 [10,] 0.1000000 0.1414214 0.2000000 [11,] 0.3000000 0.3162278 0.3464102 [12,] 0.3000000 0.4123106 0.4242641 [13,] 0.1414214 0.2449490 0.2645751 [14,] 0.1414214 0.2449490 0.3000000 [15,] 0.1000000 0.1414214 0.1732051 [16,] 0.1414214 0.1732051 0.1732051 [17,] 0.6244998 0.7810250 0.7937254 [18,] 0.3000000 0.3000000 0.3162278 [19,] 0.2645751 0.3162278 0.3741657 [20,] 0.3741657 0.4123106 0.4123106 [21,] 0.1414214 0.2000000 0.2645751 [22,] 0.1414214 0.2449490 0.3000000 [23,] 0.1414214 0.1414214 0.2236068 [24,] 0.1000000 0.2449490 0.2828427 [25,] 0.1414214 0.2236068 0.2449490 [26,] 0.1414214 0.2449490 0.2645751 [27,] 0.3162278 0.3464102 0.3741657 [28,] 0.3162278 0.4242641 0.5196152 [29,] 0.2449490 0.3316625 0.3741657 [30,] 0.4358899 0.4472136 0.6164414 [31,] 0.1732051 0.3000000 0.5291503 [32,] 0.2645751 0.4358899 0.5830952

Page 63: R version 3 Examples

[33,] 0.2828427 0.3000000 0.3464102 [34,] 0.3605551 0.4123106 0.4242641 [35,] 0.2000000 0.4123106 0.5830952 [36,] 0.3741657 0.4242641 0.4582576 [37,] 0.2828427 0.3162278 0.3162278 [38,] 0.2645751 0.5744563 0.5916080 [39,] 0.3741657 0.4472136 0.4582576 [40,] 0.2000000 0.2449490 0.3872983 [41,] 0.3162278 0.4795832 0.5099020 [42,] 0.1414214 0.3000000 0.3872983 [43,] 0.2449490 0.2645751 0.3741657 [44,] 0.1414214 0.3605551 0.8485281 [45,] 0.3316625 0.3741657 0.4123106 [46,] 0.3605551 0.3741657 0.3872983 [47,] 0.3000000 0.3162278 0.3872983 [48,] 0.2000000 0.3316625 0.3872983 [49,] 0.3872983 0.7211103 0.9000000 [50,] 0.3316625 0.3605551 0.4000000 [51,] 0.3872983 0.4358899 0.6557439 [52,] 0.1732051 0.4358899 0.4472136 [53,] 0.3000000 0.3605551 0.4582576 [54,] 0.3162278 0.3316625 0.3741657 [55,] 0.5196152 0.5567764 0.7071068 [56,] 0.2645751 0.4582576 0.6082763 [57,] 0.4123106 0.8831761 0.9327379 [58,] 0.3000000 0.4242641 0.4358899 [59,] 0.3605551 0.3741657 0.4690416 [60,] 0.5385165 0.6633250 0.7000000 [61,] 0.5477226 0.6633250 0.6782330 [62,] 0.3872983 0.4242641 0.5196152 [63,] 0.1414214 0.2449490 0.4582576 [64,] 0.2236068 0.4358899 0.4358899 [65,] 0.1732051 0.3741657 0.4123106 [66,] 0.2645751 0.3464102 0.3605551 [67,] 0.4690416 0.5099020 0.5477226 [68,] 0.0000000 0.2645751 0.3162278 [69,] 0.2236068 0.3162278 0.3872983 [70,] 0.3000000 0.4000000 0.4795832 [71,] 0.3741657 0.3741657 0.4242641 [72,] 0.2449490 0.3741657 0.4123106 [73,] 0.2236068 0.3464102 0.3605551 [74,] 0.3000000 0.5567764 0.6244998 [75,] 0.3316625 0.3605551 0.4582576 Levels: c s v > attributes(.Last.value) $levels [1] "c" "s" "v" $class [1] "factor" $prob [1] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [9] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [17] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [25] 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 [33] 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [41] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [49] 1.0000000 1.0000000 1.0000000 0.6666667 0.6666667 1.0000000 1.0000000 1.0000000 [57] 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 [65] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 [73] 1.0000000 1.0000000 0.6666667 $nn.index [,1] [,2] [,3] [1,] 10 2 13 [2,] 24 8 18 [3,] 1 18 8 [4,] 1 18 8 [5,] 4 12 10 [6,] 10 4 13 [7,] 21 18 11

Page 64: R version 3 Examples

[8,] 20 11 17 [9,] 16 17 15 [10,] 10 2 13 [11,] 2 3 10 [12,] 11 1 21 [13,] 5 1 8 [14,] 9 14 4 [15,] 8 1 18 [16,] 18 1 5 [17,] 9 14 13 [18,] 4 3 7 [19,] 24 22 18 [20,] 6 22 20 [21,] 2 13 3 [22,] 20 22 5 [23,] 4 3 7 [24,] 11 20 22 [25,] 8 1 18 [26,] 41 34 50 [27,] 34 28 30 [28,] 28 61 74 [29,] 39 37 31 [30,] 45 40 43 [31,] 45 29 35 [32,] 45 29 43 [33,] 43 45 47 [34,] 52 74 48 [35,] 42 31 37 [36,] 32 46 27 [37,] 28 41 34 [38,] 44 48 38 [39,] 37 42 31 [40,] 29 45 35 [41,] 31 43 29 [42,] 39 49 27 [43,] 43 45 47 [44,] 33 36 35 [45,] 31 43 45 [46,] 37 31 43 [47,] 37 31 43 [48,] 50 47 37 [49,] 33 36 40 [50,] 43 37 31 [51,] 53 58 71 [52,] 74 39 48 [53,] 46 74 32 [54,] 55 54 62 [55,] 53 58 63 [56,] 58 53 56 [57,] 68 56 60 [58,] 55 54 62 [59,] 48 74 39 [60,] 54 62 67 [61,] 56 53 58 [62,] 66 51 75 [63,] 67 54 62 [64,] 46 74 39 [65,] 63 71 75 [66,] 71 63 55 [67,] 63 61 66 [68,] 52 64 72 [69,] 71 75 55 [70,] 71 75 55 [71,] 63 66 61 [72,] 74 62 48 [73,] 61 62 67 [74,] 66 61 75 [75,] 52 46 72 $nn.dist [,1] [,2] [,3] [1,] 0.2000000 0.2236068 0.3000000

Page 65: R version 3 Examples

[2,] 0.2000000 0.2236068 0.2645751 [3,] 0.1414214 0.1732051 0.2236068 [4,] 0.1414214 0.1732051 0.2236068 [5,] 0.1732051 0.2236068 0.2645751 [6,] 0.1732051 0.2236068 0.2449490 [7,] 0.2828427 0.3464102 0.3605551 [8,] 0.3741657 0.4582576 0.4582576 [9,] 0.3605551 0.3872983 0.4123106 [10,] 0.1000000 0.1414214 0.2000000 [11,] 0.3000000 0.3162278 0.3464102 [12,] 0.3000000 0.4123106 0.4242641 [13,] 0.1414214 0.2449490 0.2645751 [14,] 0.1414214 0.2449490 0.3000000 [15,] 0.1000000 0.1414214 0.1732051 [16,] 0.1414214 0.1732051 0.1732051 [17,] 0.6244998 0.7810250 0.7937254 [18,] 0.3000000 0.3000000 0.3162278 [19,] 0.2645751 0.3162278 0.3741657 [20,] 0.3741657 0.4123106 0.4123106 [21,] 0.1414214 0.2000000 0.2645751 [22,] 0.1414214 0.2449490 0.3000000 [23,] 0.1414214 0.1414214 0.2236068 [24,] 0.1000000 0.2449490 0.2828427 [25,] 0.1414214 0.2236068 0.2449490 [26,] 0.1414214 0.2449490 0.2645751 [27,] 0.3162278 0.3464102 0.3741657 [28,] 0.3162278 0.4242641 0.5196152 [29,] 0.2449490 0.3316625 0.3741657 [30,] 0.4358899 0.4472136 0.6164414 [31,] 0.1732051 0.3000000 0.5291503 [32,] 0.2645751 0.4358899 0.5830952 [33,] 0.2828427 0.3000000 0.3464102 [34,] 0.3605551 0.4123106 0.4242641 [35,] 0.2000000 0.4123106 0.5830952 [36,] 0.3741657 0.4242641 0.4582576 [37,] 0.2828427 0.3162278 0.3162278 [38,] 0.2645751 0.5744563 0.5916080 [39,] 0.3741657 0.4472136 0.4582576 [40,] 0.2000000 0.2449490 0.3872983 [41,] 0.3162278 0.4795832 0.5099020 [42,] 0.1414214 0.3000000 0.3872983 [43,] 0.2449490 0.2645751 0.3741657 [44,] 0.1414214 0.3605551 0.8485281 [45,] 0.3316625 0.3741657 0.4123106 [46,] 0.3605551 0.3741657 0.3872983 [47,] 0.3000000 0.3162278 0.3872983 [48,] 0.2000000 0.3316625 0.3872983 [49,] 0.3872983 0.7211103 0.9000000 [50,] 0.3316625 0.3605551 0.4000000 [51,] 0.3872983 0.4358899 0.6557439 [52,] 0.1732051 0.4358899 0.4472136 [53,] 0.3000000 0.3605551 0.4582576 [54,] 0.3162278 0.3316625 0.3741657 [55,] 0.5196152 0.5567764 0.7071068 [56,] 0.2645751 0.4582576 0.6082763 [57,] 0.4123106 0.8831761 0.9327379 [58,] 0.3000000 0.4242641 0.4358899 [59,] 0.3605551 0.3741657 0.4690416 [60,] 0.5385165 0.6633250 0.7000000 [61,] 0.5477226 0.6633250 0.6782330 [62,] 0.3872983 0.4242641 0.5196152 [63,] 0.1414214 0.2449490 0.4582576 [64,] 0.2236068 0.4358899 0.4358899 [65,] 0.1732051 0.3741657 0.4123106 [66,] 0.2645751 0.3464102 0.3605551 [67,] 0.4690416 0.5099020 0.5477226 [68,] 0.0000000 0.2645751 0.3162278 [69,] 0.2236068 0.3162278 0.3872983 [70,] 0.3000000 0.4000000 0.4795832 [71,] 0.3741657 0.3741657 0.4242641 [72,] 0.2449490 0.3741657 0.4123106 [73,] 0.2236068 0.3464102 0.3605551

Page 66: R version 3 Examples

[74,] 0.3000000 0.5567764 0.6244998 [75,] 0.3316625 0.3605551 0.4582576

knn.cv > data(iris3) > train <- rbind(iris3[,,1], iris3[,,2], iris3[,,3]) > cl <- factor(c(rep("s",50), rep("c",50), rep("v",50))) > knn.cv(train, cl, k = 3, prob = TRUE) [1] s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s s [42] s s s s s s s s s c c c c c c c c c c c c c c c c c c c c v c v c c c c c c c c c [83] c v c c c c c c c c c c c c c c c c v v v v v v c v v v v v v v v v v v v c v v v [124] v v v v v v v v v v c v v v v v v v v v v v v v v v v attr(,"prob") [1] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [9] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [17] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [25] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [33] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [41] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [49] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [57] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [65] 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 [73] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 [81] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [89] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [97] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [105] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 [113] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [121] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [129] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 0.6666667 1.0000000 [137] 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [145] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 attr(,"nn.index") [,1] [,2] [,3] [1,] 18 5 40 [2,] 35 46 13 [3,] 48 4 7 [4,] 48 30 31 [5,] 38 1 18 [6,] 19 11 49 [7,] 48 3 12 [8,] 40 50 1 [9,] 39 4 43 [10,] 35 2 31 [11,] 49 28 37 [12,] 30 8 27 [13,] 2 10 46 [14,] 39 43 9 [15,] 34 17 16 [16,] 34 15 6 [17,] 11 49 34 [18,] 1 41 5 [19,] 6 11 49 [20,] 22 47 49 [21,] 32 28 29 [22,] 20 47 18 [23,] 7 3 38 [24,] 27 44 40 [25,] 12 30 27 [26,] 35 10 2 [27,] 24 44 8 [28,] 29 1 40 [29,] 28 40 1 [30,] 31 4 12 [31,] 30 35 10 [32,] 21 28 29 [33,] 34 47 20 [34,] 33 16 17 [35,] 10 2 31 [36,] 50 2 3

Page 67: R version 3 Examples

[37,] 11 32 29 [38,] 5 1 41 [39,] 9 43 14 [40,] 8 1 28 [41,] 18 1 5 [42,] 9 39 46 [43,] 39 48 4 [44,] 27 24 22 [45,] 47 6 22 [46,] 2 13 35 [47,] 20 22 49 [48,] 4 3 43 [49,] 11 28 20 [50,] 8 40 36 [51,] 53 87 66 [52,] 57 76 66 [53,] 51 87 78 [54,] 90 81 70 [55,] 59 76 77 [56,] 67 91 97 [57,] 52 86 92 [58,] 94 99 61 [59,] 76 55 66 [60,] 90 95 54 [61,] 94 58 82 [62,] 97 79 96 [63,] 93 70 68 [64,] 92 74 79 [65,] 83 80 89 [66,] 76 59 87 [67,] 85 56 97 [68,] 93 83 100 [69,] 88 73 120 [70,] 81 90 82 [71,] 139 128 150 [72,] 98 83 93 [73,] 134 124 147 [74,] 64 92 79 [75,] 98 76 59 [76,] 66 59 75 [77,] 59 87 53 [78,] 53 87 148 [79,] 92 64 62 [80,] 82 81 70 [81,] 82 70 90 [82,] 81 70 80 [83,] 93 100 68 [84,] 134 102 143 [85,] 67 56 97 [86,] 57 71 52 [87,] 53 66 59 [88,] 69 73 63 [89,] 96 97 100 [90,] 54 70 81 [91,] 95 56 97 [92,] 64 79 74 [93,] 83 68 100 [94,] 58 61 99 [95,] 100 97 91 [96,] 97 89 100 [97,] 96 100 89 [98,] 75 72 92 [99,] 58 94 61 [100,] 97 95 89 [101,] 137 145 105 [102,] 143 114 122 [103,] 126 121 144 [104,] 117 138 129 [105,] 133 129 141 [106,] 123 108 136 [107,] 85 60 91 [108,] 131 126 106

Page 68: R version 3 Examples

[109,] 129 104 117 [110,] 144 121 145 [111,] 148 116 78 [112,] 148 129 147 [113,] 140 141 121 [114,] 143 102 122 [115,] 122 102 143 [116,] 149 111 146 [117,] 138 104 148 [118,] 132 106 110 [119,] 123 106 136 [120,] 73 84 69 [121,] 144 141 125 [122,] 143 102 114 [123,] 106 119 108 [124,] 127 147 128 [125,] 121 144 141 [126,] 130 103 108 [127,] 124 128 139 [128,] 139 127 150 [129,] 133 105 104 [130,] 126 131 103 [131,] 108 103 126 [132,] 118 106 136 [133,] 129 105 104 [134,] 84 73 124 [135,] 104 84 134 [136,] 131 106 103 [137,] 149 116 101 [138,] 117 104 148 [139,] 128 71 127 [140,] 113 146 142 [141,] 145 121 113 [142,] 146 140 113 [143,] 143 114 122 [144,] 121 125 145 [145,] 141 121 144 [146,] 142 148 140 [147,] 124 112 127 [148,] 111 112 117 [149,] 137 116 111 [150,] 128 139 102 attr(,"nn.dist") [,1] [,2] [,3] [1,] 0.1000000 0.1414214 0.1414214 [2,] 0.1414214 0.1414214 0.1414214 [3,] 0.1414214 0.2449490 0.2645751 [4,] 0.1414214 0.1732051 0.2236068 [5,] 0.1414214 0.1414214 0.1732051 [6,] 0.3316625 0.3464102 0.3605551 [7,] 0.2236068 0.2645751 0.3000000 [8,] 0.1000000 0.1414214 0.1732051 [9,] 0.1414214 0.3000000 0.3162278 [10,] 0.1000000 0.1732051 0.1732051 [11,] 0.1000000 0.2828427 0.3000000 [12,] 0.2236068 0.2236068 0.2828427 [13,] 0.1414214 0.1732051 0.2000000 [14,] 0.2449490 0.3162278 0.3464102 [15,] 0.4123106 0.4690416 0.5477226 [16,] 0.3605551 0.5477226 0.6164414 [17,] 0.3464102 0.3605551 0.3872983 [18,] 0.1000000 0.1414214 0.1732051 [19,] 0.3316625 0.3872983 0.4690416 [20,] 0.1414214 0.1414214 0.2449490 [21,] 0.2828427 0.3000000 0.3605551 [22,] 0.1414214 0.2449490 0.2449490 [23,] 0.4582576 0.5099020 0.5099020 [24,] 0.2000000 0.2645751 0.3741657 [25,] 0.3000000 0.3741657 0.4123106 [26,] 0.1732051 0.2000000 0.2236068 [27,] 0.2000000 0.2236068 0.2236068 [28,] 0.1414214 0.1414214 0.1414214

Page 69: R version 3 Examples

[29,] 0.1414214 0.1414214 0.1414214 [30,] 0.1414214 0.1732051 0.2236068 [31,] 0.1414214 0.1414214 0.1732051 [32,] 0.2828427 0.3000000 0.3000000 [33,] 0.3464102 0.3464102 0.3741657 [34,] 0.3464102 0.3605551 0.3872983 [35,] 0.1000000 0.1414214 0.1414214 [36,] 0.2236068 0.3000000 0.3162278 [37,] 0.3000000 0.3162278 0.3316625 [38,] 0.1414214 0.2449490 0.2645751 [39,] 0.1414214 0.2000000 0.2449490 [40,] 0.1000000 0.1414214 0.1414214 [41,] 0.1414214 0.1732051 0.1732051 [42,] 0.6244998 0.7141428 0.7681146 [43,] 0.2000000 0.2236068 0.3000000 [44,] 0.2236068 0.2645751 0.3162278 [45,] 0.3605551 0.3741657 0.4123106 [46,] 0.1414214 0.2000000 0.2000000 [47,] 0.1414214 0.2449490 0.2449490 [48,] 0.1414214 0.1414214 0.2236068 [49,] 0.1000000 0.2236068 0.2449490 [50,] 0.1414214 0.1732051 0.2236068 [51,] 0.2645751 0.3316625 0.4358899 [52,] 0.2645751 0.3162278 0.3464102 [53,] 0.2645751 0.2828427 0.3162278 [54,] 0.2000000 0.3000000 0.3162278 [55,] 0.2449490 0.3162278 0.3741657 [56,] 0.3000000 0.3162278 0.3162278 [57,] 0.2645751 0.3741657 0.4242641 [58,] 0.1414214 0.3872983 0.4582576 [59,] 0.2449490 0.2449490 0.3162278 [60,] 0.3872983 0.5099020 0.5196152 [61,] 0.3605551 0.4582576 0.6708204 [62,] 0.3000000 0.3316625 0.3605551 [63,] 0.4898979 0.5196152 0.5477226 [64,] 0.1414214 0.2236068 0.2449490 [65,] 0.4242641 0.4472136 0.5099020 [66,] 0.1414214 0.3162278 0.3162278 [67,] 0.2000000 0.3000000 0.3872983 [68,] 0.2449490 0.2828427 0.3316625 [69,] 0.2645751 0.5099020 0.5385165 [70,] 0.1732051 0.2449490 0.2645751 [71,] 0.2236068 0.3000000 0.3605551 [72,] 0.3316625 0.3464102 0.3741657 [73,] 0.3605551 0.3605551 0.4123106 [74,] 0.2236068 0.3000000 0.3872983 [75,] 0.2000000 0.2645751 0.3605551 [76,] 0.1414214 0.2449490 0.2645751 [77,] 0.3162278 0.3464102 0.3464102 [78,] 0.3162278 0.3741657 0.4123106 [79,] 0.2000000 0.2449490 0.3316625 [80,] 0.3464102 0.4242641 0.4358899 [81,] 0.1414214 0.1732051 0.3000000 [82,] 0.1414214 0.2645751 0.3464102 [83,] 0.1414214 0.2645751 0.2828427 [84,] 0.3316625 0.3605551 0.3605551 [85,] 0.2000000 0.4123106 0.4795832 [86,] 0.3741657 0.4242641 0.4582576 [87,] 0.2828427 0.3162278 0.3162278 [88,] 0.2645751 0.5744563 0.5916080 [89,] 0.1732051 0.1732051 0.2236068 [90,] 0.2000000 0.2449490 0.3000000 [91,] 0.2645751 0.3162278 0.4242641 [92,] 0.1414214 0.2000000 0.3000000 [93,] 0.1414214 0.2449490 0.2645751 [94,] 0.1414214 0.3605551 0.3872983 [95,] 0.1732051 0.2236068 0.2645751 [96,] 0.1414214 0.1732051 0.2449490 [97,] 0.1414214 0.1414214 0.1732051 [98,] 0.2000000 0.3316625 0.3464102 [99,] 0.3872983 0.3872983 0.7211103 [100,] 0.1414214 0.1732051 0.2236068

Page 70: R version 3 Examples

[101,] 0.4242641 0.5000000 0.5099020 [102,] 0.0000000 0.2645751 0.3162278 [103,] 0.3872983 0.4000000 0.4123106 [104,] 0.2449490 0.2449490 0.3316625 [105,] 0.3000000 0.3162278 0.3605551 [106,] 0.2645751 0.5291503 0.5477226 [107,] 0.7348469 0.7615773 0.7937254 [108,] 0.2645751 0.4358899 0.5291503 [109,] 0.5567764 0.6000000 0.6164414 [110,] 0.6324555 0.6708204 0.7071068 [111,] 0.2236068 0.3741657 0.4242641 [112,] 0.3464102 0.3741657 0.3741657 [113,] 0.1732051 0.3464102 0.3605551 [114,] 0.2645751 0.2645751 0.3316625 [115,] 0.4898979 0.5099020 0.5099020 [116,] 0.3000000 0.3741657 0.3741657 [117,] 0.1414214 0.2449490 0.3605551 [118,] 0.4123106 0.8185353 0.8602325 [119,] 0.4123106 0.5477226 0.8944272 [120,] 0.4358899 0.5196152 0.5385165 [121,] 0.2236068 0.2645751 0.3000000 [122,] 0.3162278 0.3162278 0.3316625 [123,] 0.2645751 0.4123106 0.6082763 [124,] 0.1732051 0.2449490 0.3605551 [125,] 0.3000000 0.3162278 0.3741657 [126,] 0.3464102 0.3872983 0.4358899 [127,] 0.1732051 0.2449490 0.2828427 [128,] 0.1414214 0.2449490 0.2828427 [129,] 0.1000000 0.3162278 0.3316625 [130,] 0.3464102 0.5099020 0.5196152 [131,] 0.2645751 0.4582576 0.4690416 [132,] 0.4123106 0.8831761 0.9273618 [133,] 0.1000000 0.3000000 0.4242641 [134,] 0.3316625 0.3605551 0.3741657 [135,] 0.5385165 0.5567764 0.5830952 [136,] 0.5385165 0.5477226 0.6633250 [137,] 0.2449490 0.3872983 0.4242641 [138,] 0.1414214 0.2449490 0.3872983 [139,] 0.1414214 0.2236068 0.2828427 [140,] 0.1732051 0.3605551 0.3605551 [141,] 0.2449490 0.2645751 0.3464102 [142,] 0.2449490 0.3605551 0.4690416 [143,] 0.0000000 0.2645751 0.3162278 [144,] 0.2236068 0.3162278 0.3162278 [145,] 0.2449490 0.3000000 0.3162278 [146,] 0.2449490 0.3605551 0.3605551 [147,] 0.2449490 0.3741657 0.3872983 [148,] 0.2236068 0.3464102 0.3605551 [149,] 0.2449490 0.3000000 0.5567764 [150,] 0.2828427 0.3162278 0.3316625 Levels: c s v > attributes(.Last.value) $levels [1] "c" "s" "v" $class [1] "factor" $prob [1] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [9] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [17] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [25] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [33] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [41] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [49] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [57] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [65] 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 [73] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 [81] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [89] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [97] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000

Page 71: R version 3 Examples

[105] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 1.0000000 [113] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [121] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [129] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 0.6666667 0.6666667 1.0000000 [137] 1.0000000 1.0000000 0.6666667 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 [145] 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 $nn.index [,1] [,2] [,3] [1,] 18 5 40 [2,] 35 46 13 [3,] 48 4 7 [4,] 48 30 31 [5,] 38 1 18 [6,] 19 11 49 [7,] 48 3 12 [8,] 40 50 1 [9,] 39 4 43 [10,] 35 2 31 [11,] 49 28 37 [12,] 30 8 27 [13,] 2 10 46 [14,] 39 43 9 [15,] 34 17 16 [16,] 34 15 6 [17,] 11 49 34 [18,] 1 41 5 [19,] 6 11 49 [20,] 22 47 49 [21,] 32 28 29 [22,] 20 47 18 [23,] 7 3 38 [24,] 27 44 40 [25,] 12 30 27 [26,] 35 10 2 [27,] 24 44 8 [28,] 29 1 40 [29,] 28 40 1 [30,] 31 4 12 [31,] 30 35 10 [32,] 21 28 29 [33,] 34 47 20 [34,] 33 16 17 [35,] 10 2 31 [36,] 50 2 3 [37,] 11 32 29 [38,] 5 1 41 [39,] 9 43 14 [40,] 8 1 28 [41,] 18 1 5 [42,] 9 39 46 [43,] 39 48 4 [44,] 27 24 22 [45,] 47 6 22 [46,] 2 13 35 [47,] 20 22 49 [48,] 4 3 43 [49,] 11 28 20 [50,] 8 40 36 [51,] 53 87 66 [52,] 57 76 66 [53,] 51 87 78 [54,] 90 81 70 [55,] 59 76 77 [56,] 67 91 97 [57,] 52 86 92 [58,] 94 99 61 [59,] 76 55 66 [60,] 90 95 54 [61,] 94 58 82 [62,] 97 79 96 [63,] 93 70 68

Page 72: R version 3 Examples

[64,] 92 74 79 [65,] 83 80 89 [66,] 76 59 87 [67,] 85 56 97 [68,] 93 83 100 [69,] 88 73 120 [70,] 81 90 82 [71,] 139 128 150 [72,] 98 83 93 [73,] 134 124 147 [74,] 64 92 79 [75,] 98 76 59 [76,] 66 59 75 [77,] 59 87 53 [78,] 53 87 148 [79,] 92 64 62 [80,] 82 81 70 [81,] 82 70 90 [82,] 81 70 80 [83,] 93 100 68 [84,] 134 102 143 [85,] 67 56 97 [86,] 57 71 52 [87,] 53 66 59 [88,] 69 73 63 [89,] 96 97 100 [90,] 54 70 81 [91,] 95 56 97 [92,] 64 79 74 [93,] 83 68 100 [94,] 58 61 99 [95,] 100 97 91 [96,] 97 89 100 [97,] 96 100 89 [98,] 75 72 92 [99,] 58 94 61 [100,] 97 95 89 [101,] 137 145 105 [102,] 143 114 122 [103,] 126 121 144 [104,] 117 138 129 [105,] 133 129 141 [106,] 123 108 136 [107,] 85 60 91 [108,] 131 126 106 [109,] 129 104 117 [110,] 144 121 145 [111,] 148 116 78 [112,] 148 129 147 [113,] 140 141 121 [114,] 143 102 122 [115,] 122 102 143 [116,] 149 111 146 [117,] 138 104 148 [118,] 132 106 110 [119,] 123 106 136 [120,] 73 84 69 [121,] 144 141 125 [122,] 143 102 114 [123,] 106 119 108 [124,] 127 147 128 [125,] 121 144 141 [126,] 130 103 108 [127,] 124 128 139 [128,] 139 127 150 [129,] 133 105 104 [130,] 126 131 103 [131,] 108 103 126 [132,] 118 106 136 [133,] 129 105 104 [134,] 84 73 124 [135,] 104 84 134

Page 73: R version 3 Examples

[136,] 131 106 103 [137,] 149 116 101 [138,] 117 104 148 [139,] 128 71 127 [140,] 113 146 142 [141,] 145 121 113 [142,] 146 140 113 [143,] 143 114 122 [144,] 121 125 145 [145,] 141 121 144 [146,] 142 148 140 [147,] 124 112 127 [148,] 111 112 117 [149,] 137 116 111 [150,] 128 139 102 $nn.dist [,1] [,2] [,3] [1,] 0.1000000 0.1414214 0.1414214 [2,] 0.1414214 0.1414214 0.1414214 [3,] 0.1414214 0.2449490 0.2645751 [4,] 0.1414214 0.1732051 0.2236068 [5,] 0.1414214 0.1414214 0.1732051 [6,] 0.3316625 0.3464102 0.3605551 [7,] 0.2236068 0.2645751 0.3000000 [8,] 0.1000000 0.1414214 0.1732051 [9,] 0.1414214 0.3000000 0.3162278 [10,] 0.1000000 0.1732051 0.1732051 [11,] 0.1000000 0.2828427 0.3000000 [12,] 0.2236068 0.2236068 0.2828427 [13,] 0.1414214 0.1732051 0.2000000 [14,] 0.2449490 0.3162278 0.3464102 [15,] 0.4123106 0.4690416 0.5477226 [16,] 0.3605551 0.5477226 0.6164414 [17,] 0.3464102 0.3605551 0.3872983 [18,] 0.1000000 0.1414214 0.1732051 [19,] 0.3316625 0.3872983 0.4690416 [20,] 0.1414214 0.1414214 0.2449490 [21,] 0.2828427 0.3000000 0.3605551 [22,] 0.1414214 0.2449490 0.2449490 [23,] 0.4582576 0.5099020 0.5099020 [24,] 0.2000000 0.2645751 0.3741657 [25,] 0.3000000 0.3741657 0.4123106 [26,] 0.1732051 0.2000000 0.2236068 [27,] 0.2000000 0.2236068 0.2236068 [28,] 0.1414214 0.1414214 0.1414214 [29,] 0.1414214 0.1414214 0.1414214 [30,] 0.1414214 0.1732051 0.2236068 [31,] 0.1414214 0.1414214 0.1732051 [32,] 0.2828427 0.3000000 0.3000000 [33,] 0.3464102 0.3464102 0.3741657 [34,] 0.3464102 0.3605551 0.3872983 [35,] 0.1000000 0.1414214 0.1414214 [36,] 0.2236068 0.3000000 0.3162278 [37,] 0.3000000 0.3162278 0.3316625 [38,] 0.1414214 0.2449490 0.2645751 [39,] 0.1414214 0.2000000 0.2449490 [40,] 0.1000000 0.1414214 0.1414214 [41,] 0.1414214 0.1732051 0.1732051 [42,] 0.6244998 0.7141428 0.7681146 [43,] 0.2000000 0.2236068 0.3000000 [44,] 0.2236068 0.2645751 0.3162278 [45,] 0.3605551 0.3741657 0.4123106 [46,] 0.1414214 0.2000000 0.2000000 [47,] 0.1414214 0.2449490 0.2449490 [48,] 0.1414214 0.1414214 0.2236068 [49,] 0.1000000 0.2236068 0.2449490 [50,] 0.1414214 0.1732051 0.2236068 [51,] 0.2645751 0.3316625 0.4358899 [52,] 0.2645751 0.3162278 0.3464102 [53,] 0.2645751 0.2828427 0.3162278 [54,] 0.2000000 0.3000000 0.3162278

Page 74: R version 3 Examples

[55,] 0.2449490 0.3162278 0.3741657 [56,] 0.3000000 0.3162278 0.3162278 [57,] 0.2645751 0.3741657 0.4242641 [58,] 0.1414214 0.3872983 0.4582576 [59,] 0.2449490 0.2449490 0.3162278 [60,] 0.3872983 0.5099020 0.5196152 [61,] 0.3605551 0.4582576 0.6708204 [62,] 0.3000000 0.3316625 0.3605551 [63,] 0.4898979 0.5196152 0.5477226 [64,] 0.1414214 0.2236068 0.2449490 [65,] 0.4242641 0.4472136 0.5099020 [66,] 0.1414214 0.3162278 0.3162278 [67,] 0.2000000 0.3000000 0.3872983 [68,] 0.2449490 0.2828427 0.3316625 [69,] 0.2645751 0.5099020 0.5385165 [70,] 0.1732051 0.2449490 0.2645751 [71,] 0.2236068 0.3000000 0.3605551 [72,] 0.3316625 0.3464102 0.3741657 [73,] 0.3605551 0.3605551 0.4123106 [74,] 0.2236068 0.3000000 0.3872983 [75,] 0.2000000 0.2645751 0.3605551 [76,] 0.1414214 0.2449490 0.2645751 [77,] 0.3162278 0.3464102 0.3464102 [78,] 0.3162278 0.3741657 0.4123106 [79,] 0.2000000 0.2449490 0.3316625 [80,] 0.3464102 0.4242641 0.4358899 [81,] 0.1414214 0.1732051 0.3000000 [82,] 0.1414214 0.2645751 0.3464102 [83,] 0.1414214 0.2645751 0.2828427 [84,] 0.3316625 0.3605551 0.3605551 [85,] 0.2000000 0.4123106 0.4795832 [86,] 0.3741657 0.4242641 0.4582576 [87,] 0.2828427 0.3162278 0.3162278 [88,] 0.2645751 0.5744563 0.5916080 [89,] 0.1732051 0.1732051 0.2236068 [90,] 0.2000000 0.2449490 0.3000000 [91,] 0.2645751 0.3162278 0.4242641 [92,] 0.1414214 0.2000000 0.3000000 [93,] 0.1414214 0.2449490 0.2645751 [94,] 0.1414214 0.3605551 0.3872983 [95,] 0.1732051 0.2236068 0.2645751 [96,] 0.1414214 0.1732051 0.2449490 [97,] 0.1414214 0.1414214 0.1732051 [98,] 0.2000000 0.3316625 0.3464102 [99,] 0.3872983 0.3872983 0.7211103 [100,] 0.1414214 0.1732051 0.2236068 [101,] 0.4242641 0.5000000 0.5099020 [102,] 0.0000000 0.2645751 0.3162278 [103,] 0.3872983 0.4000000 0.4123106 [104,] 0.2449490 0.2449490 0.3316625 [105,] 0.3000000 0.3162278 0.3605551 [106,] 0.2645751 0.5291503 0.5477226 [107,] 0.7348469 0.7615773 0.7937254 [108,] 0.2645751 0.4358899 0.5291503 [109,] 0.5567764 0.6000000 0.6164414 [110,] 0.6324555 0.6708204 0.7071068 [111,] 0.2236068 0.3741657 0.4242641 [112,] 0.3464102 0.3741657 0.3741657 [113,] 0.1732051 0.3464102 0.3605551 [114,] 0.2645751 0.2645751 0.3316625 [115,] 0.4898979 0.5099020 0.5099020 [116,] 0.3000000 0.3741657 0.3741657 [117,] 0.1414214 0.2449490 0.3605551 [118,] 0.4123106 0.8185353 0.8602325 [119,] 0.4123106 0.5477226 0.8944272 [120,] 0.4358899 0.5196152 0.5385165 [121,] 0.2236068 0.2645751 0.3000000 [122,] 0.3162278 0.3162278 0.3316625 [123,] 0.2645751 0.4123106 0.6082763 [124,] 0.1732051 0.2449490 0.3605551 [125,] 0.3000000 0.3162278 0.3741657 [126,] 0.3464102 0.3872983 0.4358899

Page 75: R version 3 Examples

[127,] 0.1732051 0.2449490 0.2828427 [128,] 0.1414214 0.2449490 0.2828427 [129,] 0.1000000 0.3162278 0.3316625 [130,] 0.3464102 0.5099020 0.5196152 [131,] 0.2645751 0.4582576 0.4690416 [132,] 0.4123106 0.8831761 0.9273618 [133,] 0.1000000 0.3000000 0.4242641 [134,] 0.3316625 0.3605551 0.3741657 [135,] 0.5385165 0.5567764 0.5830952 [136,] 0.5385165 0.5477226 0.6633250 [137,] 0.2449490 0.3872983 0.4242641 [138,] 0.1414214 0.2449490 0.3872983 [139,] 0.1414214 0.2236068 0.2828427 [140,] 0.1732051 0.3605551 0.3605551 [141,] 0.2449490 0.2645751 0.3464102 [142,] 0.2449490 0.3605551 0.4690416 [143,] 0.0000000 0.2645751 0.3162278 [144,] 0.2236068 0.3162278 0.3162278 [145,] 0.2449490 0.3000000 0.3162278 [146,] 0.2449490 0.3605551 0.3605551 [147,] 0.2449490 0.3741657 0.3872983 [148,] 0.2236068 0.3464102 0.3605551 [149,] 0.2449490 0.3000000 0.5567764 [150,] 0.2828427 0.3162278 0.3316625

knn.dist > if(require(mvtnorm)) + { + sigma<- function(v, r, p) + { + V<- matrix(r^2, ncol=p, nrow=p) + diag(V)<- 1 + V*v + } + X<- rmvnorm(1000, mean=rep(0, 20), sigma(1, .5, 20)) + print(system.time(knn.dist(X)) ) + print(system.time(knn.dist(X, algorithm = "kd_tree"))) + } user system elapsed 0.17 0.00 0.17 user system elapsed 0.14 0.00 0.17

knn.index > data<- query<- cbind(1:10, 1:10) > knn.index(data, k=5) [,1] [,2] [,3] [,4] [,5] [1,] 2 3 4 5 6 [2,] 1 3 4 5 6 [3,] 2 4 1 5 6 [4,] 3 5 2 6 1 [5,] 4 6 3 7 2 [6,] 7 5 8 4 9 [7,] 8 6 9 5 10 [8,] 9 7 10 6 5 [9,] 10 8 7 6 5 [10,] 9 8 7 6 5 > knnx.index(data, query, k=5) [,1] [,2] [,3] [,4] [,5] [1,] 1 2 3 4 5 [2,] 2 1 3 4 5 [3,] 3 2 4 1 5 [4,] 4 3 5 2 6 [5,] 5 4 6 3 7 [6,] 6 7 5 8 4 [7,] 7 8 6 9 5 [8,] 8 9 7 10 6 [9,] 9 10 8 7 6

Page 76: R version 3 Examples

[10,] 10 9 8 7 6 > knnx.index(data, query, k=5, algo="kd_tree") [,1] [,2] [,3] [,4] [,5] [1,] 1 2 3 4 5 [2,] 2 1 3 4 5 [3,] 3 2 4 1 5 [4,] 4 3 5 2 6 [5,] 5 4 6 3 7 [6,] 6 7 5 8 4 [7,] 7 8 6 9 5 [8,] 8 9 7 10 6 [9,] 9 10 8 7 6 [10,] 10 9 8 7 6

knn.reg > if(require(chemometrics)){ + data(PAC); + pac.knn<- knn.reg(PAC$X, y=PAC$y, k=3); + plot(PAC$y, pac.knn$pred, xlab="y", ylab=expression(hat(y))) + }

ownn > data(iris3) > train <- rbind(iris3[1:25,,1], iris3[1:25,,2], iris3[1:25,,3]) > test <- rbind(iris3[26:50,,1], iris3[26:50,,2], iris3[26:50,,3]) > cl <- factor(c(rep("s",25), rep("c",25), rep("v",25))) > testcl <- factor(c(rep("s",25), rep("c",25), rep("v",25))) > out <- ownn(train, test, cl, testcl) > out $k [1] 5

Page 77: R version 3 Examples

$knnpred [1] s s s s s s s s s s s s s s s s s s s s s s s s s c c v c c c c c v c c c c c c c [42] c c c c c c c c c v c c v v v v v c v v v v c v v v v v v v v v v v Levels: c s v $ownnpred [1] s s s s s s s s s s s s s s s s s s s s s s s s s c c v c c c c c v c c c c c c c [42] c c c c c c c c c v c c v v v v v c v v v v c v v v v v v v v v v v Levels: c s v $bnnpred [1] s s s s s s s s s s s s s s s s s s s s s s s s s c c c c c c c c v c c c c c c c [42] c c c c c c c c c v c c v v v v v c v v v v c v v v v v v v v v v v Levels: c s v $accuracy knn ownn bnn 0.9200000 0.9200000 0.9333333

LogicReg - Package > library(LogicReg) Loading required package: survival Loading required package: splines Attaching package: ‘survival’ The following object is masked from ‘package:robustbase’: heart Warning messages: 1: package ‘LogicReg’ was built under R version 3.1.2 2: package ‘survival’ was built under R version 3.1.0

cumhaz > data(logreg.testdat) > # > # this is not survival data, but it shows the functionality > yy <- cumhaz(exp(logreg.testdat[,1]), logreg.testdat[, 2]) > # then we would use > # logreg(resp=yy, cens=logreg.testdat[,2], type=5, ... > # insted of > # logreg(resp=logreg.testdat[,1], cens=logreg.testdat[,2], type=4, ...

eval.logreg > data(logreg.savefit1) > # myanneal <- logreg.anneal.control(start = -1, end = -4, iter = 25000, update = 1000) > # logreg.savefit1 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], > # type = 2, select = 1, ntrees = 2, anneal.control = myanneal) > tree1 <- eval.logreg(logreg.savefit1$model$trees[[1]], logreg.savefit1$binary) > tree2 <- eval.logreg(logreg.savefit1$model$trees[[2]], logreg.savefit1$binary) > alltrees <- eval.logreg(logreg.savefit1$model, logreg.savefit1$binary)

frame.logreg > data(logreg.savefit1,logreg.savefit2,logreg.savefit6) > # > # fit a single mode > # myanneal <- logreg.anneal.control(start = -1, end = -4, iter = 25000, update = 1000) > # logreg.savefit1 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], > # type = 2, select = 1, ntrees = 2, anneal.control = myanneal) > frame1 <- frame.logreg(logreg.savefit1) > # > # a complete sequence

Page 78: R version 3 Examples

> # myanneal2 <- logreg.anneal.control(start = -1, end = -4, iter = 25000, update = 0) > # logreg.savefit2 <- logreg(select = 2, ntrees = c(1,2), nleaves =c(1,7), > # oldfit = logreg.savefit1, anneal.control = myanneal2) > frame2 <- frame.logreg(logreg.savefit2) > # > # a greedy sequence > # logreg.savefit6 <- logreg(select = 6, ntrees = 2, nleaves =c(1,12), oldfit = logreg.savefit1) > frame6 <- frame.logreg(logreg.savefit6, msz = 3:5) # restrict the size

Library(LogicReg) > data(logreg.savefit1,logreg.savefit2,logreg.savefit3,logreg.savefit4, logreg.savefit5,logreg.savefit6,logreg.savefit7,logreg.testdat) > myanneal <- logreg.anneal.control(start = -1, end = -4, iter = 500, update = 100) > # in practie we would use 25000 iterations or far more - the use of 500 is only > # to have the examples run fast > ## Not run: myanneal <- logreg.anneal.control(start = -1, end = -4, iter = 25000, update = 500) > fit1 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], type = 2, select = 1, ntrees = 2, anneal.control = myanneal) log-temp current score best score acc / rej /sing current parameters -1.000 1.4941 1.4941 0( 0) 0 0 2.886 0.000 0.000 -1.600 1.0442 1.0402 60( 3) 37 0 1.690 0.233 2.139 -2.200 1.0391 1.0323 52( 1) 47 0 2.057 -0.299 2.130 -2.800 1.0271 1.0271 13( 2) 85 0 1.474 0.508 2.134 -3.400 1.0250 1.0250 4( 2) 94 0 1.460 0.533 2.119 -4.000 1.0236 1.0236 1( 3) 96 0 1.469 0.538 2.109 > # the best score should be in the 0.95-1.10 range > plot(fit1) > # you'll probably see X1-X4 as well as a few noise predictors > # use logreg.savefit1 for the results with 25000 iterations > plot(logreg.savefit1) > print(logreg.savefit1) score 0.966 1.98 -1.3 * (((X14 or (not X5)) and ((not X1) and (not X2))) and (((not X3) or X1) or ((not X20) and (not X2)))) +2.15 * (((not X4) or ((not X13) and (not X11))) and (not X3)) > z <- predict(logreg.savefit1) > plot(z, logreg.testdat[,1]-z, xlab="fitted values", ylab="residuals") > # there are some streaks, thanks to the very discrete predictions > # > # a bit less output > myanneal2 <- logreg.anneal.control(start = -1, end = -4, iter = 500, update = 0) > # in practie we would use 25000 iterations or more - the use of 500 is only > # to have the examples run fast > ## Not run: myanneal2 <- logreg.anneal.control(start = -1, end = -4, iter = 25000, update = 0) > #

Page 79: R version 3 Examples

> # fit multiple models > fit2 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], type = 2, select = 2, ntrees = c(1,2), nleaves =c(1,7), anneal.control = myanneal2) The number of trees in these models is 1 The model size is 1 The best model with 1 trees of size 1 has a score of 1.1500 The model size is 2 The best model with 1 trees of size 2 has a score of 1.0481 The model size is 3 The best model with 1 trees of size 3 has a score of 1.1500 The model size is 4 The best model with 1 trees of size 4 has a score of 1.0481 The model size is 5 The best model with 1 trees of size 5 has a score of 1.0481 The model size is 6 The best model with 1 trees of size 6 has a score of 1.0457 The model size is 7 The best model with 1 trees of size 7 has a score of 1.1500 The number of trees in these models is 2 The size for this model is smaller than the number of trees you requested. ( 1 versus 2) To save CPU time, we will skip this run. On to the next model... The model size is 2 The best model with 2 trees of size 2 has a score of 1.1168 The model size is 3 The best model with 2 trees of size 3 has a score of 1.0333 The model size is 4 The best model with 2 trees of size 4 has a score of 1.0335 The model size is 5 The best model with 2 trees of size 5 has a score of 1.1138 The model size is 6

Page 80: R version 3 Examples

The best model with 2 trees of size 6 has a score of 0.9823 The model size is 7 The best model with 2 trees of size 7 has a score of 1.1168 > # equivalent > fit2 <- logreg(select = 2, ntrees = c(1,2), nleaves =c(1,7), oldfit = fit1, anneal.control = myanneal2) The number of trees in these models is 1 The model size is 1 The best model with 1 trees of size 1 has a score of 1.1500 The model size is 2 The best model with 1 trees of size 2 has a score of 1.0481 The model size is 3 The best model with 1 trees of size 3 has a score of 1.0481 The model size is 4 The best model with 1 trees of size 4 has a score of 1.1500 The model size is 5 The best model with 1 trees of size 5 has a score of 1.0457 The model size is 6 The best model with 1 trees of size 6 has a score of 1.0481 The model size is 7 The best model with 1 trees of size 7 has a score of 1.0481 The number of trees in these models is 2 The size for this model is smaller than the number of trees you requested. ( 1 versus 2) To save CPU time, we will skip this run. On to the next model... The model size is 2 The best model with 2 trees of size 2 has a score of 1.1168 The model size is 3 The best model with 2 trees of size 3 has a score of 1.0333 The model size is 4 The best model with 2 trees of size 4 has a score of 1.0429 The model size is 5 The best model with 2 trees of size 5 has a score of 1.0482 The model size is 6 The best model with 2 trees of size 6 has a score of 1.0484 The model size is 7 The best model with 2 trees of size 7 has a score of 1.1162 > plot(fit2) > # use logreg.savefit2 for the results with 25000 iterations > plot(logreg.savefit2) > print(logreg.savefit2) 1 trees with 1 leaves: score is 1.15 3.79 -1.91 * X3 1 trees with 2 leaves: score is 1.048 1.87 +2.13 * ((not X3) and (not X4)) 1 trees with 3 leaves: score is 1.046 1.85 +2.14 * ((not X3) and ((not X13) or (not X4))) 1 trees with 4 leaves: score is 1.042 1.86 +2.14 * ((((not X13) and (not X11)) or (not X4)) and (not X3)) 1 trees with 5 leaves: score is 1.042 1.86 +2.14 * ((not X3) and ((not X4) or ((not X13) and (not X11)))) 1 trees with 6 leaves: score is 1.042 4 -2.14 * (X3 or (X4 and (X13 or (not X19)))) 1 trees with 7 leaves: score is 1.04

Page 81: R version 3 Examples

1.84 +2.15 * ((((not X4) or (not X13)) and (not X3)) or (((not X6) and X14) and ((not X1) and (not X12)))) 2 trees with 2 leaves: score is 1.117 1.98 +1.89 * (not X3) -0.904 * X4 2 trees with 3 leaves: score is 1.033 1.58 +0.401 * X1 +2.13 * ((not X3) and (not X4)) 2 trees with 4 leaves: score is 0.988 4.12 -1.11 * ((not X1) and (not X2)) -2.12 * (X3 or X4) 2 trees with 5 leaves: score is 0.982 0.77 +1.22 * ((X2 or X1) or X20) +2.12 * ((not X4) and (not X3)) 2 trees with 6 leaves: score is 0.979 0.764 +2.13 * ((not X3) and (not X4)) +1.23 * ((X2 or X1) or (X20 and X3)) 2 trees with 7 leaves: score is 0.978 1.99 +2.13 * ((not X3) and (not X4)) -1.29 * (((not X7) or ((not X5) or (not X15))) and ((not X1) and (not X2))) > # After an initial steep decline, the scores only get slightly better > # for models with more than four leaves and two trees. > #

Page 82: R version 3 Examples
Page 83: R version 3 Examples
Page 84: R version 3 Examples
Page 85: R version 3 Examples
Page 86: R version 3 Examples
Page 87: R version 3 Examples
Page 88: R version 3 Examples
Page 89: R version 3 Examples
Page 90: R version 3 Examples
Page 91: R version 3 Examples
Page 92: R version 3 Examples
Page 93: R version 3 Examples
Page 94: R version 3 Examples

> # cross validation > fit3 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], type = 2, select = 3, ntrees = c(1,2), nleaves=c(1,7), anneal.control = myanneal2) The number of trees in these models is 1 The model size is 1 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 1 leaves] CV score: 1.1441 1.1441 1.2272 1.2272 Step 2 of 10 [ 1 trees; 1 leaves] CV score: 1.1573 1.1507 1.1083 1.1677 Step 3 of 10 [ 1 trees; 1 leaves] CV score: 1.1242 1.1419 1.3885 1.2413 Step 4 of 10 [ 1 trees; 1 leaves] CV score: 1.1353 1.1402 1.3023 1.2566 Step 5 of 10 [ 1 trees; 1 leaves] CV score: 1.1685 1.1459 0.9897 1.2032 Step 6 of 10 [ 1 trees; 1 leaves] CV score: 1.1639 1.1489 1.0397 1.1759 Step 7 of 10 [ 1 trees; 1 leaves] CV score: 1.1417 1.1479 1.2476 1.1862 Step 8 of 10 [ 1 trees; 1 leaves] CV score: 1.1386 1.1467 1.2750 1.1973 Step 9 of 10 [ 1 trees; 1 leaves] CV score: 1.1644 1.1487 1.0333 1.1791

Page 95: R version 3 Examples

Step 10 of 10 [ 1 trees; 1 leaves] CV score: 1.1626 1.1501 1.0534 1.1665 The model size is 2 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 2 leaves] CV score: 1.1441 1.1441 1.2272 1.2272 Step 2 of 10 [ 1 trees; 2 leaves] CV score: 1.0464 1.0953 1.0859 1.1565 Step 3 of 10 [ 1 trees; 2 leaves] CV score: 1.1242 1.1049 1.3885 1.2339 Step 4 of 10 [ 1 trees; 2 leaves] CV score: 1.0315 1.0866 1.2124 1.2285 Step 5 of 10 [ 1 trees; 2 leaves] CV score: 1.1685 1.1029 0.9897 1.1808 Step 6 of 10 [ 1 trees; 2 leaves] CV score: 1.0609 1.0959 0.9453 1.1415 Step 7 of 10 [ 1 trees; 2 leaves] CV score: 1.0432 1.0884 1.1136 1.1375 Step 8 of 10 [ 1 trees; 2 leaves] CV score: 1.0437 1.0828 1.1089 1.1340 Step 9 of 10 [ 1 trees; 2 leaves] CV score: 1.0649 1.0808 0.9012 1.1081 Step 10 of 10 [ 1 trees; 2 leaves] CV score: 1.0527 1.0780 1.0291 1.1002 The model size is 3 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 3 leaves] CV score: 1.0478 1.0478 1.0735 1.0735 Step 2 of 10 [ 1 trees; 3 leaves] CV score: 1.0439 1.0458 1.0847 1.0791 Step 3 of 10 [ 1 trees; 3 leaves] CV score: 1.0290 1.0402 1.2317 1.1300 Step 4 of 10 [ 1 trees; 3 leaves] CV score: 1.0287 1.0373 1.2129 1.1507 Step 5 of 10 [ 1 trees; 3 leaves] CV score: 1.0611 1.0421 0.9419 1.1089 Step 6 of 10 [ 1 trees; 3 leaves] CV score: 1.0609 1.0452 0.9453 1.0817 Step 7 of 10 [ 1 trees; 3 leaves] CV score: 1.0432 1.0449 1.1136 1.0862 Step 8 of 10 [ 1 trees; 3 leaves] CV score: 1.1386 1.0566 1.2750 1.1098 Step 9 of 10 [ 1 trees; 3 leaves] CV score: 1.0624 1.0573 0.9001 1.0865 Step 10 of 10 [ 1 trees; 3 leaves] CV score: 1.1626 1.0678 1.0534 1.0832 The model size is 4 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 4 leaves] CV score: 1.0478 1.0478 1.0735 1.0735 Step 2 of 10 [ 1 trees; 4 leaves] CV score: 1.0464 1.0471 1.0859 1.0797 Step 3 of 10 [ 1 trees; 4 leaves] CV score: 1.1242 1.0728 1.3885 1.1827 Step 4 of 10 [ 1 trees; 4 leaves] CV score: 1.1353 1.0884 1.3023 1.2126 Step 5 of 10 [ 1 trees; 4 leaves] CV score: 1.0611 1.0830 0.9419 1.1584 Step 6 of 10 [ 1 trees; 4 leaves] CV score: 1.0609 1.0793 0.9453 1.1229 Step 7 of 10 [ 1 trees; 4 leaves] CV score: 1.0432 1.0741 1.1136 1.1216 Step 8 of 10 [ 1 trees; 4 leaves] CV score: 1.0437 1.0703 1.1089 1.1200 Step 9 of 10 [ 1 trees; 4 leaves] CV score: 1.1644 1.0808 1.0333 1.1104

Page 96: R version 3 Examples

Step 10 of 10 [ 1 trees; 4 leaves] CV score: 1.0527 1.0780 1.0291 1.1022 The model size is 5 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 5 leaves] CV score: 1.0437 1.0437 1.0866 1.0866 Step 2 of 10 [ 1 trees; 5 leaves] CV score: 1.0464 1.0450 1.0859 1.0863 Step 3 of 10 [ 1 trees; 5 leaves] CV score: 1.0290 1.0397 1.2317 1.1348 Step 4 of 10 [ 1 trees; 5 leaves] CV score: 1.0263 1.0363 1.2132 1.1544 Step 5 of 10 [ 1 trees; 5 leaves] CV score: 1.1662 1.0623 0.9891 1.1213 Step 6 of 10 [ 1 trees; 5 leaves] CV score: 1.1639 1.0792 1.0397 1.1077 Step 7 of 10 [ 1 trees; 5 leaves] CV score: 1.0432 1.0741 1.1136 1.1086 Step 8 of 10 [ 1 trees; 5 leaves] CV score: 1.0437 1.0703 1.1089 1.1086 Step 9 of 10 [ 1 trees; 5 leaves] CV score: 1.1644 1.0807 1.0621 1.1034 Step 10 of 10 [ 1 trees; 5 leaves] CV score: 1.0527 1.0779 1.0291 1.0960 The model size is 6 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 6 leaves] CV score: 1.0478 1.0478 1.0735 1.0735 Step 2 of 10 [ 1 trees; 6 leaves] CV score: 1.1573 1.1026 1.1083 1.0909 Step 3 of 10 [ 1 trees; 6 leaves] CV score: 1.0240 1.0764 1.2305 1.1374 Step 4 of 10 [ 1 trees; 6 leaves] CV score: 1.0315 1.0651 1.2124 1.1562 Step 5 of 10 [ 1 trees; 6 leaves] CV score: 1.0611 1.0643 0.9419 1.1133 Step 6 of 10 [ 1 trees; 6 leaves] CV score: 1.1639 1.0809 1.0397 1.1011 Step 7 of 10 [ 1 trees; 6 leaves] CV score: 1.0432 1.0755 1.1136 1.1028 Step 8 of 10 [ 1 trees; 6 leaves] CV score: 1.1386 1.0834 1.2750 1.1244 Step 9 of 10 [ 1 trees; 6 leaves] CV score: 1.1644 1.0924 1.0333 1.1142 Step 10 of 10 [ 1 trees; 6 leaves] CV score: 1.0527 1.0884 1.0291 1.1057 The model size is 7 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 7 leaves] CV score: 1.0413 1.0413 1.0723 1.0723 Step 2 of 10 [ 1 trees; 7 leaves] CV score: 1.0451 1.0432 1.1644 1.1183 Step 3 of 10 [ 1 trees; 7 leaves] CV score: 1.0290 1.0384 1.2317 1.1561 Step 4 of 10 [ 1 trees; 7 leaves] CV score: 1.0315 1.0367 1.2124 1.1702 Step 5 of 10 [ 1 trees; 7 leaves] CV score: 1.0611 1.0416 0.9419 1.1245 Step 6 of 10 [ 1 trees; 7 leaves] CV score: 1.1639 1.0620 1.0397 1.1104 Step 7 of 10 [ 1 trees; 7 leaves] CV score: 1.1417 1.0734 1.2476 1.1300 Step 8 of 10 [ 1 trees; 7 leaves] CV score: 1.1386 1.0815 1.2750 1.1481 Step 9 of 10 [ 1 trees; 7 leaves] CV score: 1.0624 1.0794 0.9001 1.1206

Page 97: R version 3 Examples

Step 10 of 10 [ 1 trees; 7 leaves] CV score: 1.0527 1.0767 1.0291 1.1114 The number of trees in these models is 2 The size for this model is smaller than the number of trees you requested. ( 1 versus 2) To save CPU time, we will skip this run. On to the next model... The model size is 2 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 2 leaves] CV score: 1.1106 1.1106 1.2087 1.2087 Step 2 of 10 [ 2 trees; 2 leaves] CV score: 1.1204 1.1155 1.1216 1.1652 Step 3 of 10 [ 2 trees; 2 leaves] CV score: 1.0958 1.1089 1.3334 1.2212 Step 4 of 10 [ 2 trees; 2 leaves] CV score: 1.0984 1.1063 1.3125 1.2440 Step 5 of 10 [ 2 trees; 2 leaves] CV score: 1.1345 1.1119 0.9763 1.1905 Step 6 of 10 [ 2 trees; 2 leaves] CV score: 1.1326 1.1154 0.9967 1.1582 Step 7 of 10 [ 2 trees; 2 leaves] CV score: 1.1096 1.1146 1.2171 1.1666 Step 8 of 10 [ 2 trees; 2 leaves] CV score: 1.1111 1.1141 1.2078 1.1718 Step 9 of 10 [ 2 trees; 2 leaves] CV score: 1.1314 1.1160 1.0078 1.1535 Step 10 of 10 [ 2 trees; 2 leaves] CV score: 1.1247 1.1169 1.0817 1.1464 The model size is 3 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 3 leaves] CV score: 1.0445 1.0445 1.1026 1.1026 Step 2 of 10 [ 2 trees; 3 leaves] CV score: 1.0333 1.0389 1.0674 1.0850 Step 3 of 10 [ 2 trees; 3 leaves] CV score: 1.0103 1.0294 1.2605 1.1435 Step 4 of 10 [ 2 trees; 3 leaves] CV score: 1.1219 1.0525 1.2816 1.1780 Step 5 of 10 [ 2 trees; 3 leaves] CV score: 1.0612 1.0542 0.9656 1.1355 Step 6 of 10 [ 2 trees; 3 leaves] CV score: 1.0456 1.0528 0.9461 1.1039 Step 7 of 10 [ 2 trees; 3 leaves] CV score: 1.0387 1.0508 1.0982 1.1031 Step 8 of 10 [ 2 trees; 3 leaves] CV score: 1.0310 1.0483 1.0877 1.1012 Step 9 of 10 [ 2 trees; 3 leaves] CV score: 1.0453 1.0480 0.9524 1.0847 Step 10 of 10 [ 2 trees; 3 leaves] CV score: 1.0406 1.0472 1.0010 1.0763 The model size is 4 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 4 leaves] CV score: 1.0211 1.0211 1.1298 1.1298 Step 2 of 10 [ 2 trees; 4 leaves] CV score: 1.0449 1.0330 1.1066 1.1182 Step 3 of 10 [ 2 trees; 4 leaves] CV score: 0.9700 1.0120 1.1768 1.1378 Step 4 of 10 [ 2 trees; 4 leaves] CV score: 1.0190 1.0137 1.1928 1.1515 Step 5 of 10 [ 2 trees; 4 leaves] CV score: 1.0435 1.0197 1.0136 1.1239

Page 98: R version 3 Examples

Step 6 of 10 [ 2 trees; 4 leaves] CV score: 0.9963 1.0158 0.9462 1.0943 Step 7 of 10 [ 2 trees; 4 leaves] CV score: 1.0273 1.0174 1.1067 1.0961 Step 8 of 10 [ 2 trees; 4 leaves] CV score: 1.0954 1.0272 1.2332 1.1132 Step 9 of 10 [ 2 trees; 4 leaves] CV score: 1.1314 1.0388 1.0078 1.1015 Step 10 of 10 [ 2 trees; 4 leaves] CV score: 1.0340 1.0383 1.0076 1.0921 The model size is 5 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 5 leaves] CV score: 1.0546 1.0546 1.1528 1.1528 Step 2 of 10 [ 2 trees; 5 leaves] CV score: 1.0469 1.0507 1.0967 1.1248 Step 3 of 10 [ 2 trees; 5 leaves] CV score: 0.9632 1.0216 1.1750 1.1415 Step 4 of 10 [ 2 trees; 5 leaves] CV score: 1.0121 1.0192 1.1907 1.1538 Step 5 of 10 [ 2 trees; 5 leaves] CV score: 1.0439 1.0241 1.0292 1.1289 Step 6 of 10 [ 2 trees; 5 leaves] CV score: 1.1084 1.0382 1.0159 1.1101 Step 7 of 10 [ 2 trees; 5 leaves] CV score: 1.0268 1.0366 1.0927 1.1076 Step 8 of 10 [ 2 trees; 5 leaves] CV score: 1.0370 1.0366 1.1555 1.1136 Step 9 of 10 [ 2 trees; 5 leaves] CV score: 1.0111 1.0338 0.9275 1.0929 Step 10 of 10 [ 2 trees; 5 leaves] CV score: 1.0531 1.0357 1.0478 1.0884 The model size is 6 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 6 leaves] CV score: 1.0248 1.0248 1.0830 1.0830 Step 2 of 10 [ 2 trees; 6 leaves] CV score: 1.0786 1.0517 0.9973 1.0402 Step 3 of 10 [ 2 trees; 6 leaves] CV score: 0.9700 1.0245 1.1768 1.0857 Step 4 of 10 [ 2 trees; 6 leaves] CV score: 1.0984 1.0429 1.3125 1.1424 Step 5 of 10 [ 2 trees; 6 leaves] CV score: 1.0408 1.0425 0.9474 1.1034 Step 6 of 10 [ 2 trees; 6 leaves] CV score: 1.0616 1.0457 0.9591 1.0793 Step 7 of 10 [ 2 trees; 6 leaves] CV score: 1.0443 1.0455 1.1248 1.0858 Step 8 of 10 [ 2 trees; 6 leaves] CV score: 1.0291 1.0435 1.1537 1.0943 Step 9 of 10 [ 2 trees; 6 leaves] CV score: 1.0584 1.0451 1.0985 1.0948 Step 10 of 10 [ 2 trees; 6 leaves] CV score: 0.9916 1.0398 0.9816 1.0835 The model size is 7 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 7 leaves] CV score: 1.0085 1.0085 1.1602 1.1602 Step 2 of 10 [ 2 trees; 7 leaves] CV score: 1.0430 1.0257 1.0993 1.1298 Step 3 of 10 [ 2 trees; 7 leaves] CV score: 1.0770 1.0428 1.3477 1.2024 Step 4 of 10 [ 2 trees; 7 leaves] CV score: 1.0295 1.0395 1.1420 1.1873 Step 5 of 10 [ 2 trees; 7 leaves] CV score: 1.1345 1.0585 0.9763 1.1451

Page 99: R version 3 Examples

Step 6 of 10 [ 2 trees; 7 leaves] CV score: 0.9921 1.0474 0.9471 1.1121 Step 7 of 10 [ 2 trees; 7 leaves] CV score: 1.0295 1.0449 1.1006 1.1105 Step 8 of 10 [ 2 trees; 7 leaves] CV score: 1.0586 1.0466 1.1192 1.1116 Step 9 of 10 [ 2 trees; 7 leaves] CV score: 1.0643 1.0486 0.9132 1.0895 Step 10 of 10 [ 2 trees; 7 leaves] CV score: 1.0499 1.0487 1.0531 1.0859 > # equivalent > fit3 <- logreg(select = 3, oldfit = fit2) The number of trees in these models is 1 The model size is 1 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 1 leaves] CV score: 1.1435 1.1435 1.2330 1.2330 Step 2 of 10 [ 1 trees; 1 leaves] CV score: 1.1339 1.1387 1.3158 1.2744 Step 3 of 10 [ 1 trees; 1 leaves] CV score: 1.1583 1.1453 1.0990 1.2159 Step 4 of 10 [ 1 trees; 1 leaves] CV score: 1.1596 1.1489 1.0831 1.1827 Step 5 of 10 [ 1 trees; 1 leaves] CV score: 1.1696 1.1530 0.9802 1.1422 Step 6 of 10 [ 1 trees; 1 leaves] CV score: 1.1667 1.1553 1.0095 1.1201 Step 7 of 10 [ 1 trees; 1 leaves] CV score: 1.1494 1.1544 1.1808 1.1288 Step 8 of 10 [ 1 trees; 1 leaves] CV score: 1.1313 1.1515 1.3352 1.1546 Step 9 of 10 [ 1 trees; 1 leaves] CV score: 1.1394 1.1502 1.2671 1.1671 Step 10 of 10 [ 1 trees; 1 leaves] CV score: 1.1480 1.1500 1.1965 1.1700 The model size is 2 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 2 leaves] CV score: 1.1435 1.1435 1.2330 1.2330 Step 2 of 10 [ 1 trees; 2 leaves] CV score: 1.1339 1.1387 1.3158 1.2744 Step 3 of 10 [ 1 trees; 2 leaves] CV score: 1.1583 1.1453 1.0990 1.2159 Step 4 of 10 [ 1 trees; 2 leaves] CV score: 1.1596 1.1489 1.0831 1.1827 Step 5 of 10 [ 1 trees; 2 leaves] CV score: 1.0585 1.1308 0.9710 1.1404 Step 6 of 10 [ 1 trees; 2 leaves] CV score: 1.1667 1.1368 1.0095 1.1186 Step 7 of 10 [ 1 trees; 2 leaves] CV score: 1.0428 1.1234 1.1171 1.1183 Step 8 of 10 [ 1 trees; 2 leaves] CV score: 1.0300 1.1117 1.2235 1.1315 Step 9 of 10 [ 1 trees; 2 leaves] CV score: 1.1394 1.1148 1.2671 1.1466 Step 10 of 10 [ 1 trees; 2 leaves] CV score: 1.1480 1.1181 1.1965 1.1516 The model size is 3 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 3 leaves] CV score: 1.0447 1.0447 1.1018 1.1018 Step 2 of 10 [ 1 trees; 3 leaves] CV score: 1.0333 1.0390 1.2005 1.1511 Step 3 of 10 [ 1 trees; 3 leaves] CV score: 1.0662 1.0481 0.8883 1.0635

Page 100: R version 3 Examples

Step 4 of 10 [ 1 trees; 3 leaves] CV score: 1.1596 1.0760 1.0831 1.0684 Step 5 of 10 [ 1 trees; 3 leaves] CV score: 1.0585 1.0725 0.9710 1.0489 Step 6 of 10 [ 1 trees; 3 leaves] CV score: 1.0684 1.0718 0.8616 1.0177 Step 7 of 10 [ 1 trees; 3 leaves] CV score: 1.0403 1.0673 1.1159 1.0317 Step 8 of 10 [ 1 trees; 3 leaves] CV score: 1.0300 1.0626 1.2235 1.0557 Step 9 of 10 [ 1 trees; 3 leaves] CV score: 1.0315 1.0592 1.2115 1.0730 Step 10 of 10 [ 1 trees; 3 leaves] CV score: 1.0496 1.0582 1.0581 1.0715 The model size is 4 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 4 leaves] CV score: 1.0447 1.0447 1.1018 1.1018 Step 2 of 10 [ 1 trees; 4 leaves] CV score: 1.0333 1.0390 1.2005 1.1511 Step 3 of 10 [ 1 trees; 4 leaves] CV score: 1.0662 1.0481 0.8883 1.0635 Step 4 of 10 [ 1 trees; 4 leaves] CV score: 1.0552 1.0499 1.0029 1.0484 Step 5 of 10 [ 1 trees; 4 leaves] CV score: 1.0562 1.0511 0.9681 1.0323 Step 6 of 10 [ 1 trees; 4 leaves] CV score: 1.0679 1.0539 0.9561 1.0196 Step 7 of 10 [ 1 trees; 4 leaves] CV score: 1.0403 1.0520 1.1159 1.0334 Step 8 of 10 [ 1 trees; 4 leaves] CV score: 1.0300 1.0492 1.2235 1.0571 Step 9 of 10 [ 1 trees; 4 leaves] CV score: 1.0315 1.0473 1.2115 1.0743 Step 10 of 10 [ 1 trees; 4 leaves] CV score: 1.1480 1.0573 1.1965 1.0865 The model size is 5 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 5 leaves] CV score: 1.0422 1.0422 1.1000 1.1000 Step 2 of 10 [ 1 trees; 5 leaves] CV score: 1.0333 1.0378 1.2005 1.1502 Step 3 of 10 [ 1 trees; 5 leaves] CV score: 1.0612 1.0456 0.9154 1.0720 Step 4 of 10 [ 1 trees; 5 leaves] CV score: 1.0552 1.0480 1.0029 1.0547 Step 5 of 10 [ 1 trees; 5 leaves] CV score: 1.0585 1.0501 0.9710 1.0380 Step 6 of 10 [ 1 trees; 5 leaves] CV score: 1.0684 1.0532 0.8616 1.0086 Step 7 of 10 [ 1 trees; 5 leaves] CV score: 1.1494 1.0669 1.1808 1.0332 Step 8 of 10 [ 1 trees; 5 leaves] CV score: 1.0300 1.0623 1.2235 1.0570 Step 9 of 10 [ 1 trees; 5 leaves] CV score: 1.1360 1.0705 1.2672 1.0803 Step 10 of 10 [ 1 trees; 5 leaves] CV score: 1.1425 1.0777 1.1987 1.0921 The model size is 6 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 6 leaves] CV score: 1.1435 1.1435 1.2330 1.2330 Step 2 of 10 [ 1 trees; 6 leaves] CV score: 1.0305 1.0870 1.2017 1.2173 Step 3 of 10 [ 1 trees; 6 leaves] CV score: 1.0596 1.0779 0.8895 1.1081

Page 101: R version 3 Examples

Step 4 of 10 [ 1 trees; 6 leaves] CV score: 1.1596 1.0983 1.0831 1.1018 Step 5 of 10 [ 1 trees; 6 leaves] CV score: 1.0585 1.0903 0.9710 1.0757 Step 6 of 10 [ 1 trees; 6 leaves] CV score: 1.0658 1.0863 0.8619 1.0400 Step 7 of 10 [ 1 trees; 6 leaves] CV score: 1.0362 1.0791 1.1163 1.0509 Step 8 of 10 [ 1 trees; 6 leaves] CV score: 1.1313 1.0856 1.3352 1.0865 Step 9 of 10 [ 1 trees; 6 leaves] CV score: 1.1394 1.0916 1.2671 1.1065 Step 10 of 10 [ 1 trees; 6 leaves] CV score: 1.1480 1.0972 1.1965 1.1155 The model size is 7 training-now training-ave test-now test-ave Step 1 of 10 [ 1 trees; 7 leaves] CV score: 1.0447 1.0447 1.1018 1.1018 Step 2 of 10 [ 1 trees; 7 leaves] CV score: 1.0333 1.0390 1.2005 1.1511 Step 3 of 10 [ 1 trees; 7 leaves] CV score: 1.0662 1.0481 0.8883 1.0635 Step 4 of 10 [ 1 trees; 7 leaves] CV score: 1.0552 1.0499 1.0029 1.0484 Step 5 of 10 [ 1 trees; 7 leaves] CV score: 1.0562 1.0511 0.9681 1.0323 Step 6 of 10 [ 1 trees; 7 leaves] CV score: 1.0650 1.0534 0.9563 1.0197 Step 7 of 10 [ 1 trees; 7 leaves] CV score: 1.0428 1.0519 1.1171 1.0336 Step 8 of 10 [ 1 trees; 7 leaves] CV score: 1.1313 1.0618 1.3352 1.0713 Step 9 of 10 [ 1 trees; 7 leaves] CV score: 1.0315 1.0585 1.2115 1.0869 Step 10 of 10 [ 1 trees; 7 leaves] CV score: 1.0467 1.0573 1.0612 1.0843 The number of trees in these models is 2 The size for this model is smaller than the number of trees you requested. ( 1 versus 2) To save CPU time, we will skip this run. On to the next model... The model size is 2 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 2 leaves] CV score: 1.1102 1.1102 1.2122 1.2122 Step 2 of 10 [ 2 trees; 2 leaves] CV score: 1.0998 1.1050 1.3032 1.2577 Step 3 of 10 [ 2 trees; 2 leaves] CV score: 1.1324 1.1141 1.0046 1.1733 Step 4 of 10 [ 2 trees; 2 leaves] CV score: 1.1252 1.1169 1.0725 1.1481 Step 5 of 10 [ 2 trees; 2 leaves] CV score: 1.1325 1.1200 1.0003 1.1186 Step 6 of 10 [ 2 trees; 2 leaves] CV score: 1.1363 1.1227 0.9552 1.0913 Step 7 of 10 [ 2 trees; 2 leaves] CV score: 1.1161 1.1218 1.1596 1.1011 Step 8 of 10 [ 2 trees; 2 leaves] CV score: 1.0986 1.1189 1.3098 1.1272 Step 9 of 10 [ 2 trees; 2 leaves] CV score: 1.1016 1.1170 1.2859 1.1448 Step 10 of 10 [ 2 trees; 2 leaves] CV score: 1.1160 1.1169 1.1637 1.1467 The model size is 3

Page 102: R version 3 Examples

training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 3 leaves] CV score: 1.0350 1.0350 1.0569 1.0569 Step 2 of 10 [ 2 trees; 3 leaves] CV score: 1.0169 1.0260 1.2110 1.1340 Step 3 of 10 [ 2 trees; 3 leaves] CV score: 1.1277 1.0599 1.0328 1.1002 Step 4 of 10 [ 2 trees; 3 leaves] CV score: 1.0402 1.0550 1.0017 1.0756 Step 5 of 10 [ 2 trees; 3 leaves] CV score: 1.1325 1.0705 1.0003 1.0605 Step 6 of 10 [ 2 trees; 3 leaves] CV score: 1.0486 1.0668 0.9200 1.0371 Step 7 of 10 [ 2 trees; 3 leaves] CV score: 1.0289 1.0614 1.1063 1.0470 Step 8 of 10 [ 2 trees; 3 leaves] CV score: 1.0167 1.0558 1.2099 1.0674 Step 9 of 10 [ 2 trees; 3 leaves] CV score: 1.0163 1.0514 1.2115 1.0834 Step 10 of 10 [ 2 trees; 3 leaves] CV score: 1.1074 1.0570 1.2354 1.0986 The model size is 4 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 4 leaves] CV score: 1.1102 1.1102 1.2122 1.2122 Step 2 of 10 [ 2 trees; 4 leaves] CV score: 1.0161 1.0632 1.2198 1.2160 Step 3 of 10 [ 2 trees; 4 leaves] CV score: 1.0590 1.0618 0.8906 1.1075 Step 4 of 10 [ 2 trees; 4 leaves] CV score: 0.9951 1.0451 0.9596 1.0706 Step 5 of 10 [ 2 trees; 4 leaves] CV score: 1.0430 1.0447 0.9606 1.0486 Step 6 of 10 [ 2 trees; 4 leaves] CV score: 1.0466 1.0450 0.9290 1.0286 Step 7 of 10 [ 2 trees; 4 leaves] CV score: 1.0250 1.0421 1.0888 1.0372 Step 8 of 10 [ 2 trees; 4 leaves] CV score: 1.0154 1.0388 1.2218 1.0603 Step 9 of 10 [ 2 trees; 4 leaves] CV score: 1.0163 1.0363 1.2327 1.0795 Step 10 of 10 [ 2 trees; 4 leaves] CV score: 1.1290 1.0456 1.2121 1.0927 The model size is 5 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 5 leaves] CV score: 1.1032 1.1032 1.2172 1.2172 Step 2 of 10 [ 2 trees; 5 leaves] CV score: 1.0096 1.0564 1.2132 1.2152 Step 3 of 10 [ 2 trees; 5 leaves] CV score: 1.0616 1.0581 0.8922 1.1075 Step 4 of 10 [ 2 trees; 5 leaves] CV score: 1.1164 1.0727 1.0958 1.1046 Step 5 of 10 [ 2 trees; 5 leaves] CV score: 1.0454 1.0672 1.0031 1.0843 Step 6 of 10 [ 2 trees; 5 leaves] CV score: 1.0442 1.0634 0.9317 1.0589 Step 7 of 10 [ 2 trees; 5 leaves] CV score: 0.9878 1.0526 1.0277 1.0544 Step 8 of 10 [ 2 trees; 5 leaves] CV score: 0.9754 1.0429 1.1375 1.0648 Step 9 of 10 [ 2 trees; 5 leaves] CV score: 1.0163 1.0400 1.2115 1.0811 Step 10 of 10 [ 2 trees; 5 leaves] CV score: 1.0316 1.0392 1.0304 1.0760 The model size is 6

Page 103: R version 3 Examples

training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 6 leaves] CV score: 0.9804 0.9804 1.0341 1.0341 Step 2 of 10 [ 2 trees; 6 leaves] CV score: 0.9752 0.9778 1.1627 1.0984 Step 3 of 10 [ 2 trees; 6 leaves] CV score: 1.0675 1.0077 1.0389 1.0785 Step 4 of 10 [ 2 trees; 6 leaves] CV score: 1.0396 1.0157 0.9963 1.0580 Step 5 of 10 [ 2 trees; 6 leaves] CV score: 1.0435 1.0212 0.9707 1.0405 Step 6 of 10 [ 2 trees; 6 leaves] CV score: 1.1363 1.0404 0.9552 1.0263 Step 7 of 10 [ 2 trees; 6 leaves] CV score: 0.9878 1.0329 1.0277 1.0265 Step 8 of 10 [ 2 trees; 6 leaves] CV score: 1.0115 1.0302 1.2104 1.0495 Step 9 of 10 [ 2 trees; 6 leaves] CV score: 1.0229 1.0294 1.2071 1.0670 Step 10 of 10 [ 2 trees; 6 leaves] CV score: 1.0321 1.0297 1.0453 1.0648 The model size is 7 training-now training-ave test-now test-ave Step 1 of 10 [ 2 trees; 7 leaves] CV score: 1.0300 1.0300 1.1133 1.1133 Step 2 of 10 [ 2 trees; 7 leaves] CV score: 1.1027 1.0664 1.3097 1.2115 Step 3 of 10 [ 2 trees; 7 leaves] CV score: 0.9942 1.0423 0.9755 1.1329 Step 4 of 10 [ 2 trees; 7 leaves] CV score: 1.0563 1.0458 1.0127 1.1028 Step 5 of 10 [ 2 trees; 7 leaves] CV score: 1.0375 1.0441 0.9599 1.0742 Step 6 of 10 [ 2 trees; 7 leaves] CV score: 1.0671 1.0480 0.9696 1.0568 Step 7 of 10 [ 2 trees; 7 leaves] CV score: 1.1026 1.0558 1.1618 1.0718 Step 8 of 10 [ 2 trees; 7 leaves] CV score: 0.9695 1.0450 1.1293 1.0790 Step 9 of 10 [ 2 trees; 7 leaves] CV score: 1.0209 1.0423 1.2183 1.0945 Step 10 of 10 [ 2 trees; 7 leaves] CV score: 1.0494 1.0430 1.0766 1.0927 > plot(fit3) > # use logreg.savefit3 for the results with 25000 iterations > plot(logreg.savefit3) > # 4 leaves, 2 trees should top

Page 104: R version 3 Examples

> # null model test > fit4 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], type = 2, select = 4, ntrees = 2, anneal.control = myanneal2) The model of size 0 has score 1.4941 The best model has score 1.0316 Permutation number 1 out of 25 has score 1.4720 Permutation number 2 out of 25 has score 1.4694 Permutation number 3 out of 25 has score 1.4574 Permutation number 4 out of 25 has score 1.4686 Permutation number 5 out of 25 has score 1.4548 Permutation number 6 out of 25 has score 1.4683 Permutation number 7 out of 25 has score 1.4756 Permutation number 8 out of 25 has score 1.4670 Permutation number 9 out of 25 has score 1.4676 Permutation number 10 out of 25 has score 1.4718 Permutation number 11 out of 25 has score 1.4744 Permutation number 12 out of 25 has score 1.4681 Permutation number 13 out of 25 has score 1.4724 Permutation number 14 out of 25 has score 1.4722 Permutation number 15 out of 25 has score 1.4610 Permutation number 16 out of 25 has score 1.4630 Permutation number 17 out of 25 has score 1.4519 Permutation number 18 out of 25 has score 1.4728 Permutation number 19 out of 25 has score 1.4778 Permutation number 20 out of 25 has score 1.4655 Permutation number 21 out of 25 has score 1.4594 Permutation number 22 out of 25 has score 1.4541 Permutation number 23 out of 25 has score 1.4580 Permutation number 24 out of 25 has score 1.4569 Permutation number 25 out of 25 has score 1.4673 > # equivalent > fit4 <- logreg(select = 4, anneal.control = myanneal2, oldfit = fit1) The model of size 0 has score 1.4941 The best model has score 1.0333 Permutation number 1 out of 25 has score 1.4673 Permutation number 2 out of 25 has score 1.4694 Permutation number 3 out of 25 has score 1.4654 Permutation number 4 out of 25 has score 1.4550 Permutation number 5 out of 25 has score 1.4728 Permutation number 6 out of 25 has score 1.4649 Permutation number 7 out of 25 has score 1.4573 Permutation number 8 out of 25 has score 1.4679 Permutation number 9 out of 25 has score 1.4563

Page 105: R version 3 Examples

Permutation number 10 out of 25 has score 1.4664 Permutation number 11 out of 25 has score 1.4565 Permutation number 12 out of 25 has score 1.4577 Permutation number 13 out of 25 has score 1.4686 Permutation number 14 out of 25 has score 1.4692 Permutation number 15 out of 25 has score 1.4687 Permutation number 16 out of 25 has score 1.4578 Permutation number 17 out of 25 has score 1.4550 Permutation number 18 out of 25 has score 1.4560 Permutation number 19 out of 25 has score 1.4531 Permutation number 20 out of 25 has score 1.4741 Permutation number 21 out of 25 has score 1.4614 Permutation number 22 out of 25 has score 1.4642 Permutation number 23 out of 25 has score 1.4728 Permutation number 24 out of 25 has score 1.4721 Permutation number 25 out of 25 has score 1.4709 > plot(fit4) > # use logreg.savefit4 for the results with 25000 iterations > plot(logreg.savefit4)

Page 106: R version 3 Examples

> # A histogram of the 25 scores obtained from the permutation test. Also shown > # are the scores for the best scoring model with one logic tree, and the null > # model (no tree). Since the permutation scores are not even close to the score > # of the best model with one tree (fit on the original data), there is overwhelming > # evidence against the null hypothesis that there was no signal in the data. > fit5 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], type = 2, select = 5, ntrees = c(1,2), nleaves=c(1,7), anneal.control = myanneal2, nrep = 10, oldfit = fit2) The model of size 0 has score 1.4941 The number of trees in these models is 1 The model size is 1 Permutation number 1 out of 10 has score 1.133 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.144 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.132 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.140 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.138 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.137 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.142 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.135 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.142 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.141 model size with 1 tree(s) The model size is 2 Permutation number 1 out of 10 has score 1.044 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.046 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.079 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.097 model size with 1 tree(s)

Page 107: R version 3 Examples

Permutation number 5 out of 10 has score 1.126 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.038 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.040 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.039 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.040 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.032 model size with 1 tree(s) The model size is 3 Permutation number 1 out of 10 has score 1.043 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.039 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.037 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.054 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.036 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.025 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.026 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.044 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.091 model size with 1 tree(s) The model size is 4 Permutation number 1 out of 10 has score 1.139 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.137 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.137 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.137 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.140 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.142 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.142 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.133 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.145 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.132 model size with 1 tree(s) The model size is 5 Permutation number 1 out of 10 has score 1.048 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.046 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.052 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.025 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.095 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.035 model size with 1 tree(s)

Page 108: R version 3 Examples

Permutation number 8 out of 10 has score 1.055 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.099 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.036 model size with 1 tree(s) The model size is 6 Permutation number 1 out of 10 has score 1.121 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.038 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.040 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.044 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.090 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.036 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.038 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.036 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.037 model size with 1 tree(s) The model size is 7 Permutation number 1 out of 10 has score 1.036 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.042 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.034 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.038 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.043 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.048 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.078 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.036 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.047 model size with 1 tree(s) The number of trees in these models is 2 The size for this model is smaller than the number of trees you requested. ( 1 versus 2) To save CPU time, we will skip this run. On to the next model... The model size is 2 Permutation number 1 out of 10 has score 1.035 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.042 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.039 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.042 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.035 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.044 model size with 2 tree(s)

Page 109: R version 3 Examples

Permutation number 7 out of 10 has score 1.048 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.044 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.048 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.040 model size with 2 tree(s) The model size is 3 Permutation number 1 out of 10 has score 1.030 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.031 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.031 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.027 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.033 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.089 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.088 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.091 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.105 model size with 2 tree(s) The model size is 4 Permutation number 1 out of 10 has score 1.036 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.044 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.038 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.044 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.037 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.027 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.037 model size with 2 tree(s) The model size is 5 Permutation number 1 out of 10 has score 1.088 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.047 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.047 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.114 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.054 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.044 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.040 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.040 model size with 2 tree(s)

Page 110: R version 3 Examples

Permutation number 10 out of 10 has score 1.038 model size with 2 tree(s) The model size is 6 Permutation number 1 out of 10 has score 1.030 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.060 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.035 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.042 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.046 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.044 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.049 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.033 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.049 model size with 2 tree(s) The model size is 7 Permutation number 1 out of 10 has score 1.099 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.070 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.061 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.116 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.075 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.080 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.083 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.140 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.100 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.136 model size with 2 tree(s) > # equivalent > fit5 <- logreg(select = 5, nrep = 10, oldfit = fit2) The model of size 0 has score 1.4941 The number of trees in these models is 1 The model size is 1 Permutation number 1 out of 10 has score 1.136 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.120 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.128 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.142 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.138 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.137 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.140 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.141 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.138 model size with 1 tree(s)

Page 111: R version 3 Examples

Permutation number 10 out of 10 has score 1.132 model size with 1 tree(s) The model size is 2 Permutation number 1 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.040 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.030 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.049 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.045 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.047 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.130 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.043 model size with 1 tree(s) The model size is 3 Permutation number 1 out of 10 has score 1.042 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.035 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.039 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.039 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.048 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.035 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.037 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.042 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.053 model size with 1 tree(s) The model size is 4 Permutation number 1 out of 10 has score 1.135 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.138 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.143 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.134 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.120 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.140 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.137 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.142 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.130 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.143 model size with 1 tree(s) The model size is 5 Permutation number 1 out of 10 has score 1.023 model size with 1 tree(s)

Page 112: R version 3 Examples

Permutation number 2 out of 10 has score 1.050 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.083 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.062 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.042 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.048 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.040 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.051 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.082 model size with 1 tree(s) The model size is 6 Permutation number 1 out of 10 has score 1.032 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.094 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.037 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.049 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.068 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.029 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.035 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.047 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.043 model size with 1 tree(s) The model size is 7 Permutation number 1 out of 10 has score 1.043 model size with 1 tree(s) Permutation number 2 out of 10 has score 1.045 model size with 1 tree(s) Permutation number 3 out of 10 has score 1.042 model size with 1 tree(s) Permutation number 4 out of 10 has score 1.041 model size with 1 tree(s) Permutation number 5 out of 10 has score 1.048 model size with 1 tree(s) Permutation number 6 out of 10 has score 1.029 model size with 1 tree(s) Permutation number 7 out of 10 has score 1.117 model size with 1 tree(s) Permutation number 8 out of 10 has score 1.042 model size with 1 tree(s) Permutation number 9 out of 10 has score 1.046 model size with 1 tree(s) Permutation number 10 out of 10 has score 1.040 model size with 1 tree(s) The number of trees in these models is 2 The size for this model is smaller than the number of trees you requested. ( 1 versus 2) To save CPU time, we will skip this run. On to the next model... The model size is 2

Page 113: R version 3 Examples

Permutation number 1 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.039 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.040 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.027 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.044 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.045 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.038 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.048 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.042 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.048 model size with 2 tree(s) The model size is 3 Permutation number 1 out of 10 has score 1.048 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.049 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.033 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.018 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.035 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.045 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.111 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.048 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.110 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.042 model size with 2 tree(s) The model size is 4 Permutation number 1 out of 10 has score 1.047 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.107 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.071 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.040 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.039 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.036 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.037 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.045 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.030 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.044 model size with 2 tree(s) The model size is 5 Permutation number 1 out of 10 has score 1.083 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.036 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.048 model size with 2 tree(s)

Page 114: R version 3 Examples

Permutation number 4 out of 10 has score 1.106 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.048 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.038 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.053 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.045 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.042 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.115 model size with 2 tree(s) The model size is 6 Permutation number 1 out of 10 has score 1.048 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.040 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.037 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.045 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.083 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.040 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.032 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.044 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.041 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.040 model size with 2 tree(s) The model size is 7 Permutation number 1 out of 10 has score 1.082 model size with 2 tree(s) Permutation number 2 out of 10 has score 1.084 model size with 2 tree(s) Permutation number 3 out of 10 has score 1.081 model size with 2 tree(s) Permutation number 4 out of 10 has score 1.085 model size with 2 tree(s) Permutation number 5 out of 10 has score 1.126 model size with 2 tree(s) Permutation number 6 out of 10 has score 1.100 model size with 2 tree(s) Permutation number 7 out of 10 has score 1.138 model size with 2 tree(s) Permutation number 8 out of 10 has score 1.067 model size with 2 tree(s) Permutation number 9 out of 10 has score 1.069 model size with 2 tree(s) Permutation number 10 out of 10 has score 1.083 model size with 2 tree(s) > plot(fit5) > # use logreg.savefit5 for the results with 25000 iterations and 25 permutations > plot(logreg.savefit5) > # The permutation scores improve until we condition on a model with two trees and > # four leaves, and then do not change very much anymore. This indicates that the > # best model has indeed four leaves. > #

Page 115: R version 3 Examples
Page 116: R version 3 Examples

> # greedy selection > fit6 <- logreg(select = 6, ntrees = 2, nleaves =c(1,12), oldfit = fit1) Model 0 has a score of 1.4941 Model 1 has a score of 1.1500 Model 2 has a score of 1.0481 Model 3 has a score of 1.0333 Model 4 has a score of 0.9885 Model 5 has a score of 0.9823 Model 6 has a score of 0.9792 Model 7 has a score of 0.9783 Model 8 has a score of 0.9733 No further improvement possible > plot(fit6) empty tree empty tree empty tree empty tree > # use logreg.savefit6 for the results with 25000 iterations > plot(logreg.savefit6) empty tree empty tree empty tree empty tree > #

> # Monte Carlo Logic Regression > fit7 <- logreg(select = 7, oldfit = fit1, mc.control= logreg.mc.control(nburn=500, niter=2500, hyperpars=log(2))) iter(10k) current scr best score acc / rej /sing current parameters 0.000 0.7470 0.7470 0( 0) 0 0 2.886 0.000 0.000 0.010 0.7470 0.7470 27( 0) 73 0 2.886 0.000 0.000 0.020 0.7470 0.7470 10( 0) 90 0 2.886 0.000 0.000 0.030 0.7470 0.7470 9( 0) 91 0 2.886 0.000 0.000 0.040 0.7470 0.7470 4( 0) 96 0 2.886 0.000 0.000 0.050 0.7470 0.7470 14( 0) 86 0 2.886 0.000 0.000 0.060 0.7470 0.7470 26( 0) 74 0 2.886 0.000 0.000 0.070 16.8890 0.7470 17( 0) 83 0 3.029 -0.170 0.000

Page 117: R version 3 Examples

0.080 5.8169 0.7470 43( 1) 56 0 3.005 -0.419 0.000 0.090 11.1186 0.7470 44( 0) 56 0 3.222 -0.384 0.000 0.100 0.7470 0.7470 36( 0) 64 0 2.886 0.000 0.000 0.110 0.7470 0.7470 19( 0) 81 0 2.886 0.000 0.000 0.120 5.8226 0.7470 24( 0) 76 0 2.953 -0.103 0.000 0.130 0.7470 0.7470 29( 1) 70 0 2.886 0.000 0.000 0.140 11.1209 0.7470 35( 0) 65 0 2.863 0.119 0.000 0.150 0.7470 0.7470 36( 0) 64 0 2.886 0.000 0.000 0.160 0.7470 0.7470 17( 0) 83 0 2.886 0.000 0.000 0.170 0.7470 0.7470 15( 0) 85 0 2.886 0.000 0.000 0.180 0.7470 0.7470 34( 0) 66 0 2.886 0.000 0.000 0.190 0.7470 0.7470 14( 2) 84 0 2.886 0.000 0.000 0.200 5.8226 0.7470 23( 0) 77 0 2.862 0.120 0.000 0.210 5.8194 0.7470 20( 1) 79 0 2.995 -0.305 0.000 0.220 0.7470 0.7470 8( 0) 92 0 2.886 0.000 0.000 0.230 0.7470 0.7470 6( 0) 94 0 2.886 0.000 0.000 0.240 5.8228 0.7470 16( 2) 82 0 2.791 0.105 0.000 0.250 0.7470 0.7470 19( 0) 81 0 2.886 0.000 0.000 0.260 11.1207 0.7470 26( 2) 72 0 2.846 0.131 0.000 0.270 0.7470 0.7470 17( 2) 81 0 2.886 0.000 0.000 0.280 5.8204 0.7470 18( 1) 81 0 2.933 -0.370 0.000 0.290 16.8889 0.7470 21( 0) 79 0 2.942 -0.134 0.000 0.300 16.8901 0.7470 33( 3) 64 0 2.952 -0.030 -0.104 > # we need many more iterations for reasonable results > ## Not run: logreg.savefit7 <- logreg(select = 7, oldfit = fit1, mc.control=logreg.mc.control(nburn=1000, niter=100000, hyperpars=log(2))) > ## End(Not run) > # > plot(fit7) > # use logreg.savefit7 for the results with 25000 iterations > plot(logreg.savefit7)

Page 118: R version 3 Examples
Page 119: R version 3 Examples
Page 120: R version 3 Examples
Page 121: R version 3 Examples

> data(logreg.savefit1,logreg.savefit2,logreg.savefit3,logreg.savefit4, + logreg.savefit5,logreg.savefit6) > # > # fit a single model > # myanneal <- logreg.anneal.control(start = -1, end = -4, iter = 25000, update = 1000) > # logreg.savefit1 <- logreg(resp = logreg.testdat[,1], bin=logreg.testdat[, 2:21], > # type = 2, select = 1, ntrees = 2, anneal.control = myanneal) > # the best score should be in the 0.96-0.98 range > print(logreg.savefit1) score 0.966 1.98 -1.3 * (((X14 or (not X5)) and ((not X1) and (not X2))) and (((not X3) or X1) or ((not X20) and (not X2)))) +2.15 * (((not X4) or ((not X13) and (not X11))) and (not X3)) > # > # fit multiple models > # myanneal2 <- logreg.anneal.control(start = -1, end = -4, iter = 25000, update = 0) > # logreg.savefit2 <- logreg(select = 2, ntrees = c(1,2), nleaves =c(1,7), > # oldfit = logreg.savefit1, anneal.control = myanneal2) > print(logreg.savefit2) 1 trees with 1 leaves: score is 1.15 3.79 -1.91 * X3 1 trees with 2 leaves: score is 1.048 1.87 +2.13 * ((not X3) and (not X4)) 1 trees with 3 leaves: score is 1.046 1.85 +2.14 * ((not X3) and ((not X13) or (not X4))) 1 trees with 4 leaves: score is 1.042 1.86 +2.14 * ((((not X13) and (not X11)) or (not X4)) and (not X3)) 1 trees with 5 leaves: score is 1.042 1.86 +2.14 * ((not X3) and ((not X4) or ((not X13) and (not X11)))) 1 trees with 6 leaves: score is 1.042 4 -2.14 * (X3 or (X4 and (X13 or (not X19)))) 1 trees with 7 leaves: score is 1.04 1.84 +2.15 * ((((not X4) or (not X13)) and (not X3)) or (((not X6) and X14) and ((not X1) and (not X12)))) 2 trees with 2 leaves: score is 1.117 1.98 +1.89 * (not X3) -0.904 * X4 2 trees with 3 leaves: score is 1.033

Page 122: R version 3 Examples

1.58 +0.401 * X1 +2.13 * ((not X3) and (not X4)) 2 trees with 4 leaves: score is 0.988 4.12 -1.11 * ((not X1) and (not X2)) -2.12 * (X3 or X4) 2 trees with 5 leaves: score is 0.982 0.77 +1.22 * ((X2 or X1) or X20) +2.12 * ((not X4) and (not X3)) 2 trees with 6 leaves: score is 0.979 0.764 +2.13 * ((not X3) and (not X4)) +1.23 * ((X2 or X1) or (X20 and X3)) 2 trees with 7 leaves: score is 0.978 1.99 +2.13 * ((not X3) and (not X4)) -1.29 * (((not X7) or ((not X5) or (not X15))) and ((not X1) and (not X2))) > # After an initial steep decline, the scores only get slightly better > # for models with more than four leaves and two trees. > # > # cross validation > # logreg.savefit3 <- logreg(select = 3, oldfit = logreg.savefit2) > print(logreg.savefit3) ntree nleaf train.ave train.sd cv/test cv/test.sd 10 1 1 1.1498841 0.016778885 1.167571 0.15626639 20 1 2 1.0478746 0.011226685 1.069748 0.10832190 30 1 3 1.0449851 0.010941456 1.086064 0.08148609 40 1 4 1.0415285 0.011277524 1.073501 0.11002902 50 1 5 1.0407865 0.010747466 1.103262 0.10778810 60 1 6 1.0375888 0.016061698 1.081281 0.08087984 70 1 7 1.0389121 0.012111442 1.083661 0.08286943 80 2 2 1.0742266 0.029866724 1.113556 0.14902853 90 2 3 1.0331042 0.009432286 1.069458 0.09146249 100 2 4 0.9880183 0.009248304 1.028298 0.08375898 110 2 5 0.9819231 0.008823874 1.034018 0.09038718 120 2 6 0.9780617 0.009782010 1.045832 0.09437465 130 2 7 0.9752644 0.008960632 1.059286 0.10264426 > # 4 leaves, 2 trees should give the best test set score > # > # null model test > # logreg.savefit4 <- logreg(select = 4, anneal.control = myanneal2, oldfit = logreg.savefit1) > print(logreg.savefit4) Null Score 1.494 ; best score 0.969 Summary 25 Randomized scores Min. 1st Qu. Median Mean 3rd Qu. Max. 1.398 1.411 1.423 1.421 1.428 1.444 0 randomized scores ( 0 %) are better than the best score > # A summary of the permutation test > # > # Permutation tests > # logreg.savefit5 <- logreg(select = 5, oldfit = logreg.savefit2) > print(logreg.savefit5) 25 randomizations trees leaves null start best rand: min 1st Qu median mean 3rd Qu 1 1 1 1.494099 1.1500238 0.9707544 1.1080 1.1140 1.1190 1.1180 1.1210 2 1 2 1.494099 1.0480632 0.9707544 0.9896 1.0140 1.0190 1.0170 1.0230 3 1 3 1.494099 1.0456557 0.9707544 0.9987 1.0120 1.0160 1.0150 1.0200 4 1 4 1.494099 1.0420860 0.9707544 1.0040 1.0080 1.0120 1.0120 1.0160 5 1 5 1.494099 1.0420860 0.9707544 0.9996 1.0080 1.0110 1.0110 1.0150 6 1 6 1.494099 1.0420861 0.9707544 0.9867 1.0140 1.0180 1.0160 1.0200 7 1 7 1.494099 1.0400850 0.9707544 1.0040 1.0120 1.0160 1.0170 1.0220 8 2 2 1.494099 1.1168472 0.9707544 1.0040 1.0090 1.0180 1.0170 1.0220 9 2 3 1.494099 1.0333284 0.9707544 0.9783 0.9998 1.0030 1.0030 1.0090 10 2 4 1.494099 0.9884828 0.9707544 0.9542 0.9667 0.9720 0.9709 0.9755 11 2 5 1.494099 0.9823242 0.9707544 0.9489 0.9643 0.9677 0.9679 0.9714 12 2 6 1.494099 0.9791870 0.9707544 0.9524 0.9630 0.9674 0.9650 0.9686 13 2 7 1.494099 0.9778322 0.9707544 0.9539 0.9670 0.9702 0.9713 0.9779 max % < best 1 1.1310 0 2 1.0260 0 3 1.0250 0 4 1.0230 0 5 1.0280 0 6 1.0290 0 7 1.0390 0 8 1.0320 0 9 1.0150 0 10 0.9845 40

Page 123: R version 3 Examples

11 0.9795 68 12 0.9757 84 13 0.9894 56 > # A table summarizing the permutation tests > # > # a greedy sequence > # logreg.savefit6 <- logreg(select = 6, ntrees = 2, nleaves =c(1,12), oldfit = logreg.savefit1) > print(logreg.savefit6) 2 trees with 0 leaves: score is 1.494 2.89 +0 * 1 2 trees with 1 leaves: score is 1.15 3.79 -1.91 * X3 2 trees with 2 leaves: score is 1.048 4 -2.13 * (X3 or X4) 2 trees with 3 leaves: score is 1.033 3.71 -2.13 * (X3 or X4) +0.401 * X1 2 trees with 4 leaves: score is 0.988 3 -2.12 * (X3 or X4) +1.11 * (X1 or X2) 2 trees with 5 leaves: score is 0.982 2.89 -2.12 * (X3 or X4) +1.22 * ((X1 or X20) or X2) 2 trees with 6 leaves: score is 0.981 3.99 -2.14 * (X3 or (X4 and X13)) +0 * ((X1 or X20) or X2) 2 trees with 7 leaves: score is 0.98 +0 * (X3 or (X4 and X13)) +0 * ((X1 or X20) or (X2 and (not X4)))

CDLasso - Package

cv.logit.reg > set.seed(1001) > n=250;p=50 > beta=c(1,1,1,1,1,rep(0,p-5)) > x=matrix(rnorm(n*p),p,n) > xb = t(x) %*% beta > logity=exp(xb)/(1+exp(xb)) > y=rbinom(n=length(logity),prob=logity,size=1) > rownames(x)<-1:nrow(x) > colnames(x)<-1:ncol(x) > lam.vec = (0:15)*2 > #K-fold cross validation > cv <- cv.logit.reg(x,y,5,lam.vec) > plot(cv) > cv Call: NULL Selected Coefficient Estimates: Lambda # Selected CV Error [1,] 0 50 11.450441 [2,] 2 40 10.519529 [3,] 4 24 9.994576 [4,] 6 21 8.975189 [5,] 8 13 8.679890 [6,] 10 10 8.055626 [7,] 12 8 8.075773 [8,] 14 8 7.833066 [9,] 16 6 7.727260 [10,] 18 5 7.674470 [11,] 20 5 7.881611 [12,] 22 5 8.342223 [13,] 24 5 9.011712 [14,] 26 5 10.136755 [15,] 28 5 11.407816 [16,] 30 4 11.352547 Optimal Lambda: [1] 18 > #Lasso penalized logistic regression using optimal lambda

Page 124: R version 3 Examples

> out<-logit.reg(x,y,cv$lam.opt) > #Re-estimate parameters without penalization > out2<-logit.reg(x[out$selected,],y,0) > out2 Call: logit.reg.default(x[out$selected, ], y, 0) # of cases= 250 # of predictors= 5 Lambda used: 0 Intercept: [1] 0.2839435 Selected Coefficient Estimates: Predictor Estimate [1,] "1" "1.15732601065077" [2,] "2" "1.26630644479296" [3,] "3" "1.11480734082738" [4,] "4" "0.9729558224485" [5,] "5" "0.943690965448634" Number of Active Variables: [1] 5

Logit.reg > set.seed(1001) > n=500;p=5000 > beta=c(1,1,1,1,1,rep(0,p-5)) > x=matrix(rnorm(n*p),p,n) > xb = t(x) %*% beta > logity=exp(xb)/(1+exp(xb)) > y=rbinom(n=length(logity),prob=logity,size=1) > rownames(x)<-1:nrow(x)

Page 125: R version 3 Examples

> colnames(x)<-1:ncol(x) > #Lasso penalized logistic regression using optimal lambda > out<-logit.reg(x,y,50) > print(out) Call: logit.reg.default(x, y, 50) # of cases= 500 # of predictors= 5000 Lambda used: 50 Intercept: [1] -0.0747257 Selected Coefficient Estimates: Predictor Estimate [1,] "1" "0.221213554460732" [2,] "2" "0.120013566825793" [3,] "3" "0.156750646213643" [4,] "4" "0.25585278305989" [5,] "5" "0.0625220813776887" Number of Active Variables: [1] 5 > #Re-estimate parameters without penalization > out2<-logit.reg(x[out$selected,],y,0) > out2 Call: logit.reg.default(x[out$selected, ], y, 0) # of cases= 500 # of predictors= 5 Lambda used: 0 Intercept: [1] -0.1178632 Selected Coefficient Estimates: Predictor Estimate [1,] "1" "1.01692117273978" [2,] "2" "0.962995570602722" [3,] "3" "0.997716989934031" [4,] "4" "1.05780471771787" [5,] "5" "0.970629663127746" Number of Active Variables: [1] 5

plot.cv.logit.reg > set.seed(101) > n=250;p=50 > beta=c(1,1,1,1,1,rep(0,p-5)) > x=matrix(rnorm(n*p),p,n) > xb = t(x) %*% beta > logity=exp(xb)/(1+exp(xb)) > y=rbinom(n=length(logity),prob=logity,size=1) > rownames(x)<-1:nrow(x) > colnames(x)<-1:ncol(x) > lam.vec = (0:15)*2 > #K-fold cross validation > cv <- cv.logit.reg(x,y,5,lam.vec) > plot(cv) > #Lasso penalized logistic regression using optimal lambda > out<-logit.reg(x,y,cv$lam.opt) > #Re-estimate parameters without penalization > out2<-logit.reg(x[out$selected,],y,0)

Page 126: R version 3 Examples

> out2$estimate [1] 1.0291390 1.0705682 0.9169029 0.7744803 1.3287788

Print cv.logit.reg > set.seed(101) > n=250;p=50 > beta=c(1,1,1,1,1,rep(0,p-5)) > x=matrix(rnorm(n*p),p,n) > xb = t(x) %*% beta > logity=exp(xb)/(1+exp(xb)) > y=rbinom(n=length(logity),prob=logity,size=1) > rownames(x)<-1:nrow(x) > colnames(x)<-1:ncol(x) > lam.vec = (0:15)*2 > #K-fold cross validation > cv <- cv.logit.reg(x,y,5,lam.vec) > plot(cv) > #Lasso penalized logistic regression using optimal lambda > out<-logit.reg(x,y,cv$lam.opt) > #Re-estimate parameters without penalization > out2<-logit.reg(x[out$selected,],y,0) > out2$estimate [1] 1.0291390 1.0705682 0.9169029 0.7744803 1.3287788

Page 127: R version 3 Examples

Print logit > set.seed(1001) > n=500;p=5000 > beta=c(1,1,1,1,1,rep(0,p-5)) > x=matrix(rnorm(n*p),p,n) > xb = t(x) %*% beta > logity=exp(xb)/(1+exp(xb)) > y=rbinom(n=length(logity),prob=logity,size=1) > rownames(x)<-1:nrow(x) > colnames(x)<-1:ncol(x) > #Lasso penalized logistic regression using optimal lambda > out<-logit.reg(x,y,50) > print(out) Call: logit.reg.default(x, y, 50) # of cases= 500 # of predictors= 5000 Lambda used: 50 Intercept: [1] -0.0747257 Selected Coefficient Estimates: Predictor Estimate [1,] "1" "0.221213554460732" [2,] "2" "0.120013566825793" [3,] "3" "0.156750646213643" [4,] "4" "0.25585278305989" [5,] "5" "0.0625220813776887" Number of Active Variables: [1] 5 > #Re-estimate parameters without penalization

Page 128: R version 3 Examples

> out2<-logit.reg(x[out$selected,],y,0) > print(out2) Call: logit.reg.default(x[out$selected, ], y, 0) # of cases= 500 # of predictors= 5 Lambda used: 0 Intercept: [1] -0.1178632 Selected Coefficient Estimates: Predictor Estimate [1,] "1" "1.01692117273978" [2,] "2" "0.962995570602722" [3,] "3" "0.997716989934031" [4,] "4" "1.05780471771787" [5,] "5" "0.970629663127746" Number of Active Variables: [1] 5

Summary logit > set.seed(1001) > n=500;p=5000 > beta=c(1,1,1,1,1,rep(0,p-5)) > x=matrix(rnorm(n*p),p,n) > xb = t(x) %*% beta > logity=exp(xb)/(1+exp(xb)) > y=rbinom(n=length(logity),prob=logity,size=1) > rownames(x)<-1:nrow(x) > colnames(x)<-1:ncol(x) > #Lasso penalized logistic regression > out<-logit.reg(x,y,lambda=50) > #Re-estimate parameters without penalization > out2<-logit.reg(x[out$selected,],y,lambda=0) > summary(out2) Call: logit.reg.default(x[out$selected, ], y, lambda = 0) Feature Matrix 1 2 3 4 5 6 7 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 2.1886481 -1.1332633 -1.4018876 2.0277878 -0.1913008 0.4938323 -1.0545270 2 -0.1775473 1.4348936 -0.1610051 -0.6165189 0.1134662 -0.7886298 1.1671738 3 -0.1852753 -1.3438215 -0.2807431 -0.5884601 -0.9445003 -0.1883520 0.3333745 4 -2.5065362 -1.7316531 -0.2020096 -0.2680269 -0.5878644 1.5999591 -1.6991701 5 -0.5573113 0.1788759 -0.6887195 -1.9097983 -0.1540557 -1.3809020 1.0176145 8 9 10 11 12 13 14 1.0000000 1.0000000 1.0000000 1.00000000 1.00000000 1.0000000 1.0000000 1 -0.5956101 0.3022874 -0.1395331 -1.00456433 0.02924301 -0.2235469 0.7919949 2 0.4672057 -1.4123290 0.4351011 0.28854274 0.87119094 -0.5697564 0.2714511 3 -0.2269314 0.9918948 -0.4589502 0.83011938 -0.44240759 -0.2562603 -0.5791881 4 -1.9611353 -0.6582565 2.1408353 2.84542640 0.78796577 2.5135564 0.9641051 5 1.0141709 -0.3357966 -1.0089073 -0.07445839 -0.70164392 -0.1670262 -1.0834697 15 16 17 18 19 20 21 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.000000000 1.0000000 1 -0.2281614 -0.9923248 -1.5057445 0.9811057 -1.1961491 0.451306149 0.9598894 2 0.7753241 -2.1768987 -0.1788786 0.6699037 -1.5903890 0.001463375 0.7169977 3 0.6759196 -0.9502087 0.1732486 -0.2480184 -0.8079756 0.412850884 -0.1412054 4 -0.6672365 -1.2341979 0.3290130 -1.0492409 0.3416278 -1.331714377 -1.2306583 5 1.7293245 1.3869813 -1.8159265 0.9400324 1.9805572 0.448790500 -2.0431082 22 23 24 25 26 27 28 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1.00000000 1.00000000 1 1.0277875 -0.7934305 -2.7332163 -0.88326631 -0.5990152 -0.41471046 1.34050664

Page 129: R version 3 Examples

2 0.8493569 1.1858257 -0.5680936 -0.73980649 0.0873281 -0.09957900 -0.07933151 3 -1.5366808 -1.0890565 1.2326307 0.85437182 -0.2400236 -0.54147610 -0.14398520 4 0.7263908 0.4835308 1.7454733 -0.23077342 -0.4516527 -0.02491917 -1.60568433 5 0.9670694 1.0518593 -1.5274894 -0.04433918 0.8949903 -0.72630581 -1.53581008 29 30 31 32 33 34 35 1.00000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1 0.10157386 0.3117812 1.79464604 0.9628600 0.1927410 1.2187437 -1.0158155 2 -0.52977567 0.2448182 -0.19991843 -2.8621196 -1.0274273 0.4790830 -0.6727288 3 -0.09716749 2.2864756 0.04937121 1.5513981 -0.1952839 0.4430114 0.9858167 4 1.42822236 -1.7809573 -0.41804442 -1.1220523 -0.4020653 0.0915248 0.5386396 5 0.33361127 -0.9521308 0.10016366 -0.5620327 1.4654214 -1.2005502 -1.0806384 36 37 38 39 40 41 42 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1 -2.3175229 0.4746947 0.1905805 -0.0466606 0.8113780 -0.81657218 0.1948698 2 -0.0293778 0.9721639 -0.2574502 1.5039103 0.6982668 0.87273557 -0.3572670 3 0.3923544 0.7314031 0.8590726 -1.0680720 -1.2788559 -0.15859806 0.3262276 4 1.2603621 -1.3907238 -1.5108647 1.5851235 -0.8408157 0.05872832 -0.2226027 5 -1.2376998 -0.2582330 0.4861579 0.1658477 -1.9872330 -0.45318602 0.6661751 43 44 45 46 47 48 49 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1 0.2836622 0.8425147 1.0904838 0.62787141 0.5718562 -0.9762856 -0.5915519 2 0.2271370 -0.6113201 1.3729424 -1.14323747 1.3916715 -0.0146763 -0.8060340 3 1.9080790 0.1006568 -0.9711873 0.52294168 1.5675208 0.2320183 -0.9826636 4 2.3149525 -0.6526179 0.3965547 1.84717160 0.2932240 0.1836301 0.5697308 5 1.1621097 -0.0547845 -1.8216296 -0.07788346 -0.6365994 -0.4784084 0.1759051 50 51 52 53 54 55 56 1.00000000 1.00000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1 -0.09976201 1.34772074 1.07502818 -2.2890199 -1.8542977 -0.7055145 -1.5821775 2 0.22112748 -0.31214972 -0.09233828 -0.3799467 0.8121507 0.1303898 -1.5176412 3 0.92800676 1.06491032 -0.58913143 -1.6074810 2.1602828 -1.2120227 -0.3260883 4 0.28770004 -0.07604237 2.89163705 -0.3400451 2.0467331 -1.2640488 1.4743208 5 0.10319684 -2.00528025 -0.93705484 -1.1822712 -0.5873383 0.7598651 0.0762635 57 58 59 60 61 62 63 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1 0.3404679 0.6764394 0.2485070 -1.8137020 -0.01544402 -0.1157167 0.6662752 2 0.9446543 -1.1539400 -1.7589425 -0.2439220 0.66786440 0.3627940 1.2536072 3 -0.9922738 -0.5483265 -2.2155876 1.2471609 -0.93213247 0.6896979 1.2029225 4 -1.7224341 1.5302444 0.8882101 0.7582973 -0.64806008 0.9737811 -0.1602221 5 -0.4925319 0.2959661 0.8683791 -1.9116109 0.32647880 0.1582702 -1.3548843 64 65 66 67 68 69 70 1.0000000 1.0000000 1.0000000 1.000000000 1.0000000 1.0000000 1.00000000 1 -0.2197278 0.8676309 -0.2997762 0.401293530 -1.1070129 1.9536910 -2.22867316 2 -0.1588339 1.1294699 -0.6323518 -0.837684827 1.5804616 -1.6257677 -0.01869368 3 1.1740968 -1.1506328 0.5948968 -0.957749753 -0.5420766 0.6857555 -0.25541142 4 -0.4375074 -1.7384684 -1.1294999 -0.009186138 1.0196207 -1.1711982 -0.71200052 5 0.4979537 -0.5365587 1.0012003 1.403121141 0.1660524 0.5092384 0.61460669 71 72 73 74 75 76 77 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1.00000000 1 -0.5222518 1.7445615 -1.4837281 0.5143849 -0.33747714 -1.9635681 1.05703164 2 0.5592453 0.7334180 -0.1383849 -2.1618571 -0.55356940 -3.0362373 -0.07086379 3 -0.9064194 -0.5770376 -1.6639898 0.2059035 -1.37498074 0.7269375 0.40274591 4 -0.6188247 0.5495947 1.0189480 -0.2910829 -0.80170883 0.7948645 1.08307144 5 0.2947891 -0.9124392 0.3460858 -0.7117492 -0.07369018 -0.3504257 0.38163990 78 79 80 81 82 83 84 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 2.0315515 0.5021479 -2.0908236 1.2975432 -0.2580212 -0.3742382 -0.2626724 2 1.5277998 -0.4643356 -0.9938653 1.9255100 -0.7041729 -0.5200268 0.4384314 3 1.0300526 -0.5444556 -0.1478054 0.7508343 -2.3468830 -0.2806971 -1.4339745 4 0.2382365 -0.5562326 0.4712419 0.9719640 -1.9142655 -0.1946861 0.4317763 5 0.5208822 1.0322986 -0.4541506 -0.2758341 0.6281302 1.3491716 0.1687222 85 86 87 88 89 90 91 92 1.0000000 1.0000000 1.000000 1.0000000 1.0000000 1.0000000 1.0000000 1.000000 1 -1.3713042 1.1572872 1.198143 -1.0298962 0.8887373 1.4797736 -1.0653147 1.555879 2 -2.5641497 0.4982677 1.002033 -0.6898487 0.8127764 0.1374533 0.1176579 1.651978 3 0.8354693 -0.9754555 -2.039953 -0.6533773 1.3838549 0.1121223 -0.3639806 2.043982 4 0.4161671 -0.7348224 -1.029971 0.2445372 0.2682317 -2.0993799 0.2566321 2.753559 5 0.3983242 0.9632408 -1.122013 -0.1354732 1.1756321 0.2880737 0.1322640 1.374145 93 94 95 96 97 98 99 1.0000000 1.00000000 1.0000000 1.0000000 1.000000000 1.0000000 1.00000000 1 1.8862962 1.13286977 -0.2962953 -0.1687795 0.939040407 -0.8773813 -0.66079640 2 -1.4308373 -0.56414455 -1.2020247 -0.3522175 -0.538835907 0.1741538 -0.09404032 3 1.0045076 -0.03431546 -0.7247622 0.4954870 1.676490995 -0.3581000 -0.07305842

Page 130: R version 3 Examples

4 -0.9382625 0.21581810 0.1722979 -0.4306655 2.248417899 -1.6478933 -0.22321870 5 -0.2832520 0.77740979 -0.5486358 0.7758144 -0.007303744 -0.3897968 0.47103247 100 101 102 103 104 105 106 1.0000000 1.000000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1 0.6186061 -1.082424304 0.08381956 0.1832327 0.3699656 0.63766603 0.8240629 2 -1.6199557 1.665528034 -0.22161674 -1.4591005 -1.4168411 -0.95266274 -0.3880403 3 0.9124673 -0.615777589 2.05905916 -0.8788548 -2.1539265 -0.90453155 0.5023161 4 0.3003132 0.372280669 1.76045901 -0.2899962 1.4639172 -0.39214892 -0.5463547 5 -0.2638520 0.003996547 -0.37034028 2.3147229 0.9854366 -0.01414582 0.1848820 107 108 109 110 111 112 113 1.0000000 1.0000000 1.0000000 1.0000000 1.000000000 1.0000000 1.0000000 1 1.4108010 0.9646332 -1.0694763 1.6413792 -0.103336273 1.4343970 1.0388021 2 -1.1362150 -0.7236522 -1.7146834 0.6492446 -0.004120146 -1.1099611 1.0096444 3 -0.6435294 1.2511961 -0.5753267 1.2158407 -0.290955669 0.6629755 -0.6890916 4 -0.3636335 0.2443038 -1.0136564 0.7161890 -0.306821594 0.2855838 -0.1222304 5 -0.4130765 -0.8293343 -2.2424899 0.2550023 0.056404789 -0.1437829 1.5649671 114 115 116 117 118 119 120 1.0000000 1.0000000 1.00000000 1.00000000 1.00000000 1.00000000 1.0000000 1 1.6807232 0.3328013 0.23663316 -1.08348753 1.59507339 -1.15535678 -0.4395146 2 2.1514905 0.4447055 -1.66766313 0.69360146 0.86534187 0.30374960 -0.2469642 3 1.2479086 1.0958996 -1.53734038 2.85324995 0.46602873 -0.09942697 -1.0001110 4 1.1458881 -0.9136729 -0.79346876 1.79028634 0.94402170 -0.30051023 0.2332362 5 0.1575811 1.3162501 -0.08813937 0.07397781 -0.02588139 -1.32035129 0.9429689 121 122 123 124 125 126 127 1.00000000 1.0000000 1.0000000 1.0000000 1.00000000 1.00000000 1.0000000 1 1.62985948 -0.7592722 -0.5901612 0.3660502 0.09980172 -0.57937196 0.4951885 2 1.39360295 -0.3025907 -0.2185828 -0.9659062 -0.30942759 0.16272269 -0.4046404 3 -0.92364368 -0.3265559 0.3668535 -1.8375386 -1.35161200 -0.51684724 0.5701294 4 -1.96714517 2.2259590 -0.8602066 0.4014197 0.88859132 1.84879080 1.9557716 5 0.04602281 -0.2305141 1.1233725 1.5348516 -1.40501409 -0.01872522 0.5630001 128 129 130 131 132 133 134 1.00000000 1.0000000 1.00000000 1.00000000 1.00000000 1.0000000 1.0000000 1 0.81537297 0.5945900 -0.85995118 0.37078897 0.15537590 -1.2980838 1.9826411 2 1.44607897 0.1078095 0.05677284 -1.68350125 0.09275618 0.4157624 -0.2051616 3 0.17638733 -0.7299632 -0.07195348 1.48091496 0.18698675 0.1935774 0.3433624 4 0.07213378 -0.3433461 0.21193800 0.01164136 0.06147760 -1.5649764 0.3515845 5 0.90173178 1.7572962 -0.02932350 -1.11710200 -0.32903960 0.4367509 0.7457591 135 136 137 138 139 140 141 1.00000000 1.00000000 1.0000000 1.00000000 1.0000000 1.0000000 1.00000000 1 2.14745324 -2.00192683 -0.3170248 0.04598804 1.5541089 0.1643717 0.38461666 2 0.96419936 -0.21915172 -1.0245491 0.10645983 -1.3693656 -0.0339983 -0.01617504 3 -0.78428568 0.09477539 2.4175543 -0.86649592 0.4013543 -2.2433299 -0.15167945 4 0.19125231 -0.28399672 1.0347965 -0.03183364 -0.3165422 -0.9834351 1.63924964 5 0.09862053 0.28814464 -0.9814861 0.49857245 1.0459918 0.9624009 0.22652304 142 143 144 145 146 147 148 1.0000000 1.00000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1 -1.1642364 -0.23769085 1.45561639 -1.7903271 0.2520780 0.92948792 -0.2664532 2 1.0463257 -1.74835598 -1.59855232 0.1651787 -0.7945271 -0.08498383 -1.1455714 3 -1.6763032 -0.07493931 -0.01248907 -0.3373747 1.3640168 1.29658174 -0.3306268 4 -0.3965847 -1.81311980 -1.91512493 0.8489780 1.0442493 0.60662644 0.9508477 5 -0.5432124 1.22577770 -0.50506596 1.1787595 0.4299749 -1.15422398 -0.3439607 149 150 151 152 153 154 155 1.00000000 1.000000000 1.0000000 1.00000000 1.0000000 1.0000000000 1.0000000 1 -0.20478492 -1.003162693 -0.1453734 2.62884881 -0.8349089 -0.2199319132 -0.5141339 2 -0.07202235 0.187901992 -0.4122625 -0.04856038 -0.1362687 -0.2249042676 0.1240285 3 0.81334075 -0.005342945 2.5822861 -1.52846662 0.8884643 0.0003969738 -0.2135135 4 1.50622929 1.410837843 -0.4856130 -0.35079905 0.3288863 -1.2803890077 -0.2992408 5 0.32707249 1.045576036 -3.0873006 -1.12682595 -0.7559895 -0.5996779612 -0.7739015 156 157 158 159 160 161 162 1.0000000 1.00000000 1.00000000 1.0000000 1.0000000 1.00000000 1.00000000 1 0.5880426 -0.34082904 -0.66718518 0.3911553 0.1328230 0.77546474 -0.29511421 2 0.7118394 1.91842228 1.44705582 -1.4273438 1.4704502 -0.78242007 0.08795516 3 -0.2164140 -0.08030912 -2.00961244 0.6822696 -0.3439833 2.32638629 0.17770776 4 0.4057503 -1.10632297 0.06234508 -0.5024990 2.6088607 -0.05248331 0.14022642 5 -0.1395355 -1.67396018 -0.49790375 -1.0283308 0.3134522 -0.71377512 -0.39480218 163 164 165 166 167 168 169 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 -1.4878421 0.04318221 -0.4775317 -0.9639623 -0.5669458 -2.0273592 -0.7895037 2 1.7533461 -0.15339466 -1.1326930 2.0902095 -0.7401726 -0.6797188 -1.0902092 3 -0.9011211 -0.50179567 -0.1964893 -0.4439146 0.1110368 -1.0697423 1.1496127 4 -0.1404761 -0.69179792 -0.8692367 -1.6848414 -0.1627840 -1.2738547 -0.1063869 5 -2.0784046 0.36907387 0.7769248 0.9730708 -1.4389000 -0.2185176 -0.4014177

Page 131: R version 3 Examples

170 171 172 173 174 175 176 1.0000000 1.00000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1 0.5232902 -0.26183032 0.03012089 1.3147780 -1.0346418 -0.31442482 -0.1075772 2 -0.1019979 0.91510921 -1.40279452 -0.3576219 -0.3758292 1.02071816 -1.3410519 3 0.6715115 -0.05437414 0.19443337 0.9744432 -0.5566237 1.31060587 -0.4052890 4 1.0425859 -1.00737195 -0.99259945 -0.7959393 -0.5601286 -0.02474158 -1.3118470 5 0.5292470 -0.89087491 -0.93809814 -1.3470765 -0.2531226 1.34937338 -0.6559822 177 178 179 180 181 182 183 1.00000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1 0.86168435 -0.56386603 -0.5660130 0.8136484 -0.8045929 -0.3409986 0.06976405 2 0.07010274 1.78381728 0.3488385 0.7372332 -0.2198499 -1.3097367 -0.27440766 3 -1.08168475 -0.11943795 1.4947578 1.1513295 -0.5849447 1.1280201 0.69546880 4 -0.91221974 0.08501994 -1.5057155 0.1395889 -0.3255160 1.7555690 0.81562254 5 0.17573484 0.82205858 -0.7951811 0.6888790 -0.8557401 -2.1948495 1.67909572 184 185 186 187 188 189 190 1.000000000 1.000000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1 -0.005086993 2.797977501 -0.2111448 -0.30096920 -0.4937576 0.1943888 -0.7590282 2 1.252449847 0.004925558 -0.4318888 -0.80156448 -0.7335021 0.1441953 -1.4925223 3 -0.263617058 2.339611198 -0.3790386 -0.08697631 -0.4763975 -0.1237950 1.1609235 4 0.722253110 1.188749617 -0.8368176 0.69636448 -0.7977079 -0.7977067 1.6780523 5 -0.920210984 0.372298630 1.2908092 -1.39610116 0.1100181 -1.2303513 0.4242632 191 192 193 194 195 196 197 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1.00000000 1 -1.67561918 -0.9413342 1.1704780 -2.11808937 -0.5231623 -0.4912513 1.13811560 2 0.09176958 1.6337354 -0.9396990 0.03254279 -1.6183888 1.0276853 -0.01033889 3 -0.23377599 0.1231631 0.1625467 1.30231450 1.6025696 0.7394884 -1.13129818 4 -0.17479266 -0.6113186 0.7140931 -1.01135671 0.3757071 -0.1634156 -0.43582953 5 1.19506594 -1.4572326 1.9733925 -0.56663040 -0.5797650 0.9600317 -0.53054865 198 199 200 201 202 203 204 1.0000000 1.000000000 1.0000000 1.0000000000 1.0000000 1.0000000 1.0000000 1 0.1328057 0.803399039 0.6790493 -0.5890688547 0.7969090 0.9796074 0.1050442 2 0.2375965 0.007372541 0.9863016 0.0005509194 -1.1079494 -0.3014000 0.8762181 3 0.5449287 0.059043938 -0.5374890 -0.7376053070 -2.6846125 -0.8835438 -1.0059134 4 0.7385509 -0.252764568 -0.9782597 0.2739933865 0.7435212 0.4482427 1.3455066 5 -1.0240642 1.612034108 -0.4804989 0.1890822556 1.6718665 1.1176437 0.3820081 205 206 207 208 209 210 211 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000000 1.0000000 1.0000000 1 0.5344252 -1.9303151 0.5331271 1.2938770 -0.0444300243 -2.0584380 0.5084373 2 0.4890642 -0.1205238 0.2492818 -0.7751776 -0.0007995737 -0.9393183 -0.4745533 3 0.9900105 -0.6530163 -1.2514102 0.3229311 -0.3718448846 -0.1870468 -0.1536307 4 -0.9174400 0.1351641 -0.3364941 -0.7358746 0.6994425973 0.8453812 -0.7576433 5 -0.5842006 -0.2674267 -0.1169927 -0.2561451 0.5832503856 -1.9828627 -0.4957983 212 213 214 215 216 217 218 1.0000000 1.00000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1 1.7414023 -0.52977684 0.3095581 1.69298386 -0.4657225 1.1403492 3.2672561 2 1.5662959 0.01873546 0.3705630 -0.71218557 1.1692738 -1.3509856 -1.0416909 3 2.3571946 0.69379705 0.8358563 0.49539405 -0.5646044 0.9765626 0.4593982 4 2.2493017 0.83688315 -0.3309610 -0.98150609 -0.1723943 1.0138400 -0.1161382 5 0.5946049 1.52846455 0.5733220 0.08555863 0.0560736 -0.7344955 -1.3057852 219 220 221 222 223 224 225 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1 0.2121855 -0.4563168 -0.4192651 0.4540444 0.4439176 -1.6397929 0.08158113 2 -2.3276483 0.1227931 -0.5577135 -3.5735837 0.3601446 -0.8079688 -0.62766687 3 0.1717630 1.1493989 -1.3330973 0.2163733 1.2493277 -0.8794153 0.33912177 4 1.0934766 1.1661670 -1.0554778 -1.0329243 0.4210862 -1.3417793 0.22946605 5 1.1133436 0.2952341 2.0529327 0.7064472 -0.3995357 -2.0543232 1.27306038 226 227 228 229 230 231 232 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1 -0.9835541 0.0688873 -1.88091956 1.2450726 0.1051735 -0.1209057 -0.7376199 2 -0.1136750 0.6639982 0.05468455 0.3071807 2.0298849 1.9177334 -0.3193543 3 0.6065667 0.6628722 -0.53422596 1.9145082 0.7684324 0.1221215 -1.5186722 4 1.7380905 2.1168002 0.90398453 1.0136374 -1.0130154 -0.9133010 0.9413329 5 0.2428703 0.1672055 0.60629871 -0.3980439 -1.8518364 -1.0288395 0.8718929 233 234 235 236 237 238 239 1.0000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1 -0.3278760 -1.41529268 -0.5129525 -1.2901975 0.16275628 0.4949751 0.8789050 2 -0.4663579 0.06598198 1.2620112 -0.5309584 -0.37465552 0.1294128 1.0396163 3 -0.2167506 -0.24458478 -0.3232046 -0.7111269 -0.43692751 -1.1744945 -0.8303812 4 0.5404743 -1.01001956 0.4166886 -0.9126701 -0.07000983 -0.7911482 -0.2405564 5 -1.4971883 1.67524164 -0.9322465 0.7713370 0.53528651 0.8889716 -0.6852068 240 241 242 243 244 245 246 1.0000000 1.00000000 1.00000000 1.00000000 1.00000000 1.000000000 1.0000000

Page 132: R version 3 Examples

1 -0.7367430 0.05745638 -0.08461781 -0.25457487 -2.09658771 0.002308514 -0.8240352 2 -1.6569093 0.54752975 1.47394058 -0.80644326 -1.34896025 0.276958184 -0.3348785 3 1.1216191 -1.29009429 -0.89122908 2.10804086 -0.08517836 0.838097340 0.6406506 4 0.2569577 -0.79601367 1.28131948 0.07962391 3.01389418 -0.425877107 1.4404617 5 0.7463291 0.08591328 1.35446240 0.88928736 -1.79341562 0.670658646 -0.5586247 247 248 249 250 251 252 253 1.0000000 1.00000000 1.0000000 1.00000000 1.0000000 1.000000000 1.0000000 1 -0.3607787 -0.59067416 -0.4131925 0.02733743 1.5799369 -2.043302712 0.7235415 2 -0.6142057 0.28253852 0.3026074 2.67126928 0.4962027 0.002292139 2.8982690 3 0.5937740 -0.07804301 0.7783439 0.68885512 0.1580571 -0.777296940 0.1963339 4 -0.7985260 -1.36189092 0.4445142 -0.30703077 1.7674661 1.405111410 0.1485543 5 -0.5738920 -0.18214797 1.5649138 0.47625577 1.8327128 -1.107464016 -0.2909753 254 255 256 257 258 259 260 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1 0.02440963 -0.2677166 -0.9428428 0.5964229 -1.2711492 -0.02462971 0.5889840 2 -1.02939339 -0.4092178 0.7976340 0.7523060 0.6647210 -0.46000323 -0.6073575 3 -0.88145444 -0.3909805 3.2526405 -0.2050529 1.1541133 -0.80971003 0.5303417 4 -0.46113727 0.1863513 0.1723155 -1.0156551 1.4416000 -0.65194882 2.0020320 5 1.01096509 1.1606123 1.0105360 0.6204313 -0.9972706 0.03987831 -1.1196746 261 262 263 264 265 266 267 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 1.35955775 -0.9977722 1.2276432 -1.0443578 0.8070172 0.9605732 -0.8826334 2 -0.07871479 -1.1688880 -0.3805614 -0.4875364 0.5172444 0.1261607 -0.1417836 3 2.30107800 -0.1132917 1.2159653 -0.3008833 -0.7284348 -0.6281637 -0.3620332 4 0.12679271 1.5904562 0.4884776 0.7974576 0.5187828 1.7711640 -1.0698874 5 0.54721552 -0.8304161 -0.4274913 -0.4390044 1.0312408 -0.7609024 0.1661813 268 269 270 271 272 273 274 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.000000000 1 1.5254298 0.88290334 0.5779918 0.5671370 0.1867728 0.1206773 2.598831178 2 0.1506544 -0.09305015 1.0491683 0.7402389 -1.5754275 -0.7671361 0.931630174 3 1.1253740 -0.39029015 1.1271142 2.3146791 0.6537049 -0.3290942 -0.072730770 4 -1.1091208 -1.86615021 0.4850674 0.6954946 -0.5273724 0.2659355 0.298257095 5 -0.2177767 -1.87859998 1.4311560 -0.7156478 -1.9611928 0.2032043 0.005844197 275 276 277 278 279 280 281 1.00000000 1.00000000 1.0000000 1.0000000 1.0000000 1.00000000 1.00000000 1 -0.39112565 -2.41432548 -0.1757958 -1.1183933 1.7809384 0.93869708 -0.08438154 2 -0.08775444 -2.49529350 -1.6552147 -1.0609572 -0.8236621 0.77291509 -1.99798889 3 -0.36859624 -1.01945986 -1.0876131 2.1282162 -1.5464628 0.09329068 0.42799231 4 0.96355041 0.06334193 0.6194831 0.6315921 1.0891134 -0.65834251 -0.17535693 5 -0.59372605 -0.21005404 -0.1210537 -1.0035628 -1.3574926 0.74893872 -1.03180946 282 283 284 285 286 287 288 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 0.0943378 1.86181337 0.0363574 -1.2376460 0.3445790 -0.9443058 -1.3344974 2 0.3033585 0.53800532 0.5194920 0.6265624 -0.4026889 0.3397587 0.2554698 3 0.1054141 0.65014489 -0.9116364 0.3983257 0.1733646 0.7785341 1.8393793 4 -1.1608857 0.06512922 0.3799691 1.2411048 -0.7439985 0.7961190 -0.2169524 5 -0.9817526 -0.50338448 0.3599685 -1.3179951 -0.7558181 -0.6614452 -0.6896000 289 290 291 292 293 294 295 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 -0.8329547 0.5037223 -0.2547832 1.4767737 -0.9931044 -0.1725214 0.3809682 2 0.5043197 0.8041024 1.0317658 0.7868419 -0.2780960 0.6220685 -1.4021950 3 0.5362352 0.2756583 0.8392944 0.6044582 0.2907563 0.9136459 -0.5607797 4 -0.1540980 -0.1868830 1.0697200 0.3689517 0.3522051 -0.6298061 0.2271043 5 0.7669751 -2.1178815 -0.8740526 -0.2558389 -0.7273423 -1.7133960 1.3220909 296 297 298 299 300 301 302 1.0000000 1.0000000 1.0000000 1.00000000 1.00000000 1.0000000 1.0000000 1 0.3236282 1.9021399 1.0005533 0.67826666 -1.02726324 -1.3724715 1.8277252 2 0.1128403 -0.4932183 1.1654730 0.33831735 1.14832447 1.3192469 -0.9170896 3 0.4683932 0.6555671 1.7891247 -0.83185061 -0.25032678 -0.8169768 -0.4162121 4 0.4851135 -0.6622612 0.9428348 -0.04792862 -0.90821792 1.2014451 0.6541433 5 1.1784190 1.6133088 -1.6345752 -0.35733130 0.05069157 -1.1128425 -2.1658415 303 304 305 306 307 308 309 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1 0.6177074 0.49414916 -0.3567127 -0.1152090 0.1614967 -0.7044635 -0.05009166 2 0.8765570 -1.09385861 -0.6991117 0.3681276 -2.0701220 1.7807780 0.66667450 3 -0.2239547 -0.65760969 0.9879422 -1.6519056 -0.2610471 1.5281703 0.35987787 4 -2.0697252 0.04493212 0.5959319 -0.3955748 -0.6850634 -1.2245622 1.81585779 5 0.1666463 -1.90862085 -0.2407299 0.2919761 2.2016503 -1.3346293 1.78222196 310 311 312 313 314 315 316 1.0000000 1.00000000 1.0000000 1.00000000 1.00000000 1.0000000 1.00000000 1 0.4387302 0.24948623 -0.5616739 1.95360085 2.60834143 0.4556093 0.23842561 2 1.1932137 -1.26131613 0.8402250 0.03584165 -0.56004120 2.0540546 -0.29531795

Page 133: R version 3 Examples

3 0.6022037 0.37185315 1.1062304 -0.96130435 -0.78669055 0.5323808 -0.07005699 4 0.2446509 0.50705737 1.0956212 0.46124292 -0.07697133 -0.7265572 1.42579310 5 0.5999425 -0.07936217 -0.6463917 0.02967285 -0.27831942 -0.5040736 -0.62122326 317 318 319 320 321 322 323 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1 0.95686938 -1.0024565 -1.7340440 -0.7049020 0.7255702 0.2005879 -1.28072339 2 -0.53872746 -0.4557599 -0.4712980 0.1768343 -1.7032223 -0.8097628 0.88535817 3 -0.33749399 0.6270776 1.0227323 1.2153194 -0.1069992 -1.0714270 -0.03568942 4 0.09820451 2.0654618 0.5984573 -1.2277405 -0.6550790 -0.1995508 -1.51560895 5 0.88210343 0.1559920 0.3715548 -0.7666727 -1.7457010 0.3967554 2.49425079 324 325 326 327 328 329 330 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 0.3552903 2.2880656 0.2487913 1.1852171 0.5233101 -3.3102488 1.1640411 2 -0.8977845 -0.7730960 0.7809305 0.3682311 1.0931186 -0.5794716 -1.1653713 3 0.4370974 -1.9584061 -1.2735377 0.8810527 3.5012541 -1.6266467 -0.6794984 4 -0.3568243 -0.2263709 0.4054991 2.3093900 0.8525433 0.5555666 0.8676016 5 -0.5441084 -0.5950961 -0.8493187 0.9652477 -0.3279137 -0.5690431 -0.2824204 331 332 333 334 335 336 337 1.00000000 1.0000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1 -2.21101182 1.3342796 -1.00324536 0.7377567 0.2219444 1.54697343 0.9134127 2 0.07737866 0.1050737 0.84016593 0.4747627 1.9943113 -0.02024821 -1.2348743 3 0.26884967 1.6563320 -0.20672255 2.3751742 0.3277060 -0.19479056 -0.4952833 4 -1.13887014 0.5522875 0.20171078 -1.0984608 -0.6498617 2.02610335 0.3194991 5 1.28929784 2.2756786 0.05908185 0.9821856 1.2175036 1.20781628 1.3981914 338 339 340 341 342 343 344 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1 -0.6257361 -0.3953239 0.9972057 -0.64674782 1.5229441 -2.0038205 0.2892424 2 1.3274133 0.4763863 -0.5989348 0.03314412 0.9910686 -0.2603533 0.7317125 3 -0.3167875 0.3042487 -0.8558193 0.53112720 0.8225047 0.1246458 -1.0133020 4 -0.6643686 -1.3884664 1.7791494 -1.28068008 -0.6333174 -2.4082495 0.6456703 5 -0.4177684 1.7644685 -0.2189833 -1.68629060 -0.6824219 -0.5708813 0.8792577 345 346 347 348 349 350 351 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1 -0.7535313 0.3258325 -1.43235557 0.6655144 1.0636848 1.4323584 0.1734350 2 -0.1809822 0.9654000 0.09957250 0.9100963 0.5420698 0.9588137 1.2053499 3 0.8814843 0.5375279 -0.06037252 0.7209258 -0.1778270 -0.7555856 1.1051091 4 -0.8482110 -0.4091995 0.42410258 -0.5736138 -0.3728648 1.8642427 -0.9303882 5 -0.5902799 0.9927559 -0.74779287 -0.4797469 -0.4470180 -0.4823366 0.4811400 352 353 354 355 356 357 358 1.0000000 1.0000000 1.000000000 1.00000000 1.00000000 1.00000000 1.0000000 1 -0.1728660 0.9799939 3.240776780 0.06435327 0.89793931 -0.20777176 0.6707725 2 0.8595023 1.5077423 0.009326722 -0.44704381 -0.01121397 -0.90855354 0.4878114 3 -0.6555442 2.3773643 -0.898424361 -0.73499897 0.37444598 -0.81880238 0.5544030 4 -0.1023420 2.7229614 -0.017499996 0.18886682 -0.63852256 0.58874507 -0.5525669 5 0.8028606 2.7151558 0.678815918 1.99990745 0.70132925 -0.07488614 0.2424884 359 360 361 362 363 364 365 1.00000000 1.00000000 1.0000000 1.00000000 1.00000000 1.0000000 1.0000000 1 0.72511061 0.47184008 0.3113488 0.99673094 -0.55889209 -0.3842967 1.3123639 2 -0.63483249 -0.04875279 1.1809511 0.28287186 1.01665811 1.3536554 -0.1866527 3 0.48858501 -1.11457314 -0.1276310 -0.62793185 -0.25360849 -2.5719120 1.6773291 4 -1.26051892 -0.56359333 -0.2567799 -0.08298543 1.06377807 1.2168952 -0.2868632 5 0.01537798 0.17893913 -2.1234301 -0.77996400 -0.09248242 -0.6208540 1.1204253 366 367 368 369 370 371 372 1.00000000 1.0000000000 1.0000000 1.0000000 1.00000000 1.0000000 1.00000000 1 -0.01190458 -1.7581093260 1.3476249 0.1486723 0.27343359 0.4097844 -1.70317639 2 -0.08852426 0.0003859805 1.4643746 0.7743577 0.79737507 -0.1679910 -0.90962288 3 0.26524974 -0.8366041892 0.1296096 1.1599681 0.74536771 -0.8271625 0.03888022 4 -0.10616229 -0.4836511549 0.3839436 -0.1910454 0.02825223 1.6947705 1.15972191 5 -0.39012858 0.2700265330 0.1980823 -1.1524568 0.35747200 -0.4244229 0.78560071 373 374 375 376 377 378 379 1.00000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1.00000000 1 -0.17966692 0.2561954 -1.81009789 -0.3998194 -2.4032347 0.6523230 0.08369122 2 -0.04531949 1.3787954 -0.02347316 0.1574758 -0.4786563 -2.0781093 0.87267722 3 0.98383694 0.9966025 0.99031962 0.6572354 -0.9900413 0.8838331 -0.84593436 4 0.16044785 -0.1211230 0.24854895 -1.3313431 0.7088192 -0.5866414 0.11529730 5 -1.36921237 -1.3422657 -0.39793307 -0.9113387 0.1922842 1.1244756 0.86095028 380 381 382 383 384 385 386 1.00000000 1.0000000 1.0000000 1.0000000 1.00000000 1.00000000 1.0000000 1 -0.69560060 -0.4410434 0.3691473 -1.0393867 1.22326836 -0.03456048 1.5437347 2 0.05518093 -0.0914868 -0.8143344 -0.2561898 0.09198921 -0.07453501 0.6154864 3 0.88860696 -1.9475489 -0.8641773 0.6886270 0.19902058 0.65227567 0.5260094 4 1.61432048 -0.3712656 -1.5961072 -1.9935575 -0.39439907 -0.04674199 0.2097691

Page 134: R version 3 Examples

5 -0.55922379 -0.5764279 1.4692445 -0.2176020 -0.76925819 -0.04352025 -1.2036455 387 388 389 390 391 392 393 1.0000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1.00000000 1 -0.3882896 -0.18498599 -0.3747211 0.2061492 0.02416381 -1.2366040 -0.84194588 2 -0.9262589 -1.07081875 1.7447354 0.9018060 -0.71040837 -1.1100510 -1.01478434 3 -1.6532498 0.01878238 0.1558888 0.2545858 -0.44169607 -2.3584679 0.06861627 4 0.2638379 -0.80643836 -1.3606690 1.0915168 0.53193240 -1.5085744 -0.17472293 5 -0.3786240 0.23875510 -0.5830374 0.8374061 0.01765808 0.0143424 -0.64380553 394 395 396 397 398 399 400 1.000000 1.000000000 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1 2.464848 0.000737908 -0.1175398 -0.3423681 0.1699277 0.82844958 0.5295049 2 2.005497 -0.377415792 0.4229616 -0.5947170 -1.5739776 0.04036582 0.5063668 3 -1.445133 1.099809563 0.5535953 1.0113702 1.7272155 -0.71375046 0.2677175 4 1.102828 -0.504792358 1.6532804 0.1594778 0.3577033 -2.14298889 1.7110258 5 1.366400 -0.709173256 2.0746053 0.2299432 0.1936633 1.90347646 -0.5010857 401 402 403 404 405 406 407 1.0000000 1.0000000 1.000000000 1.00000000 1.0000000 1.0000000 1.00000000 1 -1.6609923 0.1858032 0.550479548 -1.56867311 -0.1897323 1.7138267 0.80444947 2 -0.2813254 0.5395579 -0.011708545 -2.88901424 -1.4597174 0.4460774 2.02144592 3 0.2495909 -2.3487624 0.310682433 -0.04982029 0.5164575 -2.3930413 -0.92231894 4 -0.4751118 -0.2331807 0.007598201 0.39898054 0.3133619 -1.8071013 0.06799438 5 -0.7646822 0.7089667 -1.099150572 -0.96748245 0.4642012 1.5549967 0.19570314 408 409 410 411 412 413 414 1.00000000 1.00000000 1.0000000 1.00000000 1.0000000 1.000000000 1.00000000 1 -0.22278955 -0.03085235 -0.3485254 0.05467405 -1.2721448 0.002013178 -1.77572550 2 -1.52436706 0.44331044 0.2194109 0.91156261 1.0347952 1.817998492 1.94192687 3 0.05217064 -0.76605829 -0.6101660 -0.99822585 -0.3735325 0.539413499 -0.06171838 4 1.09974538 0.60874431 -0.2054786 -1.21402673 1.5580311 0.898546541 -1.26042605 5 0.32202269 0.68010540 0.1821606 0.09064076 2.1736219 0.531430820 -1.06264080 415 416 417 418 419 420 421 1.0000000 1.0000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1 -0.7793885 -1.5477768 1.13888067 -1.3897813 -0.6978736 -0.80416936 -0.1829722 2 0.3668881 -0.6656142 -0.99280238 0.2386836 2.9072885 -1.84123051 0.8414542 3 -1.3436261 1.7405314 -0.65993293 0.3610709 2.1967538 0.18633855 0.2118850 4 -1.0027214 -0.4933566 -0.07681359 1.5582368 1.4614577 0.35036813 -0.5290850 5 0.8120311 0.4774977 -0.98675016 0.3341003 1.0587333 0.06775581 -0.6651105 422 423 424 425 426 427 428 1.0000000 1.0000000 1.00000000 1.0000000 1.00000000 1.0000000 1.0000000000 1 0.9339500 0.1019875 0.01148664 -0.9353483 0.03750893 -0.4688225 -0.8717416851 2 1.7506632 0.7393089 -0.90942464 -0.8457268 0.50792537 0.2024786 0.5138965298 3 -0.3076635 -0.3724209 -0.65979684 0.2772660 0.84527459 1.6373718 0.3458810045 4 0.9878948 0.2570523 -0.09651198 0.4099987 -0.03043954 -0.2485347 1.5358486163 5 -0.2439029 1.6832841 -1.58541064 2.3915388 1.79527658 -1.4215811 0.0002305145 429 430 431 432 433 434 435 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1.0000000 1 2.0625757 0.2422262 0.0766168 0.6534240 -1.6282518 -0.55349847 0.3664385 2 0.2748155 1.2114801 -0.2398943 -0.9175045 -0.2893360 -1.08580404 1.3933840 3 1.7709543 -0.3494823 0.2889602 -0.4841398 -0.6906336 -1.30532060 -0.6420917 4 -0.2851361 -0.1348167 0.1160926 -1.3799022 -0.6291067 0.03887172 -1.1118592 5 -1.6937317 0.8394078 0.6550928 0.6524163 2.1604679 -1.25795957 0.1580777 436 437 438 439 440 441 442 1.00000000 1.0000000 1.0000000 1.000000000 1.0000000 1.0000000 1.00000000 1 -0.89002840 -0.1440361 -0.3468535 -0.004622377 -0.1528096 0.7111030 0.93078184 2 -0.03004185 1.0080778 -1.1380630 0.172984140 0.6945090 0.2673744 0.01310438 3 -0.13272424 0.8987173 -0.2437348 0.710808184 0.4689928 0.6649725 2.18080995 4 0.75303821 -0.3878887 1.5725550 -0.773013436 -1.3949595 1.0903380 -0.47720550 5 -0.98505576 -0.4147628 -0.1010292 -0.289301242 -0.8254093 0.6662234 -0.61363098 443 444 445 446 447 448 449 1.0000000 1.00000000 1.0000000 1.0000000 1.00000000 1.0000000 1.00000000 1 1.0084133 -0.52197895 -0.6049951 -0.1422680 -0.45600362 1.4629264 1.00004588 2 -0.2700785 -2.09544042 -0.8182270 -0.2775023 -0.05258879 -0.1537917 0.65988440 3 1.2351972 -0.06135575 0.8693026 1.1543383 0.71887858 -1.1701727 0.16980870 4 0.6123941 0.32239783 -0.4099192 0.2914569 -0.77808809 0.3597232 -0.84468080 5 -0.9921897 -0.19672841 -0.2195317 0.7639903 0.53153428 0.6275608 0.02474119 450 451 452 453 454 455 456 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1.00000000 1 -0.3451167 2.4096734 0.7933723 -0.1302319 -0.3698132 -0.2739024 0.40503425 2 0.2678024 -0.9495402 -0.3765648 -1.3328678 0.6198182 -0.5394724 -1.10186957 3 -0.8119076 2.0015792 0.7045084 0.2955312 2.6915308 -0.2955881 0.23884744 4 1.3661514 0.4971088 -0.1915965 0.4213322 -1.6151946 -2.2495883 -2.44683319 5 0.6599264 1.5824897 -0.6043879 0.6929907 1.6013965 1.2043283 0.02852315 457 458 459 460 461 462 463

Page 135: R version 3 Examples

1.0000000 1.000000000 1.00000000 1.00000000 1.0000000 1.0000000 1.0000000 1 0.1994705 1.359181902 1.90135117 1.75886929 1.8132863 -0.3156519 -0.3132906 2 1.2977075 -1.333966611 -0.04531453 -0.79789347 -0.1285395 -0.4488904 -0.4389919 3 -0.3252533 1.238826228 -1.32282134 -0.51675889 -0.3367459 0.1379439 0.5363462 4 0.3298876 -0.289350016 0.17225410 0.40865024 0.5091357 -1.1371522 -1.3234681 5 0.5911202 0.007688384 -0.41567875 0.04470611 -0.7609242 1.1018944 0.3225437 464 465 466 467 468 469 470 1.0000000 1.00000000 1.0000000 1.00000000 1.0000000 1.0000000 1.0000000 1 0.9809173 -0.40812309 0.8526007 -0.06986756 -0.7230349 -0.2109699 0.7581415 2 0.3748222 -0.14915009 -0.4591745 -0.89038976 -1.1776050 0.7399976 -0.2136073 3 -1.6014763 0.09333074 0.1139405 -0.86241408 1.1298383 0.3091013 -0.6048797 4 0.3124086 0.54079499 1.7907751 1.39017962 -1.0632901 1.4942926 -0.6943239 5 0.3860840 -0.39674000 -2.0541863 -0.23629777 -0.5962336 0.2568876 -0.3846076 471 472 473 474 475 476 477 1.00000000 1.00000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1 -0.77561468 1.25542961 0.72471010 -1.8825793 -1.0116648 0.1095553 -0.9071015 2 0.08792774 -1.77893299 -0.06335952 0.6922247 -0.1085646 -2.1659343 1.9109328 3 -2.49713000 0.53979295 1.02991758 -1.8869511 -0.2473373 -0.4389924 0.7136666 4 0.38712466 0.32712859 -0.63670508 -0.2506381 0.6064031 1.5883840 -1.6538785 5 0.06334777 -0.02450092 0.63728633 -0.1583401 0.9249020 1.6304328 0.1992633 478 479 480 481 482 483 484 1.00000000 1.00000000 1.0000000 1.0000000 1.0000000 1.0000000 1.0000000 1 0.07452834 0.12276664 1.4894603 0.9564629 -0.7075722 0.2543192 0.7797579 2 -0.97132111 -1.15042241 -1.3099146 -1.8105894 -0.8092980 0.3038179 -1.3514053 3 -0.04883016 -0.01877485 -1.0606832 0.8840766 1.1931179 -2.2674064 0.3853893 4 -1.20191778 1.42601524 -1.3946801 -0.3888997 -0.1221272 -0.7441060 2.4444255 5 0.47343118 0.60507703 -0.3759899 0.5835463 -0.2205055 0.4300905 -0.8999032 485 486 487 488 489 490 491 1.00000000 1.0000000 1.0000000 1.00000000 1.00000000 1.0000000 1.0000000 1 -1.35508513 0.1328014 -1.6372657 -0.07620272 1.09385385 0.6890960 0.1398038 2 0.07522034 0.4383724 0.7911652 1.66200204 -1.04233950 -0.4018132 -2.6566877 3 0.02596486 0.3284399 0.4320510 -0.34027345 0.03440143 0.3717535 1.8344451 4 -1.71216134 1.1314707 1.7362752 0.78244921 -0.23547796 1.0162120 -0.2669798 5 -0.40480770 -0.7617670 1.2377211 -0.89815564 1.95380629 -1.1491425 0.7565479 492 493 494 495 496 497 498 1.000000000 1.00000000 1.0000000 1.0000000 1.0000000 1.00000000 1.00000000 1 0.605309105 0.04184275 0.7177084 0.7597767 1.0494704 -0.61179770 0.07716959 2 0.444787667 0.51396584 -0.3334008 -0.8019824 -0.9248011 1.26728245 0.54415551 3 -1.652420980 0.54065550 -2.3555064 -1.6568966 0.7683351 -0.03034877 -0.62321721 4 -0.239224664 0.47627567 -0.9108628 -0.7373150 -0.6042316 1.16399014 0.50232662 5 -0.006206765 -1.27352150 0.8564506 0.4431444 -1.0487915 0.65100931 -0.23952872 499 500 1.0000000 1.0000000 1 2.7290610 1.8460822 2 -0.5320819 1.8650536 3 1.2790054 0.1239435 4 1.7296413 1.1501367 5 -0.4937812 -1.0055870 Outcome for Cases [1] 0 1 0 0 0 1 1 0 0 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 1 1 0 1 0 0 0 0 1 [42] 1 1 0 1 1 1 1 0 1 1 1 0 1 0 0 0 1 0 0 0 1 1 0 0 1 1 1 1 0 0 1 0 0 0 0 1 1 0 1 1 0 [83] 0 0 0 0 0 0 1 1 0 1 0 1 0 1 1 1 0 1 1 1 0 0 0 0 0 1 0 1 0 1 1 1 0 0 1 1 1 1 1 1 1 [124] 0 0 0 1 1 1 0 1 1 0 1 1 0 0 1 1 0 1 0 0 0 0 0 1 0 1 1 0 0 1 0 0 1 0 0 0 1 1 0 0 0 [165] 0 1 0 0 0 1 0 0 1 0 1 1 0 1 0 1 0 0 1 1 1 0 0 0 0 0 0 0 1 0 1 1 0 1 0 0 1 0 1 1 1 [206] 0 1 0 1 0 0 1 1 1 1 0 1 1 1 1 0 0 1 0 1 1 1 0 1 0 0 1 0 0 0 0 1 0 0 0 1 1 1 0 1 0 [247] 0 0 1 1 1 0 1 0 0 1 1 1 0 1 1 0 1 0 1 0 0 1 0 1 1 0 0 1 0 0 0 0 1 1 1 0 1 1 1 0 1 [288] 0 1 0 1 1 0 0 1 1 1 1 0 0 1 0 0 0 0 0 0 0 1 1 0 0 1 0 1 1 0 0 1 0 0 0 1 1 0 0 1 1 [329] 0 1 0 1 0 1 1 1 1 1 1 0 0 1 0 1 0 1 0 1 1 0 0 1 1 1 1 0 0 1 1 0 1 1 0 1 1 0 0 1 0 [370] 1 1 0 0 0 0 0 0 1 1 1 0 0 0 1 0 1 0 1 1 1 0 0 0 1 0 1 1 1 1 1 0 0 0 0 0 1 1 0 1 1 [411] 0 1 1 0 0 1 0 1 1 0 0 1 1 0 1 1 1 1 1 1 1 0 0 0 0 0 1 1 1 0 1 1 1 1 0 1 0 1 1 0 1 [452] 0 1 1 0 0 1 0 0 1 1 0 0 1 0 0 0 0 1 0 0 1 1 0 1 1 0 0 1 0 0 1 0 1 0 1 0 1 1 1 0 0 [493] 0 0 0 0 1 1 1 1 Residuals [1] -1.914867e-01 9.472195e-01 -5.412572e-02 -2.020047e-01 -1.282128e-01 [6] 5.525413e-01 6.325799e-01 -1.693611e-01 -2.309175e-01 2.715741e-01 [11] 5.194153e-02 3.865334e-01 2.063301e-01 4.159891e-01 1.146750e-01 [16] -1.581807e-02 -4.465725e-02 -7.465011e-01 -1.997020e-01 -4.455450e-01 [21] -1.327285e-01 -8.720398e-01 -6.599569e-01 -1.357742e-01 -2.380800e-01

Page 136: R version 3 Examples

[26] -3.795521e-01 -1.293117e-01 -1.030367e-01 -7.708126e-01 -4.770503e-01 [31] 2.281300e-01 -1.111053e-01 5.272310e-01 2.776028e-01 -2.151326e-01 [36] 8.786347e-01 -5.766470e-01 -3.914573e-01 -8.864559e-01 -6.212021e-02 [41] 6.556122e-01 3.839350e-01 2.815490e-03 -3.792150e-01 5.011025e-01 [46] 1.394581e-01 4.478528e-02 7.619547e-01 -1.541195e-01 2.101509e-01 [51] 5.030699e-01 7.959211e-02 -2.671173e-03 7.385914e-02 -7.456616e-02 [56] -1.323807e-01 -1.041436e-01 3.062560e-01 -1.205633e-01 -1.185563e-01 [61] -3.123719e-01 1.207182e-01 1.850492e-01 -6.676736e-01 -1.603479e-01 [66] 6.595122e-01 5.299231e-01 2.734347e-01 4.395836e-01 -5.658937e-02 [71] -2.004930e-01 1.850502e-01 -1.185343e-01 -7.799157e-02 -3.607009e-02 [76] -2.161354e-02 5.696058e-02 5.461142e-03 -4.541318e-01 9.641171e-01 [81] 1.029826e-02 -8.041002e-03 -4.561957e-01 -3.158366e-01 -8.938026e-02 [86] -6.733243e-01 -1.044849e-01 -8.673133e-02 1.244211e-02 5.768701e-01 [91] -2.589940e-01 8.774224e-05 -5.392303e-01 1.916165e-01 -6.598353e-02 [96] 4.593048e-01 1.259063e-02 9.651477e-01 -3.247084e-01 5.192019e-01 [101] 4.579495e-01 3.519318e-02 -4.320465e-01 -3.208157e-01 -1.521508e-01 [106] -6.104365e-01 -2.305838e-01 2.956691e-01 -1.254755e-03 1.219239e-02 [111] -3.128097e-01 2.505067e-01 6.830715e-02 1.882418e-03 -8.862827e-01 [116] -1.904178e-02 1.391789e-02 2.240185e-02 9.369625e-01 6.544089e-01 [121] 5.190184e-01 3.489407e-01 5.943653e-01 -3.555503e-01 -1.103993e-01 [126] -7.050535e-01 3.991942e-02 3.800066e-02 2.306221e-01 -3.070906e-01 [131] 7.224101e-01 4.845846e-01 -1.114598e-01 4.152297e-02 7.516125e-02 [136] -9.188114e-02 -7.553032e-01 5.945576e-01 2.271071e-01 -8.884611e-02 [141] 1.130230e-01 -5.149334e-02 -5.488047e-02 -6.264750e-02 -4.815506e-01 [146] -9.051853e-01 1.735744e-01 -2.405029e-01 8.892953e-02 1.758276e-01 [151] -1.684543e-01 -3.819973e-01 6.451021e-01 -7.626796e-02 -1.416085e-01 [156] 2.238316e-01 -1.835884e-01 -1.388128e-01 -1.252554e-01 1.545940e-02 [161] 1.839213e-01 -4.035143e-01 -4.709097e-02 -2.505142e-01 -1.134806e-01 [166] 5.904721e-01 -5.389913e-02 -4.231395e-03 -2.098601e-01 6.897181e-02 [171] -1.843062e-01 -3.899110e-02 5.750458e-01 -5.090816e-02 4.162563e-02 [176] 9.810616e-01 -2.596705e-01 1.424338e-01 -2.260488e-01 3.278307e-02 [181] -5.184136e-02 -2.944653e-01 5.339027e-02 3.338394e-01 1.247580e-03 [186] -3.188736e-01 -1.299857e-01 -7.319084e-02 -1.253327e-01 -7.346023e-01 [191] -2.705894e-01 -1.915372e-01 4.740443e-02 -7.169632e-02 6.845542e-01 [196] 1.336571e-01 -2.544249e-01 3.596383e-01 -8.871998e-01 -3.740278e-01 [201] 7.268959e-01 -3.444242e-01 2.200254e-01 1.647630e-01 4.141569e-01 [206] -4.903113e-02 7.414872e-01 -4.369807e-01 3.162233e-01 -1.296197e-02 [211] -1.833565e-01 2.097011e-04 8.147426e-02 1.688046e-01 3.876969e-01 [216] -4.609027e-01 2.545210e-01 2.193259e-01 4.341110e-01 9.945789e-02 [221] -1.772335e-01 -3.596659e-02 1.208416e-01 -1.053894e-03 2.355189e-01 [226] 1.897016e-01 2.522351e-02 -2.756336e-01 1.729136e-02 -4.604623e-01 [231] -4.410456e-01 7.002891e-01 -1.193981e-01 -2.350604e-01 -4.475086e-01 [236] -5.378626e-02 5.753141e-01 -3.462147e-01 -5.072571e-01 -4.140154e-01 [241] 8.289143e-01 4.759067e-02 1.304171e-01 -1.009616e-01 2.336447e-01 [246] -5.847658e-01 -1.317553e-01 -1.051069e-01 7.453069e-02 3.532743e-02 [251] 3.101186e-03 -7.191356e-02 2.990004e-02 -1.868964e-01 -5.372745e-01 [256] 1.630563e-02 3.690304e-01 2.812983e-01 -1.145911e-01 1.890331e-01 [261] 1.551994e-02 -1.831993e-01 1.111506e-01 -1.776807e-01 1.167199e-01 [266] -8.158858e-01 -7.701541e-02 2.113763e-01 -2.941274e-02 1.091089e-02 [271] 2.868768e-02 -3.717281e-02 -3.580836e-01 2.482648e-02 -3.717051e-01 [276] -2.171827e-03 -8.032537e-02 -3.871027e-01 6.917808e-01 1.538585e-01 [281] 9.472355e-01 -1.411791e-01 7.429543e-02 4.350805e-01 5.846999e-01 [286] -1.819938e-01 4.437602e-01 -4.273711e-01 3.458799e-01 -3.079745e-01 [291] 1.496715e-01 5.281712e-02 -1.917156e-01 -2.475020e-01 5.291576e-01 [296] 7.986421e-02 5.411811e-02 3.850231e-02 -4.182872e-01 -2.282979e-01 [301] 7.042263e-01 -2.752992e-01 -2.897415e-01 -4.189157e-02 -5.568909e-01 [306] -1.592672e-01 -3.110362e-01 -4.537667e-01 1.117367e-02 5.120799e-02 [311] -4.382034e-01 -8.526254e-01 1.883287e-01 -7.024176e-01 1.684235e-01 [316] 3.372568e-01 -7.230735e-01 -7.999104e-01 5.795720e-01 -1.832140e-01 [321] -2.890694e-02 -1.695749e-01 4.465783e-01 7.484951e-01 -2.130600e-01 [326] -3.145581e-01 3.332027e-03 3.896500e-03 -3.576684e-03 5.226575e-01 [331] -1.216595e-01 3.061501e-03 -4.342475e-01 3.730157e-02 5.470067e-02 [336] 1.038446e-02 3.051764e-01 7.109966e-01 3.807527e-01 -7.567642e-01 [341] -3.897067e-02 1.331295e-01 -4.570641e-03 1.968108e-01 -1.612248e-01 [346] 9.883785e-02 -1.399127e-01 2.530931e-01 3.822278e-01 -9.531151e-01 [351] -8.587175e-01 3.656284e-01 3.649235e-05 5.065536e-02 2.839991e-01 [356] -7.619429e-01 -1.868019e-01 2.247816e-01 6.948327e-01 -2.280237e-01 [361] 7.547340e-01 5.752243e-01 -7.456033e-01 7.477260e-01 2.946815e-02 [366] -3.913734e-01 -4.789989e-02 3.259473e-02 -6.492286e-01 1.141949e-01 [371] 3.334582e-01 -3.323126e-01 -3.724086e-01 -7.376522e-01 -2.466675e-01 [376] -1.181634e-01 -4.419005e-02 5.257271e-01 2.847030e-01 2.176661e-01 [381] -2.792805e-02 -1.609534e-01 -4.502369e-02 4.379727e-01 -5.828090e-01

Page 137: R version 3 Examples

[386] 1.647760e-01 -4.138398e-02 8.743241e-01 6.611572e-01 3.986264e-02 [391] -3.456358e-01 -1.693428e-03 -6.341449e-02 4.628012e-03 -3.530681e-01 [396] 1.115366e-02 4.104240e-01 3.038916e-01 5.909604e-01 7.593539e-02 [401] -4.420717e-02 -2.122794e-01 -4.211140e-01 -6.293539e-03 -3.967116e-01 [406] 6.759625e-01 1.204147e-01 -4.293633e-01 3.063112e-01 7.130642e-01 [411] -2.015548e-01 4.879212e-02 2.559575e-02 -7.728205e-02 -1.024597e-01 [416] 6.580593e-01 -1.661414e-01 2.627867e-01 1.184235e-03 -1.104221e-01 [421] -3.804926e-01 4.657173e-02 9.691458e-02 -3.622243e-02 2.408253e-01 [426] 4.912079e-02 6.007831e-01 1.883959e-01 1.124889e-01 1.654020e-01 [431] 3.151850e-01 -1.615985e-01 -2.125185e-01 -1.464956e-02 -4.833256e-01 [436] -2.068759e-01 3.121232e-01 5.607779e-01 5.854767e-01 -1.957158e-01 [441] 3.468229e-02 1.282277e-01 1.729803e-01 9.294130e-01 -2.140935e-01 [446] 1.581861e-01 -4.446945e-01 2.604056e-01 3.026947e-01 -7.435847e-01 [451] 4.164247e-03 -5.596878e-01 5.301294e-01 6.697349e-02 -8.154847e-02 [456] -4.354849e-02 1.264328e-01 -7.144505e-01 -5.574855e-01 2.968201e-01 [461] 2.561004e-01 -2.959080e-01 -1.960794e-01 4.138684e-01 -4.021685e-01 [466] -5.795918e-01 -3.394786e-01 -7.153289e-02 7.456315e-02 -2.203092e-01 [471] -5.507718e-02 4.239638e-01 1.778883e-01 -2.491274e-02 4.896947e-01 [476] 3.246426e-01 -4.889702e-01 -1.372835e-01 2.736567e-01 -5.934831e-02 [481] -5.370023e-01 6.834302e-01 -9.988027e-02 1.868737e-01 -2.655596e-02 [486] 2.271442e-01 -9.204245e-01 2.647255e-01 1.581062e-01 3.713533e-01 [491] -4.373559e-01 -2.725528e-01 -5.564445e-01 -1.005235e-01 -1.071245e-01 [496] -3.032688e-01 9.003089e-02 4.596611e-01 8.397288e-03 1.946036e-02 Value of Objective Function [1] -217.81 # of cases= 500 # of predictors= 5 Lambda used: 0 Intercept: [1] -0.1178632 Esimtated Coefficients: [1] 1.0169212 0.9629956 0.9977170 1.0578047 0.9706297 Number of Active Variables: [1] 5 Selected Variables with Nonzero Coefficients: [1] "1" "2" "3" "4" "5"

Support Vector Machings Major > library(SVMMaj) Loading required package: kernlab Attaching package: ‘SVMMaj’ The following object is masked _by_ ‘.GlobalEnv’: diabetes Warning messages: 1: package ‘SVMMaj’ was built under R version 3.1.2 2: package ‘kernlab’ was built under R version 3.1.0

AusCredit > attach(AusCredit) The following objects are masked _by_ .GlobalEnv: X, y > summary(X)

Page 138: R version 3 Examples

X1 X2 X3 X4 Min. :0.0005183 Min. :0.01076 Min. :0.01308 Min. :0.008551 1st Qu.:0.2225089 1st Qu.:0.28420 1st Qu.:0.24783 1st Qu.:0.246711 Median :0.4392563 Median :0.51402 Median :0.49552 Median :0.506923 Mean :0.4582808 Mean :0.51362 Mean :0.50029 Mean :0.503837 3rd Qu.:0.6828933 3rd Qu.:0.74856 3rd Qu.:0.75115 3rd Qu.:0.738891 Max. :0.9937938 Max. :0.99782 Max. :0.99598 Max. :0.994787 X5 Min. :0.002538 1st Qu.:0.219328 Median :0.489479 Mean :0.494275 3rd Qu.:0.756727 Max. :0.997443 > summary(y) Min. 1st Qu. Median Mean 3rd Qu. Max. 2.12 10.77 14.77 14.48 18.29 26.74 > detach(AusCredit)

diabetes > attach(diabetes) The following objects are masked _by_ .GlobalEnv: x, y > summary(X) X1 X2 X3 X4 Min. :0.0005183 Min. :0.01076 Min. :0.01308 Min. :0.008551 1st Qu.:0.2225089 1st Qu.:0.28420 1st Qu.:0.24783 1st Qu.:0.246711 Median :0.4392563 Median :0.51402 Median :0.49552 Median :0.506923 Mean :0.4582808 Mean :0.51362 Mean :0.50029 Mean :0.503837 3rd Qu.:0.6828933 3rd Qu.:0.74856 3rd Qu.:0.75115 3rd Qu.:0.738891 Max. :0.9937938 Max. :0.99782 Max. :0.99598 Max. :0.994787 X5 Min. :0.002538 1st Qu.:0.219328 Median :0.489479 Mean :0.494275 3rd Qu.:0.756727 Max. :0.997443 > summary(y) Min. 1st Qu. Median Mean 3rd Qu. Max. 2.12 10.77 14.77 14.48 18.29 26.74

hinge > hingefunction <- getHinge() > ## plot hinge function value and, if specified, the majorization function at z > plot(hingefunction,z=3) > ## generate loss function value > loss <- hingefunction(q = -10:10 ,y = 1)$loss > loss [1] 121 100 81 64 49 36 25 16 9 4 1 0 0 0 0 0 0 0 0 0 [21] 0

Page 139: R version 3 Examples

Support Vector Machins

wsvm > library(wSVM) Loading required package: MASS Loading required package: quadprog Warning messages: 1: package ‘wSVM’ was built under R version 3.1.2 2: package ‘quadprog’ was built under R version 3.1.0 > # generate a simulation data set using mixture example(page 17, Friedman et al. 2000) > svm.data <- simul.wsvm(set.seeds = 123) > X <- svm.data$X > Y <- svm.data$Y > new.X <- svm.data$new.X > new.Y <- svm.data$new.Y > # run Weighted K-means clustering SVM with boosting algorithm > model <- wsvm(X, Y, c.n = rep(1/ length(Y),length(Y))) > # predict the model and compute an error rate. > pred <- wsvm.predict(X,Y, new.X, new.Y, model) > Error.rate(pred$predicted.Y, Y) [1] 2.365 > # add boost algorithm > boo <- wsvm.boost(X, Y, new.X, new.Y, c.n = rep(1/ length(Y),length(Y)), + B = 50, kernel.type = list(type = "rbf", par= 0.5), C = 4, + eps = 1e-10, plotting = TRUE) > boo $error.rate [1] 0.344 0.234 0.231 0.269 0.234 0.231 0.259 0.234 0.231 0.252 0.232 0.231 0.245 [14] 0.232 0.231 0.241 0.231 0.231 0.239 0.231 0.231 0.239 0.231 0.231 0.239 0.231 [27] 0.231 0.237 0.231 0.231 0.237 0.231 0.231 0.237 0.231 0.231 0.237 0.231 0.231

Page 140: R version 3 Examples

[40] 0.236 0.231 0.231 0.235 0.231 0.231 0.235 0.231 0.231 0.235 0.231 $predicted.model $predicted.model$predicted.values [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [1,] -16.27017 -39.08044 0.944282 -1.789679 -36.89359 21.17785 -11.34715 -40.0761 [,9] [,10] [,11] [,12] [,13] [,14] [,15] [,16] [1,] -28.99057 -5.516065 -21.77547 10.80725 -43.71525 -30.27151 -50.15125 -47.84813 [,17] [,18] [,19] [,20] [,21] [,22] [,23] [,24] [1,] -8.801083 28.84544 -17.01454 -22.59571 10.20134 -40.56534 -36.93977 -24.75309 [,25] [,26] [,27] [,28] [,29] [,30] [,31] [,32] [1,] -36.83225 -49.27991 -34.96316 -2.207464 -3.229874 -37.81128 -40.12909 -35.76373 [,33] [,34] [,35] [,36] [,37] [,38] [,39] [,40] [1,] -21.92024 -45.98178 30.5596 -10.69508 -32.76042 3.999418 -19.59183 15.18728 [,41] [,42] [,43] [,44] [,45] [,46] [,47] [,48] [1,] -24.63925 -5.731251 -14.10618 0.7742819 -16.62675 3.58908 -1.128301 -14.94297 [,49] [,50] [,51] [,52] [,53] [,54] [,55] [,56] [1,] 10.74508 -38.03607 36.39655 4.741127 -10.71515 -21.491 -22.7896 -9.612561 [,57] [,58] [,59] [,60] [,61] [,62] [,63] [,64] [1,] -38.41606 -38.59449 -2.183832 -15.02233 -35.18771 27.26006 -14.48672 18.8643 [,65] [,66] [,67] [,68] [,69] [,70] [,71] [,72] [1,] -39.41475 5.502769 -10.25952 -5.411607 -45.34746 -10.84983 -33.7641 -23.24468 [,73] [,74] [,75] [,76] [,77] [,78] [,79] [,80] [1,] -41.28699 37.75664 10.44359 22.71841 -30.97524 -39.62491 -8.725992 -42.44628 [,81] [,82] [,83] [,84] [,85] [,86] [,87] [,88] [1,] -16.446 -32.84528 10.89038 -49.85043 -47.57536 -8.22942 -7.5087 -32.98926 [,89] [,90] [,91] [,92] [,93] [,94] [,95] [,96] [1,] -0.2450541 -32.00351 -39.03697 -40.77949 -44.51566 -38.68644 -30.41145 -42.50654 [,97] [,98] [,99] [,100] [,101] [,102] [,103] [,104] [1,] -45.62597 19.22893 18.79446 -18.14004 -14.38503 -18.45792 -4.441998 -31.82107 [,105] [,106] [,107] [,108] [,109] [,110] [,111] [,112] [1,] 14.39378 -26.44394 -34.62355 -11.74433 7.984242 -30.88617 -39.52105 -3.691885 [,113] [,114] [,115] [,116] [,117] [,118] [,119] [,120] [1,] 31.38765 -16.81146 -26.67792 -37.91711 -16.68856 -40.03532 -4.355588 -13.78785 [,121] [,122] [,123] [,124] [,125] [,126] [,127] [,128] [1,] -24.03291 13.46545 -16.85368 29.41382 -22.37301 1.482483 -33.98964 19.59878 [,129] [,130] [,131] [,132] [,133] [,134] [,135] [,136] [1,] -43.33995 -42.3705 -30.57115 -42.75856 -30.21792 -42.12152 -5.648007 -12.00956 [,137] [,138] [,139] [,140] [,141] [,142] [,143] [,144] [1,] -26.54443 -34.8331 -2.831847 -5.435084 -33.01808 -6.259952 22.39395 -2.856443 [,145] [,146] [,147] [,148] [,149] [,150] [,151] [,152] [1,] -37.53853 -2.491137 -6.839391 34.59299 23.50693 -35.06248 -23.07989 -9.042504 [,153] [,154] [,155] [,156] [,157] [,158] [,159] [,160] [1,] -3.870491 13.2322 -24.5943 -36.0302 -6.097423 2.75249 -12.26365 -4.151632 [,161] [,162] [,163] [,164] [,165] [,166] [,167] [,168] [1,] -32.74158 -29.53172 -38.31893 -45.12027 -4.458359 -15.61552 -42.93104 21.78892 [,169] [,170] [,171] [,172] [,173] [,174] [,175] [,176] [1,] -15.55209 32.74605 -38.74027 -27.36557 -8.686813 -42.08723 -34.50029 0.5652003 [,177] [,178] [,179] [,180] [,181] [,182] [,183] [,184] [1,] -5.033727 -5.593206 -35.73856 -14.34853 -14.72405 -13.6193 -41.74765 21.64528 [,185] [,186] [,187] [,188] [,189] [,190] [,191] [,192] [1,] -6.737482 -5.32799 -16.222 10.3222 -15.01938 -28.11268 18.97572 -37.09884 [,193] [,194] [,195] [,196] [,197] [,198] [,199] [,200] [1,] -0.2348997 -10.97634 -32.82672 -2.659103 -2.431912 -40.61066 -32.37985 -2.06211 [,201] [,202] [,203] [,204] [,205] [,206] [,207] [,208] [1,] -6.845711 -32.61925 -20.38674 -32.01772 -3.5023 18.80173 -37.22341 22.90405 [,209] [,210] [,211] [,212] [,213] [,214] [,215] [,216] [1,] -20.72926 -3.741356 4.668957 26.62816 -15.9397 -34.0794 34.72886 -28.741 [,217] [,218] [,219] [,220] [,221] [,222] [,223] [,224] [1,] -43.28544 4.504605 -8.297369 -13.13231 5.048992 -34.24613 -40.25946 -21.26734 [,225] [,226] [,227] [,228] [,229] [,230] [,231] [,232] [1,] 30.92008 -41.63936 9.878261 -6.309147 -35.98381 1.349449 -49.65622 12.25409 [,233] [,234] [,235] [,236] [,237] [,238] [,239] [,240] [1,] 3.337962 28.91054 -15.93327 -36.39339 -15.57161 -21.66869 -45.62205 10.10291 [,241] [,242] [,243] [,244] [,245] [,246] [,247] [,248] [1,] -20.2071 -30.43712 30.36746 -34.14014 -15.11688 -31.28756 5.531549 -28.08313 [,249] [,250] [,251] [,252] [,253] [,254] [,255] [,256] [1,] 12.62885 -50.21185 4.871637 18.90673 -19.76252 -13.65875 -21.81564 -31.24732 [,257] [,258] [,259] [,260] [,261] [,262] [,263] [,264] [1,] -9.155318 -8.3661 6.384394 -36.58165 -4.182963 24.19121 -39.87982 -33.75436 [,265] [,266] [,267] [,268] [,269] [,270] [,271] [,272] [1,] -36.21607 -0.4441701 -27.12935 -14.53268 23.09981 -14.49133 -39.56742 29.63706

Page 141: R version 3 Examples

[,273] [,274] [,275] [,276] [,277] [,278] [,279] [,280] [1,] 22.5989 -30.6448 -7.887724 -9.412909 -28.69099 -6.489172 35.41012 -39.45576 [,281] [,282] [,283] [,284] [,285] [,286] [,287] [,288] [1,] -46.62317 -13.61384 -25.93487 -12.97862 17.20281 -7.313286 31.99024 10.77867 [,289] [,290] [,291] [,292] [,293] [,294] [,295] [,296] [1,] -5.169043 -34.64455 14.91441 -45.29851 24.69178 11.60096 -47.23122 4.466546 [,297] [,298] [,299] [,300] [,301] [,302] [,303] [,304] [1,] -13.90783 -49.55747 30.52947 -22.86797 -22.39481 29.0996 -42.53411 -31.87113 [,305] [,306] [,307] [,308] [,309] [,310] [,311] [,312] [1,] -30.5347 -12.16539 -39.55768 32.25857 -12.56304 8.865682 6.093298 5.800194 [,313] [,314] [,315] [,316] [,317] [,318] [,319] [,320] [1,] -47.81803 36.12897 -44.87239 9.765551 -21.90507 26.53808 -15.10975 -50.1588 [,321] [,322] [,323] [,324] [,325] [,326] [,327] [,328] [1,] -23.13055 31.73821 -43.09306 -24.62054 -43.00199 22.75016 -15.18862 18.13613 [,329] [,330] [,331] [,332] [,333] [,334] [,335] [,336] [1,] -15.319 -37.62433 -38.46493 -10.12078 1.358513 33.20828 -29.35786 -9.902488 [,337] [,338] [,339] [,340] [,341] [,342] [,343] [,344] [1,] -15.57518 -15.66524 -28.93112 -7.414679 -3.345479 36.29042 -16.02426 20.47514 [,345] [,346] [,347] [,348] [,349] [,350] [,351] [,352] [1,] -12.01237 -42.79668 -30.23974 -48.12329 16.62334 -34.98338 17.73377 -40.11143 [,353] [,354] [,355] [,356] [,357] [,358] [,359] [,360] [1,] -40.39592 -47.06595 -35.76414 -33.32677 -4.947772 -6.304314 -28.61886 30.40981 [,361] [,362] [,363] [,364] [,365] [,366] [,367] [,368] [1,] -3.301689 -40.81905 -22.90512 -20.91246 -31.14049 9.460041 -40.74746 38.76421 [,369] [,370] [,371] [,372] [,373] [,374] [,375] [,376] [1,] 4.95824 -0.6292399 -1.02733 -1.239114 -49.46741 -6.082582 -21.63832 8.03131 [,377] [,378] [,379] [,380] [,381] [,382] [,383] [,384] [1,] -8.890935 8.433609 -24.94095 -32.39953 -20.89348 -38.71237 -15.89478 -15.30998 [,385] [,386] [,387] [,388] [,389] [,390] [,391] [,392] [1,] -27.61351 -25.1533 -21.98088 -12.8123 20.95872 -10.86229 -23.34211 29.08745 [,393] [,394] [,395] [,396] [,397] [,398] [,399] [,400] [1,] -23.00144 -8.952241 -21.04443 -16.12636 -3.889586 -48.44275 -34.70085 -26.48346 [,401] [,402] [,403] [,404] [,405] [,406] [,407] [,408] [1,] -38.74656 -49.89017 -14.3927 -30.30111 -40.78396 12.45615 -46.8209 36.36159 [,409] [,410] [,411] [,412] [,413] [,414] [,415] [,416] [,417] [1,] -48.04399 -32.02221 19.76371 -27.1932 22.70623 1.250056 5.519071 3.21088 10.12557 [,418] [,419] [,420] [,421] [,422] [,423] [,424] [,425] [1,] -11.32429 -7.106856 30.11756 8.153624 -28.50993 0.2817419 -33.97924 3.947631 [,426] [,427] [,428] [,429] [,430] [,431] [,432] [,433] [1,] -48.26046 -26.58674 -31.08463 29.72648 9.702576 -15.49638 -2.06291 -18.12061 [,434] [,435] [,436] [,437] [,438] [,439] [,440] [,441] [1,] -14.31856 13.52658 -36.22982 -49.62361 1.325244 37.0753 22.72696 29.34957 [,442] [,443] [,444] [,445] [,446] [,447] [,448] [,449] [1,] -10.28714 -12.32955 -48.73796 -42.86785 -45.69099 -41.86658 23.28913 -33.73909 [,450] [,451] [,452] [,453] [,454] [,455] [,456] [,457] [1,] 4.075928 -20.41009 -10.8528 -8.185417 35.7306 29.46309 -44.56059 -16.40437 [,458] [,459] [,460] [,461] [,462] [,463] [,464] [,465] [1,] -50.15013 -43.55205 9.859921 -2.191472 18.35806 -7.335841 -14.70479 5.816835 [,466] [,467] [,468] [,469] [,470] [,471] [,472] [,473] [1,] 21.0722 12.42268 -17.87657 -15.20007 -43.50467 24.18484 -40.92547 2.864531 [,474] [,475] [,476] [,477] [,478] [,479] [,480] [,481] [1,] -17.97937 -22.60705 -1.578097 -28.70601 37.49023 -15.22753 -46.39693 17.27119 [,482] [,483] [,484] [,485] [,486] [,487] [,488] [,489] [1,] -37.94433 -35.56422 -34.39671 -2.82519 -49.52225 -43.55196 -38.09036 1.759418 [,490] [,491] [,492] [,493] [,494] [,495] [,496] [,497] [,498] [1,] 24.53 -2.516211 19.73235 -14.87989 -41.22648 38.41699 -24.93667 12.0291 16.8544 [,499] [,500] [,501] [,502] [,503] [,504] [,505] [,506] [1,] -31.43683 -44.18605 10.49844 -3.653047 36.3897 -45.06542 -34.77028 27.4498 [,507] [,508] [,509] [,510] [,511] [,512] [,513] [,514] [,515] [1,] 13.32931 14.79214 19.94005 18.18704 27.52435 36.35196 4.198445 22.90299 33.18769 [,516] [,517] [,518] [,519] [,520] [,521] [,522] [,523] [,524] [1,] 17.17727 37.14816 12.13658 9.727201 25.20227 18.53947 -1.867508 31.19432 8.533474 [,525] [,526] [,527] [,528] [,529] [,530] [,531] [,532] [,533] [1,] 12.43035 4.896552 23.15998 22.752 20.48538 11.70733 21.00109 13.78528 18.44805 [,534] [,535] [,536] [,537] [,538] [,539] [,540] [,541] [,542] [1,] 15.50344 18.8001 11.9329 25.73658 19.59609 -3.179318 20.51091 0.1112887 -13.82125 [,543] [,544] [,545] [,546] [,547] [,548] [,549] [,550] [,551] [1,] 7.181408 38.34286 36.43024 21.9822 -35.90049 7.894236 36.17372 14.71083 25.9818 [,552] [,553] [,554] [,555] [,556] [,557] [,558] [,559] [,560] [1,] 37.96735 3.593774 17.46515 16.78559 12.18323 9.077706 7.27024 14.91881 -3.014877 [,561] [,562] [,563] [,564] [,565] [,566] [,567] [,568] [,569] [1,] 4.821116 21.36031 38.29708 28.75401 17.79705 6.217101 4.045773 20.365 12.29597

Page 142: R version 3 Examples

[,570] [,571] [,572] [,573] [,574] [,575] [,576] [,577] [,578] [1,] 35.60256 8.784281 22.37672 13.78731 26.18812 23.5776 16.96164 30.23636 -0.6930769 [,579] [,580] [,581] [,582] [,583] [,584] [,585] [,586] [1,] 17.93865 23.26609 38.59238 19.79008 18.71282 -12.08734 24.52556 -26.64903 [,587] [,588] [,589] [,590] [,591] [,592] [,593] [,594] [,595] [1,] 36.76701 19.70671 21.65721 12.3788 6.52507 -39.34199 23.43075 26.71606 -19.97812 [,596] [,597] [,598] [,599] [,600] [,601] [,602] [,603] [1,] 22.12468 17.93311 31.70402 19.79186 -23.41259 29.52507 26.67675 34.91353 [,604] [,605] [,606] [,607] [,608] [,609] [,610] [,611] [,612] [1,] -4.143804 14.1254 1.858325 -18.74358 15.5934 -9.852869 21.69862 12.09698 19.57911 [,613] [,614] [,615] [,616] [,617] [,618] [,619] [,620] [,621] [1,] 24.02243 22.63789 -2.730113 32.90228 12.69182 25.94243 23.29236 -40.7069 22.47083 [,622] [,623] [,624] [,625] [,626] [,627] [,628] [,629] [1,] -25.83527 37.05957 3.702439 -45.05636 33.16777 36.72567 3.638794 24.96429 [,630] [,631] [,632] [,633] [,634] [,635] [,636] [,637] [1,] -27.41948 36.21322 11.91989 37.88044 -2.273708 -21.66729 39.26591 22.51497 [,638] [,639] [,640] [,641] [,642] [,643] [,644] [,645] [,646] [1,] 28.84065 26.6654 36.25212 -0.6032254 16.66168 15.5364 18.90488 -8.187162 8.059558 [,647] [,648] [,649] [,650] [,651] [,652] [,653] [,654] [,655] [1,] 26.65652 11.93187 13.35339 19.23556 31.65853 21.19456 33.88522 -14.09402 35.86381 [,656] [,657] [,658] [,659] [,660] [,661] [,662] [,663] [,664] [1,] 17.25389 24.00386 11.21003 13.08404 -11.76037 22.67059 14.68572 -3.26139 34.00052 [,665] [,666] [,667] [,668] [,669] [,670] [,671] [,672] [1,] 38.70603 -39.98927 23.52405 22.36081 -7.43897 30.36959 0.03302824 13.76879 [,673] [,674] [,675] [,676] [,677] [,678] [,679] [,680] [,681] [1,] -21.08371 29.55286 35.82308 19.27472 4.704699 37.22923 9.393078 21.20459 31.71435 [,682] [,683] [,684] [,685] [,686] [,687] [,688] [,689] [,690] [1,] 28.2905 13.63596 17.02169 23.7448 14.52213 14.43843 13.177 38.68759 27.92376 [,691] [,692] [,693] [,694] [,695] [,696] [,697] [,698] [,699] [1,] -19.52662 16.57603 15.12958 13.7219 38.42782 -6.009268 -10.3505 21.05822 20.80028 [,700] [,701] [,702] [,703] [,704] [,705] [,706] [,707] [,708] [1,] 21.42977 21.7976 39.00684 15.92463 14.01444 19.89942 7.605778 5.368272 25.24063 [,709] [,710] [,711] [,712] [,713] [,714] [,715] [,716] [,717] [1,] -18.04013 31.12 20.77967 9.813607 10.2344 -11.78832 27.70098 27.18736 -37.46273 [,718] [,719] [,720] [,721] [,722] [,723] [,724] [,725] [1,] -26.57351 22.47264 6.794077 36.68309 2.360507 -8.153156 -9.065619 6.080867 [,726] [,727] [,728] [,729] [,730] [,731] [,732] [,733] [,734] [1,] 37.12582 16.11685 9.450515 36.60776 12.60635 20.93796 37.30389 -2.394863 22.0336 [,735] [,736] [,737] [,738] [,739] [,740] [,741] [,742] [,743] [1,] 10.45094 -42.62139 7.195446 23.00916 30.05726 17.99331 21.5049 22.34627 19.57441 [,744] [,745] [,746] [,747] [,748] [,749] [,750] [,751] [,752] [1,] 14.58497 13.70328 12.91423 34.81896 9.394748 31.57066 23.5416 23.3473 32.66427 [,753] [,754] [,755] [,756] [,757] [,758] [,759] [,760] [,761] [1,] 14.43614 3.505382 7.452814 18.08976 13.49042 23.59752 4.70188 36.72075 17.4993 [,762] [,763] [,764] [,765] [,766] [,767] [,768] [,769] [,770] [1,] 28.19604 16.8927 32.06938 10.41121 22.94331 17.25074 38.09022 -10.74629 29.88373 [,771] [,772] [,773] [,774] [,775] [,776] [,777] [,778] [,779] [1,] 23.90167 22.99974 19.72697 3.557221 15.61226 35.20224 19.56244 38.75702 17.6951 [,780] [,781] [,782] [,783] [,784] [,785] [,786] [,787] [1,] 13.67582 37.04425 11.98628 -1.813733 -21.88131 2.751618 32.66003 36.1949 [,788] [,789] [,790] [,791] [,792] [,793] [,794] [,795] [,796] [1,] -40.39526 29.16073 23.06909 24.49586 15.19197 28.53426 38.7201 17.6364 33.05532 [,797] [,798] [,799] [,800] [,801] [,802] [,803] [,804] [1,] 16.48638 39.14081 20.07146 30.08067 21.84087 18.99062 -14.71014 -13.86334 [,805] [,806] [,807] [,808] [,809] [,810] [,811] [,812] [,813] [1,] 31.79787 23.08725 19.39451 23.52447 24.68634 -11.1324 20.05727 1.508482 27.49303 [,814] [,815] [,816] [,817] [,818] [,819] [,820] [,821] [1,] -15.39953 26.58987 16.92799 36.69849 -2.147935 6.874402 17.81895 -0.1526351 [,822] [,823] [,824] [,825] [,826] [,827] [,828] [,829] [1,] -4.663268 32.60587 15.48922 35.31802 17.13118 26.8183 -19.99068 -22.25762 [,830] [,831] [,832] [,833] [,834] [,835] [,836] [,837] [,838] [1,] -17.39798 -47.00745 37.21426 29.68683 29.07221 11.36659 11.7108 37.2217 -5.464489 [,839] [,840] [,841] [,842] [,843] [,844] [,845] [,846] [1,] 38.37217 -26.36088 22.30125 31.62431 -2.007138 6.575634 11.31978 39.15182 [,847] [,848] [,849] [,850] [,851] [,852] [,853] [,854] [1,] -19.33146 5.250051 37.47162 9.785576 -4.93801 -15.61497 22.57307 29.41549 [,855] [,856] [,857] [,858] [,859] [,860] [,861] [,862] [1,] -0.8397536 32.96376 23.38999 -25.74405 -20.60372 20.91718 34.86803 -31.61521 [,863] [,864] [,865] [,866] [,867] [,868] [,869] [,870] [,871] [1,] 27.60242 38.82283 1.482911 38.24677 22.19427 24.3146 -3.026044 18.813 -11.35892 [,872] [,873] [,874] [,875] [,876] [,877] [,878] [,879] [1,] -35.39758 34.1562 -4.310235 5.532996 25.90941 -42.62459 -33.21811 9.278782

Page 143: R version 3 Examples

[,880] [,881] [,882] [,883] [,884] [,885] [,886] [,887] [,888] [1,] -10.286 14.25465 23.70501 -8.903758 18.96384 35.38695 22.5964 24.10753 35.66059 [,889] [,890] [,891] [,892] [,893] [,894] [,895] [,896] [,897] [1,] 31.62639 36.73778 11.83581 24.50317 30.38097 10.76887 25.93239 21.77899 -17.83531 [,898] [,899] [,900] [,901] [,902] [,903] [,904] [,905] [,906] [1,] 30.05708 -42.8873 17.23801 22.06864 15.42775 26.24829 16.43025 20.23261 14.20674 [,907] [,908] [,909] [,910] [,911] [,912] [,913] [,914] [1,] 16.90132 24.57694 -10.91517 -22.15209 -0.2050415 15.33902 -2.554708 -6.493068 [,915] [,916] [,917] [,918] [,919] [,920] [,921] [,922] [,923] [1,] -9.393511 22.91488 17.38079 13.79285 32.42075 27.01597 4.93411 20.54395 22.43964 [,924] [,925] [,926] [,927] [,928] [,929] [,930] [,931] [1,] 24.24687 -22.51979 9.403874 21.74966 18.90234 -4.985459 26.74282 28.60688 [,932] [,933] [,934] [,935] [,936] [,937] [,938] [,939] [,940] [1,] 21.21157 36.97883 16.48643 21.60612 19.95054 9.352821 24.33682 -4.45221 -9.667934 [,941] [,942] [,943] [,944] [,945] [,946] [,947] [,948] [1,] 22.40643 29.60657 13.97122 38.31708 -25.62001 9.307399 -10.53841 -14.14341 [,949] [,950] [,951] [,952] [,953] [,954] [,955] [,956] [,957] [1,] 34.66362 21.88622 12.53602 19.73038 23.90455 23.08086 4.047239 23.6262 28.44745 [,958] [,959] [,960] [,961] [,962] [,963] [,964] [,965] [,966] [1,] 23.24147 34.89483 37.64118 12.63404 38.72637 -4.969913 19.69083 23.69448 14.4555 [,967] [,968] [,969] [,970] [,971] [,972] [,973] [,974] [,975] [1,] 5.105148 19.46503 33.11206 20.2006 23.83841 -33.56899 14.90093 38.17178 36.9415 [,976] [,977] [,978] [,979] [,980] [,981] [,982] [,983] [1,] -19.74697 23.82049 -16.36104 -22.23173 23.14513 28.89228 35.55705 23.57657 [,984] [,985] [,986] [,987] [,988] [,989] [,990] [,991] [,992] [1,] -10.32926 21.64584 35.7729 15.21957 5.671875 38.10004 37.48496 -30.42777 37.99629 [,993] [,994] [,995] [,996] [,997] [,998] [,999] [,1000] [1,] 14.88153 28.44732 13.13881 18.67816 24.23694 30.08263 38.23451 11.40975 $predicted.model$predicted.Y [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12] [,13] [,14] [,15] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 1 -1 -1 -1 [,16] [,17] [,18] [,19] [,20] [,21] [,22] [,23] [,24] [,25] [,26] [,27] [,28] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 [,29] [,30] [,31] [,32] [,33] [,34] [,35] [,36] [,37] [,38] [,39] [,40] [,41] [1,] -1 -1 -1 -1 -1 -1 1 -1 -1 1 -1 1 -1 [,42] [,43] [,44] [,45] [,46] [,47] [,48] [,49] [,50] [,51] [,52] [,53] [,54] [1,] -1 -1 1 -1 1 -1 -1 1 -1 1 1 -1 -1 [,55] [,56] [,57] [,58] [,59] [,60] [,61] [,62] [,63] [,64] [,65] [,66] [,67] [1,] -1 -1 -1 -1 -1 -1 -1 1 -1 1 -1 1 -1 [,68] [,69] [,70] [,71] [,72] [,73] [,74] [,75] [,76] [,77] [,78] [,79] [,80] [1,] -1 -1 -1 -1 -1 -1 1 1 1 -1 -1 -1 -1 [,81] [,82] [,83] [,84] [,85] [,86] [,87] [,88] [,89] [,90] [,91] [,92] [,93] [1,] -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,94] [,95] [,96] [,97] [,98] [,99] [,100] [,101] [,102] [,103] [,104] [,105] [1,] -1 -1 -1 -1 1 1 -1 -1 -1 -1 -1 1 [,106] [,107] [,108] [,109] [,110] [,111] [,112] [,113] [,114] [,115] [,116] [1,] -1 -1 -1 1 -1 -1 -1 1 -1 -1 -1 [,117] [,118] [,119] [,120] [,121] [,122] [,123] [,124] [,125] [,126] [,127] [1,] -1 -1 -1 -1 -1 1 -1 1 -1 1 -1 [,128] [,129] [,130] [,131] [,132] [,133] [,134] [,135] [,136] [,137] [,138] [1,] 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,139] [,140] [,141] [,142] [,143] [,144] [,145] [,146] [,147] [,148] [,149] [1,] -1 -1 -1 -1 1 -1 -1 -1 -1 1 1 [,150] [,151] [,152] [,153] [,154] [,155] [,156] [,157] [,158] [,159] [,160] [1,] -1 -1 -1 -1 1 -1 -1 -1 1 -1 -1 [,161] [,162] [,163] [,164] [,165] [,166] [,167] [,168] [,169] [,170] [,171] [1,] -1 -1 -1 -1 -1 -1 -1 1 -1 1 -1 [,172] [,173] [,174] [,175] [,176] [,177] [,178] [,179] [,180] [,181] [,182] [1,] -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 [,183] [,184] [,185] [,186] [,187] [,188] [,189] [,190] [,191] [,192] [,193] [1,] -1 1 -1 -1 -1 1 -1 -1 1 -1 -1 [,194] [,195] [,196] [,197] [,198] [,199] [,200] [,201] [,202] [,203] [,204] [1,] -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,205] [,206] [,207] [,208] [,209] [,210] [,211] [,212] [,213] [,214] [,215] [1,] -1 1 -1 1 -1 -1 1 1 -1 -1 1 [,216] [,217] [,218] [,219] [,220] [,221] [,222] [,223] [,224] [,225] [,226] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 1 -1 [,227] [,228] [,229] [,230] [,231] [,232] [,233] [,234] [,235] [,236] [,237] [1,] 1 -1 -1 1 -1 1 1 1 -1 -1 -1 [,238] [,239] [,240] [,241] [,242] [,243] [,244] [,245] [,246] [,247] [,248] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 1 -1

Page 144: R version 3 Examples

[,249] [,250] [,251] [,252] [,253] [,254] [,255] [,256] [,257] [,258] [,259] [1,] 1 -1 1 1 -1 -1 -1 -1 -1 -1 1 [,260] [,261] [,262] [,263] [,264] [,265] [,266] [,267] [,268] [,269] [,270] [1,] -1 -1 1 -1 -1 -1 -1 -1 -1 1 -1 [,271] [,272] [,273] [,274] [,275] [,276] [,277] [,278] [,279] [,280] [,281] [1,] -1 1 1 -1 -1 -1 -1 -1 1 -1 -1 [,282] [,283] [,284] [,285] [,286] [,287] [,288] [,289] [,290] [,291] [,292] [1,] -1 -1 -1 1 -1 1 1 -1 -1 1 -1 [,293] [,294] [,295] [,296] [,297] [,298] [,299] [,300] [,301] [,302] [,303] [1,] 1 1 -1 1 -1 -1 1 -1 -1 1 -1 [,304] [,305] [,306] [,307] [,308] [,309] [,310] [,311] [,312] [,313] [,314] [1,] -1 -1 -1 -1 1 -1 1 1 1 -1 1 [,315] [,316] [,317] [,318] [,319] [,320] [,321] [,322] [,323] [,324] [,325] [1,] -1 1 -1 1 -1 -1 -1 1 -1 -1 -1 [,326] [,327] [,328] [,329] [,330] [,331] [,332] [,333] [,334] [,335] [,336] [1,] 1 -1 1 -1 -1 -1 -1 1 1 -1 -1 [,337] [,338] [,339] [,340] [,341] [,342] [,343] [,344] [,345] [,346] [,347] [1,] -1 -1 -1 -1 -1 1 -1 1 -1 -1 -1 [,348] [,349] [,350] [,351] [,352] [,353] [,354] [,355] [,356] [,357] [,358] [1,] -1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 [,359] [,360] [,361] [,362] [,363] [,364] [,365] [,366] [,367] [,368] [,369] [1,] -1 1 -1 -1 -1 -1 -1 1 -1 1 1 [,370] [,371] [,372] [,373] [,374] [,375] [,376] [,377] [,378] [,379] [,380] [1,] -1 -1 -1 -1 -1 -1 1 -1 1 -1 -1 [,381] [,382] [,383] [,384] [,385] [,386] [,387] [,388] [,389] [,390] [,391] [1,] -1 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 [,392] [,393] [,394] [,395] [,396] [,397] [,398] [,399] [,400] [,401] [,402] [1,] 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,403] [,404] [,405] [,406] [,407] [,408] [,409] [,410] [,411] [,412] [,413] [1,] -1 -1 -1 1 -1 1 -1 -1 1 -1 1 [,414] [,415] [,416] [,417] [,418] [,419] [,420] [,421] [,422] [,423] [,424] [1,] 1 1 1 1 -1 -1 1 1 -1 1 -1 [,425] [,426] [,427] [,428] [,429] [,430] [,431] [,432] [,433] [,434] [,435] [1,] 1 -1 -1 -1 1 1 -1 -1 -1 -1 1 [,436] [,437] [,438] [,439] [,440] [,441] [,442] [,443] [,444] [,445] [,446] [1,] -1 -1 1 1 1 1 -1 -1 -1 -1 -1 [,447] [,448] [,449] [,450] [,451] [,452] [,453] [,454] [,455] [,456] [,457] [1,] -1 1 -1 1 -1 -1 -1 1 1 -1 -1 [,458] [,459] [,460] [,461] [,462] [,463] [,464] [,465] [,466] [,467] [,468] [1,] -1 -1 1 -1 1 -1 -1 1 1 1 -1 [,469] [,470] [,471] [,472] [,473] [,474] [,475] [,476] [,477] [,478] [,479] [1,] -1 -1 1 -1 1 -1 -1 -1 -1 1 -1 [,480] [,481] [,482] [,483] [,484] [,485] [,486] [,487] [,488] [,489] [,490] [1,] -1 1 -1 -1 -1 -1 -1 -1 -1 1 1 [,491] [,492] [,493] [,494] [,495] [,496] [,497] [,498] [,499] [,500] [,501] [1,] -1 1 -1 -1 1 -1 1 1 -1 -1 1 [,502] [,503] [,504] [,505] [,506] [,507] [,508] [,509] [,510] [,511] [,512] [1,] -1 1 -1 -1 1 1 1 1 1 1 1 [,513] [,514] [,515] [,516] [,517] [,518] [,519] [,520] [,521] [,522] [,523] [1,] 1 1 1 1 1 1 1 1 1 -1 1 [,524] [,525] [,526] [,527] [,528] [,529] [,530] [,531] [,532] [,533] [,534] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,535] [,536] [,537] [,538] [,539] [,540] [,541] [,542] [,543] [,544] [,545] [1,] 1 1 1 1 -1 1 1 -1 1 1 1 [,546] [,547] [,548] [,549] [,550] [,551] [,552] [,553] [,554] [,555] [,556] [1,] 1 -1 1 1 1 1 1 1 1 1 1 [,557] [,558] [,559] [,560] [,561] [,562] [,563] [,564] [,565] [,566] [,567] [1,] 1 1 1 -1 1 1 1 1 1 1 1 [,568] [,569] [,570] [,571] [,572] [,573] [,574] [,575] [,576] [,577] [,578] [1,] 1 1 1 1 1 1 1 1 1 1 -1 [,579] [,580] [,581] [,582] [,583] [,584] [,585] [,586] [,587] [,588] [,589] [1,] 1 1 1 1 1 -1 1 -1 1 1 1 [,590] [,591] [,592] [,593] [,594] [,595] [,596] [,597] [,598] [,599] [,600] [1,] 1 1 -1 1 1 -1 1 1 1 1 -1 [,601] [,602] [,603] [,604] [,605] [,606] [,607] [,608] [,609] [,610] [,611] [1,] 1 1 1 -1 1 1 -1 1 -1 1 1 [,612] [,613] [,614] [,615] [,616] [,617] [,618] [,619] [,620] [,621] [,622] [1,] 1 1 1 -1 1 1 1 1 -1 1 -1 [,623] [,624] [,625] [,626] [,627] [,628] [,629] [,630] [,631] [,632] [,633] [1,] 1 1 -1 1 1 1 1 -1 1 1 1 [,634] [,635] [,636] [,637] [,638] [,639] [,640] [,641] [,642] [,643] [,644] [1,] -1 -1 1 1 1 1 1 -1 1 1 1

Page 145: R version 3 Examples

[,645] [,646] [,647] [,648] [,649] [,650] [,651] [,652] [,653] [,654] [,655] [1,] -1 1 1 1 1 1 1 1 1 -1 1 [,656] [,657] [,658] [,659] [,660] [,661] [,662] [,663] [,664] [,665] [,666] [1,] 1 1 1 1 -1 1 1 -1 1 1 -1 [,667] [,668] [,669] [,670] [,671] [,672] [,673] [,674] [,675] [,676] [,677] [1,] 1 1 -1 1 1 1 -1 1 1 1 1 [,678] [,679] [,680] [,681] [,682] [,683] [,684] [,685] [,686] [,687] [,688] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,689] [,690] [,691] [,692] [,693] [,694] [,695] [,696] [,697] [,698] [,699] [1,] 1 1 -1 1 1 1 1 -1 -1 1 1 [,700] [,701] [,702] [,703] [,704] [,705] [,706] [,707] [,708] [,709] [,710] [1,] 1 1 1 1 1 1 1 1 1 -1 1 [,711] [,712] [,713] [,714] [,715] [,716] [,717] [,718] [,719] [,720] [,721] [1,] 1 1 1 -1 1 1 -1 -1 1 1 1 [,722] [,723] [,724] [,725] [,726] [,727] [,728] [,729] [,730] [,731] [,732] [1,] 1 -1 -1 1 1 1 1 1 1 1 1 [,733] [,734] [,735] [,736] [,737] [,738] [,739] [,740] [,741] [,742] [,743] [1,] -1 1 1 -1 1 1 1 1 1 1 1 [,744] [,745] [,746] [,747] [,748] [,749] [,750] [,751] [,752] [,753] [,754] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,755] [,756] [,757] [,758] [,759] [,760] [,761] [,762] [,763] [,764] [,765] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,766] [,767] [,768] [,769] [,770] [,771] [,772] [,773] [,774] [,775] [,776] [1,] 1 1 1 -1 1 1 1 1 1 1 1 [,777] [,778] [,779] [,780] [,781] [,782] [,783] [,784] [,785] [,786] [,787] [1,] 1 1 1 1 1 1 -1 -1 1 1 1 [,788] [,789] [,790] [,791] [,792] [,793] [,794] [,795] [,796] [,797] [,798] [1,] -1 1 1 1 1 1 1 1 1 1 1 [,799] [,800] [,801] [,802] [,803] [,804] [,805] [,806] [,807] [,808] [,809] [1,] 1 1 1 1 -1 -1 1 1 1 1 1 [,810] [,811] [,812] [,813] [,814] [,815] [,816] [,817] [,818] [,819] [,820] [1,] -1 1 1 1 -1 1 1 1 -1 1 1 [,821] [,822] [,823] [,824] [,825] [,826] [,827] [,828] [,829] [,830] [,831] [1,] -1 -1 1 1 1 1 1 -1 -1 -1 -1 [,832] [,833] [,834] [,835] [,836] [,837] [,838] [,839] [,840] [,841] [,842] [1,] 1 1 1 1 1 1 -1 1 -1 1 1 [,843] [,844] [,845] [,846] [,847] [,848] [,849] [,850] [,851] [,852] [,853] [1,] -1 1 1 1 -1 1 1 1 -1 -1 1 [,854] [,855] [,856] [,857] [,858] [,859] [,860] [,861] [,862] [,863] [,864] [1,] 1 -1 1 1 -1 -1 1 1 -1 1 1 [,865] [,866] [,867] [,868] [,869] [,870] [,871] [,872] [,873] [,874] [,875] [1,] 1 1 1 1 -1 1 -1 -1 1 -1 1 [,876] [,877] [,878] [,879] [,880] [,881] [,882] [,883] [,884] [,885] [,886] [1,] 1 -1 -1 1 -1 1 1 -1 1 1 1 [,887] [,888] [,889] [,890] [,891] [,892] [,893] [,894] [,895] [,896] [,897] [1,] 1 1 1 1 1 1 1 1 1 1 -1 [,898] [,899] [,900] [,901] [,902] [,903] [,904] [,905] [,906] [,907] [,908] [1,] 1 -1 1 1 1 1 1 1 1 1 1 [,909] [,910] [,911] [,912] [,913] [,914] [,915] [,916] [,917] [,918] [,919] [1,] -1 -1 -1 1 -1 -1 -1 1 1 1 1 [,920] [,921] [,922] [,923] [,924] [,925] [,926] [,927] [,928] [,929] [,930] [1,] 1 1 1 1 1 -1 1 1 1 -1 1 [,931] [,932] [,933] [,934] [,935] [,936] [,937] [,938] [,939] [,940] [,941] [1,] 1 1 1 1 1 1 1 1 -1 -1 1 [,942] [,943] [,944] [,945] [,946] [,947] [,948] [,949] [,950] [,951] [,952] [1,] 1 1 1 -1 1 -1 -1 1 1 1 1 [,953] [,954] [,955] [,956] [,957] [,958] [,959] [,960] [,961] [,962] [,963] [1,] 1 1 1 1 1 1 1 1 1 1 -1 [,964] [,965] [,966] [,967] [,968] [,969] [,970] [,971] [,972] [,973] [,974] [1,] 1 1 1 1 1 1 1 1 -1 1 1 [,975] [,976] [,977] [,978] [,979] [,980] [,981] [,982] [,983] [,984] [,985] [1,] 1 -1 1 -1 -1 1 1 1 1 -1 1 [,986] [,987] [,988] [,989] [,990] [,991] [,992] [,993] [,994] [,995] [,996] [1,] 1 1 1 1 1 -1 1 1 1 1 1 [,997] [,998] [,999] [,1000] [1,] 1 1 1 1 $predicted.model$error.rate [1] 0.234

Page 146: R version 3 Examples

wsvm.boost > # generate a simulation data set using mixture example(page 17, Friedman et al. 2000) > svm.data <- simul.wsvm(set.seeds = 123) > X <- svm.data$X > Y <- svm.data$Y > new.X <- svm.data$new.X > new.Y <- svm.data$new.Y > # run Weighted K-means clustering SVM with boosting algorithm > model <- wsvm(X, Y, c.n = rep(1/ length(Y),length(Y))) > # predict the model and compute an error rate. > pred <- wsvm.predict(X,Y, new.X, new.Y, model) > Error.rate(pred$predicted.Y, Y) [1] 2.365 > # add boost algorithm > boo <- wsvm.boost(X, Y, new.X, new.Y, c.n = rep(1 / length(Y),length(Y)), + B = 50, kernel.type = list(type = "rbf", par= 0.5), C = 4, + eps = 1e-10, plotting = TRUE) > boo $error.rate [1] 0.344 0.234 0.231 0.269 0.234 0.231 0.259 0.234 0.231 0.252 0.232 0.231 0.245 [14] 0.232 0.231 0.241 0.231 0.231 0.239 0.231 0.231 0.239 0.231 0.231 0.239 0.231 [27] 0.231 0.237 0.231 0.231 0.237 0.231 0.231 0.237 0.231 0.231 0.237 0.231 0.231 [40] 0.236 0.231 0.231 0.235 0.231 0.231 0.235 0.231 0.231 0.235 0.231 $predicted.model $predicted.model$predicted.values [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [1,] -16.27017 -39.08044 0.944282 -1.789679 -36.89359 21.17785 -11.34715 -40.0761 [,9] [,10] [,11] [,12] [,13] [,14] [,15] [,16] [1,] -28.99057 -5.516065 -21.77547 10.80725 -43.71525 -30.27151 -50.15125 -47.84813 [,17] [,18] [,19] [,20] [,21] [,22] [,23] [,24]

Page 147: R version 3 Examples

[1,] -8.801083 28.84544 -17.01454 -22.59571 10.20134 -40.56534 -36.93977 -24.75309 [,25] [,26] [,27] [,28] [,29] [,30] [,31] [,32] [1,] -36.83225 -49.27991 -34.96316 -2.207464 -3.229874 -37.81128 -40.12909 -35.76373 [,33] [,34] [,35] [,36] [,37] [,38] [,39] [,40] [1,] -21.92024 -45.98178 30.5596 -10.69508 -32.76042 3.999418 -19.59183 15.18728 [,41] [,42] [,43] [,44] [,45] [,46] [,47] [,48] [1,] -24.63925 -5.731251 -14.10618 0.7742819 -16.62675 3.58908 -1.128301 -14.94297 [,49] [,50] [,51] [,52] [,53] [,54] [,55] [,56] [1,] 10.74508 -38.03607 36.39655 4.741127 -10.71515 -21.491 -22.7896 -9.612561 [,57] [,58] [,59] [,60] [,61] [,62] [,63] [,64] [1,] -38.41606 -38.59449 -2.183832 -15.02233 -35.18771 27.26006 -14.48672 18.8643 [,65] [,66] [,67] [,68] [,69] [,70] [,71] [,72] [1,] -39.41475 5.502769 -10.25952 -5.411607 -45.34746 -10.84983 -33.7641 -23.24468 [,73] [,74] [,75] [,76] [,77] [,78] [,79] [,80] [1,] -41.28699 37.75664 10.44359 22.71841 -30.97524 -39.62491 -8.725992 -42.44628 [,81] [,82] [,83] [,84] [,85] [,86] [,87] [,88] [1,] -16.446 -32.84528 10.89038 -49.85043 -47.57536 -8.22942 -7.5087 -32.98926 [,89] [,90] [,91] [,92] [,93] [,94] [,95] [,96] [1,] -0.2450541 -32.00351 -39.03697 -40.77949 -44.51566 -38.68644 -30.41145 -42.50654 [,97] [,98] [,99] [,100] [,101] [,102] [,103] [,104] [1,] -45.62597 19.22893 18.79446 -18.14004 -14.38503 -18.45792 -4.441998 -31.82107 [,105] [,106] [,107] [,108] [,109] [,110] [,111] [,112] [1,] 14.39378 -26.44394 -34.62355 -11.74433 7.984242 -30.88617 -39.52105 -3.691885 [,113] [,114] [,115] [,116] [,117] [,118] [,119] [,120] [1,] 31.38765 -16.81146 -26.67792 -37.91711 -16.68856 -40.03532 -4.355588 -13.78785 [,121] [,122] [,123] [,124] [,125] [,126] [,127] [,128] [1,] -24.03291 13.46545 -16.85368 29.41382 -22.37301 1.482483 -33.98964 19.59878 [,129] [,130] [,131] [,132] [,133] [,134] [,135] [,136] [1,] -43.33995 -42.3705 -30.57115 -42.75856 -30.21792 -42.12152 -5.648007 -12.00956 [,137] [,138] [,139] [,140] [,141] [,142] [,143] [,144] [1,] -26.54443 -34.8331 -2.831847 -5.435084 -33.01808 -6.259952 22.39395 -2.856443 [,145] [,146] [,147] [,148] [,149] [,150] [,151] [,152] [1,] -37.53853 -2.491137 -6.839391 34.59299 23.50693 -35.06248 -23.07989 -9.042504 [,153] [,154] [,155] [,156] [,157] [,158] [,159] [,160] [1,] -3.870491 13.2322 -24.5943 -36.0302 -6.097423 2.75249 -12.26365 -4.151632 [,161] [,162] [,163] [,164] [,165] [,166] [,167] [,168] [1,] -32.74158 -29.53172 -38.31893 -45.12027 -4.458359 -15.61552 -42.93104 21.78892 [,169] [,170] [,171] [,172] [,173] [,174] [,175] [,176] [1,] -15.55209 32.74605 -38.74027 -27.36557 -8.686813 -42.08723 -34.50029 0.5652003 [,177] [,178] [,179] [,180] [,181] [,182] [,183] [,184] [1,] -5.033727 -5.593206 -35.73856 -14.34853 -14.72405 -13.6193 -41.74765 21.64528 [,185] [,186] [,187] [,188] [,189] [,190] [,191] [,192] [1,] -6.737482 -5.32799 -16.222 10.3222 -15.01938 -28.11268 18.97572 -37.09884 [,193] [,194] [,195] [,196] [,197] [,198] [,199] [,200] [1,] -0.2348997 -10.97634 -32.82672 -2.659103 -2.431912 -40.61066 -32.37985 -2.06211 [,201] [,202] [,203] [,204] [,205] [,206] [,207] [,208] [1,] -6.845711 -32.61925 -20.38674 -32.01772 -3.5023 18.80173 -37.22341 22.90405 [,209] [,210] [,211] [,212] [,213] [,214] [,215] [,216] [1,] -20.72926 -3.741356 4.668957 26.62816 -15.9397 -34.0794 34.72886 -28.741 [,217] [,218] [,219] [,220] [,221] [,222] [,223] [,224] [1,] -43.28544 4.504605 -8.297369 -13.13231 5.048992 -34.24613 -40.25946 -21.26734 [,225] [,226] [,227] [,228] [,229] [,230] [,231] [,232] [1,] 30.92008 -41.63936 9.878261 -6.309147 -35.98381 1.349449 -49.65622 12.25409 [,233] [,234] [,235] [,236] [,237] [,238] [,239] [,240] [1,] 3.337962 28.91054 -15.93327 -36.39339 -15.57161 -21.66869 -45.62205 10.10291 [,241] [,242] [,243] [,244] [,245] [,246] [,247] [,248] [1,] -20.2071 -30.43712 30.36746 -34.14014 -15.11688 -31.28756 5.531549 -28.08313 [,249] [,250] [,251] [,252] [,253] [,254] [,255] [,256] [1,] 12.62885 -50.21185 4.871637 18.90673 -19.76252 -13.65875 -21.81564 -31.24732 [,257] [,258] [,259] [,260] [,261] [,262] [,263] [,264] [1,] -9.155318 -8.3661 6.384394 -36.58165 -4.182963 24.19121 -39.87982 -33.75436 [,265] [,266] [,267] [,268] [,269] [,270] [,271] [,272] [1,] -36.21607 -0.4441701 -27.12935 -14.53268 23.09981 -14.49133 -39.56742 29.63706 [,273] [,274] [,275] [,276] [,277] [,278] [,279] [,280] [1,] 22.5989 -30.6448 -7.887724 -9.412909 -28.69099 -6.489172 35.41012 -39.45576 [,281] [,282] [,283] [,284] [,285] [,286] [,287] [,288] [1,] -46.62317 -13.61384 -25.93487 -12.97862 17.20281 -7.313286 31.99024 10.77867 [,289] [,290] [,291] [,292] [,293] [,294] [,295] [,296] [1,] -5.169043 -34.64455 14.91441 -45.29851 24.69178 11.60096 -47.23122 4.466546 [,297] [,298] [,299] [,300] [,301] [,302] [,303] [,304] [1,] -13.90783 -49.55747 30.52947 -22.86797 -22.39481 29.0996 -42.53411 -31.87113 [,305] [,306] [,307] [,308] [,309] [,310] [,311] [,312]

Page 148: R version 3 Examples

[1,] -30.5347 -12.16539 -39.55768 32.25857 -12.56304 8.865682 6.093298 5.800194 [,313] [,314] [,315] [,316] [,317] [,318] [,319] [,320] [1,] -47.81803 36.12897 -44.87239 9.765551 -21.90507 26.53808 -15.10975 -50.1588 [,321] [,322] [,323] [,324] [,325] [,326] [,327] [,328] [1,] -23.13055 31.73821 -43.09306 -24.62054 -43.00199 22.75016 -15.18862 18.13613 [,329] [,330] [,331] [,332] [,333] [,334] [,335] [,336] [1,] -15.319 -37.62433 -38.46493 -10.12078 1.358513 33.20828 -29.35786 -9.902488 [,337] [,338] [,339] [,340] [,341] [,342] [,343] [,344] [1,] -15.57518 -15.66524 -28.93112 -7.414679 -3.345479 36.29042 -16.02426 20.47514 [,345] [,346] [,347] [,348] [,349] [,350] [,351] [,352] [1,] -12.01237 -42.79668 -30.23974 -48.12329 16.62334 -34.98338 17.73377 -40.11143 [,353] [,354] [,355] [,356] [,357] [,358] [,359] [,360] [1,] -40.39592 -47.06595 -35.76414 -33.32677 -4.947772 -6.304314 -28.61886 30.40981 [,361] [,362] [,363] [,364] [,365] [,366] [,367] [,368] [1,] -3.301689 -40.81905 -22.90512 -20.91246 -31.14049 9.460041 -40.74746 38.76421 [,369] [,370] [,371] [,372] [,373] [,374] [,375] [,376] [1,] 4.95824 -0.6292399 -1.02733 -1.239114 -49.46741 -6.082582 -21.63832 8.03131 [,377] [,378] [,379] [,380] [,381] [,382] [,383] [,384] [1,] -8.890935 8.433609 -24.94095 -32.39953 -20.89348 -38.71237 -15.89478 -15.30998 [,385] [,386] [,387] [,388] [,389] [,390] [,391] [,392] [1,] -27.61351 -25.1533 -21.98088 -12.8123 20.95872 -10.86229 -23.34211 29.08745 [,393] [,394] [,395] [,396] [,397] [,398] [,399] [,400] [1,] -23.00144 -8.952241 -21.04443 -16.12636 -3.889586 -48.44275 -34.70085 -26.48346 [,401] [,402] [,403] [,404] [,405] [,406] [,407] [,408] [1,] -38.74656 -49.89017 -14.3927 -30.30111 -40.78396 12.45615 -46.8209 36.36159 [,409] [,410] [,411] [,412] [,413] [,414] [,415] [,416] [,417] [1,] -48.04399 -32.02221 19.76371 -27.1932 22.70623 1.250056 5.519071 3.21088 10.12557 [,418] [,419] [,420] [,421] [,422] [,423] [,424] [,425] [1,] -11.32429 -7.106856 30.11756 8.153624 -28.50993 0.2817419 -33.97924 3.947631 [,426] [,427] [,428] [,429] [,430] [,431] [,432] [,433] [1,] -48.26046 -26.58674 -31.08463 29.72648 9.702576 -15.49638 -2.06291 -18.12061 [,434] [,435] [,436] [,437] [,438] [,439] [,440] [,441] [1,] -14.31856 13.52658 -36.22982 -49.62361 1.325244 37.0753 22.72696 29.34957 [,442] [,443] [,444] [,445] [,446] [,447] [,448] [,449] [1,] -10.28714 -12.32955 -48.73796 -42.86785 -45.69099 -41.86658 23.28913 -33.73909 [,450] [,451] [,452] [,453] [,454] [,455] [,456] [,457] [1,] 4.075928 -20.41009 -10.8528 -8.185417 35.7306 29.46309 -44.56059 -16.40437 [,458] [,459] [,460] [,461] [,462] [,463] [,464] [,465] [1,] -50.15013 -43.55205 9.859921 -2.191472 18.35806 -7.335841 -14.70479 5.816835 [,466] [,467] [,468] [,469] [,470] [,471] [,472] [,473] [1,] 21.0722 12.42268 -17.87657 -15.20007 -43.50467 24.18484 -40.92547 2.864531 [,474] [,475] [,476] [,477] [,478] [,479] [,480] [,481] [1,] -17.97937 -22.60705 -1.578097 -28.70601 37.49023 -15.22753 -46.39693 17.27119 [,482] [,483] [,484] [,485] [,486] [,487] [,488] [,489] [1,] -37.94433 -35.56422 -34.39671 -2.82519 -49.52225 -43.55196 -38.09036 1.759418 [,490] [,491] [,492] [,493] [,494] [,495] [,496] [,497] [,498] [1,] 24.53 -2.516211 19.73235 -14.87989 -41.22648 38.41699 -24.93667 12.0291 16.8544 [,499] [,500] [,501] [,502] [,503] [,504] [,505] [,506] [1,] -31.43683 -44.18605 10.49844 -3.653047 36.3897 -45.06542 -34.77028 27.4498 [,507] [,508] [,509] [,510] [,511] [,512] [,513] [,514] [,515] [1,] 13.32931 14.79214 19.94005 18.18704 27.52435 36.35196 4.198445 22.90299 33.18769 [,516] [,517] [,518] [,519] [,520] [,521] [,522] [,523] [,524] [1,] 17.17727 37.14816 12.13658 9.727201 25.20227 18.53947 -1.867508 31.19432 8.533474 [,525] [,526] [,527] [,528] [,529] [,530] [,531] [,532] [,533] [1,] 12.43035 4.896552 23.15998 22.752 20.48538 11.70733 21.00109 13.78528 18.44805 [,534] [,535] [,536] [,537] [,538] [,539] [,540] [,541] [,542] [1,] 15.50344 18.8001 11.9329 25.73658 19.59609 -3.179318 20.51091 0.1112887 -13.82125 [,543] [,544] [,545] [,546] [,547] [,548] [,549] [,550] [,551] [1,] 7.181408 38.34286 36.43024 21.9822 -35.90049 7.894236 36.17372 14.71083 25.9818 [,552] [,553] [,554] [,555] [,556] [,557] [,558] [,559] [,560] [1,] 37.96735 3.593774 17.46515 16.78559 12.18323 9.077706 7.27024 14.91881 -3.014877 [,561] [,562] [,563] [,564] [,565] [,566] [,567] [,568] [,569] [1,] 4.821116 21.36031 38.29708 28.75401 17.79705 6.217101 4.045773 20.365 12.29597 [,570] [,571] [,572] [,573] [,574] [,575] [,576] [,577] [,578] [1,] 35.60256 8.784281 22.37672 13.78731 26.18812 23.5776 16.96164 30.23636 -0.6930769 [,579] [,580] [,581] [,582] [,583] [,584] [,585] [,586] [1,] 17.93865 23.26609 38.59238 19.79008 18.71282 -12.08734 24.52556 -26.64903 [,587] [,588] [,589] [,590] [,591] [,592] [,593] [,594] [,595] [1,] 36.76701 19.70671 21.65721 12.3788 6.52507 -39.34199 23.43075 26.71606 -19.97812 [,596] [,597] [,598] [,599] [,600] [,601] [,602] [,603] [1,] 22.12468 17.93311 31.70402 19.79186 -23.41259 29.52507 26.67675 34.91353 [,604] [,605] [,606] [,607] [,608] [,609] [,610] [,611] [,612]

Page 149: R version 3 Examples

[1,] -4.143804 14.1254 1.858325 -18.74358 15.5934 -9.852869 21.69862 12.09698 19.57911 [,613] [,614] [,615] [,616] [,617] [,618] [,619] [,620] [,621] [1,] 24.02243 22.63789 -2.730113 32.90228 12.69182 25.94243 23.29236 -40.7069 22.47083 [,622] [,623] [,624] [,625] [,626] [,627] [,628] [,629] [1,] -25.83527 37.05957 3.702439 -45.05636 33.16777 36.72567 3.638794 24.96429 [,630] [,631] [,632] [,633] [,634] [,635] [,636] [,637] [1,] -27.41948 36.21322 11.91989 37.88044 -2.273708 -21.66729 39.26591 22.51497 [,638] [,639] [,640] [,641] [,642] [,643] [,644] [,645] [,646] [1,] 28.84065 26.6654 36.25212 -0.6032254 16.66168 15.5364 18.90488 -8.187162 8.059558 [,647] [,648] [,649] [,650] [,651] [,652] [,653] [,654] [,655] [1,] 26.65652 11.93187 13.35339 19.23556 31.65853 21.19456 33.88522 -14.09402 35.86381 [,656] [,657] [,658] [,659] [,660] [,661] [,662] [,663] [,664] [1,] 17.25389 24.00386 11.21003 13.08404 -11.76037 22.67059 14.68572 -3.26139 34.00052 [,665] [,666] [,667] [,668] [,669] [,670] [,671] [,672] [1,] 38.70603 -39.98927 23.52405 22.36081 -7.43897 30.36959 0.03302824 13.76879 [,673] [,674] [,675] [,676] [,677] [,678] [,679] [,680] [,681] [1,] -21.08371 29.55286 35.82308 19.27472 4.704699 37.22923 9.393078 21.20459 31.71435 [,682] [,683] [,684] [,685] [,686] [,687] [,688] [,689] [,690] [1,] 28.2905 13.63596 17.02169 23.7448 14.52213 14.43843 13.177 38.68759 27.92376 [,691] [,692] [,693] [,694] [,695] [,696] [,697] [,698] [,699] [1,] -19.52662 16.57603 15.12958 13.7219 38.42782 -6.009268 -10.3505 21.05822 20.80028 [,700] [,701] [,702] [,703] [,704] [,705] [,706] [,707] [,708] [1,] 21.42977 21.7976 39.00684 15.92463 14.01444 19.89942 7.605778 5.368272 25.24063 [,709] [,710] [,711] [,712] [,713] [,714] [,715] [,716] [,717] [1,] -18.04013 31.12 20.77967 9.813607 10.2344 -11.78832 27.70098 27.18736 -37.46273 [,718] [,719] [,720] [,721] [,722] [,723] [,724] [,725] [1,] -26.57351 22.47264 6.794077 36.68309 2.360507 -8.153156 -9.065619 6.080867 [,726] [,727] [,728] [,729] [,730] [,731] [,732] [,733] [,734] [1,] 37.12582 16.11685 9.450515 36.60776 12.60635 20.93796 37.30389 -2.394863 22.0336 [,735] [,736] [,737] [,738] [,739] [,740] [,741] [,742] [,743] [1,] 10.45094 -42.62139 7.195446 23.00916 30.05726 17.99331 21.5049 22.34627 19.57441 [,744] [,745] [,746] [,747] [,748] [,749] [,750] [,751] [,752] [1,] 14.58497 13.70328 12.91423 34.81896 9.394748 31.57066 23.5416 23.3473 32.66427 [,753] [,754] [,755] [,756] [,757] [,758] [,759] [,760] [,761] [1,] 14.43614 3.505382 7.452814 18.08976 13.49042 23.59752 4.70188 36.72075 17.4993 [,762] [,763] [,764] [,765] [,766] [,767] [,768] [,769] [,770] [1,] 28.19604 16.8927 32.06938 10.41121 22.94331 17.25074 38.09022 -10.74629 29.88373 [,771] [,772] [,773] [,774] [,775] [,776] [,777] [,778] [,779] [1,] 23.90167 22.99974 19.72697 3.557221 15.61226 35.20224 19.56244 38.75702 17.6951 [,780] [,781] [,782] [,783] [,784] [,785] [,786] [,787] [1,] 13.67582 37.04425 11.98628 -1.813733 -21.88131 2.751618 32.66003 36.1949 [,788] [,789] [,790] [,791] [,792] [,793] [,794] [,795] [,796] [1,] -40.39526 29.16073 23.06909 24.49586 15.19197 28.53426 38.7201 17.6364 33.05532 [,797] [,798] [,799] [,800] [,801] [,802] [,803] [,804] [1,] 16.48638 39.14081 20.07146 30.08067 21.84087 18.99062 -14.71014 -13.86334 [,805] [,806] [,807] [,808] [,809] [,810] [,811] [,812] [,813] [1,] 31.79787 23.08725 19.39451 23.52447 24.68634 -11.1324 20.05727 1.508482 27.49303 [,814] [,815] [,816] [,817] [,818] [,819] [,820] [,821] [1,] -15.39953 26.58987 16.92799 36.69849 -2.147935 6.874402 17.81895 -0.1526351 [,822] [,823] [,824] [,825] [,826] [,827] [,828] [,829] [1,] -4.663268 32.60587 15.48922 35.31802 17.13118 26.8183 -19.99068 -22.25762 [,830] [,831] [,832] [,833] [,834] [,835] [,836] [,837] [,838] [1,] -17.39798 -47.00745 37.21426 29.68683 29.07221 11.36659 11.7108 37.2217 -5.464489 [,839] [,840] [,841] [,842] [,843] [,844] [,845] [,846] [1,] 38.37217 -26.36088 22.30125 31.62431 -2.007138 6.575634 11.31978 39.15182 [,847] [,848] [,849] [,850] [,851] [,852] [,853] [,854] [1,] -19.33146 5.250051 37.47162 9.785576 -4.93801 -15.61497 22.57307 29.41549 [,855] [,856] [,857] [,858] [,859] [,860] [,861] [,862] [1,] -0.8397536 32.96376 23.38999 -25.74405 -20.60372 20.91718 34.86803 -31.61521 [,863] [,864] [,865] [,866] [,867] [,868] [,869] [,870] [,871] [1,] 27.60242 38.82283 1.482911 38.24677 22.19427 24.3146 -3.026044 18.813 -11.35892 [,872] [,873] [,874] [,875] [,876] [,877] [,878] [,879] [1,] -35.39758 34.1562 -4.310235 5.532996 25.90941 -42.62459 -33.21811 9.278782 [,880] [,881] [,882] [,883] [,884] [,885] [,886] [,887] [,888] [1,] -10.286 14.25465 23.70501 -8.903758 18.96384 35.38695 22.5964 24.10753 35.66059 [,889] [,890] [,891] [,892] [,893] [,894] [,895] [,896] [,897] [1,] 31.62639 36.73778 11.83581 24.50317 30.38097 10.76887 25.93239 21.77899 -17.83531 [,898] [,899] [,900] [,901] [,902] [,903] [,904] [,905] [,906] [1,] 30.05708 -42.8873 17.23801 22.06864 15.42775 26.24829 16.43025 20.23261 14.20674 [,907] [,908] [,909] [,910] [,911] [,912] [,913] [,914] [1,] 16.90132 24.57694 -10.91517 -22.15209 -0.2050415 15.33902 -2.554708 -6.493068 [,915] [,916] [,917] [,918] [,919] [,920] [,921] [,922] [,923]

Page 150: R version 3 Examples

[1,] -9.393511 22.91488 17.38079 13.79285 32.42075 27.01597 4.93411 20.54395 22.43964 [,924] [,925] [,926] [,927] [,928] [,929] [,930] [,931] [1,] 24.24687 -22.51979 9.403874 21.74966 18.90234 -4.985459 26.74282 28.60688 [,932] [,933] [,934] [,935] [,936] [,937] [,938] [,939] [,940] [1,] 21.21157 36.97883 16.48643 21.60612 19.95054 9.352821 24.33682 -4.45221 -9.667934 [,941] [,942] [,943] [,944] [,945] [,946] [,947] [,948] [1,] 22.40643 29.60657 13.97122 38.31708 -25.62001 9.307399 -10.53841 -14.14341 [,949] [,950] [,951] [,952] [,953] [,954] [,955] [,956] [,957] [1,] 34.66362 21.88622 12.53602 19.73038 23.90455 23.08086 4.047239 23.6262 28.44745 [,958] [,959] [,960] [,961] [,962] [,963] [,964] [,965] [,966] [1,] 23.24147 34.89483 37.64118 12.63404 38.72637 -4.969913 19.69083 23.69448 14.4555 [,967] [,968] [,969] [,970] [,971] [,972] [,973] [,974] [,975] [1,] 5.105148 19.46503 33.11206 20.2006 23.83841 -33.56899 14.90093 38.17178 36.9415 [,976] [,977] [,978] [,979] [,980] [,981] [,982] [,983] [1,] -19.74697 23.82049 -16.36104 -22.23173 23.14513 28.89228 35.55705 23.57657 [,984] [,985] [,986] [,987] [,988] [,989] [,990] [,991] [,992] [1,] -10.32926 21.64584 35.7729 15.21957 5.671875 38.10004 37.48496 -30.42777 37.99629 [,993] [,994] [,995] [,996] [,997] [,998] [,999] [,1000] [1,] 14.88153 28.44732 13.13881 18.67816 24.23694 30.08263 38.23451 11.40975 $predicted.model$predicted.Y [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12] [,13] [,14] [,15] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 1 -1 -1 -1 [,16] [,17] [,18] [,19] [,20] [,21] [,22] [,23] [,24] [,25] [,26] [,27] [,28] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 [,29] [,30] [,31] [,32] [,33] [,34] [,35] [,36] [,37] [,38] [,39] [,40] [,41] [1,] -1 -1 -1 -1 -1 -1 1 -1 -1 1 -1 1 -1 [,42] [,43] [,44] [,45] [,46] [,47] [,48] [,49] [,50] [,51] [,52] [,53] [,54] [1,] -1 -1 1 -1 1 -1 -1 1 -1 1 1 -1 -1 [,55] [,56] [,57] [,58] [,59] [,60] [,61] [,62] [,63] [,64] [,65] [,66] [,67] [1,] -1 -1 -1 -1 -1 -1 -1 1 -1 1 -1 1 -1 [,68] [,69] [,70] [,71] [,72] [,73] [,74] [,75] [,76] [,77] [,78] [,79] [,80] [1,] -1 -1 -1 -1 -1 -1 1 1 1 -1 -1 -1 -1 [,81] [,82] [,83] [,84] [,85] [,86] [,87] [,88] [,89] [,90] [,91] [,92] [,93] [1,] -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,94] [,95] [,96] [,97] [,98] [,99] [,100] [,101] [,102] [,103] [,104] [,105] [1,] -1 -1 -1 -1 1 1 -1 -1 -1 -1 -1 1 [,106] [,107] [,108] [,109] [,110] [,111] [,112] [,113] [,114] [,115] [,116] [1,] -1 -1 -1 1 -1 -1 -1 1 -1 -1 -1 [,117] [,118] [,119] [,120] [,121] [,122] [,123] [,124] [,125] [,126] [,127] [1,] -1 -1 -1 -1 -1 1 -1 1 -1 1 -1 [,128] [,129] [,130] [,131] [,132] [,133] [,134] [,135] [,136] [,137] [,138] [1,] 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,139] [,140] [,141] [,142] [,143] [,144] [,145] [,146] [,147] [,148] [,149] [1,] -1 -1 -1 -1 1 -1 -1 -1 -1 1 1 [,150] [,151] [,152] [,153] [,154] [,155] [,156] [,157] [,158] [,159] [,160] [1,] -1 -1 -1 -1 1 -1 -1 -1 1 -1 -1 [,161] [,162] [,163] [,164] [,165] [,166] [,167] [,168] [,169] [,170] [,171] [1,] -1 -1 -1 -1 -1 -1 -1 1 -1 1 -1 [,172] [,173] [,174] [,175] [,176] [,177] [,178] [,179] [,180] [,181] [,182] [1,] -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 [,183] [,184] [,185] [,186] [,187] [,188] [,189] [,190] [,191] [,192] [,193] [1,] -1 1 -1 -1 -1 1 -1 -1 1 -1 -1 [,194] [,195] [,196] [,197] [,198] [,199] [,200] [,201] [,202] [,203] [,204] [1,] -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,205] [,206] [,207] [,208] [,209] [,210] [,211] [,212] [,213] [,214] [,215] [1,] -1 1 -1 1 -1 -1 1 1 -1 -1 1 [,216] [,217] [,218] [,219] [,220] [,221] [,222] [,223] [,224] [,225] [,226] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 1 -1 [,227] [,228] [,229] [,230] [,231] [,232] [,233] [,234] [,235] [,236] [,237] [1,] 1 -1 -1 1 -1 1 1 1 -1 -1 -1 [,238] [,239] [,240] [,241] [,242] [,243] [,244] [,245] [,246] [,247] [,248] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 1 -1 [,249] [,250] [,251] [,252] [,253] [,254] [,255] [,256] [,257] [,258] [,259] [1,] 1 -1 1 1 -1 -1 -1 -1 -1 -1 1 [,260] [,261] [,262] [,263] [,264] [,265] [,266] [,267] [,268] [,269] [,270] [1,] -1 -1 1 -1 -1 -1 -1 -1 -1 1 -1 [,271] [,272] [,273] [,274] [,275] [,276] [,277] [,278] [,279] [,280] [,281] [1,] -1 1 1 -1 -1 -1 -1 -1 1 -1 -1 [,282] [,283] [,284] [,285] [,286] [,287] [,288] [,289] [,290] [,291] [,292] [1,] -1 -1 -1 1 -1 1 1 -1 -1 1 -1 [,293] [,294] [,295] [,296] [,297] [,298] [,299] [,300] [,301] [,302] [,303]

Page 151: R version 3 Examples

[1,] 1 1 -1 1 -1 -1 1 -1 -1 1 -1 [,304] [,305] [,306] [,307] [,308] [,309] [,310] [,311] [,312] [,313] [,314] [1,] -1 -1 -1 -1 1 -1 1 1 1 -1 1 [,315] [,316] [,317] [,318] [,319] [,320] [,321] [,322] [,323] [,324] [,325] [1,] -1 1 -1 1 -1 -1 -1 1 -1 -1 -1 [,326] [,327] [,328] [,329] [,330] [,331] [,332] [,333] [,334] [,335] [,336] [1,] 1 -1 1 -1 -1 -1 -1 1 1 -1 -1 [,337] [,338] [,339] [,340] [,341] [,342] [,343] [,344] [,345] [,346] [,347] [1,] -1 -1 -1 -1 -1 1 -1 1 -1 -1 -1 [,348] [,349] [,350] [,351] [,352] [,353] [,354] [,355] [,356] [,357] [,358] [1,] -1 1 -1 1 -1 -1 -1 -1 -1 -1 -1 [,359] [,360] [,361] [,362] [,363] [,364] [,365] [,366] [,367] [,368] [,369] [1,] -1 1 -1 -1 -1 -1 -1 1 -1 1 1 [,370] [,371] [,372] [,373] [,374] [,375] [,376] [,377] [,378] [,379] [,380] [1,] -1 -1 -1 -1 -1 -1 1 -1 1 -1 -1 [,381] [,382] [,383] [,384] [,385] [,386] [,387] [,388] [,389] [,390] [,391] [1,] -1 -1 -1 -1 -1 -1 -1 -1 1 -1 -1 [,392] [,393] [,394] [,395] [,396] [,397] [,398] [,399] [,400] [,401] [,402] [1,] 1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 [,403] [,404] [,405] [,406] [,407] [,408] [,409] [,410] [,411] [,412] [,413] [1,] -1 -1 -1 1 -1 1 -1 -1 1 -1 1 [,414] [,415] [,416] [,417] [,418] [,419] [,420] [,421] [,422] [,423] [,424] [1,] 1 1 1 1 -1 -1 1 1 -1 1 -1 [,425] [,426] [,427] [,428] [,429] [,430] [,431] [,432] [,433] [,434] [,435] [1,] 1 -1 -1 -1 1 1 -1 -1 -1 -1 1 [,436] [,437] [,438] [,439] [,440] [,441] [,442] [,443] [,444] [,445] [,446] [1,] -1 -1 1 1 1 1 -1 -1 -1 -1 -1 [,447] [,448] [,449] [,450] [,451] [,452] [,453] [,454] [,455] [,456] [,457] [1,] -1 1 -1 1 -1 -1 -1 1 1 -1 -1 [,458] [,459] [,460] [,461] [,462] [,463] [,464] [,465] [,466] [,467] [,468] [1,] -1 -1 1 -1 1 -1 -1 1 1 1 -1 [,469] [,470] [,471] [,472] [,473] [,474] [,475] [,476] [,477] [,478] [,479] [1,] -1 -1 1 -1 1 -1 -1 -1 -1 1 -1 [,480] [,481] [,482] [,483] [,484] [,485] [,486] [,487] [,488] [,489] [,490] [1,] -1 1 -1 -1 -1 -1 -1 -1 -1 1 1 [,491] [,492] [,493] [,494] [,495] [,496] [,497] [,498] [,499] [,500] [,501] [1,] -1 1 -1 -1 1 -1 1 1 -1 -1 1 [,502] [,503] [,504] [,505] [,506] [,507] [,508] [,509] [,510] [,511] [,512] [1,] -1 1 -1 -1 1 1 1 1 1 1 1 [,513] [,514] [,515] [,516] [,517] [,518] [,519] [,520] [,521] [,522] [,523] [1,] 1 1 1 1 1 1 1 1 1 -1 1 [,524] [,525] [,526] [,527] [,528] [,529] [,530] [,531] [,532] [,533] [,534] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,535] [,536] [,537] [,538] [,539] [,540] [,541] [,542] [,543] [,544] [,545] [1,] 1 1 1 1 -1 1 1 -1 1 1 1 [,546] [,547] [,548] [,549] [,550] [,551] [,552] [,553] [,554] [,555] [,556] [1,] 1 -1 1 1 1 1 1 1 1 1 1 [,557] [,558] [,559] [,560] [,561] [,562] [,563] [,564] [,565] [,566] [,567] [1,] 1 1 1 -1 1 1 1 1 1 1 1 [,568] [,569] [,570] [,571] [,572] [,573] [,574] [,575] [,576] [,577] [,578] [1,] 1 1 1 1 1 1 1 1 1 1 -1 [,579] [,580] [,581] [,582] [,583] [,584] [,585] [,586] [,587] [,588] [,589] [1,] 1 1 1 1 1 -1 1 -1 1 1 1 [,590] [,591] [,592] [,593] [,594] [,595] [,596] [,597] [,598] [,599] [,600] [1,] 1 1 -1 1 1 -1 1 1 1 1 -1 [,601] [,602] [,603] [,604] [,605] [,606] [,607] [,608] [,609] [,610] [,611] [1,] 1 1 1 -1 1 1 -1 1 -1 1 1 [,612] [,613] [,614] [,615] [,616] [,617] [,618] [,619] [,620] [,621] [,622] [1,] 1 1 1 -1 1 1 1 1 -1 1 -1 [,623] [,624] [,625] [,626] [,627] [,628] [,629] [,630] [,631] [,632] [,633] [1,] 1 1 -1 1 1 1 1 -1 1 1 1 [,634] [,635] [,636] [,637] [,638] [,639] [,640] [,641] [,642] [,643] [,644] [1,] -1 -1 1 1 1 1 1 -1 1 1 1 [,645] [,646] [,647] [,648] [,649] [,650] [,651] [,652] [,653] [,654] [,655] [1,] -1 1 1 1 1 1 1 1 1 -1 1 [,656] [,657] [,658] [,659] [,660] [,661] [,662] [,663] [,664] [,665] [,666] [1,] 1 1 1 1 -1 1 1 -1 1 1 -1 [,667] [,668] [,669] [,670] [,671] [,672] [,673] [,674] [,675] [,676] [,677] [1,] 1 1 -1 1 1 1 -1 1 1 1 1 [,678] [,679] [,680] [,681] [,682] [,683] [,684] [,685] [,686] [,687] [,688] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,689] [,690] [,691] [,692] [,693] [,694] [,695] [,696] [,697] [,698] [,699]

Page 152: R version 3 Examples

[1,] 1 1 -1 1 1 1 1 -1 -1 1 1 [,700] [,701] [,702] [,703] [,704] [,705] [,706] [,707] [,708] [,709] [,710] [1,] 1 1 1 1 1 1 1 1 1 -1 1 [,711] [,712] [,713] [,714] [,715] [,716] [,717] [,718] [,719] [,720] [,721] [1,] 1 1 1 -1 1 1 -1 -1 1 1 1 [,722] [,723] [,724] [,725] [,726] [,727] [,728] [,729] [,730] [,731] [,732] [1,] 1 -1 -1 1 1 1 1 1 1 1 1 [,733] [,734] [,735] [,736] [,737] [,738] [,739] [,740] [,741] [,742] [,743] [1,] -1 1 1 -1 1 1 1 1 1 1 1 [,744] [,745] [,746] [,747] [,748] [,749] [,750] [,751] [,752] [,753] [,754] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,755] [,756] [,757] [,758] [,759] [,760] [,761] [,762] [,763] [,764] [,765] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,766] [,767] [,768] [,769] [,770] [,771] [,772] [,773] [,774] [,775] [,776] [1,] 1 1 1 -1 1 1 1 1 1 1 1 [,777] [,778] [,779] [,780] [,781] [,782] [,783] [,784] [,785] [,786] [,787] [1,] 1 1 1 1 1 1 -1 -1 1 1 1 [,788] [,789] [,790] [,791] [,792] [,793] [,794] [,795] [,796] [,797] [,798] [1,] -1 1 1 1 1 1 1 1 1 1 1 [,799] [,800] [,801] [,802] [,803] [,804] [,805] [,806] [,807] [,808] [,809] [1,] 1 1 1 1 -1 -1 1 1 1 1 1 [,810] [,811] [,812] [,813] [,814] [,815] [,816] [,817] [,818] [,819] [,820] [1,] -1 1 1 1 -1 1 1 1 -1 1 1 [,821] [,822] [,823] [,824] [,825] [,826] [,827] [,828] [,829] [,830] [,831] [1,] -1 -1 1 1 1 1 1 -1 -1 -1 -1 [,832] [,833] [,834] [,835] [,836] [,837] [,838] [,839] [,840] [,841] [,842] [1,] 1 1 1 1 1 1 -1 1 -1 1 1 [,843] [,844] [,845] [,846] [,847] [,848] [,849] [,850] [,851] [,852] [,853] [1,] -1 1 1 1 -1 1 1 1 -1 -1 1 [,854] [,855] [,856] [,857] [,858] [,859] [,860] [,861] [,862] [,863] [,864] [1,] 1 -1 1 1 -1 -1 1 1 -1 1 1 [,865] [,866] [,867] [,868] [,869] [,870] [,871] [,872] [,873] [,874] [,875] [1,] 1 1 1 1 -1 1 -1 -1 1 -1 1 [,876] [,877] [,878] [,879] [,880] [,881] [,882] [,883] [,884] [,885] [,886] [1,] 1 -1 -1 1 -1 1 1 -1 1 1 1 [,887] [,888] [,889] [,890] [,891] [,892] [,893] [,894] [,895] [,896] [,897] [1,] 1 1 1 1 1 1 1 1 1 1 -1 [,898] [,899] [,900] [,901] [,902] [,903] [,904] [,905] [,906] [,907] [,908] [1,] 1 -1 1 1 1 1 1 1 1 1 1 [,909] [,910] [,911] [,912] [,913] [,914] [,915] [,916] [,917] [,918] [,919] [1,] -1 -1 -1 1 -1 -1 -1 1 1 1 1 [,920] [,921] [,922] [,923] [,924] [,925] [,926] [,927] [,928] [,929] [,930] [1,] 1 1 1 1 1 -1 1 1 1 -1 1 [,931] [,932] [,933] [,934] [,935] [,936] [,937] [,938] [,939] [,940] [,941] [1,] 1 1 1 1 1 1 1 1 -1 -1 1 [,942] [,943] [,944] [,945] [,946] [,947] [,948] [,949] [,950] [,951] [,952] [1,] 1 1 1 -1 1 -1 -1 1 1 1 1 [,953] [,954] [,955] [,956] [,957] [,958] [,959] [,960] [,961] [,962] [,963] [1,] 1 1 1 1 1 1 1 1 1 1 -1 [,964] [,965] [,966] [,967] [,968] [,969] [,970] [,971] [,972] [,973] [,974] [1,] 1 1 1 1 1 1 1 1 -1 1 1 [,975] [,976] [,977] [,978] [,979] [,980] [,981] [,982] [,983] [,984] [,985] [1,] 1 -1 1 -1 -1 1 1 1 1 -1 1 [,986] [,987] [,988] [,989] [,990] [,991] [,992] [,993] [,994] [,995] [,996] [1,] 1 1 1 1 1 -1 1 1 1 1 1 [,997] [,998] [,999] [,1000] [1,] 1 1 1 1 $predicted.model$error.rate [1] 0.234

Page 153: R version 3 Examples

wsvm.predict > # generate a simulation data set using mixture example(page 17, Friedman et al. 2000) > svm.data <- simul.wsvm(set.seeds = 12) > X <- svm.data$X > Y <- svm.data$Y > new.X <- svm.data$new.X > new.Y <- svm.data$new.Y > # run Weighted K-means clustering SVM with boosting algorithm > model <- wsvm(X, Y, c.n = rep(1/ length(Y),length(Y))) > # predict the model and compute an error rate. > pred <- wsvm.predict(X,Y, new.X, new.Y, model) > Error.rate(pred$predicted.Y, Y) [1] 2.21 > # add boost algorithm > boo <- wsvm.boost(X, Y, new.X, new.Y, c.n = rep(1/ length(Y),length(Y)), + B = 50, kernel.type = list(type = "rbf", par= 0.5), C = 4, + eps = 1e-10, plotting = TRUE) > boo $error.rate [1] 0.379 0.251 0.250 0.287 0.252 0.250 0.271 0.252 0.250 0.265 0.251 0.250 0.260 [14] 0.251 0.250 0.259 0.251 0.250 0.255 0.250 0.250 0.256 0.250 0.250 0.254 0.250 [27] 0.250 0.253 0.250 0.250 0.251 0.250 0.250 0.251 0.250 0.250 0.251 0.250 0.250 [40] 0.251 0.250 0.250 0.252 0.250 0.250 0.252 0.250 0.250 0.252 0.250 $predicted.model $predicted.model$predicted.values [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [1,] 16.37813 4.480619 -20.0869 22.13312 -10.805 29.88408 -44.6435 -38.00885 24.25422 [,10] [,11] [,12] [,13] [,14] [,15] [,16] [,17] [1,] -25.27766 -47.69207 -32.59767 -15.56616 -26.54498 22.84748 -0.8832336 2.752864 [,18] [,19] [,20] [,21] [,22] [,23] [,24] [,25]

Page 154: R version 3 Examples

[1,] -19.57489 -40.7062 -28.72338 14.73157 -44.23251 16.90805 -40.30529 -11.24842 [,26] [,27] [,28] [,29] [,30] [,31] [,32] [,33] [1,] 0.2322697 -23.97883 -32.84684 -6.016688 -0.6976073 1.712263 -49.98286 -6.754894 [,34] [,35] [,36] [,37] [,38] [,39] [,40] [,41] [1,] 11.72005 -33.40951 -18.8519 6.340936 8.154654 -35.86755 -36.42553 -28.98855 [,42] [,43] [,44] [,45] [,46] [,47] [,48] [,49] [1,] -8.36669 1.120571 -40.33086 -48.15418 34.6375 1.806709 38.81676 -42.60902 [,50] [,51] [,52] [,53] [,54] [,55] [,56] [,57] [1,] -37.27666 -32.85476 -43.48784 -14.96973 -0.4081069 -35.68152 -47.20523 1.372611 [,58] [,59] [,60] [,61] [,62] [,63] [,64] [,65] [1,] -8.426776 -40.69391 20.67753 -42.6056 -1.376796 -40.69928 -9.63815 -8.558385 [,66] [,67] [,68] [,69] [,70] [,71] [,72] [,73] [1,] -6.477191 -35.64794 -23.33866 -20.46898 -35.46664 -27.6125 22.88645 -16.07699 [,74] [,75] [,76] [,77] [,78] [,79] [,80] [,81] [1,] -30.12492 -24.64323 -30.35541 -28.94115 -41.97272 4.583217 -49.78276 4.962717 [,82] [,83] [,84] [,85] [,86] [,87] [,88] [,89] [1,] 0.1866892 -35.50257 -4.799662 -20.42439 17.31142 31.67987 -35.65392 -9.421641 [,90] [,91] [,92] [,93] [,94] [,95] [,96] [,97] [1,] -48.14156 -35.06905 -46.26575 3.290415 -31.54808 -13.55689 8.881843 20.55829 [,98] [,99] [,100] [,101] [,102] [,103] [,104] [,105] [1,] -6.02135 -12.25389 -25.03677 -3.218119 18.05989 3.600953 10.90937 -10.49319 [,106] [,107] [,108] [,109] [,110] [,111] [,112] [,113] [1,] -24.19208 -25.41218 -44.85088 -4.435909 1.706593 -32.51515 -6.644066 -5.054061 [,114] [,115] [,116] [,117] [,118] [,119] [,120] [,121] [1,] 20.95619 23.67074 9.189338 9.005309 -40.69416 -21.34784 -18.79812 -34.72613 [,122] [,123] [,124] [,125] [,126] [,127] [,128] [,129] [1,] 36.17066 23.08333 -27.54519 -12.44636 -11.81308 -26.66419 -16.85463 -41.69385 [,130] [,131] [,132] [,133] [,134] [,135] [,136] [,137] [1,] -5.689448 22.2206 14.83151 -31.6584 -33.32709 26.83464 -23.37051 -24.64114 [,138] [,139] [,140] [,141] [,142] [,143] [,144] [,145] [1,] -38.63071 -40.71491 -21.64649 -2.988512 0.6335246 -27.20094 -29.43154 -14.544 [,146] [,147] [,148] [,149] [,150] [,151] [,152] [,153] [1,] -23.15485 8.800365 -11.37759 -1.25035 9.426568 28.6055 -29.10037 -18.0389 [,154] [,155] [,156] [,157] [,158] [,159] [,160] [,161] [1,] 27.30337 -13.6652 -45.13585 -38.17022 26.69631 12.55347 -5.881452 5.141411 [,162] [,163] [,164] [,165] [,166] [,167] [,168] [,169] [1,] -38.00978 23.30333 -14.35345 -14.05296 -10.93842 -28.76835 -31.62878 16.59179 [,170] [,171] [,172] [,173] [,174] [,175] [,176] [,177] [1,] -39.75551 17.7466 -12.47577 -11.80051 -43.608 -35.73307 -41.31837 16.19193 [,178] [,179] [,180] [,181] [,182] [,183] [,184] [,185] [1,] -13.77483 -22.3662 -30.22251 -11.9861 9.818485 -34.31999 13.50242 -11.28987 [,186] [,187] [,188] [,189] [,190] [,191] [,192] [,193] [1,] 0.6521324 -7.93272 0.3387476 8.897543 -30.73353 -11.59021 -13.97831 -42.62065 [,194] [,195] [,196] [,197] [,198] [,199] [,200] [,201] [1,] 37.16052 -9.304047 -0.7118296 -0.7400131 1.70303 -43.84596 25.18919 -29.58336 [,202] [,203] [,204] [,205] [,206] [,207] [,208] [,209] [1,] 15.74816 -30.27049 8.147716 3.785587 -40.93448 -1.621948 30.02146 -10.0043 [,210] [,211] [,212] [,213] [,214] [,215] [,216] [,217] [1,] 11.53667 -29.10461 28.89054 -10.12833 -38.44841 -10.92656 -36.18211 -25.50724 [,218] [,219] [,220] [,221] [,222] [,223] [,224] [,225] [1,] 0.3824721 -28.4129 -21.26802 -5.680644 -3.341187 -5.024342 -8.844397 -9.768222 [,226] [,227] [,228] [,229] [,230] [,231] [,232] [,233] [1,] -11.70462 -43.22254 -36.40478 -45.82464 -13.92044 24.23142 -30.00677 8.25708 [,234] [,235] [,236] [,237] [,238] [,239] [,240] [,241] [1,] -25.24493 -10.88748 -22.54976 -38.3584 -6.051605 -33.34414 -34.71255 -4.746666 [,242] [,243] [,244] [,245] [,246] [,247] [,248] [,249] [1,] 8.20665 -15.35842 -32.1306 -3.326115 -38.75134 19.90551 -5.545471 38.14472 [,250] [,251] [,252] [,253] [,254] [,255] [,256] [,257] [1,] -38.98146 -19.81885 -49.63769 2.023535 -50.01829 -20.99762 -27.19065 27.19524 [,258] [,259] [,260] [,261] [,262] [,263] [,264] [,265] [1,] -15.07571 -12.19751 -10.43474 -37.13902 14.40084 -24.92307 -13.01824 16.67008 [,266] [,267] [,268] [,269] [,270] [,271] [,272] [,273] [1,] 19.09091 17.79809 -23.75388 -31.43137 -8.853803 -47.27447 -29.80834 5.784852 [,274] [,275] [,276] [,277] [,278] [,279] [,280] [,281] [1,] -8.876548 -5.176172 -30.42859 -33.38314 -13.87895 -40.35378 -48.90978 36.81341 [,282] [,283] [,284] [,285] [,286] [,287] [,288] [,289] [1,] -42.19029 8.796899 -7.331366 -28.72699 3.620422 -24.06529 -8.489604 33.23103 [,290] [,291] [,292] [,293] [,294] [,295] [,296] [,297] [1,] -27.10437 -31.84715 -43.26901 38.42361 -46.72311 -19.0722 -7.788556 -39.74043 [,298] [,299] [,300] [,301] [,302] [,303] [,304] [,305] [1,] -32.61658 29.56087 -36.63502 -23.22435 -30.4642 -49.46817 -24.66894 -37.10703 [,306] [,307] [,308] [,309] [,310] [,311] [,312] [,313]

Page 155: R version 3 Examples

[1,] -25.23839 -22.89756 9.578687 -37.8553 -31.65494 -19.03154 -42.54552 -27.9211 [,314] [,315] [,316] [,317] [,318] [,319] [,320] [,321] [1,] 32.56612 34.01139 -16.67527 -0.4100079 -40.32926 -36.01956 -23.37658 23.57693 [,322] [,323] [,324] [,325] [,326] [,327] [,328] [,329] [1,] -16.58556 -13.48042 35.97903 34.76174 -33.49786 -4.423336 -13.53773 13.75592 [,330] [,331] [,332] [,333] [,334] [,335] [,336] [,337] [1,] 0.3251427 -19.72705 -40.14132 -22.40883 -36.77543 -1.957995 1.65186 11.14851 [,338] [,339] [,340] [,341] [,342] [,343] [,344] [,345] [1,] -24.48828 -49.7877 -39.11203 32.74774 -1.472517 -29.89106 28.43154 -23.38926 [,346] [,347] [,348] [,349] [,350] [,351] [,352] [,353] [1,] -5.098839 16.12296 24.80778 -38.42136 -24.52724 -32.41332 30.10828 4.738836 [,354] [,355] [,356] [,357] [,358] [,359] [,360] [,361] [1,] -38.4706 6.780479 30.50997 -13.17149 -4.274478 -18.78012 -10.44595 39.17939 [,362] [,363] [,364] [,365] [,366] [,367] [,368] [,369] [1,] 8.115591 27.24278 15.143 -12.71169 -4.735258 -12.58574 35.43182 -37.13404 [,370] [,371] [,372] [,373] [,374] [,375] [,376] [,377] [1,] -35.9404 -7.224858 34.24425 1.93021 -21.41423 -50.18142 28.6388 -48.71285 [,378] [,379] [,380] [,381] [,382] [,383] [,384] [,385] [1,] -33.80058 -0.1678595 -18.06743 17.86476 -20.95596 -9.257543 13.38422 -34.34606 [,386] [,387] [,388] [,389] [,390] [,391] [,392] [,393] [1,] 5.732067 6.784072 -4.158206 -41.15477 -24.74247 -38.28074 -35.29213 -48.95308 [,394] [,395] [,396] [,397] [,398] [,399] [,400] [,401] [1,] -6.512919 -10.53509 -30.76528 27.09824 -3.928507 -29.6926 -4.111002 21.47138 [,402] [,403] [,404] [,405] [,406] [,407] [,408] [,409] [1,] 4.974531 -8.323548 2.946644 -2.788241 -45.82442 36.86261 18.61521 13.74909 [,410] [,411] [,412] [,413] [,414] [,415] [,416] [,417] [1,] -2.785416 -32.63358 13.7556 -14.03019 -24.38578 26.39213 2.062058 -47.81793 [,418] [,419] [,420] [,421] [,422] [,423] [,424] [,425] [1,] -27.09031 0.6192042 -17.08487 -40.71344 -37.14986 -12.0736 -45.71138 2.733109 [,426] [,427] [,428] [,429] [,430] [,431] [,432] [,433] [1,] -49.98554 1.362467 -28.45898 -12.05939 -16.37598 -8.490986 -46.65786 -7.938608 [,434] [,435] [,436] [,437] [,438] [,439] [,440] [,441] [1,] -25.74653 -29.11137 -30.69191 -39.62336 36.29665 11.04785 -26.70704 -47.34453 [,442] [,443] [,444] [,445] [,446] [,447] [,448] [,449] [1,] -39.37578 10.23723 -48.73265 -26.11419 28.20515 -11.74112 -35.60213 -39.03765 [,450] [,451] [,452] [,453] [,454] [,455] [,456] [,457] [1,] -32.29816 -32.1283 -14.31757 -0.05583232 8.695129 -33.95684 -46.22396 -10.64453 [,458] [,459] [,460] [,461] [,462] [,463] [,464] [,465] [1,] -47.98382 -12.86162 -33.93572 -9.7961 -0.9952126 -2.108301 -1.893603 -14.33824 [,466] [,467] [,468] [,469] [,470] [,471] [,472] [,473] [1,] 8.635865 30.59437 -33.77079 8.456671 2.853196 23.08991 -5.071845 -8.159168 [,474] [,475] [,476] [,477] [,478] [,479] [,480] [,481] [1,] 24.45645 -34.80424 11.63627 -8.191764 -34.4797 21.95072 -42.80594 -12.01854 [,482] [,483] [,484] [,485] [,486] [,487] [,488] [,489] [1,] -14.09967 -7.71891 -7.652612 -41.46553 -37.47215 -20.42664 4.261731 -15.75769 [,490] [,491] [,492] [,493] [,494] [,495] [,496] [,497] [1,] 23.38851 -16.18049 -6.238848 -5.27663 -12.4629 -18.38699 -38.54186 -47.22236 [,498] [,499] [,500] [,501] [,502] [,503] [,504] [,505] [1,] 4.283052 10.14789 -16.2678 -41.92285 17.88925 34.72984 19.25308 -1.688332 [,506] [,507] [,508] [,509] [,510] [,511] [,512] [,513] [,514] [1,] 25.08513 26.24108 38.59785 28.40363 18.71813 19.92917 34.86089 15.32144 24.34862 [,515] [,516] [,517] [,518] [,519] [,520] [,521] [,522] [,523] [1,] 16.93816 6.807819 11.68839 19.17174 -2.22283 20.5306 10.44365 30.80966 23.89464 [,524] [,525] [,526] [,527] [,528] [,529] [,530] [,531] [,532] [1,] 36.15297 33.8014 23.45658 29.09866 12.01435 35.06563 38.86275 19.6934 -6.822649 [,533] [,534] [,535] [,536] [,537] [,538] [,539] [,540] [,541] [1,] -2.43596 35.83119 14.55737 -44.5562 23.79281 18.06569 27.08567 7.565211 -2.510687 [,542] [,543] [,544] [,545] [,546] [,547] [,548] [,549] [,550] [1,] 16.58245 -5.018676 36.31647 5.304979 26.55113 33.38103 15.03253 36.58159 17.48619 [,551] [,552] [,553] [,554] [,555] [,556] [,557] [,558] [,559] [1,] 20.94456 21.42359 8.578431 35.76185 34.54452 24.20947 22.76071 35.21845 23.24639 [,560] [,561] [,562] [,563] [,564] [,565] [,566] [,567] [1,] 0.03339723 -2.455973 9.325428 21.28427 -2.204154 -13.04785 24.32589 22.75755 [,568] [,569] [,570] [,571] [,572] [,573] [,574] [,575] [1,] 4.64714 -4.500368 36.08664 -22.75977 36.83923 23.64835 -41.39579 -1.558886 [,576] [,577] [,578] [,579] [,580] [,581] [,582] [,583] [1,] -20.69956 22.83126 -3.731396 17.38254 25.50405 16.70884 33.92001 -31.67499 [,584] [,585] [,586] [,587] [,588] [,589] [,590] [,591] [,592] [1,] 33.0565 32.05233 1.111798 5.087115 20.17878 -13.86281 27.86791 28.91613 20.45469 [,593] [,594] [,595] [,596] [,597] [,598] [,599] [,600] [1,] -48.3761 -1.780028 36.61559 10.30365 15.69363 36.00851 20.54272 34.52793 [,601] [,602] [,603] [,604] [,605] [,606] [,607] [,608] [,609]

Page 156: R version 3 Examples

[1,] -15.51281 24.04447 23.2608 12.3844 2.093639 -41.92552 9.036987 20.1356 23.62545 [,610] [,611] [,612] [,613] [,614] [,615] [,616] [,617] [,618] [1,] 23.16213 19.2047 36.88643 37.57482 17.65279 35.83811 -11.66284 22.65248 -27.10806 [,619] [,620] [,621] [,622] [,623] [,624] [,625] [,626] [,627] [1,] 13.03798 15.35774 14.61223 -30.6597 16.14916 20.63536 27.32432 10.54901 34.70139 [,628] [,629] [,630] [,631] [,632] [,633] [,634] [,635] [1,] 19.37342 33.05932 -2.275592 34.57532 -14.52498 2.550582 18.40348 22.75088 [,636] [,637] [,638] [,639] [,640] [,641] [,642] [,643] [,644] [1,] 20.53746 36.5699 30.54846 -16.3201 37.40193 24.21831 -48.47615 -20.64068 39.05955 [,645] [,646] [,647] [,648] [,649] [,650] [,651] [,652] [,653] [1,] -4.695012 12.96157 15.36668 32.84516 36.84846 13.42874 29.85978 23.19609 15.38109 [,654] [,655] [,656] [,657] [,658] [,659] [,660] [,661] [,662] [1,] 38.04429 -1.66452 22.63522 33.02352 22.05858 21.46406 -1.072433 25.18941 36.04964 [,663] [,664] [,665] [,666] [,667] [,668] [,669] [,670] [1,] 28.39002 22.83766 -37.94368 21.90601 13.63129 35.74366 33.92107 36.72337 [,671] [,672] [,673] [,674] [,675] [,676] [,677] [,678] [,679] [1,] -25.39048 21.62508 21.85037 4.271417 21.72153 36.72795 14.72338 19.8542 34.62327 [,680] [,681] [,682] [,683] [,684] [,685] [,686] [,687] [,688] [1,] 33.75004 34.0996 14.44126 -21.23851 2.096776 13.16501 27.9639 8.326339 13.99792 [,689] [,690] [,691] [,692] [,693] [,694] [,695] [,696] [,697] [1,] 18.98673 20.56645 37.33693 18.38504 -16.37411 14.72516 17.07091 29.5058 -19.586 [,698] [,699] [,700] [,701] [,702] [,703] [,704] [,705] [1,] -33.80813 22.94064 20.78668 9.880329 20.23874 -47.71744 18.64178 18.20657 [,706] [,707] [,708] [,709] [,710] [,711] [,712] [,713] [,714] [1,] 35.46465 2.61702 31.75523 26.38451 5.621788 18.12243 17.96908 22.67359 -30.54766 [,715] [,716] [,717] [,718] [,719] [,720] [,721] [,722] [1,] 35.92008 -4.309396 -6.64325 -36.42169 26.79783 11.27914 11.11968 -47.37863 [,723] [,724] [,725] [,726] [,727] [,728] [,729] [,730] [,731] [1,] 28.35788 37.5521 2.96481 25.76903 37.15248 20.77483 -21.8673 11.02072 16.47764 [,732] [,733] [,734] [,735] [,736] [,737] [,738] [,739] [1,] 5.009815 27.75403 13.77702 12.92676 15.74157 -14.09928 -23.35441 5.750995 [,740] [,741] [,742] [,743] [,744] [,745] [,746] [,747] [,748] [1,] 2.063779 21.69806 36.46764 -8.876797 22.63255 36.3666 17.3164 23.23679 -17.78563 [,749] [,750] [,751] [,752] [,753] [,754] [,755] [,756] [1,] 30.6214 -23.20922 27.38115 -9.356022 32.01929 23.81796 -44.75483 21.86822 [,757] [,758] [,759] [,760] [,761] [,762] [,763] [,764] [1,] 13.52078 20.94864 8.415689 -35.63583 28.85158 -34.64772 16.96968 22.66887 [,765] [,766] [,767] [,768] [,769] [,770] [,771] [,772] [1,] 13.90654 37.62199 24.50274 -15.30286 22.34131 21.57181 -3.310424 4.662265 [,773] [,774] [,775] [,776] [,777] [,778] [,779] [,780] [,781] [1,] 23.54752 9.355402 21.10992 16.84373 31.28746 27.42409 15.66684 14.88601 14.89113 [,782] [,783] [,784] [,785] [,786] [,787] [,788] [,789] [,790] [1,] 5.128116 23.18067 -20.92659 21.853 -1.709425 5.422827 3.575637 21.06605 18.83842 [,791] [,792] [,793] [,794] [,795] [,796] [,797] [,798] [1,] 18.21192 -23.55028 37.27432 23.64067 37.69067 16.96823 27.84443 -8.216873 [,799] [,800] [,801] [,802] [,803] [,804] [,805] [,806] [,807] [1,] 25.71119 30.19941 18.7454 18.42438 -5.602567 -2.207138 -21.3942 38.93854 19.82544 [,808] [,809] [,810] [,811] [,812] [,813] [,814] [,815] [1,] 38.53412 11.46175 39.05494 17.57804 -2.003113 29.17775 -12.54087 39.01667 [,816] [,817] [,818] [,819] [,820] [,821] [,822] [,823] [,824] [1,] 18.65415 21.50294 -22.97332 28.67436 22.54534 22.80929 23.55315 9.453054 12.11279 [,825] [,826] [,827] [,828] [,829] [,830] [,831] [,832] [,833] [1,] 15.14858 -3.66887 22.21361 12.37499 18.88278 12.12797 20.44374 12.72226 16.55705 [,834] [,835] [,836] [,837] [,838] [,839] [,840] [,841] [1,] -0.4429002 32.65302 21.15962 25.10673 -13.3788 27.72402 35.48497 5.693965 [,842] [,843] [,844] [,845] [,846] [,847] [,848] [,849] [1,] 9.489983 9.035887 18.76565 -37.18444 2.280589 8.976045 14.79448 -1.852238 [,850] [,851] [,852] [,853] [,854] [,855] [,856] [,857] [1,] -35.25758 18.64111 10.35917 20.30648 -19.20965 8.812662 35.5126 -18.30456 [,858] [,859] [,860] [,861] [,862] [,863] [,864] [,865] [,866] [1,] 38.02748 -18.67709 11.62552 21.97641 16.08633 17.09121 10.9996 9.363052 16.57293 [,867] [,868] [,869] [,870] [,871] [,872] [,873] [,874] [1,] 8.390769 28.87665 0.1200936 -28.50821 21.67444 15.18392 24.22982 35.84569 [,875] [,876] [,877] [,878] [,879] [,880] [,881] [,882] [1,] -4.302763 -30.63974 33.96803 25.69174 27.71762 -18.31439 -20.79448 15.52937 [,883] [,884] [,885] [,886] [,887] [,888] [,889] [,890] [1,] -16.61525 37.47275 39.10284 16.24031 -5.746917 23.86404 16.7349 26.18243 [,891] [,892] [,893] [,894] [,895] [,896] [,897] [,898] [,899] [1,] -34.23965 5.438093 25.24799 28.71733 16.17651 34.4954 30.31152 26.72579 1.703926 [,900] [,901] [,902] [,903] [,904] [,905] [,906] [,907] [,908] [1,] 23.87834 37.86902 4.200849 5.907774 14.09016 11.88068 0.7140518 14.85329 21.91192 [,909] [,910] [,911] [,912] [,913] [,914] [,915] [,916] [,917]

Page 157: R version 3 Examples

[1,] 35.96838 27.52734 15.42534 33.43717 36.03979 19.46915 12.56837 -5.921915 17.29954 [,918] [,919] [,920] [,921] [,922] [,923] [,924] [,925] [,926] [1,] 23.5888 28.4168 11.18275 -5.77916 6.322641 15.29067 2.285828 -13.56141 35.78858 [,927] [,928] [,929] [,930] [,931] [,932] [,933] [,934] [1,] 38.65914 22.90937 23.62193 15.93159 3.838256 -6.239112 6.085083 -7.639274 [,935] [,936] [,937] [,938] [,939] [,940] [,941] [,942] [1,] 1.007167 -14.55018 10.82033 15.47855 -1.83582 18.49068 26.89679 -28.42574 [,943] [,944] [,945] [,946] [,947] [,948] [,949] [,950] [1,] -4.23763 27.73943 -11.38523 22.57187 -9.408349 -9.258071 37.74571 23.93424 [,951] [,952] [,953] [,954] [,955] [,956] [,957] [,958] [1,] 35.60848 -2.642359 23.00129 -20.74691 21.92851 21.89443 -30.51165 -47.94647 [,959] [,960] [,961] [,962] [,963] [,964] [,965] [,966] [,967] [1,] 38.64051 15.32615 7.780349 17.48725 30.96889 38.16012 16.9051 35.71555 31.15249 [,968] [,969] [,970] [,971] [,972] [,973] [,974] [,975] [,976] [1,] 24.71567 -3.208871 20.73194 36.80306 21.01465 5.763591 20.45762 30.59329 21.96537 [,977] [,978] [,979] [,980] [,981] [,982] [,983] [,984] [1,] -13.0769 -39.81159 6.721403 22.52699 7.208148 -0.6048042 24.69909 23.75355 [,985] [,986] [,987] [,988] [,989] [,990] [,991] [,992] [,993] [1,] -32.59499 13.67339 31.71321 -4.789024 30.25394 24.5679 20.03193 20.25613 25.63179 [,994] [,995] [,996] [,997] [,998] [,999] [,1000] [1,] 16.29013 -35.97638 -40.62982 14.15732 8.089712 -41.84589 -16.09254 $predicted.model$predicted.Y [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12] [,13] [,14] [,15] [1,] 1 1 -1 1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 1 [,16] [,17] [,18] [,19] [,20] [,21] [,22] [,23] [,24] [,25] [,26] [,27] [,28] [1,] -1 1 -1 -1 -1 1 -1 1 -1 -1 1 -1 -1 [,29] [,30] [,31] [,32] [,33] [,34] [,35] [,36] [,37] [,38] [,39] [,40] [,41] [1,] -1 -1 1 -1 -1 1 -1 -1 1 1 -1 -1 -1 [,42] [,43] [,44] [,45] [,46] [,47] [,48] [,49] [,50] [,51] [,52] [,53] [,54] [1,] -1 1 -1 -1 1 1 1 -1 -1 -1 -1 -1 -1 [,55] [,56] [,57] [,58] [,59] [,60] [,61] [,62] [,63] [,64] [,65] [,66] [,67] [1,] -1 -1 1 -1 -1 1 -1 -1 -1 -1 -1 -1 -1 [,68] [,69] [,70] [,71] [,72] [,73] [,74] [,75] [,76] [,77] [,78] [,79] [,80] [1,] -1 -1 -1 -1 1 -1 -1 -1 -1 -1 -1 1 -1 [,81] [,82] [,83] [,84] [,85] [,86] [,87] [,88] [,89] [,90] [,91] [,92] [,93] [1,] 1 1 -1 -1 -1 1 1 -1 -1 -1 -1 -1 1 [,94] [,95] [,96] [,97] [,98] [,99] [,100] [,101] [,102] [,103] [,104] [,105] [1,] -1 -1 1 1 -1 -1 -1 -1 1 1 1 -1 [,106] [,107] [,108] [,109] [,110] [,111] [,112] [,113] [,114] [,115] [,116] [1,] -1 -1 -1 -1 1 -1 -1 -1 1 1 1 [,117] [,118] [,119] [,120] [,121] [,122] [,123] [,124] [,125] [,126] [,127] [1,] 1 -1 -1 -1 -1 1 1 -1 -1 -1 -1 [,128] [,129] [,130] [,131] [,132] [,133] [,134] [,135] [,136] [,137] [,138] [1,] -1 -1 -1 1 1 -1 -1 1 -1 -1 -1 [,139] [,140] [,141] [,142] [,143] [,144] [,145] [,146] [,147] [,148] [,149] [1,] -1 -1 -1 1 -1 -1 -1 -1 1 -1 -1 [,150] [,151] [,152] [,153] [,154] [,155] [,156] [,157] [,158] [,159] [,160] [1,] 1 1 -1 -1 1 -1 -1 -1 1 1 -1 [,161] [,162] [,163] [,164] [,165] [,166] [,167] [,168] [,169] [,170] [,171] [1,] 1 -1 1 -1 -1 -1 -1 -1 1 -1 1 [,172] [,173] [,174] [,175] [,176] [,177] [,178] [,179] [,180] [,181] [,182] [1,] -1 -1 -1 -1 -1 1 -1 -1 -1 -1 1 [,183] [,184] [,185] [,186] [,187] [,188] [,189] [,190] [,191] [,192] [,193] [1,] -1 1 -1 1 -1 1 1 -1 -1 -1 -1 [,194] [,195] [,196] [,197] [,198] [,199] [,200] [,201] [,202] [,203] [,204] [1,] 1 -1 -1 -1 1 -1 1 -1 1 -1 1 [,205] [,206] [,207] [,208] [,209] [,210] [,211] [,212] [,213] [,214] [,215] [1,] 1 -1 -1 1 -1 1 -1 1 -1 -1 -1 [,216] [,217] [,218] [,219] [,220] [,221] [,222] [,223] [,224] [,225] [,226] [1,] -1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 [,227] [,228] [,229] [,230] [,231] [,232] [,233] [,234] [,235] [,236] [,237] [1,] -1 -1 -1 -1 1 -1 1 -1 -1 -1 -1 [,238] [,239] [,240] [,241] [,242] [,243] [,244] [,245] [,246] [,247] [,248] [1,] -1 -1 -1 -1 1 -1 -1 -1 -1 1 -1 [,249] [,250] [,251] [,252] [,253] [,254] [,255] [,256] [,257] [,258] [,259] [1,] 1 -1 -1 -1 1 -1 -1 -1 1 -1 -1 [,260] [,261] [,262] [,263] [,264] [,265] [,266] [,267] [,268] [,269] [,270] [1,] -1 -1 1 -1 -1 1 1 1 -1 -1 -1 [,271] [,272] [,273] [,274] [,275] [,276] [,277] [,278] [,279] [,280] [,281] [1,] -1 -1 1 -1 -1 -1 -1 -1 -1 -1 1 [,282] [,283] [,284] [,285] [,286] [,287] [,288] [,289] [,290] [,291] [,292]

Page 158: R version 3 Examples

[1,] -1 1 -1 -1 1 -1 -1 1 -1 -1 -1 [,293] [,294] [,295] [,296] [,297] [,298] [,299] [,300] [,301] [,302] [,303] [1,] 1 -1 -1 -1 -1 -1 1 -1 -1 -1 -1 [,304] [,305] [,306] [,307] [,308] [,309] [,310] [,311] [,312] [,313] [,314] [1,] -1 -1 -1 -1 1 -1 -1 -1 -1 -1 1 [,315] [,316] [,317] [,318] [,319] [,320] [,321] [,322] [,323] [,324] [,325] [1,] 1 -1 -1 -1 -1 -1 1 -1 -1 1 1 [,326] [,327] [,328] [,329] [,330] [,331] [,332] [,333] [,334] [,335] [,336] [1,] -1 -1 -1 1 1 -1 -1 -1 -1 -1 1 [,337] [,338] [,339] [,340] [,341] [,342] [,343] [,344] [,345] [,346] [,347] [1,] 1 -1 -1 -1 1 -1 -1 1 -1 -1 1 [,348] [,349] [,350] [,351] [,352] [,353] [,354] [,355] [,356] [,357] [,358] [1,] 1 -1 -1 -1 1 1 -1 1 1 -1 -1 [,359] [,360] [,361] [,362] [,363] [,364] [,365] [,366] [,367] [,368] [,369] [1,] -1 -1 1 1 1 1 -1 -1 -1 1 -1 [,370] [,371] [,372] [,373] [,374] [,375] [,376] [,377] [,378] [,379] [,380] [1,] -1 -1 1 1 -1 -1 1 -1 -1 -1 -1 [,381] [,382] [,383] [,384] [,385] [,386] [,387] [,388] [,389] [,390] [,391] [1,] 1 -1 -1 1 -1 1 1 -1 -1 -1 -1 [,392] [,393] [,394] [,395] [,396] [,397] [,398] [,399] [,400] [,401] [,402] [1,] -1 -1 -1 -1 -1 1 -1 -1 -1 1 1 [,403] [,404] [,405] [,406] [,407] [,408] [,409] [,410] [,411] [,412] [,413] [1,] -1 1 -1 -1 1 1 1 -1 -1 1 -1 [,414] [,415] [,416] [,417] [,418] [,419] [,420] [,421] [,422] [,423] [,424] [1,] -1 1 1 -1 -1 1 -1 -1 -1 -1 -1 [,425] [,426] [,427] [,428] [,429] [,430] [,431] [,432] [,433] [,434] [,435] [1,] 1 -1 1 -1 -1 -1 -1 -1 -1 -1 -1 [,436] [,437] [,438] [,439] [,440] [,441] [,442] [,443] [,444] [,445] [,446] [1,] -1 -1 1 1 -1 -1 -1 1 -1 -1 1 [,447] [,448] [,449] [,450] [,451] [,452] [,453] [,454] [,455] [,456] [,457] [1,] -1 -1 -1 -1 -1 -1 -1 1 -1 -1 -1 [,458] [,459] [,460] [,461] [,462] [,463] [,464] [,465] [,466] [,467] [,468] [1,] -1 -1 -1 -1 -1 -1 -1 -1 1 1 -1 [,469] [,470] [,471] [,472] [,473] [,474] [,475] [,476] [,477] [,478] [,479] [1,] 1 1 1 -1 -1 1 -1 1 -1 -1 1 [,480] [,481] [,482] [,483] [,484] [,485] [,486] [,487] [,488] [,489] [,490] [1,] -1 -1 -1 -1 -1 -1 -1 -1 1 -1 1 [,491] [,492] [,493] [,494] [,495] [,496] [,497] [,498] [,499] [,500] [,501] [1,] -1 -1 -1 -1 -1 -1 -1 1 1 -1 -1 [,502] [,503] [,504] [,505] [,506] [,507] [,508] [,509] [,510] [,511] [,512] [1,] 1 1 1 -1 1 1 1 1 1 1 1 [,513] [,514] [,515] [,516] [,517] [,518] [,519] [,520] [,521] [,522] [,523] [1,] 1 1 1 1 1 1 -1 1 1 1 1 [,524] [,525] [,526] [,527] [,528] [,529] [,530] [,531] [,532] [,533] [,534] [1,] 1 1 1 1 1 1 1 1 -1 -1 1 [,535] [,536] [,537] [,538] [,539] [,540] [,541] [,542] [,543] [,544] [,545] [1,] 1 -1 1 1 1 1 -1 1 -1 1 1 [,546] [,547] [,548] [,549] [,550] [,551] [,552] [,553] [,554] [,555] [,556] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,557] [,558] [,559] [,560] [,561] [,562] [,563] [,564] [,565] [,566] [,567] [1,] 1 1 1 1 -1 1 1 -1 -1 1 1 [,568] [,569] [,570] [,571] [,572] [,573] [,574] [,575] [,576] [,577] [,578] [1,] 1 -1 1 -1 1 1 -1 -1 -1 1 -1 [,579] [,580] [,581] [,582] [,583] [,584] [,585] [,586] [,587] [,588] [,589] [1,] 1 1 1 1 -1 1 1 1 1 1 -1 [,590] [,591] [,592] [,593] [,594] [,595] [,596] [,597] [,598] [,599] [,600] [1,] 1 1 1 -1 -1 1 1 1 1 1 1 [,601] [,602] [,603] [,604] [,605] [,606] [,607] [,608] [,609] [,610] [,611] [1,] -1 1 1 1 1 -1 1 1 1 1 1 [,612] [,613] [,614] [,615] [,616] [,617] [,618] [,619] [,620] [,621] [,622] [1,] 1 1 1 1 -1 1 -1 1 1 1 -1 [,623] [,624] [,625] [,626] [,627] [,628] [,629] [,630] [,631] [,632] [,633] [1,] 1 1 1 1 1 1 1 -1 1 -1 1 [,634] [,635] [,636] [,637] [,638] [,639] [,640] [,641] [,642] [,643] [,644] [1,] 1 1 1 1 1 -1 1 1 -1 -1 1 [,645] [,646] [,647] [,648] [,649] [,650] [,651] [,652] [,653] [,654] [,655] [1,] -1 1 1 1 1 1 1 1 1 1 -1 [,656] [,657] [,658] [,659] [,660] [,661] [,662] [,663] [,664] [,665] [,666] [1,] 1 1 1 1 -1 1 1 1 1 -1 1 [,667] [,668] [,669] [,670] [,671] [,672] [,673] [,674] [,675] [,676] [,677] [1,] 1 1 1 1 -1 1 1 1 1 1 1 [,678] [,679] [,680] [,681] [,682] [,683] [,684] [,685] [,686] [,687] [,688]

Page 159: R version 3 Examples

[1,] 1 1 1 1 1 -1 1 1 1 1 1 [,689] [,690] [,691] [,692] [,693] [,694] [,695] [,696] [,697] [,698] [,699] [1,] 1 1 1 1 -1 1 1 1 -1 -1 1 [,700] [,701] [,702] [,703] [,704] [,705] [,706] [,707] [,708] [,709] [,710] [1,] 1 1 1 -1 1 1 1 1 1 1 1 [,711] [,712] [,713] [,714] [,715] [,716] [,717] [,718] [,719] [,720] [,721] [1,] 1 1 1 -1 1 -1 -1 -1 1 1 1 [,722] [,723] [,724] [,725] [,726] [,727] [,728] [,729] [,730] [,731] [,732] [1,] -1 1 1 1 1 1 1 -1 1 1 1 [,733] [,734] [,735] [,736] [,737] [,738] [,739] [,740] [,741] [,742] [,743] [1,] 1 1 1 1 -1 -1 1 1 1 1 -1 [,744] [,745] [,746] [,747] [,748] [,749] [,750] [,751] [,752] [,753] [,754] [1,] 1 1 1 1 -1 1 -1 1 -1 1 1 [,755] [,756] [,757] [,758] [,759] [,760] [,761] [,762] [,763] [,764] [,765] [1,] -1 1 1 1 1 -1 1 -1 1 1 1 [,766] [,767] [,768] [,769] [,770] [,771] [,772] [,773] [,774] [,775] [,776] [1,] 1 1 -1 1 1 -1 1 1 1 1 1 [,777] [,778] [,779] [,780] [,781] [,782] [,783] [,784] [,785] [,786] [,787] [1,] 1 1 1 1 1 1 1 -1 1 -1 1 [,788] [,789] [,790] [,791] [,792] [,793] [,794] [,795] [,796] [,797] [,798] [1,] 1 1 1 1 -1 1 1 1 1 1 -1 [,799] [,800] [,801] [,802] [,803] [,804] [,805] [,806] [,807] [,808] [,809] [1,] 1 1 1 1 -1 -1 -1 1 1 1 1 [,810] [,811] [,812] [,813] [,814] [,815] [,816] [,817] [,818] [,819] [,820] [1,] 1 1 -1 1 -1 1 1 1 -1 1 1 [,821] [,822] [,823] [,824] [,825] [,826] [,827] [,828] [,829] [,830] [,831] [1,] 1 1 1 1 1 -1 1 1 1 1 1 [,832] [,833] [,834] [,835] [,836] [,837] [,838] [,839] [,840] [,841] [,842] [1,] 1 1 -1 1 1 1 -1 1 1 1 1 [,843] [,844] [,845] [,846] [,847] [,848] [,849] [,850] [,851] [,852] [,853] [1,] 1 1 -1 1 1 1 -1 -1 1 1 1 [,854] [,855] [,856] [,857] [,858] [,859] [,860] [,861] [,862] [,863] [,864] [1,] -1 1 1 -1 1 -1 1 1 1 1 1 [,865] [,866] [,867] [,868] [,869] [,870] [,871] [,872] [,873] [,874] [,875] [1,] 1 1 1 1 1 -1 1 1 1 1 -1 [,876] [,877] [,878] [,879] [,880] [,881] [,882] [,883] [,884] [,885] [,886] [1,] -1 1 1 1 -1 -1 1 -1 1 1 1 [,887] [,888] [,889] [,890] [,891] [,892] [,893] [,894] [,895] [,896] [,897] [1,] -1 1 1 1 -1 1 1 1 1 1 1 [,898] [,899] [,900] [,901] [,902] [,903] [,904] [,905] [,906] [,907] [,908] [1,] 1 1 1 1 1 1 1 1 1 1 1 [,909] [,910] [,911] [,912] [,913] [,914] [,915] [,916] [,917] [,918] [,919] [1,] 1 1 1 1 1 1 1 -1 1 1 1 [,920] [,921] [,922] [,923] [,924] [,925] [,926] [,927] [,928] [,929] [,930] [1,] 1 -1 1 1 1 -1 1 1 1 1 1 [,931] [,932] [,933] [,934] [,935] [,936] [,937] [,938] [,939] [,940] [,941] [1,] 1 -1 1 -1 1 -1 1 1 -1 1 1 [,942] [,943] [,944] [,945] [,946] [,947] [,948] [,949] [,950] [,951] [,952] [1,] -1 -1 1 -1 1 -1 -1 1 1 1 -1 [,953] [,954] [,955] [,956] [,957] [,958] [,959] [,960] [,961] [,962] [,963] [1,] 1 -1 1 1 -1 -1 1 1 1 1 1 [,964] [,965] [,966] [,967] [,968] [,969] [,970] [,971] [,972] [,973] [,974] [1,] 1 1 1 1 1 -1 1 1 1 1 1 [,975] [,976] [,977] [,978] [,979] [,980] [,981] [,982] [,983] [,984] [,985] [1,] 1 1 -1 -1 1 1 1 -1 1 1 -1 [,986] [,987] [,988] [,989] [,990] [,991] [,992] [,993] [,994] [,995] [,996] [1,] 1 1 -1 1 1 1 1 1 1 -1 -1 [,997] [,998] [,999] [,1000] [1,] 1 1 -1 -1 $predicted.model$error.rate [1] 0.258

Page 160: R version 3 Examples

Rknn

confusion > obs<- rep(0:1, each =5); > pre<- c(obs[3:10], obs[1:2]) > confusion(obs, pre) classified as-> 0 1 0 3 2 1 2 3 > confusion2acc( confusion(obs, pre)) [1] 0.6

eta > eta(1000, 32, 100) [1] 7.345061e-18

fitted

kknn contr.dummy > contr.metric(5) [,1] 1 -2 2 -1 3 0 4 1 5 2

Page 161: R version 3 Examples

> contr.ordinal(5) [,1] [,2] [,3] [,4] 1 0.5 0.5 0.5 0.5 2 -0.5 0.5 0.5 0.5 3 -0.5 -0.5 0.5 0.5 4 -0.5 -0.5 -0.5 0.5 5 -0.5 -0.5 -0.5 -0.5 > contr.dummy(5) 1 2 3 4 5 1 1 0 0 0 0 2 0 1 0 0 0 3 0 0 1 0 0 4 0 0 0 1 0 5 0 0 0 0 1

Glass > data(glass) > str(glass) 'data.frame': 214 obs. of 11 variables: $ Id : int 1 2 3 4 5 6 7 8 9 10 ... $ RI : num 1.52 1.52 1.52 1.52 1.52 ... $ Na : num 13.6 13.9 13.5 13.2 13.3 ... $ Mg : num 4.49 3.6 3.55 3.69 3.62 3.61 3.6 3.61 3.58 3.6 ... $ Al : num 1.1 1.36 1.54 1.29 1.24 1.62 1.14 1.05 1.37 1.36 ... $ Si : num 71.8 72.7 73 72.6 73.1 ... $ K : num 0.06 0.48 0.39 0.57 0.55 0.64 0.58 0.57 0.56 0.57 ... $ Ca : num 8.75 7.83 7.78 8.22 8.07 8.07 8.17 8.24 8.3 8.4 ... $ Ba : num 0 0 0 0 0 0 0 0 0 0 ... $ Fe : num 0 0 0 0 0 0.26 0 0 0 0.11 ... $ Type: Factor w/ 6 levels "1","2","3","5",..: 1 1 1 1 1 1 1 1 1 1 ...

Ionosphere > data(ionosphere) > summary(ionosphere) V1 V2 V3 V4 V5 Min. :0.0000 Min. :0 Min. :-1.0000 Min. :-1.00000 Min. :-1.0000 1st Qu.:1.0000 1st Qu.:0 1st Qu.: 0.4721 1st Qu.:-0.06474 1st Qu.: 0.4127 Median :1.0000 Median :0 Median : 0.8711 Median : 0.01631 Median : 0.8092 Mean :0.8917 Mean :0 Mean : 0.6413 Mean : 0.04437 Mean : 0.6011 3rd Qu.:1.0000 3rd Qu.:0 3rd Qu.: 1.0000 3rd Qu.: 0.19418 3rd Qu.: 1.0000 Max. :1.0000 Max. :0 Max. : 1.0000 Max. : 1.00000 Max. : 1.0000 V6 V7 V8 V9 Min. :-1.0000 Min. :-1.0000 Min. :-1.00000 Min. :-1.00000 1st Qu.:-0.0248 1st Qu.: 0.2113 1st Qu.:-0.05484 1st Qu.: 0.08711 Median : 0.0228 Median : 0.7287 Median : 0.01471 Median : 0.68421 Mean : 0.1159 Mean : 0.5501 Mean : 0.11936 Mean : 0.51185 3rd Qu.: 0.3347 3rd Qu.: 0.9692 3rd Qu.: 0.44567 3rd Qu.: 0.95324 Max. : 1.0000 Max. : 1.0000 Max. : 1.00000 Max. : 1.00000 V10 V11 V12 V13 Min. :-1.00000 Min. :-1.00000 Min. :-1.00000 Min. :-1.0000 1st Qu.:-0.04807 1st Qu.: 0.02112 1st Qu.:-0.06527 1st Qu.: 0.0000 Median : 0.01829 Median : 0.66798 Median : 0.02825 Median : 0.6441 Mean : 0.18135 Mean : 0.47618 Mean : 0.15504 Mean : 0.4008 3rd Qu.: 0.53419 3rd Qu.: 0.95790 3rd Qu.: 0.48237 3rd Qu.: 0.9555 Max. : 1.00000 Max. : 1.00000 Max. : 1.00000 Max. : 1.0000 V14 V15 V16 V17 Min. :-1.00000 Min. :-1.0000 Min. :-1.00000 Min. :-1.0000 1st Qu.:-0.07372 1st Qu.: 0.0000 1st Qu.:-0.08170 1st Qu.: 0.0000 Median : 0.03027 Median : 0.6019 Median : 0.00000 Median : 0.5909 Mean : 0.09341 Mean : 0.3442 Mean : 0.07113 Mean : 0.3819 3rd Qu.: 0.37486 3rd Qu.: 0.9193 3rd Qu.: 0.30897 3rd Qu.: 0.9357 Max. : 1.00000 Max. : 1.0000 Max. : 1.00000 Max. : 1.0000 V18 V19 V20 V21 Min. :-1.000000 Min. :-1.0000 Min. :-1.00000 Min. :-1.0000 1st Qu.:-0.225690 1st Qu.: 0.0000 1st Qu.:-0.23467 1st Qu.: 0.0000 Median : 0.000000 Median : 0.5762 Median : 0.00000 Median : 0.4991 Mean :-0.003617 Mean : 0.3594 Mean :-0.02402 Mean : 0.3367 3rd Qu.: 0.195285 3rd Qu.: 0.8993 3rd Qu.: 0.13437 3rd Qu.: 0.8949 Max. : 1.000000 Max. : 1.0000 Max. : 1.00000 Max. : 1.0000 V22 V23 V24 V25 Min. :-1.000000 Min. :-1.0000 Min. :-1.00000 Min. :-1.0000 1st Qu.:-0.243870 1st Qu.: 0.0000 1st Qu.:-0.36689 1st Qu.: 0.0000

Page 162: R version 3 Examples

Median : 0.000000 Median : 0.5318 Median : 0.00000 Median : 0.5539 Mean : 0.008296 Mean : 0.3625 Mean :-0.05741 Mean : 0.3961 3rd Qu.: 0.188760 3rd Qu.: 0.9112 3rd Qu.: 0.16463 3rd Qu.: 0.9052 Max. : 1.000000 Max. : 1.0000 Max. : 1.00000 Max. : 1.0000 V26 V27 V28 V29 Min. :-1.00000 Min. :-1.0000 Min. :-1.00000 Min. :-1.0000 1st Qu.:-0.33239 1st Qu.: 0.2864 1st Qu.:-0.44316 1st Qu.: 0.0000 Median :-0.01505 Median : 0.7082 Median :-0.01769 Median : 0.4966 Mean :-0.07119 Mean : 0.5416 Mean :-0.06954 Mean : 0.3784 3rd Qu.: 0.15676 3rd Qu.: 0.9999 3rd Qu.: 0.15354 3rd Qu.: 0.8835 Max. : 1.00000 Max. : 1.0000 Max. : 1.00000 Max. : 1.0000 V30 V31 V32 V33 Min. :-1.00000 Min. :-1.0000 Min. :-1.000000 Min. :-1.0000 1st Qu.:-0.23689 1st Qu.: 0.0000 1st Qu.:-0.242595 1st Qu.: 0.0000 Median : 0.00000 Median : 0.4428 Median : 0.000000 Median : 0.4096 Mean :-0.02791 Mean : 0.3525 Mean :-0.003794 Mean : 0.3494 3rd Qu.: 0.15407 3rd Qu.: 0.8576 3rd Qu.: 0.200120 3rd Qu.: 0.8138 Max. : 1.00000 Max. : 1.0000 Max. : 1.000000 Max. : 1.0000 V34 class Min. :-1.00000 b:126 1st Qu.:-0.16535 g:225 Median : 0.00000 Mean : 0.01448 3rd Qu.: 0.17166 Max. : 1.00000

Kknn > data(iris) > m <- dim(iris)[1] > val <- sample(1:m, size = round(m/3), replace = FALSE, + prob = rep(1/m, m)) > iris.learn <- iris[-val,] > iris.valid <- iris[val,] > iris.kknn <- kknn(Species~., iris.learn, iris.valid, distance = 1, + kernel = "triangular") > summary(iris.kknn) Call: kknn(formula = Species ~ ., train = iris.learn, test = iris.valid, distance = 1, kernel = "triangular") Response: "nominal" fit prob.setosa prob.versicolor prob.virginica 1 virginica 0 0.000000000 1.00000000 2 virginica 0 0.000000000 1.00000000 3 virginica 0 0.000000000 1.00000000 4 versicolor 0 1.000000000 0.00000000 5 versicolor 0 1.000000000 0.00000000 6 setosa 1 0.000000000 0.00000000 7 virginica 0 0.000000000 1.00000000 8 virginica 0 0.000000000 1.00000000 9 versicolor 0 0.845720060 0.15427994 10 versicolor 0 0.982460743 0.01753926 11 setosa 1 0.000000000 0.00000000 12 versicolor 0 0.579817628 0.42018237 13 setosa 1 0.000000000 0.00000000 14 versicolor 0 1.000000000 0.00000000 15 versicolor 0 0.966438031 0.03356197 16 setosa 1 0.000000000 0.00000000 17 versicolor 0 1.000000000 0.00000000 18 virginica 0 0.026207641 0.97379236 19 virginica 0 0.000000000 1.00000000 20 setosa 1 0.000000000 0.00000000 21 setosa 1 0.000000000 0.00000000 22 setosa 1 0.000000000 0.00000000 23 setosa 1 0.000000000 0.00000000 24 versicolor 0 0.869386829 0.13061317 25 versicolor 0 0.963689487 0.03631051 26 virginica 0 0.057605307 0.94239469 27 virginica 0 0.000000000 1.00000000 28 setosa 1 0.000000000 0.00000000 29 setosa 1 0.000000000 0.00000000

Page 163: R version 3 Examples

30 setosa 1 0.000000000 0.00000000 31 versicolor 0 1.000000000 0.00000000 32 setosa 1 0.000000000 0.00000000 33 versicolor 0 1.000000000 0.00000000 34 versicolor 0 1.000000000 0.00000000 35 setosa 1 0.000000000 0.00000000 36 versicolor 0 0.887012108 0.11298789 37 setosa 1 0.000000000 0.00000000 38 setosa 1 0.000000000 0.00000000 39 virginica 0 0.000000000 1.00000000 40 versicolor 0 0.626491147 0.37350885 41 setosa 1 0.000000000 0.00000000 42 virginica 0 0.387517791 0.61248221 43 versicolor 0 1.000000000 0.00000000 44 virginica 0 0.000000000 1.00000000 45 virginica 0 0.005696962 0.99430304 46 virginica 0 0.000000000 1.00000000 47 setosa 1 0.000000000 0.00000000 48 versicolor 0 1.000000000 0.00000000 49 versicolor 0 1.000000000 0.00000000 50 setosa 1 0.000000000 0.00000000 > fit <- fitted(iris.kknn) > table(iris.valid$Species, fit) fit setosa versicolor virginica setosa 18 0 0 versicolor 0 15 1 virginica 0 3 13 > pcol <- as.character(as.numeric(iris.valid$Species)) > pairs(iris.valid[1:4], pch = pcol, col = c("green3", "red") + [(iris.valid$Species != fit)+1])

> data(ionosphere)

Page 164: R version 3 Examples

> ionosphere.learn <- ionosphere[1:200,] > ionosphere.valid <- ionosphere[-c(1:200),] > fit.kknn <- kknn(class ~ ., ionosphere.learn, ionosphere.valid) > table(ionosphere.valid$class, fit.kknn$fit) b g b 19 8 g 2 122 > (fit.train1 <- train.kknn(class ~ ., ionosphere.learn, kmax = 15, + kernel = c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 1)) Call: train.kknn(formula = class ~ ., data = ionosphere.learn, kmax = 15, distance = 1, kernel = c("triangular", "rectangular", "epanechnikov", "optimal")) Type of response variable: nominal Minimal misclassification: 0.12 Best kernel: rectangular Best k: 2 > table(predict(fit.train1, ionosphere.valid), ionosphere.valid$class) b g b 25 4 g 2 120 > (fit.train2 <- train.kknn(class ~ ., ionosphere.learn, kmax = 15, + kernel = c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 2)) Call: train.kknn(formula = class ~ ., data = ionosphere.learn, kmax = 15, distance = 2, kernel = c("triangular", "rectangular", "epanechnikov", "optimal")) Type of response variable: nominal Minimal misclassification: 0.12 Best kernel: rectangular Best k: 2 > table(predict(fit.train2, ionosphere.valid), ionosphere.valid$class) b g b 20 5 g 7 119