combining regression trees and radial basis function networks

21
Combining Regression Trees and Radial Basis Function Networks paper by: M. Orr, J. Hallam, K. Takezawa, A. Murray, S. Ninomiya, M. Oide, T. Leonard presentation by: Vladimir Vacić

Upload: deepak

Post on 12-Jan-2016

37 views

Category:

Documents


3 download

DESCRIPTION

Combining Regression Trees and Radial Basis Function Networks. paper by: M. Orr, J. Hallam, K. Takezawa, A. Murray, S. Ninomiya, M. Oide, T. Leonard presentation by: Vladimir Vacić. Contents:. Regression Trees Radial Basis Function Neural Networks Combining RTs and RBF NNs Method - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Combining Regression Trees and Radial Basis Function Networks

Combining Regression Trees and Radial Basis Function Networkspaper by: M. Orr, J. Hallam, K. Takezawa,

A. Murray, S. Ninomiya, M. Oide, T. Leonard

presentation by: Vladimir Vacić

Page 2: Combining Regression Trees and Radial Basis Function Networks

Contents:

Regression Trees Radial Basis Function Neural

Networks Combining RTs and RBF NNs Method Experimental Results Conclusion

Page 3: Combining Regression Trees and Radial Basis Function Networks

Xik<b

SL SR

Regression Trees

b

Page 4: Combining Regression Trees and Radial Basis Function Networks

Radial Basis Function Neural Networks

Page 5: Combining Regression Trees and Radial Basis Function Networks

Combining RTs and RBF NNs

RT generates candidate units

for the RBF NN RT specifies RBF centers

and radiiRT influences the order in which

candidate units are evaluated

Page 6: Combining Regression Trees and Radial Basis Function Networks

Method

Generating the regression tree:

recursively cut along the k dimensions

determine output for each node

Page 7: Combining Regression Trees and Radial Basis Function Networks

Method

Transforming tree nodes in RBFs:

center

radius

Page 8: Combining Regression Trees and Radial Basis Function Networks

Method

Selecting RBF units from the set of candidates:

necessary because so far we have not performed any pruning of the regression tree

too complex of a RBF runs into a risk of over-fitting

complex RBF is computationally expensive

Page 9: Combining Regression Trees and Radial Basis Function Networks

Method

Selecting RBF units from the set of candidates:

standard selection methods are forward selection, backward elimination, combination of the two, full combinatorial search…

problem with forward selection is that once choice may block subsequent informative choices

Page 10: Combining Regression Trees and Radial Basis Function Networks

Method

Using the trees to guide selection:

put the root node into the list of active nodes

for each node, consider the effect of adding one or both children and keeping or removing the parent

choose the combination which improves performance the most and update the active list

repeat

Page 11: Combining Regression Trees and Radial Basis Function Networks

Method

Calculating the weights:

least square minimization

Page 12: Combining Regression Trees and Radial Basis Function Networks

Method

Model selection criterion:

Bayesian information criterion (BIC)

BIC imposes a penalty for model complexity and hence leads to smaller networks

Page 13: Combining Regression Trees and Radial Basis Function Networks

Method

Note that so far we have had

2 free parameters :

p

(controls the resulting network size)

(controls the ratio of the RBF radii to corresponding hyper-rectangle size)

Page 14: Combining Regression Trees and Radial Basis Function Networks

Experimental Results

the authors report that the best experimentally determined p and on the training set do not always yield best performance on the test set

instead, they suggest using a set of best values for p and from training and then find the best combination on the test set

Page 15: Combining Regression Trees and Radial Basis Function Networks

Experimental Results

2D sine wave problem

simulated circuit problem

Page 16: Combining Regression Trees and Radial Basis Function Networks

Experimental Results

Comparison with other learning methods:

linear least squares regression k-nearest neighbor ensembles of multilayer perceptrons multilayer perceptrons trained using MCMC multivariate adaptive regression splines

(MARS) with bagging

Page 17: Combining Regression Trees and Radial Basis Function Networks

Experimental Results

Datasets:

DELVE dataset (non-linear, nigh noise, 8- and 32-dimensional examples), generated from simulated robotic arms

soybean classification into three classes (good, fair, poor) from digital images

Page 18: Combining Regression Trees and Radial Basis Function Networks

Experimental Results

DELVE, 8-dimensional examples:

Page 19: Combining Regression Trees and Radial Basis Function Networks

Experimental Results

DELVE, 32-dimensional examples:

Page 20: Combining Regression Trees and Radial Basis Function Networks

Conclusion

improvement and analysis of previous work by Kubat

combining RTs and RBF NNs as a technique is competitive with some leading modern methods

Page 21: Combining Regression Trees and Radial Basis Function Networks

Combining Regression Trees and Radial Basis Function Networkspaper by: M. Orr, J. Hallam, K. Takezawa,

A. Murray, S. Ninomiya, M. Oide, T. Leonard

presentation by: Vladimir Vacić