parameter estimation for semiparametric models with cmars and its applications
DESCRIPTION
AACIMP 2010 Summer School lecture by Gerhard Wilhelm Weber. "Applied Mathematics" stream. "Modern Operational Research and Its Mathematical Methods with a Focus on Financial Mathematics" course. Part 11. More info at http://summerschool.ssa.org.uaTRANSCRIPT
PARAMETER ESTIMATION FOR SEMIPARAMETRIC MODELS
WITH CMARS AND ITS APPLICATIONS
Fatma YERLIKAYA-ÖZKURT
Institute of Applied Mathematics, METU, Ankara,Turkey
Gerhard-Wilhelm WEBER
Institute of Applied Mathematics, METU, Ankara,Turkey
Faculty of Economics, Business and Law, University of Siegen, Germany
Center for Research on Optimization and Control, University of Aveiro, Portugal
Universiti Teknologi Malaysia, Skudai, Malaysia
Pakize TAYLAN
Department of Mathematics, Dicle University, Diyarbakır, Turkey
5th International Summer School
Achievements and Applications of Contemporary Informatics,
Mathematics and Physics
National University of Technology of the Ukraine
Kiev, Ukraine, August 3-15, 2010
• Introduction
• Estimation for Generalized Linear Model (GLM)
• Generalized Partial Linear Model (GPLM)
• Estimation for GPLM
– Least-Squares Estimation with Tikhonov Regularization
– CMARS Method
• Penalized Residual Sum of Squares (PRSS) for GLM with MARS
• Tikhonov Regularization for GLM with MARS
• An Alternative Solution for Tikhonov Regularization Problem with CQP
• Solution Methods
• Application
• Conclusion
Outline
The class of Generalized Linear Models (GLMs) has gained popularity as a statistical modeling tool.
This popularity is due to:
• The flexibility of GLM in addressing a variety of statistical problems,
• The availability of software (Stata, SAS, S-PLUS, R) )to fit the models.
The class of GLM is an extension of traditional linear models allows:
The mean of a dependent variable to depend on a linear predictor by a nonlinear link function......
The probability distribution of the response, to be any member of an exponential family of distributions.
Many widely used statistical models belong to GLM:
o linear models with normal errors,
o logistic and probit models for binary data,
o log-linear models for multinomial data.
Introduction
Many other useful statistical models such as with
• Poisson, binomial,
• Gamma or normal distributions,
can be formulated as GLM by the selection of an appropriate link function
and response probability distribution.
A GLM looks as follows:
• : expected value of the response variable ,
• : smooth monotonic link function,
• : observed value of explanatory variable for the i-th case,
• : vector of unknown parameters.
( ) ; T
i i iH x
( ) i i
E Yi
YH
ix
Introduction
Introduction
• Assumptions: are independent and can have any distribution from exponential family density
• are arbitrary “scale” parameters, and is called a natural parameter.
• General expressions for mean and variance of dependent variable :
, ,i i i
a b c
iY
( 1,2,...,
~ ( , , )
( )exp ( , ) ),
( )
ii Y i i
i i i i
i i
i
i n
Y f y
y bc y
a
i
iY
'
"
( ) ( ),
( ) ( ) ,
( ) ( ) , ( ) : / .
i i i i
i i
i i i i i i
E Y b
Var Y V
V b a
Estimation for GLM
• Estimation and inference for GLM is based on the theory of
• Maximum Likelihood Estimation
• Least–Squares approach:
• The dependence of the right-hand side on is solely through the dependence of the on .
1
( ) : ( ( ) ( , )).
n
i i i i i i ii
l y b c y
i
Generalized Partial Linear Models (GPLMs)
• Particular semiparametric models are the Generalized Partial Linear Models (GPLMs) :
They extend the GLMs in that the usual parametric terms are augmented by a single nonparametric component:
• is a vector of parameters, and
is a smooth function,
which we try to estimate by CMARS.
• Assumption: m-dimensional random vector which represents (typically discrete) covariates,
q-dimensional random vector of continuous covariates,
which comes from a decomposition of explanatory variables.
Other interpretations of : role of the environment,
expert opinions,
Wiener processes, etc..
X
, ; TE Y X T G X T
T
m
T
Estimation for GPLM
• There are different kinds of estimation methods for GPLM.
• Generally, the estimation methods for model
is based on kernel methods and test procedures on the correct specification of this model.
• Now, we will try to concentrate on special types of GPLM estimation based on
------- Newly developed data mining method CMARS
and
------- Least –Squares estimation with Tikhonov regularization.
, ; TE Y X T G X T
Least-Squares Estimation with Tikhonov Regularization
• The general model
can be considered as semiparametric generalized linear model and can be written as follows:
observation values .
and and is a smooth function.
• For the estimation of parametric part, we apply the linear least squares with Tikhonov regularization.
, ; TE Y X T G X T
1
( ) ( , ) .m
T
j j
j
H X X X
T T T
, , ( 1,2,..., )i i iy i nx t
( )i iG ( ) T
i i i iH x t
The Least-Squares Estimation with Tikhonov Regularization
The process is as follows:
Firstly, we apply the linear least squares on the given data to find a vector :
(*)
Equivalently, the model form is
The method of least squares is used for estimating the regression coefficients,
in ,
to minimize the residual sum of squares (RSS).
preprocY TX
0 1 2( , , ,..., )preproc T
m
0
1
.
m
j j
j
y X
0
1
m
j j
j
y X
preproc
• Tikhonov Regularization proposed an approximate solution to (*) equation by minimizing the
quadratic functional
(**)
where is a regularization parameter between the first and the second part.
The terms and represents the response vector and unknown coefficients.
• They are obtained by solving Tikhonov regularization problem (**).
• Generally, Tikhonov regularization involves higher-order regularization terms
which can be solved using generalized singular value decomposition (GSVD).
2 2
2 2min ,
preproc
preproc preprocy X L
y preproc
The Least-Squares Estimation with Tikhonov Regularization
• After getting the regression coefficients, we subtract the linear least- square model (without intercept)
from corresponding responses:
• Doing this at the input data, the resulting values ( ) are our new responses.
• Then, based on these new data, we find find the knots for nonparametric part with MARS.
• Again consider model :
and is a smooth function which we try to estimate by CMARS
which is an alternative technique to the well-known data mining tool multivariate adaptive
regression splines (MARS).
1
ˆ.m
j j
j
y X y
y
( ) T
i i i iH x t
The Least-Squares Estimation with Tikhonov Regularization
CMARS Method
• What is MARS?
• Multivariate adaptive regression splines (MARS) is developed in 1991 by Jerome Friedman.
• MARS builds flexible models by introducing piecewise linear regressions.
• MARS uses expansions in piecewise linear basis functions of the form
.
• Set of basis functions:
.
Basic elements in the regression with MARS.
[ ] : max 0,q q ( , ) = [ ( )] , -c x x+ ( , ) = [ ( )] ,c x x
1, 2, ,: ( ) , ( ) , ,..., , 1,2,...,|j j j j N jX X x x x j p
x
y
+( , )=[ ( )]c x x ( , )=[ ( )]-c x x
CMARS Method
• Thus, we can represent by a linear combination which is successively built up by basis
functions and the intercept , such that
.
Here, are basis functions from or products of two or more such functions,
interaction basis functions are created by multiplying an existing basis function with a truncated linear
function involving new variable. are the unknown coefficients for the mth basis function
or for the constant 1
• Provided the observations represented by the data :
.
0
1
( ) ( )M
T
i i i m m i
m
H
x t
i t
0
( 1,2,..., )m m M
m ( 1,2,..., )m M
( 0).m
( 1, 2, ..., )i i Nt
1
( ) : [ ( )]m
m m mj j j
K
m
j
s t
t
CMARS Method
The MARS algorithm for estimating the model function consists of two algorithms:
I. Forward stepwise algorithm:
• Search for the basis functions.
• At each step, the split that minimized some “lack of fit” criterion from all the possible splits on each basis function is chosen.
• The process stops when a user-specified value is reached. At the end of this process we have a large expression in Y .
• This model typically overfits the data; so a backward deletion procedure is applied.
II. Backward stepwise algorithm:
• Prevents from over-fitting by decreasing the complexity of the model without degrading the fit to the data.
• Proposition: We do not employ the backward stepwise algorithm to estimate the function.
At its place, as an alternative we propose to use penalty terms in addition to the least-squares estimation in order to control the lack of fit from the viewpoint of the complexity of the estimation.
maxM
The Penalized Residual Sum of Squares for GLM with MARS
• Let we consider equation
• ,
where and . Let us use the penalized residual sum
of squares with basis functions having been accumulated in the forward stepwise algorithm.
For the GLM model with MARS, PRSS has the following form:
, .
( ) T
i i i iH x t
1, ,...,T
M
maxM
max
1 2
22 2
2
,
1 1 1, ( )( , )
( ) ,MN
T m m
i i i m m r s m
i m r sr s V m
PRSS D d
x t t t
T
i i i t t t
1 2
, ( ) : ( )m m m m
r s m m r sD t t t t
1 2
1 2
1 2 1 2
( ) : | 1, 2,...,
: ( , ,..., )
( , )
: , , 0,1
Km
m
j m
m T
m m m
V m j K
t t t
t =
where
The Penalized Residual Sum of Squares for GLM with MARS
• Our optimization problem bases on the tradeoff between both accuracy, i.e., a small sum of error squares, and not too high a complexity.
• This tradeoff is established through the penalty parameters .
• Let use in approximate PRSS by discretizing the high-dimensional integration:
Then, PRSS becomes
.
m
1
, ,1
ˆm
m mm j m jj j
Km
i l lj
t t t
, , ,
ˆ , ,..., ,m m mm j m j m jj j j
m
i l l lt t t
t
1,2,...,( ) 0,1,2,..., 1
Kmj
mj KN
max
,M
max
max
1 1
1 11, ( ),..., ( ), ( ),..., ( )T
MM M
i i M i M i M i
d t t t t
max
1 2
2
1
( 1) 22
2
,
1 1 1, ( )( , )
( )
ˆ ˆ( ) .
Km
NT
i i i
i
M Nm m
m m r s m i i
m i r sr s V m
PRSS
D
x d
t t
max1 2 1 2: ( , ,..., , , ,..., )MM M M T
i i i i i i i
d t t t t t t
Tikhonov Regularization for GLM with MARS
• For a short representation, we can rewrite the approximate relation as
• We can write PRSS as
where is a block matrix constructed by -matrix and
matrix , is a vector constructed and vectors.
• Then, we deal with the linear systems equations of , approximately.
12
1 2
22
,
1, ( )( , )
ˆ ˆ( ) .m m
im r s m i i
r sr s V m
L D
t t
max ( 1)2
2 2
21 1
( ) ,
KmM N
m im m
m i
PRSS L
X d
1( ) ( ),..., ( )T
N d d d
max ( 1)2
* * 2 2
21 1
,
KmM N
m im m
m i
PRSS L
X
* = ( )X X d ( )N p Xmax( ( 1))N Μ
( )d * = , T
* * X
Tikhonov Regularization for GLM with MARS
• We approach our problem PRSS as a Tikhonov regularization problem by using the same penalty
parameter for each derivative:
Here, high dimentional matrix , where is an matrix with entries being first
or second derivatives of .
• We can easily note that our Tikhonov regularization problem has multiple objective functions through
a linear combination of and . We select the solution such that it minimizes both
first objective function and second objective in the sense of a compromise
(tradeoff) solution.
2( : )m
max( 1)M p
2 2* * * *
2 2.PRSS X L
* *L = R L
*R
2* *
2 X
2* *
2X
2* *
2 X
2* *
2X
An Alternative Solution for Tikhonov Regularization Problem with CQP
• We can solve Tikhonov regularization problem for MARS by continuous optimization techniques, especially, conic quadratic programming.
• We formulate PRSS as a CQP problem:
max
* *
1 1 1 1 1(1,0 ) , ( , ) , (0 , ), , (1,0,...,0) , 0,T T T T T
M p Nz q c u D X d p
, *
* *
2
* *
2
min ,
subject to ,
.
zz
z
M
X
L
2, ( 1,2,..., ). ( )min T
i ii iT q i k
xc Qx Pp Cx D x dIn general : subject to
Max max max
*
2 1 2 1 2 2 2( , ), , and . 0 0 0M M M p q MD L d p
*I
• We first reformulate as a Primal Problem:
with ice-cream (or second order or Lorentz) cones:
*I
max
maxmax
max
max
, *
*
*
1
*11
*
1
21
min ,
such that ,1 0
0,
0
, ,
0
0
0
0
z
N
T
M p
MM
T
M p
MN
z
z
z
M
L L
X
L
1 1 2 2 2
1 2 1 1 1 2( , ,..., ) | ... ( 1).N T N
N N+ NL x x x x x x x N
x R
An Alternative Solution for Tikhonov Regularization Problem with CQP
• The corresponding Dual Problem is
max
max
maxmax max
2max
1 1 2
1
1 2* *11 1
1
1 2
max ( ,0) ,
0 11such that ,
, .
0
00
00 0
M
T T
M
TTMN
T TM pM p M p
N
M
L L
X L
An Alternative Solution for Tikhonov Regularization Problem with CQP
Solution Methods
• CQPs belong to the well-structured convex problems.
• Interior Point algorithms:
– We use the structure of problem.
– Yield better complexity bounds.
– Exhibit much better practical performance.
Application
• GLPMs with CMARS and the parameter estimation for them have been presented and investigated in detail. Now, a numerical example for this study will be given.
• Two data sets are used in our applications:
• Concrete Compressive Strength Data Set
• Concrete Slump Test Data Set
• Data sets are obtained from UCI Machine Learning Repository (http://archive.ics.uci.edu/ml/).
•
Application
• Salford Systems is used for MARS application, for CMARS a code is written by using MATLAB
and in order to solve the CQP problem in CMARS, MOSEK software is preferred.
For Tikhonov Regularization Regularization Toolbox in MATLAB is used.
• All test data sets are also compared according to the performance measures such as Root Mean Square Error (RMSE), Correlation Coefficient (r), R2, Adjusted R2.
•
• To compare the performances of Tikhonov regularization, CMARS and GPLM models,
let us look at the performance measure values for both data sets.
WORSE BETTER
Evaluation of the models based on performance values:
• CMARS performs better than Tikhonov regularization with respect to all the measures for both data sets.
• On the other hand, GLM with CMARS (GPLM) performs better than both Tikhonov regularization and CMARS with respect to all the measures for both data sets.
Application
Outlook
• Important new class of GPLs: having written the Tikhonov regularization task for GLM using
MARS as a CQP problem, we will call it CGLMARS:
, ,
( , ) GPLM ( ) ( )LM MARS
= +
TE Y X T G X T
X T X T
e.g.,
References
[1] Aster, A., Borchers, B., and Thurber, C., Parameter Estimation and Inverse Problems, Academic
Press, 2004.
[2] Craven, P., and Wahba, G., Smoothing noisy data with spline functions, Numer. Math. 31, Linear
Models, (1979), 377-403.
[3] De Boor, C., Practical Guide to Splines, Springer Verlag, 2001.
[4] Dongarra, J.J., Bunch, J.R., Moler, C.B., and Stewart, G.W., Linpack User’s Guide, Philadelphia,
SIAM, 1979.
[5] Friedman, J.H., Multivariate adaptive regression splines, (1991), The Annals of Statistics
19, 1, 1-141.
[6] Green, P.J., and Yandell, B.S., Semi-Parametric Generalized Linear Models, Lecture Notes in
Statistics, 32 (1985).
[7] Hastie, T.J., and Tibshirani, R.J., Generalized Additive Models, New York, Chapman and Hall, 1990.
[8] Kincaid, D., and Cheney, W., Numerical Analysis: Mathematics of Scientific computing, Pacific
Grove, 2002.
[9] Müller, M., Estimation and testing in generalized partial linear models – A comparive study, Statistics
and Computing 11 (2001) 299-309, 2001.
[10] Nelder, J.A., and Wedderburn, R.W.M., Generalized linear models, Journal of the Royal Statistical
Society A, 145, (1972) 470-484.
[11] Nemirovski, A., Lectures on modern convex optimization, Israel Institute of Technology
http://iew3.technion.ac.il/Labs/Opt/opt/LN/Final.pdf.
References
[12] Nesterov, Y.E , and Nemirovskii, A.S., Interior Point Methods in Convex Programming,
SIAM, 1993.
[13] Ortega, J.M., and Rheinboldt, W.C., Iterative Solution of Nonlinear Equations in Several
Variables, Academic Press, New York, 1970.
[14] Renegar, J., Mathematical View of Interior Point Methods in Convex Programming, SIAM,
2000.
[15] Sheid, F., Numerical Analysis, McGraw-Hill Book Company, New-York, 1968.
[16] Taylan, P., Weber, G.-W., and Beck, A., New approaches to regression by generalized
additive and continuous optimization for modern applications in finance, science and
technology, Optimization 56, 5-6 (2007), pp. 1-24.
[17] Taylan, P., Weber, G.-W., and Liu, L., On foundations of parameter estimation for
generalized partial linear models with B-splines and continuous optimization, in the
proceedings of PCO 2010, 3rd Global Conference on Power Control and Optimization,
February 2-4, 2010, Gold Coast, Queensland, Australia.
[18] Weber, G.-W., Akteke-Öztürk, B., İşcanoğlu, A., Özöğür, S., and Taylan, P., Data Mining:
Clustering, Classification and Regression, four lectures given at the Graduate Summer
School on New Advances in Statistics, Middle East Technical University, Ankara, Turkey,
August 11-24, 2007 (http://www.statsummer.com/).
[19] Wood, S.N., Generalized Additive Models, An Introduction with R, New York, Chapman
and Hall, 2006.
Thank you very much for your attention!