Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple...

21
This article was downloaded by: [McGill University Library] On: 17 November 2014, At: 06:26 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Communications in Statistics - Theory and Methods Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/lsta20 Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model Shahjahan Khan a b a Department of Mathematics and Computing , University of Southern Queensland , Toowoomba, Queensland, Australia b Department of Mathematics and Computing , University of Southern Queensland , Toowoomba, Queensland, Australia Published online: 15 Feb 2007. To cite this article: Shahjahan Khan (2005) Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model, Communications in Statistics - Theory and Methods, 33:10, 2423-2441, DOI: 10.1081/ STA-200031471 To link to this article: http://dx.doi.org/10.1081/STA-200031471 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions

Transcript of Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple...

Page 1: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

This article was downloaded by: [McGill University Library]On: 17 November 2014, At: 06:26Publisher: Taylor & FrancisInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: MortimerHouse, 37-41 Mortimer Street, London W1T 3JH, UK

Communications in Statistics - Theory and MethodsPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/lsta20

Predictive Distribution of Regression Vector andResidual Sum of Squares for Normal MultipleRegression ModelShahjahan Khan a ba Department of Mathematics and Computing , University of Southern Queensland ,Toowoomba, Queensland, Australiab Department of Mathematics and Computing , University of Southern Queensland ,Toowoomba, Queensland, AustraliaPublished online: 15 Feb 2007.

To cite this article: Shahjahan Khan (2005) Predictive Distribution of Regression Vector and Residual Sum of Squares forNormal Multiple Regression Model, Communications in Statistics - Theory and Methods, 33:10, 2423-2441, DOI: 10.1081/STA-200031471

To link to this article: http://dx.doi.org/10.1081/STA-200031471

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose ofthe Content. Any opinions and views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be reliedupon and should be independently verified with primary sources of information. Taylor and Francis shallnot be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and otherliabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to orarising out of the use of the Content.

This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Page 2: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

Predictive Distribution of Regression Vector andResidual Sum of Squares for Normal Multiple

Regression Model

Shahjahan Khan*

Department of Mathematics and Computing, University of SouthernQueensland, Toowoomba, Queensland, Australia

ABSTRACT

This article proposes predictive inference for the multiple regressionmodel with independent normal errors. The distributions of the sam-ple regression vector (SRV) and the residual sum of squares (RSS)

for the model are derived by using invariant differentials. Also, thepredictive distributions of the future regression vector (FRV) and thefuture residual sum of squares (FRSS) for the future regression modelare obtained. Conditional on the realized responses, the FRV is

found to follow a multivariate Student t distribution, and that ofthe residual sum of squares follows a scaled beta distribution. Thenew results have been applied to the market return and accounting

rate data to illustrate its application.

*Correspondence: Shahjahan Khan, Department ofMathematics andComputing,University of Southern Queensland, Toowoomba, Queensland, Australia; E-mail:

[email protected].

2423

DOI: 10.1081/STA-200031471 0361-0926 (Print); 1532-415X (Online)

Copyright # 2004 by Marcel Dekker, Inc. www.dekker.com

200031471_LSTA33_10_R2_100504

123456789

101112131415161718192021222324252627282930313233343536373839404142

COMMUNICATIONS IN STATISTICS

Theory and Methods

Vol. 33, No. 10, pp. 2423–2441, 2004

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 3: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

Key Words: Multiple regression model; Regression vector andresidual sum of squares; Noninformative prior; Future regression

model; Predictive inference; Future regression vector; Multivariatenormal; Student t; Beta and gamma distributions.

AMS 1991 Subject Classification: Primary 62H10; Secondary62J12.

1. INTRODUCTION

The predictive inference had been the oldest form of statistical infer-ence used in real life. In general, predictive inference is directed towardinference involving the observables, rather than the parameters. Thepredictive method had been the most popular statistical tool beforethe diversion of interest in the inferences on parameters of the models.Predictive inference uses the realized responses from the performedexperiment to make inference about the behavior of the unobservedresponses of the future experiment (cf. Aitchison and Dunsmore, 1975,p. 1). The outcomes of the two experiments are connected through thesame structure of the model and indexed by the same set of parameters.For details on the predictive inference methods and their wide range ofapplications, readers may refer to Aitchison and Dunsmore (1975) andGeisser (1993). Predictive inference for a set of future responses of themodel, conditional on the realized responses from the same model, hasbeen derived by many authors, including Aitchison and Sculthorpe(1965), Fraser and Haq (1969), and Haq and Khan (1990). The predictiondistribution of a set of future responses from the model has been usedby Guttman (1970), Haq and Rinco (1973), and Khan (1992) to deriveb-expectation tolerance region. There are many other kinds of applica-tions of the prediction distributions available in the literature (e.g.,Geisser, 1993).

Similar to almost every other branch of statistics, there have beenmany studies in the area of predictive inference mainly for the indepen-dent and normal error model. The pioneering work in this area includesFraser and Guttman (1956), Aitchison (1964), Aitchison and Sculthorpe(1965), Fraser and Haq (1969), and Guttman (1970). Aitchison andDunsmore (1975) provided an excellent account of the theory and appli-cation of the prediction methods. Fraser and Haq (1969) obtained predic-tion distribution for the multivariate normal model by using thestructural distribution, instead of the Bayes posterior distribution.Haq (1982) used the structural relations, rather than the structural

2424 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 4: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

distribution, to derive the prediction distribution. Geisser (1993) dis-cussed the Bayesian approach to predictive inference and included a widerange of real-life applications of the method. This includes model selec-tion, discordancy, perturbation analysis, classification, regulation, screen-ing, and interim analysis. The predictive inference for the linear modelhas been dealt with by Lieberman and Miller (1963), Bishop (1976),and Ng (2000). Haq and Rinco (1976) derived the b-expectation toleranceregion for a generalized linear model with multivariate normal errorsusing the prediction distribution obtained by structural approach. Unlikethe previous normal theory-based studies, Khan (1992), Khan and Haq(1994), Fang and Anderson (1990), Khan (1996), and Ng (2000) providepredictive analyses of linear models with multivariate Student t errorsand spherical errors, respectively.

In this article, we consider the widely used multiple regression modelfor the unobserved but realized responses, as well as for the unobservedfuture responses. The two sets of errors are assumed to follow indepen-dent normal distribution. However, they are connected to one anotherthrough the common regression and scale parameters. Here, we pursuethe predictive approach to derive the distribution of the regression vectorand the RSS of the future responses, conditional on the set of realizedresponses. This is a new approach that proposes predictive inferencefor the regression parameters of the multiple regression model based onthe future responses. The proposed predictive inference of the regressionparameters depends on the realized responses, but not through the pre-diction distribution of the future responses. First, the joint distributionof the sample regression vector (SRV) and the RSS of the errors arederived from the joint distribution of the two error vectors by usingthe invariant differentials (cf. Fraser, 1968, p. 30). Then the distributionof the SRV and the RSS of the realized responses are derived by usingappropriate transformations. The SRV is found to be a multivariatenormal vector and the RSS statistic turns out to be a scaled gammavariable. These two statistics are independently distributed. Finally, thedistribution of the same statistics of the future regression model, thatis, the future regression vector (FRV) and RSS of the future responses,conditional on the realized responses, are obtained by using the non-informative prior distribution for the parameters.

The predictive distribution of the FRV follows a multivariateStudent t distribution and that of the RSS of the future regression followsa scaled beta distribution. Unlike the SRV and RSS of the realized regres-sion model, the distribution of the same statistics for the future regressionmodel, conditional on the realized responses, are dependent, and hencethe joint density cannot be factored into the marginal distributions.

Predictive Distribution for Regression Model 2425

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 5: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

On many occasions, the researchers may be required to predict thevalue of the parameter, rather than the response itself. In particular, ifthe interest is in the predictive inference on the regression parameter,the rate of change in the response variable with unit change in the expla-natory variable, we require to find the prediction distribution of thefuture slope vector. Given a set of realized responses and an appropriateprior distribution for the underlying parameters, this can be obtained bydefining the joint distribution of the parameters and FRV based on theunobserved future responses. In this article, we assume the noninforma-tive prior distribution for the parameters of the model under considera-tion. Ng (2000) used an improper prior for the derivation of predictiondistribution.

In the next section, we discuss the multiple regression model withnormal errors. Some preliminaries are provided in Sec. 3. Distributionsof the SRV and the RSS of the realized model are obtained in Sec. 4.The multiple regression model for the future responses is introduced inSec. 5. The predictive distributions of the regression vector and the RSSof the future regression model, conditional on the realized responses, arederived in Sec. 6. An illustrative example based on stock market data isprovided in Sec. 7. Some concluding remarks are included in Sec. 8.

2. THE MULTIPLE REGRESSION MODEL

Consider the commonly used linear regression equation

y ¼ bxþ se; ð2:1Þ

where y is the response variable, b is the vector of regression parametersassuming values in the p-dimensional real space Rp, x is the vector of pregressors, s is the scale parameter assuming values in the positive half ofthe real line Rþ, and e is the error variable associated with the response y.Assume the error component, e, is normally distributed with mean 0 andvariance 1 so the variance of y is s2. Now, consider a set of n > p inde-pendent responses, y ¼ ðy1; y2; . . . ; ynÞ, from the previous regressionmodel that can be expressed as

y ¼ bX þ se; ð2:2Þ

where the n-dimensional row vector y is the vector of the response vari-able; X is the p� n dimensional matrix of the values of the p regressors;e is the 1� n row vector of the error component associated with the

2426 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 6: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

response vector y; and the regression vector b and scale parameter s arethe same as defined in (2.1). Then the error vector follows a multivariatenormal distribution with mean 0, a vector of n-tuple of zeros, andvariance–covariance matrix, In. Therefore, the joint density function ofthe vector of errors becomes

fðeÞ ¼ ½2p��n2e�

12fee0g: ð2:3Þ

Consequently, the response vector follows a multivariate normal distribu-tion with mean vector bX and variance–covariance matrix, s2In. Thus,the joint density function of the response vector becomes

fðy; b; s2Þ ¼ ½2ps2��n2e

� 1

2s2fðy�bXÞðy�bXÞ0g: ð2:4Þ

In this article, we call the previous multiple regression model the realizedmodel of the responses from the performed experiment. The previousjoint density becomes the likelihood function of b and s2 when treatedas a function of the parameters, rather than the sample response. Themaximum likelihood estimators (MLES) of the parameters and the like-lihood ratio test can be derived to test any hypothesis regarding theregression parameters from the likelihood function. It is well known thatthe MLE of the parameters of this model is the same as the ordinary leastsquares estimator (OLE), and hence, is best linear unbiased. However, inthis article, we are interested to find the distribution of the SRV and theRSS for the realized responses from the previous multiple regressionmodel, as well as that of the FRV and future residual sum of squares(FRSS) of the unobserved future responses from the future regressionmodel defined in Sec. 5.

3. SOME PRELIMINARIES

Some useful notations are introduced in this section to facilitate thederivation of the results in forthcoming sections. First, we denote theSRV of e on X by bðeÞ and the RSS of the error vector by s2ðeÞ. Thenwe have

bðeÞ ¼ eX0ðXX0Þ�1 and s2ðeÞ ¼ ½e� bðeÞX�½e� bðeÞX�0: ð3:1Þ

Let sðeÞ be the positive square root of the RSS based on the error regres-sion, s2ðeÞ, and dðeÞ ¼ s�1ðeÞ½e� bðeÞX� be the ‘‘standardized’’ residualvector of the error regression.

Predictive Distribution for Regression Model 2427

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 7: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

Now we can write the error vector, e; as a function of bðeÞ and sðeÞ inthe following way:

e ¼ bðeÞXþ sðeÞdðeÞ; and hence, we get ee0 ¼ bðeÞXX0b0ðeÞ þ s2ðeÞð3:2Þ

because dðeÞd0ðeÞ ¼ 1, inner product of two orthonormal vectors, andXd0ðeÞ ¼ 0, because X and dðeÞ are orthogonal.

From (3.2) and (2.2), the following relations (cf. Fraser, 1968, p. 127)can easily be established:

bðeÞ ¼ s�1fbðyÞ � bg; and s2ðeÞ ¼ s�2s2ðyÞ; ð3:3Þ

where

bðyÞ ¼ yX0ðXX0Þ�1 and s2ðyÞ ¼ ½y� bðyÞX�½y� bðyÞX�0 ð3:4Þ

are the sample regression vector of y on X, and the RSS of the regressionbased on the realized responses, respectively. It may be mentioned herethat both s2ðeÞ and s2ðyÞ have the same structure because the definitionsof s2ðeÞ in (3.3) and that of s2ðyÞ in (3.4) ensure the same format of thetwo residual statistics of errors and realized responses, respectively.Haq (1982) called the relation in (3.3) as the structural relations. It caneasily be shown that dðeÞ ¼ s�1ðyÞ½y� bðyÞX� ¼ dðyÞ. From the previousresults, the density of the error vector in (2.3) can be written as a functionof bðeÞ and sðeÞ as follows,

fðeÞ ¼ c� e�12fbðeÞXX0b0ðeÞþs2ðeÞg; ð3:5Þ

where c is an appropriate normalizing constant. In Sec. 5, we definesimilar FRV and FRSS for the future regression model.

4. DISTRIBUTION OF SRV AND RSS

From the probability density of e in (2.3) and the relation (3.2), thejoint probability density of bðeÞ and s2ðeÞ; conditional on the dðeÞ; isobtained by using the invariant differentials (see Eaton, 1983, pp. 194–206 or Fraser, 1968, p. 30) as follows,

f�bðeÞ; s2ðeÞjdðeÞ� ¼ K1ðdÞ½s2ðeÞ�

n�p�22 e�

12fbðeÞXX0b0ðeÞþs2ðeÞg; ð4:1Þ

2428 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 8: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

where K1ðdÞ is the normalizing constant. It can be shown that the pre-vious density does not depend on dðeÞ (cf. Fraser, 1978, p. 113) andcan be written as the product of two densities in the following way:

f�bðeÞ; s2ðeÞ� ¼ f1ðbðeÞÞ � f2

�s2ðeÞ� ð4:2Þ

where

f1ðbðeÞÞ ¼ K11e�1

2fbðeÞXX0b0ðeÞg and f2�s2ðeÞ� ¼ K12½s2ðeÞ�

n�p�22 e�

12s

2ðeÞ

ð4:3Þ

in which

K�111 ¼ ½2p�p2jXX0j�n

2 and K�112 ¼ ½2�n�p�2

2 G�n� p

2

�ð4:4Þ

are the respective normalizing constants. Clearly, the joint distributionfactors, and hence the marginal distributions of bðeÞ and s2ðeÞ, are inde-pendent of one another. Therefore, the SRV based on the error regressionfollows a multivariate normal distribution with mean 0 and covariancematrix ½XX0��1. That is, bðeÞ � Np

�0; ½XX0��1�: The RSS of the error

regression, s2ðeÞ, follows a gamma distribution with shape parameterðn� pÞ=ð2Þ.

To find the distributions of the SRV of the response regression, bðyÞ,and the RSS of the response regression, s2ðyÞ, we use the followingrelations of the multiple regression model:

bðeÞ ¼ s�1½bðyÞ � b� and s2ðeÞ ¼ s�2s2ðyÞ: ð4:5Þ

So the associated differentials can be expressed as

dbðeÞ ¼ s�pdbðyÞ and ds2ðeÞ ¼ s�2ds2ðyÞ: ð4:6Þ

Therefore, the density function of bðyÞ is written as

fðbðyÞÞ ¼ ½2ps2��p

2jXX0jn2e� 1

2s2fðbðyÞ�bÞXX0ðbðyÞ�bÞ0g ð4:7Þ

and that of s2ðyÞ is given by

f�s2ðyÞ� ¼ 1

½2�n�p�22 G

�n�p

2

� ½s2ðyÞ�n�p�2

2

½s2�n�p

2

e� 1

2s2s2ðyÞ: ð4:8Þ

Predictive Distribution for Regression Model 2429

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 9: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

Thus, the SRV of the realized response regression follows a multivariatenormal distribution with mean vector b and covariance matrix s2ðXX0Þ�1,that is, bðyÞ � Npðb; s2ðXX0Þ�1Þ, and the RSS of the response regression,s2ðyÞ is distributed as a scaled gamma variable with shape parameterðn� pÞ=ð2Þ. The SRV and the RSS of the realized response regressionare independently distributed. This is true for both the error regressionand the response regression. However, the parameters of the distributionsof the statistics of the error regression are different from that of theresponse regression.

5. REGRESSION MODEL FOR FUTURE RESPONSES

In this section, we introduce the idea of a predictive model for futureresponses, and use both the realized sample and unobserved future sam-ple to derive the distributions of the FRV and the FRSS. First, consider aset of nf � p future unobserved responses, yf ¼ ðyf1; yf2; . . . ; yfnf Þ; fromthe multiple regression model as given in (2.1), with the same regressionand scale parameters as defined in Sec. 2. Such a set of future responsescan be expressed as

yf ¼ bXf þ sef ; ð5:1Þ

where Xf is the p� nf matrix of the values of the regressors that generatethe future response vector yf , and ef is the nf -dimensional row vector offuture error terms. The future responses are assumed to be generated bythe same data-generating process as that of the realized responses andinvolve the same regression and scale parameters. Thus, the responsesof the realized sample and the unobserved future responses are relatedthrough the same indexing parameters, b and s2. We assume noninforma-tive prior distribution of the previous parameters. Our objective here is tofind the distributions of the FRV and the RSS of the future regressionmodel, conditional on the realized responses.

Following the same process as in Sec. 2, we define the followingstatistics based on the future regression model:

bfðef Þ ¼ efX0f ðXfX

0f Þ�1; s2f ðefÞ ¼ ½ef � bf ðefÞXf �½ef � bfðefÞXf �0

ð5:2Þ

in which bfðefÞ is the FRV and s2fðef Þ is the RSS of the future error of thefuture model, respectively. Then we can write the future error vector, ef ;

2430 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 10: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

in the following way:

ef ¼ bfðefÞXf þ sf ðefÞdf ðefÞ; ð5:3Þwhere sfðefÞ is the positive square root of s2f ðefÞ, and hence, we get

efe0f ¼ bfðefÞXfX

0fb

0f ðefÞ þ s2fðefÞ ð5:4Þ

because Xf and dðefÞ are orthogonal and dfðef Þ is orthonormal. More-over, the following relations can easily be observed:

bfðefÞ ¼ s�1fbfðyfÞ � bg; and s2f ðefÞ ¼ s�2s2fðyfÞ; ð5:5Þ

where

bfðyfÞ ¼ yfX0fðXfX

0fÞ�1 and s2f ðyÞ ¼ ½yf � bfðyfÞXf �½yf � bfðyfÞXf �0

ð5:6Þin which bfðyfÞ is the FRV of the future responses and s2fðyfÞ is the RSSof future responses, respectively. Note that the future response vector,independent of the realized responses, follows an nf -dimensional multi-variate normal distribution, that is, yf � Nnf bXf ; s2Inf

� �. Following

the same argument as in Sec. 2, the density function of the future errorvector is given by

fðefÞ ¼ ½2p��nf

2 e�12ðefef 0 Þ: ð5:7Þ

and hence, by using the invariant differentials, as in Sec. 4, we get thejoint distribution of bfðefÞ and s2fðefÞ as follows,

f�bfðefÞ; s2fðef Þ

� ¼ K2 � ½s2fðef Þ�nf�p�2

2 e�1

2fbf ðef ÞXfX0fb0f ðef Þþs2

fðef Þg; ð5:8Þ

where K2 is the normalizing constant. The unconditional marginal distri-butions of the FRV and FRSS of the error regression for the futuremodel can be obtained from the previous joint density in (5.8). Becausethe future sample is independent of the realized sample, the join densityfunction of the combined error vector—that is, the errors associated withthe realized and future responses ðe; efÞ—can be expressed as

fðe; ef Þ ¼ ½2p��nþnf

2 e�1

2fee0þefe0fg: ð5:9Þ

Haq and Khan (1990) used this density function to derive the predictiondistribution of future responses, conditional on the realized responses.Figure 1 provides the graph of the prediction distribution for the

Predictive Distribution for Regression Model 2431

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 11: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

Figure 1. Graph of prediction distribution of the future regression parameter.

2432 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 12: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

accounting rates of stocks for the data used by Barlev and Levy (1979).Here, we use this density function to derive the prediction distributionsof the FRV and future sum of squared errors.

6. PREDICTIVE DISTRIBUTIONS OF FRV AND FRSS

In this section, we derive the predictive distributions of the FRV andthe RSS for the future multiple regression model, conditional on therealized responses. Because both the realized and future regression mod-els involve the same parameters, the joint distribution of the responseswould contain the same regression and scale parameters. In the absenceof any knowledge about the parameters, we consider noninformativeprior distribution for the parameters as follows:

pðbÞ / constant; and pðs2Þ / s�2: ð6:1Þ

This prior distribution is used to derive the predictive distributions ofbðyfÞ and s2ðyfÞ from the joint distribution of b, s2, bðyfÞ, and s2ðyfÞ.Justification for the use of such a noninformative prior is given by Geisser(1993, pp. 60, 192), Box and Tiao (1992, p. 21), Press (1989, p. 132), andMeng (1994), among others. It is worth noting that no prior distributionis required in the structural approach (cf. Fraser, 1978) because the struc-tural distribution, similar to the Bayes posterior distribution, can beobtained from the structural relation of the model without involvingany prior distribution. Fraser and Haq (1969) discussed that for the non-informative prior, the Bayes posterior density is the same as the structuraldensity.

6.1. Distribution of the FRV

In this subsection, we derive the prediction distribution of the FRV,conditional on the realized responses. The joint density function of theerror statistics bðeÞ, s2ðeÞ, bfðefÞ, and s2fðefÞ; for given dð�Þ; is derivedfrom the joint density in (5.9) by applying the properties of invariantdifferentials, as follows:

p�bðeÞ; s 2ðeÞ; bfðefÞ; s 2f ðefÞjdð�Þ

¼ C11ð�Þ½s2ðeÞ�n�p�2

2 e�12g1ðb;XÞC21ð�Þ½s 2f ðefÞ�

nf�p�2

2 e�12g2ðbf ;Xf Þ; ð6:2Þ

Predictive Distribution for Regression Model 2433

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 13: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

where g1ðb;XÞ ¼ bðeÞXX0b0ðeÞ; g2ðbf ;XfÞ ¼ bf ðefÞXfX0fb

0fðefÞ; and C11

and C21 are the normalizing constants. Because the previous density doesnot depend on dð�Þ, the conditioning in (6.2) can be disregarded as weneed to find the joint density of bðyÞ, s2ðyÞ, bfðyf Þ, and s2fðyf Þ from theprevious joint density. The structural relation of the model yields

bðeÞ ¼ s�1½bðyÞ � b� and s2ðeÞ ¼ s�2s2ðyÞ: ð6:3Þ

The joint distribution of bðyÞ, s2ðyÞ, bfðef Þ; and s2fðefÞ is then obtained byusing the Jacobian of the transformation,

J�½bðeÞ; s2ðeÞ� ! ½bðyÞ; s2ðyÞ�� ¼ ½s2��

pþ22 ; ð6:4Þ

as follows:

p�bðyÞ; s2ðyÞ; bfðefÞ; s2fðef Þ

�¼ C2 � ½s2�n�p�2

2 ½s2fðyf Þ�nf�p�2

2 ½s2��n2 e

� 1

2s2fx1ðb;bÞþs2þx2ðbf ðef ÞÞþs2

fðef Þg;

ð6:5Þwhere x1ðb; bÞ ¼ ðb� bÞXX0ðb� bÞ0; x2ðbfðef ÞÞ ¼ bfðefÞXfX

0fb

0fðefÞ;

b ¼ bðyÞ and s2 ¼ s2ðyÞ. The normalizing constant C2 can be obtainedby integrating the right-hand side of the previous function over the appro-priate domains of the underlying variables. Because we are interested inthe distributions of bfðyfÞ and s2f ðyf Þ, the FRV and RSS for the futureregression, respectively, conditional on the realized responses, we do notpursue the matter any further in this article.

To derive the joint distribution of b, s2, bf ðyf Þ, and s2f ðyf Þ from theprevious joint density, note that from the structure of the future regres-sion equation we have

bfðef Þ ¼ s�1½bfðyfÞ � b� and s2ðefÞ ¼ s�2s2ðyf Þ; ð6:6Þ

where

bfðyfÞ ¼ yX0f ðXfX

0f Þ�1; s2ðyfÞ ¼ ½yf � bfðyfÞX0

f �½yf � bfðyfÞXf �0:ð6:7Þ

Therefore, the Jacobian of the transformation is found to be

J�½bfðef Þ; s2f ðefÞ� ! ½bfðyfÞ; s2ðyfÞ�

� ¼ ½s2��pþ22 : ð6:8Þ

2434 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 14: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

Now, the joint density of bðyÞ, s2ðyÞ, bfðyf Þ, and s2fðyfÞ is obtained as

p�b; s2;bf ; s

2f

¼ C3ð�Þ � ½s2�n�p�22 ½s2f ðyfÞ�

nf�p�2

2 ½s2��nþnf

2

� e� 1

2s2��

b� b�XX0�b� b

�0 þ s2 þ �bf � b

�XfX

0f

�bf � b

�0 þ s2f

�;

ð6:9Þ

where bf ¼ bfðyf Þ and s2f ¼ s2fðyfÞ for notational convenience. From thenoninformative prior distribution of the parameters of the model and thedensity in (6.9), we find the following joint density of b, s2, bfðyfÞ, ands2fðyfÞ,

p�b;s2;bf ; s2f

¼ C3ð�Þ � ½s2�n�p�22 ½s2f ðyfÞ�

nf�p�2

2 ½s2��nþnfþ2

2

� e� 1

2s2��

b� b�XX0�b� b

�0 þ s2 þ �bf � b

�XfX

0f

�bf � b

�0 þ s2f�:

ð6:10ÞA similar result can be obtained by using the structural distribution

approach. In fact, the final results of this article are similar to thatobtained by the structural distribution approach. Interested readersmay refer to Fraser and Haq (1969) for details.

To evaluate the normalizing constant C3ð�Þ in the previous density,we go through the following steps. Let

Is2 ¼Zs2p�b; s2; bf ; sf

�ds2

¼ �s2f�nf�p�2

2

Zs2½s2��

nþnfþ2

2 e� 1

2s2Qds2; ð6:11Þ

where

Q ¼ �b� b

�XX0�b� b

�0 þ s2 þ �bf � b

�XfX

0f

�bf � b

�0 þ s2f : ð6:12Þ

Therefore,

Is2 ¼ ½2�nþnf

2 G�nþ nf þ 2

2

�½s2f �

nf�p�2

2 ½Q��nþnfþ2

2 : ð6:13Þ

Predictive Distribution for Regression Model 2435

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 15: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

To facilitate the further integrations, the terms involving the regressionvector b in Q can be expressed as follows:

�b� b

�XX0�b� b

�0 þ �bf � b

�XfX

0f

�bf � b

�0¼ �

b � FA�1�A�b � FA�1

�0 þ �bf � b

�H�1

�bf � b

�0; ð6:14Þ

where

F ¼ bXX0 þ bfXfX0f ; A ¼ XX0 þ XfX

0f ; and

H ¼ ½XX0��1 þ ½XfX0f ��1:

ð6:15Þ

Then, let

Is2b ¼Zb

Is2 db ¼ ½2�nþnf

2 G�nþ nf þ 2

2

�½s2f �

nf�p�2

2

�Zb

��bf � b

�H�1

�b0f � b

�þ s2 þ s2f þ g�b;A

���nþnfþ2

2 db

¼ ½2�nþnf

2ðpÞp2G� nþnf�pþ2

2

�jAj12

½s2f �nf�p�2

2

� ��bf � b

�H�1

�bf � b

�0 þ s2 þ s2f

��nþnf�pþ2

2 ; ð6:16Þ

where

g�b;A

� ¼ �b � FA�1

�A�b � FA�1

�0: ð6:17Þ

In the same way, let

Is2bbf ¼Zbf

Is2b dbf

¼ ½2�nþnf

2ðpÞp2G� nþnf�pþ2

2

�jAj12

½s2f �nf�p�2

2

�Zbf

��bf � b

�H�1

�bf � b

�0 þ s2 þ s2f

��nþnf�pþ2

2 dbf

¼ ½2�nþnf

2ðpÞpG� nþnf�2pþ2

2

�jAj12jHj�1

2

½s2f �nf�p�2

2 ½s2 þ s2f ��nþnf�2pþ2

2 : ð6:18Þ

2436 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 16: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

Finally, let

Is2bbf s2f¼

Zs2f

Is2bbf ds2f

¼ ½2�nþnf

2ðpÞpG� nþnf�2pþ2

2

��jAj12jHj�1

2

Zs2f

½s2f �nf�p�2

2 ½s2 þ s2f ��nþnf�2p

2 ds2f

¼ ½2�nþnf

2ðpÞp G

�n�pþ2

2

�G� nf�p

2

�jAj12jHj�1

2½s2�n�pþ22

: ð6:19Þ

Thus, the normalizing constant for the joint distribution of b, s2, bf , ands2f becomes

C3ð�Þ ¼ jAj12jHj�12½s2�n�p

2

½2�nþnf

2 ðpÞp G�n�pþ2

2

�G� nf�p

2

� : ð6:20Þ

The marginal density of b, bf , and s2f , conditional on y, is derived byintegrating out s2 from the previous joint density. Thus, we have

p�b; bf ; s

2f jy

� ¼ C4 � ½s2f �nf�p�2

2�s2 þ �

b � FA�1�A

� �b � FA�1

�0 þ �bf � b

�H�1

�bf � b

�þ s2f

��nþnf

2 ;

ð6:21Þ

where C4 is the normalizing constant.Similarly, the marginal density of bf and s2f is obtained by integrating

out b over Rp from (6.21). This gives the joint density of bf and s2f ,conditional on y, as

p�bf ; s

2f jy

� ¼ C5 � ½s2f �nf�p�2

2�s2 þ s2f þ

�bf � b

�H�1

�bf � b

���nþnf�p

2 ;

ð6:22Þ

where

C5 ¼jHj�1

2G� nþnf�p

2

�½s2�n�p

2

ðpÞp2G� n�p

2

�G� nf�p

2

� ð6:23Þ

is the normalizing constant. The prediction distribution of the FRV,bf ¼ bfðyf Þ; can now be obtained by integrating out s2f from (6.22).

Predictive Distribution for Regression Model 2437

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 17: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

The integration yields

p�bf j y

� ¼ C6 ��s2 þ �

bf � b�H�1

�bf � b

�0��n2; ð6:24Þ

where C6 ¼ C4 � B� nf�p

2 ; n2�½s2�n�p

2 . On simplification, we get

C6 ¼Gðn2Þ

ðpÞp2G� n�p

2

�jHj12½s2�n�p

2

: ð6:25Þ

The prediction distribution of bf can be written in the usual multivariateStudent t distribution form as follows:

p�bf jy

� ¼ C6 ��1þ �

bf � b��s2H

��1�bf � b

�0��n2 ð6:26Þ

in which n > p. Because the density in (6.26) is a Student t density, theprediction distribution of the FRV, bf , conditional on the realizedresponses, follows a multivariate Student t distribution of dimension p,with ðn� pÞ degrees of freedom. Thus, ½bf jy� � tpðn� p; b; s2HÞ, whereb is the location vector and H is the scale matrix. It is observed thatthe degrees of freedom parameter of the prediction distribution of bfdepends on the sample size of the realized sample and the dimension ofthe regression parameter vector of the model. The previous predictiondistribution can be used to construct the b-expectation tolerance regionfor the future regression parameter.

6.2. Distribution of FRSS

The prediction distribution of the FRSS from the future regression,s2fðyfÞ, based on the future responses, yf ; conditional on the realizedresponses, y; is obtained by integrating out bf from (6.22) as follows:

p�s2fðyfÞj y

� ¼ C7 ��s2fðyfÞ

�nf�p�2

2�s2 þ s2fðyfÞ

��nþnf�2p

2 : ð6:27Þ

The density function in (6.27) can be written in the usual beta distributionform as follows:

p�s2f j y

� ¼ C7 ��s2f

�nf�p�2

2�1þ s�2s2f

��nþnf�2p

2 ð6:28Þ

where

C7 ¼G� nþnf�2p

2

��s2�n�p

2

G�n�p

2

�G� nf�p

2

� :

2438 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 18: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

This is the prediction distribution of the FRSS based on the futureresponse yf ; conditional on the realized responses, from the multipleregression model with normal error variable. The density in (6.28) is amodified form of beta density of the second kind with ðnf � pÞ andðn� pÞ degrees of freedom.

7. AN ILLUSTRATION

To illustrate how the method works, we consider a real life dataset from Barlev and Levy (1979). The simple regression model fitted tothis data is a special case of the model considered in our article. Thedata set was used in the study of relationship between accountingrates on stocks and market returns. It provides information on the twovariables for 54 large-size companies in the US. Considering the marketreturn to be the response and the accounting rate as the explanatoryvariables, the fitted model becomes yy ¼ 0:084801 þ 0:61033x with �xx ¼12:9322, �yy ¼ 8:7409,

P54j¼1 x

2j ¼ 10293:10893,

P54j¼1 xjyj ¼ 6874:4131, and

s2 ¼ 25:864, the mean squared error. The prediction distribution of theregression parameter involves H, which in this special case becomes½P54

j¼1 x2j ��1 þ ½xf 2��1:

The top two graphs in Fig. 1 display the prediction distributions offuture responses for future accounting rate 5 and 25, respectively. Boththe distributions are Student t distributions, but the one with higherfuture accounting rate has a wider spread than the one with the lowervalue of future accounting rate. The prediction distribution of the futureregression slope parameter of the regression of future market rate on thefuture accounting rate is given in the middle two graphs of Fig. 1. Boththe graphs represent the Student t distributions with different parameters.Although the shape of the distribution of both the graphs is roughly thesame, the first graph here has a slightly more spread, but lower pick, thanthe second graph. The bottom two graphs of Fig. 1 display the predictiondistribution of the future sum of squared errors for different sample sizes.These last two graphs in Fig. 1 represent the beta distribution withvarying arguments.

8. CONCLUDING REMARKS

The foregoing analyses reveal the fact that, for the multiple regres-sion model with independent normal errors, the SRV and the RSS areindependently distributed. This is true for both the error regression and

Predictive Distribution for Regression Model 2439

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 19: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

response regression of the realized model. However, for the future regres-sion model, the predictive distributions of the FRV and the RSS condi-tional on the realized responses, are not independent. The SRV of therealized model follows a multivariate normal distribution, but the FRVof the future model follows a multivariate Student t distribution. Thus,every element of the SRV is independently distributed, but the compo-nents of the FRV are not independent. Moreover, the RSS of the realizedmultiple regression model follows a scaled gamma distribution, whereasthat of the future regression model, conditional on the realized responses,follows a scaled beta distribution. The RSS based on the error regressionand that of the response regression differs by a constant for the realizedregression model.

ACKNOWLEDGMENTS

The author thankfully acknowledges some valuable suggestionsfrom Professor Lehana Thabane, McMaster University, Canada thatimproved the quality and content of the article significantly. This workwas initiated while the author was visiting the Institute of StatisticalResearch and Training, University of Dhaka, Bangladesh.

REFERENCES

Aitchison, J. (1964). Bayesian tolerance regions. J. R. Statist. Soc. B127:161–175.

Aitchison, J., Sculthorpe, D. (1965). Some problems of statistical predic-tion. Biometrika 55:469–483.

Aitchison, J., Dunsmore, I. R. (1975). Statistical Prediction Analysis.Cambridge: Cambridge University Press.

Barlev, B., Levy, H. (1979). On the variability of accounting incomenumbers. J. Accounting Res. 16:305–315.

Bishop, J. (1976). Parametric Tolerance Regions. Unpublished Ph.D.thesis, Department of Statistics, University of Toronto, Canada.

Box, G. E. P., Tiao, G. C. (1992). Bayesian Inference in StatisticalAnalysis. New York: Wiley.

Eaton, M. L. (1983). Multivariate Statistics—A Vector Space Approach.New York: Wiley.

Fraser, D. A. S. (1968). The Structure of Inference. New York: Wiley.Fraser, D. A. S. (1978). Inference and Linear Models. New York:

McGraw-Hill.

2440 Khan

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 20: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

ORDER REPRINTS

Fraser, D. A. S., Haq, M. S. (1969). Structural probability and predictionfor the multilinear model. J. R. Statist. Soc. B 31:317–331.

Geisser, S. (1993). Predictive Inference: An Introduction. London:Chapman & Hall.

Guttman, I. (1970). Statistical Tolerance Regions: Classical andBayesian. London: Griffin.

Haq, M. S. (1982). Structural relation and prediction for multivariatemodels. Statistische Hefte 23:218–228.

Haq, M. S., Rinco, S. (1976). b-Expectation tolerance regions for ageneralized multilinear model with normal error variables. J. Multi-variate Anal. 6:414–421.

Haq, M. S., Khan, S. (1990). Prediction distribution for a linear regres-sion model with multivariate Student-t error distribution. Comm.Statist.—Theory and Methods 19(12):4705–4712.

Khan, S. (1992). Predictive Inference for Multilinear Model withStudent-t Errors. Unpublished Ph.D. thesis, Department of Statis-tics & Actuarial Sciences, University of Western Ontario, Canada.

Khan, S. (1996). Prediction inference for heteroscedastic multipleregression model with class of spherical error. Aligarh J. Statist.15 & 16:1–17.

Khan, S. (2000). An extension of the generalized beta integral with matrixargument. Pak. J. Statist. 16(2):163–167. Special Volume in honourof Saleh and Aly.

Khan, S., Haq, M. S. (1994). Prediction inference for multilinear modelwith errors having multivariate Student-t distribution and first-order autocorrelation structure. Sankhya, Part B: Indian J. Statist.56:95–106.

Lieberman, G. J., Miller, R. G. Jr. (1963). Simultaneous tolerance inter-vals in regressions. Biometrika 50:155–168.

Meng, X. L. (1994). Posterior predictive p-value. Ann. Statist.22(3):1142–1160.

Ng, V. M. (2000). A note on predictive inference for multivariate ellipti-cally contoured distributions. Comm. Statist.—Theory and Methods29(3):477–483.

Press, S. J. (1989). Bayesian Statistics: Principles, Methods and Applica-tions. New York: Wiley.

Predictive Distribution for Regression Model 2441

123456789

101112131415161718192021222324252627282930313233343536373839404142

200031471_LSTA33_10_R2_100504

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014

Page 21: Predictive Distribution of Regression Vector and Residual Sum of Squares for Normal Multiple Regression Model

Request Permission/Order Reprints

Reprints of this article can also be ordered at

http://www.dekker.com/servlet/product/DOI/101081STA200031471

Request Permission or Order Reprints Instantly!

Interested in copying and sharing this article? In most cases, U.S. Copyright Law requires that you get permission from the article’s rightsholder before using copyrighted content.

All information and materials found in this article, including but not limited to text, trademarks, patents, logos, graphics and images (the "Materials"), are the copyrighted works and other forms of intellectual property of Marcel Dekker, Inc., or its licensors. All rights not expressly granted are reserved.

Get permission to lawfully reproduce and distribute the Materials or order reprints quickly and painlessly. Simply click on the "Request Permission/ Order Reprints" link below and follow the instructions. Visit the U.S. Copyright Office for information on Fair Use limitations of U.S. copyright law. Please refer to The Association of American Publishers’ (AAP) website for guidelines on Fair Use in the Classroom.

The Materials are for your personal use only and cannot be reformatted, reposted, resold or distributed by electronic means or otherwise without permission from Marcel Dekker, Inc. Marcel Dekker, Inc. grants you the limited right to display the Materials only on your personal computer or personal wireless device, and to copy and download single copies of such Materials provided that any copyright, trademark or other notice appearing on such Materials is also retained by, displayed, copied or downloaded as part of the Materials and is not removed or obscured, and provided you do not edit, modify, alter or enhance the Materials. Please refer to our Website User Agreement for more details.

Dow

nloa

ded

by [

McG

ill U

nive

rsity

Lib

rary

] at

06:

27 1

7 N

ovem

ber

2014