ajfisher

download ajfisher

of 36

Transcript of ajfisher

  • 8/8/2019 ajfisher

    1/36

    http://www.econometricsociety.org/

    Econometrica, Vol. 76, No. 4 (July, 2008), 727761

    FISHERS INFORMATION FOR DISCRETELY SAMPLEDLVY PROCESSES

    YACINE AT-SAHALIABendheim Center for Finance, Princeton University, Princeton, NJ 08540-5296, U.S.A. and NBER

    JEAN JACODInstitut de Mathmatiques de Jussieu, 75013 Paris, France and Universit P. et M. Curie, Paris-6,France

    The copyright to this Article is held by the Econometric Society. It may be downloaded,printed and reproduced only for educational or research purposes, including use in coursepacks. No downloading or copying may be done for any commercial purpose without theexplicit permission of the Econometric Society. For such commercial purposes contactthe Office of the Econometric Society (contact information may be found at the websitehttp://www.econometricsociety.org or in the back cover ofEconometrica). This statement mustthe included on all copies of this Article that are made available electronically or in any other

    format.

    http://www.econometricsociety.org/http://www.econometricsociety.org/
  • 8/8/2019 ajfisher

    2/36

    Econometrica, Vol. 76, No. 4 (July, 2008), 727761

    FISHERS INFORMATION FOR DISCRETELY SAMPLEDLVY PROCESSES

    BY YACINE AT-SAHALIA AND JEAN JACOD1

    This paper studies the asymptotic behavior of Fishers information for a Lvy processdiscretely sampled at an increasing frequency. As a result, we derive the optimal ratesof convergence of efficient estimators of the different parameters of the process andshow that the rates are often nonstandard and differ across parameters. We also showthat it is possible to distinguish the continuous part of the process from its jumps part,and even different types of jumps from one another.

    KEYWORDS: Lvy process, jumps, rate of convergence, optimal estimation.

    1. INTRODUCTION

    CONTINUOUS TIME MODELS in finance often take the form of a semimartin-gale, because this is essentially the class of processes that precludes arbitrage.Among semimartingales, models allowing for jumps are becoming increasinglycommon. There is a relative consensus that, empirically, jumps are prevalentin financial data, especially if one incorporates not just large and infrequentPoisson-type jumps, but also infinite activity jumps that can generate smaller,more frequent, jumps. This evidence comes from direct observation of the timeseries of various asset prices, studying the distribution of primary asset and thatof derivative returns, especially as high frequency data becomes more com-

    monly available for a growing range of assets.On the theory side, the presence of jumps changes many important proper-ties of financial models. This is the case for option and derivative pricing, wheremost types of jumps will render markets incomplete, so the standard arbitrage-based delta hedging construction inherited from Brownian-driven models fails,and pricing must often resort to utility-based arguments or perhaps bounds.The implications of jumps are also salient for risk management, where jumpshave the potential to alter significantly the tails of the distribution of asset re-turns. In the context of optimal portfolio allocation, jumps can generate thetypes of correlation patterns across markets, or contagion, that is often docu-mented in times of financial crises. Even when jumps are not easily observableor even present in a given time series, the mere possibility that they could occurwill translate into amended risk premia and asset prices: this is one aspect ofthe peso problem.

    These theoretical differences between models driven by Brownian motionsand those driven by processes that can jump are fairly well understood. How-

    1Financial support from the NSF under Grants SBR-0350772 and DMS-0532370 is gratefullyacknowledged. We are also grateful for comments from the editor, three anonymous referees,George Tauchen, and seminar and conference participants at Princeton University, The Univer-sity of Chicago, MIT, Harvard, and the 2006 Econometric Society Winter Meetings.

    727

    http://www.econometricsociety.org/http://www.econometricsociety.org/
  • 8/8/2019 ajfisher

    3/36

    728 Y. AT-SAHALIA AND J. JACOD

    ever, comparatively little is known about the estimation of models driven byjumps, even in the special case where the jump process has independent andidentically distributed increments, that is, is a Lvy process (see, e.g., Woerner

    (2004) and the references cited therein), and among different inference proce-dures for parametrized Lvy processes, even less is known about their relativeefficiency, especially for high frequency data.

    In this paper, we take a few steps toward a better understanding of theeconometric situation, in a parametric model that is far from being fully re-alistic, but has the advantage of being tractable and yet is rich enough to de-liver some surprising results. In particular, we will see that jumps give rise to anonstandard econometric situation where the rates of convergence of estima-tors of the various parameters of the model can depart from the usual

    n in

    a number of unexpected ways: another comparable, although unrelated, situa-tion happens as one either stays on or crosses the unit root barrier in classicaltime series analysis.

    We study a class of Lvy processes for a log-asset price X, indexed on threeparameters. We split X into the sum of two independent Lvy processes as

    Xt = Wt + Yt(1.1)Here, we have > 0 and R, and W is a standard symmetric stable processwith index (0 2]. We are often interested in the classical situation where = 2 and so W is a Wiener process, hence continuous; and are scale pa-rameters. In the case where W is a Wiener process, is the (Brownian or con-

    tinuous) volatility of the process. As for Y, it is another Lvy process, viewedas a perturbation ofW and dominated by W in a sense we will make pre-cise below. In some applications, Y may represent frictions that are due to themechanics of the trading process or, in the case of compound Poisson jumps,it may represent the infrequent arrival of relevant information related to theasset. In the latter case, W is then the driving process for the ordinary fluctua-tions of the asset value. Y is independent ofW, and its law is either known ora nuisance parameter.

    When = 2, W has, as already mentioned, continuous paths, and we willthen be working with a model that has both continuous and discontinuouscomponents. But when < 2, W is itself a jump process, and we will be ableto study the effect on each others parameters of including two separate jumpcomponents in the model, with different relative levels of jump activity. Ourresults cover both cases and we will describe the relevant differences.

    While there is a large literature devoted to stable processes in finance (start-ing with the work of Mandelbrot in the 1960s and see, for example, Rachevand Mittnik (2000) for new developments), a model based on stable processesis arguably quite special. Rather than an attempt at a fully realistic model in-tended to be fitted to the data as is, we have chosen to capture only some ofthe salient features of financial data, such as the presence of jumps, in a rea-sonably tractable framework. Inevitably, we leave out some other important

  • 8/8/2019 ajfisher

    4/36

    SAMPLED LEVY PROCESSES 729

    features such as stochastic volatility2 or market microstructure noise, but in ex-change we are able to derive explicitly some new and surprising econometricresults in a fully parametric framework.

    The log-asset price X is observed at n discrete instants separated by n unitsof time. In financial applications, we are often interested in the high frequencylimit where n 0. The length Tn = nn of the overall observation period maybe a constant T or may go to infinity. With Y viewed as a perturbation ofW, westudied in an earlier paper (At-Sahalia and Jacod (2007)) the estimation of thesingle parameter , showing in particular that it can be estimated with the samedegree of accuracy as when the process Y is absent, at least asymptotically. Weproposed estimators for designed to achieve the efficient rate despite thepresence ofY.3

    We now study the more general problem where the parameter vector of in-terest is either () or the full (), viewing the law ofY nonparametri-cally. Our objective is to determine the optimal rate at which these parameterscan be estimated. So we are led to consider the behavior of Fishers informa-tion. Through the CramerRao lower bound, this yields the rate at which anoptimal sequence of estimators should converge and the lowest possible as-ymptotic variance.

    Unlike the situation where n is fixed or the estimation of we previouslystudied, we are now in a nonstandard situation where the rates of convergenceof optimal estimators of() can depart from the usual

    n in a number of

    different and often unexpected ways. Even for such a simple class of modelsas (1.1), there is a wide range of different asymptotic behaviors for Fishers

    information if we wish to estimate the three parameters , , and or even and only when is known, for example, when = 2. We will show thatdifferent rates of convergence are achieved for the different parameters andfor different types of Lvy processes.

    The optimal rate for is

    n as seen in At-Sahalia and Jacod (2007). Now,for , the optimal rate will be the faster

    n log(1/n), and is unaffected by

    the presence ofY (as was the case for ). To estimate , the optimal rate isnot

    n and, unlike the one for , depends heavily on the specific nature of

    the Y process. Furthermore, unlike what happens for or , which are im-mune to the presence ofY, the presence of the other process (W in this case)

    strongly affects the rate for . We study different examples of that situation;one in particular consists of the case where Y is also a stable process withindex < ; another is the jump-diffusion situation where Y is a compoundPoisson process.

    2There has been much activity devoted to estimating the integrated volatility of the process inthat context, with or without jumps: see, for example, Andersen, Bollerslev, and Diebold (2003),Barndorff-Nielsen and Shephard (2004), and Mykland and Zhang (2006).

    3Another string of the recent literature focuses on disentangling the volatility from the jumps:see, for example, At-Sahalia (2004) and Mancini (2001). In our case, this is related to the esti-mation of when

    =2.

  • 8/8/2019 ajfisher

    5/36

    730 Y. AT-SAHALIA AND J. JACOD

    FIGURE 1.Convergence as 0 of Fishers information for the parameter from themodel X= W + Y, where W is Brownian motion ( = 2) and Y is a Cauchy process ( = 1),to Fishers information from the model X= W which is 2/2 as we will show in Theorem 2.Fishers information for the model X = W + Y (solid line) is computed numerically usingthe exact expressions for the densities of the two processes and numerical integration of their

    convolution. The figure is in loglog scale. The parameter values are = 30%/year and = 03.

    To illustrate some of these effects numerically, consider the example whereW is a Brownian motion ( = 2) and Y is a Cauchy process ( = 1). We plotin Figures 1 and 2 the convergence, as the sampling interval decreases, of the(numerically computed) Fishers information for and , respectively, to theirlimits derived in Theorems 2 and 7, respectively. While, as illustrated in Fig-ure 1, we can estimate as well as if Y were absent when n 0, the sym-metric statement is not true for . The limiting behavior in Figure 2 is quitedistinct from a convergence to Fishers information for in the model withoutW, that is, X= Y. The latter is simply 1/(22). In fact, Fishers informationabout in the presence ofW tends to zero when n 0. Consequently, anyoptimal estimator of will converge at a rate slower than

    n.

    The paper is organized as follows. In Section 2, we set up the model. In Sec-tion 3, we derive the asymptotic behavior of Fishers information about theparameters (). Then we show in Section 4 that the optimal rate for varies substantially according to the structure of the process Y, and we illus-trate the versatility of the situation through several examples that display thevariety of convergence rates that arise. Section 5 concludes. Proofs are in theAppendix.

    http://-/?-http://-/?-
  • 8/8/2019 ajfisher

    6/36

    SAMPLED LEVY PROCESSES 731

    FIGURE 2.Convergence as 0 of Fishers information for the parameter from themodel X= W + Y, where W is a Brownian motion ( = 2) and Y is a Cauchy process ( = 1),to the theoretical limiting curve ()11/2[log(1/)]1/2 derived in Theorem 7. Fishersinformation for the model X= W + Y (solid line) is computed numerically using the exactexpressions for the densities of the two processes, and numerical integration of their convolution.The x-axis is in log scale. The parameter values are identical to those of Figure 1.

    2. THE MODEL

    Because the components ofX in (1.1) are all Lvy processes, they have anexplicit characteristic function. Consider first the component W. The charac-teristic function ofWt is

    E(eiuWt) = et|u|/2(2.1)The factor 2 above is perhaps unusual for stable processes when < 2, but we

    put it here to ensure continuity between the stable and the Gaussian cases.The essential difficulty in this class of problems is the fact that the density ofW is not known in closed form except in special cases ( = 1 or 2).4 Withoutthe density ofY in closed form, it is of course hopeless to obtain the densityofX, the convolution of the densities ofW and Y, in closed form. Butwhen goes to 0, the density of X and its derivatives degenerate in a way

    4For fixed , there exist representations in terms of special functions (see Zolotarev (1995)and Hoffmann-Jrgensen (1993)), whereas numerical approximations based on approximationsof the densities may be found in DuMouchel (1971, 1973b, 1975) or Nolan (1997, 2001), or alsoBrockwell and Brown (1980).

  • 8/8/2019 ajfisher

    7/36

    732 Y. AT-SAHALIA AND J. JACOD

    which is controlled by the characteristics of the Lvy process, thus allowing usto explicitly describe, in closed form, the limiting behavior of Fishers informa-tion.

    To understand this behavior, we will need to be able to bound variousfunctionals of the density of X. As is well known, when < 2, we haveE(|Wt|) < if and only if 0 < < . The density h of W1 is C (by re-peated integration of the characteristic function) for all 2, even, and itsnth derivative h(n) behaves, as |w| , as

    h(n) (w)

    c

    |w|n+1+n1i=0

    ( + i + 1) if < 2,

    |w

    |new

    2/2/

    2 if=

    2

    (2.2)

    (in the case n = 0, an empty product is defined to be 1), where c is the constantgiven by

    c =

    (1 )

    4 (2 ) cos(/2) if = 1, < 2,

    12

    if = 1.(2.3)

    This follows from the series expansion of the density due to Bergstrm (1952),

    the duality property of the stable densities of order and 1/ (see, e.g., Chap-ter 2 in Zolotarev (1986)), with an adjustment factor in c to reflect our defin-ition of the characteristic function in (2.1). In addition, the cumulative distrib-ution function ofW1 for < 2 behaves according to

    w

    h(x)dx c/(w)when w (and symmetrically as w ).

    The second component of the model is Y. Its law as a process is entirelyspecified by the law G of the variable Y for any given > 0. We write G =G1, and we recall that the characteristic function ofG is given by the LvyKhintchine formula

    E(eivY ) = exp ivb cv22

    + F(dx)eivx 1 ivx1{|x|1}(2.4)where (bcF) is the characteristic triple ofG (or, ofY): b R is the drift ofY, and c 0 is the local variance of the continuous part ofY, and F is the Lvyjump measure ofY, which satisfies

    (1x2)F(dx) < (see, e.g., Chapter II.2

    in Jacod and Shiryaev (2003)).We need to put some additional structure on the model. We define the dom-

    ination ofY by W by the property that G belongs to one of the classes definedbelow, for some

    . Let first be the class of all nonnegative continuous

  • 8/8/2019 ajfisher

    8/36

    SAMPLED LEVY PROCESSES 733

    functions on [0 1] with (0) = 0. If , we set

    G() = the set of all infinitely divisible distributions with c = 0(2.5)

    and

    x (0 1]

    xF([x x]c) (x) if < 2,x2F([x x]c) (x) and

    {|y|x}|y|2F(dy) (x) if = 2,

    (2.6)

    G =

    G()

    We then have (0 2] G =

    G is infinitely divisible c = 0 lim

    x0xF([x x]c) = 0

    = 2 G2 = {G is infinitely divisible c = 0}(2.7)

    For example, ifY is a compound Poisson process or a gamma process, thenG is in

    (02]G. IfY is an -stable process, then G is in

    >G , but not in

    G. Recall also that the BlumenthalGetoor index (G) ofY (or G or F) is the

    infimum of all such that {|x|1} |x|F(dx) < , equivalently,st |Ys Ys|is almost surely finite for all t. IfH denotes the set of all G such that c = 0 and(G) , one may show that G H

    >G . Hence saying that G G

    is a little bit more restrictive than saying that G H.Our purpose is to understand the properties of the best possible estima-

    tors of the parameters of the model. The variables under consideration in themodel (1.1) have densities which depend smoothly on the parameters, so inlight of the CramerRao lower bound, Fishers information is an appropri-ate tool for studying the optimality of estimators. X is observed at n timesn 2n n n. Recalling that X0

    =0, this amounts to observing the n incre-

    ments Xin X(i1)n . So when n = is fixed, we observe n independent andidentically distributed variables distributed as X. If, further, these variableshave a density depending smoothly on a parameter , a very weak assump-tion, we are on known grounds: the Fishers information at stage n has theform In() = nI(), where I() > 0 is the Fishers information (an invert-ible matrix if is multidimensional) of the model based on the observation ofthe single variable X X0; we have the locally asymptotically normal prop-erty; the asymptotically efficient estimators n are those for which n(n )converges in law to the normal distribution N(0 I()1), and the maximumlikelihood estimator (MLE) does the job (see, e.g., DuMouchel (1973a)).

  • 8/8/2019 ajfisher

    9/36

    734 Y. AT-SAHALIA AND J. JACOD

    For high frequency data, though, things become more complicated since thetime lag n becomes small. The corresponding asymptotics require n 0 asthe number n of observations goes to infinity. Fishers information at stage n

    still has the form Inn () = nIn (), but the asymptotic behavior of the infor-mation In (), not to speak about the behavior of the MLE for example, isnow far from obvious: the essential difficulty comes from the fact that the lawofXn becomes degenerate as n .

    In the model (1.1), the law of the observed process X depends on the threeparameters () to be estimated, plus on the law ofY which is summarizedby G. The law of the variable X has a density which depends smoothly on and , so that the 2 2 Fishers information matrix (relative to and ) of ourexperiment exists; it also depends smoothly on when < 2, so in this case the3 3 Fishers information matrix exists. In all cases we denote the informationby Inn (G), and it has the form Inn (G)

    =nIn (G),

    where I(G) is the Fishers information matrix associated with theobservation of a single variable X. We denote the elements of the matrixI(G) as I (G), I

    (G), etcetera. G may appear as anuisance parameter in the model, in which case we may wish to have estimatesfor the Fishers information that are uniform in G, at least on some reasonableclass ofGs. Let us also mention that in many cases the parameter is indeedknown: this is particularly true when W is a Brownian motion, = 2.

    3. THE GENERAL CASE

    We are first interested in determining the optimal rates (and constants) atwhich one can estimate (), while leaving the distribution G G as un-specified as possible. As we will see, the optimal rate for the estimation of isheavily dependent on the precise nature ofG.

    3.1. Inference on ()

    It turns out that the limiting behavior of Fishers information for () isgiven by the corresponding limits in the situation where Y = 0, that is, we di-rectly observe X= W. In our general framework, this corresponds to settingG

    =0, a Dirac mass at 0, and we set the (now unidentified) parameter to 0,

    or for that matter to any arbitrary value.We start by studying whether the limiting behavior ofI (G) when = 2 and of the () block of the matrix I(G) when < 2 is af-fected by the presence of Y. We have the intuitively obvious majoration ofFishers information in the presence of Y by the one for which Y = 0. Notethat in this result no assumption whatsoever is made on Y (except of coursethat it is independent ofW):

    THEOREM 1: For any > 0, we have

    I ( 2G)

    I ( 2 0 0)(3.1)

  • 8/8/2019 ajfisher

    10/36

    SAMPLED LEVY PROCESSES 735

    and, when < 2, the difference

    I ( 0 0) I

    ( 0 0)

    I

    ( 0 0) I

    ( 0 0)

    I ( G) I

    (G)

    I

    ( G) I

    (G)

    is a positive semidefinite matrix; in particular we have

    I

    (G) I ( 0 0)(3.2)Next, how does the limit as 0 of I(G) compare to that of

    I( 0 0)? To answer this question, we need some further notation. Forthe first two derivatives of the density h ofW1, we write h and h

    , and we

    also introduce the functions

    h(w) = h(w) + wh(w) h(w) = h(w)2h(w) (3.3)Then h is positive, even, continuous, and h(w) = O(1/|w|1+) as |w| ;hence h is Lebesgue-integrable.

    Let

    I() = h(w)dw(3.4)This is a positive number, which takes the value I() = 2 when = 2.

    When Y = 0, we have p(x| 0 0) = (1/1/)h(x/1/) and, there-fore,

    I ( 0 0) =1

    2I()(3.5)

    which does not depend on . I() is simply the Fishers information at point= 1 for the statistical model in which we observe W1.

    The limiting behavior of the () block of Fishers information is given by

    the following theorem:THEOREM 2: (a) IfG G, we have, as 0,

    I (G) 1

    2I()(3.6)

    and also, when < 2,

    I

    (G)

    log(1/)2 1

    4I()

    I

    (G)

    log(1/) 1

    2I()(3.7)

  • 8/8/2019 ajfisher

    11/36

    736 Y. AT-SAHALIA AND J. JACOD

    (b) For any and (0 ], and K > 0, we have as 0,

    supGG()||KI (G)

    I()

    2 0(3.8)

    < 2

    sup

    GG()||K

    I (G)(log(1/))2 I()4 0

    supGG()||K

    I (G)log(1/) I()2 0

    (c) For each n, let Gn be the standard symmetric stable law of index n, withn a sequence strictly increasing to . Then for any sequence n 0 such that( n) log n 0 (i.e., the rate at which n 0 is slow enough), the sequenceof numbers In (Gn) (resp. I

    n(Gn)/(log(1/n))2 when further

    < 2) converges to a limit which is strictly less thanI()/2 (resp. I()/4).

    Sincetheresult(a)isvalidforall G G,itisvalidinparticularwhen G = 0,that is, when Y = 0. In other words, at their respective leading orders in , thepresence ofYhas no impact on the information terms I , I

    , and I

    , as soonas Y is dominated by W: so, in the limit where n 0, the parameters and can be estimated with the exact same degree of precision whether Y is presentor not. Moreover, part (b) states the convergence of Fishers information is

    uniform on the set G() and || K for all [0 ]; this settles the casewhere G and are considered as nuisance parameters when we estimate and. But as tends to , the convergence disappears, as stated in part (c).

    The part of the theorem related to the term alone is discussed in At-Sahalia and Jacod (2007), where we use it to construct explicit

    n-converging

    estimators of the parameter that are strongly efficient in the sense that notonly do they converge at rate

    n, but they also have the same asymptotic vari-

    ance when Y is present as when it is not:

    n(n ) converges in law toN(0 2/I()) as soon as n 0. This is the case even in the semiparametriccase where nothing is known about Y, other than the fact that Y is sufficiently

    away from W, meaning that G G for small enough: < is not enoughand the precise condition on depends on the behavior of the pair (nn); seeTheorem 5 of that paper.

    When it comes to estimating , with known, things are different. Whenn 0 and the true value is < 2, we can hope for estimators converging to at the fasterrate

    n log(1/n), in view of the limit for I

    given in (3.7). In fact,for a single observation, say X, there exists of course no consistent estimatorfor , but there is one for when 0: namely = log(1/)/ log |X|, which converges in probability to as , and the rate of convergence islog(1/); all these follow from the scaling property ofW.

  • 8/8/2019 ajfisher

    12/36

    SAMPLED LEVY PROCESSES 737

    Estimators for could be constructed using the jumps of the process of asize greater than some threshold (see Hpfner and Jacod (1994)). Estimating was studied by DuMouchel (1973a), who computed numerically the term

    I

    ( 0 0), including also an asymmetry parameter. Note that if one sus-pects that = 2, then one should perform a test, as advised by DuMouchel(1973a), and in that case the behavior of the Fishers information about doesnot provide much insight.

    The introduction of as a parameter to be estimated makes the joint es-timation of the pair () a singular problem. The asymptotic (normalized)2 2 Fishers information matrix given by (3.6) and (3.7) is not invertible. Inparticular, there is no guarantee that the joint MLE for the pair (), forexample, behaves well. However, the estimation of either parameter knowingthe other remains a regular problem, albeit with a rate faster than

    n for .

    3.2. Inference on

    For the entries of Fishers information matrix involving the parameter ,things are more complicated. First, observe that I (0G) (that is, theFishers information for the model X= Y) does not necessarily exist, but ofcourse if it does we have an inequality similar to (3.1) for all :

    I (G) I (0G)(3.9)Contrary to (3.1), however, this is a very rough estimate, which does not

    take into account the properties ofW. The () Fishers information is usu-ally much smaller than what the right side above suggests, and we give belowa more accurate estimate when Y has second moments, but without the domi-nation assumption that G G.

    For this, we see another information appear, namely the Fishers informa-tion associated with the estimation of the real number a for the model whereone observes the single variable W1 + a. This Fishers information is

    J() =

    h(w)2

    h(w)dw(3.10)

    Observe in particular that J() = 1 when = 2.THEOREM 3: IfY1 has finite variance v and mean m, we have

    I (G) J()

    2

    m222/ + v12/(3.11)

    This estimate holds for all > 0. The asymptotic variant, which says that

    limsup

    0

    2/1I (G) vJ()

    2(3.12)

  • 8/8/2019 ajfisher

    13/36

    738 Y. AT-SAHALIA AND J. JACOD

    is sharp in some cases and not in others, as we will see in the examples below.These examples will also illustrate how the translation Fishers informationJ() comes into the picture here.

    Theorem 3 does not solve the problem in full generality: for a generic Y, weonly obtain a bound. For some specific Ys, we will be able to give a full resultin Section 4. It is indeed a feature of this problem that the estimation of depends heavily on the specific nature of the Y process, including the fact thatthe optimal rate at which can be estimated depends on the process Y. Henceit is not surprising that any general result regarding , such as Theorem 3, canat best state a bound: there is no single common rate that applies to all Ys.

    4. SPECIFIC EXAMPLES OF LVY PROCESSES

    The calculations of the previous section involving the parameter can bemade fully explicit if we specify the distribution of the process Y. We can thenillustrate the surprisingly wide range of situations that can occur to the rates ofconvergence. For simplicity, let us consider as known in these examples andfocus on the joint estimation of().

    4.1. Stable Process Plus Drift

    Here we assume that Yt = t, so G = 1 and is a drift parameter:

    THEOREM 4: The 2

    2 Fishers information matrix for estimating () isI (1) I

    (1)

    I (1) I (1)

    (4.1)

    = 12

    I() 0

    0 22/J()

    This has several interesting consequences (we denote by Tn = nn the lengthof the observation window):

    1. If is known, (4.1) suggests the existence of estimators

    n satisfying

    n(n ) dN(0 2/I()). This is the case and, in fact, since is known,observing Xin and observing Win are equivalent.2. If is known, one may hope for estimators n satisfying n11/n (n

    )d N(0 2/J()). If = 2, the rate is thus Tn: this is in accordance

    with the classical fact that for a diffusion the rate for estimating the drift co-efficient is the square root of the total observation window, that is,

    Tn here.

    Moreover, in this case, the variable XTn /Tn is N(2/Tn), so n = XTn /Tn is

    an asymptotically efficient estimator for (recall that J() = 1 when = 2).When < 2, we have 1 1/ < 1/2, so the rate is bigger than Tn and itincreases when decreases; when < 1, this rate is even bigger than

    n.

  • 8/8/2019 ajfisher

    14/36

    SAMPLED LEVY PROCESSES 739

    3. Observe that here Y1 has mean m = 1 and variance v = 0, so the bound(3.11) in Theorem 3 is indeed an equality. The fact that the translation Fishersinformation J() appears here is transparent.

    4. If both and are unknown, one may hope for estimatorsn andn suchthat the pairs (n(n ), n11/n (n )) converge in law to the productN(0 2/I()) N(0 2/J()).

    4.2. Stable Process Plus Poisson Process

    Here we assume that Y is a standard Poisson process (for simplicity jumps ofsize 1, intensity 1) whose law we write as G = P. We can describe the limitingbehavior of the () block of the matrix I(P) as 0.

    THEOREM 5: IfY is a standard Poisson process, we have, as

    0,

    I (P) 1

    2I()(4.2)

    1/1/2I (P) 0(4.3)

    2/1I (P) 1

    2J()(4.4)

    Since P G, (4.2)isnothingelsethanthefirstpartof( 3.6). One could provemore than (4.3), namely that sup

    1/1|I (P)| . Since Y1 hasmean m

    =1 and variance v

    =1, the asymptotic estimate (3.12) in Theorem 3

    is sharp in view of (4.4). Here again, we deduce some interesting consequencesfor the estimation:

    1. If is known, one may hope for estimators n satisfying n12/n (n )

    dN(0 2/J()). So the rate is fasterthan n, except when = 2. Moregenerally, if both and are unknown, one may hope for estimators n andn such that the pairs (n(n )n12/n (n )) converge in law to theproduct N(0 2/I()) N(0 2/J()).

    2. However, the above-described behavior of any estimator

    n cannot be

    true when Tn = nn does not go to infinity, because in this case there is a posi-

    tive probability that Y has no jump on the biggest observed interval, and so noinformation about can be drawn from the observations in that case. It is true,though, when Tn , because Y will eventually have infinitely many jumpson the observed intervals. There is a large literature on this subject, startingwith the base case where the Poisson process is directly observed, for whichthe same problem occurs.

    4.3. Stable Process Plus Compound Poisson Process

    Here we assume that Y is a compound Poisson process with arrival rate and law of jumps : that is, the characteristics of G are b

    ={|x|1} x(dx)

  • 8/8/2019 ajfisher

    15/36

    740 Y. AT-SAHALIA AND J. JACOD

    and c = 0 and F= . We then write G = P , which belongs to G. If = 2,we are then in the classical context of jump diffusions in finance.

    We will further assume that has a density f satisfying

    lim|u|

    uf(u) = 0 supu

    |f(u)|(1 + |u|)< (4.5)We also suppose that the multiplicative Fishers information associated with (that is, the Fishers information for estimating in the model when oneobserves a single variable U with U distributed according to ) exists. It thenhas the form

    L=

    (xf(x) + f(x))2f(x)

    dx(4.6)

    We can describe the limiting behavior of the () block of the matrixI(P) as 0.

    THEOREM 6: IfY is a compound Poisson process satisfying (4.5) and such thatL in (4.6) is finite, we have, as 0,

    I (P) 1

    2I()(4.7)

    1

    I

    (P) 0(4.8)2

    2

    (xf(x) + f(x))2f (x) + c

    |x|1+dx liminf1

    I (P)

    12L(4.9)

    and also, when = 2,1

    I ( 2P) 1

    2L(4.10)

    As for the previous theorem, (4.7) is nothing else than the first part of (3.6).We could prove more than (4.8), namely that sup

    1|I (P)| < .

    Here again, we draw some interesting consequences:1. Equations (4.9) and (4.10) suggest that there are estimators n such

    that

    Tn(n ) is tight (the rate is the same as for the case Yt = t) and iseven asymptotically normal when = 2. But this is of course wrong when Tndoes not go to infinity, for the same reason as in the previous theorem.

    2. When the measure has a second order moment, the right side of(3.11) is larger than the result of the previous theorem, so the estimate in The-orem 3 is not sharp.

  • 8/8/2019 ajfisher

    16/36

    SAMPLED LEVY PROCESSES 741

    The rates for estimating in the two previous theorems, and the limitingFishers information as well, can be explained as follows (supposing that isknown and that we have n observations and that Tn ):

    1. For Theorem 5: comes into the picture whenever the Poisson processhas a jump. On the interval [0 Tn] we have an average ofTn jumps, most ofthem being isolated in an interval (in (i + 1)n]. So it essentially amounts toobserving Tn (or rather the integer part [Tn]) independent variables, all distrib-uted as 1/n W1 +. The Fishers information for each of those (for estimating) is J()/22/, and the global Fishers information, namely nIn , is ap-proximately TnJ()/22/n J()/22/1n .

    2. For Theorem 6: Again comes into the picture whenever the com-pound Poisson process has a jump. We have an average of Tn jumps, so itessentially amounts to observing Tn independent variables, all distributed as

    1/

    n W1 + V, where V has the distribution . The Fishers information foreach of those (for estimating ) is approximately L/2 (because the variable1/n W1 is negligible), and the global Fishers information nI

    n

    is approxi-mately TnL/2 nnL/2. This explains the rate in (4.9), and is an indicationthat (4.10) may be true even when < 2, although we have been unable toprove it thus far.

    4.4. Two Stable Processes

    Our last example is about the case where Y is also a symmetric stable processwith index , < . We write G = S. Surprisingly, the results are quite in-volved, in the sense that for estimating we have different situations accordingto the relative values of and . We obviously still have (4.7), so we concen-trateontheterm I and ignore the cross term in the statement of the followingtheorem. The constant c below is the one occurring in (2.2).

    THEOREM 7: IfY is a standard symmetric stable process with index < , wehave the following situations as 0:

    (a) If = 2,

    (log(1/))/2

    ()/I (S)

    2c/2

    2(2( ))/2 (4.11)

    (b) If < 2 and > 2 ,

    (2())/I (S)(4.12)

    2c2

    22

    2

    (R|y|1 dy10 (1 v)h(x yv)dv)2

    h(y)dx

  • 8/8/2019 ajfisher

    17/36

    742 Y. AT-SAHALIA AND J. JACOD

    (c) If < 2 and = 2 ,

    (2())/ log1

    I (S) 2( )c222

    c2(4.13)

    (d) If < 2 and < 2 ,

    1

    I (S) 2c2

    22

    2

    dz

    c|z|1+2 + c|z|1+/(4.14)

    Then if is known, one may hope to find estimators

    n with un(

    n ) d

    N(0 V ), with

    un =

    n(2)/4n /(log(1/n))/4 if = 2,

    n()/n if < 2 > /2,n1/2n

    log(1/n) if < 2 = /2,

    nn if < 2 < /2,

    and of course the asymptotic variance Vshould be the inverse of the right-handsides in (4.11)(4.14).

    5. CONCLUSIONS

    We studied in this paper a tractable parametric model with jumps, also in-cluding a continuous component, and examined the impact of the presence ofthe different components of the model (continuous part vs. jump part and/orjumps with different relative levels of activity) on the various parameters ofthe model. We determined how optimal estimators should behave where theparameter vector of interest is (), viewing the law ofY nonparametri-cally. While optimal estimators for should converge at rate

    n and have the

    same asymptotic variance with or without Y, the optimal rate for is faster

    and given by n log(1/n), and is also unaffected by the presence of Y. Butthe rate of convergence of optimal estimators of is very dependent on thelaw ofY and is affected by the presence ofW.

    Given that we now know what can be achieved by efficient estimators, thenatural next step is to exhibit actual, and tractable, estimators with those op-timal properties. We did not attempt here to construct estimators of theseparameters, and leave that question to future work. As a partial step in thatdirection, we studied in At-Sahalia and Jacod (2007) estimation procedures,for the parameter only, that satisfy the property that those estimators areasymptotically as efficient as when the process Y is absent.

  • 8/8/2019 ajfisher

    18/36

    SAMPLED LEVY PROCESSES 743

    Dept. of Economics, Bendheim Center for Finance, Princeton University, 26 Prospect Avenue, Princeton, NJ 08540-5296, U.S.A. and NBER; [email protected]

    and Institut de Mathmatiques de Jussieu, CNRSUMR 7586, 175 Rue dueChevaleret, 75013 Paris, France and Universit P. et M. Curie, Paris-6, France;

    [email protected].

    Manuscript received May, 2006; final revision received October, 2007.

    APPENDIX: PROOFS

    A. Fishers Information When X= W + Y

    By independence ofW and Y the density ofX in (1.1) is the convolution(recall that G is the law ofY)

    p(x|G) =1

    1/

    G(dy)h

    x y1/

    (A.1)

    We now seek to characterize the entries of the full Fishers information ma-trix. Since h, h, and h are continuous and bounded, we can differentiateunder the integral in (A.1) to get

    p(x|G) = 121/ G(dy)hx y1/ (A.2)p(x|G) = v(x|G)

    log 2

    p(x|G)(A.3)

    p(x|G) = 1

    22/

    G(dy)yh

    x y1/

    (A.4)

    where

    v(x|G) =1

    1/ G(dy)hx y1/ (A.5)The entries of the () block of the Fishers information matrix are (leavingimplicit the dependence on (G))

    I =

    p(x)2

    p(x)dx I =

    p(x)p(x)

    p(x)dx(A.6)

    I = p(x)

    2

    p(x)dx

    mailto:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]:[email protected]
  • 8/8/2019 ajfisher

    19/36

    744 Y. AT-SAHALIA AND J. JACOD

    When < 2, the other entries are

    I

    =J

    log

    2

    I I

    =J

    log

    2

    I (A.7)

    I

    = J 2log

    2J

    +2(log )2

    4I (A.8)

    where

    J

    =

    p(x)v(x)

    p(x)dx J

    =

    v(x)2

    p(x)dx(A.9)

    J

    = v(x)p(x)

    p(x)dx

    B. Proof of Theorem 1

    The proof is standard and given for completeness, and given only in the casewhere < 2 (when = 2 take v = 0 below). What we need to prove is that, forany u v R, we have

    (up(x|G) + vp(x|G))2

    p(x

    |G)

    dx(B.1)

    (up(x| 0 0) + vp(x| 0 0))2

    p(x| 0 0)dx

    We set

    q(x) = p(x|G) q0(x) = p(x| 0 0)r(x) = up(x|G) + vp(x|G)r0(x) = up(x| 0 0) + vp(x| 0 0)

    Observe that by (A.1), q(x) = G(dy)q0(x y) hence r(x) = G(dy) r0(x y) as well. Apply the CauchySchwarz inequality to G with r0 =q0(r0/

    q0) to get

    r(x)2 q(x)

    G(dy)r0(x y)2q0(x y)

    Then

    r(x)2

    q(x)dx

    dx

    G(dy)

    r0(x y)2q0(x

    y)

    =

    r0(z)2

    q0(z)dz

  • 8/8/2019 ajfisher

    20/36

    SAMPLED LEVY PROCESSES 745

    by Fubinis theorem and a change of variable: this is exactly (B.1).

    C. Proof of Theorem 2

    It is easily seen from the characteristic function that h(w) is differ-entiable on (0 2], and we denote by h(w) its derivative. However, instead of(2.2), one has

    |h(w)|

    c log |w||w|1+ if < 2,1

    |w|3 if = 2(C.1)

    as|w

    | , by differentiation of the series expansion for the stable density.

    Therefore the quantity

    K() =

    h(w)2

    h(w)dw(C.2)

    is finite when < 2 and infinite for = 2. This is the Fishers information forestimating , upon observing the single variable W1.

    First, part (b) of the theorem implies (a). Second, the claim about the term

    In (n G

    n

    ) I()

    2(C.3)

    and its uniformity are proved in At-Sahalia and Jacod (2007), so we only needto prove the claims about I and I in (b) and (c).

    We assume < 2. If (b) fails, one can find , (0 ], and > 0,and also a sequence n 0 and a sequence Gn of measures in G() and asequence of numbers n converging to a limit , such that

    I

    n(n G

    n)

    (log(1/n))2 I()

    4

    +

    In (n Gn)

    log(1/n) I()

    2

    for all n. In other words, we will get a contradiction and (b) will be proved assoon as we show that

    I

    n(n G

    n)

    (log(1/))2 I()

    4

    I

    n(n G

    n)

    log(1/) I()

    2(C.4)

    Recall (A.7) and (A.8): with the notation J (G) of (A.9), wesee that

    J

    (G)

    I (G)J

    (G)(C.5)

  • 8/8/2019 ajfisher

    21/36

    746 Y. AT-SAHALIA AND J. JACOD

    by a first application of the CauchySchwarz inequality. A second applicationof the same plus (A.1) and (A.5) yields

    v(x|G)2 p(x|G)1/

    G(dy) h2hx y

    1/

    Then Fubinis theorem and the change of variable x (xy)/1/ in (A.9)leads to

    J

    (G) K()(C.6)

    Next, (C.4) readily follows from (C.5), (C.6), and (C.3), and also (A.7) and(A.8).

    Finally, it remains to prove the part of (c) concerning I. We alreadyknow that the sequence In (G

    n) converges to a limit which is strictlyless than I()/2. Then by (A.8), (C.5), (C.6), and (C.3), it is clear thatI

    n(Gn)/(log(1/n))2 also converges to a limit which is strictly less than

    I()/4.

    D. Proof of Theorem 3

    The CauchySchwarz inequality gives us, by (A.1) and (A.4),

    |p(x|G)|2

    133/

    p(x|G)

    G(dy)y2[h((x y)/1/)]2

    h((x y)/1/)

    Plugging this into (A.6), applying Fubinis theorem, and doing the change ofvariable x (x y)/1/ leads to

    I

    1

    2

    2/ G(dy)y

    2

    h(x)

    2

    h(x)

    dx(D.1)

    Since E(Y2) = m2 + v, we readily deduce (3.11).

    E. Proof of Theorem 4

    When Yt = t we gave G = . Then (4.1) follows directly from applying theformulae (A.2) and (A.4), and from the change of variable x (x )/1/in (A.6), after observing that the function hh/ h is integrable and odd,hence has a vanishing Lebesgue integral.

  • 8/8/2019 ajfisher

    22/36

    SAMPLED LEVY PROCESSES 747

    F. Proofs of Theorems 5 and 6

    F.1. Preliminaries

    We suppose that Y is a compound Poisson process with arrival rate andlaw of jumps , and that k is the kth fold convolution of. So we have

    G = e

    k=0

    ()k

    k! k

    Set

    (1) (kx) =

    11/

    k(du)h

    x u1/

    (F.1)

    (2) (kx) = 121/k(du)hx u

    1/(F.2)

    (3) (kx) =1

    22/

    k(du)uh

    x u1/

    (F.3)

    We have (recall that 0 = 0)

    p(x|G) = e

    k=0()k

    k! (1) (kx)(F.4)

    p(x|G) = e

    k=0

    ()k

    k! (2) (kx)(F.5)

    p(x|G) = e

    k=1

    ()k

    k! (3) (kx)(F.6)

    Omitting the mention of(G), we also set

    i = 2 3 (i)

    (kk

    ) = (i) (kx)

    (i) (k

    x)

    p(x) dx(F.7)

    (4) (kk) =

    (2) (kx)

    (3) (k

    x)

    p(x)dx(F.8)

    By the CauchySchwarz inequality, we have

    i = 2 3(i) (kk)(i) (kk)(i) (k k)(F.9)

    (4) (kk

    )(2)

    (kk)(3)

    (k k)(F.10)

  • 8/8/2019 ajfisher

    23/36

    748 Y. AT-SAHALIA AND J. JACOD

    For any k 0 we have p(x) e(()k)/k!(1) (kx). Therefore,

    i

    =2 3 (i) (kk)

    ek!

    ()k

    (i) (kx)2

    (1) (kx)

    dx(F.11)

    Finally, if we plug (F.5) and (F.6) into (A.6), we get

    I = e2

    k=0

    l=1

    ()k+l

    k!l! (4)

    (kl)(F.12)

    I = e2

    k=1

    l=1

    ()k+l

    k!l! (3)

    (kl)(F.13)

    F.2. Proof of Theorem 5

    When Y is a standard Poisson process, we have = 1 and k = k. There-fore, we get

    (1) (kx) =

    11/

    h

    x k1/

    (F.14)

    (2) (kx) =1

    21/h

    x k1/

    (F.15)

    (3) (kx)

    =k

    22/h

    x k1/ (F.16)

    Plugging this into (F.11) yields

    (2) (kk) ek!2k

    I() (3) (kk) ek2k!

    2k+2/J()(F.17)

    Recall that (4.2) follows from Theorem 2, so we need to prove (4.3) and(4.4). In view of (F.12) and (F.13), this amounts to proving the two properties

    k=0

    l=1k+l+1/1/2

    k!l!

    (4) (kl)

    0

    k=1

    l=1

    k+l+2/1

    k!l! (3)

    (kl) 1

    2J()

    If we use (F.9) and (F.17), it is easily seen that the sum of all summands in thefirst (resp. second) left side above, except the one for k = 0 and l = 1 (resp.k = l = 1), goes to 0. So we are left to prove

    1/2+1/(4) (0 1) 0 1+2/(3) (1 1) 1

    2J()(F.18)

  • 8/8/2019 ajfisher

    24/36

    SAMPLED LEVY PROCESSES 749

    Let = 12(1+) , so (1 + 1 )(1 ) = 1 + 1/2. Assume first that < 2. Thenif|x| |/1/|, we have for some constant C (0 ), possibly dependingon , , and , that changes from one occurrence to the other and provided

    (2/)

    ,h(x) C(1+1/) h

    x + r/1/ C1+1/

    when r Z\{0}, and thus|x|

    /1/ h

    x + r/1/ Ch(x)(1+1/)(1) = Ch(x)1+1/2

    When = 2, a simple computation on the normal density shows that the aboveproperty also holds. Therefore, in view of (F.4) and (F.14) we deduce that in all

    cases x 1/ 1/

    implies

    p(x) e

    1/h

    x 1/

    + C1+/2

    1 +

    k=2

    k

    k!

    e

    1/1 h x 1/1 + C/2By (F.7) it follows that

    (3)

    (1 1) e

    31+3/

    {|(x)/1/|(/1/)}

    h((x )/1/)2h((x )/1/)

    dx

    e

    21+2/

    {|x|(/1/)}

    h(x)2

    h(x)dx

    We readily deduce that liminf0 1+

    2/

    (3)

    (1 1) J()/2

    . On the otherhand, (F.17) yields limsup0 1+2/(3) (1 1) J()/2, and thus the second

    part of (F.18) is proved.Finally h/ h is bounded, so (F.15), (F.16), and the fact that

    p(x) eh

    x/1/

    /1/

    yield

    (4) (0 1)

    e

    32/ h

    x 1/

    dx e

    21/ |h(x)| dx

  • 8/8/2019 ajfisher

    25/36

    750 Y. AT-SAHALIA AND J. JACOD

    and the first part of (F.18) readily follows.

    F.3. Proof of Theorem 6

    We use the same notation as in the previous proof, but here the measure khas a density fk for all k 1, which further is differentiable and satisfies (4.5)uniformly in k, while we still have 0 = 0. Exactly as in (3.3), we set

    fk(u) = ufk(u) + fk(u)Recall the Fishers information L defined in (4.6), which corresponds to esti-mating in a model where one observes a variable U, with U having the law. Now if we have n independent variables Ui with the same law , the Fishersinformation associated with the observation ofUi for i = 1 n is of coursenL; if instead one observes only (U

    1 + +Un), one gets a smaller Fishers

    information Ln nL. In other words, we have

    Ln :=

    fn(u)2

    fn(u)du nL(F.19)

    Taking advantage of the fact that k has a density, for all k 1, we canrewrite (i) (kx) as (using further an integration by parts when i = 3 and thefact that each fk satisfies (4.5))

    (1)

    (kx)=

    1

    h(y)fkx y1/

    dy(F.20)(2) (kx) =

    1

    h(y)fk

    x y1/

    dy(F.21)

    (3) (kx) =1

    2

    h(y)fk

    x y1/

    dy(F.22)

    Since the fks satisfy (4.5) uniformly in k, we readily deduce that

    k

    1 i

    =1 2 3

    (i) (kx) C(F.23)

    (1) (1 x)

    1

    fx

    (3) (1 x)

    12

    fx

    (F.24)

    Let us start with the lower bound. Since (1) (0 x) = (1/1/)h(x/1/)we deduce from (2.2), (F.23), and (F.4) that

    1

    p(x) c

    |x|1+ +

    f

    x

    (F.25)

    as soon as x=

    0, and with the convention c2=

    0. In a similar way, we deduce

  • 8/8/2019 ajfisher

    26/36

    SAMPLED LEVY PROCESSES 751

    from (F.23) and (F.6) that

    1

    p(x)

    2f

    x

    (F.26)Then plugging (F.25) and (F.26) into the last equation in (A.6), we conclude byFatous lemma and after a change of variable that

    liminf0

    I

    2

    2

    f(x)2

    f (x) + c/|x|1+dx

    It remains to prove (4.8) and the upper bound in (4.9) (including when = 2). By the CauchySchwarz inequality, we get (using successively the twoequivalent versions for (i) (kx))

    (2) (kx)2 1

    3(1) (kx)

    k(du)h(x u)/1/

    (3) (kx)

    2 13

    (1) (kx)

    h(y)

    fk((x y1/)/)2fk((x y1/)/)

    dy

    Then it follows from (F.11) and (F.19) that

    (2)

    (kk) ek!

    2()kI()

    (3) (kk)

    ekk!2()k

    L(F.27)

    We also need an estimate for (4) (0 1). By (F.1), (F.2), and (F.4) we obtainp(x) eh(x/1/)/1/ and (2) (0 x) = h(x/1/)/21/. Thenuse (F.22) and the definition (F.7) to get

    (4) (0 1) 12 hh

    x

    1/

    dx h(y)fx y1/dy(F.28)

    = 12

    h(y)dy

    h

    h

    x

    1/

    f

    x y1/

    dx

    Cwhere the last inequality comes from the facts that h/ h is bounded and thatf is integrable (due to (4.5)).

    At this stage we use (F.9) together with (F.12) and (F.13), and the fact that2|xy| ax2 + y2/a for all a > 0. Taking arbitrary constants akl > 0, we deducefrom (F.27) that

    I L

    2e

    k=1

    ()kk

    k

    !

    + 2

    k=1

    l=k+1()k+l

    k

    !l

    !

    kk!

    ()kll!

    ()l

    1/2

  • 8/8/2019 ajfisher

    27/36

    752 Y. AT-SAHALIA AND J. JACOD

    L2

    +

    k=1

    l=k+1

    ()lk

    l! akl +

    k=1

    l=k+1

    ()kl

    k!1

    akl

    Then if we take akl = ()(kl)/2 for l > k, a simple computation shows thatindeed

    I L

    2

    + C3/2

    for some constant C, and thus we get the upper bound in (4.9). In a similarway, replacing L/2 above by the supremum between L/2 and I()/2, wesee that in (F.12) the sum of the absolute values of all summands except theone for k = 0 and l = 1 is smaller than a constant times . Finally, the sameholds for the term for k

    =0 and l

    =1, because of (F.28), and this proves (4.8).

    G. Proof of Theorem 7

    G.1. Preliminaries

    In the setting of Theorem 7, the measure G admits the density y h(y/

    1/)/1/. For simplicity, we set

    u = 1/1/(G.1)

    (so u 0 as 0), and a change of variable allows to write (A.1) asp(x|G) =

    11/

    h

    x

    1/ y

    h

    y

    u

    dy

    Therefore,

    p(x|G) = 1

    21/

    h

    x

    1/ y

    h

    y

    u

    dy

    and another change of variable in (A.6) leads to

    I =22u2

    2Ju(G.2)

    where

    Ju =

    Ru(x)2

    Su(x)dx(G.3)

    Ru(x) =

    u1+

    h(x y)h

    y

    udy

  • 8/8/2019 ajfisher

    28/36

    SAMPLED LEVY PROCESSES 753

    =

    u

    h

    x yu

    h(y)dy

    Su(x) =

    u h(x y)hyudy= hx yu h(y)dyBelow, we denote by K a constant and by a continuous function on R+

    with (0) = 0, both of them changing from line to line and possibly dependingon the parameters ; if they depend on another parameter , we writethem as K or . Recalling (2.2), we have, as |x| ,

    h(x) c

    |x|1+

    {|y|>|x|}h(y)dy

    2c|x| (G.4)

    h(x) c

    |x|1+ {|y|>|x|} h(y)dy 2c|x| Another application of (2.2) when < 2 and of the explicit form ofh2 gives

    |y| 1 |h(x y)| h(x) :=

    Kh(x) if < 2

    K(1 + x2)ex2/3 if = 2(G.5)

    We will now introduce an auxiliary function

    D(x) = {|y|>} h(x y) 1|y|1+ dy(G.6)which will help us control the behavior ofRu and Su as stated in the followinglemma.

    LEMMA 1: We haveSu(x)

    h(x) + c

    (u)

    D1(x)

    (G.7)

    (h(x)

    +uD1(x))(u)

    +Kuh(x)Ru(x) c2h(x) D(x)

    h(x)

    + D(x)

    (u) + Kh(x)2

    with

    D(x) 1

    |x

    |1+ as |x| (G.8)

  • 8/8/2019 ajfisher

    29/36

    754 Y. AT-SAHALIA AND J. JACOD

    PROOF: We split the first two integrals in Ru and Su into sums of integralson the two domains {|y| } and {|y| > } for some (0 1] to be chosenlater. We have |h(x y) h(x) + h(x)y| h(x)y2 as soon as |y| 1, sothe fact that both f = h and f = h are even functions gives

    {|y|}h(x y)f

    y

    u

    dy h(x)

    {|y|}

    f

    y

    u

    dy

    h(x)

    {|y|}

    y2fyu

    dyOn the one hand we have with f = h or f = h, and in view of (G.4),

    {|y|}

    y2fyudy= 3u33 {|z|/u} z2|f(z)| dz Ku1+2On the other hand, the integrability of h and h, and the fact that

    h(y)dy= 0 yield{|y|1}

    h

    y

    u

    dy= u

    {|z|/u}

    h(z)dz =u

    (1 + (u))

    {|y|} hyudy= u {|z|/u} h(z)dz= u

    {|z|>/u}

    h(z)dz

    = 2c(u)1+

    1+(1 + (u))

    Putting all these facts together yields

    {|y|1} h(x y)hyudy u h(x)(G.9) u(u)h(x) + Ku1+h(x)

    and {|y|}

    h(x y)h

    y

    u

    dy 2c

    (u)1+

    1+h(x)

    (G.10) u1+

    h(x)

    (u)

    + Kh(x)2

  • 8/8/2019 ajfisher

    30/36

    SAMPLED LEVY PROCESSES 755

    For the integrals on {|y| > } we observe that by (G.4) we have

    h(y/u) c(u/|y|)1+

    (u/y)1+(u)

    and the same for h except that c is substituted with c. Given the defini-tion ofD we readily get

    {|y|>1}h(x y)h

    y

    u

    dy c

    (u)1+

    1+D1(x)

    (G.11) D1(x)u1+(u)

    {|y|>}h(x y)h

    y

    u

    dy+ c

    (u)1+

    1+D(x)

    D(x)u1+(u)

    and if we put together (G.9) and (G.11), we obtain (G.7).Finally, we prove (G.8). We split the integral in (G.6) into the sum of the

    integrals, say D(1) and D(2) , over the two domains {|y| > |y x| |x|} and

    {|y| > |y x| > |x|}, where = 4/5 if = 2 and ((1 + )/(1 + ) 1) if < 2. On the one hand, D(2) (x) Kh(|x|), so with our choice of we obvi-ously have |x|1+D(2) (x) 0. On the other hand, |x|1+D(1) (x) is clearly equiv-alent, as |x| , to

    {|yx||x|} h(x y)dy, which equals

    {|z||x|} h(z)dz,

    which in turn goes to 1. Hence we get (G.8) for all > 0. This concludes the

    proof of the lemma. Q.E.D.

    Using the lemma, we can now obtain the behavior of Ru and Su asu 0. First, an application of (G.4)tothelastformulain(G.3) and Lebesguestheorem readily give

    limu0

    Su(x) = h(x)(G.12)

    Also, by (G.4), (G.5), (G.7), and (G.8), we get

    Su(x)

    C

    1 + |x|1+ if < 2,

    C

    ex

    2/2 + u

    1 + |x|1+

    if = 2,(G.13)

    for some C > 0 depending on the parameters and all u small enough. In thesame way, we see that uRu(x) 0, but this not enough. However, if r =2h/ D, we deduce from (G.7) that for all (0 1],

    limsupu

    0

    |Ru(x) cr(x)| Kh(x)2(G.14)

  • 8/8/2019 ajfisher

    31/36

    756 Y. AT-SAHALIA AND J. JACOD

    A simple computation and the second order Taylor expansion with integralremainder for h yield

    r(x) = {|y|>}

    h(x) h(x y)|y|1+ dy

    =

    {|y|>}|y|1 dy

    10

    (1 v)h(x yv)dv

    By Lebesgues theorem, r converges pointwise as 0 to the function rgiven by

    r(x)=

    R |y|1 dy1

    0

    (1

    v)h(x

    yv)dv

    Then, taking into account (G.14) and using once more (G.7) together with(G.4), (G.5), and (G.8), and also < , we get

    limu0

    Ru(x) = cr(x) |Ru(x)| K

    1 + |x|1+ (G.15)

    G.2. The Case (b) in Theorem 7

    We are now in a position to prove (4.12), where < 2 and > /2 (andof course < ). Since > /2, we see that (G.12), (G.13), and (G.15)allow us to apply Lebesgues theorem in the definition (G.3) to get thatJu

    dx(cr(x))

    2/ h(x): this is (4.12) (obviously |r(x)| K/(1 + |x|1+),while h(x) > C/(1 + |x|1+) for some C > 0; hence the integral in (4.12) con-verges).

    G.3. The Other Cases

    The other cases are a bit more involved, because Lebesgues theorem does

    not apply and we will see that Ju goes to infinity. First, we introduce the func-tions

    R(x) = c|x|1+ Su(x) =

    c

    |x|1+ +c

    u

    |x|1+ if < 2,

    ex2/2

    2

    + cu

    |x|1+ if = 2.

    Below we denote by (u ) for u (0 1] and 1 the sum (u) + (1/ )for any two functions and changing from line to line, that are continuous

  • 8/8/2019 ajfisher

    32/36

    SAMPLED LEVY PROCESSES 757

    on R+ with (0) = (0) = 0. We deduce from (G.4), (G.5), (G.7) for = 1,and (G.8) that

    |x|

    > (G.16)

    |Ru(x) + R(x)| (u )R(x) +

    K

    |x|1+ if < 2,

    Kx2ex2/3 if = 2

    In a similar way, we get

    |x| >

    |Su(x)

    Su(x)

    | (u )Su(x) if < 2,

    (u )Su(x) + K0u

    x2ex2/

    3 if = 2where K0 is some constant. Then for any > 0, we denote by the smallestnumber bigger than 1, such that K0x2ex

    2/3 c|x|1+ for all |x| > . The last

    estimate above for = 2 reads as |Su Su| Su((u ) + ), so in all caseswe have, for some fixed function 0 as above,

    Su(x) = Su(x)(1 + u(x))(G.17)where

    |u(x)| 0(u ) if < 2 |x| > ,0(u ) + if = 2 |x| > > 0

    At this stage, we set

    Ju =

    {|x|>}

    R(x)2

    Su(x)dx(G.18)

    Observe that Ju = Ju +

    4i=1 J

    (i)u, where

    J(1)u = {|x|}

    Ru(x)2

    Su(x)dx

    J(2)u =

    {|x|>}

    (Ru(x) + R(x))2Su(x)

    dx

    J(3)u = 2

    {|x|>}

    R(x)(Ru(x) + R(x))Su(x)

    dx

    J(4)u = {|x|>}

    R(x)2

    Su(x)dx

    {|x|>}R(x)2

    Su(x)dx

  • 8/8/2019 ajfisher

    33/36

    758 Y. AT-SAHALIA AND J. JACOD

    In light of (G.13), (G.15), and (G.16), we get, for some u0 > 0,

    supu(0u0] J(1)

    u < u (0 u0] J(2)u K + (u )

    J

    (4)u + Ju

    The CauchySchwarz inequality yieldsJ(3)u 2J(2)uJ(4)u + Ju1/2Finally, (G.13) and (G.17), and the definition ofR yield (with 0 as in (G.17))

    J(4)u

    20(u )Ju if < 2 0( u ) < 1/2,

    2(0(u ) + )Ju if = 2 > 0(u ) + < 1/2.

    Therefore, we get

    Ju

    Ju 1 K

    Ju+ (u ) + 2(0(u ) + )

    as soon as 0(u ) + < 1/2 and > , and with the convention that = 0and 0 = 1 when < 2. Then, remembering that limu0 lim (u ) = 0,and the same for 0, and that = 0 when < 2 and is arbitrarily small when = 2, we readily deduce the following fact: suppose that for some functionu (u) going to + as u 0 and independent of we have proved that

    Ju (u) as u 0 > 1(G.19)

    Then Ju (u) and, therefore, by (G.2) we get

    I 22u2(u)

    2 (G.20)

    G.3.1. The Case (d) in Theorem 7: Now let < 2 and < /2. The changeof variables z = xu/() in (G.18) yields

    Ju = 2c2

    {|x|>}

    1c|x|1+2 + cu|x|1+/

    dx(G.21)

    = 2c2

    u((2))/() {|z|> u/()}1

    c|z

    |1+2

    +c

    |z

    |1+/

    dz

  • 8/8/2019 ajfisher

    34/36

    SAMPLED LEVY PROCESSES 759

    and we have (G.19) with

    (u)

    =

    2c2

    u((

    2))/(

    )

    1

    c|z|1+

    2

    + c

    |z|1+

    /

    dz(G.22)

    So if we combine this with (G.20), we get (4.14).

    G.3.2. The Case (c) in Theorem 7: Suppose now that 2 = < 2. Then

    Ju = 22c2

    1x(c + cux/)

    dx(G.23)

    For v > 0 we let Hv be the unique point x > 0 such that cvx/ = c, soin fact Hv = /v for some > 0. We have

    Ju 22c2

    c

    Hu

    1x

    dx + Ku

    Hu

    1x1+

    dx 22c2c

    log 1u

    + KOn the other hand, if > 0 and < x < Hu, we have c + cux/ 22c2

    c

    Hu

    1x(1 + 1/) dx

    22c2c

    11 + 1/ log

    1u

    K

    Putting together these two estimates and choosing big gives (G.19) with

    (u) = 22c2

    clog 1

    u

    and we readily deduce (4.13).

    G.3.3. The Case (a) in Theorem 7: Finally, consider the case = 2. Wehave

    Ju = 22c2

    1

    x1+(x1+ex2/2/

    2

    +cu/)

    dx(G.24)

    Suppose that is large enough for x x2+2ex2/2 to be decreasing on[ ). For v > 0 small enough, there is a unique number Hv = x > suchthat cv/ = x1+ex2/2/

    2, so in fact Hv

    2 log(1/v) when v 0.

    Then

    Ju KHu

    ex2/2

    x2+2dx + 2

    2c

    u

    Hu

    1x1+

    dx(G.25)

    KHu

    + 2c

    uHu 2c

    u(2)/2(log(1/u))/2(1 + o(1))

  • 8/8/2019 ajfisher

    35/36

    760 Y. AT-SAHALIA AND J. JACOD

    On the other hand, if > 1 and x > Hu, we have x1+ex2/2/

    2 +

    cu/ < +cu(1 + )/, hence,

    Ju >22c

    u Hu 1x1+(1 + ) dx(G.26) 2c

    u(2)/2(log(1/u))/2(1 + ) (1 + o(1))

    So again we see, by choosing close to 1, that the desired result holdswith

    (u) = 2c

    u(2)/2(log(1/u))/2(G.27)

    and we deduce (4.11).

    REFERENCES

    AT-SAHALIA, Y. (2004): Disentangling Diffusion From Jumps, Journal of Financial Eco-nomics, 74, 487528.

    AT-SAHALIA, Y., AND J. JACOD (2007): Volatility Estimators for Discretely Sampled LvyProcesses, The Annals of Statistics, 35, 355392.

    ANDERSEN, T. G., T. BOLLERSLEV, F. X. DIEBOLD, AND P. LABYS (2003): Modeling and Fore-casting Realized Volatility, Econometrica, 71, 579625.

    BARNDORFF-NIELSEN, O. E., AND N. SHEPHARD (2004): Power and Bipower Variation With

    Stochastic Volatility and Jumps (With Discussion), Journal of Financial Econometrics, 2, 148.BERGSTRM, H. (1952): On Some Expansions of Stable Distribution Functions, Arkiv frMathematik, 2, 375378.

    BROCKWELL, P., AND B. BROWN (1980): High-Efficiency Estimation for the Positive StableLaws, Journal of the American Statistical Association, 76, 626631.

    DUMOUCHEL, W. H. (1971): Stable Distributions in Statistical Inference, Ph.D. Thesis, De-partment of Statistics, Yale University.

    (1973a): On the Asymptotic Normality of the Maximum-Likelihood Estimator WhenSampling From a Stable Distribution, The Annals of Statistics, 1, 948957.

    (1973b): Stable Distributions in Statistical Inference: 1. Symmetric Stable DistributionsCompared to Other Symmetric Long-Tailed Distributions, Journal of the American StatisticalAssociation, 68, 469477.

    (1975): Stable Distributions in Statistical Inference: 2. Information From Stably Dis-tributed Samples, Journal of the American Statistical Association, 70, 386393.

    HOFFMANN-JRGENSEN, J. (1993): Stable Densities, Theory of Probability and Its Applications,38, 350355.

    HPFNER, R., AND J. JACOD (1994): Some Remarks on the Joint Estimation of the Index andthe Scale Parameters for Stable Processes, in Proceedings of the Fifth Prague Symposium onAsymptotic Statistics, ed. by P. Mandl and M. Huskova. Heidelberg: Physica-Verlag, 273284.

    JACOD, J., AND A. N. SHIRYAEV (2003): Limit Theorems for Stochastic Processes (Second Ed.).New York: Springer-Verlag.

    MANCINI, C. (2001): Disentangling the Jumps of the Diffusion in a Geometric Jumping Brown-ian Motion, Giornale dellIstituto Italiano degli Attuari, LXIV, 1947.

    MYKLAND, P. A., AND L. ZHANG (2006): ANOVA for Diffusions and It, Processes, The Annalsof Statistics, 34, 19311963.

    http://www.e-publications.org/srv/ecta/linkserver/setprefs?rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:8/dumouchel73b&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:8/dumouchel73b&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:8/dumouchel73b&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:8/dumouchel73b&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:10/dumouchel75&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:10/dumouchel75&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:10/dumouchel75&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:10/dumouchel75&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:10/dumouchel75&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:10/dumouchel75&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:8/dumouchel73b&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:8/dumouchel73b&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:15/myklandzhang06&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:14/mancini01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:13/jacodshiryaev2003&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:12/hopfnerjacod94&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:11/hoffmannjorgensen93&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:10/dumouchel75&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:9/dumouchel73a&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:8/dumouchel73b&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:1/yacjfe04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/setprefs?rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-L
  • 8/8/2019 ajfisher

    36/36

    SAMPLED LEVY PROCESSES 761

    NOLAN, J. P. (1997): Numerical Computation of Stable Densities and Distribution Functions,Communications in StatisticsStochastic Models, 13, 759774.

    (2001): Maximum Likelihood Estimation and Diagnostics for Stable Distributions, inLvy Processes: Theory and Applications, ed. by O. E. Barndorff-Nielsen, T. Mikosch, and S. I.

    Resnick. Boston: Birkhuser, 379400.RACHEV, S., AND S. MITTNIK(2000): Stable Paretian Models in Finance. Chichester, U.K.: Wiley.WOERNER, J. H. (2004): Estimating the Skewness in Discretely Observed Lvy Processes,Econometric Theory, 20, 927942.

    ZOLOTAREV, V. M. (1986): One-Dimensional Stable Distributions, Vol. 65. Providence, RI: Trans-lations of Mathematical Monographs, American Mathematical Society.

    (1995): On Representation of Densities of Stable Laws by Special Functions, Theoryof Probability and Its Applications, 39, 354362.

    http://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:16/nolan97&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:16/nolan97&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:16/nolan97&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:16/nolan97&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:16/nolan97&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:19/woerner04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:19/woerner04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:19/woerner04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:19/woerner04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:19/woerner04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:19/woerner04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:16/nolan97&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:19/woerner04&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:17/nolan01&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-Lhttp://www.e-publications.org/srv/ecta/linkserver/openurl?rft_dat=bib:16/nolan97&rfe_id=urn:sici%2F0012-9682%28200807%2976%3A4%3C727%3AFIFDSL%3E2.0.CO%3B2-L