Continuous Local Martingale Imt ¥ C1 Supermartingale

Stochastic process with sequence of stopping times so each stopped processes is martingale

In mathematics, a local martingale is a type of stochastic process, satisfying the localized version of the martingale property. Every martingale is a local martingale; every bounded local martingale is a martingale; in particular, every local martingale that is bounded from below is a supermartingale, and every local martingale that is bounded from above is a submartingale; however, in general a local martingale is not a martingale, because its expectation can be distorted by large values of small probability. In particular, a driftless diffusion process is a local martingale, but not necessarily a martingale.

Local martingales are essential in stochastic analysis (see Itō calculus, semimartingale, and Girsanov theorem).

Definition [edit]

Let ( Ω , F , P ) {\displaystyle (\Omega ,F,P)} be a probability space; let F = { F t t 0 } {\displaystyle F_{*}=\{F_{t}\mid t\geq 0\}} be a filtration of F {\displaystyle F} ; let X : [ 0 , ) × Ω S {\displaystyle X:[0,\infty )\times \Omega \rightarrow S} be an F {\displaystyle F_{*}} -adapted stochastic process on the set S {\displaystyle S} . Then X {\displaystyle X} is called an F {\displaystyle F_{*}} -local martingale if there exists a sequence of F {\displaystyle F_{*}} -stopping times τ k : Ω [ 0 , ) {\displaystyle \tau _{k}:\Omega \to [0,\infty )} such that

Examples [edit]

Example 1 [edit]

Let W t be the Wiener process and T = min{t :W t  = −1 } the time of first hit of −1. The stopped process W min{t,T } is a martingale; its expectation is 0 at all times, nevertheless its limit (as t → ∞) is equal to −1 almost surely (a kind of gambler's ruin). A time change leads to a process

X t = { W min ( t 1 t , T ) for 0 t < 1 , 1 for 1 t < . {\displaystyle \displaystyle X_{t}={\begin{cases}W_{\min \left({\tfrac {t}{1-t}},T\right)}&{\text{for }}0\leq t<1,\\-1&{\text{for }}1\leq t<\infty .\end{cases}}}

The process X t {\displaystyle X_{t}} is continuous almost surely; nevertheless, its expectation is discontinuous,

E X t = { 0 for 0 t < 1 , 1 for 1 t < . {\displaystyle \displaystyle \operatorname {E} X_{t}={\begin{cases}0&{\text{for }}0\leq t<1,\\-1&{\text{for }}1\leq t<\infty .\end{cases}}}

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as τ k = min { t : X t = k } {\displaystyle \tau _{k}=\min\{t:X_{t}=k\}} if there is such t, otherwise τ k = k {\displaystyle \tau _{k}=k} . This sequence diverges almost surely, since τ k = k {\displaystyle \tau _{k}=k} for all k large enough (namely, for all k that exceed the maximal value of the process X). The process stopped at τ k is a martingale.[details 1]

Example 2 [edit]

Let W t be the Wiener process and ƒ a measurable function such that E | f ( W 1 ) | < . {\displaystyle \operatorname {E} |f(W_{1})|<\infty .} Then the following process is a martingale:

X t = E ( f ( W 1 ) F t ) = { f 1 t ( W t ) for 0 t < 1 , f ( W 1 ) for 1 t < ; {\displaystyle X_{t}=\operatorname {E} (f(W_{1})\mid F_{t})={\begin{cases}f_{1-t}(W_{t})&{\text{for }}0\leq t<1,\\f(W_{1})&{\text{for }}1\leq t<\infty ;\end{cases}}}

here

f s ( x ) = E f ( x + W s ) = f ( x + y ) 1 2 π s e y 2 / ( 2 s ) d y . {\displaystyle f_{s}(x)=\operatorname {E} f(x+W_{s})=\int f(x+y){\frac {1}{\sqrt {2\pi s}}}\mathrm {e} ^{-y^{2}/(2s)}\,dy.}

The Dirac delta function δ {\displaystyle \delta } (strictly speaking, not a function), being used in place of f , {\displaystyle f,} leads to a process defined informally as Y t = E ( δ ( W 1 ) F t ) {\displaystyle Y_{t}=\operatorname {E} (\delta (W_{1})\mid F_{t})} and formally as

Y t = { δ 1 t ( W t ) for 0 t < 1 , 0 for 1 t < , {\displaystyle Y_{t}={\begin{cases}\delta _{1-t}(W_{t})&{\text{for }}0\leq t<1,\\0&{\text{for }}1\leq t<\infty ,\end{cases}}}

where

δ s ( x ) = 1 2 π s e x 2 / ( 2 s ) . {\displaystyle \delta _{s}(x)={\frac {1}{\sqrt {2\pi s}}}\mathrm {e} ^{-x^{2}/(2s)}.}

The process Y t {\displaystyle Y_{t}} is continuous almost surely (since W 1 0 {\displaystyle W_{1}\neq 0} almost surely), nevertheless, its expectation is discontinuous,

E Y t = { 1 / 2 π for 0 t < 1 , 0 for 1 t < . {\displaystyle \operatorname {E} Y_{t}={\begin{cases}1/{\sqrt {2\pi }}&{\text{for }}0\leq t<1,\\0&{\text{for }}1\leq t<\infty .\end{cases}}}

This process is not a martingale. However, it is a local martingale. A localizing sequence may be chosen as τ k = min { t : Y t = k } . {\displaystyle \tau _{k}=\min\{t:Y_{t}=k\}.}

Example 3 [edit]

Let Z t {\displaystyle Z_{t}} be the complex-valued Wiener process, and

X t = ln | Z t 1 | . {\displaystyle X_{t}=\ln |Z_{t}-1|\,.}

The process X t {\displaystyle X_{t}} is continuous almost surely (since Z t {\displaystyle Z_{t}} does not hit 1, almost surely), and is a local martingale, since the function u ln | u 1 | {\displaystyle u\mapsto \ln |u-1|} is harmonic (on the complex plane without the point 1). A localizing sequence may be chosen as τ k = min { t : X t = k } . {\displaystyle \tau _{k}=\min\{t:X_{t}=-k\}.} Nevertheless, the expectation of this process is non-constant; moreover,

E X t {\displaystyle \operatorname {E} X_{t}\to \infty } as t , {\displaystyle t\to \infty ,}

which can be deduced from the fact that the mean value of ln | u 1 | {\displaystyle \ln |u-1|} over the circle | u | = r {\displaystyle |u|=r} tends to infinity as r {\displaystyle r\to \infty } . (In fact, it is equal to ln r {\displaystyle \ln r} for r ≥ 1 but to 0 for r ≤ 1).

Martingales via local martingales [edit]

Let M t {\displaystyle M_{t}} be a local martingale. In order to prove that it is a martingale it is sufficient to prove that M t τ k M t {\displaystyle M_{t}^{\tau _{k}}\to M_{t}} in L 1 (as k {\displaystyle k\to \infty } ) for every t, that is, E | M t τ k M t | 0 ; {\displaystyle \operatorname {E} |M_{t}^{\tau _{k}}-M_{t}|\to 0;} here M t τ k = M t τ k {\displaystyle M_{t}^{\tau _{k}}=M_{t\wedge \tau _{k}}} is the stopped process. The given relation τ k {\displaystyle \tau _{k}\to \infty } implies that M t τ k M t {\displaystyle M_{t}^{\tau _{k}}\to M_{t}} almost surely. The dominated convergence theorem ensures the convergence in L 1 provided that

( ) E sup k | M t τ k | < {\displaystyle \textstyle (*)\quad \operatorname {E} \sup _{k}|M_{t}^{\tau _{k}}|<\infty } for every t.

Thus, Condition (*) is sufficient for a local martingale M t {\displaystyle M_{t}} being a martingale. A stronger condition

( ) E sup s [ 0 , t ] | M s | < {\displaystyle \textstyle (**)\quad \operatorname {E} \sup _{s\in [0,t]}|M_{s}|<\infty } for every t

is also sufficient.

Caution. The weaker condition

sup s [ 0 , t ] E | M s | < {\displaystyle \textstyle \sup _{s\in [0,t]}\operatorname {E} |M_{s}|<\infty } for every t

is not sufficient. Moreover, the condition

sup t [ 0 , ) E e | M t | < {\displaystyle \textstyle \sup _{t\in [0,\infty )}\operatorname {E} \mathrm {e} ^{|M_{t}|}<\infty }

is still not sufficient; for a counterexample see Example 3 above.

A special case:

M t = f ( t , W t ) , {\displaystyle \textstyle M_{t}=f(t,W_{t}),}

where W t {\displaystyle W_{t}} is the Wiener process, and f : [ 0 , ) × R R {\displaystyle f:[0,\infty )\times \mathbb {R} \to \mathbb {R} } is twice continuously differentiable. The process M t {\displaystyle M_{t}} is a local martingale if and only if f satisfies the PDE

( t + 1 2 2 x 2 ) f ( t , x ) = 0. {\displaystyle {\Big (}{\frac {\partial }{\partial t}}+{\frac {1}{2}}{\frac {\partial ^{2}}{\partial x^{2}}}{\Big )}f(t,x)=0.}

However, this PDE itself does not ensure that M t {\displaystyle M_{t}} is a martingale. In order to apply (**) the following condition on f is sufficient: for every ε > 0 {\displaystyle \varepsilon >0} and t there exists C = C ( ε , t ) {\displaystyle C=C(\varepsilon ,t)} such that

| f ( s , x ) | C e ε x 2 {\displaystyle \textstyle |f(s,x)|\leq C\mathrm {e} ^{\varepsilon x^{2}}}

for all s [ 0 , t ] {\displaystyle s\in [0,t]} and x R . {\displaystyle x\in \mathbb {R} .}

Technical details [edit]

  1. ^ For the times before 1 it is a martingale since a stopped Brownian motion is. After the instant 1 it is constant. It remains to check it at the instant 1. By the bounded convergence theorem the expectation at 1 is the limit of the expectation at (n-1)/n (as n tends to infinity), and the latter does not depend on n. The same argument applies to the conditional expectation.

References [edit]

  • Øksendal, Bernt K. (2003). Stochastic Differential Equations: An Introduction with Applications (Sixth ed.). Berlin: Springer. ISBN3-540-04758-1.

cayerwalouteemper2001.blogspot.com

Source: https://en.wikipedia.org/wiki/Local_martingale

0 Response to "Continuous Local Martingale Imt ¥ C1 Supermartingale"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel