This is a review of chapter 1 of Capinski’s book, however the overview will be slightly different.
definition: A measurable space is a pair where
is a set and
is a sigma-algebra on
.
The role of the is to model the information available through observation. It is a sigma algebra to enable attachment of a measure. An element of
is called a measurable set.
If , then
the set of measurable sets generated from the open sets of the standard metric induced by norm.
definition: A measurable map is a function on measurable spaces such that
The composition of measurable maps is a measurable map. A map is measurable with respect to information or
-measurable if
. IN other words, the information
provides is contained in
.
definition: A random variable is a real-valued measurable map.
definition: A probability space is a measurable space equipped with a probability measure defined on the measurable sets of the space.
definition: A real discrete-time process is a real-valued sequence. A random discrete-time process is a sequence of random variables.
For a given outcome of a measurable space, a random process
is realized as a real sequence
. The value of the process at time
is
.
definition: A filtration is a non-decreasing sequence of collections of measurable sets.
If is a filtration, then
is the information that can be measured at time
.
definition: The natural filtration of a random variable is the filtration
generated from the
for
:
.
definition: A random process is adapted to a filtration
if for all
,
definition: A filtered measurable space is a measurable space equipped with a filtration of the measurable sets of the space.
This structure models the causal information flow. Here information will be taken to be synomous with sigma algebra, and information flow with a filtration.
definition: The conditional expectation of a random variable with respect to information
is an integrable random variable
such that
is
-measurable, and for any
-measurable set
.
Since it can be shown that the collection of all such conditional expectations are identical to each other almost everywhere, it is customary to ignore the difference and identify the conditional expectation by or
.
A consequence of the definition is the sequence of conditional expectations defined on a filtration satisfy the so-called tower law:
lemma: when is a filtration,
It is convenient to simplify notation and write when it is clear the conditional expectation is with respect to the filtration of a filtered space. Also instead of saying
is
-measurable, it is convenient to say that
is measurable at time
.
definition: A random process is adapated to a filtered space if for any time
,
is measurable at time
.
definition: A [sub | super | ] martingale of a filtration is a random process such that for any
.
definition: a process is predictable if for any time
is adapted at time
.
In other words, the value of the process at the next time instant is determined by the information available at the current time.
proposition: a predictable martingale is constant
Since but since
is measurable at time
, and so
. By induction, for any time
. And so for any outcome
is a constant.
theorem: Let be a martingale and
be a predictable process. Provided
is bounded or
are both square-integrable, then the process defined by
and for
,
(where ) defines a martingale.
Note that
with the added technical conditions ensuring integrability of .
lemma: (Jensen’s inequality) Let be a convex real operator such that given a random variable
defined on a probability space
,
. Then for any information structure
of
,
.
lemma: if is a square-integrable martingale, then
is a submartingale.
By Jensen,
definition: a process is non-decreasing if for any
,
is a non-decreasing sequence.
proposition: an increasing martingale is constant.
Since is increasing
, and for all
implies
. and so
is constant.
theorem: (Doob) If is a submartingale of a filtered space
, then there is a martingale
and a non-decreasing predictable process
defined on
such that
and
. Furthermore, the representation is (a.e.) unique.
Doob provide the proof necessary to demonstrate one can always convert a submartingale into a martingale by adding its compensator. Of particular interest is the square of a martingale.
proposition: If is a square integrable martingale such that
, then when
,
This is a key result for discrete Ito integrals
proposition: (Ito Isometry) If is square integrable then
where
is the compensator of
.
This replaces the stochastic integrator with a predictable integrator
under appropriate technical conditions.
Stopping Times
definition: A stopping time is a random variable whose range is
defined on a filtered space
such that such that
is
-measurable.
The role of a stopping time is to model an adapted process that announces when something occurs at a given time (or if infinite, never occurs) .
proposition: Constant stopping times, max/min of stopping times, and the limit of a convergent sequence of stopping are stopping times.
definition: Let be an adapted process defined on a filtered space and
be a stopping time that is (almost surely) finite. Then the random variable
is the value of the stopped process at the stopping time.
definition: for a finite stopping time , the information known at time
is
.
It can be shown that is sub-information of
, and that when
are stopping times such that
almost surely then
is coarser than
, and that any stopping time is measurable with respect to its information.
definition: the process is called the process
stopped at
.
proposition: is adapated.
theorem: if is a martingale and
is a stopping time, then
is a martingale.
The next result is a generalization of the martingale definition that applies to ordered stopping times.
theorem: (Doob’s optional sampling theorem) If are stopping times such that
, then for any martingale
,
.
Doob’s Inequalities and Martingale Convergence
The first inequality relates the probability of a submartingale surmounting a barrier over some time interval to the terminal expectation of the submartingale. The second inequality related the expectation of the square of the maximum of a martingale over some time interval to the terminal expectation of the square of the martingale. These are technical results that can be used to simplify analysis in the proof of the Martingale convergence theorem.
theorem: If is a non-negative submartingale, then for any time
and positive value
,
nd: interpret as a measure on functions where appropriate.
theorem: if is a square-integrable non-negative submartingale then
. Furthermore, if
is uniformly bounded then
The main result of the section is a Martingale convergence theorem, which is an assertion that under appropriate technical conditions, the limit of a martingale is a well-defined random variable such that the martingale can be equivalently represented by the sequence of conditional expectations of the random variable.
theorem: let be
-bounded martingale (uniformly bounded expectation of
). Then there is a random variable
such that
and
- for all
,
the limitation that stopping times be finite can be relaxed to extend the optional sampling theorem.
theorem: Let be an
bounded martingale and
be a random variable such that
. Then for any pair of stopping times
such that
,
Markov Process
A Markov process is a process whose next value depends only on the information determined from the current value.
definition: A random process is a markov process given a space equipped with the filter generated from
, if for any bounded borel operator
.
nd: the information determined by is the information
.
[to be completed]
