The Way of the great learning involves manifesting virtue, renovating the people, and abiding by the highest good.

2008年12月26日星期五

Markov process

马尔可夫链,因安德烈·马尔可夫(A.A.Markov,1856-1922)得名,是数学中具有马尔可夫性质的离散时间随机过程。该过程中,在给定当前知识或信息的情况下,过去(即当期以前的历史状态)对于预测将来(即当期以后的未来状态)是无关的。 马尔可夫链是随机变量X_1,X_2,X_3...的一个数列。这些变量的范围,即他们所有可能取值的集合,被称为“状态空间”,而X_n的值则是在时间n的状态。如果X_{n+1}对于过去状态的条件概率分布仅是X_n的一个函数,则 P(X_{n+1}=xX_0, X_1, X_2, \ldots, X_n) = P(X_{n+1}=xX_n). \, 这里x为过程中的某个状态。上面这个恒等式可以被看作是马尔可夫性质。 马尔可夫在1906年首先做出了这类过程 。而将此一般化到可数无限状态空间是由柯尔莫果洛夫在1936年给出的。 马尔可夫链与布朗运动以及遍历假说这两个二十世纪初期物理学重要课题是相联系的,但马尔可夫寻求的似乎不仅于数学动机,名义上是对于纵属事件大数法则的扩张。 Markov process
From Wikipedia, the free encyclopedia
Jump to: navigation, search
A Markov process, named after the Russian mathematician Andrey Markov, is a mathematical model for the random evolution of a memoryless system, that is, one for which the likelihood of a given future state, at any given moment, depends only on its present state, and not on any past states.
In a common description, a stochastic process with the Markov property, or memorylessness, is one for which conditional on the present state of the system, its future and past are independent.
Often, the term Markov chain is used to mean a discrete-time Markov process. Also see continuous-time Markov process.
Contents[hide]
1 Formal definition
2 Markovian representations
3 See also
4 References
//

[edit] Formal definition
A stochastic process whose state at time t is X(t), for t > 0, and whose history of states is given by x(s) for times s < t is a Markov process if
0." src="http://upload.wikimedia.org/math/0/9/9/0994a8e653b42606c365d99082ab6585.png">
That is, the probability of its having state y at time t+h, conditioned on having the particular state x(t) at time t, is equal to the conditional probability of its having that same state y but conditioned on its value for all previous times before t. This captures the idea that its future state is independent of its past states.
Markov processes are typically termed (time-) homogeneous if
0," src="http://upload.wikimedia.org/math/d/2/3/d23d6eda44b15bb99cb4d0d39aaffec5.png">
and otherwise are termed (time-) inhomogeneous (or (time-) nonhomogeneous). Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.

[edit] Markovian representations
In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. For example, let X be a non-Markovian process. Then define a process Y, such that each state of Y represents a time-interval of states of X, i.e. mathematically,

If Y has the Markov property, then it is a Markovian representation of X. In this case, X is also called a second-order Markov process. Higher-order Markov processes are defined analogously.
An example of a non-Markovian process with a Markovian representation is a moving average time series.

[edit] See also
Examples of Markov chains
Memorylessness
Semi-Markov process
Markov chain
Markov decision process
Dynamics of Markovian particles
Conditional Probability

[edit] References
Eric W. Weisstein, Markov process at MathWorld.
This probability-related article is a stub. You can help Wikipedia by expanding it.
Retrieved from "http://en.wikipedia.org/wiki/Markov_process"
Categories: Probability stubs Stochastic processes

没有评论: