官术网_书友最值得收藏!

Markov chain definition

As we said, a Markov chain is a mathematical model of a random phenomenon that evolves over time in such a way that the past influences the future only through the present. In other words, a stochastic model describes a sequence of possible events in which the probability of each event depends only on the state that was attained in the previous event. So, Markov chains have the property of memorylessness.

Let's consider a random process described by a sequence of random variables, X = X0, ..., Xn, which can assume the values in a j0, j1,…, jset. We will say that it has the Markov property if the evolution of the process depends on the past only through the present—that is, the state in which we found ourselves after n steps. This can be defined as follows:

This relation must apply to all the parameters if they are well-defined conditional probabilities. A discrete-time stochastic process X that has the Markov property is said to be a Markov chain. A Markov chain is said to be homogeneous if the following transition probabilities do not depend on n, but only on i and j:

When this happens, the following changes are made to the formula:

Given this, we can calculate all the joint probabilities by knowing the numbers pij, plus the following initial distribution:

This probability is called a distribution of the process over time zero. The pij probabilities are called transition probabilities, and, to be specific, pij is the probability of transition from i to j in a time step.

主站蜘蛛池模板: 拜泉县| 霞浦县| 浏阳市| 绥芬河市| 淅川县| 陈巴尔虎旗| 铅山县| 时尚| 新龙县| 乌兰察布市| 德清县| 搜索| 佛坪县| 汝阳县| 绩溪县| 抚松县| 建水县| 顺昌县| 舒兰市| 佛冈县| 理塘县| 吉木乃县| 潢川县| 米易县| 南丹县| 唐山市| 齐河县| 万山特区| 东乌珠穆沁旗| 雷波县| 新和县| 宁河县| 赣州市| 长沙县| 金溪县| 湖口县| 渝北区| 腾冲县| 黔南| 桃源县| 永定县|