A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as a Markov chain. See also random walk.

A finite state machine with probabilities for each transition, that is, a probability that the next state is sj given that the current state is si.

Any multivariate probability density whose independence diagram is a chain.The variables are ordered, and each variable "depends" only on its neighbors in the sense of being conditionally independent of the others. Markov chains are an integral component of hidden Markov models.

A sequence of random variables in which the distribution of each random variable depends only on the value of its predecessor.

a Markov process for which the parameter is discrete time values

a class of stochastic process for which events are transitions between states, for which the probability of transition between two states is defined by the two states

a collection of states where the probability of moving from one state to another depends only upon the state that the system is currently residing in, independent of how the system got into that current state

a discrete stochastic process with discrete states and discrete transformations between them

a discrete time random process

a finite set of states and for each pair of states x and y a transition probability m(x,y)

a mathematical model for describing a certain type of stochastic process that moves in a sequence of phases through a set of states

a mathematical model that can be thought of a being in exactly one of a number of states at any time

a probability matrix in which the state of some system is sequentially followed by other states with a given probability

a sequence of possible states, or random variables that take on certain discrete values

a sequence of random values whose probabilities are linked in a very special way

a sequence of random values whose probabilities at a time interval depends upon the value of the number at the

a sequence of random variables in which the distribution of each element depends only on the value of the previous one

a sequence of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors

a sequence of symbols (words in this case) generated according to a table of probabilities

a special case of a Markov process, which itself is a special case of a random or stochastic process

a special form of probabilistic finite state machine, in which each node is an element in a sequence, and the arcs are labelled with transition probabilities

a special type of stochastic process

a stochastic process that can be used to predict the probability of a system being in a state at a specific time and the time taken until a state is first reached

a system which has a set S of states and changes randomly between these states in a sequence of discrete steps

a thoroughly studied mathematical model for which a standard set of statistics exists

Technique used to estimate the transition probabilities between various ranges or categories in a time series; for example the probability that a cyclone moving in the speed range 5-10 ms-1 will accelerate to 15-20 ms-1 in the next 12 hours (see Leslie et al., 1992).

A method for calculating the probability of an event based on the previous event in a sequence without knowing the state that is generating the sequence of events.

A set of processes where the probabilities for the next state are dependent on the present state.

stochastic process with a finite number of states in which the probability of occurrence of a future state is conditional only upon the current state; past states are inconsequential. In meteorology, Markov chains have been used to describe a raindrop size distribution in which the state at time step + 1 is determined only by collisions between pairs of drops comprising the size distribution at time step .

In mathematics, a Markov chain, named after Andrey Markov, is a discrete-time stochastic process with the Markov property.