Recently MeroSpark is lunched with more features and services, now you can ask your question, sell your books, share your notes and many more. Visit https://www.merospark.com/signup/ now and create your account to take full advantage of MeroSpark.

Markov chains and Markov Process | BSc.CSIT | Simulation and Modeling | 5th Semester

Download our Android App from Google Play Store and start reading Reference Notes Offline.

markov-chain-and-markov-processMarkov chains and Markov Process
Simulation and Modeling Reference Notes
Fifth Semester | Third year
BSc.CSIT | Tribhuvan University (TU)

Markov chains and Markov Process
Important classes of stochastic processes are Markov chains and Markov processes. A Markov chain is a discrete-time process for which the future behavior, given the past and the present, only depends on the present and not on the past. A Markov process is the continuous-time version of a Markov chain. Many queuing models are in fact Markov processes. This chapter gives a short introduction to Markov chains and Markov processes focusing on those characteristics that are needed for the modeling and analysis of queuing problems.

A Markov chain
A Markov chain, named after Andrey Markov, is a mathematical system that undergoes transitions from one state to another, between a finite or countable number of possible states. It is a random process characterized as memory less: the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of “memorylessness” is called the Markov property. Markov chains have many applications as statistical models of real-world processes.

Formally
A Markov chain is a sequence of random variables X1, X2, X3, … with the Markov property, namely that, given the present state, the future and past states are independent. i.e
markov1

Example; A simple whether model
The probabilities of weather conditions (modeled as either rainy or sunny), given the weather on the preceding day, can be represented by a transition matrix:
markov2

The matrix P represents the weather model in which a sunny day is 90% likely to be followed by another sunny day, and a rainy day is 50% likely to be followed by another rainy day. The columns can be labelled “sunny” and “rainy” respectively, and the rows can be labeled in the same order.

(P)i j is the probability that, if a given day is of type i, it will be followed by a day of type j.

Notice that the rows of P sum to 1: This is because P is a stochastic matrix.

The weather on day 0 is known to be sunny. This is represented by a vector in which the “sunny” entry is 100%, and the “rainy” entry is 0%:
markov3

The weather on day 1 can be predicted by:
markov4

The weather on day 2 can be predicted in the same way:
markov5

General rules for day n are:
markov6

(Visited 239 times, 1 visits today)

Posted By : Digvijay | Comment RSS | Category : Fifth Semester
Tag :

Post a Comment

Your email is never published nor shared. Required fields are marked *

*
*

Wordpress DMCA
Community | Toolbar | Android App | Founder/Developer : Hari Prasad Chaudhary | CSIT Portal Manager : Digvijay Chaudhary