Generate data using Markov Chain models in Python

Amir Ali Hashemi
2 min readNov 18, 2023

In this short blog post, you will see one way of producing data from a Discrete Time Markov chain in Python. I will not discuss what Markov chains are, so if you are unfamiliar with them, please take a quick look on the internet. The only library used in the code is numpy.

Now, consider the following Markov chain {Xt, t = 0, 1, 2, . . .} with the transition probability matrix:

P = [[0.2, 0.3, 0.5], 
[0 , 0.3, 0.7],
[0.5, 0.4, 0.1]]

on the state space {0, 1, 2} and transition diagram as depicted below:

The following code snippet is a python code to synthesize data using simulation for this Markov chain.

import numpy as np

# initialize your markov chain matrix
P = np.array([[0.2, 0.3, 0.5], [0, 0.3, 0.7], [0.5, 0.4, 0.1]])
# initialize your states
states = [0, 1, 2]
# select a random state to begin with
state_now = np.random.choice(states,1)
# initialize the previos state
state_prev = None
# number of travels in your markov chain diagram
iter = 10

for i in range(iter):
# get the vector of probabilities to go from the current state to all other states
prob = P[state_now]
# select the next satet based on the probability vector of above
next_state = np.random.choice(states,1, p=prob[0])
# update the current state and the previous state
state_prev = state_now
state_now = next_state
# print the transition
print(state_prev, ' -> ', state_now)

--

--

Amir Ali Hashemi

I'm an AI student who attempts to find simple explanations for questions and share them with others