What Defines a Step in a Markov System or Process?

This information is helpful for homework assignments and can be found in the given links. In summary, the MS/P uses one step as the equivalent of one day or unit, and the amount of days or units can be determined by the programmer or user.
  • #1
madmax2006
8
0
It's actually just a general question about the MS/P (Markov System &/or Process) (I'm not sure what to call it).

In the MS/P, is 1 "step" the same as 1 day(or unit), or is the amount of day(s) (or unit(s)) decided by the programmer/user/instructor/etc..

I know this isn't "homework", but it's going to help me with my homework.
In case you don't remember off of the top of your head (link):

http://www.me.utexas.edu/~jensen/ORMM/models/unit/markchain/subunits/example/index.html

Page 2 link is at the bottom right of the page or:

http://www.me.utexas.edu/~jensen/ORMM/models/unit/markchain/subunits/example/analysis.html

Under "Transient Analysis"

It says "After one day the state probabilities are as shown in the row labeled 1. The display shows the probabilities for 20 days or steps."

I'm assuming 1 step = 1 day = 1 unit ?

Thanks
 
Physics news on Phys.org
  • #2
in advanceYes, in the MS/P one step is equal to one day (or unit). The amount of days (or units) is usually decided by the programmer or user, depending on the application.
 

FAQ: What Defines a Step in a Markov System or Process?

What is a Markov system?

A Markov system is a mathematical model that describes a system that transitions between different states over time. The probability of transitioning from one state to another is dependent only on the current state and not on any previous states.

What is a Markov process?

A Markov process is a stochastic process that follows the Markov property, where the probability of transitioning to a new state is only dependent on the current state and not on any previous states. It is often used to model random events or systems that evolve over time.

What are the applications of Markov systems and processes?

Markov systems and processes have many applications in various fields such as finance, economics, biology, and physics. They are commonly used for predicting stock market trends, analyzing DNA sequences, and modeling chemical reactions.

What are the limitations of Markov systems and processes?

Markov systems and processes assume that the future state is only dependent on the current state, which may not always be true in real-world scenarios. These models also require a large amount of data and may not be suitable for systems with complex or nonlinear behavior.

What is the difference between a discrete-time and a continuous-time Markov process?

A discrete-time Markov process is one where the system transitions between states at discrete time intervals, while a continuous-time Markov process allows for transitions at any point in time. Discrete-time processes are often used for modeling systems with a finite number of states, while continuous-time processes are better suited for systems with a continuous range of states.

Back
Top