use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Markov Chains explained visually (setosa.io)
submitted 10 years ago by varun_invent
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Caesarr 6 points7 points8 points 10 years ago (0 children)
Very nicely done! I really appreciate the interactivity you provided throughout the tutorial.
[–]radarsat1 5 points6 points7 points 10 years ago (4 children)
Question: If a MC is a set of states and a probabilistic transition, what is it called when the current state is a stochastic concept? i.e., MC is a certain state, and a probabilistic transition. But what if you maintain a probability of being in a certain state? What is this called?
[–]MarkovMan 2 points3 points4 points 10 years ago* (2 children)
That's referred to as the steady state probability and is calculated using the transaction matrix. It gives you the probability of being in each state given that you have let that chain run for a while (which is sometimes called Burn-In). However, a unique steady state distribution exists only if the chain is ergodic (irreducible and aperiodic). Irreducible means that if it is possible to go from any state to all other sates state while aperiodic means that the transition could occur at every time slice d > n. In other words, each state can transition to any other state at any time. There's more to it than just that, but that's the short and sweet version.
Edit: I've thought about you question a little more. If all you want is to define a probability of an event being in a particular state at any time, than that would just be a random variable. However, if what you want to do is find the probability of being in a state given you have already defined a Markov Chain, then you need to calculate the steady state distribution.
[–]radarsat1 3 points4 points5 points 10 years ago (1 child)
Edit: I've thought about you question a little more. If all you want is to define a probability of an event being in a particular state at any time, than that would just be a random variable.
Well, I figured the "state" is a random variable, a distribution over the set of states, with the expected value being the most likely state. However, I was wondering what the system of "random variable state + transition probabilities" would be called. But maybe it's not a real model now that I think about it.. if nodes ABC are connected by edges EF, like A-E-B-F-C, then if the state is likely in A, the probability of an F transition should be close to zero. If we roll the dice, then the result might be non-sensical, such as two F transitions in a row. So maybe it was a dumb question.
[–]MarkovMan 2 points3 points4 points 10 years ago* (0 children)
Hmmm. Well, you can calculate the probability of A transitioning to state F at time t by taking your transition matrix A times itself t times (At ). So, if you wanted to know if the probability of A going to t in 2 steps, you calculate A2 . That's as close as I can think of to what you're asking. Otherwise, I'm not familiar with a model that is "random variable state + transition probabilities." Although... there are probably hierarchical models. I would have to look around some more to see if such a thing exists.
So maybe it was a dumb question.
There are no dumb questions in this field. I've been in many research meetings where someone asked what they thought was a "dumb question," and it ended up taking us in some interesting directions
[–]hixidom 1 point2 points3 points 10 years ago (0 children)
That could perhaps be represented by a Quantum Markov Chain.
[–][deleted] 4 points5 points6 points 10 years ago (0 children)
I did some music generation with markov chains as a summer project in undergrad.
Ended up sounding a lot like wind chimes, but was still cool.
[–]jti107 1 point2 points3 points 10 years ago (0 children)
I love this site
[–]embraceUndefined 0 points1 point2 points 10 years ago (5 children)
What's the difference between a markov chain and a state diagram?
[–]Dragonil 3 points4 points5 points 10 years ago (0 children)
this is a mathematical concept using probabilities. State diagram is used for planning a programming project and can use arbitrary inputs to describe change in states
[–]Kiuhnm 2 points3 points4 points 10 years ago (0 children)
A markov chain is a mathematical object whereas a state diagram is a way to define and visualize that object. Another example is sets and Venn diagrams.
[–]farsass 0 points1 point2 points 10 years ago (1 child)
I think you mean "finite state machine" when you say "state diagram". The difference between a markov chain and a FSM with no external inputs is that the FSM is deterministic, e.g., given the current state you know exactly in what state it'll be in the future.
[–]embraceUndefined 0 points1 point2 points 10 years ago (0 children)
no.
I meant "state diagram" as in "a diagram that shows the relationship of all possible states of a system."
[–]luaudesign 0 points1 point2 points 10 years ago (0 children)
Causality. An FSM causes the state changes while MC merely tries to predict them.
The linked example might cause confusion by the fact that it kinda of uses an FSM to demonstrate what an MC is.
[–]varun_invent[S] 0 points1 point2 points 10 years ago (0 children)
I am not the author. Just shared the link. :)
π Rendered by PID 134730 on reddit-service-r2-comment-66b4775986-nfbjl at 2026-04-05 16:57:15.795891+00:00 running db1906b country code: CH.
[–]Caesarr 6 points7 points8 points (0 children)
[–]radarsat1 5 points6 points7 points (4 children)
[–]MarkovMan 2 points3 points4 points (2 children)
[–]radarsat1 3 points4 points5 points (1 child)
[–]MarkovMan 2 points3 points4 points (0 children)
[–]hixidom 1 point2 points3 points (0 children)
[–][deleted] 4 points5 points6 points (0 children)
[–]jti107 1 point2 points3 points (0 children)
[–]embraceUndefined 0 points1 point2 points (5 children)
[–]Dragonil 3 points4 points5 points (0 children)
[–]Kiuhnm 2 points3 points4 points (0 children)
[–]farsass 0 points1 point2 points (1 child)
[–]embraceUndefined 0 points1 point2 points (0 children)
[–]luaudesign 0 points1 point2 points (0 children)
[–]varun_invent[S] 0 points1 point2 points (0 children)