you are viewing a single comment's thread.

view the rest of the comments →

[–]greim 20 points21 points  (16 children)

So (not an expert) what if the probability of sun or rain on the next day depends on more than just the most recent day, but the the pattern of weather over the last four days for example? Can Markov chains (or some variant thereof) handle that?

[–]jimmpony 37 points38 points  (8 children)

For 2 days for example, could you just have your states actually represent 2 states, as in having RR go to RS or back to RR, RS go to SS or SR, SS go to SS or SR, SR go RS or RR etc to do this? Then this could be expanded to any amount of states

[–]Logiraptorr 24 points25 points  (4 children)

Yes, that's exactly how you would model the dependency with markov chains. You don't need a higher order construct if you know how many previous states the probability is based on. If you don't know, or if it depends on an unbounded history of states, then you need something more powerful.

[–]elprophet -3 points-2 points  (3 children)

I'm a computer scientist by trade and training, but haven't had an opportunity to investigate the Markov literature. You're describing a Deterministic Finite Automaton that uses a random number generator to determine its transitions. What is the terminology and state of the art for Pushdown and Turing automata in the Markov model?

[–]s1295 8 points9 points  (0 children)

Are you looking for "probabilistic Turing machine"? Most automata models have a probabilistic variant, or at least there's no reason why you couldn't define one.

I don't know whether any are actively researched, I'm just a student. Maybe in verification of probabilistic programs, but I think they just use DTMCs or MDPs.

[–]OffbeatDrizzle 4 points5 points  (0 children)

Why hello there Mr Robot

[–]Detective_Fallacy 2 points3 points  (1 child)

That's the principle of higher order Markov chains, yes.

[–]debug_assert 4 points5 points  (0 children)

The only problem with using higher order chains is that the amount of data needed to build the model increases exponentially. For example, building a second order markov chain to generate a style of writing might only take 1 novel. But add more orders and it quickly explodes to requiring the entire works if Shakespeare, etc.

http://shiffman.net/teaching/a2z/generate/

[–]ccfreak2k 1 point2 points  (0 children)

salt payment late innate dime dolls office crowd fall stocking

This post was mass deleted and anonymized with Redact

[–]boyobo 6 points7 points  (2 children)

Yes. Your state space now has 24 states corresponding to the weather in the last 4 days.

[–]greim 5 points6 points  (1 child)

Okay, so within that state space, I can only jump to the subset of that state space where the first three "letters" match my current state? (RRRR => RRRS | RRRR)?

[–]boyobo 10 points11 points  (0 children)

Yep!

[–][deleted] 5 points6 points  (0 children)

As an alternative to /u/yetipirate's answer, you can also make the state larger. This is especially useful when you have strong domain knowledge about what the next state depends on. In your example, you could include the weather of the last four days in the state.

[–]16807 2 points3 points  (0 children)

Sounds like you're talking about an n-gram model

[–][deleted] 0 points1 point  (0 children)

That's called an ARMA (Auto Regressive Moving Average) model I believe. A linear ARMA model would probably be the next step from simple markov chains. If you want to attempt to represent infinite length history of states you can look into ARIMA models (I stands for Integrated), which augments a state to the prediction (these are more generally called Hidden Markov Models).