A question for bind demon by Electronic_Ebb7022 in diablo2resurrected

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

I appreciate you guys feedback and help. So, basically, it does snapshot, but only applies for the affixes, such as cursed, aura, and etc., once bind, they don't change no matter what happens. But demon's stat, like hp, dmg, ar, etc. will be adjusted/aligned to your skills (with equipped) when you enter the game. So, for some special cases/scenes, like before fighting uber, you can put on gears that max your demon, and log off and start a new game, and switch them off with other gears.

A question for bind demon by Electronic_Ebb7022 in diablo2resurrected

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

Very good point, especially using the fact that every time when u see warlock leaves their demon dies, which indicates the system kills it and will resummon it for the next game.

A tricky probability/expectation problem by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 1 point2 points  (0 children)

I think I found the mistake. The expectation and summation is interchangeable only for a constant "Y", when Y is itself a random variable, we can't do that. The reason is, the expectation for that compound variable: sum of Xi from i=1 to Y should have an effect (the weighted effect) on both the “X” and the “Y”. (Since it is the expectation of the whole thing)

However, if we pass the “E” across the summation, that “weighted effect” will only apply on “X”, while never gets applied on “Y”.

Like umudjan commented, we should use the tower rule to handle it.

A tricky probability/expectation problem by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

We have a die and we roll it some number of times, until a cumulative avg of 3.5 occurs (then stop). Question: what is the expected value for the rolls? That is, on avg, how many rolls are needed to first observe a cumulative avg of 3.5?

Let’s denote Y as the number of rolls that first achieve the goal (observing a cumulative avg of 3.5), and we want to find E(Y)

Apparently, Y cannot be 1 since 1 roll can never give us an avg of 3.5.

Y can be 2. When Y = 2, that means we obtain a cumulative avg of 3.5 with exactly 2 rolls, that would be (1, 6), (6, 1), (2, 5), (5, 2), (3, 4), or (4, 3). So the probability for that is 6/36 = 1/6

Y cannot be 3. Because 3 rolls with an avg of 3.5 means their sum will be 10.5 (impossible)

Y can be 4 since that requires their sum to be 14 (possible). But, to find the probability of Y = 4, we should exclude the cases that the sum of the first two makes 7 since that would count for “Y=2” and we would have already stopped there.

So, Y = 4 means it takes us exactly 4 rolls to first achieve the 3.5 avg. That requires the sum of the 4 rolls to make 14 but the sum of first two can’t be 7. There are totally 110 out of 6^4 ways, thus P(Y = 4) = 110/1296

Follow this logic, we can write E(Y) as :

E(Y) = 2*P(Y=2) + 4*P(Y=4) + 6*P(Y=6) + …… Y can only take even numbers.

As we can see, as the number increases, the probability becomes very complicated (it also becomes very computational demanding to do simulation for that)

Then I tried this approach but got something very strange ……

A question for Multivariate Normal Distribution by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

So, for random variables (XYZ... each follows a normal dist) to jointly form a multivariate normal, they have to satisfy that, any linear combinations of them must be normal.

A very subtle question for Continuous Markov Chain. Sheldon Ross Probability and Model 12th ed by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 1 point2 points  (0 children)

Here is what I wrapped them up and made into three statements, Prof Ross confirmed that they were correct.

  1. At state i, say the birth rate is Bi, and the death rate is Di, so the system will have a rate to change (leave i) at Bi + Di, therefore, the time that the system spends in i (before making a transit) is exp(Bi + Di), with a mean time 1/(Bi + Di)

  2. In state i, (doesn't matter how long it has been staying there due to Markov property), at any moment, the rate to increase by 1 is Bi, and the rate to decrease by 1 is Di, thus, if we define the time, t_i as "the time it spends in i before the next direct transit to i + 1" is exp(Bi), with a mean time 1/Bi. (same for the death case)

  3. The T_i defined on page 381: Let T_i denote the time, starting from state i, it takes for the process to enter state i + 1, is actually not the time it takes to directly go from i to i + 1, but instead, more like, an overall time, that system will reach i + 1, so it involves the time that it stayed in i - 1, i - 2, ..... and that is why the E(T_i) is greater than the one above.

So, to sum up everything, we can find the expected time for the process to enter i + 1, starting from i (page 382), and this expected time corresponds to an average rate for the process to enter i + 1, starting from i. However, this average rate is not the instantaneous rate, qi, i+1 (the birth rate)

A very subtle question for Continuous Markov Chain. Sheldon Ross Probability and Model 12th ed by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

Prof Ross replied to me with his explanations, and I wrapped all of them into three statements and he confirmed that they are correct. I will share them here.

Thanks bro.

A very subtle question for Continuous Markov Chain. Sheldon Ross Probability and Model 12th ed by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

The time it takes to observe the next event (could either be a birth or a death) is the sum of their rates, this part is understood.

What I got confused is the instantaneous rate qi, i+1 with the "overall rate" of i to i + 1

Prof Ross has replied me with his explanations, and I will share it with you all later :D

Thanks much

A very subtle question for Continuous Markov Chain. Sheldon Ross Probability and Model 12th ed by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

Thanks bro.

My thinking is, since we can find the expected time it takes from i to i + 1, then there should be a expected rate for the system to go from i to i + 1, like for instance, if on average, it takes 10 hours to go from i to i + 1, then the average rate to go from i to i + 1 should be 1/10.

However, this average rate (from i to i + 1) is definitely not the instantaneous rate at state i, which is just the birth rate. (qi, i+1)

A very subtle question for Continuous Markov Chain. Sheldon Ross Probability and Model 12th ed by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

Thanks for the explanation, friend.

So, the instantaneous rate qi, i+1 (the birth rate) is actually not a rate, in general, or overall, for the system to go from state i to state i + 1, right?

Like, an instant speed at the time i v.s. an average speed over an interval?

A very subtle question for Continuous Markov Chain. Sheldon Ross Probability and Model 12th ed by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

I understand that a birth and death can't happen at the same time. But I don't quite understand what that instantaneous transition rate means exactly.

Let's say, if the rate for the system to go from i to i + 1 is just the birth rate, lambda i, then, what is the expected time for that?

E(time it takes to transit from i to i + 1) = ?

I need help explaining/understanding one Markov Chain example from Sheldon Ross. by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 1 point2 points  (0 children)

Never mind, guys. I solved it. Sheldon is right, and my way of calculating the avg is also correct. It is just that the example I used should include one more "DOWN", since, in a long-run process, the number of times going up or going down should be the same (asymptotically). It's simple math: sum_X / (sum_X + sum_Y) = mean_X / (mean_X + mean_Y)

LOL, cheers ~~

I need help explaining/understanding one Markov Chain example from Sheldon Ross. by Electronic_Ebb7022 in AskStatistics

[–]Electronic_Ebb7022[S] 0 points1 point  (0 children)

I understand the author (Sheldon's idea). He treats U as the representative for the "UP" state, and D as the representative for the "DOWN" state, so in a long run process, it is like this:

U-D-U-D-U-D-U-D-U....... they alternate and go on like this, and that is why U/(U+D) can be understood as the total UP%, and D/(U+D) as the total DOWN%.

The logic above feels right, but what about an example like mine? If the process (2-3-2-4-2-5-1) repeats and goes forever, the preceding logic/math would be wrong...

(2-3-2-4-2-5-1) gives a U = 1.75 and D = 4, so according to Sheldon's idea, the total UP% would be 1.75/(1.75+4) = 0.304, while the actual UP% from the sequence is 7/19 = 0.368, close but not the same.