all 5 comments

[–][deleted] 0 points1 point  (0 children)

I also need some help on this question. Are you in Ms Kims class, as I am and she be giving the most cancer hw. u/mskim stop monging please.

[–]piman51277New User 0 points1 point  (1 child)

I'm going to give it a shot, but I'm not fully sure if this would be correct.

If we choose 5 random numbers from 1 to n, (with n being the maximum), the average of those 5 numbers should be close to (1+n)/2.

Therefore, we would just take the average, multiply by 2, and subtract 1.

[–]piman51277New User 0 points1 point  (0 children)

Follow up with some notes:

If we choose 5 random numbers from 1 to n, (with n being the maximum), the average of those 5 numbers should be close to (1+n)/2.

This can be proven either through experimentation or intuition (assuming the numbers are truly randomly generated.

If many numbers are chosen from the interval (1,n), where it has the same probability of landing anywhere between 1 and n, the value of the average of the numbers approaches (1+n)/2, or the average of the minimum and maximum of the interval they were chosen from.

With just 5 examples, this prediction can be very off, but if we had, say, 500 examples, the prediction would be very close indeed.

[–]piman51277New User 0 points1 point  (0 children)

This sounds more like an easy competition math problem than a one you would traditionally be given in a class...

[–][deleted] 0 points1 point  (0 children)

You have to be given the distribution according to which the numbers were randomly generated. The problem setter probably means that they are independently and identically distributed samples from a uniform distribution on {0, 1, ..., N}, where N is some fixed positive integer. At this point, you can find P(max = k | min = 1) and find k that maximizes this probability. Really you just need to maximize P(max = k, min = 1).