Heroes of Might and Magic Community
visiting hero! Register | Today's Posts | Games | Search! | FAQ/Rules | AvatarList | MemberList | Profile


Age of Heroes Headlines:  
5 Oct 2016: Heroes VII development comes to an end.. - read more
6 Aug 2016: Troubled Heroes VII Expansion Release - read more
26 Apr 2016: Heroes VII XPack - Trial by Fire - Coming out in June! - read more
17 Apr 2016: Global Alternative Creatures MOD for H7 after 1.8 Patch! - read more
7 Mar 2016: Romero launches a Piano Sonata Album Kickstarter! - read more
19 Feb 2016: Heroes 5.5 RC6, Heroes VII patch 1.7 are out! - read more
13 Jan 2016: Horn of the Abyss 1.4 Available for Download! - read more
17 Dec 2015: Heroes 5.5 update, 1.6 out for H7 - read more
23 Nov 2015: H7 1.4 & 1.5 patches Released - read more
31 Oct 2015: First H7 patches are out, End of DoC development - read more
5 Oct 2016: Heroes VII development comes to an end.. - read more
[X] Remove Ads
LOGIN:     Username:     Password:         [ Register ]
HOMM1: info forum | HOMM2: info forum | HOMM3: info mods forum | HOMM4: info CTG forum | HOMM5: info mods forum | MMH6: wiki forum | MMH7: wiki forum
Heroes Community > Tavern of the Rising Sun > Thread: Let's talk about Maths!!!
Thread: Let's talk about Maths!!! This thread is 55 pages long: 1 10 20 30 40 ... 48 49 50 51 52 ... 55 · «PREV / NEXT»
alcibiades
alcibiades


Honorable
Undefeatable Hero
of Gold Dragons
posted April 05, 2011 05:04 PM

I must admit I have a hard time grasping the full details on this.

On one hand I'm comforted at the notion that it should be completely random, because that's what my sense tells me.

On the other hand, it seems to me that there would be a 50 % chance that the other box has lower amount, and a 50 % chance that the other box would have higher amount, and since we know that they're 1:2 that would imply that other box holds average 125 % of current box - an observation that could be made even before we make the pick and get revealed the content of the other box.

But maybe clue is that this argument is symmetrical between the boxes, so that if one box holds X (statistically), total amount of money in the boxes is (X + 2X + X + 0,5 X)/2 (namely 50 % chance that box has lower amount and 50 % it has higher amount), or 2,25 X. Thus, if one box holds X, the other box will hold 1,25 X (similar to above).

This means that average content of box would be 1.125 X, so our 100 $ are actually 1.125 X, or X = 100 $/1.125 = 88.8 $. This makes sense to me because X + 1.25 X is now 200 $, which should be - statistically - the total amount of money in the two boxes. Thus, if our box holds 100 $, statistically the other box should hold also 100 $ - i.e. it doesn't make any difference if we redecide.

Geez, I'm not sure if I manage to explain my thoughts, but I actually think that what I think makes sense.
____________
What will happen now?

 Send Instant Message | Send E-Mail | View Profile | PP | Quote Reply | Link
Corribus
Corribus

Hero of Order
The Abyss Staring Back at You
posted April 05, 2011 05:45 PM

Well Alci I started writing out a reply but now I find I'm confusing myself, so I have to let it sit for a little bit.  This is what happens when I reply off the cuff without thinking things through.

I'm beginning to think that the answer to this problem is that ihor is just snowing around with us.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Ecoris
Ecoris


Promising
Supreme Hero
posted April 05, 2011 11:55 PM

There is nothing wrong with the argument that shows that the expected amount of money in the other box is 1.25 times greater than in the box you picked. The problem has to do with the premises:
Some guy chose a random number X, and put $X in one box and $2X in the other box. We then implicitly assume that X is picked according to a uniform distribution. But if there are no bounds on the set from which we pick X, say we only require that X is a natural number, there is no such uniform distribution; you can't "pick a natural number at random", if "at random" is supposed to mean that every number has the same probability of being chosen. (And you can't do it for rational numbers or the reals).

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
friendofgunnar
friendofgunnar


Honorable
Legendary Hero
able to speed up time
posted April 06, 2011 12:18 PM
Edited by friendofgunnar at 12:22, 06 Apr 2011.

nah Corribus, you're right.  This isn't quantum mechanics where what's in the second box depends on what you you find in the first.  The second box has already been decided.
Say there's 50 in the first and 100 in the second.  If you pick the first there's a 100% chance of doubling by switching.  If you pick the second there's a 100% chance of halving by switching.  The probabilities cancel each other out.

 Send Instant Message | Send E-Mail | View Profile | PP | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted April 06, 2011 11:54 PM

Ok, let's forget probabilities entirely.

Can we prove:

When we switch, the amount of money that we gain, if we actually gain, is greater than the amount of money that we lose, if we do lose.

?
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Ecoris
Ecoris


Promising
Supreme Hero
posted April 07, 2011 11:51 AM
Edited by Ecoris at 11:56, 07 Apr 2011.

Yes, of course. If we switch and win, we double up; we win 2X instead of X, a net gain of X. If we switch and lose we go from X to 0.5X, a net loss of 0.5X.

But that doesn't mean that the expected net gain can't be 0 (which it obviously has to be). If we let the random variable G be the amount of money gained by swtiching, ant let the random variable M be the amount of money in the box with the least money (the other then contains 2M),
then the expected value of G is 0, for any distribution of M. (M could be discrete or continuous). You can verify that E(G) = 0 by computations without appealing to the symmetry of the situation (if E(G) > 0, then we would also gain money by switching back, but then we could switch once more and so on, which is clearly absurd).

We can illustrate what is going on with an example. Lets say M has to be $1, $2, $4, $8, $16 or $32 (and each amount is equally likely). Then the other box will contain $2, $4, $8, $16, $32 or $64 respectively.
We choose a box (we don't know the distribution of M, and whether or not we get to see the contents of the box before we decide to switch is irrelevant). If the chosen box contains $2 and we switch we lose $1 50% of the time and win $2 the other 50% of the time. If it contains $4 and we switch we lose $2 50% of the time and win $4 the other 50% of the time. And so on. But if it contains $1 we always win $1, and if it contains $64 we always lose $32.
So if we switch we expect to gain
($1 + (-1$ + $2) + (-$2 + $4) + (-$4 + $8) + (-$8 + $16) + (-$16 + $32) - $32)/12 = $0

If the value of M is bounded above, you always lose if the box you initially choose contains a sufficiently large amount of money. This makes up for the money you are expected to gain whenever the box contains an amount of money that allows you to both win and lose money by switching.

In general M need not be bounded above, but the density function of M has to converge to 0 as the value of M increases (and it has to converge so "fast" that its integral is 1). This is just a slight generalization from the situation described above. In this case when the amount of money in the box you choose is sufficiently large you are more likely to lose by switching than you are to win. So even though you you either gain X or lose 0.5X, you eventually become so much more likely to lose that you should not switch.


The paradox only arises because one is not aware of this behaviour for large values of M. You can't just say "let M be a random integer" and thereby mean that every integer should have the same probability of being chosen.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
dimis
dimis


Responsible
Supreme Hero
Digitally signed by FoG
posted April 07, 2011 03:01 PM

I agree with the problems that occur with density and calculations involving probabilities. I really meant to give a hint, but I didn't know how to say it without making myself appear as a smarta$$ since I apparently know the problem. So, can we prove the following statement then without probabilities (which clearly contradicts the above one):

When we switch, the amount of money that we win or lose is the same.

?
____________
The empty set

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Ecoris
Ecoris


Promising
Supreme Hero
posted April 08, 2011 10:16 AM
Edited by Ecoris at 10:17, 08 Apr 2011.

Well, you can just refer to the symmetry: If we expect to win money by switching to the other box, we would also expect to win money by switching once more if we were given the choice. But that is clearly absurd. By the same argument we can't expect to lose money.

I'm not sure where you want to go. Everyone is aware of the above argument in some form, the problem is that it is in conflict with the 1.25X-argument which is very appealing in itself. The real problem as I see it lies in understanding why the 1.25X-argument is flawed.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
ihor
ihor


Supreme Hero
Accidental Hero
posted April 08, 2011 12:31 PM

Who said there is a flaw? .
What dimis asked is just to prove following two statements without involving any probabilities.

1) When we switch, the amount of money that we gain, if we actually gain, is greater than the amount of money that we lose, if we do lose.

2) When we switch, the amount of money that we win or lose is the same.


 Send Instant Message | Send E-Mail | View Profile | PP | Quote Reply | Link
Corribus
Corribus

Hero of Order
The Abyss Staring Back at You
posted April 08, 2011 11:02 PM
Edited by Corribus at 23:07, 08 Apr 2011.

May I pose a new problem?

Consider the following game:

In front of you are three buckets, labeled A, B and C. In bucket A are 3 balls which are identical except for the fact that they are labeled by consecutive numbers: 1, 2 and 3.  The other two buckets are empty.  The goal of the game is to get all of the balls into bucket C.

The game has three rules:

1. You can only move one ball at a time.

2. You can only move a ball from a bucket if it is the highest numbered ball in that bucket.

3. You can only move a ball to a bucket if its number is higher than every other ball in that bucket. [Note: this implies you can always move a ball to an empty bucket.]

Question: What is the minimum number of moves you can use to move all the balls from bucket A to bucket C?

Bonus problem: What is the minimum number of moves you can use to move n balls, numbered consecutively 1 to n?


 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Smithey
Smithey


Promising
Supreme Hero
Yes im red, choke on it !!!
posted April 08, 2011 11:29 PM
Edited by Smithey at 23:29, 08 Apr 2011.

Quote:
May I pose a new problem?

Consider the following game:

In front of you are three buckets, labeled A, B and C. In bucket A are 3 balls which are identical except for the fact that they are labeled by consecutive numbers: 1, 2 and 3.  The other two buckets are empty.  The goal of the game is to get all of the balls into bucket C.

The game has three rules:

1. You can only move one ball at a time.

2. You can only move a ball from a bucket if it is the highest numbered ball in that bucket.

3. You can only move a ball to a bucket if its number is higher than every other ball in that bucket. [Note: this implies you can always move a ball to an empty bucket.]

Question: What is the minimum number of moves you can use to move all the balls from bucket A to bucket C?

Bonus problem: What is the minimum number of moves you can use to move n balls, numbered consecutively 1 to n?




Answer 1. change labels between buckets C and A... zero moves and all balls are in bucket C
Answer 2. ball 3 into C, ball 2 into B, ball 3 into B, ball 1 into C, ball 3 into A, ball 2 into C, ball 3 into C... done with.... hmm 7 I think...
Bonus answer - dont ever bother with those but I'll make an educated guess, 3 balls went in with 3!+1, so I would first try out the n!+1... but as stated before I'm just guessing....

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Ecoris
Ecoris


Promising
Supreme Hero
posted April 09, 2011 11:35 AM

It's the Towers of Hanoi (the link contains spoilers).

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Corribus
Corribus

Hero of Order
The Abyss Staring Back at You
posted April 09, 2011 06:04 PM

Lol, that it is.  I was hoping to disguise it by translating it to balls and buckets, but I guess I should know better than to trick you guys.

The answer to the bonus I figured out to be (if my math is right)

# of moves, f, as a function of # of balls (n)

f(n) = SUM[from 1 to n] 2^(n-1)

By the way, smithey, you're answer of 7 is correct.  But how can you be sure it is the minimum number of moves?  (I.e., can you prove it?)
____________
I'm sick of following my dreams. I'm just going to ask them where they're goin', and hook up with them later. -Mitch Hedberg

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Warmonger
Warmonger


Promising
Legendary Hero
fallen artist
posted April 09, 2011 06:13 PM

It's the simplest iterative algorithm presented in most of algorithms & data structures books and computer sience courses.
____________
The future of Heroes 3 is here!

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Corribus
Corribus

Hero of Order
The Abyss Staring Back at You
posted April 09, 2011 06:41 PM
Edited by Corribus at 18:42, 09 Apr 2011.

Ok, new problem.

I like probability problems and here's one I made up while taking a shower.  (Yeah, some people sing in the shower; I think about math problems.  I'm strange.)  I don't know the answer, so I'll be trying to figure it out, too.

Consider a thoroughly shuffled standard deck of 52 cards (no jokers).  You turn one card over and observe it.

Here are the rules of the game.

1. You pick one card at a time.

2. If the card you pick is higher than or equal to the previous card, you get to pick another card.

3. If the card you pick is lower than the previous card, you lose the game.

4. If you successfully turn over 4 cards consecutively (including the first one!) you win the game.

5. Aces are high (though, I don't think it should matter).  All suits are considered equivalent.

What is the probability of winning the game?

Does it hurt or hinder your chances if you draw from an infinitely large pile of cards?

For a bigger challenge, what is the probability of successfully turning over n cards in an infinitely large deck?

Here's an example of the game in case the rules aren't clear:

Example of a win: 3 then 7 then 7 then K
Example of a loss: 2 then 9 then 4
 
____________
I'm sick of following my dreams. I'm just going to ask them where they're goin', and hook up with them later. -Mitch Hedberg

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Ecoris
Ecoris


Promising
Supreme Hero
posted April 09, 2011 10:44 PM
Edited by Ecoris at 22:46, 09 Apr 2011.

Quote:
f(n) = SUM[from 1 to n] 2^(n-1)
Aka 2^n - 1. One can prove that this is the minimum number of moves by induction on n.
Using your terminology, the ball labeled "1" can only be moved from bucket A to another bucket if all other balls are in the third bucket. We have to move the ball from bucket A at some point. We also have to move it to bucket C at some point. Every solution - optimal or not - must therefore contain the following configurations in order:
1. All balls are in bucket A (the initial configuration).
2. Bucket B or bucket C contains all balls except ball 1 which is in bucket A.
3. Bucket A or bucket B contains all balls except ball 1 which is in bucket C.
4. All balls are in bucket C (the final configuration).

The induction assumption tells us that it takes at least 2^(n-1) - 1 moves to go from 1. to 2.
It takes at least one move to go from 2. to 3.
And finally, it takes at least 2^(n-1) - 1 moves to go from 3. to 4.

So it takes at least 2^(n-1) - 1 + 1 + 2^(n-1) - 1 = 2^n - 1 moves to solve the problem with n balls. But we also know that it can be solved in that many moves.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
smithey
smithey


Promising
Supreme Hero
Yes im red, choke on it !!!
posted April 10, 2011 12:36 PM

Quote:
Ok, new problem.

I like probability problems and here's one I made up while taking a shower.  (Yeah, some people sing in the shower; I think about math problems.  I'm strange.)  I don't know the answer, so I'll be trying to figure it out, too.

Consider a thoroughly shuffled standard deck of 52 cards (no jokers).  You turn one card over and observe it.

Here are the rules of the game.

1. You pick one card at a time.

2. If the card you pick is higher than or equal to the previous card, you get to pick another card.

3. If the card you pick is lower than the previous card, you lose the game.

4. If you successfully turn over 4 cards consecutively (including the first one!) you win the game.

5. Aces are high (though, I don't think it should matter).  All suits are considered equivalent.

What is the probability of winning the game?

Does it hurt or hinder your chances if you draw from an infinitely large pile of cards?

For a bigger challenge, what is the probability of successfully turning over n cards in an infinitely large deck?

Here's an example of the game in case the rules aren't clear:

Example of a win: 3 then 7 then 7 then K
Example of a loss: 2 then 9 then 4
 


hmm... to the previous Q, nope dont do proving, just solving, new one hmmm....

4/52 (chances of the first card being 2) * 1 (always a win since the first one was a 2) + 4/52 (card being 3) * (1-4/51)*(1-3/50)*(1-2/49) (anything besides card labeled as 2) +.... u get the picture althouhgt there is an easier way with N, K and ! but I prefer long and painful and im not the only one here

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Ecoris
Ecoris


Promising
Supreme Hero
posted April 10, 2011 01:02 PM

That is a very good problem Corribus. For now, I'll just post my results.
Probablity of winning with a 52-card deck: 5.76%
Probablity of winning with an infinite deck: 6.37%

I only have some ugly recursive formulas in the n-card, infinite deck problem. I think I've found a non-recursive one. More on that later.

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
OhforfSake
OhforfSake


Promising
Legendary Hero
Initiate
posted April 12, 2011 07:22 AM

Happy birthday dimis!
____________
Living time backwards

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Warmonger
Warmonger


Promising
Legendary Hero
fallen artist
posted April 12, 2011 02:15 PM

I've just googled for "deviation latex" just to find out not everyone in this world is into maths.
____________
The future of Heroes 3 is here!

 Send Instant Message | Send E-Mail | View Profile | Quote Reply | Link
Jump To: « Prev Thread . . . Next Thread » This thread is 55 pages long: 1 10 20 30 40 ... 48 49 50 51 52 ... 55 · «PREV / NEXT»
Post New Poll    Post New Topic    Post New Reply

Page compiled in 0.3746 seconds