2450 = 1 * 2 * 5 * 5 * 7 * 7
so list down all the possible sets ( x, y, z) such that xyz = 2450;
some sets are (1,49,50); (2,35,49); (5,10,49); (7,7,50) and so on.....
Now the bartender says that the sum of their ages is equal to your age. Since the person knows his own age but still cannot get the answer, this means that 'x + y + z ' is same for multiple sets say (x1,y1,z1) and (x2,y2,z2).....
only one such combination of multiple sets exists i.e
(5,10,49); (7,7,50) both of which sum up to 64.
Now we have to choose from one of these sets...
Now the bartender says that the oldest lady is older than peterson. Which means that the set must be (5,10,49).
So the ages must be 5,10,49
Solution for Lets Think!!!!!!!!!!!
Sunday, July 17, 2011
Solution to The flippant number
Answer: 17
Solution: We use the notation “|” to mean “divides.”
There is only one flippant 2-digit number, namely 77. Indeed, if 10a + b is flippant
(where a, b are integers 1–9), then 7 | 10a + b and 7 | 10b + a. Thus,
7 | 3(10a + b) − (10b + a) = 29a − 7b = a + 7(4a − b),
so that 7 | a, and similarly 7 | b, so we’d better have a = b = 7
There are 16 flippant 3-digit numbers. First consider the 12 palindromic ones (ones
where the hundreds and units digits are the same): 161, 252, 343, 434, 525, 595, 616,
686, 707, 777, 868, and 959. Now consider the general case: suppose 100a + 10b + c is
flippant, where a, b, c are integers 1–9. Then 7 | 100a + 10b + c and 7 | 100c + 10b + a,
so 7 | (100a + 10b + c) − (100c + 10b + a) = 99(a − c), and so 7 | a − c. In order for this
not to result in a palindromic integer, we must have a − c = ±7 and, moreover, both
100a + 10b + a and 100c + 10b + c must be palindromic flippant integers. Consulting
our list above, we find 4 more flippant integers: 168, 259, 861, and 952.
Solution: We use the notation “|” to mean “divides.”
There is only one flippant 2-digit number, namely 77. Indeed, if 10a + b is flippant
(where a, b are integers 1–9), then 7 | 10a + b and 7 | 10b + a. Thus,
7 | 3(10a + b) − (10b + a) = 29a − 7b = a + 7(4a − b),
so that 7 | a, and similarly 7 | b, so we’d better have a = b = 7
There are 16 flippant 3-digit numbers. First consider the 12 palindromic ones (ones
where the hundreds and units digits are the same): 161, 252, 343, 434, 525, 595, 616,
686, 707, 777, 868, and 959. Now consider the general case: suppose 100a + 10b + c is
flippant, where a, b, c are integers 1–9. Then 7 | 100a + 10b + c and 7 | 100c + 10b + a,
so 7 | (100a + 10b + c) − (100c + 10b + a) = 99(a − c), and so 7 | a − c. In order for this
not to result in a palindromic integer, we must have a − c = ±7 and, moreover, both
100a + 10b + a and 100c + 10b + c must be palindromic flippant integers. Consulting
our list above, we find 4 more flippant integers: 168, 259, 861, and 952.
Solution to funbit
The answer is: a dice. An explanation: "It's always 1 to 6": the numbers on the faces of the dice, "it's always 15 to 20": the sum of the exposed faces when the dice comes to rest after being thrown, "it's always 5": the number of exposed faces when the dice is at rest, "but it's never 21": the sum of the exposed faces is never 21 when the dice is at rest, "unless it's flying": the sum of all exposed faces when the dice is flying is 21 (1 + 2 + 3 + 4 + 5 + 6)
Monday, July 11, 2011
Solution to Monty Hall problem
The solution presented shows the three possible arrangements of one car and two goats behind three doors and the result of switching or staying after initially picking Door 1 in each case:
Door 1 Door 2 Door 3 result if switching result if staying
Car Goat Goat Goat Car
Goat Car Goat Car Goat
Goat Goat Car Car Goat
A player who stays with the initial choice wins in only one out of three of these equally likely possibilities, while a player who switches wins in two out of three. The probability of winning by staying with the initial choice is therefore 1/3, while the probability of winning by switching is 2/3.
Increasing the number of doors
That switching has a probability of 2/3 runs counter to many people's intuition. If there are two doors left, then why isn't each door 1/2? The intuition may be aided by generalizing the problem to have a large number of doors so that the player's initial choice has a small chance of winning.
It may be easier to appreciate the solution by considering the same problem with 1,000,000 doors instead of just three . In this case there are 999,999 doors with goats behind them and one door with a prize. The player picks a door. His initial probability of winning is 1 out of 1,000,000. The game host then opens 999,998 of the other doors revealing 999,998 goats. (Imagine the host starting with the first door and going down a line of 1,000,000 doors, opening each one, skipping over only the player's door and one other door.) The host then offers the player the chance to switch to the only other unopened door. On average, in 999,999 out of 1,000,000 times the other door will contain the prize, as 999,999 out of 1,000,000 times the player first picked a door with a goat. A rational player should switch. Intuitively speaking, the player should ask how likely is it, that given a million doors, he or she managed to pick the right one. The example can be used to show how the likelihood of success by switching is equal to (1 minus the likelihood of picking correctly the first time) for any given number of doors. The chance that the player's door is correct hasn't changed. It is important to remember, however, that this is based on the assumption that the host knows where the prize is and must not open a door that contains that prize, randomly selecting which other door to leave closed if the contestant manages to select the prize door initially.
To extend the above, it's as if Monty gives you the chance to keep your one door, or open all 999,999 of the other doors, of which he kindly opens the first 999,998 of them for you, leaving, deliberately, the one with the prize. Clearly, one would choose to open the other 999,999 doors rather than keep the one.
This example can also be used to illustrate the opposite situation in which the host does not know where the prize is and opens doors randomly. There is a 999,999/1,000,000 probability that the contestant selects wrong initially, and the prize is behind one of the other doors. If the host goes about randomly opening doors not knowing where the prize is, the probability is likely that the host will reveal the prize before two doors are left (the contestant's choice and one other) to switch between. If the host happens to not reveal the car, then both of the remaining doors have an equal probability of containing a car. This is analogous to the game play on another game show, Deal or No Deal; in that game, the contestant chooses a numbered briefcase and then randomly opens the other cases one at a time.
Door 1 Door 2 Door 3 result if switching result if staying
Car Goat Goat Goat Car
Goat Car Goat Car Goat
Goat Goat Car Car Goat
A player who stays with the initial choice wins in only one out of three of these equally likely possibilities, while a player who switches wins in two out of three. The probability of winning by staying with the initial choice is therefore 1/3, while the probability of winning by switching is 2/3.
Increasing the number of doors
That switching has a probability of 2/3 runs counter to many people's intuition. If there are two doors left, then why isn't each door 1/2? The intuition may be aided by generalizing the problem to have a large number of doors so that the player's initial choice has a small chance of winning.
It may be easier to appreciate the solution by considering the same problem with 1,000,000 doors instead of just three . In this case there are 999,999 doors with goats behind them and one door with a prize. The player picks a door. His initial probability of winning is 1 out of 1,000,000. The game host then opens 999,998 of the other doors revealing 999,998 goats. (Imagine the host starting with the first door and going down a line of 1,000,000 doors, opening each one, skipping over only the player's door and one other door.) The host then offers the player the chance to switch to the only other unopened door. On average, in 999,999 out of 1,000,000 times the other door will contain the prize, as 999,999 out of 1,000,000 times the player first picked a door with a goat. A rational player should switch. Intuitively speaking, the player should ask how likely is it, that given a million doors, he or she managed to pick the right one. The example can be used to show how the likelihood of success by switching is equal to (1 minus the likelihood of picking correctly the first time) for any given number of doors. The chance that the player's door is correct hasn't changed. It is important to remember, however, that this is based on the assumption that the host knows where the prize is and must not open a door that contains that prize, randomly selecting which other door to leave closed if the contestant manages to select the prize door initially.
To extend the above, it's as if Monty gives you the chance to keep your one door, or open all 999,999 of the other doors, of which he kindly opens the first 999,998 of them for you, leaving, deliberately, the one with the prize. Clearly, one would choose to open the other 999,999 doors rather than keep the one.
This example can also be used to illustrate the opposite situation in which the host does not know where the prize is and opens doors randomly. There is a 999,999/1,000,000 probability that the contestant selects wrong initially, and the prize is behind one of the other doors. If the host goes about randomly opening doors not knowing where the prize is, the probability is likely that the host will reveal the prize before two doors are left (the contestant's choice and one other) to switch between. If the host happens to not reveal the car, then both of the remaining doors have an equal probability of containing a car. This is analogous to the game play on another game show, Deal or No Deal; in that game, the contestant chooses a numbered briefcase and then randomly opens the other cases one at a time.
Monday, July 4, 2011
Solution to CARD GAME
If we approximate the problem by assuming each card has 1/13 chance for each value 1 to 13, we can compute the expected return based on the number of cards remaining, like in the dice problem towr mentioned.
With the last card, the expected return is (1+13)/2 = 7.
With 2 cards remaining, we keep the card if the value is higher than that. The return is (7*7 + 6*(7+13)/2)/13 = 8.6154. (7 chances to continue and get 7, 6 chances to stop and earn (7+13)/2 on average).
With 3 cards remaining, you keep anything above 8.6. The return is (8*8.6154 + 5*(9+13)/2)/13 = 9.5325.
With 4 cards remaining, you keep 10 or more, the return is (9*9.5325 + 4*(10+13)/2)/13 = 10.13791534. That is what you can earn on average.
If you take into account that the cards don't repeat it changes these values. But I don't think it changes them to the point where the strategy should be changed. The average return should be slightly higher due to the fact that when you discard cards, these are likely to be low values, so the return after discarding a card is often higher than computed.
Just for the last card, if you are holding a 7, you can check whether the first 2 cards average to more than 7. If yes, you should better keep your seven. If not, you should draw the last card.
But If I do the same calculations with the rule that you always keep a value 10 or above, I find a result of 10.0068. I think the real value is slightly higher.
With the last card, the expected return is (1+13)/2 = 7.
With 2 cards remaining, we keep the card if the value is higher than that. The return is (7*7 + 6*(7+13)/2)/13 = 8.6154. (7 chances to continue and get 7, 6 chances to stop and earn (7+13)/2 on average).
With 3 cards remaining, you keep anything above 8.6. The return is (8*8.6154 + 5*(9+13)/2)/13 = 9.5325.
With 4 cards remaining, you keep 10 or more, the return is (9*9.5325 + 4*(10+13)/2)/13 = 10.13791534. That is what you can earn on average.
If you take into account that the cards don't repeat it changes these values. But I don't think it changes them to the point where the strategy should be changed. The average return should be slightly higher due to the fact that when you discard cards, these are likely to be low values, so the return after discarding a card is often higher than computed.
Just for the last card, if you are holding a 7, you can check whether the first 2 cards average to more than 7. If yes, you should better keep your seven. If not, you should draw the last card.
But If I do the same calculations with the rule that you always keep a value 10 or above, I find a result of 10.0068. I think the real value is slightly higher.
Sunday, July 3, 2011
Solution to ENVELOPE GAMBLE I
here is the possible outcome in a tabular form
| x | 2x
--------------------------------
dont change | x | 2x
--------------------------------
change | 2x | x
so if i dont change my "profit" obviously is going to be 0.
but if i change my "profit" is going to be 100% in one case and -50% in the other case. that makes the average profit to be 25%.
I am a bit confused about the approach, as intuitively i feel as this is same as suggested by the question so might be the wrong approach.
what do you think,
Update: another solution suggested is why should we consider the gain %age(as intially the amount is not with us, we are given some amount and an option to change it, %age wise though there is a gain if we change). So from the above table as we see, there is equal probabilty of getting x or 2x. Hence in this case if we change or dont change does matter.
Another observation: the answer might change if we say, (a) one amount is 1.5 times the other, or (2) one amount is 3 times the other.
| x | 2x
--------------------------------
dont change | x | 2x
--------------------------------
change | 2x | x
so if i dont change my "profit" obviously is going to be 0.
but if i change my "profit" is going to be 100% in one case and -50% in the other case. that makes the average profit to be 25%.
I am a bit confused about the approach, as intuitively i feel as this is same as suggested by the question so might be the wrong approach.
what do you think,
Update: another solution suggested is why should we consider the gain %age(as intially the amount is not with us, we are given some amount and an option to change it, %age wise though there is a gain if we change). So from the above table as we see, there is equal probabilty of getting x or 2x. Hence in this case if we change or dont change does matter.
Another observation: the answer might change if we say, (a) one amount is 1.5 times the other, or (2) one amount is 3 times the other.
Solution to BUTTON TRAP ROOM
Here is the solution. At the end of any step you may win, otherwise proceed to the next step.
1) push opposite buttons
2) push adjacent. buttons
3) push opposite buttons (if you didn't win by this step then you know you have a 3/1 situation)
4) push any one button (either winning or resulting in a 2/2 situation)
5) push opposites (either winning or resulting in an UUDD situation)
6) push adj. buttons
7) push opposites
1) push opposite buttons
2) push adjacent. buttons
3) push opposite buttons (if you didn't win by this step then you know you have a 3/1 situation)
4) push any one button (either winning or resulting in a 2/2 situation)
5) push opposites (either winning or resulting in an UUDD situation)
6) push adj. buttons
7) push opposites
Subscribe to:
Posts (Atom)