

04/16/2017 

BRIDGE HANDS Also see http://www.rpbridge.net/7z77.htm and http://www.rpbridge.net/7z68.htm
Hand Distributions: 1. There are 39 patterns associated with the 636 billion hands dealt to one player. See Durango Bill http://durangobill.com/BrSuitStats.html
There are 39 hand patterns. The missing 26 patterns are each less than 1%. Not shown are patterns like 9310. The top 5 occur 71% of the time. The top 13 occur over 95% of the time. 2. There are 104 patterns associated with the 495,918,532,948,104 { i.e. COMBIN(52, 26)} hands dealt to two players. See Durango Bill http://durangobill.com/BrSuitStats.html
The above four patterns occur almost half of the time. High Card Points in your hand:
Trump Splits:
Durango Bill's Website on probabilities See Bill Butler's page: http://www.durangobill.com/Bridge.html. No Trump Hands http://www.bridgehands.com/S/Strong_Notrump.htm If you open 1517, the distribution is:
If partner has 8 flat and invites with 2N, opener will pass 43% of the time with 15 and will bid 3N 57% of the time with 16 or 17. So, 57% of the time the contract will be in 3N but should not be 3N, 33/(33 + 24) = 58% of the time when inviting with 8 flat, so I don't invite with 8 flat. CHEATING IMPROVES THE ODDS Blubaugh http://www.greatbridgelinks.com/GBLArchives/GBL010706.html
ITALY http://en.wikipedia.org/wiki/Blue_Team_(bridge)
1984 Boston http://www.nytimes.com/1984/07/31/nyregion/fivebridgeplayerstakenoutofplaycheatingsuspected.html
1995 http://cam.bridgeblogging.com/2011/03/06/watchinthedetectives/
2007 Gee http://tedmuller.us/Bridge/Esoterica/Ted03TheKenGeeCheatingCoverup.htm
2014 Coughing German Doctors http://www.independent.co.uk/news/world/europe/coughsthewordbridgeworldchampionsfoundguiltyofcheating9224319.htm THE HAT CHECK PROBLEM Thirteen men go to a restaurant and check their hats. Upon leaving they blindly grab hats.. What if 1000 men entered a restaurant and checked their hats and simply grabbed hats upon leaving. What is the probability noone received the correct hat back? The answer is hardly affected by the number of men. The answer is about 1 divided by e, the base of the natural logarithm, or 1/e, or 1/(2.71828)... about 37%. This probability was determined by the French mathematician, Nicolas Bernoulli, in the early 1700s in analyzing an old French card game Treise, which means 13 in French as the original game involved only 13 cards. It was similar to the following game: Using two 13 card decks (e.g. all spades), we each shuffle our decks and place them face down on the table. We start turning cards over, one at a time and see if our cards match or do not match. If we find a match before all the cards are compared I pay you 37 cents. If we compare all 13 cards and there is no match you pay me 63 cents. The probability of no matches is 1/e, about .37. The number e is an important mathematical constant that is the base of the natural logarithm. It is approximately equal to 2.71828, and is the limit of (1 + 1/n)^{n} as n approaches infinity, an expression that arises in the study of compound interest. It can also be calculated as the sum of the infinite series: Consider the two decks of spades described above. The following table contains the probability of exactly 0, 1, 2, 3 ...12, 13 matches. These probabilities are the same as for the 13 Hat Problem. P(r,N) = the probability of exactly r matches out of N cards or hats P(r,13) = {1  1/1! + 1/2!  1/3! + .... +_ 1/(13r)!} /r!
See http://www.mathnet.or.kr/mathnet/paper_file/Elizabethtown/Gabriela/Monmort.pdf and info on "Derangements" at http://en.wikipedia.org/wiki/Derangement and http://mathworld.wolfram.com/Derangement.html and finally http://math.stanford.edu/~thiem/courses/108S/hw1solns.pdf and "Combinatorics" at http://en.wikipedia.org/wiki/Combinatorics BIRTHDAY PROBLEM This problem was posed by Richard von Mises in 1939. How many people , n, must be in a room before the probability is at least 50% that someone shares a birthday? To figure out the exact probability of finding two people with the same birthday in a given group, it turns out to be easier to ask the opposite question: what is the probability that NO two will share a birthday, i.e., that they will all have different birthdays? With just two people, the probability that they have different birthdays is 364/365, or about .997. If a third person joins them, the probability that this new person has a different birthday from those two (i.e., the probability that all three will have different birthdays) is (364/365) x (363/365), about .992. With a fourth person, the probability that all four have different birthdays is (364/365) x (363/365) x (362/365), which comes out at around .983. And so on. The answers to these multiplications get steadily smaller. When a twentythird person enters the room, the final fraction that you multiply by is 343/365, and the answer you get drops below .5 for the first time, being approximately .493. This is the probability that all 23 people have a different birthday. So, the probability that at least two people share a birthday is 1  .493 = .507, just greater than 1/2. See the table below. p(n) is the probability that, with n people in a room, at least two share a birthday. It equals 1 minus the probability there are no common birthdays out of n people in a room.
PIG GAME is a simple dice game first described in print by John Scarne in 1945. Pig is sometimes used by math teachers to teach probability concepts. Pig is one of a family of dice games described by Reiner Knizia as "jeopardy dice games". For jeopardy dice games, the dominant type of decision is whether or not to jeopardize previous gains by rolling for potential greater gains. Durango Bill, who has calculated many probabilities regarding Bridge Hands, has a web page describing the game together with the winning strategy in the Pig Game. See http://www.durangobill.com/Pig.html PARANNDO'S PARADOX A cointossing example Consider playing two games, Game A and Game B with the following rules. For convenience, define to be our capital at time t, immediately before we play a game.
It is clear that by playing Game A, we will almost surely lose in the long run. Harmer and Abbott^{ }show via simulation that if and Game B is an almost surely losing game as well. In fact, Game B is a Markov chain, and an analysis of its state transition matrix (again with M=3) shows that the steady state probability of using coin 2 is 0.3836, and that of using coin 3 is 0.6164.^{ }As coin 2 is selected nearly 40% of the time, it has a disproportionate influence on the payoff from Game B, and results in it being a losing game. However, when these two losing games are played in some alternating sequence  e.g. two games of A followed by two games of B (AABBAABB...), the combination of the two games is, paradoxically, a winning game. Not all alternating sequences of A and B result in winning games. For example, one game of A followed by one game of B (ABABAB...) is a losing game, while one game of A followed by two games of B (ABBABB...) is a winning game. This cointossing example has become the canonical illustration of Parrondo's paradox – two games, both losing when played individually, become a winning game when played in a particular alternating sequence. The apparent paradox has been explained using a number of sophisticated approaches, including Markov chains, flashing ratchets. One way to explain the apparent paradox is as follows:
The role of now comes into sharp focus. It serves solely to induce a dependence between Games A and B, so that a player is more likely to enter states in which Game B has a positive expectation, allowing it to overcome the losses from Game A. With this understanding, the paradox resolves itself: The individual games are losing only under a distribution that differs from that which is actually encountered when playing the compound game. In summary, Parrondo's paradox is an example of how dependence can wreak havoc with probabilistic computations made under a naive assumption of independence. A more detailed exposition of this point, along with several related examples, can be found in Philips and Feldman. For a simpler example of how and why the paradox works, again consider two games Game A and Game B, this time with the following rules:
Say you begin with $100 in your pocket. If you start playing Game A exclusively, you will obviously lose all your money in 100 rounds. Similarly, if you decide to play Game B exclusively, you will also lose all your money in 100 rounds. However, consider playing the games alternatively, starting with Game B, followed by A, then by B, and so on (BABABA...). It should be easy to see that you will steadily earn a total of $2 for every two games. Thus, even though each game is a losing proposition if played alone, because the results of Game B are affected by Game A, the sequence in which the games are played can affect how often Game B earns you money, and subsequently the result is different from the case where either game is played by itself. Parrondo's paradox is used extensively in game theory, and its application in engineering, population dynamics, financial risk, etc., are also being looked into as demonstrated by the reading lists below. Parrondo's games are of little practical use such as for investing in stock markets^{ }as the original games require the payoff from at least one of the interacting games to depend on the player's capital. However, the games need not be restricted to their original form and work continues in generalizing the phenomenon. Similarly, a model that is often used to illustrate optimal betting rules has been used to prove that splitting bets between multiple games can turn a negative median longterm return into a positive one. In the early literature on Parrondo's paradox, it was debated whether the word 'paradox' is an appropriate description given that the Parrondo effect can be understood in mathematical terms. The 'paradoxical' effect can be mathematically explained in terms of a convex linear combination.





This site was last updated 04/16/17