Hacker News with Generative AI: Probability

Dice, (De)Convolution and Generating Functions (demofox.org)
I stumbled across a very cool YouTube video today that talks about how to look for alternate ways of labeling sides of two dice to give you the same random number distribution as if you added two standard dice together.
Formally Modeling Dreidel, the Sequel (buttondown.com)
Channukah's next week and that means my favorite pastime, complaining about how Dreidel is a bad game. Last year I formally modeled it in PRISM to prove the game's not fun. But because I limited the model to only a small case, I couldn't prove the game was truly bad.
Why probability probably doesn't exist (but it is useful to act like it does) (nature.com)
All of statistics and much of science depends on probability — an astonishing achievement, considering no one’s really sure what it is.
The Birthday Paradox Experiment (2018) (pudding.cool)
The chance that two people in the same room have the same birthday — that is the Birthday Paradox 🎉. And according to fancy math, there is a 50.7% chance when there are just 23 people+This is in a hypothetical world. In reality, people aren’t born evenly throughout the year, and leap years are excluded. However, the numbers should still be pretty close. More on this in the appendix. in a room.
Coincidences that make our existence possible (bigthink.com)
There are a few small cosmic details that, if things were just a little different, wouldn’t have allowed our existence to be possible.
A boy girl paradox – or maybe not? (shankwiler.com)
The most liked comment answers 1/2, but the user got worn down by others until he recanted and agreed that the answer is 1/3. Many other comments give lengthy explanations as to why the answer is 1/3.
An alternative construction of Shannon entropy (rkp.science)
TL;DR: Shannon’s entropy formula is usually justified by showing it satisfies key mathematical criteria, or by computing how much space is needed to encode a variable. But one can also construct Shannon’s formula starting purely from the simpler notion of entropy as a (logarithm of a) count—of how many different ways a distribution could have emerged from a sequence of samples.
Adventures in Probability (buttondown.com)
I hope everyone had a good weekend. I went on a hike. It was great.
Universe would die before monkey with keyboard writes Shakespeare, study finds (theguardian.com)
Mathematicians have called into question the old adage that a monkey typing randomly at a keyboard for long enough would eventually produce the complete works of Shakespeare.
Monkeys Will Never Type Shakespeare (bbc.co.uk)
Two Australian mathematicians have called into question an old adage, that if given an infinite amount of time, a monkey pressing keys on a typewriter would eventually write the complete works of William Shakespeare.
Probability-generating functions (entropicthoughts.com)
I have long struggled with understanding what probability-generating functions are and how to intuit them. There were two pieces of the puzzle missing for me, and we’ll go through both in this article.
Understanding Gaussians (gestalt.ink)
The Gaussian distribution, or normal distribution is a key subject in statistics, machine learning, physics, and pretty much any other field that deals with data and probability. It’s one of those subjects, like $\pi$ or Bayes’ rule, that is so fundamental that people treat it like an icon.
Deriving the Kelly Criterion to Maximise Profits (obrhubr.org)
In a fictional casino which offers even odds on a fair coin toss game, how much of your money should you invest? If you said anything other than 0, you’re leaving broke at the end of the night.
Jane Street Probability Guide [pdf] (janestreet.com)
Mega Millions tickets will climb to $5, but officials promise better odds (apnews.com)
The cost of buying a Mega Millions jackpot dream will soon more than double, but lottery officials said they’re confident players won’t mind paying more after changes that will lead to larger prizes and more frequent winners.
St. Petersburg Paradox (wikipedia.org)
The St. Petersburg paradox or St. Petersburg lottery[1] is a paradox involving the game of flipping a coin where the expected payoff of the lottery game is infinite but nevertheless seems to be worth only a very small amount to the participants.
Randomness extractors: making fair coins out of biased coins (bytepawn.com)
In a previous post titled Fair coin from biased coin, I looked at the problem of creating a uniform random coin given access to a biased coin. I looked at multiple approaches, and determined that they're actually all the same in some sense.
Lottery Simulator (2023) (perthirtysix.com)
Every so often, a lottery jackpot will get so high that I'll hear about it on the news or from a friend.
Perplexing the Web, One Probability Puzzle at a Time (quantamagazine.org)
Three crew investigated over Bayesian yacht sinking (bbc.com)
The Law of Large Numbers or Why It Is a Bad Idea to Go to the Casino (easylang.online)
The Math of Card Shuffling (fredhohman.com)
Do not confuse a random variable with its distribution (bookdown.org)
Rare things become common at scale (2014) (asmartbear.com)
A Tale of Two Sieves (1996) [pdf] (ams.org)
Is the largest root of a random real polynomial more likely real than complex? (mathoverflow.net)
LLMs can't do probability (brainsteam.co.uk)
The Gambler's Fallacy Is Not a Fallacy (kevindorst.com)