Hacker News with Generative AI: Information Theory

An alternative construction of Shannon entropy (rkp.science)
TL;DR: Shannon’s entropy formula is usually justified by showing it satisfies key mathematical criteria, or by computing how much space is needed to encode a variable. But one can also construct Shannon’s formula starting purely from the simpler notion of entropy as a (logarithm of a) count—of how many different ways a distribution could have emerged from a sequence of samples.
Leveraging the lightcone around the source of truth with Postgres (benoitessiambre.com)
To summarize, the postgres-centrism hypothesis says that integrated, entropy-minimizing systems that reduce dependency distances, and anchor logic near the source of truth, leveraging the efficiencies of proximity, often better aligns with fundamentals of computing, physics, information theory, thermodynamics and intelligence.
Machine learning and information theory concepts towards an AI Mathematician (arxiv.org)
The current state-of-the-art in artificial intelligence is impressive, especially in terms of mastery of language, but not so much in terms of mathematical reasoning.
Introduction to Information Theory – Edward Witten [video] (youtube.com)
What Is Entropy? (wordpress.com)
What Is Entropy? (wordpress.com)
Data Compression Explained (2011) (mattmahoney.net)
A Mathematical Theory of Communication [pdf] (math.harvard.edu)