Hacker News with Generative AI: Computing

Apple joins UALink consortium with Intel and AMD to take on Nvidia AI dominance (9to5mac.com)
Apple has officially gained a board seat on the Ultra Accelerator Link Consortium, a group of more than 65 members developing next generation AI accelerator architecture.
Getting an all-optical AI to handle non-linear math (arstechnica.com)
A team of MIT researchers figured that if you had a chip that could process photons directly, you could skip the entire digitization step and perform calculations with the photons themselves. It has the potential to be mind-bogglingly faster.
Nvidia CEO says his AI chips are improving faster than Moore's Law (techcrunch.com)
Nvidia CEO Jensen Huang says the performance of his company’s AI chips is advancing faster than historical rates set by Moore’s Law, the rubric that drove computing progress for decades.
Kids can't use computers and this is why it should worry you (2013) (coding2learn.org)
The truth is, kids can't use general purpose computers, and neither can most of the adults I know.
Reversible Computing Escapes the Lab in 2025 (ieee.org)
This weird information-theory concept has become a power-saving chip
I was wrong about the ethics crisis (cacm.acm.org)
The ethics crisis in computing was “launched” in 2018. In March of that year, The Boston Globe asserted, “Computer science faces an ethics crisis. The Cambridge Analytica scandal proves it!” This was in response to the Techlash,a where Wall Street Journal columnist Peggy Noonan described Silicon Valley executives as “moral Martians who operate on some weird new postmodern ethical wavelength” and Niall Ferguson, a Hoover Institution historian, described cyberspace as “cyberia, a dark and lawless realm where malevolent actors range.”
LLMs are everything that it wrong in computing (crys.site)
For decades corporations have been doing anything in their power to make computers worse.
Why Did Early CD-ROM Drives Rely on Awkward Plastic Caddies? (hackaday.com)
These days, very few of us use optical media on the regular. If we do, it’s generally with a slot-loading console or car stereo, or an old-school tray-loader in a desktop or laptop. This has been the dominant way of using consumer optical media for some time.
CUDA Moat Still Alive (semianalysis.com)
SemiAnalysis has been on a five-month long quest to settle the reality of MI300X. In theory, the MI300X should be at a huge advantage over Nvidia’s H100 and H200 in terms of specifications and Total Cost of Ownership (TCO). However, the reality is that the on paper specs as given below are not representative of performance that can be expected in a real-world environment.
Million GPU clusters, gigawatts of power – the scale of AI defies logic (theregister.com)
Million GPU clusters, gigawatts of power – the scale of AI defies logic
Classic Computer Magazines (archive.org)
Measuring hardware overhang (2020) (lesswrong.com)
How can we measure a potential AI or hardware overhang? For the problem of chess, modern algorithms gained two orders of magnitude in compute (or ten years in time) compared to older versions.
What did Ada Lovelace's program actually do? (2018) (twobithistory.org)
The story of Microsoft’s founding is one of the most famous episodes in computing history. In 1975, Paul Allen flew out to Albuquerque to demonstrate the BASIC interpreter that he and Bill Gates had written for the Altair microcomputer.
Apple Working on Giant Foldable iPad (bloomberg.com)
Apple’s new vision for the future of computing is a giant, iPad-like foldable device. Also: The company rethinks the mouse, its next AirTag will have a new chip that lets you find items from farther away, and a shift to in-house Wi-Fi chips will kick off next year. Apple also is planning satellite and health upgrades for its smartwatch.
The saga of the color brown in the early years of the PC (2023) (blogspot.com)
Lightmatter – The photonic (super)computer company (lightmatter.co)
Raspberry Pi 500 makes an 8GB Pi 5 into a compact, inexpensive desktop PC (arstechnica.com)
One of the selling points of the Raspberry Pi 5 (released in October 2023) is that it was fast enough and had enough memory to be a credible general-purpose desktop PC, if not an especially fast one. For Pi-as-desktop enthusiasts, the company has a couple of new pre-holiday announcements. The biggest is the Raspberry Pi 500, which fits the components of an 8GB Pi 5 into a small keyboard-shaped case for $90.
Raspberry Pi 500 review with Raspberry Pi Monitor and teardown (cnx-software.com)
The Raspberry Pi 500 keyboard PC is just out along with the 15.6-inch Raspberry Pi Monitor and received samples from Raspberry Pi for review a few days ago.  I’ve had time to play with both, so in this review, I’ll go through an unboxing of the kit I received and report my experience with both the keyboard PC and monitor.
Raspberry Pi 500 Review: The keyboard is the computer, again (tomshardware.com)
Pocket 4 with 8.8″ High-Refresh LTPS Screen, 64GB RAM, 2TB SSD, and 45Wh Battery (linuxgizmos.com)
Indiegogo recently introduced the GPD Pocket 4, a compact PC powered by AMD’s latest processors, including the Ryzen AI9 HX370. It features up to 64GB of LPDDR5x RAM, an M.2 NVMe port, Gigabit Ethernet, Wi-Fi 6E, Bluetooth 5.3, and more.
Photonic processor could enable fast AI computations with energy efficiency (news.mit.edu)
The deep neural network models that power today’s most demanding machine-learning applications have grown so large and complex that they are pushing the limits of traditional electronic computing hardware.
Raspberry Pi CM5 is a faster, drop-in upgrade (jeffgeerling.com)
The Raspberry Pi Compute Module 5 is smaller than a credit card, and I already have it gaming in 4K with an eGPU, running a Kubernetes cluster, and I even upgraded my NEC Commercial display from a CM4 to CM5, just swapping the Compute Modules!
Apple iMac M4 review: who is this for, exactly? (theverge.com)
The M4 iMac is a beautiful computer that feels more and more like it fell out of a universe where laptops never took off.
Computing Industry Doesn't Care about Performance: how I made things faster (deviantabstraction.com)
These days, my day job is all about optimizing email deliverability, but I am fascinated by making computers faster. Originally, that was even the focus of the company I cofounded (until we pivoted to emails—got to pay the rent!). I still believe that the way we’ve built the computing ecosystem is fundamentally flawed and, in many ways, disempowering.
M4 Mac Mini Cluster [video] (youtube.com)
Ancient Computers in Use Today (2012) (pcworld.com)
It’s easy to wax nostalgic about old technology–to remember fondly our first Apple IIe or marvel at the old mainframes that ran on punched cards. But no one in their right mind would use those outdated, underpowered dinosaurs to run a contemporary business, let alone a modern weapons system, right?
Ubitium is developing 'universal' processor combining CPU, GPU, DSP, and FPGA (tomshardware.com)
AI PCs make users less productive (theregister.com)
Those using personal computers with built-in AI services are less productive than those using traditional PCs, according to a study conducted by Intel.
How the ZX Spectrum became a 1980s icon (bbc.com)
The ZX Spectrum was a 1980s icon which played a starring role in the revolution that brought computers into the UK’s homes for the first time.
The end of ChromeOS is a new dawn for cheap Android laptops (zdnet.com)
It's the beginning of the end for ChromeOS as Google faces a pivotal challenge: compete with Apple's Arm dominance while leveraging AI and custom silicon to redefine affordable computing.