It was a Saturday in 2015, perhaps 2016. I was still “normal” back then, still convinced that technology was inherently positive, potentially revolutionary, still naive enough to believe that the internet liberated by definition. I was browsing books at Waterstones on Sauchiehall Street in Glasgow—one of my little guilty pleasures since I landed in Scotland—when I came across “The Net Delusion: The Dark Side of Internet Freedom” by Evgeny Morozov. I picked up the book, went downstairs, sat in the in-house café and started reading. And I went into crisis. His thesis demolished, piece by piece, the narrative of the “Twitter Revolution” of 2009 in Iran. In the book, Morozov cited an analysis by Golnaz Esfandiari, an Iranian journalist for Foreign Policy, who had done something simple but, these days, almost revolutionary: journalism (if you're laughing at this point, you're good people...). She had looked at where the tweets with #iranelection actually came from during the 2009 protests. And the answer? From the West. Not from Iran. Wait, what? Yes, exactly. It was theater. Western self-celebration masquerading as solidarity.
Read more...
3:00 AM. Another one of those nights where my brain decided sleep was overrated. After my usual nocturnal walk through the streets of a remote Scottish town—where even a fox observed me with that “humans are weird” look—I sat back down at my server. Just a quick scan of my RSS feeds, I told myself, then I can start work. When...
We backed up Spotify (metadata and music files). It's distributed in bulk torrents (~300TB), grouped by popularity.
This release includes the largest publicly available music metadata database with 256 million tracks and 186 million unique ISRCs.
It's the world's first “preservation archive” for music which is fully open (meaning it can easily be mirrored by anyone with enough disk space), with 86 million music files, representing around 99.6% of listens.
The news came from Anna's Archive—the world's largest pirate library—which had just scraped Spotify's entire catalog. Not just metadata, but also the audio files. 86 million tracks, 300 terabytes. I stopped to reread those numbers, then thought: holy shit, how big is this thing?
Read more...
A hackmeeting, many years ago. A conference on various open-source projects. They were talking about Kiwix. The audience seemed interested, nodding, asking questions. I sat in the back of the room with a doubt that seemed legitimate but that I didn't dare express out loud: “what's the point of offline Wikipedia?” I mean: the internet is everywhere. If you need to look something up on Wikipedia, you open your browser, search, read. Done. Why would anyone download gigabytes of data to consult an encyclopedia offline? It seemed like a solution in search of a problem. Something for nerds nostalgic for CD-ROM encyclopedias.
It took me years to understand how naive I'd been.
Read more...
There's a moment in the history of technology when everything changes. We don't always recognise it. Sometimes it takes years to understand that a small spark, an apparently insignificant detail, ignited a revolution that would forever change the way we live, communicate, and consume culture. In 1987, an American singer-songwriter named Suzanne Vega released a minimalist track called “Tom's Diner”. Two minutes and nine seconds of a cappella vocals, no instrumental accompaniment, no special effects. Just a voice telling the story of an ordinary morning in a New York diner. A song so essential, so pure in its simplicity, that someone on the other side of the world – a German engineer obsessed with #audio compression – would use it as a benchmark to create a technology that would shake the global music industry to its core. That technology was called #MP3. And that voice, that “warm a cappella voice” as Karlheinz Brandenburg would later describe it, would become the ultimate test to determine whether a compression algorithm actually worked or not.
Read more...
From time to time, to completely disconnect from everything and everyone, I turn back into a kid and immerse myself in video games. I'm slow, I admit it: a game that would normally take 4-5 hours, I finish in at least quadruple the time. But every now and then, among the depths of Steam, I encounter genuine gems. And last night I finally completed Planet of Lana, a 2023 indie game that had been sitting in my library for months. The plot is straightforward but effective: Lana and Elo, presumably brother and sister, live in a peaceful fishing village built on stilts, where life flows serenely in harmony with nature. But this peace is shattered when a group of robots assault the village, kidnapping some inhabitants including Elo himself. From here begins Lana's odyssey: a journey to the edges of the known world to find and save her brother.
Read more...
When the world woke up astonished in November 2022 to this “magical” chatbot, few realized that this magic was the result of decades of research. The history of artificial intelligence begins in 1943, when Warren McCulloch and Walter Pitts proposed the first mathematical model of an artificial neuron. In 1956, at the Dartmouth Conference, John McCarthy coined the term “Artificial Intelligence” and the discipline was officially born.
The '60s and '70s were characterized by excessive optimism: people thought strong AI was just around the corner. Two “AI winters” followed – periods when funding disappeared and research slowed – because promises weren't materializing. But some continued working in the shadows. Geoffrey Hinton, Yann LeCun, Yoshua Bengio – those we now call the “godfathers of deep learning” – continued their studies on neural networks when no one believed in them anymore.
Read more...
In the silence of dozens of sleepless nights, in the solitude of my keyboard and laptop, I have imagined worlds. Fantastic scenarios inspired by hundreds of books, perhaps read too hastily, that have embedded themselves in my mind like small precious memories. The blue glow of my screen became a portal to these universes as my fingers translated thoughts into digital existence, each keystroke bringing new realities to life. As a longtime passionate reader of Cyberpunk, and only recently of Solarpunk, I have patiently imagined a story. Cyberpunk's dystopias in a Post-Apocalyptic world and Solarpunk's hopeful ecological futures have merged in my creative space, forming a unique vision that explores both technological power and environmental harmony. I build it unhurriedly, without a deadline, shaping the characters one at a time. I grow attached to them, explore them, abandon them, return to them, weep, and begin again. Each character carries fragments of real lives, observed emotions, and contemplated philosophies – becoming more real to me with every written line.
Read more...
Imagine a sleepless night, much like mine, where a single term – perceptron – catches your eye. It’s a word that seems both familiar and foreign, yet it opens a door to a world of ideas that have shaped our understanding of artificial intelligence. This is the story of the perceptron – a concept that began with promise, faced a significant setback, and then paved the way for the AI revolution we experience today.
The idea of thinking machines isn’t new. Ancient myths and philosophical debates have pondered the possibility of machines mimicking human thought. Yet, it wasn’t until the 20th century that these ideas took concrete form. Alan Turing, in his 1950 paper Computing Machinery and Intelligence, sparked a revolution by suggesting that machines could simulate human thought. He even proposed the Turing Test—a way to measure machine intelligence by their ability to converse like humans. This was a bold vision, and it set the stage for the birth of artificial intelligence as a field of study.
Read more...
I am a regular user of AbeBooks. Often, when I’m searching for books, I turn to the second-hand market, and AbeBooks, acting as an aggregator, is just perfect for my needs. Recently, among the recommended books, I came across a work by Richard Dawkins that I hadn’t read yet: The Magic of Reality. It’s a very accessible book that explains how to apply the scientific method to analyze the world around us. Interestingly, in my humble opinion, it pairs well with The Demon-Haunted World by Carl Sagan, a work that significantly propelled the modern skepticism movement.
In one of the final chapters, Dawkins focuses on the concept of evil and how the human mind struggles to understand that there is no “blame” or “natural justice” behind negative events. Rather, everything results from a series of coincidences, only partially controllable by humans. How often do we ask ourselves: “Why did this happen to me? It’s so unfair!”
Read more...
In recent months, I’ve been making a concerted effort to reconnect with the world of video games—a phrase that might make some smile, but let me explain. Back in 1999, when I made the definitive switch from Windows to Linux, I also bid farewell to gaming. As a devoted Dungeons & Dragons enthusiast, I had fallen in love with Diablo and, later, its much-criticized expansion Hellfire, where I played as the Monk. A few years later, my aunt gifted me a PlayStation 2, and I dabbled in Kingdom Hearts. However, I quickly lost interest. That was a difficult period in my life, marked by my first major bout of depression, and I couldn’t even find solace in gaming. As the saying goes, the first time is never forgotten.
Recently, I rediscovered the joy of gaming, thanks to an Italian creator, Phenrir, whose work I discovered through her review of the TV series The Man in the High Castle. I started with a few indie games—To the Moon, Machinarium, Nocturnal, The Bridge, Swapper, and Sheepy. Then, through recommendations and the ever-reliable Steam (bless it!), I stumbled upon a game that left me utterly breathless: Inside. Released in 2016 and developed by Playdead, this game features a minimalist yet profoundly impactful narrative.
Read more...