The Beat

Neuromancers, Unthinkable Complexity, and Creative Cultural Osmosis

Welcome to The Beat, Decential’s weekly exploration of music, culture and the new Internet – featuring all the friends we’ve met along the way.

On the culture-tech byway, things move at breakneck speeds. From web3 to AI, copyright to collective ownership, art to psychedelics, The Beat is an exercise in association. We all contain multitudes, and within them, vast differences. But there is some connective, fundamental essence to be found.

The Beat is dedicated to that essence; the rest, as Alex Ross wrote, is noise.

Creative Cultural Osmosis

“Cyberspace. A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts...

A graphic representation of data abstracted from banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding...”

William Gibson, Neuromancer

That’s a passage from William Gibson’s iconic 1984 novel, Neuromancer. Regarded as the progenitor of the cyberpunk genre, it was the first novel to win the Nebula, the Hugo and Philip K. Dick Award. 

Gibson was formative to the way we think about “cyberspace.” He actually coined the term, two years prior in his novelette, Burning Chrome. In the afterword of a 2000 reissue of Neuromancer, fellow author Jack Womack went so far as to suggest that Gibson’s writing was the catalyst for the world wide web. “What if the act of writing it down, in fact, brought it about?” Womack asked.

It's a Matrix-like thought. And it’s well-known that The Matrix film drew heavily from the novel. “I was, as you can probably imagine, prepared not to like The Matrix,” Gibson wrote in his blog after seeing it. “I liked it a lot. I even went back to see it a second time in theatrical release, which is unusual for me…It’s thematically gnostic, something Neuromancer isn’t. Whatever of my work may be there, it seems to me to have gotten there by exactly the kind of creative cultural osmosis I’ve always depended on myself.”

Today, as AI continues to permeate our lives, what kinds of “creative cultural osmosis” should we depend on? What will be AI’s ultimate form? And will we survive it?

The Turing Police

In Neuromancer, there’s a multinational law enforcement agency called the Turing Police. They’re tasked with containing AI development to prevent it from becoming too powerful or self-aware.

That agency came to mind when Trump promised to nix Biden’s 2023 executive order on “the safe, secure, and trustworthy development and use of artificial intelligence.”

And, surely, he’s less informed than the 2,600+ tech CEOs and leaders – Elon Musk and Steve Wozniak amongst them – that signed a petition last March for a temporary pause on AI development. 

It was around then, too, that Dr. Geoffrey Hinton, the “Godfather of AI,” quit Google and warned of dangers ahead, espousing regret over his role in creating the technology. And those dangers are beginning to manifest.

Autonomous weaponry is perhaps the most egregious form of clear and present danger – and it’s already on the battlefield. Last week, NPR shared several firsthand accounts of Palestinian civilians being picked off and killed by Israeli “quadcopter” drones. It’s Black Mirror-esque, and a damning sign that the world has thrown caution to the wind – with no Turing Police to keep us in check, and few ethical guardrails to prevent those in power from reducing people to disposable data points.

As I wrote a couple of Beats ago, much of Trump’s Silicon Valley support came from the folks banking on AI deregulation. Not all Silicon Valley concoctions are as vile as the quadcopter, but even on something as relatively innocuous as creator platforms, there’s a tendency to reduce human beings to ones and zeros.

But “these platforms will collapse without those creators,” wrote Ted Gioia in a recent edition of his Honest Broker newsletter, “so the corporate bosses need to treat them well.”

According to a recent study, 14 percent of the United States’ working age population is a creator on a web platform (and 44 percent of those are doing that work full-time).

Gioia continues:

“Very few CEOs in Silicon Valley have figured this out yet. They are tech people, and tend to focus on apps and clicks and software upgrades. But they are really in the business of managing human talent.

Sure, they would like to replace all these human creators with AI. But that’s not happening anytime soon – or maybe not anytime at all. The consumers of creative works are skeptical of bot-generated work.

This represents a huge shift in power. Just a few years ago, Mark Zuckerberg or Elon Musk or Jeff Bezos could rule their platforms like dictators, and there were no negative consequences.

That’s changing.”

One harbinger of change is a recent “exodus” from Twitter, Gioia observed. “Everybody I know on the platform has seen a significant erosion in followers since the election,” he wrote. I’ve noticed that, too.

Power, Gioia suggests, is “shifting rapidly to indie creators.” It’s a hopeful development, and I wonder, what does AI look like in that world? When platform dictators cannot replace us without consequence? When we have more agency in our cultural osmosis?

As thousands were urging regulators to pause AI development, LAION, a large-scale Artificial Intelligence Open Network, was trying to democratize it – and that feels just as important for cultivating a healthy relationship between people and machines.

In their petition, LAION called for a CERN-like global organization to coordinate efforts on the transparency of large-scale AI research – a sort of Turing Police, if you will, that exists outside the purview of Big Tech and corporate fiduciary duty.

Because until there are viable and more cooperative alternatives – which are, gratefully, emergent – they have little incentive not to keep us in algorithmic bubbles, isolated in our own feedback loops, primed to consume.

To borrow another line from Neuromancer: “We have sealed ourselves away behind our money, growing inward, generating a seamless universe of self.”

AI and Intimacy

Last week, at the London venue Made by Many, I attended an evening on AI and intimacy. Three people spoke at the event: Jenny Kleeman, a journalist, broadcaster and author (Sex Bots & Vegan Meat, The Price of Life); Stephen O'Farrell, a senior machine learning engineer at Bumble and the lead for their recently-launched AI Discovery team; and Margarita Popova, Chief Product Officer at Replika – an AI companion app with over 30 million active users. 

Here was a different perspective. When used responsibly and with more connective intention, they asked, can AI be used to fill our gaps? Can it mitigate a pandemic of loneliness that’s been exacerbated by Big Tech and social media?

Because today, nearly half of all adults feel less confident about their self-image after using social media. In the UK, 7 in 10 young people report that social media harms their mental health. Australia just banned social media for everyone under 16. Can we use technology to ameliorate the woes – that “seamless universe of self” – that technology has created?

Next Wednesday, I’m hosting my own event, Meridians, at Made by Many. I’ve written here before about my stutter, and about losing trust in my voice, and about Moon Man, the alter ego my subconscious created when bullies forced me to turn inward.

In my journey, I realized that everyone has a Moon Man. Each of us has a “meridian,” that liminal space between the “true self” and the image we project to the world. And through my music, I realized I could invite people to join the healing process. 

I conceived an interactive prototype using AI that bridges live performance with digital co-creation. As I play, the audience is invited to sit with their ‘Moon Man,’ and they’re encouraged to express what comes forth on a custom leaflet.

Afterward, everyone’s expressions are filtered through a generative AI program. They’re then collectively transformed into a character from Braneworld, an emergent digital realm and storytelling container where our various selves collide.

In this way, the performance remains intimate and human-centered, and the technology acts as a bridge between our inner and outer selves. It transforms individual struggles into collective strength, becoming a shared mythology that we can all draw from.

And if it helps us approach a more democratic, more indie, less isolated reservoir for cultural osmosis, will we look back and ask: what if the act of writing it down, in fact, brought it about?

As Octavia Butler wrote in her 1993 novel, Parable of the Sower, "All that you touch, you change. All that you change, changes you."

Coda

This week I’m back home in Minnesota for Thanksgiving (happy Thanksgiving to all who celebrate). It’s my first time back in a couple years, and I’m soaking up the quietude of my very small hometown. As I write this, I’m looking out the window at an empty cornfield, plowed and dormant, steadily freezing into the long northern winter.

Mourning doves sing. From time to time, cardinals fly by and drink from the bird bath. The trees have all lost their leaves, and they stand quietly in the cold air. Still, life moves on. Unthinkable complexity. The city lights receding quickly from my mind.

Now go outside and listen to music – it’s a beautiful day.

My name is MacEagon Voyce. For more music and less noise, consider subscribing to The Beat. And if you already do, consider sharing with a friend. Thanks for being here.