My post-punk media-theory book, The Medium Picture, is now available for preorder from all of your favorite places: UGA Press, Bookshop, Barnes & Noble, and even Amazon! Preorders mean more than you think. They’re very important for the life and success of the book. If you know you’re going to buy it, please consider snagging a copy early.
Preorders serve as an early indicator of a book’s potential success. They signal to publishers and retailers that there is interest in the book, which can lead to increased marketing efforts and larger print runs. For authors, preorders can be crucial in boosting their book’s visibility on platforms like Amazon. This can improve their sales rankings and increase exposure. On Amazon, preorders can affect the sales ranking before release, which might influence the platform’s promotional efforts. If you’re not sure, read on! Thank you!
Of all of my books, this is the one I’ve worked on longest and hardest. It’s the closest to my heart.
Here’s what other people are saying about it:
“Exactly the sort of contemporary cultural analysis to yield unnerving flashes of the future.” — William Gibson
“Like a skateboarder repurposing the utilitarian textures of the urban terrain for sport, Roy Christopher reclaims the content and technologies of the media environment as a landscape to be navigated and explored. The Medium Picture is both a highly personal yet revelatory chronicle of a decades-long encounter with mediated popular culture.” — Douglas Rushkoff
“A synthesis of theory and thesis, research and personal recollection, The Medium Picture is a work of rangy intelligence and wandering curiosity. Thought-provoking and a pleasure to read.” — Charles Yu
If you’re so inclined, you can post one these on the social medium of your liking. Link ‘em to your favorite online book outlet or just to http://www.themediumpicture.com
After looking back at the unified election map from 1984 and griping about advertising again, I arrived this week on their intersection: Apple’s 1984 Super Bowl commercial introducing the Macintosh. It launched not only the home-computer revolution but also the Super Bowl advertising frenzy and phenomenon.
The commercial burned itself right into my brain and everyone else’s who saw it. It was something truly different during something completely routine, stark innovation cutting through the middle of tightly-held tradition. I wasn’t old enough to understand the Orwell references, including the concept of Big Brother, but I got the meaning immediately: The underdog was now armed with something more powerful than the establishment. Apple was going to help us win.
Apple has of course become the biggest company in the world in the past 40 years, but reclaiming the dominant metaphors of a given time is an act of magical resistance. Feigning immunity from advertising isn’t a solution, it provides a deeper diagnosis of the problem. Appropriating language, mining affordances, misusing technology and other cultural artifacts create the space for resistance not only to exist but to thrive.Aggressively defying the metaphors of control, the anarchist poet Hakim Bey termed the extreme version of these appropriations “poetic terrorism.” He wrote,
The audience reaction or aesthetic-shock produced by [poetic terrorism] ought to be at least as strong as the emotion of terror—powerful disgust, sexual arousal, superstitious awe, sudden intuitive breakthrough, dada-esque angst—no matter whether the [poetic terrorism] is aimed at one person or many, no matter whether it is “signed” or anonymous, if it does not change someone’s life (aside from the artist) it fails.
Echoing Bey, the artist Konrad Becker suggests that dominant metaphors are in place to maintain control, writing,
The development in electronic communication and digital media allows for a global telepresence of values and behavioral norms and provides increasing possibilities of controlling public opinion by accelerating the flow of persuasive communication. Information is increasingly indistinguishable from propaganda, defined as “the manipulation of symbols as a means of influencing attitudes.” Whoever controls the metaphors controls thought.
In a much broader sense, so-called “culture jamming,” is any attempt to reclaim the dominant metaphors from the media. Gareth Branwyn writes, “In our wired age, the media has become a great amplifier for acts of poetic terrorism and culture jamming. A well-crafted media hoax or report of a prank uploaded to the Internet can quickly gain a life of its own.” Culture jammers, using tactics as simple as modifying phrases on billboards and as extensive as impersonating leaders of industry on major media outlets, expose the ways in which corporate and political interests manipulate the masses via the media. In the spirit of the Situationists International, culture jammers employ any creative crime that can disrupt the dominant narrative of the spectacle and devalue its currency.
“If you want a picture of the future, imagine a boot stamping on a human face—forever.” — George Orwell, 1984
“It’s clearly an allegory. Most commercials aren’t allegorical,” OG Macintosh engineer Andy Hertzfeld says of Apple’s “1984” commercial. “I’ve always looked at each commercial as a film, as a little filmlet,” says the director Ridley Scott. Fresh off of directing Blade Runner, which is based on a book he infamously claims never to have read, he adds, “From a filmic point of view, it was terrific, and I knew exactly how to do a kind of pastiche on what 1984 maybe was like in dramatic terms rather than factual terms.”
David Hoffman once summarized Orwell’s 1984, writing that “during times of universal deceit, telling the truth becomes a revolutionary act.” As the surveillance has expanded from mounted cameras to wireless taps (what Scott calls, “good dramatic bullshit”; cf. Orwell’s “Big Brother”), hackers have evolved from phone phreaking to secret leaking. It’s a ratcheting up of tactics and attacks on both sides. Andy Greenberg quotes Hunter S. Thompson, saying that the weird are turning pro. It’s a thought that evokes the last line of Bruce Sterling’s The Hacker Crackdown which, after deftly chronicling the early history of computer hacker activity, investigation, and incarceration, states ominously, “It is the End of the Amateurs.”
These quips could be applied to either side.
The Hacker Ethic—as popularized by Steven Levy’s Hackers (Anchor, 1984)—states that access to computers “and anything which might teach you something about the way the world works should be unlimited and total” (p. 40). Hackers seek to understand, not to undermine. And they tolerate no constraints. Tactical media, so-called to avoid the semiotic baggage of related labels, exploits the asymmetry of knowledge gained via hacking. In a passage that reads like recent events, purveyor of the term, Geert Lovink writes, “Tactical networks are all about an imaginary exchange of concepts outbidding and overlaying each other. Necessary illusions. What circulates are models and rumors, arguments and experiences of how to organize cultural and political activities, get projects financed, infrastructure up and running and create informal networks of trust which make living in Babylon bearable.”
If you want a picture of the future now, imagine a sledgehammer shattering a screen—forever.
Following Matt Blaze, Neal Stephenson states “it’s best in the long run, for all concerned, if vulnerabilities are exposed in public.” Informal groups of information insurgents like the crews behind Wikileaks and Anonymous keep open tabs on the powers that would be. Again, hackers are easy to defend when they’re on your side. Wires may be wormholes, as Stephenson says, but that can be dangerous when they flow both ways. Once you get locked out of all your accounts and the contents of your hard drive end up on the wrong screen, hackers aren’t your friends anymore, academic or otherwise.
Hackers of every kind behave as if they understand that “[p]ostmodernity is no longer a strategy or style, it is the natural condition of today’s network society,” as Lovink puts it. In a hyper-connected world, disconnection is power. The ability to become untraceable is the ability to become invisible. We need to unite and become hackers ourselves now more than ever against what Kevin DeLuca calls the acronyms of the apocalypse (e.g., WTO, NAFTA, GATT, etc.). The original Hacker Ethic isn’t enough. We need more of those nameless nerds, nodes in undulating networks of cyber disobedience. “Information moves, or we move to it,” writes Stephenson, like a hacker motto of “digital micro-politics.” Hackers need to appear, swarm, attack, and then disappear again into the dark fiber of the Deep Web.
Who was it that said Orwell was 40 years off? Lovink continues: “The world is crazy enough. There is not much reason to opt for the illusion.” It only takes a generation for the underdog to become the overlord. Sledgehammers and screens notwithstanding, we still need to watch the ones watching us.
Growing up watching cartoons and slapstick comedies made it seem like rare one-off events like getting stuck in quicksand, slipping on banana peels, and anvils falling from the sky were persistent problems in the world. Not only that, but primetime dramas made it seem like adults could get arrested for anything, and they might never even know the reason! The world seemed dangerous in ways that it really wasn’t.
Posited by George Gerbner in the early 1970s, cultivation theory states that among heavy television viewers, there is a tendency to view the world outside as similar to the world the way the television depicts it. That is, heavy media consumption tends to skew the general views of the media consumer.
Around the turn of the millennium there was a major push in certain underground circles to subvert consensus reality. The internet had connected people according to their esoteric interests (“find the others” as one popular site put it at the time), and it had evolved to a place where they could launch campaigns against the larger culture. Rabble-rousers came together in temporary autonomous zones to jam culture and pull pranks on the squares.
Josh Keyes, “Drift” (2020).
A lot has changed since those heady days of us vs them. The squares are all online now, and the mainstream has split into a million tributaries. Online our media diets are now directed by search results that base what we see on what we’ve seen before and what’s popular among others like us. In other words, the algorithm-driven internet is a similarity engine, producing a shameless sameness around our interests and beliefs, cocooning each of us in an impervious reflective armor. This can create what Eli Parser calls a filter bubble, an echo chamber of news, information, and entertainment that distorts our view of the real world. As Parser told Lynn Parramore at The Atlantic,
Since December 4, 2009, Google has been personalized for everyone. So when I had two friends this spring Google “BP,” one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill. Presumably that was based on the kinds of searches that they had done in the past.
Combine Gerbner’s cultivation theory and Parser’s filter bubble, and you’ve got a simple recipe for media-enabled solipsism. “Participatory fiction. Choose your own adventure,” the conspiracy theory chronicler Robert Guffey writes. “Virtual reality, but with no goggles necessary.” False microrealities like the Deep State, PizzaGate, and QAnon come alive in this environment. A limited ecosystem produces limited results.
It’s not farming and it’s not agriculture, it’s gardening: each of us hoeing a row, working a plot to grow only the food we want, regardless of what everyone else is eating.
This fragmentation in the United States has never been more evident than during the last few presidential elections. Above is the electoral map from the last one. As the nightly network news spread out into 24-hour cable coverage, so did its audience and its intentions. In his book, After the Mass-Age, Chris Riley writes that instead of trying to get the majority to watch, each network preferred a dedicated minority: “Now you didn’t win the ratings war by being objective; you won by being subjective, by segmenting the audience, not uniting them.” And we met them in the middle, seeking out the news that presented the world more the way we wanted to see it than the way it really was. With the further splintering of social media, we choose the news that fits us best. If we’re all watching broadcast network news, we’re all seeing the same story. If we’re all on the same social network, no two of us are seeing the same thing.
Rewind: Above is the electoral map from the 1984 US presidential election. Republican incumbent Ronald Reagan carried 49 of the 50 states, while Walter Mondale pulled only his home state of Minnesota and the District of Columbia. The year 1984 stands as the most united these states have ever been behind a president.
This map is the product of broadcast and print media: one-to-many, mass media like television, radio, newspapers, and magazines. Over the past 40 years those platforms have divided and splintered further and further into unique, individual experiences. The 2020 map above is a product of the internet and social media: many-to-many, multiple sources and viewpoints, and fewer shared mediated experiences.
The medium is only the message at a certain scale, and that scale is diminished.
Reality doesn’t scale in the way that our media depicts it. Nietzsche once called any truth a “useful fiction.” Now that’s all we have, but a lot of them aren’t useful, and none of them are sustainable. A temporary autonomous zone is just that — temporary. There is no longer a consensus to subvert, but we need to know what everyone else is eating if we’re ever going to eat together again.
This is only one of the results of our media gardening. If we share fewer and fewer mediated experiences, some of those disconnections are going to have consequences. Tucked away in the alleys and valleys of our own interests, we stay entrenched in our own tribes, utterly outraged at any other tribe’s dis, disdain, or destruction of one of our own’s preciously held beliefs. The internet has exacerbated these conditions. Instead of more connection, there is a sense of more dis-connection. Where we are promised diversity, we get division. We burrow so deep in our own dirt that we can’t see the world as it really is: a spinning blue ball covered with tiny cells, passive plants, and dumb meat, each just trying to make its own way. Starting from such focus, we can find ourselves in a place. We can belong at a certain level. It just feels like now we never seem to zoom out far enough to see the whole. Instead of giving us the tools to see the bigger picture, the algorithmic biases of our media feed our own individual biases.
Retreat is not the answer, retreat is the problem. We need more connection, not less — real connection. We need to eat at the same table once in a while. We need to engage more with those who aren’t like us. Lift the little ones, help the ones who need it, and learn as much about each other as we can.
A new year typically brings renewal and hope. I will admit to struggling to find it in these first couple of weeks of 2024. There are too many things we need to get out from under first. Satisficing, the resigning oneself to the first workable option as sufficient (the word itself a workable but unwieldy portmanteau of “satisfy” and “suffice”), is often considered a good thing, saving one from the needless pursuit of an elusive better or optimal solution. Too much of this good thing leads to the same old thing.
After writing about unintended outcomes and technology not solving problems a few weeks ago, I seem to have closed something else off. Now those unintended outcomes are all I see. Greatness is never achieved through satisficing. The road to mediocrity is paved with just good enough. Now more than ever, we need more than that.
There’s a story under there somewhere, I think.
When you watch a video clip on YouTube, it is typically preceded (and often interrupted) by some sort of advertising. They give you a countdown clock to when the ad is over or to when you can click “skip” and get on with your initial purpose. The very existence of this click-clock indicates that the people at YouTube know that you don’t want to see the ad(s) on their site! They’ve been cracking down on plug-ins to block such ads, and they along with other such “services” offer premium packages where you can eschew all ads for an additional monthly fee (Gee, thanks!).
I mentioned direct mail in the preamble to my previous list, writing that a successful direct-mail advertising campaign has a response rate of 2% and what a waste that is for all involved (98%!). How much mail do you recycle compared to actual mail and written correspondence? Mail seems like an antiquated example, until you go online.
It’s global, yet it’s local. It’s the next thing in Social. Hip-hop, rockin’, or microbloggin’ — You get updates every time you log in. So, come on in, we’re open, And we’re hopin’ to rope in All your Facebook friends and Twitter memories. There’s a brand-new place for all of your frenemies. You don’t really care about piracy or privacy. You just want music and friends as far as the eye can see. So, sign up, sign in, put in your information. It’s the new online destination for a bored, boring nation. Tell your friends, your sister, and your mom. It’s time for justgoodenough.com
When you log into Instagram and check your notifications (or your other accounts or even your email), how many of them are from people you follow and how many are from spam accounts? Mine are fairly even. That is, I spend as much time on these platforms deleting junk as I do “interacting” with friends and colleagues. I’m sure you have similar experiences.
Where is the break boundary? Where is the point when enough of us have had enough to actually ditch these platforms? I abandon my accounts every other month. None of them are essential after all. YouTube and Instagram are toys at best, amusements for brains trained to seek such tiny nuggets of validation and entertainment, but these same inconvenient priorities spill over into things that do matter. All noise and very little signal. All soggy vegetables and very little pudding.
We’re starving, but… Everything is okay.
Everything is just okay. And it won’t get better until we all demand something else. It won’t get better until we stop satisficing and give each other more of what we want and less of what they want us to have.
Unintended outcomes are the furniture of our uncertain age. Decades of short-term thinking, election cycles, and bottom lines assessed quarterly have wound us into a loop we can’t unwind. In addition, our technologies have coopted our desires in ways we didn’t foresee. The internet promised us diversity and gave us division. Social media promised to bring us together, instead it fomented frustration and rage between friends and among family. We know the net result is bad, but we won’t abandon these poisonous platforms.
As straw-person an argument as it might be, direct mail is my favorite example. Successful direct-mail advertising has a return rate of 2%. That means that in a successful campaign, 98% of the effort is wasted. In any other field, if 98% of what you’re doing is ineffective, you would scrap it and start over.
I’ve been thinking about case studies of ineffective efforts and unintended outcomes, and I came up with five for your consideration — IRL: Idea, Reality, Lesson.
“Shadow Play,” Sharpie on paper, 2005.
Idea: AI as a tool for creativity. Reality: Training large-language models (and the other software that currently pass as artificial intelligence) to be “creative” requires the unpaid labor of many writers and artists, potentially violating copyright laws, relegating the creative class to the service of the machines and the people who use them. Lesson: Every leap in technology’s evolution has winners and losers.
Idea: Self-driving cars will solve our transportation problems. Reality: Now you can be stuck in traffic without even having to drive. Lesson: We don’t need more cars with fewer drivers. We need fewer cars with more people in them.
Idea: Put unused resources to use. Reality: The underlying concept of companies like Uber and AirBnB—taking unused resources (e.g., vehicles, rooms, houses, etc.) and redistributing them to others in need—is brilliant and needed in our age of abundance and disparity. Instead of using what’s there, a boutique industry of rental car partnerships for ride-share drivers and homes bought specifically for use as AirBnB rentals sprung up around these app-enabled services. Those are fine, but they don’t solve the problem the original idea set out to leverage. Lesson: You cannot disrupt capitalism. Ultimately, it eats everything.
Idea: Content is King. Reality: When you can call yourself a “Digital Content Creator” just because you have a front-facing camera on your phone, then content is the lowest form. To stay with the analogy, Content is a peasant at best. Getting it out there is King. Getting and maintaining people’s attention is Queen. Lesson: Distribution and Attention are the real monarchy.
Idea: Print is dead. Reality: People have been claiming the death of print since the dawn of the web—over 30 years now—and it’s still patently untrue. Print is different, but it’s far from dead. Books abound! People who say this don’t read them anyway. Just because they want synopses and summaries instead of leisurely long reads doesn’t mean that everyone wants that. Lesson: Never underestimate people’s appetite for excuses.
If more of what you’re doing is wasteful rather than effective, then you should rethink what you’re doing. Attitudes about technology are often incongruent with their realities, and the way we talk about its evolution matters. Moreover, while many recent innovations seem to be helping, there are adjacent problems they’re not solving. Don’t be dazzled by stopgap technologies that don’t actually solve real problems.
No one reads. People say this all the time, and as a writer, it’s very hard to hear. If I’m ever forced to start a podcast, that will be the reason, and it might be the name. If no one reads, why are we outsourcing writing? According to a recent article onFuturism, sports magazine Sports Illustratedallegedly published reviews generated by artificial intelligence. Not only that, but the bylines on those articles belonged to writers who weren’t real either.
Drew Ortiz, a “Product Reviews Team Member” for Sports Illustrated.
Meet Drew Ortiz, a “neutral white young-adult male with short brown hair and blue eyes” (likely on purpose), and a “Product Reviews Team Member” for Sports Illustrated. One of Drew’s many articles for SI claims that volleyball “can be a little tricky to get into, especially without an actual ball to practice with.” True enough, Drew, but it’s also tricky to get into if you don’t have an actual body to practice with either.
Look, Drew is just like you and me.
Drew was eventually replaced briefly by Sora Tanaka, a “joyful asian young-adult female with long brown hair and brown eyes.” Futurism also notes Jim Cramer’s TheStreet hosting articles by Domino Abrams, Nicole Merrifield, and Denise McNamera — all pseudonyms for AI-generated pseudoscribes.
Sora Tanaka, a “joyful asian young-adult female with long brown hair and brown eyes.”
Given that this path was paved when we first outsourced our thinking to written language, it’s perhaps most fitting that what passes for artificial intelligence these days are large language models, none of which can play volleyball but can write about it. The computer scientists Allen Newell and Herbert A. Simon defined thinking in just such terms, writing, “A physical symbol system has the necessary and sufficient means for general intelligent action.” The externalization of human knowledge has largely been achieved through text — a physical symbol system. Cave paintings, scrolls, books, the internet. Even with the broadening of bandwidth enabling sound and video, all of these media are still heavily text-based.
In a paper from 1936 titled “On Computable Numbers, with an Application to the Entscheidungsproblem,” the mathematician and computer scientist Alan Turing posited that humans compute by manipulating symbols that are external to the human brain and that computers do the same. The paper serves as the basis for his own Universal Turing Machine, algorithms, and the fields of computer science and AI.
I am admittedly a lapsed student of AI, having dropped out of the University of Georgia’s Artificial Intelligence masters program midway through my first semester there in the late 1990s. My interest in AI lies in the weird ways that consciousness and creation butt heads in the midst of such advanced technologies. As Al Burian sings on the Milemarker song “Frigid Forms Sell You Warmth,” “We keep waiting for the robots to crush us from the sky. They sneak in through our fingertips and bleed our fingers dry.” If humans have indeed always been part technology, where do the machines end and we begin? As the literary critic N. Katherine Hayles told me years ago,
In the twenty-first century, text and materiality will be seen as inextricably entwined. Materiality and text, words and their physical embodiments, are always already a unity rather than a duality. Appreciating the complexities of that unity is the important task that lies before us.
“Manufacturing Dissent” multimedia on canvas by me, c. 2003.
In his book, Expanded Cinema (1970), the media theorist and critic Gene Youngblood conceived television as the “software of the world,” and nearly a decade later in their book Media Logic (1979), David Altheide and Robert Snow argue that television represents not only the totality of our culture, but that it gives us a “false sense of participation.” With even a marginally immersive medium like television one needn’t upload their brain into a machine to feel it extending into the world, even as the medium itself encroaches on the mind.
A medium is anything that extends the senses or the body of humans according to Marshall McLuhan in his classic Understanding Media: The Extensions of Man (1964). More specifically, McLuhan saw the “electronic media” of the time — radio, telephone, television — as extensions of our nervous system. Jussi Parikka writes that we must stop thinking about bodies as closed systems and realize that they are open and constituted by their environment, what Humberto Maturana and Francisco J. Varela call “structural coupling.” Our skin is not a boundary; it is a periphery: permeable, vulnerable, and fallibly open to external flows and forces through our senses. Parikka adds, “[W]e do not so much have media as we are media and of media; media are brains that contract forces of the cosmos, cast a plane over the chaos.” We can no longer do without, if we ever could.
Our extensions have coerced our attentions and intentions. We are now the pathological appendages of our technological assemblages.
Desire is where our media and our bodies meet. It’s where our human wants blur with our technologies. It is the inertia of their meeting and their melding, whether that is inside our outside our bodies is less relevant than whether or not we want to involve ourselves in the first place. Think about the behaviors that our communication technology affords and the ones we find appropriate. They’re not the same. Access is the medium. Desire is the message.
Crash-testing intelligence [Sharpies and Photoshop by me, 2023].
Steve Jobs once said that the personal computer and the television would never converge because we choose one when we want to engage and the other when we want to turn off. It’s a disparity of desire. A large percentage of people, given the opportunity or not, do not want to post things online, create a social-media profile, or any of a number of other web-enabled sharing activities. For example, I do not like baseball. I don’t like watching it, much less playing it. If all of the sudden baseballs, gloves, and bats were free, and every home were equipped with a baseball diamond, my desire to play baseball would not increase. Most people do not want to comment on blog posts, video clips, or news stories, much less create their own, regardless of the tools or opportunities made available to them. Cognitive surplus or not, its potential is just that without the collective desire to put it into action. Just because we can doesn’t mean we want to.
The Turing Test, which is among Alan Turing’s other top contributions to the fields of computer science and artificial intelligence, is more accurately a test of the human who’s interacting with the machine. The test, as outlined in Turing’s 1950 article “Computing Machinery and Intelligence,” states that a machine is considered to be truly thinking like a human if it can fool a human into thinking it is (a.k.a. “The Imitation Game”). So, according to the language and the lore, artificial intelligence doesn’t have to be real, it just has to be convincing. Now that Drew Ortiz, Sora Tanaka, and the other machines can do these symbol-manipulation tasks for us, we’ve outsourced not only our knowledge via text but now the writing of that knowledge, not quite the thoughts themselves but the articulation thereof.
One of the since-faded early concerns of the internet was “information overload.” The worry was that given the onset of abundant connectivity and content, we were being inundated with so much information that we’d never be able to process it all. Now we limit the flow in our feeds and find just what we need. The real danger of filter bubbles and echo chambers is a cultivated myopia: a limited view of a world of sameness and an inability to see beyond the barriers we’ve erected for ourselves. As Jay Ogilvy once said, “If it’s not different, it’s not information.”
My rendition of “The Strength of Weak Ties” by Mark Granovetter, (1973).
In the late 1960s, Mark Granovetter was studying how people found jobs. His 1973 article in the American Journal of Sociology, “The Strength of Weak Ties,” states that each person in a close social network is likely to have the same information as everyone else in that network. It’s the weak ties to other networks that lead to the new stuff. That is, weak ties are a more likely source of novel ideas and information—regarding jobs, mates, and other opportunities—than strong ones.
Granovetter says, “I put the theory of weak ties together from a number of things. I learned about hydrogen bonding in AP Chemistry in high school and that image always stuck with me—these weak hydrogen bonds were holding together huge molecules precisely because they were so weak. That was still in my head when I started thinking about networks.”
Like most of my research interests, I first noticed these thresholds in music. I was looking at the CDs I had on hand one day, and I noticed that most of my favorite bands didn’t fit into established genres. They tended to straddle the lines between genres. In nature, these interstitial spaces are called edge realms. In her book When Plants Dream (Watkins Media, 2019), Sophia Rokhlin describes them as follows:
The edge describes the place where two distinct ecosystems meet. These are places of tension and unfamiliarity, territories of confrontation, where different ecosystems overlap and merge. The edge is found where a grassland meets a forest, where oceans reach the shore, where wetlands mediate between river basins and fields. Edges are hot spots of biodiversity that invite innovation, intermingling, and new forms of cooperation from various species. Edge realms are thresholds of potential and fecundity.
Mutations inside Area X as seen in Alex Garland’s ‘Annihilation’ (2018).
An edge realm is a wilderness, a mutant space ripe for new forms. In Jeff VanderMeer’s Southern Reach Trilogy, the mysterious Area X is just such a space. Its pollinations crossing well established boundaries, mixing into ever-new breeds and combinations. In his book about VanderMeer’s work, None of This is Normal (University of Minnesota Press, 2018), Ben Robertson writes,
Area X is something else, what has always already disrupted the processes by which by which borders are established between that and this, between one space or time and another space or time, between the human and whatever its other happens to be.
My pencil portrait of Brian Eno from ‘Follow for Now, Vol. 2’.
The fertile ground is in between the established crops of others. The new stuff happens at the edges, in between the codified categories. Any old boring story from history can be made more interesting by varying viewpoints. In his 1996 memoir, A Year with Swollen Appendices (faber & faber), Brian Eno proposes the idea of edge culture, which is based on the premise that
If you abandon the idea that culture has a single center, and imagine that there is instead a network of active nodes, which may or may not be included in a particular journey across the field, you also abandon the idea that those nodes have absolute value. Their value changes according to which story they’re included in, and how prominently.
Each of us tell our own stories, including the cultural artifacts relevant to the narrative we’ve chosen. The long tail is an ironic attempt to depict a big picture that no longer exists. With its emphasis on the individual narrative, edge culture more accurately illustrates the current, fragmented state of mediated culture, subcultures, and the way that edge realms and social networks define them.
My Sharpie sketch of a Boundary Object in use among 3 communities of practice.
The members and fans of subcultures—groups united by similar goals, practices, and vocabularies—represent what Etienne Wenger calls communities of practice. To translate differences and aid communication between these communities, they use what Susan Leigh Star and James Griesemer (1989) calledboundary objects. A boundary object can be a word, concept, metaphor, allusion, artifact, map, or other node around which communities organize their overlaps and interconnections. These connective terms emphasize groups’ similarities rather than their differences. Boundary objects between different communities of practice open borders once inaccessible, circulating ideas into new territories.
Allusions, references, quotations, metaphors, and other figurative expressions provide the points at which multiple texts, genres, and groups connect and collaborate. They are where textual communities compare notes. “What I see instead of there being one line, many lines,” Eno explains in a lecture from 1992, “lots of ways of looking at this field of objects that we call culture. Lines that we may individually choose to change every day.” Hunting and gathering, picking and choosing, we can each make our own individual mongrel culture.
Mark Granovetter conceived the edge realms of these cultural networks way before we were all connected online, but his insight is all the more relevant today. With our personal media, ubiquitous screens, and invisible, wireless networks, we live in a world of weak ties. You just have to reach out to find the new stuff.
I had the pleasure of talking with Alex Kuchma of the New Books Network podcast about my recent edited collection, Boogie Down Predictions, as well as my books Dead Precedents and The Medium Picture. A student of hip-hop culture like me, Alex is steeped in the stuff. He came to the discussion with sharp questions and insight. It was a pleasure.
Check it out here, on Apple, or wherever you listen to podcasts. Many thanks to Alex and the New Books Network for the interest and the opportunity.
The biologist Robin Wall Kimmerer writes that when botanists go out in fields and forests looking for flora, they call it a foray, and when writers do the same, it should be called a metaphoray.[1] Marshall and Eric McLuhan open their book Laws of Media:The New Science with the claim that each of our technological artifacts is “a kind of word, a metaphor that translates experience from one form to another.”[2]That is, each new technological advance transforms us by changing our relationship to our environment, just as a metaphor does with our knowledge.
“Each person is his [sic] own central metaphor,” wrote Mary Catherine Bateson.[3] Bateson saw the perceptual processes of the organism as a metaphor for the complexities of the world outside it. With the spread and adoption of personal media and the internet, the network metaphor has creeped further and further into our thinking.[4] The center thins out to the edges as the network becomes central.[5] Extending it inward, Michael Schandorf writes that “every ‘node’ is a network all its own, each with its own very fuzzy boundaries and interpenetrations.”[6]Expanding it outward, Barry Brummett writes that texts are, “nodal: what one experiences here and now is a text, but it may well be a part of a larger text extending into time and space. Texts tend to grow nodes off themselves that develop into larger, more complex but related texts.”[7]And Steven Shaviro adds, “The network is shaped like a fractal. That is to say, it is self-similar across all scales, no matter how far down you go. Any portion of the network has the same structure as the network as a whole. Neurons connect with each other across synapses on much the same way that Web sites are linked on the World Wide Web.”[8]From texts to networks, our minds are permeable.[9] I belabor the point here because we are complicit in the use of these metaphors.
The connections of a network are what gives it its power. In turn, the network gives each node its power as well. For example, a telephone is only as valuable as its connection to other telephones. Where value normally derives from scarcity, here it comes from abundance. If you own the only telephone or your phone loses service, it’s worthless. In addition, each new phone connected to the network adds value to every other phone.[10] At a certain point, fatigue sets in. Connectivity is great until you’re connected to people you’d rather avoid. Each new communication channel is eventually overrun by marketers and scammers, leveraging the links to sell or shill, forcing us to filter, screen, buffer, or otherwise close ourselves off from the network.[11] There is a threshold, a break boundary, beyond which connectivity becomes a bad thing and the network starts to lose its value.
Fig 6.1: A tetrad of the network. It enhances connectivity, obsolesces isolation, retrieves word of mouth, and reverses into fatigue.
In their Laws of Media, Marshall and Eric McLuhan outlined the ramifications of these media through their tetrad of media effects, which states that every new medium enhances something, makes something obsolete, retrieves a previous something, and reverses into something else once pushed past a certain threshold.[12] In Fig. 6.1, I’ve applied this metaphorical framework to the network.
When we buy into these infrastructures—networks or otherwise—we’re buying into their metaphors. Moreover, we’re buying into the idea that metaphors are an effective way to represent the world.[13] “A metaphor is always a framework for thinking, using knowledge of this to think about that,” Bateson once said.[14] The word metaphor means “carrying over,” and that’s just what their meanings do. Nodes and networks: eventually, we forget all of these are metaphors. “That is the real danger,” Robert Swigart writes, “unless we pause from time to time to consider how these metaphors work to create boundaries… they will control us without our knowledge.”[15] As long as we’re paying attention though, we can always defy them.
Notes:
[1] Robin Wall Kimmerer, Braiding Sweetgrass, Minneapolis, MN: Milkweed Editions, 2013, 46; With thanks to Michael Schandorf.
[2] Marshall McLuhan & Eric McLuhan, Laws of Media: The New Science, Toronto: University of Toronto Press, 1988, 3; As Eric McLuhan notes in The Lost Tetrads of Marshall McLuhan, “Metaphor and the tetrad on metaphor are the very heart of Laws of Media.”; Marshall McLuhan & Eric McLuhan, The Lost Tetrads of Marshall McLuhan, New York: O/R Books, 2017, 200n; So much of Marshall McLuhan’s work was done with metaphors. As he wrote, interpolating Robert Browning, “A man’s reach must exceed his grasp or what’s a metaphor.”; Marshall McLuhan, Understanding Media: The Extensions of Man, New York: Houghton Mifflin, 1964, 64; See also, Yoni Van Den Eede, “Exceeding Our Grasp: McLuhan’s All-Metaphorical Outlook,” in Jaqueline McLeod Rogers, Tracy Whalen & Catherine G. Taylor (eds.), Finding McLuhan: The Mind, The Man, The Message, Regina, Canada: University of Regina Press, 2015, 43-61; Roger K. Logan, McLuhan Misunderstood, Toronto: The Key Publishing House, 2013, 39-40.
[3] Mary Catherine Bateson, Our Own Metaphor, New York: Alfred A. Knopf, 1972, 284; See also, Gregory Bateson, “Our Own Metaphor: Nine Years After,” in A Sacred Unity: Further Steps to an Ecology of Mind, New York: HarperCollins, 1991, p. 285.
[4] This is a trend that John Naisbitt spotted in newspapers in the late 1970s; See Chapter 8, “From Hierarchies to Networks,” in John Naisbitt, Megatrends: Ten New Directions Transforming Our Lives, New York: Warner Books, 1982, 189-205.
[5] Alexander R. Galloway, Eugene Thacker, and McKenzie Wark, Excommunication: Three Inquiries in Media and Mediation. Chicago: University of Chicago Press, 2014, 2.
[6] Michael Schandorf, Communication as Gesture, Bingley, UK: Emerald Publishing, 2019, 108; Latour defines the black box similarly: “Each of the parts inside the black box is itself a black box full of parts.”; Bruno Latour, Pandora’s Hope: Essays on the Reality of Science Studies. Cambridge, MA: Harvard University Press, 1999, 185.
[7] Barry Brummett, A Rhetoric of Style, Carbondale, IL: Southern Illinois University Press, 2008, 118.
[8] Shaviro goes on to flip McLuhan’s claim that electronic networks are an extension of the central nervous system to write, “every individual brain is a miniaturized replica of the global communications network.”; Shaviro, 2003, 12.
[9] Nicholas Carr writes, “Those who celebrate the ‘outsourcing’ of memory to the Web have been misled by a metaphor. They overlook the fundamentally organic nature of biological memory. What gives real memory its richness and its character, not to mention its mystery and fragility, is its contingency. It exists in time, changing as the body changes. Indeed the very act of recalling a memory appears to restart the entire process of consolidation, including the generation of proteins to form new synaptic terminals.”; Nicholas Carr, The Shallows: What the Internet Is Doing to Our Brains, New York: W.W. Norton & Co., 2010, 191.
[10] Kevin Kelly calls this “the fax effect.”; Kevin Kelly, New Rules for the New Economy, New York: Viking, 1998, 39-49.
[11] Using metaphors from epidemiology, Malcolm Gladwell calls this fatigue “immunity.”; Malcolm Gladwell, The Tipping Point, New York: Little, Brown, 2000, 271-275.
[13] Neal Stephenson, In the Beginning was the Command Line, New York: William Marrow, 1999.
[14] Mary Catherine Bateson, How to Be a Systems Thinker: A Conversation with Mary Catherine Bateson, Edge, April 17, 2018: https://www.edge.org/conversation/mary_catherine_bateson-how-to-be-a-systems-thinker
[15] Robert Swigart, “A Writer’s Desktop,” In Brenda Laurel (ed.), The Art of Human-Computer Interface Design, Reading, MA: Addison-Wesley Professional, 1990, 140-141.
I’m on Talk Your Talk with my man Alaska this week. I’m the first guest on this spin off from his usual show, Call Out Culture with Curly Castro and Zilla Rocca, on which I was also the first guest. I did the artwork for their Michael Myers/Nas-themed “Killmatic” episode, too.
In this new one, we talk about my books, new, old, and not-out-yet, as well as a few high-minded social-science theories… and the raps, of course.