Imagine 40 or so years from now, if political and economic policies maintain their current trajectory. Mark Zuckerberg has put his Ceasar-worship where his mouth is and Facebook has achieved something akin to statehood, lording over a global network of citizens from the stronghold of Menlo Park. Assuming the interests of future-Facebook remain the same as they are now, we can assume that the company-state has its own currency, a standing military and political sway on par with the world’s hyperpowers. It also maintains its monopoly on social functions, having absorbed the most other major platforms. In this future, the platform by necessity becomes the infrastructure of modern life: it’s the most reliable source of commerce, communication, ideas, security, entertainment, and social credit. No one else has the reach. The social network is the fabric of society.
I will be old by this point, and likely you will be too. The digital migrants will have been outclassed by generations who grew up fully immersed in social media as a lifestyle, and were not along for the ride during the experimental transition from Web 1.0 to Web 2.0. In this future, the elderly populations are savvy enough as social media users, having retained our edge from the turn of the millenium, but we have held onto an antiquated mistrust of the state. We miss the world as it was, when the platform wasn’t something we vitally depended on. It was just an entertainment system we willingly sold our information to in exchange for the lols and the occasional mass radicalization movement.
I’m imagining, then, the launch of a sequestered and simplified version of the site called Facebook Classic, created as a panacea for the older generations to keep them as willing and pacified users. This is essentially a group where we all pretend to be boomers but for real, or the Jitterbug phone for aged millennials – a place for our memes and cringe personal posts without the transparentness of the company-state as a meddler in all our affairs. We can pretend the world functions as it once did; they can continue to pocket our data. The reach of the empire seems less insidious, and we willingly log on.
In that vein, I fully believe that Facebook will launch a children’s version as well. Given how YouTube has already sequestered its children’s programming, I can picture Facebook going the same route with a more tailored content stream to set itself apart from procedurally generated nursery rhyme hell – Facebook Jumpstart. Facebook Sprout. Facebook One. The logo in a rounded sans serif font. The fabric of society comes rebranded as an education aid, a social tool. “Give your kid the head start they deserve.” Get them savvy young, so they’ll be a good user in the future.
The core concept here is that no other website has managed to encapsulate such a large population within its user base, and that population is going to be broken into age-based subgroups with wildly varying usage patterns. I would venture that Facebook is currently the longest-running major social network with an active audience – all of its old industry peers, the MySpaces and the Friendsters and the Yahoos have fallen by the wayside and are either ghost towns or nonexistent. To keep its relevance in a world where its users are separated by their unique generational experiences, Facebook will have to diversify their content to keep all of their audiences engaged. Their current user base ranges from Gen Z to boomers, but extend it outward and you get Facebook from birth until death. Facebook for life. Facebook Terminus.
I recently had the chance to wander the Barbican Estate in London for a few hours. Constructed from the mid-60s to the mid-70s, the sprawling brutalist development is a dream, and a maze. Around every corner is a magnificent retrofuturist vista, or a charmingly overgrown plant installation, or a subterranean breezeway leading nowhere, or an entire disused exhibition hall with signage and fake plants frozen the 1980s. You don’t get to know in advance what you’ll find there. You just have to stumble into it.
While there doesn’t seem to be any concrete (heh) evidence that J.G. Ballard was inspired by this imposing brutalist mecca while penning High Rise, it would make sense if he was. As he was writing he would have witnessed three of the tallest concrete towers in London shoot up on the horizon, and in the way that brutalism often looks shocking before it looks beautiful, I imagine that the towers would have seemed domineering and sinister, like invaders on the skyline. Filmmaker Ben Wheatley certainly alluded to the Barbican Estates heavily in his film adaptation of High Rise, having borrowed the buildings’ scooped balconies and hand-textured concrete for his tower blocks. In his version he’s slanted the top section of flats along an ominous cantilever, making the buildings look as if they’re peering down at the viewer, stooping to examine an ant on the sidewalk.
High Rise is perhaps the most vicious critique of utopian architecture in fiction. In broad strokes, hundreds of professionals move into a tower community that is intended to act as an enclosed city. The building is designed to provide uniform access to food, recreational and educational facilities within the tower itself; in this vision, utopia means never needing to rely on the outside world for anything other than your income. As the residents settle in, it becomes clear that the wealthier community members have access to social and economic privileges within the estate that others do not. The classes within the tower become more and more stratified, eventually leading to an all-out war, but notably it is not a war for resources. There is no concept of organized action in High Rise beyond roving gangs of opportunists. The violence doesn’t have an objective. The residents are driven mad, attacking each other indiscriminately over slights real or imagined. No one entertains the idea of leaving the tower complex, or banding together to overthrow those in power. Instead, the residents seem to desire the conquest of the tower for themselves, as individuals, in a contest of primal supremacy.
The impression I get from the plight of the residents in High Rise is this: if you expect hundreds of different people to live in strict utopian equilibrium, but within a capitalist framework, they will stratify themselves, hoarding resources where they can and creating exclusivity where there wasn’t designed to be any. You can’t expect utopian housing to negate the dystopia is exists in. This appears to be the problem with building utopian housing schemes under capitalism in general. Note that interest in Le Corbusier’s “machines for living in” or Soviet housing blocks is generally limited to Atlas Obscura-fed curiosity these days – we don’t think of them as useful frameworks for public life, only semi-inhabited ruin porn. And when we do become interested in them, we are often most interested in how the residents individualize their spaces. Examples include the endless parade of Le Corbusier loft remodels on architecture websites, and the endless photos of cluttered plattenbau balconies that get traction in dystopian and cyberpunk aesthetic circles.
The Barbican Estates mitigate this phenomenon by allowing people to live according to both their means and their desires. The housing options at the estates are plentiful. There are the high-rise towers, there are lower-slung apartment blocks, there is a row of secluded townhouses with private entries, there are terrace homes aimed at families with a playground at the center, there are waterfront lofts for the single well-to-do – people are allowed to live vastly different lifestyles while maintaining access to the same cultural resources within the estate grounds. And did I mention that there are many cultural resources: there’s a library, a movie theater, a concert hall, a cocktail bar, a massive greenhouse, multiple schools, a launderette, a cafeteria! The fact that no one ever expected the residents to be equals in every way is the saving grace of the estates. In fact, if the Barbican was to function as an enclosed city (though it would need massive expansion of its food services to do so), I imagine it would be the success where High Rise is the failure. Until a truly utopian society can be created, even the most socialist of housing schemes must have a hint of capitalist individualism in to them to succeed.
Unfortunately, the area surrounding the Barbican Estates these days is pushing perhaps the ugliest version of the capitalist aesthetic, which seems intentionally designed to dominate the humble concrete of the estate. The neighborhood is a WeWork Disneyland – green glass everything, worker canteens with novelty neon installations, and immense street closures to break ground on more of the same. It feels like less of a neighborhood and more of a hive, all industry with no grace. While I suspect that the developers of these new buildings would rather the estates fade into obscurity so their workers don’t have to remember that sedate lifestyles under capitalism are possible, the Barbican Estates still seem like an oasis in this swarm of worker bees, and the first glimpse of that distinctive hand-textured concrete through the glass felt something like relief.
(my thanks to Real Life, who I shamelessly stole this format from)
I’m currently staying in an Airbnb in SoHo, London. It’s one of those Airbnbs that is managed from afar by someone with quite a bit of money – the decor includes an unused Squier guitar, a pair of stork statuettes in the bathroom draped in strands of what appear to be real pearls, and many empty Louis Vuitton shoeboxes lined up on a bookshelf like trophies. It has also clearly never been lived in. The fixtures are pristine, the plants fake, the art uninspired and matchy-matchy like it was all procured from the same store at once. You can tick off the cliches on your fingers – vinyl wall decals of origami cranes and crystals, prints featuring macro close-ups of leaves, a coffee menu in the kitchen with a curlicue font from a cafe that never existed. The typography tryptich on the wall in my room spells out LAK, which I don’t think is anyone’s initials and appears be the result of a random bulk order. There are no extra rolls of toilet paper to be found anywhere in the building. This is peak AirSpace, the well-documented phenomenon of identical spaces all over the world catering to the affluent, aesthetics-oriented traveler in the same way a Starbucks or a McDonald’s does – the same wherever your go.
Last night, the friend I am traveling with met me in the hall to hand off the house keys, and a strange mania overtook us. Spotting a row of decorative tins on the windowsill, each of us quickly grabbed one, then another, to pop off their tops and take a look inside. We found a gum wrapper and a spare screw. These discoveries were unsatisfying in that they were things, but they weren’t significant things. They weren’t real. They weren’t things that anyone would claim as a belonging.
There is a certain kind of rummaging that happens in places like this, a checking of the pulse for signs of life. Small containers are opened, surveyed for odds and ends. Fridges are raided. Drawers are pulled open, slammed shut, one after another. We want mess. Detritus. Personal effects. Did someone ever live here? Was this space designed for the comfort of real people, or was the intention always a facsimile of human life, the veneer of familiarity with nothing inside? Were these books (always design or self-help) put here to convey personality, or did someone actually read them? Did someone select these weird little twine balls because they enjoyed them, or because they saw them in other Airbnbs attractively taking up tabletop space without being functional or interesting in any way? Does anything in this space exists because someone liked it, or because the space itself was just based off the meme of other spaces like it?
The point of AirSpace is that the decor never says anything about its curator. The house is one step removed from home, presided over by investors and housekeepers. It’s house-as-waystation, comfortable enough but not cozy, personable but lacking personality. When you enter AirSpace you’re entering the uncanny valley of interior design. Not the kind of place where one could settle in, amass belongings. You aren’t supposed to get too attached to AirSpace. You’re only supposed to move on.
What does it say about us as creators and consumers, that we saw that a place was monied, white and sterile, and decided to perpetuate it ad nauseum with algorithmic precision? Why are we proposing entire districts based on this model when existing districts have already become perpetual, barely-navigable AirSpaces? Perhaps maintaining a sense of place and purpose betrays our decorum as privileged travelers, because it forces us to feel like guests in someone else’s space. It’s the notion that another resident has been displaced to accommodate us – something that is often the case with homeshare schemes. Maybe this discomfort should be the price we pay as travelers – forced to live amongst the belongings of someone not present, to consider the human cost of our leisure.
At a recent rationalist gathering, someone I don’t know very well looked at my outfit and told me, in utter seriousness, “if I dressed like you, I would be undeniably evil.” He changed the topic a moment later, but I proceeded to think about this comment for the next few weeks. It would be hypocritical to be angry about it, because he was correct: the way I look deliberately connotes at least otherness, if not deviance depending on the audience. It is, however, rare for someone to call it outright.
I have to thank the schlocky murder-erotica TV drama Hannibal for bringing me the phrase “ethics become aesthetics” (a condensed version of a theory put forth in Susan Sontag’s On Style). Inexorably, the way one looks is tied to the way one constructs their worldview. How much can you judge about how someone moves through the world, without ever talking to them? Who are they performing for? What is it they care about? What do they want you to know?
In my case, I took great care to weaponize my appearance. I have what could be characterized as an actively dangerous aesthetic. Lots of black, harsh angles, militaristic connotations. I don’t characterize myself as evil, but I’m happy to look it, in that I actively don’t want to look good. I think it should be important to establish yourself as someone who exists outside of the prevailing moral dogma. This should be something people know about you from the start. In the era of fake news, cancel culture (groan) and a guilty-until-proven-innocent structure of discourse, is important not to be an easy mark.
This is the guide for why you should want to look evil.
The most common morality systems are faulty.
I used to be close with someone who could be called an ineffective altruist. They were nationally recognized for their charitable work, which was driven by a solemn religiosity. They were also so hung up on the idea of repairing the world, of playing the long game, that they came to regard the people around them as collateral damage. They could absolutely demoralize you and then say, “I’m doing this for the greater good.” This was a person I later cast in a space opera as the planet-destroying villain, driven by delusions of godlike justice. It was very easy to think, if this is what goodness is, I don’t want to be good. I started thinking that morally it would be better to be the kind of person they would despise.
Around this time I was also attending a middling Jesuit university with squeaky-clean facade. I quickly found myself entirely uncomfortable with Catholics – the wheeling and dealing of sin and confession; the professors brazenly shoehorning their own faith into their coursework; the trite pro-LGBTQ signage around campus at odds with the deacon who banned gay marriage in the chapel, convinced that the local archdiocese would pull their support and disrupt the university’s cash flow; the useless storefront of Campus Psychological Services, which redirected you to the priests stationed in the dorms as counsel. Guilt and secrecy were the backbones of the culture. Many of the students wore matching necklaces from spiritual retreats; the whole thing was culty. So I set myself up as someone unrecruitable, someone who could never be one of ‘those people’. I acquired a leather jacket, got more piercings, all the ’80s movie trappings of someone who doesn’t care what you think, man. It might have made me moderately obnoxious (I once got suckered into a trip to the chapel wearing a t-shirt that said HELL IS SO HOT RIGHT NOW) but it also made me a lightning rod for discourse in my classes because I was the de facto contrarian. People expected my opinions to be novel, which made my work better. It was productive.
The moral here is that it’s effective to cast yourself as someone who isn’t playing by the rules. Quite literally wearing your ethics on your sleeve can do a lot of the legwork in saying I am not like you, I don’t believe what you believe, and can more accurately set the tone for the interactions you want to be having in these morally-governed spaces.
Much of what we think of as morality is just play-acting.
One malady of late capitalism is that none of us want to look like we’re doing too little. Public performance has always been an aspect of morality (think self-flagellating monks, or Victorians in their funerary black), and the demands of looking like a morally upstanding citizen today are painstakingly specific. The information age has given us a cornucopia of choices regarding who we are allowed to be in society, but we cherrypick certain behaviors to put our best foot forward. Just a sampling: wear a smile, even when there is nothing to smile about, because you need to look and sound authentic (happiness is of course taken to be the default authentic state). Sit up straight, even when exhausted; maybe get a standing desk so you can lord your able-bodiedness over your peers. You may talk to your officemates about certain preapproved topics when allowed time to be lax (definitely don’t mention that it feels like the world is ending), but don’t be too lax. Perform self-care, but not too much. Repost the correct current events to your newsfeed so everyone can know you did your part by raising awareness. Always be aware, but stay optimistic, like someone who doesn’t read the news. Do meatless Mondays. Make a show of riding your bike. Act like your small acts save the world, even though a handful of people with more power than you can dream of are actively trying to kill it. These rituals are distracting, exhausting virtue signaling. They barely do anything for the net good of the planet other than telegraphing please like me to your immediate peers. They are not the kind of actions that, when everyone does them, will make a difference; they are bare-minimum nods to environmentalism that don’t change the fact that the people at the top of the heap won’t do anything that matters in time to fix the state of this planet (also your affinity for quinoa is decimating indigenous populations in South America).
I’m not saying we should all give up. I’m saying that many of the behaviors we partake in to increase our social worth are inefficiencies, and it is possible to care deeply about justice without constructing a shell of empty symbols to do it. The mask can slip off with little consequence. Speedrunning the impractical parts of these social contracts may at worst lead to some meaningless name-calling about being uncaring. It is worthwhile to construct yourself as the kind of person who doesn’t care about these things, who has convictions that exist outside of hollow late capitalist ethics. To save yourself some energy, look like someone who wants to bring down the system, not like someone who wants to earn a gold star from it.
Any system that defines good and evil as inexorable, all-or-nothing states of being is not a just system.
Hell was the worst idea humans ever had. Hell makes everything a moral imperative – you’re either 100% saved or 100% damned, no matter how complicated the human experience. Even in secular circles, this sort of thinking is carried out in paradigms like cancel culture, in which a community decides to cast members out based on often small infractions which are then extrapolated into rigid judgments about the person’s character as a whole.
I participate in some social justice work. It was never an active decision to do so; it felt like a reflex, like some primordial decision-making structure told me I had to do it. I help edit a list of known offenders within a certain community, people prone to abuses of power or coercive business practices. It is left up to the readers to decide whether they can handle working with the people documented on the list. Some mistake the existence of a list like this as evidence of rampant cancel culture, and assume me to be its figurehead, but my only goal to make sure instances of abuse are documented in a field that has no formal accountability. The list doesn’t tell anyone who they should shun. But if someone wants to based on information provided in the list, that is their choice.
In working on a project like this, embracing underhandedness is necessary. The list is effective because it acts as a proverbial anvil hovering over the heads of people who do deplorable things, and embracing the role of a villain made dealing with these tough subjects tolerable. There is a vicious kind of joy in singling out what makes a person dangerous, and then documenting it dispassionately, like a hunter hanging a game trophy. Being a threat can be thrilling. Power can be balanced swiftly and gracefully from afar, if the need arises.
Being a villain, however, requires being an outsider. I have had opportunities parlay this project into restorative justice, to become a compassionate advocate instead of a slightly rude fly on the wall, and it was always imperative that I decline these. The list works because it doesn’t provide any definitive judgments. The moment I take a stand for the goodness of certain people over others, presuming the innocence of some but not others, the whole endeavor becomes moot. It would cease to be an unbiased instrument of justice and would instead be a popularity contest. It only functions if my personal judgments are entirely out of the picture, if I maintain a moral otherness as someone on the outside of the system looking in.
On looking vs. being:
I’ll admit that I went through about five separate drafts of this essay because I didn’t want to come off as an absolute asshole. Some of the ideas here are what I would typify as “dark side theologies” (and I’m sure there’s a real term for this that I don’t know): ideas that immediately threaten group security by reducing rule adherence. Or, more simply: ideas that would make me an absolute misanthrope depending on context. To round this out and to rescue my ego, I’d like to loop back around to the comment that started this screed off, with my emphasis: “if I dressed like you, I would be undeniably evil.” The sentence structure seems to indicate that while I only look like I should be evil, there is some ineffable factor that keeps me from actually being evil (something that the person who said this doesn’t seem to think he has). This conversation happened in a place where I clearly look like an iconoclast, but where ethically, I don’t seem to be one. I’m happy to interpret this as reinforcement that what you appear to be isn’t what you are.
I am obligated as both a musician and a very pessimistic futurist to write about the Twitter fight that musicians Grimes and Zola Jesus had recently. This snippet of drama is so tailor-made for me that in another world I’d have thought it was spat out by an algorithm designed to generate entertainment based on the contents of my brain.
I’d recommend looking through the tweets compiled in the Sterogum article linked above if you’re not familiar with this particular slab of fresh beef. To provide a brief summary, noted Elon Musk paramour Grimes posited that flesh-and-blood musicians would soon be obsolete in the face of music-making AI, and Zola Jesus retorted that to so glibly speak of one’s own redundancy, one must be sure that they will be sheltered by the technocratic elite when this Kurzweilian singularity eventually occurs.
I agree with Zola Jesus’ assessment here – it is a callous thing for a musician, especially one who is also intimately familiar with grueling tours and the pervasive poverty among creatives, to say that the end of not only the music industry but the end of relevant human creativity is is simply an interesting diversion along the way to some techno-utopia where humankind will live blissfully amongst the technologies that will also put it out to pasture. However, callousness isn’t what I’m concerned with here.
On a personal level, I’ve been worried about Grimes for a while.
A friend of mine, another cultural critic, has a theory that Grimes has been conceptualizing herself as the protagonist of an imagined ’90s anime. She’s the heroine who crawls into the maw of danger, in this case the laps of the elite, with the intention of staging a revolution against the forces of evil from the inside out. She’s Utena, and Usagi, and every other chipper young woman with colorful hair who was called upon by prophecy to lead humanity into a world of light. However, somewhere along the way she lost the thread. The belly of the beast became a safe enclave. Protected by power and influence, removed from the drudgery of her origins, she decided to save herself instead of everyone else, and now exists as a novelty for her compatriot-captors, a whimsical thing who reads Sun Tzu (likely at their behest – I know the type of book that gets recommended at parties hosted by people in that community) and makes a music video about it only semi-ironically. She knows she has followed a point-for-point villain origin story, and builds this into her mythology. She brands herself as a product of big tech, with Musk at the center. She assumes his logos, she calls him her creator. She is soaking up their ideologies and parroting their rhetoric. When she says “I’m not dying on a hill, just having a good time” followed by “Seems weird to withhold ideas, and even weirder that suggesting potential futures can cause so much rage” she is gaslighting in the way they do, implying that everyone who hasn’t drank their particular flavor of Kool-Aid is small-minded, that they hasn’t yet left Plato’s cave. She is using the same patronizing language that a troll on r/askphilosophy would use to explain why he is smarter and more enlightened than you, a tactic which is designed to provoke rage, thus rendering you “irrational” and deserving of scorn if you take the bait. She might as well have just written “triggered”.
And yet, I feel for her. When I was introduced to rationalism it made me feel powerful and clever too, like I had entered some agora of true knowledge, unencumbered by the virtue signaling of the left, that would reveal the secrets of the future to me as well. I spent a month absorbing their texts like a sponge and then the next one wringing myself out and discarding the irrelevant or harmful ideas. R/sneerclub was helpful for targeting toxic ideologies, and I relied on the the works of James Hillman and the guidance of a local occultist to keep my wits about me. Importantly, I made sure to do these things in isolation. I only started hanging out in the actual rationalist community once I was sure I had mastered the language and understood the discourse, and had studied the damage that could be done from eating it up wholesale. Grimes’ assessments lack ideological nuance and discipline purity. She sounds like someone who is in the thick of it, parroting things she heard at the last meetup because they sound cool and provocative, dog-whistling the accelerationists and the Dark Enlightenment in the process, which, as we all read recently, is a dangerous game to play. But if cozying up to them nets her more power, more influence, more safety? Win-win.
In I Wear The Black Hat, author Chuck Klosterman provided the most functional definition of villainy I’ve seen recently, wherein he states that a villain is the person who, in the face of an atrocity, knows the most but cares the least. Grimes meets the criteria here. She is on the inside, with access to resources most musicians can only dream of, and trade secrets that could damn large swaths of the tech industry. It is rather unrealistic to expect her to share those resources – after all, this isn’t actually an anime, and we don’t actually do socialism like that. However, her blatant indulgences, her “scientific” talking points that bulldoze not just her former contemporaries but all human ingenuity itself, her wet dream vision of the apocalypse – that’s textbook. That’s character acting.
I recently finished Freeze-Frame Revolution by Peter Watts, a novella about a generation ship that has been traveling through uncharted space for eons, with a powerful AI at its helm. Its cargo is a crew of humans the size of a small city, a mere half-dozen of whom are spun out of suspended animation every few millenia to aid the AI with wormhole navigation. The crew members wonder why the AI would ever need them, and it is tacitly decided that human creativity, or even just impulsivity, cannot be replicated even by the most advanced artificial entities. A human’s knack for novel decision-making might be the only thing that ensures the ship’s survival, and the survival of the last remnants of the human race slumbering within its hull. I much prefer this sort of speculative future to one that disregards human agency, whimsy and unpredictability, writing off what makes us unique as irrelevant. We are messy animals, and geniuses, and I still have some hope for a future where we can stare down the barrel of AI supremacy and say “betcha can’t do this.”
When I was in high school I was in a band with a scene queen. She had feathered hair and a bedroom full of Hello Kitty merchandise. She would listen to Mindless Self Indulgence in the car. Every week when I would show up to practice, her earlobes would be stretched yet larger. I was not cool enough to be active on MySpace, but I had no doubt that she was hugely popular there. Our guitarist was in love with her, and I was in love with him. I wanted to be her more than I wanted to be myself. I braided a blue extension into my hair and amassed a collection of Day-Glo American Apparel t-shirts in the hopes that maybe some of her magic would transfer to me if I classed myself as part of the same cargo cult.
This year, when I discovered what egirls are, I felt the exact same feeling I had about the drummer. An incandescent envy that other people were allowed to make a career out of being decorative and childlike, while I was clearly cut from a different material that had to grow up quickly and excel in school and work. But what concern does a twenty[REDACTED] year-old have with the teenagers who populate TikTok? Only that those teenagers look exactly like the teenagers I was jealous of when I was a teenager myself.
Everything old is seemingly new again. It only took ten years for the exact visual lexicon of MySpace-era decora-punk to come back into fashion. The sideswept bangs and Sharpie eyeliner are very much still in effect, as are the low-res selfies cluttered with glittering gifs. The difference is that this time, they’re entirely divorced from the images we painstakingly crafted on our parents’ Gateway desktops. Instagram lets you imitate the exact aesthetic of 2006 in an easy menu format, complete with glitchy, artifacted gifs to select from. Similarly, the glossy, futuristic aesthetics of the Y2K era have resurfaced and peaked a mere decade and a half after their genesis. Pop stars dress like shiny aliens again. We’re even getting an utterly improbable Matrix sequel (good luck with the whole redpill thing these days).
The thing about aesthetics that originated online is that it feels like they never truly die. They are omnipresences to us. If everything functions as it should, the internet acts as a cultural archive where nothing is forgotten as long as the servers stay running (of course, we know what happens when they don’t – and even then, the news cycle is so quick and enough of the archive is squirreled away in other corners of the internet that it doesn’t register as much of a loss). In this space, the artistic and cultural movements that found homes online endlessly sample each other in referential feedback loops, creating shorter and shorter recursive cycles of trends that capitalize on the nostalgias of recent history. Unless there is such a time where we enter a “great ravine” (à la Liu Cixin) that forces us to reckon with a loss of digital media altogether, I would posit that internet-based design movements will become so recycled and self-referential that we may well hit a wall, or an “end of historicity” – an eternal present of quasi-ironic self-sampling in which the broad descriptor of ‘internet culture’ is the whole of the culture itself, and the same nostalgia can pine for both 30 years ago and one week ago.
To see how we got here, it may be helpful to look at vaporwave, which functions both a musical genre and an aesthetic sensibility. The origins of vaporwave are arguably cemented in the mall culture of the early 1990s, particularly the blandly soothing consumerism of piped-in smooth jazz and frozen yogurt shops. However, new generations of vaporwave composers, often in their teens, have very few malls of this style left to reference, as most have been buried under a few layers of remodels or have closed down altogether. These new composers have constructed a nostalgia removed from any physical basis, one which is only rooted in online archives and has taken on a cyberpunk-lite flavor in relation to its current surroundings. At this point in time, the music and graphics being produced by this subculture have very little to do with the actual sights and sounds of the era it purports to reference (note the anachronistic Greco-Roman busts and Arizona Iced Tea cans that dominate the imagery nowadays) and more to do with a nostalgia for only aesthetic precursors within the genre.
The thing about nostalgia for dead places is that it has to segue into a nostalgia for dead internet ecosystems, being that public spaces are increasingly just transitions between instances of being online. The aforementioned egirl is a mishmash of early Tumblr and MySpace tropes. Vaporwave aesthetics are closer kin to Geocities than they are to the actual, physical consumer spaces of the 1990s. However, our app ecosystem is much more stable than the ones that spawned these aesthetic subgroups. Interfaces are flat, minimal and uncustomizable, or users simply don’t see a need to customize them even if the option is available (I recently made a Tumblr post about how the website’s HTML themes have been largely forgotten by its users, and it racked up a solid 60,000 notes in agreement – on a similar note, when was the last time you saw one of your friends update their Facebook cover image?). As a whole, our current ecosystem is a series of clean, white spaces where content lives, but the space itself does not serve as content. To denote coolness or knowingness, users reference the aesthetics of past networks, as if to say “I was there.” In ten years, I imagine it will be difficult to find someone nostalgic for the aesthetics of 20-aughts Twitter or Facebook, because notably there aren’t any. The nostalgia will be for earlier design tropes, recycled in meme form to become present ones.
It is worth noting that I don’t see this culture-blending as a bad thing, but I do see it as the future of design. In my imagining, the only thing that could pull the plug on these recursions is a force strong enough to yank large swaths of humanity offline altogether. It could be electrical grids collapsing due to solar storms, pervasive wildfires, a failure of undersea fiber optics, or a worldwide shutdown due to political unrest or nuclear devastation. It could even be backlash against Singularity accelerationists should they somehow gain control of the zeitgeist. Whatever the case may be, in the 21st century, I don’t think that offline gets to be cool again. I think for it to be relevant, it has to be inevitable. In the end, I keep coming back to just two options for the aesthetics of the future: utopia, everything existing all at once, in perpetuity – or dystopia, mass deletion, a forced shutdown.
I spend a lot of time looking at apartments for rent in my area. The architecture interests me, and looking for listings that are in my price point is perversely satisfying. I have no intention of moving at the moment, but in a city where dumpy older buildings like mine are often sold out from under their tenants, it’s good to have my eye on a few exit strategies anyway.
However, there is an issue. Specifically, in the newer studio builds, something seems to be missing.
Where are all the fucking closets?
Look, I know this is a standard tradeoff of studio living, but when you look at the history of architecture in Seattle it’s pretty obvious that something has changed. The studios from the 1910s and 1920s have closets, some so big I know people who opt to use them as bedrooms. Builds from the 1930s-1960s have big, beautiful built-ins. I’m in a 1980s building right now, and while the storage is impractical (the water heater lives in my closet), it does exist. But from then on, the storage starts disappearing.
I was talking about this with someone at a party, and they told me that per most zoning ordinances, any room that is classified as a bedroom must have a closet. I haven’t really been able to corroborate this in Seattle’s zoning code, but it makes sense that since a studio apartment is more of an all-in-one deal, you don’t technically have to zone for a closet at all (I recently saw a studio listing with the one-room floor plan divided into areas labeled ‘rest’, ‘relaxation’ and ‘relief’ as opposed to the less abstract concepts of bedroom, living room and bathroom). A closetless apartment is also going to be a cheaper and more space-efficient build. So, if you’re a greedy developer, you can line up all of your identical one-room apartments like shoeboxes on a shelf, without having to worry about purchasing doors or altering the floor plan so the closet entrance sits flush with the wall. This is disappointing behavior from a developer, but lazy builds are not at all surprising in a rapidly gentrifying city with a housing shortage.
However, there is definitely a more insidious explanation than cost-cutting.
Seattle used to be a city built for families. Most of the land here is zoned for single-family homes. Then the tech boom happened, and the local economy became almost entirely dependent on young contract workers at the local tech companies, specifically Amazon. These workers are wooed into the city with competitive salaries and subsidized moving costs, spend a year or so at their positions, and then move out of Seattle when their contracts are up (this has also created a booming market for secondhand, barely used West Elm furniture – insert joke about trickle-down economics here). This influx of new residents has resulted in a construction boom in the areas surrounding Amazon’s downtown hub.
The working conditions at Amazon are, of course, not great. Morale is low and recidivism is high. In fact, the company relies on a high turnover rate to keep their spending in line. So, what is the easiest way of getting a large subsection of the population to continually leave their jobs? Make sure they’re never allowed to get comfortable, at work or at home. Make them feel like they are constantly struggling to eke out a stable existence, and they will go of their own volition.
Let’s create a hypothetical tech worker. Jason, 29, is a white college graduate who has recently moved to Seattle from Iowa for a dev gig. He works in one of the new office towers in South Lake Union, which has an open floor plan that requires hotdesking. Each day, Jason shows up early so he doesn’t have to jockey for space with the other devs. It is loud in the office, and someone is always glancing at your screen or eavesdropping on your phone calls. What looked like a breezy agora in the architectural renderings is more of a cattle pen, where cramped, impersonal conditions and sensory overload are the norm.
After a long day of work in the code mines of South Lake Union, Jason returns to his Capitol Hill studio. He only has a minifridge, and the stove is in a shared common area, so he’ll probably just order in. He would love to cook, but he doesn’t have enough space to store food or cookware (Facebook keeps pushing him ads for Soylent, suggesting that he can optimize eating out of his life altogether). In fact, he doesn’t have enough space to store much of anything. By necessity his wardrobe has become minimal, consisting of t-shirts and packable windbreakers. He had a dresser but it took up too much space in his kitchen area. He got rid of his car when he moved to the city, because his building does not offer parking, and street parking is near impossible to find. He got rid of a lot of his leisure items too, as things like snowboards or guitars just aren’t priorities when there is nowhere to put them. Jared would like to feel as if he has a life outside of his job, but there is nowhere to put a life outside of his job. His existence is optimized for staring at screens in tight, shared spaces and utilizing the gig economy to rush-deliver his few creature comforts. If he meets a girl before his contract is up, he dreams of splitting the lease on one of those stately brick one-bedrooms. He might even jump the gun on an imperfect relationship just to do it. Otherwise, he’s worried he’ll get stuck living in what is essentially the company dorm until he leaves the city altogether.
Yes, this is slightly dramatized. It’s also entirely likely.
The battle for fair housing in Seattle has been a long and brutal one. Outdated single-family zoning means that the majority of land in the city can’t be used for high-density builds. The areas that can accommodate new apartments are typically in commercial districts with transportation hubs, and many complain that all the new, cheaply-produced buildings have destroyed the historical character of their districts. In addition, many of these new buildings are financed by hedge funds and shadow investors (once even caught blackmailing the city council!), which has lead not only to the drudgery of identical, unergonomic microapartments that stifle the growth and security of their residents by design, but also to some legal corner cutting. The great irony is that many of these new units sit vacant because the rent is simply too high. This is how investments work in Seattle now – these buildings will be decent liquid assets, and in about 40 years when they have depreciated just enough to be worthless they will be sold, razed and redeveloped again. They are built just long enough to store money, not to provide stable housing on which to build the future of the city.
I’ve been thinking about the apartment a friend of mine once rented in a sparkling, much-lauded LEED-certified building. It was maybe 75 square feet at most, clearly not the legally required 150. There was enough room for two stools and a foam pad that would get leaned against the wall during the daytime. The floor plan was a series of tight, zigzagging corners that opened into a 5’x5′ room, which indicated that this unit was likely sandwiched into the negative space between two regular-sized, humane units. It was clearly illegal, and also a steal. And in 50 years it will probably be gone.