Don’t Be Like Ajay

There’s a lot of bad career advice out there, but the worst of it comes from people who’ve been successful at private-sector social climbing. Blind to their own privilege, and invested in the perverse mythology of corporate meritocracy, they are least equipped to perceive the truth– not to mention their lack of incentive to share it, in the off chance of discovering it. At the same time, these people can say anything and get it into print, so desperate are the rest of us, the proles, to hear the inside corporate secrets they purport to have.

There are no secrets. The corporate system is corrupt; it is not a conspiracy. It is exactly what it looks like; the powerful abuse the powerless, the rich get richer, and people who speak the truth about it are punished.

This pestilent article, “What College Grads Could Learn From My Former Intern“, comes from Zillow CEO, Spencer Rascoff. Now, I have no personal knowledge of the author, and I know even less about the “Ajay”– that may or may not be his real name; it doesn’t matter– so I’m going to stick to the merits of the article itself.

This I will say: venture-funded startup CEOs are the worst when it comes to self-deception and the profligate evangelization of nonsense.

Venture capital, at least in the technology industry, has become a mechanism for the replication of privilege. Well-connected families create the appearance of their progeny having built businesses from scratch when, it fact, they had all sorts of hidden advantages: tighter sales advantages, fawning press coverage, and most importantly, the privilege not to worry about personal financial nonsense. (If their businesses tanked, they’d fail up into cushy executive jobs, often as venture capitalists.) It’s money laundering, plain and simple, and it’s not even well hidden since it’s technically not illegal.

The corporate system is a resource extraction culture, not unlike the ones in culturally impoverished, oil-rich societies that never needed to grow or innovate, because they could pump wealth out of the ground. In this case, though, the depleting resource is the good faith of the American middle class– an earnest belief in hard work, an affinity for technology, an acceptance of authority. The purpose of the ruse is to make it look like “this time it’s different” and that today’s elite, unlike the warlords and viscounts of the past, actually earned it.


Ajay, the protagonist of this second-rate Horatio Alger story, was a hard worker, eager to please, by the author’s description (emphasis mine):

Ajay did [difficult, unpleasant work] eagerly and with a smile; he worked incredibly hard and because of that, built a reputation for himself as someone who would pitch in to help with anything you asked and give it his best effort. People liked that.

I almost retched when I came upon “and with a smile”. Gross.

My thoughts, for the rising generation? Yes, work hard when it’s worth it to work hard. In fact, I would not try to give advice to the young about “work-life balance” or tell them that they should backpack around Australia for two years. It’s hard enough to achieve something significant during peace time; it’s much harder in 2018, when the rich have made it so much harder for anyone to get a chance. One cannot produce significant work in any field and also have the Instagram party life.

This said, there is difficult, unpleasant work worth doing; there are other tasks that are waste. If one has to do the job with a goddamn smile to get credit for it, then it’s almost certainly in the latter category.


Bosses might like, on a personal level, those who do unpleasant work with a smile. That doesn’t mean that it leads to career success. It’s never good to be disliked by a manager, but bosses don’t get to promote everyone they like. If one is well-liked only because of having made it a path of least resistance to give one unpleasant, career-incoherent work, then one is in a state sustained only by suffering, that one can almost never turn into career advancement.

I’d also like to point out the author’s corporate weasel terminology. He says, “People liked that.” He liked it. There’s nothing sinister or surprising about a boss liking someone who’s preternaturally “easy to manage”. What’s galling is that, like most corporate bosses, he felt entitled to superpose his opinion over the entire company. It’s like when managers fire people but want to avoid taking responsibility, so they say “the team decided”.

I would guess that many people disliked Ajay. They saw what he was doing, and they cringed.

Of course, if Ajay succeeded, then their opinions didn’t matter; those people didn’t win. Still, it’s generally not useful to be disliked by one’s colleagues, and no one likes ass-kissers.

Ajay was also a serial networker, even all the way up to me, the CEO.

It’s funny how blind CEOs are the politics that exist all around them. Since they get everything they want, there’s “no politics” in the organization. I suppose that’s true. The ultimate solution for someone who wishes to abolish politics is despotism– the degenerate but nominally apolitical arrangement. Most of us don’t want that, of course.

At any rate, if Ajay’s colleagues and managers tolerated “a serial networker”, it’s because they never saw him as a threat until he was fully ensconced in the managerial sun. Perhaps they were wrong and got blindsided. Like I said, I don’t know these people.


In general, though, the idea that a 22-year-old can try to rub elbows with a CEO, in a competitive environment like a startup or investment bank, and not get shanked by someone at or above his own level, is laughable. The people with the training to pull this off are those with inherited wealth and social resources, who have the least need for “internal networking” because of the extensive external networks their Daddies gave them.

When Ajay left to finish school and go on to various startups, he continued to build upon his brand and kept in touch—essentially marketing himself through his networks.

Emphases mine. There’s nothing incorrect about “essentially”; I just wanted to highlight an unnecessary adverb that really, totally, very badly, irritatingly weakened the prose.

I want to focus more on “build upon his brand”. (The author could have taken out “upon” and nothing would have been lost, but there’s actual incorrectness here, so I shan’t dwell on it.) See, what got me to write this response is not that the author’s giving misguided career advice. To be honest, I couldn’t give better advice that Forbes readers (if my estimation of its demographic is correct) would want to hear. I’d offer the truth– the game is rigged and most people will lose no matter what they do– and that’s not a charismatic message. No, I’m writing this response because the notion of “personal brand” is, to me, sickening.

I am not a brand. There are not five hundred of me stacked on a shelf in grocery store, all in neat order like the rectangular boxes they put toothpaste tubes in. You, dear reader, are not a brand either. If you don’t cringe when you hear the words “personal brand”, then wake up.

People who use the term “personal brand” without dripping contempt are a special breed of douchebag. What’s amusing is that, while they identify “personal brand” with their desperate claims of uniqueness, these people are pretty much all the same.

It is bad advice. The truth is that people who focus on “building their brand” are assumed by their colleagues not to be doing the work, and they’re the first ones to get shanked when things get difficult. Perhaps Ajay succeeded. Perhaps he’s in a corporate jet, still smile. Or perhaps he used his bonus on plastic surgery to fix that frozen-face smile after getting kicked out of a funeral for the goddamn last time.

You want to be remembered, whether you’re joining a company of five or 500, because remembered people get opportunities; anonymous ones don’t.

Remembered people get denied opportunities.

I’ve been involved with the antifascist cause since 2011. I’ve been turned down for jobs because of a somewhat public (and, in cases, adversarially publicized) track record of having the backbone to stand up for what’s right.

When it comes to social media, employment references, and personal uniqueness, we live in a 500-mile world. As in, follow any driver for 500 miles, and you’ll find a reason to write him up. It used to be difficult (literally, and in metaphor) and time-consuming to follow one person so far; technology and surveillance have made it easier.

I’ve been a hiring manager. I was always sympathetic to people with controversial online histories, for obvious reasons, but it’s the most common reason for denying a job to someone good enough to make it to the final round. No, these people aren’t alt-right psychopaths or proud, public drug users. Usually, they’re normal people who just happen to hold opinions. It’s assumed that they’ll get bored, or that they’ll react badly to mistakes made from authority. I did, on one occasion, cringe when a startup executive commented on a black woman’s natural hair being “political”.

The people who rise in the corporate system are boring. The best odds, in the corporate game, come from becoming the most bland, inoffensive, socially useless person one can. The problem with this truth– the reason it lacks business-magazine charisma– is that its odds are still poor. There are a lot of perfunctory losers out there, and they don’t all get executive jobs. Most of them get the same shitty treatment and outcomes as everyone else.

Not being boring, though, means that someone only has to follow you for 25 miles to find a reason to screw you over, damage your reputation, or deny you a job.

The optimal strategy is to be boring, to ingratiate oneself to powerful people over time, and to become intertwined enough with an organization’s powerful people that one is perceived to have undocumented leverage, and therefore gets what one wants out of the organization. Does this strategy work for everyone, all the time? No. The odds are depressing– most social climbers fail. But the odds are even worse for all the other strategies.


“How do you effectively brand yourself without being a peacock or a sycophant?” There are two ways: intentionally constructing it and being patient.

There are several ways to brand yourself. The classic approach is apply pressure with iron, heated in a fire. At high enough temperatures, permanent scars can be achieved in two or three seconds. Electric arcs are sometimes used for this process. An alternative to thermal burns is “cold branding”, often using liquid nitrogen. There seems to be no risk-free option, since branding literally is skin damage.


The same should be true for you: “Work with Sophia—she has a great attitude, big ideas, and is really hard-working.”

This guy must be getting paid per word. The Hemingway editor yells at me; I use adverbs. They’re not always unnecessary and replacing one with a clunky adverb-free adverbial phrase isn’t my way. Still, not only is the “really” unnecessary, but the author could have said “works hard”.

Whatever you decide to pursue as your personal brand, make sure it has a strong purpose behind it. If you do that, the rest is just packaging.

“Just packaging.” A product’s brand is literally that: packaging. Brand is the use of identical-looking boxes to convince buyers that a minimum standard of quality has been met. A Hershey Bar isn’t going to blow me away, but it’s perfectly adequate. I know that when I buy one, I’m unlikely to find a severed housefly wing in it.

If you want “perfectly adequate” on your tombstone, then consider being like Ajay– a brand. That said, you might want to pull that smile down. Do your job and do it well, of course, but if you smile so much, you’ll make everyone hate you. No one wants to compete for attention with an ass-kisser.

The Truth

As I said, I found the article harmless till I got to the “personal brand” bit.

There’s a lot of bad career advice out there from successful people (most of whom lucked into, or were born into, what they have). There’s also a lot of bad career advice from unsuccessful people who’ve found success selling the “inside secrets” of a corporate game they never actually won– now that is personal brand. The well-meaning self-deception will never go away, nor will the intentionally deceptive sleaze. There are many gamblers who “have a system” for beating roulette wheels and slot machines. Many books have been written on their systems. They do not work. The house wins in the long term. That’s why it’s the house.

The house is smart enough to keep people coming in. So it offers intermittent small wins, and a few big ones that generate publicity. It’s very hard for lottery winners to keep their windfalls private; lotteries discourage it. In these corrupt career lotteries, though, the system doesn’t have to make it hard for game winners to stay private. They shout in open air; they never shut up.

Is “be like Ajay” good advice? I don’t know, because I don’t know who Ajay is. Perhaps he was a ruthless political operator, fully aware of the resentments his supplicating smiles generated, and he used them for some sort of eleven-dimensional manifold socio-economic judo so brilliant it’s beyond my comprehension. Perhaps Ajay’s reading this blog post on Trump’s golden toilet, laughing at me. For the average schmuck, though, it’s not good advice. Of course, don’t be incompetent. Don’t be too grumpy. Be the “go to” guy or girl for work you genuinely enjoy and are good at. But, as a favor to yourself, don’t become a dumpster for career-incoherent work. Also, don’t smile all the time; it’s creepy.

I would love to advise authenticity, but that is also not a good approach for someone who needs to squeeze money out of the corporate system– and most people have no other choice.


There’s no path I can sell for the individual. The situation, in truth, is quite dire. In Boomer times, the corporate system seduced people with greed: $500 executive lunches, business-class travel all over the world, and seven-figure bonuses just for showing up. Today, it runs on fear. Fear’s cheap. Most Ajays won’t succeed; I can say that with confidence. I can also say that most anti-Ajays won’t succeed. Most people won’t succeed. The corporate game is rigged and anyone who says otherwise is trying to sell something toxic. I have no elixir of socioeconomic invulnerability; I’ll admit that. There’s a massive market for false hope. I will not sell into it. I am better than that.

For the world– if, sadly, not always the individual– it would be better if we woke up, tore down the corporate system brick-by-brick like the Bastille, and replaced it with a fairer, more sensible, pro-intellectual style of society worth caring about. If enough of us had the courage to live in truth, consequences be damned, the whole corporate edifice would crumble and we’d all be better off for it.

It’s not easy to live in truth. It’s downright hard to change a world whose most powerful people loathe any change at all. A first step, though, might be for us, unhindered by mercy, to mock anyone and everyone who says “personal brand” without vehement contempt for the concept. If we work together, we can make such people shut up. That would be a start.


Why I’m not using a traditional publisher to launch Farisa’s Crossing.

As I write this sentence, it’s June 30, 2018– 300 days before I launch Farisa’s Crossing, on April 26, 2019.

A few months ago, I decided to self publish the book. I realized that I wasn’t even going to try traditional publishing. I have no doubts about my ability to get in. The process is harrowing and random, and even the best writers can expect to be shot down more than anyone likes to think about, but that wasn’t the problem I realized I had with it. In the end, it came down to time. It’s finite. I’m 35; I’ll be almost 36 in April 2019. Anyone who plans to explore all options before doing everything will end up achieving nothing. I had to knock some things off the calendar. I’m not going to skimp on the writing itself, nor research, nor editing. What can I cut that doesn’t affect the quality of the book? Writing a bunch of silly query letters landed high on that list.

Self publishing isn’t for every author or every book; nor is traditional publishing. Each has its advantages and drawbacks. There are books where I would eagerly use a traditional publisher, in spite of the drawbacks.

I thought it would be worthwhile to go through my reasoning here. Below is why I decided not to use traditional publishing for Farisa’s Crossing.

1. I don’t need it– Farisa is fiction.

A friend of mine writes biographies. Of all the genres, I think biography is the best suited by traditional publishing. Generalist copy editors aren’t equipped to copy edit biographies, which require extensive fact checking and removal of bias. Traditional publishing, in this genre, is invaluable.

Opinionated nonfiction, I would argue, is best served by traditional publishing– at least at book length and in print. Author credibility is huge, and can be manufactured if it isn’t there. Here, a self publisher is a guy with opinions; backed by a traditional publisher that’ll line up national TV spots, he’s a world-renowned expert. (Actual expertise optional.) Topical nonfiction– say, a book about a current election– has a short half-life; it will sell quickly or never. New York publishers have the resources to publicize it quickly; self publishers, in general, do not.


Memoir, if it’s at risk of being controversial, needs a traditional publisher. The author puts her personal reputation on the line. She needs a full-time publicist to fend off attacks.


Finally, we have business books. Those aren’t written to sell copies. It doesn’t hurt if they do, but few books make large sums of money, especially by business executives’ standards. Rather, these books are written to advance their authors’ careers. Middle-aged managers can reinvent themselves as “successful executives” and get better jobs– or, if they’re tired of being employees, lucrative speaking opportunities. Prestige, in that game, is everything. Substance, as anyone who’s read a business book or few, is not.

From the above, it should be obvious that I do not think traditional publishing is a dinosaur on the brink of its own extinction. Will its retreat from fiction continue? Yes. Is it dead? No. In fact, it’s exactly where it wants to be. It has decided that new author discovery, at least in fiction, costs too much. In the 1970s, fiction editors read manuscripts (“slush”). In the 1990s, they pushed that job to literary agents. In 2018, unpaid 19-year-old interns do it. A reader is a reader, so I don’t mean to disparage these interns as people; but I would always bet on a larger crowd when it comes to discovery. A hundred strangers versus one Ivy Leaguer? I’m betting on the hundred doing a better job. So long as self publishers can get their work read in the first place, the gatekeepers will be unnecessary.

Nonfiction demands external credibility, because it makes truth claims. I’m more inclined to trust an opinion essay from an expert writing acceptable prose than a stranger who writes beautifully.

As for fiction, the traditional publisher is far more optional. Farisa’s Crossing will be no better and no worse than the 200,000-or-so words I write because it will literally be the 200,000-or-so words I write.

Authors don’t need external credibility to write successful fiction. A good novelist disappears. The reader should get so involved in the story that she forgets that she’s reading one in the first place. The ability to induce this feeling is rare, quite difficult to teach, and does not come from advanced degrees, an author platform, or a reputation built by a Manhattan publicist. It comes from good writing.

2. Thinking about agents led to bad artistic decisions.

Self publishing is hard. Traditional publishing, if the stars align, is easy– seductively easy. Every single one of us humans is prone to the “Prince Charming” mentality, at least a little bit. We’d like the basics to be taken care of.

The traditional publishing fantasy goes like so: you get the first and best agent you query, he snaps together a lead-title deal, your book is reviewed by the New York Times, then the New Yorker offers to publish a chapter (and your publishing house doesn’t object) and it goes viral like that “Cat Person” story, so you sell 2 million copies and you’re set for life. You can literally think (and type) your way to the life you want– if you get the words right. That’s the promise; that’s the dream.

Of course, you can also win the lottery– if you get the numbers right.

The time cost of querying, one can put limits on. I’m 35 and I’m starting a series that I expect to take at least 10 years to finish. My health is better than it has been for a long time (ten years ago, I didn’t expect to be here today) but my life hasn’t been a no-damage speed run. If I thought the expense of 6 more months were worth it, I might put querying on the schedule. No harm in that.


We are all humans, though. When we see something that looks easy– a path of least resistance that seems to go where we are trying to get– we’re built to focus on it.

This becomes a problem if you start to think about agents rather than readers. This ruins a book. One of the major reasons for literary fiction’s decline, if not the main one, is that many of these stories are written to score agents. And not all agents are created equal. In any genre, there’ll be no more than a dozen “power agents” who can snap together serious deals with large print runs, demand aggressive marketing from major publishing houses, and sell screenplays. There’s a lot of terrible fiction written to appeal to the tastes of a small number of people.

An experiment has been performed several times in which an award-winning novel is queried to literary agents and shut out entirely. It’s not that agents are stupid or don’t understand good literature. (I think their tastes are as valid as anyone else’s.) To some degree, it’s just the sheer randomness of the process that produces this outcome. Being read at 9:00 am will produce different results from being read at 3:30 pm– or, worst of all, right before lunch. No one can control that.

Furthermore, great novels take risks. (So do many terrible novels.) Agents pick up heuristics that one must heed in order to get published. An exhaustive list of “agent rules” is not the purpose of this essay, but I’ll give a couple examples.

One of those agent rules is not to use exclamation points, ever. (Some agents allow 1 per 50,000 words.) Are they overused by mediocre writers? Yes. Can they be obnoxious? Of course! Used skillfully and in character, they’re quite useful. In dialogue, they differentiate hot anger from cold anger– there’s a difference between “Get out!” and “Get out.” Likewise, an author using deep POV in the voice of a seven-year-old girl might use exclamation points for weather (“It was hot!”) while a septuagenerian probably wouldn’t.

Another agent rule is never to use back story in the first chapter. Now, like all of these agent-level prejudices, this principle is not without merit. First-chapter time jumps are very difficult to get right. They tend either to bore or confuse readers. If back story is relevant in a first chapter, it should be limited to a sentence or two here or there, and it should be told rather than shown. (Showing costs words; words equal time; always but especially in the first chapter, milliseconds matter.) Why do I hate this as a hard rule? The first chapter, in well-told linear narrative, is always back story… to the rest of the book. In truth, there are times when it’s artistically valid to open at 120 miles per hour, and times when it’s not.

You write differently to get an agent than to write a good novel. If querying is on your mind, you’ll find yourself writing for the 19-year-old unpaid intern who’s been throat-deep in slush since 9:56 am and who’ll decide in eight seconds whether to read beyond the first paragraph. You’ll put that explosion that belongs on Page 32 on Page 1. You’ll find yourself writing for people trying to mirror their bosses’ opinions rather than readers who want to get lost in a story. You’ll write a hook-laden confusing opening, flash and no substance, at the expense of the rest of the book.

Writing for agents is easier than writing for readers– the former is paint-by-numbers, and the latter takes genuine artistic commitment– but pollutes the work. Writing for both is impossible. Sometimes an author will hit both targets– a novel written for readers will land a power agent– but it’s so rare, it’s not worth obsessing over.

I had an agent-friendly opening, for more than one drafting cycle, that I knew was wrong. I found it subtly corrupting other, later, chapters. Readers found it intriguing but pretentious and confusing– which it was. They were right. So, eventually, I decided, “Fuck that agent game; I’m going to write for readers.”

3. Farisa is long.

Speaking of agent prejudices….

What is the right word count for a novel?

The answer is similar to, What is the correct weight for an airplane? The answer: as light as possible to do the job.

In truth, the answer is less satisfactory for stories than airplanes, because an airplane’s duties are, at least, well defined. The metaphor works this far, though: airplane weights range all over the place, because of their different purposes.

Novels range from about 25,000 words (which would, today, be classified as a novella) to well over 500,000. It’s story-specific what number is right; a book can be overweight at 100,000 words or underweight at 200,000. An average traditionally published novel might weigh in at 85,000 words. The sweet spot for contemporary literary fiction seems to be 125,000 – 250,000, which is longer than average.

My guess is that Farisa‘s final word count– in revision, word counts go up, then down– will land in the 175,000 – 225,000 range.

How much do readers care about word count? They don’t. They care about pacing. They care about price– which can make a big book hard to sell on paper. Editors care, but will make exceptions for good books. Agents? You will not get one over 150,000 words. They’ll sometimes represent a long (or short) book as a favor to an existing client, but not a first-time novelist. Acceptable word counts, as determined by literary agents, tend to fall into a tight range: a genre-specific target, plus or minus 10,000 – 15,000 words. For example, first-time literary novels are expected to be between 80,000 and 100,000 words; epic fantasy should be 90,000 – 120,000.

It’s hard to land an agent with a big book because it has to be sold to one’s boss several times. The intern has to sell the book to his boss (the agent). The agent has to sell it to an editor at a publishing house. The editor has to sell it to executives who control marketing budgets. Only established, big-name authors can get through at 200,000, even if that’s the right length for the story.

An option, with a big book, is to split it. Both publisher and author stand to make more money this way. Sometimes this is the right artistic decision. For Farisa’s Crossing, it’s not, although an explanation of why would spoil the plot.

4. Farisa is a genre-crosser: literary fantasy.

What on earth is literary fiction? What is genre? Can a book be both? This is a fun topic. I could write thousands of words on that alone, but I’ll spare the reader.

Conventional wisdom, in some literary circles, is that there’s “real literature” and then there’s “genre fiction”. Literary novels transcend; genre novels merely entertain. This is, I shan’t hesitate to say, complete bollocks.

All literature has genre. What is usually called “literary fiction” is, in fact, another genre. I call it metrorealism. Actually, literary (as often defined) and mainstream fiction are two sub-branches of metrorealism that otherwise have little to do with each other. Metrorealism takes place in the real world and focuses on ordinary characters. If kings and queens, heroes and villains, or geniuses and fools are featured, it is usually ironic in a way that humanizes the subject and equalizes with the reader. Character-driven metrorealism with high-quality prose tends to be received (and marketed) as literary, while plot-driven metrorealism with adequate prose tends to be presented as mainstream fiction.

There’s a lot to be said for metrorealism. It’s a fine genre– especially the literary subtype. I read a lot of it. I’ve written a few short stories in that genre (that I’ll probably try to get published around April, when I launch Farisa). I have nothing against it. It’s not what Farisa’s Crossing is, is all. The Antipodes is an epic fantasy series– with literary style and aspirations.


The meaningful distinction, to me, has nought to do with genre. A novel is not “genre” or “not genre” because all work has genre. (Technically speaking, “novel” is a genre and “fantasy novel” is a subgenre.) Rather, the distinction is between literary and commercial fiction. So, just as commercial metrorealism (mainstream fiction) exists, so can literary fantasy.

I don’t intend to say that commercial fiction is inferior. This is a distinction of purpose, not value. Most commercial writing is perfectly adequate, and I don’t believe the reading public wants substandard dreck. People buy books for all sorts of reasons, and shoddy writing is not a deal-breaker when it comes to commercial (or critical) success, but I don’t think the first wave of readers for 50 Shades bought the books because they were badly written. (The hate readers came after its commercial success.) Would the book have sold better if it were polished to a literary standard? Perhaps it would have sold 100,000 more copies. Compared to the 125+ million it actually sold, that’s a rounding error.

There doesn’t seem to be much evidence that literary novels sell worse than commercial ones, if one compares like against like. There’s an apex fallacy by which literary writers look at the outcomes for commercial bestsellers, rather than hangers-on, and think they’re all rolling in money. I’d actually bet that improving the writing, characterization, and relevance of a commercial novel, up to a literary standard, will only improve sales. The problem? It takes 10 times as much work, and I highly doubt that it increases sales by a factor of 10.


Literary writing is intensive of writing time, calendar time, and life experience. The characters form over years in the writer’s mind. Sentences are revised several times before going in to print. Every decision is questioned over and over again. The second draft is nearly a complete rewrite, now that the author understands the characters more fully. A seasoned commercial writer is about 50 percent done after writing “The End” on the first draft; the literary writer is lucky if she’s 10 percent done.

Like I said, the difference is not in value or quality so much as purpose and process. The commercial writer, once the prose is adequate enough that an editor can take the book from there, stops working on that story and begins the next one. The literary author line edits her own work and often has tens of thousands of unused back story for each of the main characters.

Commercial authors aren’t necessarily bad writers (some are, but that’s true of literary authors as well) and sometimes they’re the best storytellers. They iterate. They publish more often and get quicker feedback, so they can get more experience with a wider array of story formats. They usually have a stronger sense of the average person’s psychology– and let’s be honest, every one of us is average in almost all ways; the exceptional are usually extraordinary in only a few ways– than the literary writers (who tend, in turn, to have a stronger grasp of deep characterization, language, and atypical psychology).

Farisa’s Crossing is literary fantasy. Agents tend not to like literary fantasy (or literary science fiction). Why is that? Any answer would be speculative (pun intended) insofar as I’m not one. The polite guess is that they must believe they’re hard to market– and they might be right about that. The impolite guess isn’t relevant here.

5. I’m writing a series.

Traditional publishing carries risks. One does not sell “a book”; one sells rights to a book. This is important. Most traditionally-published authors rely on their agents to navigate their contracts. They do not use lawyers (they often cannot afford lawyers) and are discouraged by their agents from doing so. Lawyers kill deals, they say. (It may be true, but that says more about the deals than the attorneys.) If they killed so many deals, then why do publishing houses employ them?

Bad things sometimes happen in publishing. Authors get dumped. Editors change houses or quit entirely. Agents burn out and leave the industry. Someone in a distant corner of the world might say the wrong thing and burn a bridge three degrees separated from the author– zeroing the marketing budget and turning that enviable advance into a festering zombie albatross. An author might leave his publishing house after learning that he’s been under-published for years because the house hired an executive who really, really hates Ohio– and the author is from Ohio. Getting rights back, when leaving (or fired by) a publisher, can be a nightmare.

The value of book rights is book-dependent, of course. If you’re writing a book about the 2018 election, the rights are unlikely to be valuable in 2038 unless the title achieves lasting cultural relevance now. If the publisher fumbles, it’s a lost opportunity, but the loss of rights is irrelevant.

For a series, giving up the wrong rights can be deadly. Many authors cannot publish using their world or characters without permission of the publishing house. Even without that, though, taking a series to a new publisher is difficult. No publisher wants to buy Books 3–7 of a series when a rival house owns the first two books, and won’t give them up.

Books used to go out of print if the publisher stopped printing and selling copies. Rights reverted to the author. If the book was ahead of its time, or would have fared better as a $4 e-book than as a $20 block of paper in the bookstore (the author makes about the same money on each) it can be republished.

No one wants to think about their book selling poorly, or their series being dumped by a publisher, but these things can happen and not always to bad books. Good series can be trashed for all sorts of reasons. A self-publisher can try again. In traditional publishing, retries are rare– and if the book fares poorly, it’s always taken to be the author’s fault.

6. Trade publishing takes too long.

Good things take time, and books are no exception.

I could write a 100,000-word rough draft in an 80-hour week. It wouldn’t be worth reading. I’d need to spend significant time on revision. Lining up editors and cover art shouldn’t be rushed, either, and the people doing this work need time, of course. Traditional publishing requires additional lead time, due to the emphasis placed by bookstores on each title’s performance in its first eight weeks; if it doesn’t sell well in the short term, it might not have a long term.

Much of the delay in trade publishing is legitimate. Some it is not– there is some status waiting, too. A literary agent’s turnaround time can exceed 6 months. At my age, I’m not in the position where I can treat it as nothing to spend a year waiting for a “power agent” to grace me with… the right to offer him a job. I’d rather spend the time writing.

7. Control.

Title and cover art are artistic and commercial decisions; pricing is mostly commercial. Guesswork and intuition come in to play.

Traditional publishing houses have expertise, and the short-term winning bet, I think, is to hand those duties over. The problem is that, since the author signs over so many rights, he loses control completely. I’ve known several authors whose books were ruined by bad titles and cover art.

Of course, if a book flops due to bad marketing or a terrible cover, the author’s in no position to ask for it to be released again with better efforts. The publisher will consider itself generous if it offers him to write another book for them.

Self publishers, at least, can iterate and learn. This, I think, is one of the major reasons why self publishing will become the usual way in for fiction. Trade publishers will continue to work with nonfiction, public domain work, and the top hundred or so bestselling fiction others. For novelists, it’ll be a victory lap rather than a career, for those who need to negotiate foreign-language rights and screenplays before the book even comes out.

By 2030, the vast majority of important novelists– including, to the establishment’s surprise, the best literary authors– will not use traditional publishing. Why? sheer numbers. Talent seems uncorrelated with hereditary social class. For every would-be writer whose parents get him representation by a power agent as a 21st-birthday present, there are 1,000 writers who don’t.


8. I want to learn about the business.

By American standards, my politics are left-wing, so it might surprise some people that I’m saying this: I’m not ideologically against capitalism. Business is natural and necessary. I don’t view commerce as inherently dirty, and I think that academics’ outmoded, knee-jerk, leftist pearl-clutching about the material world (in fact, often a social-class humble-brag that reinforces power structures) hurts everyone. The result of the left’s dislike for all things business means that the best people shrink from it– and dirty people disproportionately go into (and end up dominating) the game. It doesn’t have to be that way.

The publishing business isn’t a massive money-maker but, for better or worse, it influences culture.

Our culture is in peril. The danger is not immigration (which refreshes it) or gender equality (on the contrary, gender justice is the strongest indicator of cultural health I know) or scientific advances (again, beneficial, at least when used well). Rather, the threat to our culture is atrocious leadership, both from the perceived right (corporate executives) and left (connected coastal tastemakers). Border walls won’t solve this problem; we did it to ourselves, and the enemy is our own elite.

Right now, too many good people sit on the sidelines. Too many people on the left would rather make a performance art out of being offended than get out there and start doing. We can’t let this happen. Good people need to enter tough, competitive worlds like business and politics– and stand up for intellect, morality, and culture.

9. I wanted to learn editing.

Editing is hard. It can be a slog.

Here’s a dirty secret about writing: quite a few people who are good at it, whether we’re talking about bestselling commercial authors or acclaimed literary voices, don’t especially enjoy it. This is something they rarely admit (and I’m not about to out anyone) and I’m not entirely sure why. I guess they have to keep up the “dream job” image, but for many of them, it has become merely a job. They’re good enough to stay relevant and get paid, but the passion’s gone.

I don’t think they should be ashamed of this. Writing’s hard. It’s not for everyone. It’s not for the vast majority of people. The world needs more readers, more than it needs more writers.

There are probably 50 million people in the United States who want to “be a writer” and will publish their novel “someday”. Not a small number of them have 300-page manuscripts. Some will self publish unready work. Others will query agents and find themselves quoted on Twitter with the annotation, #queryfail. Very few of them will actually write a solid book. Divergent creativity (branching) isn’t all that rare. It’s the fun part. Kids have it. Convergent creativity (pruning) requires taste and skill. It’s painful and detail-oriented. In corporate management, there’s a separation between the “creative work” (which is not all that creative) and the detailed “grunt work”, but that mentality carries over badly to the arts. It’s all about the details. Few people have the grit necessary to write a complete, publishable novel– much less a significant literary work.

I’d guess that 40–60 percent of successful writers still enjoy writing– and, again, I’m not denigrating those who don’t. It’s not a sin that they enjoy Manhattan cocktail parties more than 6:00am writing sessions; it means they’re normal. (I’m not normal.) I’d guess that less than 10 percent enjoy editing.

I didn’t think I would at first, but as my skills improved, I found myself enjoying editing as well. It’s a different pleasure from 120-mile-an-hour rough-draft writing, but it’s a lot of fun in its own right. I studied characterization; scene construction; nuances of grammar; line editing; story structure; and rhetorical devices and when (and when not) to use them. There’s something liberating about going deep into detail, without fear. Not many people do that after college (if even then).

When I finished my first draft of Farisa, it weighed in at 134,159 words. (I remember the number because it’s one transposition away from the approximation of π, 3.14159.) The number intimidated me, and over the next month I discovered plot holes, missed opportunities, dangling story threads and far too much telling. The more I learned about craftsmanship, the more I spotted and improved. For every 500-word info dump I could cut (kill, kill, kill those things) I found a 2,500-word scene needed to strengthen a connection between events that, in my first writing, I had assumed but never stated or shown. Some of the edges I drew, to tighten the story became nodes (scenes, even characters) in their own right. If my sum total, after a bit of line editing to take the word count down, comes in under 200,000, I’ll be happy.

Revising a 130,000-plus word manuscript is a big task. I was apprehensive. “Shit, I’ve got to edit this thing. Maybe twice, even.” (Hahahaha.) I found out, though, that I like it. I’m not a perfectionist– I went through that phase of life, and it’s crippling– but there is a ludic element, a game almost, of seeing how tight I can make a sentence or how good I can make a story.

The inclination to edit well and enjoy it, I think, is rare. Age and life history have a lot to do with it. If I succeed with Farisa (or a later work) I’ll be glad that it happened late. Many writers are ruined by early success; they write a great book at 25, but are useless by 30, because the Manhattan cocktail party scene takes them in and they stop having original ideas. It could have happened to me, and probably would have, had things gone a different way. I’m different, but I’m not morally superior.


At 35, half of my biblical three-score-and-ten, I find that as I get older, I get simpler in most ways. If somehow I beat all odds and sold a million copies of my first book, I wouldn’t hang around the Manhattan book buzz people. I’d move to the mountains and focus entirely on the second book (and the third, et al).


10. I’m realistic.

Outsiders to traditional publishing think that it comes with six-figure advances, national radio and TV spots, reviews in the New York Times, and full-time publicists pushing each other out of the way to line up one’s speaking calendar.

Those deals are rare, but they also have very little to do with literary merit. It may be true that “good writing gets found”, but what makes or breaks a career in traditional publishing is how well a book performs in its first eight weeks, and that has everything to do with how the book gets treated by its publisher, which in turn is driven almost entirely by agent clout. What favors can (and will) he call in? Will someone’s kid not get in to a preschool if the New York Times declines to review an author’s book? Book buzz is like sausage and laws; some things, it is best not to see them made.

The sausage-making component doesn’t require only “an agent”. Querying still works (given enough time) if one’s goal is just to “get in”. The agents who have the power and connections to drive the sort of treatment that makes traditional publishing worthwhile are extremely rare. One doesn’t need only to sign such an agent, but to rank among his favored clients. That outcome is inaccessible without pre-existing social class or extraordinary luck.

Most authors of reasonable talent can get into traditional publishing, even in 2018, even without inherited social connections, if they give it enough time. Their outcomes, though, are uninspiring: mediocre deals with no publicity, that they’re pressed to take because their agents will fire them if they back out, but that lead to lackluster launches that harm their careers in the long run. Querying, of course, isn’t free. It no longer costs postage, but time is the most valuable resource we have, and querying takes too much of it compared to what it can actually do.


I don’t think it’s worthwhile to be bitter about the changes in traditional publishing. Industries evolve. So long as the self-publishing infrastructure continues to grow, literature will improve with time. The few dozen power agents in Manhattan (even if augmented by the thousands who wish to join them) were always a tiny fraction of the reading population, but their proportion is even smaller if one steps up to a global perspective. As for bitterness, which there’s a lot of in publishing, the problem (as I’ve learned, by being embittered in a different career) is that it leads, paradoxically, to magical thinking. Bitter people want to be not-bitter; they want someone (like a literary agent) to come along and solve their problems. This is why they’re so easy to swindle. Bitter people fall for sweet talk– the narrative wherein someone riding higher stops for someone special, just because– and that’s a dangerous weak spot to have in business. There are cases in which to use traditional publishers, and others in which they’re unnecessary. Realism, not bitterness, is what an author needs.


11. Experimentation / flexibility.

No one knows what sells books. It constantly changes. There’s a lot of guesswork and iteration. Traditional publishers get a bad rap for how often they get it wrong, but most self publishers aren’t any better.

Marketing is especially hard for books, because the book’s main advantage over other media is its reputation for (and, because books are less expensive, true advantage in) authenticity. The production values of a film or television show come at a price: executives who control budgets, focus groups, the need to manage an average attention span. People understand this. Popular visual media tend to establish value using social proof: special effects, wide releases, and famous actors. Novels establish value through the quality of writing, characterization, plotting and world-building. The proof-of-value isn’t $30 million but 3 years of a talented writer’s time. The issue is that a reader must spend considerable time with the writing to see these production-like values; they don’t come through in a two-minute trailer. Even for the writer to get a shot, readers must know that the book exists in the first place. Marketing matters.

No one expects authenticity from a summer blockbuster– it may be there, but it’s not mandatory– but we absolutely expect it from literary novels (and, to a lesser extent, high-grade commercial works). Authenticity and marketing/publicity go against each other. If readers knew how much Manhattan favor trading and sausage making went in to “book buzz”, they’d trust it even less. For light summer entertainment, the inauthenticity of marketing is not so self-destructive. Getting people to come to the theaters is, in comparison, straightforward. For books? Most publicity efforts go nowhere, because the nature of public relations is its irreducible inauthenticity.

A publicity strategy that drives sales today might fall flat in 2019. What a publishing house thinks, for good reason, is genius, might pull a zero and take a good book down with it.

In traditional publishing, recovery is next to impossible. A one-shot approach to  One way to recover would be to reduce price, give copies away, and publish chapters either for free or in magazines, but traditional publishers rarely do. Once a book is deemed a flop (or worse, a mediocre performer, making the book expensive for the publisher to give away, which might be the best move for the next one) the publisher loses interest in its fate, although the author doesn’t.

A self publisher, when a publicity effort fails, can try another approach. There’s more experimentation available.

12. Not to be an employee.

I’ve said before that more people want to “be a writer” than actually want to write (much less write well) and one of the reasons for this is that people, eventually, want to escape the oppressive stupidity of office life. They think they’ll be their own boss. I’ll admit that this is a contributing motivation for me, as well.

I’m good at many things. I believe writing is one of them. I’m also bad at many things. Because I have a architect’s knack for how things could be or ought to be, my mind under-attunes itself to parochial details of the broken way things really are at any specific point in space and time. As a result, arbitrary authority– like bad legacy software, just another form of sloppy writing– isn’t something I handle skillfully. I’m not good at tolerating bad decisions or managing the childlike needs of people in power. If I could change such traits, perhaps I would. On one hand, I would be a less virtuous person and my life’s total value to the world would decrease. On the other, it has cost me jobs and a lot of money to be less-than-perfect at the less-than-virtuous skill at navigating less-than-excellence.

I was, at one time, in the top 1 percent or so of software engineers. Perhaps I still am, although I’m not as current. These days, I prefer management and data science roles. I had a period in which I hated writing code; I could do it, but it was a struggle, because every keystroke felt like an injection of nonsense into the world. Programming did become fun again, but it took considerable time.

Sometimes it is right and prudent to follow orders (operational subordination) but organizations often demand personal subordination. If you have a backbone, and do anything in that context– anything at all– you will grow to hate it. Writing, programming, speaking… if you do the job in a context of personal subordination, you will ruin it for yourself. You may find excuses not to do it. You might complete the work, but poorly. Perhaps you’ll power through and do it well enough, but nothing you produce will be authentic. For factory-floor corporate work, this isn’t such a tragedy to the product; mediocrity and inauthenticity are not merely survivable but expected and commonplace. For literary fiction, it’s fatal.


I know plenty of people who’ve used traditional publishing: successes and failures; people who defend it and others who loathe it. I know people who’ve been dumped by their agents and fallen to pieces; I know people who’ve been failed by traditional publishing and still defend it; I know people who’ve succeeded but would self publish if they were to do it again; I know bestselling authors with exceptional agents who love what traditional publishing does for them and have no regrets. There seem, at first, to be few similarities between the outlier successes and the horror stories, but there is, in fact, one theme that connects them all.


That theme is: traditionally published authors are employees.

For example, often they give their publishers the right of first refusal, which means they can’t shop work around unless their “home” has already rejected it. Most authors cannot publish, even short stories and bonus chapters, in a world they used without the publisher’s permission. Of course, publishing has elements of a feudal reputation economy, and an author dumped by a publisher or editor will likely find it harder to acquire another one than he did for his debut. And, as bad as it is for an author to lose a publisher, to be dumped by an agent is almost always fatal.

For example, authors who demand their publishers to do their job– to market their books– are deemed “difficult”. Those who turn down career-damaging deals with onerous contractual terms get pressure from their agents to acquiesce and, eventually, will be tossed back in the slush pile if their agents get sick of waiting for ‘dat commission. It’s shockingly easy for a writer to end up worse off than pre-debut, which leaves them out of power.

Agents don’t fear being dumped by authors, because there are thousands more submitting queries every day. Authors know that if they get dumped, their careers in traditional publishing are over.


These aren’t theoretical concerns. I know of talented writers being dumped (and blacklisted) by their agents for turning down crappy deals. I’ve heard of publishers reneging on promised marketing when the author complained about an ill-chosen title. The old system, under which authors knew that their publishers truly backed them, and that after getting published once, they’d continue to get book deals and competent marketing, is gone.

Of course, people who leave traditional publishing can still self publish, but if that were their plan, they ought not to have wasted time and rights on a different game. It would have been better for them to spend those years self publishing.

It is not always bad to be an employee. I want to make that clear. Nor is there anything sinister about employment. I’d like to have my own show in some time, but that’s not everyone’s way; done morally right, employment is a risk transfer. It only becomes immoral when the trade is misrepresented (i.e., the risk reduction is not commensurate with what the employee gives up). I’ll leave it to others to decide, for themselves, whether traditional publishing offers more than it takes away. On an individual level, it depends more on the book and the deal than anything else.

As for being an employee, there are tiers of it. There are seven-figure executives who write their own performance reviews, fly in corporate jets, and have limitless resources for any projects they might imagine, and they are employees; there are also miserable, underpaid, precarious employees. Some people enjoy organizational mechanics, either as a spectator sport or for live-action play; others consider it nonsense and a distraction. Some people excel at the game; others are either bad or, at best, inauthentic when they play. There are as many approaches that can be taken as stories that can be written.

What story do I exist to write? I don’t have a fully-formed answer but, on my own question, I’m further along than anyone else. Clearly no one else knows; I’ve lived half a life to learn that much. My job becomes to figure out the rest.

Why 95 Percent of Software Engineers Lose Nothing By Unionizing

Should software engineers unionize?

I can’t give a simple answer to this. There are advantages and disadvantages to enrolling in a collective bargaining arrangement. If the disadvantages didn’t exist, or weren’t considerable in some situations, everyone would unionize. So, we need to take both sides seriously.

The upshots of collective bargaining are: better compensation on average, better job security, better working conditions, and more protection against managerial adversity. There are a lot of improvements to employment that can only be made with collective negotiation. An individual employee who requested guaranteed severance, the right to appeal performance reviews, transparency in reference-checking and internal transfer, and waiving of onerous (and effectively nonconsensual) but common terms in contracts– e.g., mandatory arbitration provisions, non-competition and non-solicitation agreements, anti-moonlighting provisions– would be laughed out of the building. No individual can negotiate against these terms– it is, for example, embarrassing for an individual to discuss what rights she has if a manager gives a negative performance review– but unions can.

So what are the downsides of unionization? Possible losses of autonomy. Often, an increase in bureaucracy (but most often a tolerable one). Union dues, though usually those are minimal in comparison to the wage gains the unions achieve. Possible declines in upper-tier salaries as compensation moves toward the middle– however, not all unions regulate compensation; for example, unions for athletes, actors, and screenwriters do not seem to have this problem.

There are a small number individuals in software who would not benefit from unions, and there are a few firms (mostly small, or outside of the for-profit sector) that do not need them.

To wit, if you’re a high-frequency trader making $1 million per year, you probably do not need a union– free agency is working well for you– and you may not want one.

And, if you work in a federally-funded research lab that pays for your graduate education, and that allows you to publish papers, attend conferences, and perform original research on working time, then you probably don’t need a union.

If you’re a Principal Engineer at a “Big N” technology company, making $500,000 per year, who picks and chooses his projects– you’ve never even heard of Jira– and wakes up every morning excited to implement the ideas he dreamt about over night… you may not need a union.

If your boss is personally invested in your career, so much so that the only thing that could prevent you from making senior management within 5 years would be to commit some grievous crime… then you might not want to unionize.

If you’re anyone else– if you’re part of that other 95+ percent, probably 99+ percent; the IT peons– then, chances are, you lose nothing by unionizing.

For example: if you have to justify weeks or days of your working time; if you work on Jira tickets rather than choosing and defining your own projects; if you know for sure that you’re never going to be promoted; if your work is business-driven and you have little or no working time to spend on your own technical interests… then you are hopelessly nuts if you are not in favor of unionization.

Here’s why I say that. If you’re the typical, low-status, open-plan programmer, forced to interview for his own job every morning in “Daily Scrum”, then all the bad things that unions can bring have already happened at your job. Whatever negatives unions might bring– bureaucracy, reduced autonomy, lower status of the profession– have already occurred and are therefore moot.

Is there a risk that a union will introduce bureaucracy and reduce worker autonomy? Yes; sometimes that happens. But, engineers under Jira, Scrum, and Agile (technological surveillance) already have so little autonomy that there’s nothing to lose.

Might a union will create an adversarial climate between management and the work force? Sure. But, most software engineers are low-status workers whose jobs their bosses would gladly ship overseas, and who live under the surveillance described above. They’ll be fired as soon as their performance dips, or a cheaper worker comes on the market, or they piss the wrong person off. The adversarial climate exists. Again, nothing to lose.

Do unions tend to pull compensation toward the middle (or, more accurately, the upper middle)? Of course, they do. Software engineers making $500,000 per year might not see a use for unions. That said, any engineer who works on “user stories” is highly unlikely to be anywhere close to that number, and within her current company, never will be. The same applies: nothing to lose.

What do unions do? For good and bad, they commoditize work. The technician, artisan, or engineer, once a union comes in, is no longer fully a creative, unique, lover-of-the-trade (amateur, in the original sense) valued for his intangible, cultural, and long-term (looking back and forward) importance to the organization. Nope, he’s a worker, selling time or labor for money. If both you and your employer believe your work is not a commodity– this attitude still exists in some corners of academia, and in some government agencies– then you might not want to involve a union, since unions are designed to negotiate commodity work.

Let’s be honest, though. If you’re the typical software engineer, then your work has already been commoditized. Your bosses are comparing your salaries to those in countries where drinking water is a luxury. Commoditizing your work is, quite often, your employer’s job. Middle managers are there to reduce risk, and that includes diminishing reliance on singular, high-value individuals. Running a company, if possible, on “commodity” (average) talent isn’t good for us highly-capable people; but it is, when possible, good middle management.

Chances are, you don’t get to pick and choose your projects because “product managers” have better ideas than you (so says the company) about how you should spend your time. You’re told that “story points” and “velocity” aren’t used as performance measures, but when times get tough, they very much are. Open your eyes; when middle managers say that Agile is there to “spot impediments”, what they mean is that it makes it easier and quicker for them to fire people.

A union will also commoditize your work– this lies behind all the objections to them– but it will try to do so in a fair way. Most employers– in private-sector technology, the vast majority of them– will commoditize your work just as readily, but in an unfair way. Which one wins? I think it’s obvious.

If you’ve been indoctrinated, you might think that unions are only valuable for the stragglers and the unambitious, and that the services they offer to workers are useless to average, but less high, performers. False. “I’ve never been fired,” you say. “I could get another job next week,” you say. “The working world is just,” you say.

Most people hope never to face managerial adversity. I have, so I know how it works. When it develops, things start happening fast. The worker is usually unprepared. In fact, he’s at a disadvantage. The manager has the right to use “working time” to wage the political fight– because “managing people out” is literally part of his job– while the worker has to sustain a 40-hour effort in addition to playing the political side-game of fighting the adversity or PIP. It’s the sort of ugly, brutal fight that managers understand from experience (although even most managers dislike the process) and, because they choose the time and place of each confrontation, have every advantage possible. The worker thinks it’s a “catch up” meeting because that’s what the calendar says. A stranger from HR is there: it’s an ambush. Two witnesses against one, and because corporate fascism-lite is under-regulated in our country, the employee does not have the right to an attorney, nor to remain silent.

What might be able to counterbalance such disadvantages? Oh, right. A union.

What, though, if you’re happy with your compensation and don’t consider yourself a low performer? Do you still need a union?

Saying “I don’t need a union because I’m a high performer” is like saying “I don’t need to know about self-defense, because I’m so good-looking no one would ever attack me.” Real talk: that meth-addicted, drunk scumbag does not care one whit for your pretty face, buddy. Run if you at all can; avoid the fight if he’ll listen to reason; but, defend yourself if you must.

Have you, dear reader, been in a street fight? I don’t mean a boxing match, a prize fight where there are still rules, or a childhood or middle-school fight that ends once one person has won. I’m talking about a real adult fistfight– also known as: for the attacker, an assault; for the defender, a self-defense situation– where multiple assailants, deadly weapons, and continued (and possibly lethal) violence after defeat are serious possibilities? I, personally, have not.

Most people haven’t. I’ve studied combat enough to know that most people (including, quite possibly, me) have no idea what the fuck to do when such a situation emerges. Many victims freeze. Given that an average street fight is over in about ten seconds– after that point, it’s more of a one-sided beatdown of the loser– that’s deadly. But it’s something that untrained humans are not well-equipped to handle.

Even people with excellent self-defense training avoid street fights– there are too many bad things that can happen, and nothing good. Sometimes, they lose. Why? Because their training, mostly oriented around friendly sparring, has them primed to stop short of hurting the assailant. That’s noble, but against someone who will bite and eye-gouge and resort to murder, this is a disadvantage.

What sorts of people are experienced with street fights (not sparring)? Criminals, reprobates, psychopaths…. Thugs. They’ve been in a few. Pain that would stall or incapacitate the uninitiated (that is, most of us) doesn’t faze them; they may be on drugs. They’ll do anything to win. They’ve stomped on necks and heads; they’ve pulled knives and guns; they’ve possibly committed sexual assaults against their victims. They know and choose the venue. They select the target and the time. They may have friends waiting to get in on the action. They may have weapons. They know almost everything about the situation they’re about the enter and, most of the time, their target knows nothing.

The odds for an untrained defender, in an unanticipated self-defense situation, are extremely poor.

It’s the same in the corporate world, when it comes to managerial adversity. Most workers think they’re decent performers– and, quite often, they are– and when they’re hit out of the blue with a PIP, they don’t know what’s going on. Was it a performance problem? Often, no. Perhaps the manager found a 2013 blog post and disliked the employee’s political views or religion. Perhaps, as is usual in private-sector technology, the company dishonestly represented a layoff as a rash of performance-based firings. Perhaps the employee is working in good faith, but performing poorly for reasons that aren’t her fault: poor project/person fit, or life events like health issues, sick parents, or divorce. Perhaps some stranger three levels up made the call, to free up a spot for his nephew, and the hapless middle manager got stuck doing the paperwork.

The corporate world is a might-makes-right system where there is no sense of ethics. There is no line between abuse of power and power as those on top see it; what we plebeians call “abuse”, they call “power”; what use would power have, they ask, if there were rules put on it?

People suffer all sorts of career punishments– PIPs, firings, bad references, damaged reputations– for reasons that aren’t their fault. The idea that only bad workers end up in this situation is analogous to the idea that the only people who can be assaulted on the streets are those who asked for it.

As in a street fight, the odds are overwhelmingly bad for an employee under managerial adversity. The other side has more information, more power, and more experience. Management and HR have done this before. The worker? It’s likely her first or second time.

In a non-union, private-sector organization like the typical technology company, to be an employee is to walk down the streets, alone, at 2:30 in the morning.

For everything one can learn in a self-defense class– proper fighting techniques improve one’s chances from impossible to merely undesirable– the best defense is to avoid dangerous places altogether. In the corporate world, that’s not possible. This is a country where at-will employment is the law of the land, so every time and every place is dangerous. Every street should be considered a slum; it’s always 2:30 in the morning.

If one must go into a dangerous place, what’s the best means of defense? The same rules that apply in bear country: don’t go alone. Wild animals rarely attack humans in groups, and criminals tend to be similar. But the corporate system is designed to isolate those it wishes to target. In the meetings that unfold under managerial adversity, the boss can bring in whoever he wants– HR, higher-level bosses, “Scrum Masters” and miscellaneous enforcers, even his 9-year-old son to laugh at the poor worker– while the target can bring in… only himself.

I do not intend to peddle illusions. Unions aren’t perfect. They aren’t good in all situations. However, most of private-sector technology needs them. Why? Because they allow the worker to exercise his right not to go alone. The HR tactics (e.g., stack ranking, performance surveillance, constructive dismissal) that are so common in technology companies to have become accepted practices would simply not survive under a decent union.

The average non-managerial white-collar worker has never been in the street fight of managerial adversity. Unions have. They know exactly what to do– and what not to do– when a situation turns nasty. Fights, albeit for the side of good, are much of what they do.

Again, if you’re in that elite cadre of software programmers who get to work on whatever they want, who find $400/hour consulting work just by asking for it in a tweet, and whose bosses see them as future leaders of the company… then you’re probably not reading my blog for career advice. On the other hand, if you’re in that other 95-plus (to be honest, it’s probably 99-plus) percent, you should unionize. All the bureaucracy and commoditization that you fear might come from a union is already around you; you can’t make it go away, so the best thing to do is to make it fair.

Incel: the Strange Identity That Became a Weapon Against Feminism

The incels are coming. Hide the socks.

The word incel means different things to different people, which makes for dangerous discussions. On the surface, all it takes to qualify as an incel is to be involuntarily celibate, a fairly common turn of fate that most people experience at least once, and yet a community of homegrown extremists and terrorists have taken up the label incel to describe something darker: a defeatist mentality asserting that women (and especially feminists) have doomed a large percentage of men to implacable misery.

If by “incel”, one means a misogynist or extremist, than nothing is acceptable but an utter desire to end that culture. Of course, to attack incels as people risks association with one of the oldest pillars of patriarchy: virgin shaming. This is why I don’t like the term incel: the extremists began using it to sympathy, but also to recruit, because although pathological misogynists are uncommon, people suffering annoying dry spells (and in the ages of 15–25, when people are most susceptible to propaganda, they are mostly men) are not.

Make no mistake about the incel identity, though: whatever the word meant once, it has lately been used as a self-identification by a culture and ideology so frightening, retrogressive, misogynistic, and downright insane that it takes a strong stomach to look at it square-on.

In an age of proliferating identities, where personality traits become labels, and we have terms like demisexual, otherkin, wagecuck and NEET flying about, the identity of incel is perhaps the strangest, because it fixates on what is, for almost everyone in fact, a transient frustration. Sexually speaking, there’s an order of magnitude more demand for young (18–23) women than men of that age, and so this period of time is unpleasant for most men. It’s so much so that societies have to invent ways to deal with it: prostitution is an old one, and martial culture (giving young men a source of worth) is another. College is yet another technique that tries to handle it, by culturally and geographically isolating 18–22 year olds so young men have a chance. Mostly, though, this problem is managed privately using wealth transfer, especially around social and cultural capital where there’s enough ambiguity to make it socially acceptable. Young men from privilege get set up, by their parents and inherited networks, with precocious career advancement to give them esteem, build their confidence, and make maximize their “eligibility” when they hit the golden score of male sexual attractiveness (25 to 44) and look for marital partners. The rest of the young men can go die, as far as conservative patriarchal societies like our late-stage corporate capitalism are concerned.

That’s what’s so weird about incel rage. These men are blaming women for something that patriarchy did to them. Women didn’t create the Hollywood narrative under which only young sex counts (quite opposite from the truth) and a man is a loser if a virgin at 25. Women didn’t crash the job market. Women didn’t drive up college tuitions. Patriarchy– and about 90 percent of the people running it are men– did that.

Inherent in the incel worldview is the notion that this transient state– an unfavorable sexual power balance, since women reach high levels of sexual attractiveness so much earlier than men– will last forever. Average- and even above-average-looking incels declare themselves “ugly” based on facial bone structure traits that haven’t been fetishized as much since the racist pseudoscience of the late 19th century. Male grievance culture isn’t new; it’s been around forever. What is new is the degree of despair and violence. It wouldn’t have been able to hit a critical mass until recently. Male grievance culture– from mainstream sexism in the 1950s and ’60s, to the rakish porn-star chauvinism of the 1970s and ’80s, to the pickup artistry of the 1990s and ’00s, to the raving misogyny of incels today– has been increasingly cult-like with each iteration. What gives a true cult its ultimate hard-on? Apocalypse. What did it take to bring the incel phenomenon about? Socioeconomic collapse.

The economic changes of 2008 were managed well-enough to protect the wealthier and older people by keeping asset prices up. Socioeconomically, however, they were cataclysmic and most of society is underappreciative of the damage that has been done. We’ll be reeling from this, fifty years from now. The shithole we let our society become, that’ll kill people in the future even if we fix everything now. For example, people will die in 2060 because of anti-medical prejudices and bad habits developed, right now, in this era of unaffordable, lousy care, atrocious coverage, and adversarial behavior by employers and insurers who’ll break a social contract (and often a legal one) as soon as there’s a dollar in it. The world changed; an apocalypse actually happened.

What does this have to do with incels? Well, they are a post-apocalyptic creature. What makes them unnerving and sometimes disgusting is their complete lack of insight into the nature of the apocalypse inflicted upon them. They blame women for a social calamity– one that has left them hurting and miserable– that was, in fact, caused by corporate capitalism.

Incels believe that priapic creatures like “Chads” (the male entitlement figure of yesteryear) and “Tyrones” (an offensive African-American stereotype) scour the wasteland of modern human sexuality and fight over the last remaining “pure” women like junkyard animals. Names that Chicagoans and Twin Cities residents used to describe less-sophisticated Midwesterners who gave the region a bad name– Trixie, Chad, Becky and Cam– have mutated into supposed creatures one would expect to fight in the 2300 AD world of Chrono Trigger. Anyway, in this post-apocalyptic, over-fucked and semen-drenched world, everyone’s having lots of sex– frequent, amazing sex, because that totally happens at 18, and also inexplicably stops around 23– except them.


I’ve spent months studying human sexuality, in part as background research for Farisa’s Crossing, since I’m having to build characters with sexualities different from my own. You’d think there’d be data to support this sexual apocalypse, if one were going on. Nope. For example, infidelity and marital failure are becoming less common, as is participation in high school and college casual sex. The culture’s healing, not falling apart. What’s driving it? It turns out that feminism is a good thing, especially for so-called “beta” males who lack the glib charm, aggressive presence, irresponsible risk-seeking and financial resources to succeed at high-frequency casual promiscuity.

It’s patriarchy that drives women into the arms of boorish alpha males– the sorts who climb corporate hierarchies– not feminism. When women don’t have to marry early out of economic necessity, and when they choose their husbands instead of having those choices made by their fathers or economic forces, so-called “beta” males win (contrary to “Chad” phobia) more often than the aggressive, boorish men our society deems “alpha”.

Incels and MRAs halfway acknowledge female maturation, but because they’re so obsessed with casual sex, they’ve built up another toxic narrative to explain it.

The worst men seem to win at casual sex. No one disputes this. Even if decent men are having casual sex– and one must be careful about terminology here: does it count as casual if it becomes a legitimate relationship? what about if it happens between two close friends?– it is most often the case that the indecent are loud about it. Perhaps normal people are doing all kinds of stuff only they know about, but the loudest cultural narrative one sees in casual sex is that of macho, entitled men taking advantage of women with low self-esteem (often, victims of abuse) with copious alcohol in the mix. This is unhealthy; it’s hideous. Is it the sexual mainstream? No.

There is no sexual apocalypse. Terminological debates aside, casual sex and in particular stranger sex (to which incel fantasies about hyper-aggressive demonic men absconding with superficial women might apply) seem to be going down. This variety of sexuality, perhaps deserving of its vilification for its superficiality and tendency to spread disease, isn’t common at all. Most women have no casual encounters; or they have one, don’t like it, and never do it again; or they only have them when led to believe (often by unscrupulous men such as “pickup artists”) that romantic relationships are forming, which is not their fault. Few women knowingly have causal sex and they don’t enjoy it: only about 10 percent of women orgasm on a one-night-stand. In this light, the incel mythology about women pining for selfish “Chads” is a bit absurd to anyone who understands sex. Monogamous relationships, which women overwhelmingly prefer (with a few exceptions), are very much in, and feminism is no threat to them.

There’s a difference, of course, between a dry spell and an apocalypse: between weather and climate collapse. Like most 20-year-old men, I was socially and romantically unsuccessful compared to what I wanted to be at that age, but I knew it would get better. At that age, women have all the options and men have maximal competition; it improves. Incels, on the other hand, have tied themselves to the mast with an extreme notion that there’s no hope. Most of these guys aren’t unattractive or seriously disabled (except, perhaps, for often-treatable mental illnesses) and they come overwhelmingly from the middle class of the English-speaking world. They live in diverse countries where they could easily meet women from all sorts of cultural backgrounds. It is not hopeless, at least sexually speaking, for them at all; if they evicted the misogynistic, cultish garbage from their heads, they’d be fine.

They just need something better to do with themselves. See, the way we handled dry spells, back before the 0.1 percent trashed the economy, was to focus on our careers. Those existed back then. There was a time– in 2018, it’s hard to imagine this– when applying for jobs actually worked. Transiently sexless men had something to do other than stew about experiences they weren’t having.

We are not in a sexual apocalypse caused by feminism. We are in a socioeconomic apocalypse caused by corporate capitalism– also known sometimes as “the patriarchy”, though I dislike this term because it demonizes fatherhood, and it gives too much credit to an oppressive system. We must know who our true enemies really are. Incels have allowed themselves to become useful idiots, who blame the malfeasance of corporate patiarchy on women.

Incels aren’t miserable because of women. They’re not miserable because they aren’t getting sex, because it’s always been hard for young men to get sex, and because sexless doesn’t always lead to such rage. To wit, most 60-year-old widows become involuntarily celibate, but don’t fall into rage. These men are miserable because society has subjected them to a long con. They’ve been swindled. Society has stuffed their minds full of rotten ideas that are leading them down a bad road.

When a corporate-capitalist society such as Mussolini’s Italy or Corporate America perceives peace, it does not go out of its way to differentiate gender roles: for example, in overt governmental fascism, men and women are both told they must support the state; in our covert employer-nucleated fascism, the directive is to support a manager’s career and hope to be invited to ride his coattails. When such a society perceives war, though, gender roles emerge: the woman becomes a soldier factory, favored for her ability to produce children of the master race; men become sacrificial and are told to accept posthumous glory, for not all will survive what the society decides it must do.

Our time is unique, in that peace and war have become one, like the gas and liquid phases of a supercritical fluid. Most Americans do not sacrifice, as we would in war– meat and sugar are not rationed, gas prices of $3 per gallon are cause for complaint– and the wealthiest quarter of us can live in peaceful prosperity. At the same time, war surrounds us: two campaigns we started (with unclear intention) last decade still rage in the Middle East; our appetite for drugs has financed violence and upheaval from Juarez to Medellin; and social media drama can render an individual unemployable (and blacklisting is, I would argue, an act of war). In what state are we? A peace with pockets of war, or a war that looks like peace? If war, who is fighting whom?

Patriarchal and fascistic societies ramp up toxic masculinity in preparation for war, especially when they intend to be the aggressor, and wind it down (into smoldering chauvinism, as in the 1950s) during peace. So what’s our state today? We live in a mostly-peaceful but tenuous time of asymmetric economic war. Overt acts of aggression (health insurance denials; negative employment references and blacklisting; social-media harassment campaigns; poisoning of public water resources) are fairly uncommon, but terrifying, rapid in their onset, and hard to prevent or control. We live in a time where 140 characters from a powerful person can send fifty unconnected strangers to harass any target in the world. We live in a time when workers get fired, quite literally by computers whose sole purpose is performance surveillance; the manager’s only function is to read the monthly print-out and deliver the bad news.

It’s important to understand that, while this war is different from any other, it is a real war. The 0.1 percent has not been waging “class war”, some inferior category thereof, against the rest of us. It is, and has been for a long time, an actual war. People have died because of it.

Incels get the nature of this calamity wrong. They’re too young to know about health insurance, and they haven’t gotten in the corporate world yet, which is why they think it’s only women who are capable of maltreating people. It’s impossible to sympathize with the militant incels, because they lash out at innocents, but they are perceptive of the fact that an apocalypse is underway. Their mistake is that they mischaracterize it. Millennials really have been fucked over by previous generations.

Had the upper class not stabbed us in the back, we know what society would look like, and we know because this is what was like, forty years ago: if you had a car and a college education, you could talk your way on to a job anywhere in the country. You’d call an executive on Thursday, have an hour-long lunch with him on Friday, and start on Monday. If you were 27 or older, you’d get a management-level job. If you were 32 or older, you’d get an executive job. If the job required an advanced degree, the company would send you back to school. If it was 1:30 in the afternoon and you were still working, you were a go-getter who’d get every promotion. This is the country we used to have, and our elite took it from us, and we should be willing to fight them– to die, and even to kill, if necessary– if we stand a chance of getting it back.

What killed our society, starting in the 1970s? The right wing wants people to believe that social advances (feminism, gender liberalism) had something to with the economic degradation that began around the same time. That could not be farther from the truth. In fact, the situation for women and racial minorities has been declining of late, specifically because of worsening economic inequality and job prospects. So what did go to hell in the late 1970s? Again, the culprit is toxic masculinity.

The elite of the 1940s–70s saw themselves as a national elite and took pride in making the country better: building libraries and museums, supporting progressive causes, and making education more available. To the extent that this can be gendered (and at the time, it was) this was a productive masculinity that brought society (and, over time, women) forward. Toxic masculinity never took a break, but in economics, it was on defense for a solid forty years, only to rage back into focus in the 1980s. Why?

One might be tempted to pin our society’s self-created decline on “the Reagan Era”, but I don’t think one conservative politician can be blamed for everything that happened. Rather, as we became increasingly connected, our national elite re-polarized. This ties in to our hatred for Baby Boomers. Most Baby Boomers aren’t the privileged assholes we love to hate on– the traditional Boomer narrative ignores black Boomers, gay Boomers, dead-in-Vietnam Boomers, and Boomers who fought for the rights of minorities or engaged in the (alas, losing) battle against corporate supremacy. But the Boomer 1% deserves its horrible reputation. These were the guys who compared themselves to oil sheikhs, third-world despots, narcotraficantes, and (after 1990) post-Soviet kleptocrats and decided that the American CEO– making $400,000 per year, and having to follow his country’s laws– was the short man in the group.

The lesson from the Boomer 1% is to forget Milton’s comparison of reigning in hell versus serving in heaven. From a material perspective, it is even superior to reign in hell over reigning in heaven. The 1980s is the decade when our elite began intentionally de-civilizing us in order to join the slurry of kleptocratic garbage that is the global elite.

It is hard to imagine reversing the above. The national elite, as it once was, is dead. After selling us out, it was subsumed into the malevolent global one. Toxic masculinity runs the world again– to everyone’s detriment. It’s the force that drives a man with $1 billion to want $10 billion, or a man with a beautiful wife to cheat because he has decided that the world owes him 10 (and then 100) beautiful women. It is not enough for him to drink and enjoy his milkshake. He must drink all the milkshakes, even if he throws up afterward.

Incels are not the men running the world, of course. They’re not drinking any milkshakes. In fact, they’re triple-threat losers. They’re sexual losers because of their social alienation and self-sabotaging tendencies, perhaps inherited from our puritanical culture’s views of sex as dirty (amplified by an envy of the mature and less inhibited). They’re social losers because toxic masculinity says in no uncertain terms that low-status, unsuccessful men are worth less than garbage and ought to be viewed with suspicion. They’re economic losers because the high-autonomy middle-class jobs (which would be fantastic plum positions by today’s standards) have been replaced by technologically surveilled and menial subordinate work. They exhibit toxic masculinity in their odious attitudes toward women, but they’ve also been crushed by it.

The logical fallacies of the male grievance culture are too numerous to list– each one could get an essay of its own– but the most prominent (no pun intended) is the apex fallacy. An apex fallacy exists when one compares the most successful or fortuante of one group against the average-case performance or outcomes of one’s own. Reactionaries and nostalgists often indulge in apex fallacies, comparing their lot as average people today to those of kings, knights and ladies– not peasants who die at 33 of typhoid. Likewise, incels believe that women drown in male attention because they’re hyperfocused on the white, blonde, young “Stacies” that so many other men are chasing. Apex fallacies exist, likely, because it is advantageous to observe the most successful individuals. When the pinnacle of a society is corrupt, calamity is likely to follow for that reason– bad examples are being set– and we should be scared for that reason. Incels look at the top of society and see people devoid of virtue– the unaccountable, unscrupulous, self-indulgent “Chads”, almost always from well-connected families– winning. Their most noted reaction, “Why can’t that be me?”, is hardly sympathetic, but their problem is. In terms of male role models, our society is in dissolution.

Corporate capitalism, and other forms of dysfunctional patriarchy, cannot keep themselves afloat without using various narratives to manipulate people’s desires and therefore allay the resentment that would otherwise accrue to the corrupt top. For example, it is patriarchy (not feminism) that tells men they are worthless if they cannot support a family on one income. To be a “basement dweller”, under patriarchy, is to be less than human. The system tells men to derive their sense of worth from capability, especially as expressed in competitive endeavors– even if those contests are dehumanizing or stupid. In high school and college, one of the most fetishized (but also most detrimental to personal growth) competence metrics is the ability to procure sex when one wants it. (And, further according to this narrative, men always want sex; or else there is something wrong with them.) When incels struggle with a normal, benign thing– that it is difficult for men under 25 to find sexual partners– they begin to see themselves as useless incompetence, doomed to fail in all other areas of life. They shut down; they lose contact with their friends, their grades drop, and they become addicted to video games and internet trolling– living out their power fantasies behind a keyboard.

What does patriarchy think of this massive waste of male talent? Patriarchy couldn’t be happier. See, virgin shaming is what keeps men going into work, in order to procure those pictures of dead people, that can be traded for social experiences like overpriced meals and recreational neurotoxins, that may on occasion lead to sexual access.

There’s a response I can imagine coming from incels and MRAs, which is that women, as much as men, can participate in virgin shaming, gold digging, and various other behaviors that keep toxic masculinity in place. Of course, that’s true. See, feminism doesn’t require a conviction that women are innately morally superior to men. I am a feminist and hold no such belief. I think the distributions of moral character are most likely equivalent across genders. And just as there are good men aligned with feminist causes, there are plenty of women who lend their support to patriarchy, who enforce its doctrines, and even who prefer to live within it. Women actually exist who uphold toxic values by making themselves available to the sorts of malignant, aggressive men running our civilization into the ground. It is not acknowledge of their existence that makes MRAs and incels problematic; it is their inaccurate believe that immature, damaged women are somehow representative of the gender (they are not) that makes this dangerous. The truth is that, in a world with billions of people within it, you’re bound to find everything.

What is feminism? I think it has two components. One is the belief that women ought to have equal political and economic rights to men. That, on itself, doesn’t need to be called feminism. If this were all there were to the feminist cause, I’d have no issue with people who say, “I’m not a feminist; I’m an equalist”. The second component pertains not to biological femaleness but to femininity. This gets tricky, because it’s not clear that any of the differences between “masculine” and “feminine” nature exist in any innate way. Any discussion of masculinity and femininity must be relative to a cultural frame. There’s a lot of virtue– compassion, judgment, quiet competence, collaboration over competition, sexual restraint– that lives in what out culture construes as feminine. What makes toxic masculinity so virulent is that it’s built to destroy the feminine. It does not necessarily hate females; it hates femininity in women, but especially in men. What we’re learning, as our late-stage corporate capitalism destroys the planet ecologically, culturally, and socially as well as economically, is that in order to survive for another century, we’re going to have to become more feminine. It is not about women as superior to men (I do not think they are) but the need for us, as humans, to evolve in a more feminine direction and, while retaining masculinity’s virtues, purge it of its aggressive and toxic elements.

Feminism also has tons of historical support. Making things better for women also makes the world better for men. Gender is not a zero-sum game.

Self-indulgence is often marked by misogynists (most likely, a case of projection) as a female vice, but it’s actually the core of toxic masculinity. This is not to say that female self-indulgence and toxic femininity don’t exist– every woman who demands an expensive carbon crystal before she’ll marry is engaging in an instance of toxic femininity (manufactured by toxic men in the diamond industry)– but it seems to be toxic masculinity that is most capable of metastasis. Toxic masculinity says: one must grow up and acquire, acquire, acquire; one must do it fast; and one who acquires less than other men is inferior and not really a man at all. Accrued wealth and paid work– the influence of family contacts, though it accounts for almost all of what actaully happens in the career game– become the sole, numerical metric of male value. One cannot criticize the might-makes-right corporate system, either, unless one wants to risk being called “whiny”, “weak”, “a snowflake”, or (who can forget this classic?) “a fag”.

Corporate capitalism and toxic masculinity are cruel, and there’s no moral justification for shoehorning 50 percent of the population into it (and forcing the other 50 percent to clean up). But is this brand of masculinity a con? I don’t think it always was. In the 1950s, there was real work to be done, and people could make a living by doing it. Competence and merit actually mattered: there were more small businesses, it was easier for a skilled person to escape a reputation problem and reinvent himself, and there was high federal investment in R&D, resulting in 4–6 percent annual economic growth. For all the flaws of that era– I can’t think of anyone sane who’d want to restore 1950s gender or race relation– it was a time when work worked.

Keynes predicted that, by now, we’d be working about 5–10 hours per week. That turned out to be right. So where’s our leisure society? Nowhere, because of the Graeberian imperative to hold position. People now spend 10 hours to work a 2-hour day, the rest of the time full of useless anxiety in open-plan offices that exist largely to humiliate them. If the bosses figure out how little work is necessary, they’ll cut jobs and workers will lose, so it must be hidden. The work being done almost never matters; it is mostly a commodity, and little respect accrues to people who do actual work. Instead, we’re a nation of professional reputation managers. If you’re not disgusted by the notion, you’re not human. Of course, this means that the winners of the new economy are those people (mostly, physically imposing men, because even though such violent confrontations have been rare for thousands of years– it’s now how we like to do business– it is just easier to ask for favors when one could physically end the other’s life) who can force others to manicure their own personal reputations. Neofeudalism sets in: those who have permanent staffs of reputation managers (of course, the firms that employ them fully believe real work is being done, and occasionally it is) become lords, and those who support their campaigns for relevance in a blandly decadent, pointless economic system become the vassals.

One can see this most prominently in that people do things that are more work-like for their hobbies– gardening, hunting, hiking, learning new fields, writing– than the stupid, sedentary, humiliating subordinate bullshit they endure in under the proto-fascist corporate regime of status reports about status reports they call “work”. Men (and women) used to go to work and do things, but now they go to work and subordinate to other, almost always completely useless, men.

Isn’t this ancient, though? Hasn’t work always been about subordination? Well, yes and no. This topic requires more words than I can give it, but complex endeavors always require operational subordination. That is, some people have to take direction from others, and apprentices need more direction than seasoned masters. There’s nothing wrong with operational subordination; we do it every day, to our benefit, when we stop at a red traffic signal. It is better to follow a sound order and wait two minutes than to disobey it and possibly die in a preventable traffic accident. Operational subordination isn’t humiliating; it’s just something we need to do. In today’s corporate climate, though, the demand has gone beyond lawful operational subordination into personal subordination. It is not enough for the worker to take direction; he must fully accept the total superiority of the manager. It is not enough to do the job well; he must pretend to like it, he must ask for more grunt work when he is underutilized, and he can never for a second allow anyone to hold the suspicion that he might be smarter than the mediocre apparatchik doling out the tasks.

Here is where I offend some leftists: it may be entirely due to socialization, but men and women are different. Women are, to put it bluntly, better actors. They learn how to be pleasant to people they dislike, to mirror emotions without feeling them, and to engage in the ceremony of personal subordination while, in fact, avoiding major compromise. They’re socialized to put a crumple zone between them and abuse that is coming from uphill. Perhaps that’s why, even though corporate culture is terrible for women, it’s devastating to men. Women can play a humiliating, stupid game– powdering the bottoms and attending the whims of adult babies called “executives”– without total personal collapse, whereas men seem unable to do so. I don’t think the explanation is that men are weaker; I think we are not socialized as well to be actors– to be able to play a humiliating, subordinate role for 8 hours per day without internalizing it– and that we are also pushed to identify with paid work (a problem, in an economy where humiliation is the only thing left most people will pay for) more than women are.

If you tell men that the highest expression of masculinity is to go into a workplace and subordinate to other men– not the temporary operational subordination of the apprentice, but a permanent personal subordination to better-placed, my-daddy-made-a-call mediocrity– you’re going to have a masculine crisis on your hands. And we do. While I won’t get into detail about Jordan Peterson, his appeal seems to derive from his willingness to address the masculine crisis head-on, without fear. (This is not to say that he knows how to solve it.) But here’s the truth: our masculine crisis will not be solved until we eradicate artificial scarcities (which exist to manipulate men into working hard, on the promise that those proxies for female sexual attention– job titles, higher salaries– actually mean something) and corporate capitalism itself. To kill corporate capitalism, we’ll need to institute a more compassionate society– one that takes care of people, sending them to school if they wish, paying favors forward without expecting immediate return– and that would be, traditionally, more feminine. So we have the odd-sounding-but-true conclusion that the solution to our masculine crisis is (in part) feminism.

What was done to these incels was not done by women. It was done to them by patriarchy: a system that has inculcated the notion of women as sexual objects and rewards for participating in an economic system that professes to be meritocracy but that, on closer inspection, is no further along an evolutionary journey than might-makes-right barbarism. They are just entitled-men-the-enemy. They have been infected by terrible ideas and they are suffering intensely. And while their expressions of rage, both on and off the internet, are often unacceptable, we must raise our focus away from this particular element, and smash the woman-hating, racist, elitist, proto-fascist corporate system that created them in the first place.

The Green Pill: the Case for Doing the Right Thing; Why Feminism Is Good for Men, Too; and What to Learn from “Incels”.

Fifteen years ago, I got taken in by the male grievance cult and swallowed its nonsense whole: pickup artists, Chads, dual mating strategies… all that garbage, though we had different names for the stuff. It was the same ugly culture, thought it seems to have gotten worse. I was what would today be called an “incel”: unsuccessful with women, and seething with rage. I’m ashamed of my participation in that world, and in the then-fledgling art of internet trolling, seeing what all the nonsense has led to.

Today, for a contrast, I’m happily married to a feminist woman, and I’m writing Farisa’s Crossing, a novel with a female protagonist. What changed? Well, the time in between has been quite interesting, and I think there’s something one could learn from my own zero’s journey into (and, later, out of) the “red pill” world of male grievance culture. Yet, every time I sit down to write “that” essay… I just fucking can’t. I don’t like reliving it. Today’s incel phenomenon hits too close to home. I read delusional, angry screeds on the “braincels” subreddit or various other incel forums, and I remember a time when I could have believed (or even said) such things. It gives me a headache.

Forgive me if this is raw. I’m not a saint. I don’t judge the male grievance community– increasingly like a cult in its commitment to a set of incorrect, self-defeating, and misogynistic beliefs with no bearing in reality– from a place of superiority. I was there once. I got taken in, and I got out. I know how it operates, and I know why it appeals to some young men.

What is it that drives young men, while they endure that oppressively quotidian and not especially harmful problem of early-adulthood sexual infrequency, into such rage and despair? Well, I think everyone should watch this video about charismatic anger. Rage spreads. Fear sells a story. Angry memes stick in the mind, regardless of truth. Add to this some confirmation bias, apex fallacies, and ready-made excuses for one’s own sexual infrequency– “it’s not me; it’s all women”– and you get a self-defeating complex that takes years to evict from one’s head. As with a cult’s illogic, smart and otherwise rational people don’t seem to be immune to this.

Most of the guys who get taken in male grievance culture are like me around age 20: decent men in a vulnerable, difficult time where the rules are unclear, it’s hard to know what’s going on, and everyone else seems to be doing better. That said, the luminaries of this culture seem to be an assortment of loathsome creatures, such as: white nationalists (who argue that multiculturalism and miscegenation have caused the incel’s problems), domestic abusers, pedophiles (who wish to normalize their perversion by demonizing adult female sexuality), and the sex addicts who call themselves “pickup artists”.

The male grievance cult, in other words, draws its strength from the worst of the male gender.

Blue, Red, Black and Green Pills

Mainstream American culture doesn’t indulge in the overt misogyny of incels or pickup artists, but one of politically correct hypocrisy. We claim to be liberal and vote conservative. We support a might-makes-right economic system, corporate capitalism, in which economically successful men (until 2017) were able to maltreat women with impunity; it was one of the perks of being an executive. This corporatized, paper-thin, dishonest culture I call chauvalry: a combination of chivalry and chauvinism. It’s what the male grievance culture calls the blue pill.

The blue pill’s not feminist. It’s the worldview of the Hollywood movie where being “a nice guy” and working for hard for his boss is enough that a man “ought to” get sex any time he wants it. In romantic comedies, it shows us male behaviors that would actually put someone in jail: ticketless airport runs, stalker-level displays of singular attention at the Act-2/Act-3 transition, punching guys in the face who look at one’s girlfriend the wrong way. It tells men that if they do the right thing, two hours of cat saving ought to be enough to attract women– even if a man is still in high school. That’s not how it works. An exercise montage stands in for the hundreds of hours it takes to fix or improve an ill-cared-for body. Adults know this, but adolescents might not fully get it.

The blue pill, “nice guy” worldview is casually misogynistic. It indulges in just world fallacies that suit our corporate masters. Do what you’re told for fifty weeks, it says, and your beautiful wife will give you hot sex on your two-week vacation. Be the hard-working “all-American” guy, and you’ll get laid, no problem. It presents sex as the ultimate validation of male virtue, and women as a sort of “insert compliments and free dinners, receive blowjobs and nookie” vending machine. You don’t have to go to the gym and become an attractive person, or read books and become an interesting person; just show up at your job, and a pretty girl will come by and touch your dick, we promise.

Of course, sex is not (nor should it be) the measure of male virtue. Men are not owed sex for being productive members of society; they are not owed sex at all from anyone who does not want to have sex with them.

Usually by the first or second year of college, men realize that the blue-pill story is fraudulent. They see useless men getting ample sexual activity in high school and they’re told that it’s different in college. Whether they go to a state school or to Harvard, it isn’t. Actually, I don’t think that useless men are getting more sex on average than anyone else; they’re just the ones who make a trophy out of it. The decent people are having sex, too; they’re just not talking about it.

In The Matrix, the protagonist is offered a choice: take the blue pill and persist in self-deception, or take the red one and engage reality, starting on a hero’s journey. The male grievance community co-opted this metaphor, and starting using the term “red pill” to describe their alternative, less corporate but more vicious, misogyny. Of course, what they call “red pill” isn’t any more reflective of reality than the blue pill worldview they reject (and that all thinking adults know to be a facade). But, they took that term first, and we’re stuck with it that way.

The red pill view of women and relationships is much more dismal. It views all of us (male and female) as selfish, hypersexual, narcissistic and obsessed with physical appearances. Coming from a mix of failed providers who fared poorly in divorces– most divorces aren’t “won” by the woman, but impair both parties’ finances– and sex-addicted pickup artists, the red-pill view that dominates male grievance culture is intensely negative. For example, the typical red-pill view of women is that they all secretly long for domineering men (“Chad”, in incel lore) who will degrade them.

In the early 2000s, the process was called “speed seduction” or “Game”; now it’s known as pickup artistry. The truth about pickup artists is that they’re often insecure, disease-ridden, broken men. Their lives aren’t enviable. Their high-frequency promiscuity is mostly made possible by lowering of standards. A bona fide sex addict doesn’t care if she’s a “9/10” marriage-worthy chemical engineer or a “3/10” disease-ridden drunk party girl, and relationally-impaired men often can’t sustain the effort necessary to attract the former. However, their unhealthy lifestyles have given them hypertrophic social ability. They know what a certain subclass of women, selected for rapid sexual availability, want.

What makes pickup artistry so dangerous an art for a young man to learn is that (as with cults) the first courses and modules will focus on what 97% of people (i.e., the ones who know it) would call common sense: basic social skills and grooming: don’t talk about sex on the first date, wear dark colors to seem more masculine, do between 25 and 40 percent of the talking. All of this advice actually works, contradicting that two-word blue pill myth, Be Yourself. It’s the later material in the pickup world that’s more disturbing. Ultimately, pickup artists’ views of women are not based on the best sample, but on the small percentage of women on whom cheap tricks work. Since pickup artists are rarely able to achieve long-term, mutually enriching relationships, they deny their possibility. Run Game forever, they say. Never let your guard down, they say. Don’t be vulnerable, they say. All girls are basically the same, they say. It’s best not to listen to that shit. None of it’s true.

Pickup artistry doesn’t work, not as advertised. The high-pressure sales tactics that lead to quick lays will undermine genuine relationships. “Dread game” is abuse. Finally, having sex with a lot of different women never cures insecurity. Sex is amazing when it exists on its own, in the context of a loving relationship, but sex rarely solves problems. It does have a fascinating history of creating them, though.

Though the blue-pill lie ignores corruption, the red-pill lie exaggerates it, and advises one to manipulate it for personal benefit. This doesn’t work as most people hope, because few people get away with aggressive non-virtue for very long. Con artistry is a great way to get a one-night stand, and a shitty way to find relationships. In the long run, most people don’t find it fulfilling, men who indulge in pornified casual sex lose interest in “7/10” women, in the same way that porn addicts tire of vanilla scenes and gravitate toward the extreme, and obsess over the “9/10” they can never have.

It shouldn’t be surprising that men selling the secrets of how to con women into reluctant sex will also swindle men buying their services. They overpromise. Bed Models In 21 Days, only $26.99. Those who dip into these corrupt games find that they’re not winning enough, but the losers become enraged and disgusted. Despair sets in, and that’s what incels call the black pill.

Blue pillers view the world as just and indulge in hypocrisy. Red pillers view the world as corrupt and seek personal benefit, making the world a little worse with each move. Black pillers see the world as hopeless; it must be destroyed. That’s what produces the Elliot Rodgers and Alex Minassians.

When I was a redpilled guy trying to up my dating game, I learned about “negging”– an exaggerated refutation of the blue-pill notion that disingenuous compliments lead anywhere– and how to deal with those pesky interlopers called “AMOGs”. One learned how to play hot-and-cold games. I don’t like that I indulged in this, but the blackpill incel discussions of today’s world are worse: one finds martyr worship for murderers like Elliot Rodger, fantasies involving Westworld-esque sexbots, and female sexual slavery from the “pro” side of a debate that should not exist. These blackpilled men are, of course, deranged and need psychological help. I doubt many of them, in a country without universal healthcare, will get it.

Red-pill pickup artistry, at least, had better exit options than black-pill incel rage. In the mid-2000s, I used cheap tricks to get dates and make-out sessions, and to improve my confidence. It may have improved my life, though I doubt it. Eventually, with enough dating and relationship experience, I learned (spoiler alert) that women are people. Even when women I dated rejected me, they didn’t seem like terrible people; they were only quicker to perceive that it wouldn’t work out. Successes kept me going: the odds are always low, but the payoff is high.

With black-pill misery, though, there seems to be no exit. If I believed I lived in the world that these guys think is the real world, I’d be just as enraged. See, they believe that all women secretly want to sleep with their high school bullies, mythologized as “Chad”. In their view, romantic relationships are impossible, because they’ve used stunted male sexuality for their model of the adult female. Women who show genuine sexual interest in non-Chads, they believe, are settling for a “betabux” provider and, like a cat in heat, will do anything to get alpha sperm when she’s ready to have children. The red pill turned men into pickup artists; the black pill is turning them into suicides or, worse yet, murderers.

Most cults have an eschatological narrative, because cultish behavior is unsustainable and therefore the world’s imminent end must be, at least, wished-for. The recent turns in male grievance culture show us belief in a post-apocalyptic landscape. It says that women like their mothers and grandmothers no longer exist. It says that today, every “Stacy” has been fucked by 100 Chads before age 15. Ask a blackpiller about adult sexuality, and you get a picture of junkyard animals fighting over scraps of meat, ten weeks after the end of the world.

Fuck the blue pill. Fuck the red pill. Especially fuck the black pill.

There is corruption in the world; it has always been there. But, it is rarely so hopeless as to merit nihilism and destruction (black pill) and one need not get involved and add to the corruption (red pill). Are there women so damaged that they’ll sleep with men who deploy cheap tricks? Of course there are. That doesn’t mean that one has to take part.

I advocate what I’d call, in response, the green pill: to acknowledge reality as it is, with no self-deception, but then to do the right thing, rather than the easy thing that everyone else seems to be doing, anyway. If the world’s dirty, be clean. Plant a fucking tree. This is the approach of the ancient cynic or stoic, whose wisdom has not decayed with age. Don’t support women’s rights because you think it’ll get you laid (it might, or it might not); support women’s rights because it’s the moral thing to do.

In our might-makes-right, corporatized society, the green pill isn’t fashionable. We see bad guys winning, all over the place. Our president bragged about (and, almost certainly, has actually committed) sexual assault, and still got elected. Am I really going to make a case for virtue– as an end in itself, with no expectation of reward– when corporate capitalism reigns and so many bad people seem to be winning? Yes.

No one tells young men this, but being male sucks from ages 18 to 23 for most people. Men have a higher sex drive than women their age, but also the women in that cohort are attractive to the full age range of men, whereas men in that group can barely attract their own counterparts. Women have all the options; men have lots of competition. It gets better for men as the years pass, but the male initiation period has never been a positive experience. I went through it as well. Most men do.

Here’s what I’ve learned. The sex I didn’t have at 17, 18, 19, 20, 21… never mattered. In the long run, whether one loses virginity at 16 versus 26 is unimportant. That said, what I do remember, and not in a good way, is how often I acted like an asshole. I’d be lying if I said I was fully over that.

The results of actions fade in importance over time; the actions themselves stick around in one’s mind, and a person has to live with them forever.

Perhaps that’s the strongest case I can make for virtue on its own. I don’t know that I need a stronger one.


The blue pill is the chauvalry of disingenuous “nice guys” in a patriarchal corporate system where sex is a reward for good male behavior, rather than an immensely pleasurable expression of deep love. Blue pillers are just-worlders who go to work and believe their companies are “meritocracies”. Red pillers are men who’ve internalized how capitalism’s alpha males actually behave– they’ll lie, cheat, and steal to glamorize themselves– and apply it to sexuality. The black pill prescribes aggressive nihilism– humanity is hopeless; best to kill everyone. None of these are virtuous attitudes, and they’re all expressions of our right-wing corporate society. I advocate something else entirely: let’s be honest with ourselves about the world’s corruption– and fix it.

Why the Black Pill’s So Relevant

Red- and black-pill thinking, two branches of the male grievance culture, have always existed, but the red pill was always much more prevalent. People would rather believe they can manipulate their way to a better life than convince themselves that they’re terminally fucked (or the lack thereof). When I was taken in by the male grievance cult, wannabe pickup artists were common and no one identified as a permanent incel, or discussed murdering innocent people (as Rodger, Cruz, and Minassian have) to prove one’s point. So what changed? I’m going to borrow a dumb 1990s catchphrase: “It’s the Economy, Stupid.”

Back in my day, when we walked uphill (both ways) nine miles through the snow (now get off my lawn) we had something that doesn’t exist anymore: a functioning labor market. As I said, it has always been difficult to be an 18- to 23-year-old man– the supply/demand imbalances leave a lot of young men single. But, there were other things to do than stew about not getting laid. Let me explain how the whole “work” think used to work.

In the 1970s, a college degree and a car was all it took to talk your way on to a job– if you were 27 or older, a management job; if you were 32 or older, an executive job– anywhere in the country. So, what did you do, if you were a decent 22-year-old who couldn’t get laid? You worked, because these things called “careers” existed back then, and because if you actually showed up on time at your job and were still working at 1:30 in the afternoon, you’d be flying business-class by age 35 and running a company by your mid-40s. If you were what is now called an incel– back then, we just called it “not getting laid”– you could invest time and energy at work and distract yourself from the sex you weren’t having. That economic world doesn’t exist anymore; the 1% took it away from us. Thanks to offshoring and automation, the only thing left to do in the corporate world, except for those born into elite connections, is subordinate make-work with no career value, creative fulfillment, or redeeming social value.

It shouldn’t be surprising that our society would be throat-deep in a dangerous masculine crisis. Work sucks for women as much as it does for men– actually, it sucks even more for them– but men have been told for hundreds of years to identify themselves with their paid work, and that success in business is the ultimate expression of masculinity. With corporate consolidation and an imploded job market– one that hasn’t recovered from any of recessions we’ve had since 1973, although the stock market has and property prices are sky-high– this setup has produced an untenable situation where men are told that masculinity is to be found by… subordinating to other men. Of course the system would collapse. Ex falso quodlibet.

When I was an unsuccessful 22-year-old man, in 2005, I said a lot of shit that I now regret, but I held out hope (and was right) that my lack of social and sexual success was transient. The difference, in the post-2008 world, where housing is unaffordable in places where there are jobs, and where third-world corruption has become the norm in the “lean” private sector company, is that “incel” now stands as a permanent identity. Not knowing it, young people have conflated their permanent (unless we overthrow corporate capitalism) economic misery with their transient socio-sexual difficulties and become hopeless black-pillers. I don’t blame this on the women who are exercising their right to turn down men; I blame this on the 1% for stealing everything– for wrecking our economy and culture, and for perpetuating the simmering (blue pill, mostly) misogyny that makes these rages possible. The Elliot Rodgers of the world are like Japan’s hikikomori, but with the misogyny of an emerging fascist movement, and (scariest most) the guns of America.

So what are we going to do about it? We must overthrow corporate capitalism– a might-makes-right system of cancerous masculinity– before its corruption spreads further and the masculine crisis becomes an all-out war. We need to overthrow the red-pill corporate executives– the ones who perpetuate corruption for personal benefit– and the blue-pill establishment enablers, before this black-pill psychosis can fester and its shit really hits the fan.

In the Meantime

Angry young men such as today’s incels do not tend to believe facts put in front of them. If they listen to reason, they’ll still find their way to extreme interpretations. It’s very hard to change a mind in the moment. That doesn’t mean one shouldn’t blast bad ideas. People will eventually come around. It’s good to plant the good ones, even knowing they’ll be rejected at the time.

I’ve argued for feminism with incels. What these men don’t realize yet (and won’t be convinced any time soon, but there’s hope) is that feminism is actually good for the decent man, the “beta male” who’d rather play with his own kids than do 3 extra hours of work for a company that poisons someone else’s kids in Brazil. When I explain why feminism’s good for average beta males like me, the reaction tends to be either “well, obviously” (from the leftist progressives) or shock (from the zero-sum-thinking incels). Either way, here me out.

The incels have mythologized their high-school bullies, with a bit of male porn actor thrown in, as “Chad Thundercock”, the priapic Norse god of white male douchebaggery. The rest of us just call this loathsome creature “frat boy” or “bro”. The thing is, Chad is the guy who wins out under misogynistic structures like corporate capitalism. In societies where women’s fathers and economic forces parcel women out as a sexual commodity, Chads rule. Meanwhile, the more feminist a society is, the better results will come to the patient, caring, and less-macho men that women are more likely to choose– because they’re better fathers and far superior lovers than the Chads.

Incels tend to come, I’d guess, from the earnest lower-middle-class– the ones who once believed in all-American mythologies about corporate meritocracy, and who bought into the blue pill worldview– so they have a sense that the men women choose when directed by economic forces (or, better yet, economic necessity) are somehow superior to the brutish or monstrous men– there may be racial attitudes here– that women will choose if left to their own devices. There’s no evidence that bears that out, though.

Some women, of course, choose skid-row rotten men, just as vice versa. To the extent that there is a “Chad pattern” in some women, I think it comes from our puritanical attitudes about sex (paradoxically?) more than anything else. Tell young girls that sex is a disgusting thing, and they’ll do it with disgusting men: the abusive frat boys of the world.

There is one painful but beneficial result of feminism for men: we get rejected more. They have more options life. It may be counterintuitive, but all this rejection is a good thing. In the long term, relationships are symmetric: a marriage that’s good for one person is good for the other. Women are choosier not because they’re mean, but because they’re quicker to perceive mutual non-matches, while men are prone to “we can make it work” quixotry. Female choosiness, in the end, saves us time and emotional energy. A polite, respectful turn-down is a favor.

Why does rejection hurt so much, then? My guess is that it has to do with our evolutionary environment: small tribes of about 100 people. In such a small world, being rejected means being humiliated in front of all of one’s whole social world. Rejection and breakups feel like major, life-ending events because, twenty thousand years ago, they were pretty close to it.

In a world with 4 billion sexually active adults, though, rejection is harmless. You can get rejected 300 times and nothing bad happens. Our society is capable of discovering exceptional matches that would never have been found decades ago– I grew up in Appalachia, my wife grew up in the Philippines, and we met in New York– but the price of this is an ultra-high rejection rate.

Under the old patriarchal systems, women were pushed into marriage because they needed economic security, and as a way for their families to improve their social standing or achieve political goals. Love was optional, and long-term marital love seems to have been the exception rather than the rule. True, middle-class (and above) men were assured wives and the loss of their virginity, but “dead bedrooms” were pretty common after the baby-making stage ended.

Although lasting romantic love was the hoped-for marital ideal, it didn’t happen often in the old world that incels seem to want to return to. Old-style family sitcoms where the goofy dad always wants (and almost never gets) sex once seemed that’s-life funny and now, from a 2018 perspective, seem pathetic. Perhaps this is a cultural change more than one in reality, but when I look at the Everybody Loves Raymond marriage and its lost sexual chemistry, it’s not what I think people want, or will settle for. Feminism is forcing men to up their play if they want to get married, and almost everyone seems to be winning.

In sum on this matter, pairings that can generate long-term romantic love– the kind where a couple still want to jump each other’s bones after 10 years of marriage– are rare, and female choice isn’t 100% accurate in finding them, but it seems to be doing a better job than corporate patriarchy (“take it from your father: this boy’ll be a good provider.”)

It seems counterintuitive that heterosexual female choosiness– a source of extreme frustration in the short term– would lead to benefits for heterosexual men. If we accept though that relational health is, in the long term, mostly symmetric, it shouldn’t surprise us that much.

This would require another essay to explore, but I call it the Control Paradox. Relinquishing control can lead to better results, especially when that control is illusory. What’s magical about mature female sexuality, from a male perspective, is that it can’t be controlled. Nothing’s better than a woman going after what she wants.

Our economic system is brittle– corporate capitalism is breaking down already, and will probably shatter in the next fifty years– in the face of control paradoxes. See, we work in firms built on zero-sum thinking. The Graeberian bullshit jobs exist largely because corporate executives believe their subordinates’ happiness equals their misery. The more (stereotypically, at least) feminine open allocation approach to corporate governance results in more innovation than toxically masculine zero-sum command-and-control regimes… but the working world hasn’t learned that, and won’t until corporate capitalism has been overthrown.

It’s extremely counterintuitive that a system where men get rejected far more than ever before would be, in truth, better than all those other mating regimes we’ve discarded. The teens and early 20s are absolutely brutal. But here are things becoming rarer: divorce, loveless marriages, infidelity, and dead bedrooms. It’s sometimes amazing to me that, after all the rejection and false starts and misbehavior (on male and female sides) that I endured, I ended up getting what I wanted all along: marriage to the best woman, at least for me, I’ve ever met.

In Sum

When the blue-pill, politically correct, lies fall away, it’s a vulnerable moment for any man– much like what we are all going through as corporate capitalism’s total failure reaches a point we can’t ignore. Red-pill contempt and black-pill despair can set in, but I’d like to make the case for something else: to acknowledge the world’s corruption, without self-deception or undue negative emotion (for emotions themselves are useful data, but not recreation). I’d like to argue for the green pill: to do, in the face of corruption, the one thing that’s truly rebellious: to be better than that.

Nine Selves and the Hero’s Journey

Staged models of human development occur throughout philosophy, psychology, and religion. Kierkegaard said we were aesthetic, then ethical, then religious. Piaget focused on a child’s cognitive development. Freud gave us the terminology of “oral fixation” and “anal retention” as he mapped out his own psychosexual theory. Kohlberg focused on moral development. Timothy Leary put forward an 8-circuit model of consciousness that, while a bit loopy, inspired a lot of future work. I’m not smarter than these guys, and I’m not a trained psychologist, but looking through their work as well as my own experiences, I’ve come to the conclusion that there are about nine stages of consciousness, or nine selves, if you will.

Leary’s model connected specific circuits to drugs and argued that psychoactive chemicals like sugar (oral; 1st circuit), alcohol (2nd circuit), cannabis (5th circuit), and ketamine (8th circuit) were literal neurotransmitters; today, we know that to be false. Moreover, experiences do not impact us on only one of the circuit (e.g., “a 4th circuit” experience) and I suspect that all circuits or selves or stages exist, to some degree, in adults. Like the higher harmonics on a musical instrument, they’re all there, just in varying amounts.

I don’t have the biological knowledge necessary to defend the term “circuit”, either, so I’ll use the looser term of “self”. I don’t intend this to be a rigorous scientific model– I don’t know how one would test it– but it’s a meta-analytical tool I find useful.

I posit that there are (at least) nine selves that unlock during a person’s development. The first three– sensory, positional, and cognitive– emerge during childhood and seem to be universal in terms of how they are interpreted and the ages at which they seem to emerge: birth, age 2, and about 4.

The middle three– socio-sexual, aesthetic, and truth-seeking– emerge in adulthood, and tend to correspond most strongly to adolescence, middle age, and elderhood. Unlike the first three stages, these do not emerge by necessity with age; a cognitive self emerges in almost everyone, but there are many older adults for whom the socio-sexual self still dominates.

The last three– self-revealing, supremely vulnerable, and meta-mortal– tend to occur in mystical experiences, and are the hardest to describe. These tend toward the speculative, since we really don’t know, in any scientific sense, whether there is a why to conscious experience, or what really occurs to us after we die. Experiences involving these final three selves tend to leave people with strong sensations of divinity or an afterlife– or, perhaps, an absence thereof– but. on these topics, present science tells us little either way.

1. The Sensory Self (Infants and Cells)

In the first two years of life, we’re a helpless animal. In fact, we’re lower in functionality than animals, since we can’t survive on our own, which most creatures can. None of us remember this stage, which has historically led to incorrect notions about it. We used to think an infant’s vision is blurry, our own memories of childhood being so blurred– that’s incorrect. Still, there doesn’t seem to be much more going on in a baby than in any other organism: sensors and movement.

The sensory self dominates infancy. It doesn’t go away with age; we just develop, as we age, the tools to understand what it is doing to us. We have some ability to moderate its influence, but not as much as we’d think. Hunger, extreme cold or heat, or physical pain dominate us at a certain point; if they are intense enough, the other selves are muted. Lower selves, except possibly in people who’ve mastered self-disciplines often considered spiritual (e.g., the yogi’s ability to bear intense pain), can override the higher ones, and that’s not a bad thing. It keeps us alive, that these lower selves “break through” in desperate situations.

We can idealize maturation or spiritual progress by assuming circuits to be “complete”; for example, most of us don’t spend much time on the sensory self’s concerns. Our brains do a lot of work– as someone who’s worked on AI programming, I can say: it is a lot of work– to regulate breathing and balance, and the visual processing our eyes and brains do is fascinating, but most of us can rely on the sensory self without thinking about it.

Yet, as someone who experiences anxiety and depression, I’m aware that the lower selves can get out of whack. Just as all selves are activated by complex experience, they are all connected and can all be thrown out of whack by disorder.

The sensory self is best at ease when one feels cozy and safe: drinking a warm cup of coffee on a winter day; or sleeping in late; or relief when pain subsides. However, chronic pain as well as dysfunction of the next self “up”– the positional self– can lead it to disorder: lethargic depression. When the sensory self is unhappy– possibly due to overload from the demands of its siblings, e.g., after the positional self throws a panic tantrum– the body feels heavy and one wants to do… as little as possible. To burrow into the warm folds of the earth and sleep, perhaps forever. This may be the only self that doesn’t get bored.

2. The Positional Self (Toddlers and Lobsters)

Leary, I think, got it right when he said that the toddler self– the positional self, as I call it– is one whose dysfunctions are activated by alcohol. The positional self’s unresolved issues come out roaring when a person’s drunk. He named it the emotional circuit, and with that I disagree; we feel emotions on all the circuits. This is where what we might call our “lower” emotions live: anger, fear, triumph, and disgust.

The sensory self seeks nourishment and comfort; it avoids pain, and though it does not know why, it avoids senses that evince danger: loud noises, pungent odors, extreme temperatures. It reacts; it wants to be safe.

The positional self emerged, most likely, along with complex organisms, sexual reproduction, and competition for resources with other creatures (including conspecific ones). Dominance hierarchies exist, and Jordan Peterson argues that they go back to lobsters, and the positional self develops as we realize that conspecific individuals are a source of danger.

It’s traumatic, when this happens: the first bite of fruit, perhaps, from the Tree of Knowledge. We become envious of adults, who are seven times our size. We size each other up, too. We get envious of older, bigger children but also younger ones who get more attention. We stand halfway between that boring world of infant survival– reacting to things, being taken care of– and this complicated adult one that makes no sense to us. It’s also easy to punish us: to be yelled at is like physical pain, and a five-minute timeout or recess detention seems eternal. We haven’t separated physical pain and psychological emotion, so falling down and skinning a knee (which we do a lot) makes us cry, to an extent that an adult only would if someone died.

We get our first inklings of sickness and death in this age, though we don’t understand them fully. We learn that not all humans are good: some will kick us, just to hear us cry. We may even find ourselves doing such things– I was a bit of a bully, I’m afraid, and I’m far from proud of it– and not know why. We do a lot of terrible things (for which we feel later remorse, sometimes 15 seconds later and sometimes 15 years later) to test limits and see where we stand in the territorial and dominance structures that exist all around us.

This time becomes hazy in memory, the thoughts of that time rarely thought in words, more like a shuffled stack of postcards than a narrative. Higher selves emerge and we become smart enough to deal with it– most of the time. It always lives within us, though. People who are “bad drunks” usually have positional-self issues that come out when the higher selves are blunted with alcohol. The positional self doesn’t like being told to wait in line like everyone else, and it can’t stand to see a personal or professional rival driving a more expensive car than the perfectly fine one we have. Moreover, what is an open-plan office but an attempt to manipulate a worker via his positional self’s weaknesses?

The positional self’s dysfunction takes the form of anxiety, which everyone has; it runs the gamut from the low-grade fear that keeps us alive to a full-scale, debilitating panic attack that takes days to recover from: the sensory self, also beaten, falls into lethargy.


Alcohol can unveil existing issues in a person’s positional self, but almost anyone can develop positional-self dysfunctions. Low social position has that effect, especially if it leads to uncontrollable changes is a person’s environment.

Humans are, for the most part, resilient. We can go through a lot without lasting mental illness. Grief, after a parent dies, is normal, but few of us become total wrecks. Natural disasters, accidents, and life-threatening animal encounters can cause specific phobias, but rarely break the mind outright. What seems limitless in its capacity to destroy the human mind is not danger (in fact, we seek danger out) but stigma: not mere unsafeness but the kind that comes from low social position. For example, one of the few non-biological factors that seems to cause frank psychosis is eviction in childhood; moving and poverty are bad enough, but combine them into a constant chase, and it’s a disaster.

High social status, on the other hand, can induce or reveal the positional self’s dysfunction just as easily. Often, the positional self uses the tools of other, higher, selves to get what it wants. For example, the corporate executive shows some strength (in general, IQ between 120 and 130) in the cognitive self, and his motivations are usually socio-sexual, but his day-to-day behaviors are toddler-like. This is sometimes called “acquired situational narcissism” and it’s also common in promising artists, academics, and writers who get famous too young, too quickly, and turn into blithering idiots.

High-status positional upset also emerges in the sort of acquisitive sex addict that has unfortunately found a niche selling the skills that come from its glib social hypertrophy (“pickup artistry”) . These men, who have put themselves at the top of a certain hierarchy, go beyond normal socio-sexuality into positional obsession.

The positional self always wants more; to level up, to get to the top, to have the shiniest toys. If it wins, it wants a bigger win. If it loses, its impulse is to try again. You see this in socially inept men who continue to chase the same sort of woman– popular, self-involved– that rejected them in high school, while failing to develop adult tastes or expectations; they keep losing on “that damn level”; to them, it’s like a video game where they have to “beat that damn level”. Because they must win, they lose. Their insecurity is such a turn-off that even when these women “hit the wall”– a fantasy of such men in which aging reduces their socio-sexual value– it does them no good (and I’m not terribly sympathetic).

Our higher selves enable us to drop out of the positional nonsense– on reflection, most of it is absurd; there is nothing dangerous about being seventh in line at the grocery store– most of the time.


Now, it is often said that bullies (in the schoolyard or the workplace) are insecure and have low self-esteem, and that’s not accurate. I wish it were true; I wish bullies suffered as that narrative suggests; but, in truth, they tend to have high self-esteem, and while they are insecure in fact (since a monster is driving them that will not be satisfied until they have destroyed everything) they do not feel insecure in the moment.

This said, the bully’s ultimate destination is misery, since he’s feeding something that gets hungrier the more it is fed. We note here that the very poor and very rich are miserable for the same reason: their lives are all about money. The very-poor person deals with debt collectors; the very-rich person deals has hundreds of sudden “best friends” who’ll revolutionize parking garage technology and just need half a million to do it. It’s a not good life, one that’s all about money, and of course money isn’t evil (and neither is sex, which will cover later); the problem is with people.

It’s probable that all selves emerge, though in muted form, around age two; the toddler has enough of a cognitive self to acquire language, and a socio-sexual instinct to want to spend time with attractive people, even though those selves have yet to mature.

3. The Cognitive Self (Children and Primates)

It wouldn’t surprise me if humans reach peak rationality just before puberty. We’ve aged out of the toddler’s tantrums, but we haven’t yet developed a fixation– regressive, for many people– on sexuality. We can make jokes as equals to adults, recognizing the absurdity of the “selves” below and above this one: the positional squabbling of toddlers, the socio-sexual drama of teenagers and immature adults, and the Seinfeldian universality of these impulses even in mature, intelligent adults. The 10-year-old boy “gets it” that his 15-year-old sister’s obsession with makeup is a bit silly… until two or three years later, when he has his own similar nonsense.

Language develops earlier, but cognition becomes an end in itself in early childhood; the third self emerges. What was once a loosely connected set of concepts useful to the lower selves– “hot”, “cold”, “feed”, “me”– now has rules; the words have relationships. “I want milk” is less confusing than “milk me”.

In this self, humans differentiate vastly, not only from animals, but from each other. The lowest tier of cognitive behavior– unexamined motivations, rooted in the positional self’s wants– we liken to rodents, e.g. “the rat race”. The sociality we develop as children to please or disgust others– but not to manipulate them, since socio-sexual games haven’t begun yet– we liken either to dogs (pleasing) or pigs (disgusting). Next up, humor starts and we get a grasp of language and its limitations, tools and their unintended uses, and the ever-present terror of boredom. Primates have enough of this that we call a hyper-curious, humorous person “a monkey” Buddhists call the chattering, grasping, often internally verbal (as if it were explaining itself to someone) element of our minds “the monkey mind” and I find that apropos.

What differentiates us from animals is that our cognitive selves are much better. They are nearly limitless. Unlike every other animal, we hit a critical point, something akin to Turing completeness. We can reason about reasoning. We can dissect our own programs.

Plenty of animals can do quick cognitive feats we can’t; for example, a falcon can fly through dense forest at 100 miles per hour– we’d hit a tree and die, if in control. It is said that dogs solve partial differential equations (obviously, not consciously) when they catch a frisbee. Neither animal knows that 34 plus 55 equals 89; neither animal knows the basic concepts (or cares). We, however, can do anything, computationally speaking– the only limitation is that we can be slow. We can formulate problems that no amount of fault can solve. We have made ourselves an apex predator, but we’ve also learned that we will die, no matter what we do. That thing that the positional self tortures us in order to avoid will, one day, happen.

The cognitive self works in symbols like words, characters, and numbers. It lacks nuance, and while this is a social hindrance (that our next selves must overcome) it is from this that notions of mathematical and scientific precision have emerged.

There seem to be few limits to what can be done with the symbolic and cognitive self. It may not have taste– we’ll need to use the socio-sexual, aesthetic, and truth-seeking selves if we want to write novels or essays worth reading– but it gets the mechanics of communication down. Furthermore, this is where we start to see limitations. I suspect that my sensory and positional selves– the first a reactive agent, the second a bit of a mindless bully if left on his own– can do no more or less than anyone else’s. Some people are better at positional games than others, but because of differences in higher selves, it seems.

The symbolic self, though, can always do more than whatever it does. There are theorems in mathematics that have never been proven and that I’m not smart enough to prove. If we agree that a thing is either true or not true (excluded middle) then we cannot prove all things that are true (Gödel); we cannot tell whether two computational processes are equivalent (Church); there is no way to know if a program terminates (Turing); we cannot really know anything about an arbitrary computation (Rice); there are probably truths that are verifiable within minutes but that one could not derive with all the time and matter in the universe (the unsolved P ?= NP problem).

Above are formal results. Those apply to machines much faster than we are, and to anything we might call replicable intelligence. The part of us that thinks in symbols– of course, the brain does millions of other things– can handle no more than 100 symbols per second, even under the best-case assumptions. For raw computation, we’re way slower (and more error prone) than computers. In many ways, we’re limited, some of us much more than others.

Then, there are differences between us. In practice, the difference between an IQ 100 mind and IQ 80 is significant; that between IQ 120 and IQ 100 is notable; that between IQ 140 and IQ 120 is relevant in the most abstract academic disciplines, but in most of life immaterial. Measurable intelligence doesn’t tell us everything, but it tells us a lot. I don’t mean to be cruel– in fact, we should acknowledge that since intelligence isn’t chosen, the leftist impulse to moralize intellect is misguided– but only about 5% of the population can become professional mathematicians; the other 95%, no matter what they do, won’t make it. Furthermore, most of that 5% will require a lot of work to do it, just to be professionally adequate; and all of them will need to put forth substantial effort if they want to be significant mathematicians. Creativity is harder to measure, but true creativity seems at least as rare as high intelligence (with which it is not always correlated) and general creative superiority (at a substantial level) may not even exist. What differentiates creative people is a Drake equation of sorts, each specific to a given field; native measurable intelligence is rarely the most important factor.

Why couldn’t a person with a 100 IQ become a professional mathematician? Prima facie, it doesn’t seem impossible; people with 100 IQs can do most things that people with 140 IQs can do: they can run businesses, lead others, and learn most academic subjects if they really need to– and if they put in a lot of work. It just takes longer to learn them. In practice, though, limitations of time and resources make it very unlikely that it’ll ever happen. Society is unlikely to invest the resources (nor, the person himself) necessary to get a 100 IQ person to the fore in mathematics, even if it were theoretically possible. There are scientifically accomplished people with “low” (below 130, so not actually low) measured IQs, but even this is rare enough that one might question the tests rather than the notion of intelligence itself.

We sort ourselves into cognitive classes; we need to do so. Jordan Peterson talks about competence versus dominance hierarchies, and the distinction is relevant. If we don’t figure out who’s smart, then whoever is most forceful will end up in charge. There is a tendency for people who excel with their positional and socio-sexual selves to be deficient in their cognitive, aesthetic, and truth-seeking selves; this puts society at peril when the “strong man” wins.

Let’s talk about politics. As I don’t enjoy the leftist tendency to moralize intelligence– the attitude that people of average capability could be “like us” if they worked harder, which implies there is something wrong with them when they are not– and although basic income will be one of many needed tools in the future, I don’t think it’s enough to give people a sense of dignity. In any case, though it is necessary for us to have cognitive classes, this tendency of ours creates messes.

First, no one should live an inferior life because of low intelligence. It’s a bit inconsistent that we (rightly) view it as tasteless to spit on people with retardation (IQ 0–69) but have no problem trashing the merely “stupid” from 70 to 99. (I recognize that there is such a thing as elective moral stupidity– also known as ignorance– but that, my friends, occurs often in high-IQ people, too.) This said, our society is at a level of complexity that requires smart people to make the decisions. Preference aggregation (e.g., voting and market economies) serves two separate purposes. One is to make decisions; the other is to hold those in power accountable. The latter is why voting matters: an individual vote almost never sways an election; but, when a group can vote, politicians are more accountable to that group.

What do we want? Well, we want complex decisions to be made by a cognitive elite (based on skill and taste, as innate intelligence isn’t worth much on its own) but we need for people in that elite to be accountable to everyone else.

That, on its own, is a hard problem. How do we establish a cognitive elite that can make complex decisions quickly, without them becoming self-serving or unaccountable to those with less genetic fortune?

Worse, we’re not even at the point where that is our problem. Representative democracy is the purported solution, but it doesn’t always work that way. Business bureaucracies (which have become, in the past 50 years, more powerful and important than governments) don’t even try to come close.

The cognitive self may approach a child’s perfect rationality– smart, not yet corrupted by socio-sexual impulses– but it also has the child’s timidity. The selves just below and above it dominate the culture. Those who master positional dynamics run the business world. The socio-sexual winners become the in-crowd and run the culture. We have a rapacious toddler elite running us from the economic right; we have a self-involved adolescent elite running us from the cultural left.

Beyond this, one questions whether it is beneficial to be supremely cognitive provides real merit. There are millions of math problems one could solve, or books one could write, or pieces of music one could compose; which ones matter? We have to peer into the higher selves to find a why for all this glucose- or electricity-consuming cognition.

One could, in principle, ask whether the middle digit of the 10237th prime is even or odd. (Pedant note: if the number’s length is even, break the tie at left.) In every universe, the answer is the same. It would take eons to answer this (from what we know now) and there seems to be no purpose in doing so.

I mentioned that dysfunction and irresolution of the sensory self seems to be a lethargic depression. In the positional self, it’s anxiety. In the cognitive self, it’s obsession. One can see the downward cascade of dysfunction, I’d imagine: obsession often leads to anxiety, which can spill over into lethargic depression.

4. The Socio-Sexual Self (Adolescence)

Sexuality exists in most creatures, but socio-sexuality seems limited to the most advanced animals: humans, cetaceans, and primates. Sex becomes a tool to avoid conflict– two same-sex conspecifics might prefer each other, rather than fighting over a mate– and to build intense relationships of trust (notably, pair bonding).

In less advanced animals, sexuality seems to occur as-needed, in response to a drive that only wells up on occasion– because sex is dangerous, and nature built animals to take as few unnecessary risks as possible. This is why it’s not cruel to neuter pets; sex is not something they always want or see as a necessary part of life. It’s something they get an irritating, dangerous drive to do on occasion.

Socio-sexuality re-introduces the irrationality that the cognitive self tried to stifle. When we’re 11, we find it ridiculous that a 16-year-old would react so strongly to a transient facial blemish. Our intelligence is not refined at 11, but we may be at our most rational. At that age, we are on the younger “mountains are mountains” side of the socio-sexual drama on most television; we see the humor in people acting absurd but lack the experience to understand their motivations.

Socio-sexuality hits us like a wrecking ball. I don’t think animals, except when in heat, care much about sex. In general, most mammals aren’t choosy. Two males might fight over a female, but the female would be happy to go off with either one (or a third male who is smart enough to come after the fight starts, and not fight at all). Humans are, and it’s this choosiness (in both of the main genders, despite stereotypes) that leads us to take a renewed interest in social hierarchies.

The socio-sexual self can be nasty: it understands the lower selves and can use them, in a person and in others. It can combine the primal meanness of the positional self with the infinite calculation of the human cognitive self to devise all kinds of creative punishments: even non-physical ones (induced depression, ostracism) that leave no marks. Almost every animal fights; the socio-sexual self, however, can come up with torture. It can lead the cognitive self on a chain; it can make a smart person wrong in hyper-intelligent ways that even she is not smart enough to think her way out of; that “four-wheel drive” problem of getting stuck in inaccessible places.

Though I suspect that the higher animals have flickers of socio-sexuality, it dominates humans. Quite a few people reach this level of maturity and stop. The positional and cognitive selves give them methods to go about the world; the socio-sexual self gives them motivation; it’s the why. The cognitive self learns that paper pictures of dead people are a fantastic way to acquire needed things without positional conflict; the socio-sexual self falls so far in love with dead-people-paper (and the sexual access it provides) that it’ll destroy the whole world, just to acquire more.

Socio-sexuality is also where genders diverge most. This is probably socially constructed. The sensory, positional, and cognitive selves seem very similar, despite society’s differing reactions to each. Socio-sexuality is gendered.

The toddler realizes, to great fear, that not all humans are friendly and that an authority/status ranking, as well as an obvious size/strength/can-kill-you ranking, exist– and that they’re at the bottom of both. The cognitive self provides distraction and a quest for self-improvement, and it is perhaps for this reason that cognitively under-satisfied (i.e., bored) children fall into positional agony. Cognition is an end of its own for a while, until we experience intense impulses (often negative ones, often ones we know we can’t act on) and start to wonder what we are cogitating toward. We are bigger and stronger and (unlike toddlers) starting to be sexually appealing; we can climb those hierarchies now… maybe?

Those hierarchies traumatized us at 2; we threw tantrums when we learned that we couldn’t just take candy from the store. From 3 to 11 or so, we decided that adult stuff was not only not for us, but seemed a lot more boring than reading, writing, running around outside, trying to hit a road sign with a slingshot. Then, we’re 12, and we want to climb those adult hierarchies… but we’re absolutely pathetic as adults. We still need guidance. Gendered socialization seems to come in. Boys are usually told to resolve this re-emergent positional trauma by keeping with the cognitive stuff: gain skills and competencies, learn how to survive, learn how to defend others. Girls are told they are beautiful (or, in some sad cases, not) based on their innate traits. Even into adulthood, society seems to rate men based on what they do and women based on what (it thinks) they are. This is why there can be a celebrity “It girl” but never an “It guy”.

The masculine regime seems harsher: who wouldn’t rather be than have to do, to earn one’s keep? In some ways it is. For one: it’s punishing in adolescence. 20-year-old women are beautiful (not to say older women aren’t) but the vast majority of 20-year-old men haven’t accomplished a damn thing; they can’t do much at all. It forces men to learn and demonstrate a set of domain-specific (and somewhat icky) social skills (“game”) that women don’t need if they’re even average looking. Yet, in most societies, the feminine regime is a lot more repressive. Why? Forget the idealized societal notion of valuing women for what they are as opposed to men for what they do. Society will get a woman’s (or a man’s) what-you-are (reputation) wrong and not care; sometimes it’s deliberate, extortive, or even malevolent. Men suffer from this but, in most societies, women suffer more.

Socio-sexuality brings the old positional dreads back, in men and women. We men realize that some are more competent (in athletics, or in academics, or in the slimy reptilian positional game of real-world “work”) than we’ll ever be. Women realize that some women are more beautiful. We do resolve this, later on: self-definition. We drop out of “the Great Game” and find a game we can win. .

Middle school, in the U.S., seems to be the worst. There is one hierarchy and everyone wants to know where they stand. I remember estimating my popularity, at age 12, by drawing names from the yearbook and taking a weighted average (based on other peoples’ popularity, as I guessed) of how much (I thought) they liked me. Basically, I invented PageRank before Google, though I only did one iteration, because I didn’t know what eigenvectors were and my programming skills were limited to QBasic. And, of course, the numbers were made up and probably wrong. Still, middle school is a time where there’s one popularity hierarchy and everyone has a position on it.

In high school, people diverge. One can have value without playing the terrifying Great Game (to use Tyrion Lannister’s phrase) and do something else. Bad at football? Play chess. Or learn a musical instrument. That won’t get you laid in high school, but it’ll get you respected, and (they tell you… haha) it’ll get you laid in college.

Socio-sexual ranking systems affect our self-image much more than a purely positional ranking. We know that positional rankings are volatile and dangerous. If an idiot with a sharp piece of metal (e.g., a knife) comes at me, demanding my wallet, he outranks me in the moment (being able to end my earthly existence) and I will give it. I’ll be shaken, I’m sure, but I’m not going to think less of myself for giving the wallet up. Socio-sexual rankings, on the other hand, impress memories. Worse yet, they generate feedback loops that can be impossible to get out of. A person with poor grades for three years is unlikely to get into an elite college with a senior-year turnaround; a 25-year-old male virgin is unattractive, for that reason alone, to most women his age; disliked or low-ranking people are excluded from future opportunities.

So, when the One Hierarchy of middle school fades, as the high schoolers develop cliques, that’s actually a good thing. People whose physical features are unattractive to most can often find partners who seek those traits out. We decide what we value and what we don’t; this is why adolescence is a time with the strongest focus on self-definition and expression. Even if we are not globally remarkable, we can be unique and un-rankable.

Unfortunately, this dissembling only goes so far. We see some of it in high school, and more in college, but then when we leave college… a new One Hierarchy emerges: money. We’re on the bottom, mostly. Whether we want to be or not, we’re back in the Great Game with everyone. To make it even worse, it’s a rigged Great Game; undeserving rich kids, whom we defeated in academic competitions where their parents couldn’t buy scores, rocket to the top of the corporate world. People of merit, meanwhile, get stuck trying to figure out what the fuck happened.

College, in terms of the message it sends to young people, is a lie. It implodes in late adolescence and it’s painful. It gives 25 percent of the population a leadership education, for which it charges them an immense price, while failing to divulge that only 1 percent of them will get leadership positions (and that those were mostly pre-selected at birth).

Here’s how I’ve come to view it. Society creates “cells” within which intellectual merit– talent and hard work– matter. Middle class kids who go to Harvard are one cell; the blue-blooded kids who got in with 1250 SATs because “we’ve gone here for generations” are in a different cell, even though they attend the same university. Minority students at elite colleges often end up in their own cells and this is called “self-segregation”, though I’m afraid it’s not entirely elective. At any rate, at age 21, we are maximally invested in the value system of our cell and ranked according to what we value: one might be the #3 physics major but the best in the lab; or a mediocre student but an excellent poet.

When college ends, most of us go into the working world– graduate school is another topic, for another time, but not all that different– and all those cell walls dissolve, and the shuffling continues, and most of us are back around what could have been predicted when we were born, and we don’t know what happened. We learn then that the animalistic positional nonsense never went away. Uh, oops.

There’s one place on a college campus where the terrible old world– ancient, vicious impulses and desires– breaks through: “hookup culture”, which is a polite term for an acquisitive, alcohol-fueled, and combative sexuality that leaves everyone but sociopaths dissatisfied. Men struggle to get sex; women get sex easily but struggle just as hard as men to get respect. It leads to “rape culture”. Right-wing psychopathy (though it is apolitical and for historical reasons associated with the left) tears a hole in the college utopia’s walls, and the miasma breaks through.

At least, for that, one can opt out. The depiction of college life suggests that dystopian casual socio-sexuality is the norm on campuses, but here’s the good news: it’s a few loud (and usually quite damaged) people making most of the noise. Pair bonding and healthy monogamous relationships are still very much “in”; they never went away.

The common currencies of American society seem to be socio-sexual. We work to rank ourselves; when young, so we can fuck more; when colder, so our little fuck-trophies can get into better schools than others’ little fuck-trophies, and grow up to be executives who boss around others’ now-adult fuck-trophies, and then go on to send their own little fuck-trophies (fuck-trophies of fuck-trophies) to better schools so… you get the point.

It’s an ugly thing to realize, that this is the heartbeat of commercial life. It’s not that sex is bad. Of course, it’s not. The zero-sum socio-sexuality that seems to live in human society is disgusting; sex itself isn’t disgusting at all.

We could do so much better. Sex isn’t a zero-sum commodity, after all. Economic inequality causes rich and poor people both to have less sex (and to have less stable sexual relationships)– some overwork themselves, while those in low positions lose desire. A few people’s unresolved (and perhaps infinite) socio-sexual desires have created a dismal society that’s obsessed with sex but, in fact, has little of it.

I believe that many European societies have progressed into the next phase of maturation, while the U.S. fumbles about in an adolescence that lasts too long.

I mentioned the failure modes of the lower selves; lethargy, for the sensory self; anxiety, for the positional self; obsession, for the cognitive self. What does failure in the socio-sexual self look like? There are two kinds that seem different, but might be similar; ill-gotten or perverse socio-sexual success leads to narcissism. Socio-sexual failure leads to a form of depression that’s higher-functioning but more profound than lethargy: despair. Lethargic depression (e.g., after a panic attack or illness) comes from defeat; despair and dread linger when defeat continues without respite.

Most of us, thank God, do not live in socio-sexual defeat forever. We learn how to relate to other people, we better ourselves, we start dating, we get married. We move on from socio-sexual squabbles and start asking deeper questions: what is the good life?

Most of us are capable of formulating that question. It’s only an artifact of how our society’s constructed, in the U.S., that has us answering to the avaricious, perennially socio-sexually irresolute, tasteless and unethical (i.e., lacking the aesthetic and truth-seeking selves) half-men that become corporate executives. We should become more like Europe, cast those executives aside, and dedicate our lives to things that really matter.

4B. Aside on Corporate Misery and Conflict Between the Socio-Sexual and Aesthetic Worlds.

Growing old is mandatory; growing up is optional. I obviously am not the first to say this, but it applies here. Quite a large number of people never develop the fifth (aesthetic) or sixth (truth-seeking) self, much less the higher ones.

Most of us grow out of the socio-sexual contest. Why? Because, after 30, it’s not hard to “get sex”. One can pay for it. An average-looking (or even below-average-looking) person has typically developed the social skills necessary to find casual hookups. Or, one can go a more traditional route: get married.

Furthermore, most people learn either through direct experience or others’ tales that the socio-sexual hierarchy is bunk, at least as a predictor of the quality of an experience. Or, to put it bluntly: most people at its apex (in both genders) are too selfish to be decent lovers. Emotional intimacy matters so much more than socio-sexual ranking; people realize that and say: to hell with the latter. Thus, they enter full adulthood.

So why would a person, now married and 30 years old, care about the Great Game at all?

Well, we still need this stupid little thing, a socio-sexual token called “a job”, to have an income. That’s non-negotiable for most of us. If we have children, we need to line up schools and connections and (in today’s world) first jobs to keep them from falling into society’s blender blades.

So, we end up contending with socio-sexual machinery that other people created and that has no value. We have to go to work, and most of us do that in the corporate world.

The dismal socio-sexuality of the workplace could merit its own 10,000-word essay: how it is aggressively patriarchal and heterosexual, even when run by women or gay people; how it sublimates male bisexuality into homosadism, with executives inclined to abuse men because they are ashamed of their sexual feelings for them; and, finally, how it has emasculated high-performing men by forcing them to contend with harem dynamics.

The main issue I’ll cover is this. Most of us, by 30 if not before, have settled down into a life where sexuality is (as is much preferred, thank you) a private matter. It’s distasteful to talk about one’s sex life, if one is supposed to be having sex with only one person. Public socio-sexuality becomes a bit shameful. It probably should be. What good did it ever do? Sex is a good thing; socio-sexuality? Not so much.

I disagree with those who attribute all creativity to sublimated sexuality. The best creativity comes from all of the selves: aesthetic and truth-seeking as well. This said, the crass minuscule creativity of the corporate world– the approval-seeking, inoffensive humor; the slight variations on sickening conformity that somehow stand out– is probably socio-sexual in its origins.

One thing about the corporate workplace that makes it so hostile to life is that, while it professes to be a cognitive hierarchy, it is a weird hybrid of a positional and socio-sexual hierarchy. Let’s focus on the socio-sexual component first: the people at the top are effete half-men, exhausted by decades of subordination, humiliation, creative atrophy, and idiocy. One cannot exceed them in creativity and survive, although they have so little creativity– note the irony, in that the self-appointed socio-sexual winners are obese, self-indulgent men with broken families– that self-deletion is often the only viable approach. On its own socio-sexual terms, the corporate world is a miasma of failure.

The corporate world has no aesthetic merits: it’s aggressively tasteless. It has no truth-seeking value: it actively works to destroy truth, lest it disturb the lies that prop it up. It does not have much cognitive merit– smart people do not succeed in it. Yet, when the supposed socio-sexual winners are effete, obese half-men whose families despise them, it’s hard to swallow that component either. What’s left, to dominate daily activity, is positional nonsense: open-plan offices and status checks and pointless deadlines– psychiatric warfare to remind the arbitrarily unfortunate that the arbitrarily fortunate people in charge, are in charge.


5. The Aesthetic Self (Maturity; the Artist)

We’re supposed to realize that there’s more to life than incessant competition. Most of us do, especially when we lose at rigged competitions over and over. We grow to value experiences more than possessions, and processes rather than results.

We might paint or write something, sell it to no one, or even destroy it.

The aesthetic self can be selfish, or it can be altruistic, but it seems to the be the first self that considers other people for its own reasons. We learn, much earlier with our positional self, to be “nice” to avoid punishment, and we develop finer social skills to ascend socio-sexual hierarchies, but we learn in adulthood that beautiful experiences can have more value– or, at least, a different kind of value– when shared.

True beauty– the first crack of snowy air in November, the first kiss with a new lover– is often fleeting. If you stare at it, it’s gone, just as any word becomes absurd if you repeat it a million times in your head. The only long-term strategy for achieving consistent beauty (as opposed to boring, repetitive indulgence) is to work to give beautiful experiences to others. The aesthetic self seeks the profound and eternal; it also seeks intimacy in the now.

The aesthetic self isn’t always a wondrous or virtuous thing. It can be repugnant, depending on what it finds beautiful. De Sade, I would surmise, found others’ misery and humiliation to be beautiful. The aesthetic self’s tolerance for discomfort can lead it to cause pain for others, too. It would probably steal another airline passenger’s first-class ticket to avoid the unpleasantness of coach, if it could get away with it.

The traditional result of sex is children, and few experiences are so profound as those involving kids. This is good and bad. The aesthetic self drives adults (some of them, anyway) to care about their kids and want positive experiences– and, one imagines, a good life– for them. But there’s much evil that comes from private child-raising, too. The intense bonds people feel with their own progeny– and, often, something on the spectrum between apathy and competitive rivalry toward other’s children– are, no doubt, a driving force behind the proto-fascist nightmare we call “Corporate America”.

One of the deepest dysfunctions of the aesthetic self occurs in the world of recreational drug use. I’m not talking about addiction here, because addiction crushes all selves. LSD, cannabis, and psilocybin are not addictive, and they’re not the evil terror drugs they’re made out to be. (They are dangerous, but the probability of long-term psychiatric casualty from one-time use is probably closer to 1-in-100 than what mainstream society seems to think.) Yet, I’ve seen users of these drugs get wrapped up in “experience chasing”. There’s no physical addiction, but they have to have the next bigger and more powerful experience, to trip harder and longer than they ever have before. It becomes a form of escapism.

Toward the end of that, the psychonaut-cum-burnout has usually been hospitalized at least once, and is likely to suffer panic attacks and HPPD, at which point the drug of abuse is often alcohol– possibly self-medication for acquired anxiety. Alcoholism is never good, but when it’s combined with the medicines one needs to manage panic attacks, it can be fatal.

The aesthetic self’s limitations should be known. Immediate pleasure isn’t always wanted. I don’t know from experience whether heroin is pleasurable (having never done it) but the reasons not to do it are obvious.

This said, the aesthetic self isn’t bad on its own. Religious anti-hedonism is, and we should be thankful for this, dying out. Beauty, joy, and love are what the aesthetic self seeks. The child experiences them but does not know why; the adult has more of a sense of where to find them.

To all of our chagrin, beautiful experiences are rare. There is so much ugliness in the world, and most of it’s unnecessary. Most people spend fifty weeks on pointless, subordinate activity in order to enjoy two weeks of vacation (that they don’t much enjoy because they’re just “recovering”). We live in a world where dysfunction of the aesthetic self– tastelessness– is the norm.

There’s another issue. The first three selves are gender-neutral. Male or female, we all have senses, we’re all animals, we all use the same words. The socio-sexual self incorporates gender into its identity, almost by definition. The aesthetic self need not be gendered, but in our society, it’s treated as feminine– while our society overvalues what it perceives as masculine. The aesthetic self, it says, is useless, chthonic, subjective, and self-indulgent. This is wrong. Beauty and joy are reasons to live; increasing a factory’s efficiency by 2 percent is not one.


One who becomes fully adult will be tasteful and altruistic, I’d argue. It is meaningless to “be happy” if not virtuous; it is pointless to have beautiful experiences but not share them. One ought to wish to make the world better not only for oneself, but for everyone, and for generations to come. This requires insight, knowledge, and care. One must become a truth seeker.

6. The Truth-Seeking Self (Elderhood; The Judge)

Archetypically, truth-seeking is the job of an elder, but by this point, these selves have diverged from chronological age. There are 25-year-olds who are deep truth seekers, and lecherous, socio-sexual old men.

Perhaps it’s the experience of facing death that activates the truth-seeking self, but we all face death. Not one of us shall escape it.

Erik Erickson described the midlife conundrum as one between generativity and stagnation, and I think he’s mostly right.

This is one form of the midlife crisis, which seems to occur when a person’s lowest unsatisfied self charges to the fore.

The pathetic, self-indulgent midlife crises we love to mock (cf. American Beauty) are at the socio-sexual level. But, it’s possible to have a midlife crisis on an aesthetic or truth-seeking level. I’m pretty sure my midlife crisis started around my mother’s death (when I was 29) and it has led me to nobler goals: I want to write beautiful prose, and (though I write fantasy, whose reputation is of being purely aesthetic, not literary) I want to delve into deep issues of the human condition, as well as I can.

Lest it seem that I can ranking truth over beauty, contra Keats, that’s not my objective. They are interconnected. Truth can be a source of beauty, or a mechanism that allows us to find it.

All of our selves have value. Our sensory self tells us where we are in the world. Our positional self exists to keep us out of danger. Our cognitive self gives us language; our socio-sexual self gives us motivation until the aesthetic self is mature enough to take over. Truth that does not, in some way, provide beautiful experiences or prevent ugly ones, at least to someone, is mere cognitive formalism. The objective of a truth-seeker isn’t to discover things that are true, because there are myriad meaningless truths, but to to reach the best– deepest, most resonant, most useful– truths.

We realize that we’re going to die. (That is the Truth before which other truths bow.) We realize that none of us know what comes afterward, not even the people (on both sides) who think they do. But we can give our life maximal value and purpose, and that seems to be the best preparation for death.

This truth-seeking self can take us to ethical heights. It can lead us to build better societies. However, in corrupt people, its tools can become calamitous. True truth is charismatic, but equally so are many false truths. That’s what fascism and religious cults (charismatic, well-structured false truths) teach us. Hitler and other high-ranking Nazis were drug-abusing mystics, and it’s likely that they activated these higher selves– the truth-seeking self, and possibly the higher, mystical ones– though they did great evil.

Each self has a different notion of conflict and what is “bad”: painful for the sensory self; threatening for the positional self; stupid for the cognitive self; disgusting for the socio-sexual self; ugly for the aesthetic self. The truth-seeking self confronts all manners of bad: the unethical, which is distinct from the immoral, which is distinct from evil. It even recognizes the favored conflicts of literary fiction: good versus good (Little Fires Everywhere) and bad versus bad (Gone Girl).

This said, while the truth-seeking self might learn of good and evil, nothing always compels it to choose good. We bet, in kind with Martin Luther King’s theory of history, that a certain Law of Large Numbers applies, as most of us choose good. Even though a small percentage of people ever activate the truth-seeking self– we still call it “visionary”, though it would be mundane if we were more evolved– it seems that enough do so, and enough of those break the right way, that human civilization advances, if slowly.

The truth-seeking self is rational, wise, and kind. It knows many things. It knows why ugly things are sometimes beautiful and why beauty becomes stale. It understands socio-sexual manipulations, and it gives a purpose to cognition. It seems as far as we can go– the highest adult self– and communicate with others. Beyond its threshold, we seem to go alone.

Yet, there’s much it can’t hack. The truth-seeking self must admit not to know what it does not know. What happens after we die, we ask it. It cannot answer.

The next three selves are the hardest to explore, or even define. They’re dangerous, but important; and that they are dangerous does not mean we should shun them. After all, our lives end with a transition that we regard as supremely hazardous. If we can safely integrate these selves, it might be worth a try.

7. The Self-Revealing Self (The Shaman; the Hero)

The strangest and most powerful experiences confound our aesthetic principles, and our notions of truth. Lucid dreaming suggests that we could, in principle, have any experience we want. Could we create a dream world in which 1 + 1 = 3? How long could we live there? In fact, “drug dreams” by former so-called “psychonauts” suggest that the humble, safe dream state is more powerful (and almost certainly less dangerous) than many psychoactive chemicals.

The subconscious and the underworld live here. The masculine parts of the woman, the feminine parts of the man, and alternative answers to questions one thought were resolved long ago, live in this deep ocean. We and it communicate, but sparsely. When we throw a problem into its basement, go to bed, and solve it the next morning, the work was done in a place that exists in all of us, but that few activate as a self.

The self-revealing self is a reality smasher, just like the socio-sexual one. Our mid-childhood rationality fell to pieces when we discovered orgasm; but, we find ways to integrate that. Late-adulthood rationality struggles at the world’s edges, but those come. We think strange thoughts as we go to sleep. We know we’ll die.

What, in the end, is good? What’s bad? Can’t a well-intended action cause misery? (Obvious answer: yes.) Why can a drug like LSD give some people an immensely positive experience and lead others into hell? How can we make ourselves into artists (aesthetic) or judges (truth-seeking) if we don’t know the first damn thing about the world we live in, or who we are?

The self-revealing self’s name suggests recursion. That’s probably intentional. It’s a weird space to explore. Let’s talk about one (and not the most common) usage of the term psychedelic.

The term psychedelic means “mind-revealing”, but psychedelic drugs are one of the less important (and more dangerous) ways to bring this aspect to the fore. The drugs have shown utility in controlled usage; however, uncontrolled hedonistic use, especially by experience-chasing young people, seems to do far more harm than good. It’s probably better, for most of us, to take the slow route: to meditate, and get there when we get there, rather than strapping chemical rockets to our asses.

In fact, most people associate psychedelia with visual hallucinations, overwhelming emotions, or intense introspection. Only the last of these seems like it might belong to the self-revealing circuit. The “trippy” psychedelia is merely hedonic/aesthetic, and there are safer ways to activate that self. If visuals are your thing, go to an art museum. If you want to explore consciousness safely, consider brainwave entrainment (e.g., binaural beats, isochronic tones). We still don’t understand these drugs well enough for anyone to say they aren’t dangerous. The drugs seem to be reasonably safe under controlled settings, and can produce higher-self experiences if used meditatively, but casual use seems to bring risk without profundity, and anyone who uses LSD as a party drug is begging the gods for disaster.

Having discussed what it’s not, what is the self-revealing self? Oh, this is hard to say. Let me first haul out a metaphor: language and writing.

The sensory self knows only sounds: a tiger’s roar. It does not stop for words, and words do not kindly stop for it.

The positional self understands commands and very simple sentences (usually with a first- or second-person subject.) Stop! Milk me! (This is a toddler requesting milk, not to be milked.) Go! I’m cold!

The cognitive self has the full faculty of any modern language, but little nuance. That’s enough for a proof or technical writing.

The socio-sexual self will indulge, and can write perfectly salable porn erotica, and one who desperately needs to sell writing could stop there.

Commercial novelists engage with the aesthetic self: they provide an emotional response, whether it’s a form of intimacy (romance novels, which, contrary to their reputation, are more about the emotional bond of the characters than the lurid aspects) or excitement (thrillers, which keep the reader in life-or-death suspense) or fear (horror, leveraging aesthetic paradoxes) or intellectual achievement (mystery, leveraging the “Aha!” sensation) or wonder (fantasy, bringing us back to a child’s sense of a bigger world). Provoking an emotional, aesthetic response in the reader is the writer’s goal: a few do it well.


There’s nothing wrong with commercial writing, though I prefer to write literary fantasy. To be honest about it, the best storytellers are often commercial writers; because they spend less time per book, they get more experience writing stories.

Literary novelists– the next tier up in difficulty and prestige, though I don’t intend to knock what commercial writers do– seek more than an emotional response. Aesthetics are important to them, but psychological and social truths are even more important. The goal might be to capture, in full accuracy, what it was like to live in Seattle in 1995, or to have been a 40-year-old Buddhist priest who joined a hedge fund in 2006. Or, perhaps the goal is to explore human nature under adversity, with a fable set among animals in the California woods. The story must be engaging, and the prose must be very good, but an additional objective to the aesthetic one is theme: to give what could be a dry, 20,000-word essay, instead, the human power of a 100,000-word story. These novels reach a truth-seeking height.

(For an aside, I am not saying that thrillers, science fiction, or fantasy novels can’t be literary. Metrorealism, or literary-the-genre, is a good genre but not there’s plenty of great stuff outside of it.)

Something very few writers excel at, and it requires a light touch, but an incredible amount of work, is to go to the next level: to master self-revealing detail (and remember that the self being revealed is a character, not author). Aesthetic, commercial writers focus on the story and characters– the art of life. Truth-seeking, literary writers focus on these elements, but also the sentences and diction– the art of life and the art of writing. Beyond that, though, there is something that is hard to put a name on, that comes out of the most precise writing. You see it in a million micro-decisions. Just now, I wrote “hard to put a name on”, not “on which it’s hard to put a name”. I wanted German bluntness, not Latinate refinement. One must know the rules, but know exactly when it is in character to break them.

For example, literary agents (most of whom are failed editors that real editors use as gatekeepers) despise exclamation points. Hate, hate, hate! You will not get one if you use two or more. Of course, that mark is overused by mediocre writers, but there are times when it is in order. It differentiates hot anger from cold anger. (Villains who don’t use exclamation points are scarier.) If you’re writing in character as a six-year-old, you should use it (and you should lay off the complex sentences). If you’re writing a 30-year-old, of course, the weather is probably not “sunny!”

Now, anything that would be called “psychedelic writing”… I would do everything I could not to read, much less write. But, what the best writing does, that no other medium matches, is the ability not just to “get inside a character”, like some kind of alien invader, but to walk pace-by-pace with her. Commercial writing (aesthetic) lets us vet stories that Hollywood may one day pick up. Middle literary writing (truth-seeking) lets an essay be told in an engaging way, but a documentary can do that. There’s a self-revealing level that’s a tad bit higher. No film can capture that wretch, Humbert Humbert, the way Nabokov’s words do. The magic telepathy of fiction (which is absent in mediocre fiction that can still be entertaining) requires a writer to have the talent, experience, and sheer masochism to fuss over commas and pronouns and even information-theoretic concerns. I allow Farisa to use more complex sentences than less intelligent characters, just as Othello’s and Hamlet’s vocabularies exceeded those of bit characters. I cannot promise that what I’m writing will be “high literary” fiction, but I will try. Farisa would put a comma where another character might not.

There’s a metaphor. I hope it helps. It may not. The self-revealing mind tends to poke out in minute details. Why does green have different connotations from red? Why does the female hero have a mark on her shoulder? Why does order beget chaos, and chaos order? What is this world and what are we trying to do here? What awaits us in the underworld or the hereafter?

One of the problems the self-revealing self presents is that people seem to gain strange beliefs when they activate it. I don’t know why that is. I don’t bristle when people claim to remember past lives or understand the afterlife, because I don’t see those topics as necessarily supernatural. If we exist after we die, and have the same terminology, we’ll regard the afterlife as “natural”. I am skeptical when people claim their near-death experiences have given them the ability to predict the future, and that Japan will sink into the ocean in 1997 (which, thank God, it never did). I don’t believe in psychic powers and so, no, I don’t think the self-revealing (or higher) selves can unlock them.

What the self-revealing self can give us is a certain ability to program ourselves; to examine our motivations and thought processes, and retrain them. One might call this “metaprogramming”, but the term has been used by so many unsavory characters that I’m hesitant to use that term.

So much misguided nonsense has been said about “the psychedelic”– and the irresponsible 1960s drug culture did far more harm than good, as recreational drug use seems to be the easiest but worst way to access this– that I’m hesitant to risk adding to the nonsense pile. I hope I’ve done what I can to make the self-revealing self clearer. Most people experience it, but only in dreams, and the few moments before and after sleep. It’s there, and it’s mostly subconscious; it’s worth listening to, but most of us, probably, don’t need to yell at it.

8. The Supremely Vulnerable Self (Atonement; 0)

Let’s say that one mastered the true psychedelic world, turning life into a 24/7 lucid dream. One could have whole-body orgasms with a single thought, decide to perceive orange and blue, and drop out of all human misery. One could burn to death, and still achieve the deep bliss of meditation.

Would that be the final goal? Or would it get boring? We do not seem to be satisfied with any final goal. Is that a result of our human clinging, or our infinite spirit?

Let’s just talk about our experiences as earthbound creatures who know they are going to die. We are stuck between two terrible notions: annihilation and eternity. We can conceive of the possibility of, but could never construct with the tools we have now, an eternal existence that we would want. Our physical brains would run out of space and we’d lose treasured memories (and, besides, the universe will die out some day). A non-physical afterlife is plausible, but we can’t imagine what its principles are. Does it also obey thermodynamic principles? Eternity is distressing; eternal anything is unsettling. At that scale, we don’t even understand time. Has the universe existed for 13.7 billion years, or did a 10-ish-billion-years-old come into existence when there was first conscious life to observe it? How can we talk about life after death when our notions of before and after are bound to such a pedestrian notion of time?

Yet, we also hate the idea that consciousness might end; not the feared (non-)experience but the lack of completion, and the sheer injustice– some people are born, live in pain for a few hours, then die– that it would imply.

If we experience something again and again, it eventually loses meaning. Its hedonic value seems to go to zero. Of course, if there is no experience after death, the hedonic payoff of that is zero. Can we even conceive of nothing? What does nothing even mean? If we let go of everything and float in the void, do we know that we are still there? If we are not there, then what perceives the void? Is there a void? If there is not a void, then did we ever disappear?

Sensory deprivation and deep meditation can lead this way. Some people have achieved it with drugs, but I don’t recommend that route; dissociative drugs especially are dangerous. A severe panic attack induces this, and I don’t recommend that either. Some call this ego death, although I think there are varieties that are terrifyingly with-ego, e.g. “What is happening to me?” A panic attack feels fatal and final; in fact, the terrifying experience is arguably when one is most alive, obsessed to a hot point with survival. There is a state of chaos in which anything seems possible, in which it is as plausible that one is the chair sits on as one is oneself. I can’t put that feeling into words; it’s deeply weird. This experience can multiply by zero, producing nothing; it can divide by zero, producing anything.

All of this– the proliferation and therefore devaluation of experience, or the absence of true feeling– can lead to despondency. I believe it is related to the “dark night of the soul” that St. John of the Cross wrote about. It can be a state of peace, or one of profound insecurity.

One might ask what I think happens after death. Obviously, I don’t know the answer, any more than anyone else does. My guess: the terrifying answer is freedom. The good news: I don’t think it’s liable to last that long, hence reincarnation. I do not think most of us (and I include myself) here are advanced to a state where we can tolerate such freedom indefinitely; rather, we bind to something new, and are reborn.

Just as sensory deprivation frees us, to the point where we experience psychedelia and possible ego loss (even without drugs) death liberates us from a body and a brain. We may not be able to think without a brain– we know that in this world, we do all our thinking with it– so even there, we’re at a loss to imagine what goes on. We might just drift.

I would guess that there is some beneficial-entropic process– a “heat” death we’d actually want– of consciousness toward a final state that mixes enlightenment with annihilation. The good parts of us continue; the bad parts are burned away by a force one might call “God”. That course, I imagine, is long: millions or billions of lives. As for the space between them, I wonder if a not-yet enlightened person like me can do much there.  The tendency to take form and be born again may be irresistible.

I could describe my attempts to meditate into the eighth self, but I could not express them to anyone without comparable experience (who, for that matter, has no use for such an explanation). I think zen koans, as objects of meditation, help us get a sense of what we’re dealing with. What is the sound of one hand clapping? The answer doesn’t matter; if an answer existed, it would be nonsense. Think on the question, for an hour. Maybe two. The process of answering the koan is what matters; that there is no answer lets the process (helpfully) avoid completion.

This is a trip I’ll take for real in about 40–50 years. Whatever I say about it from this side, using the occasional experience that feels like it might be somehow like that, is still a guess.

The space of this self is very dark. Not bad, not evil, any more than light is one or the other. It’s a whale’s song in midnight depths of the ocean. Farisa, in the spring-day forest, closes her eyes and nothing bites her. Dark.

And then, there seems to be light.

9. The Meta-Mortal Self (Return With the Elixir; 1)

There is a saying that there are no atheists in foxholes. It is offensive, and it is not true. There are brave, heroic people who are atheists and a close encounter with death need not convince someone to believe in gods or an afterlife.

This said, I doubt there are any nihilists in foxholes. The struggle to survive would, in that moment, provide meaning. The sense of relief, if nothing else, would be pleasant enough that the soldier would feel, or learn, something. Nature abhors a vacuum, and nihilism would implode if it were a thing at all.

That’s why nihilism’s so dangerous. If nihilism stayed nihilistic, it wouldn’t be militant, and therefore it wouldn’t be bad. All of us experience transient personal nihilism; the self described above does, in the extreme. Nihilism’s danger comes from the high likelihood of this void being filled with something we don’t want: avarice, lust, vengeance, or mayhem.

I’ve spoken out against cultural and economic nihilism. As I said, personal nihilism isn’t evil; it seems to be an inevitable stage. There is, therefore, no value in defining “nihilist” as a tribe– we should not; that must be clear by now– as if we were not all prone to nihilism, because we are. Some religious people attempt to “other” their own doubts by lashing out at nonbelievers. It should be obvious that I’m against that. It’s cruel. It does not make sense to call a person “a nihilist” as if it were a permanent commitment; it is something all of us struggle with: how do we find purpose in what seems to be an oppressively short life, in a world that seems godless and, in the larger scale, mindless?

This is hard to answer. The existentialist view is that meaning emerges from subjectivity and will. It need not be eternal to have value. I tend to agree with this. Even if existence is eternal, it’s not guaranteed to be meaningful. There could be one miserable person in heaven who sits around and says, “This is bullshit”. Perhaps heaven and hell are the same place– the allegory of the long spoons– and heaven is therefore conditioned upon humanist ethics (as opposed to the might-makes-right, anti-humanist, predestination-heavy theology of, say, the founders of our country’s so-called “Protestant work ethic”).

This is something I fight with in Farisa’s Crossing. There is a fantasy race that, in addition to having long lifespans, has developed the ability to recall past lives in their entirety. They know that there’s an afterlife. But God? Meaning? They’re as confused as we are.

I use the term meta-mortal because it’s antithetical to immortal. I see a reincarnation model as most likely to be true. The one-life Christian model, if we view 0 and 100 as total depravity and 100 as perfect salvation, seems to send 49.99 to hell and 50.01 to heaven; it seems more likely that 50.01 comes back as 50.01– Brownian motion, not Levy jumps– but, of course, I don’t know. Let’s say that reincarnation is true. This does not make us immortal. It makes us more mortal. It means we die all the time! The tragedy of death is not swept away; it is multiplied.

Don’t get me wrong: I’d rather go to heaven than be reincarnated, especially if there are no reunions in the time– is there such a notion of time?– between because we all pop in and out at different times. My cats will probably die before me, and they along with a large number of people, I’ll want to see again.

If reincarnation is true, then I’ve died before. To be honest, I don’t fear death– in this life. But I am sure, in such a case, that I have died, in pain and fear, before. It is also possible, then, that I will experience future deaths that I (with different religious or philosophical beliefs) will fear greatly.

And, even if we only live once– either the Christian model, or we face annihilation– there are still little deaths we face constantly… of thoughts, of ways of life, of relationships, of identities. What emerges from all this tragedy? What keeps us going?

As opposed to meta-mortal, I considered the term trans-mortal. In a different world, that would be better. The problem is that I want to avoid association with “trans-humanism”, a tech-industry religion that capitalist atheist nerds invented to come to terms with the fact that they’ve done their 20s wrong. The Singularity (nerd rapture) is something they believe in because, otherwise, they would recognize it as tragic that they wasted their youth for the profit of the super-rich (whom they are exceedingly unlikely to join). It is tragic. Place no hope in robot gods who’ll buy us out of the fact of mortality; to me it seems ludicrous. One should not use religion, of any kind, to blunt the moral stakes of reality. If there is a God, it is not fair to put it on Her to “let God sort ’em out”; clearly, She intended that we not do that.

In that, it seems obvious to me, if there is a God, why She would create a world that seems to be so godless. Death means there are stakes. Life matters. Living in a world that presents the possibility of death being the end means that we do not take life (ours, but more importantly, those of others) for granted. Why, one might ask, would a God “create” us imperfect and force us to go through all this suffering? The answer, I believe, is that God is not omnipotent– or that, if one means God to be an omnipotent being, there is no God. The concept itself breaks down easily. Could an omnipotent being take away its own omnipotence, forever? If no, then such a being is not omnipotent; if yes, then such a being’s omnipotence can be taken away and therefore the being is not omnipotent. It gets weird– weird enough that I’ve rejected the notion of omnipotence wholly. Not only do we not need it to find meaning in life, but it seems to steamroll any attempts we’ve made on the endeavor.

These questions are hard to answer. Heroes return from death, in their archetypical journeys, with an elixir; but our final journey– not necessarily heroic, insofar as it’s completely involuntary– is the one we don’t come back from.

What emerges from the abyss?

I don’t know.

I’m going to go play with my cats, now.