Greed versus sadism

I’ve spent a fair amount of time reading Advocatus Diaboli, and his view on human nature is interesting. He argues that sadism is a prevailing human trait. In an essay on human nature, he states:

They all clearly a demonstrate a deep-seated and widespread human tendency to be deceitful, cruel, abusive and murderous for reasons that have almost nothing to with material or monetary gain. It is as if most human beings are actively driven a unscratchable itch to hurt, abuse, enslave and kill others even if they stand to gain very little from it. Human beings as a species will spend their own time, effort and resources to hurt other living creatures just for the joy of doing so.

This is a harsh statement, and far from socially acceptable. Sadism is a defining human characteristic, rather than a perversion? To put it forward, I don’t agree that sadism is nearly as prevalent as AD suggests. However, it’s an order of magnitude more prevalent than most people want to admit. Economists ignore it and focus on self-interest: the economic agent may be greedy (that is, focused on narrow self-interest) but he’s not trying to hurt anyone. Psychology treats sadism as pathological, and limited to a small set of broken people called psychopaths, then tries to figure out what material cause created such a monster. The liberal, scientific, philosophically charitable view is that sadistic people are an aberration. People want sex and material comfort and esteem, it holds, but not to inflict pain on others. Humans can be ruthless in their greed, but are not held to be sadistic. What if that isn’t true? We should certainly entertain the notion.

The Marquis de Sade– more of a pervert than a philosopher, and a writer of insufferably boring, yet disturbing, material– earned his place in history by this exact argument. In the Enlightenment, the prevailing view was that human nature was not evil, but neutral-leaning-good. Corrupt states and wayward religion and unjust aristocracies perverted human nature, but the fundamental human drive was not perverse. De Sade was one of the few to challenge this notion. To de Sade, inflicting harm on others for sexual pleasure was the defining trait. This makes the human problem fundamentally insoluble. If self-interest and greed are the problem, society can align peoples’ self-interests by prohibiting harmful behaviors and rewarding mutually beneficial ones. If, however, inflicting pain on others is a fundamental human desire, then it is impossible for any desirable state of human affairs to be remotely stable; people will destroy it, just to watch others suffer.

For my part, I do not consider sadism to be the defining human trait. It exists. It’s real. It’s a motivation behind actions that are otherwise inexplicable. Psychology asserts it to be a pathological trait of about 1 to 2 percent of the population. I think it’s closer to 20 percent. The sadistic impulse can overrun a society, for sure. Look at World War II: Hitler invaded other countries to eradicate an ethnic group for no rational reason. Or, the sadists can be swept to the side and their desires ignored. Refusing to acknowledge that it exists, however, is not a solution, and I’ll get to why that is the case.

Paul Graham writes about the zero-sum mentality that emerges in imprisoned or institutionalized populations. He argues that the malicious and pointless cruelty seen in U.S. high schools, prisons, and high-society wives is of a kind that emerges from boredom. When people don’t have something to do– and are institutionalized or constrained by others’ low regard for them (teenagers are seen as economically useless, high-society wives are made subservient, prisoners are seen as moral scum)– they create senseless and degrading societies. He’s right about all this. Where he is wrong is in his assertion that “the adult world” (work) is better. For him, working on his own startup in the mid-1990s Valley, it was. For the 99%, it’s not. Office politics is the same damn thing. Confine and restrain people, and reinforce their low status with attendance policies and arbitrary orders, and you get some horrendous behavior. Humans are mostly context. Almost all of us will become cruel and violent if circumstances demand it. Okay, but is that the norm? Is there an innate sadism to humans, or is it rare except when induced by poor institutional design? The prevailing liberal mentality is that most human cruelty is either the fault of uncommon biological aberration (mental illness) or incompetent (but not malicious) design in social systems. The socially unacceptable (but not entirely false) counterargument is that sadism is a fundamental attribute of us (or, at least, many of us) as humans.

What is greed?

The prevailing liberal attitude is that greed is the source of much human evil. The thing about greed is that it’s not all that bad. In computer science, we call an optimization algorithm “greedy” if it is short-sighted (i.e. not able to capture the whole space, at a given algorithmic step) and these greedy algorithms often work. Sometimes, they’re the only option because anything else requires too much in the way of computational resources. “Greed” can simplify. Greedy people want to eat well, to travel, and for their children to be well-educated. Since that’s what most people want, they’re relatable. They aren’t malignant. They’re ruthless and short-sighted and often arrogant, but they (just like anyone else) are just trying to have good lives. What’s wrong with that? Nothing, most would argue. Most importantly, they’re reasonable. If society can be restructured and regulated so that doing the right thing is rewarded, and doing the wrong thing is punished or forbidden, greedy people can be used for good. Unlike the case with sadism, the problem can be solved with design.

Is greed good? It depends on how the word is defined. We use the word ambition positively and greed negatively, but if we compare the words as they are, I’m not sure this makes a lot of sense. Generally, I view people who want power more negatively than those who want wealth (in absolute, rather than relative terms) alone. As a society, we admire ambition because the ambitious person has a long-term strategy– the word comes from the Latin ambire, which means to walk around gathering support– whereas greed has connotations of being short-sighted and petty. We conflate long-range thinking with virtue, ignoring the fact that vicious and sadistic people are capable of long-term thought as well. At any rate, I don’t think greed is good. However, greed might be, in certain contexts, the best thing left.

To explain this, note the rather obvious fact that corporate boardrooms aren’t representative samples of humanity. For each person in a decision-making role in a large business organization, there’s a reason why he’s there and, if you think it comes down to “hard work” or “merit”, you’re either an idiot or painfully naive. Society is not run by entrepreneurs, visionaries, or creators. It’s run by private-sector social climbers. Who succeeds in such a world? What types of people can push themselves to the top? Two kinds. The greedy, and the sadistic. No one else can make it up there, and I’ll explain why, later in this post.

This fact is what, in relative terms, makes greed good. It’s a lot better than sadism.

The greedy person may not value other concerns (say, human rights or environmental conservation) enough, but he’s not out to actively destroy good things either. The sadist is actively malicious and must be rooted out and destroyed. It is better, from the point of view of a violence-averse liberal, that the people in charge be merely greedy. Then it is possible to reason with them, especially because technology makes rapid economic growth (5 to 20+ percent per year) possible. What prevents that from happening now is poor leadership, not malignant obstruction, and if we can share the wealth with them while pushing them aside, that might work well for everyone. If the leaders are sadistic, the only way forward is over their dead bodies.

“The vision thing”

Corporate executives do not like to acknowledge that the vast majority of them are motivated either by greed or by sadism. Instead, they talk a great game about vision. They concoct elaborate narratives about the past, the future, and their organization’s place in the world. It makes greed more socially acceptable. Yes, I want power and wealth; and here is what I plan to do with it. In the corporate world, however, vision is almost entirely a lie, and there’s a solid technical reason why that is the case.

We have a term in software engineering called “bikeshedding“, which refers to the narcissism of petty differences. Forget all that complicated stuff; what color are we going to paint the bike shed? The issue quickly becomes one that has nothing to do with aesthetics. It’s a referendum on the status of the people in the group. You see these sorts of things in mergers often. In one company, software teams are named after James Bond villains; in the other, they’re named after 1980s hair bands. If the merger isn’t going well, you’ll see one team try to obliterate the memetic cultural marks of the other. “If you refer to Mötley Crüe in another commit message, or put umlauts where they don’t belong for any reason, I will fucking cut you.”

Bikeshedding gets ugly, because it’s a fundamental human impulse (and one that is especially strong in males) to lash out against unskilled creativity (or the perception of unskilled creativity, because the perceiver may be the defective one). You see this in software flamewars, or in stand-up comedy (with hecklers pestering comics, and the swift comics brutally insulting their adversaries.) This impulse toward denial is not sadistic or even a bad thing at its root. It’s fundamentally conservative, but inflicting brutal social punishments on incompetent wannabe chieftains is what kept early humans from walking into lions’ dens.

As a result of the very strong anti-bikeshedding impulse, creativity and vision are punished, because (a) even those with talent and vision come under brutal attack and are drawn into lose-lose ego wars, and (b) almost never are there creatively competent adults in charge who can resolve conflicts, consistently, on the right side. The end result is that these aspects of humans are driven out of organizations. If you stand for something– anything, even something obviously good for the organization– the probability that you’ll take a career-ending punch approaches one as you climb the ladder. If you want to be a visionary, Corporate America is not the place for it. If you want to be seen as a visionary in Corporate America, the best strategy is to discern what the group wants before a consensus has been reached, and espouse the viewpoint that is going to win– before anyone else has figured that out. What this means is that corporate decisions are actually made “by committee”, and that the committee is usually made up of clever but creatively weak individuals. In the same way as mixing too many pigments produces an uninspiring blah-brown color, an end result of increasing entropy, the decisions that come from such committees are usually depressing ones. They can’t agree on a long-term vision, and to propose one is to leave oneself politically exposed and be termed a “bikeshedder”. The only thing they can agree upon is short-term profit improvement. However, increasing revenue is itself a problem that requires some creativity. If the money were easy to make, it’d already be had. Cutting costs is easier; any dumbass can do that. Most often, these costs are actually only externalized. Cutting health benefits, for one example, means work time is lost to arguments with health insurance companies, reducing productivity in the long run, and being a net negative on the whole. But because those with vision are so easily called out as bikeshedding, impractical narcissists, the only thing left is McKinsey-style cost externalization and looting.

Hence, two kinds of people remain in the boardroom, after the rest have been denied entry or demoted out of the way: the ruthlessly greedy, and the sadistic.

Greedy people will do what it takes to win, but they don’t enjoy hurting people. On the contrary, they’re probably deeply conflicted about what they have to do to get the kind of life they want. The dumber ones probably believe that success in business requires ruthless harm to others. The smarter ones see deference to the mean-spirited cost-cutting culture as a necessary, politically expedient, evil. If you oppose it, you risk appearing “soft” and effeminate and impractical and “too nice to succeed”. So you go along with the reduction of health benefits, the imposition of stack ranking, the artificial scarcities inherent in systems like closed allocation, just to avoid being seen that way. That’s how greed works. Greedy people figure out what the group wants and don’t fight it, but front-run that preference as it emerges. So what influences go into that group preference? Even without sadism, the result of the entropy-increasing committee effect seems to be, “cost cutting” (because no one will ever agree on how to increase revenue). With sadism in the mix, convergence on that sort of idea happens faster, and ignorance of externalized costs is enhanced.

The sadist has an advantage in the corporate game that is unmatched. The more typical greedy-but-decent person will make decisions that harm others, but is drained by doing so. Telling people that they don’t have jobs anymore, and that they won’t get a decent severance because that would have been a losing fight against HR, and that they have to be sent out by security “by policy”, makes them pretty miserable. They’ll play office politics, and they play to win, but they don’t enjoy it. Sadists, on the other hand, are energized by harm. Sadists love office politics. They can play malicious games forever. One trait that gives them an advantage over the merely greedy is that, not only are they energized by their wins, but they don’t lose force in their losses. Greedy people hate discomfort, low status, and loss of opportunity. Sadists don’t care what happens to them, as long as someone else is burning.

This is why, while sadists are probably a minority of the general population, they make up a sizeable fraction of the upper ranks in Corporate America. Their power is bolstered by the fact that most business organizations have ceased to stand for anything. They’re patterns of behavior that have literally no purpose. This is because the decision-making derives from a committee of greedy people with no long-term plans, and sadistic people with harmful long-term plans (that, in time, destroy the organization).

Sadists are not a majority contingent in the human population. However, we generally refuse to admit that it exists at all. It’s the province of criminals and perverts, but surely these upstanding businessmen have their reasons (if short-sighted ones, but that is chalked up to a failure of regulation) for bad behaviors. I would argue that, by refusing to admit to sadism’s prevalence and commonality, we actually give it more power. When people confront frank sadism either in the workplace or in the public, they’re generally shocked. Against an assailant, whether we’re talking about a mugger or a manager presenting a “performance improvement plan”, most people freeze. It’s easy to say, “I would knee him in the nuts, gouge out his eyeballs, and break his fingers in order to get away.” Very few people, when battle visits them unprepared, do so. Mostly, the reaction is, I can’t believe this is happening to me. It’s catatonic panic. Refusing to admit that sadism is real and that it must be fought, we instead give it power by ignoring its existence, thus allowing it to ambush us. In a street fight, this is observed in the few seconds of paralytic shock that can mean losing the fight and being killed. In HR/corporate matters, it’s the tendency of the PIP’d employee to feel intense personal shame and terror, instead of righteous anger, when blindsided by managerial adversity.

The bigger problem

Why do I write? I write because I want people in my generation to learn how to fight. The average 25-year-old software engineer has no idea what to do when office politics turn against him (and that, my friends, can happen to anyone; overperformance is more dangerous than underperformance, but that’s a topic for another essay). I also want them to learn “Work Game”. It’s bizarre to me that learning a set of canned social skills to exploit 20-year-old women with self-esteem problems (pickup artistry) is borderline socially acceptable, while career advice is always of nice-guy “never lie on your resume, no exceptions” variety. (Actually, that’s technically correct. Everyone who succeeds in the corporate game has lied to advance his career, but never put an objectively refutable claim in writing.) Few people have the courage to discuss how the game is actually played. If men can participate in a “pickup artist” culture designed to exploit women with low self-respect and be considered “baller” for it, and raise millions in venture funding… then why it is career-damaging to be honest about what one has to do in the workplace just to maintain, much less advance, one’s position? Why do we have to pretend to uphold this “nice guy”/AFC belief in office meritocracy?

I write because I want the good to learn how to fight. We need to be more ruthless, more aggressive, and sometimes even more political. If we want anything remotely resembling a “meritocracy”, we’re going to have to fight for it and it’s going to get fucking nasty.

However, helping people hack broken organizations isn’t that noble of a goal. Don’t get me wrong. I’d love to see the current owners of Corporate America get a shock to the system. I’d enjoy taking them down (that’s not sadism, but a strong– perhaps pathologically strong, but that’s another debate– sense of justice.) Nonetheless, we as a society can do better. This isn’t a movie or video game in which beating the bad guys “saves the world”. What’s important, if less theatric and more humbling, is the step after that: building a new and better world after killing off the old one.

Here we address a cultural problem. Why do companies get to a point where the ultimate power is held by sadists, who can dress up their malignant desires as hard-nosed cost-cutting? What causes the organization to reach the high-entropy state in which the only self-interested decision it can make is to externalize a cost, when there are plenty of overlooked self-interested decisions that are beneficial to the world as a whole? The answer is the “tallest nail” phenomenon. The tallest nail gets hammered down. As a society, that’s how we work. Abstractly, we admire people who “put themselves out there” and propose ideas that might make their organizations and the world much better. Concretely, those people are torn down as “bikeshedders”, by (a) their ideological opponents, who usually have no malicious intent but don’t want their adversaries to succeed– at least, not on that issue–; (b) sadists relishing the opportunity to deny someone a good thing; (c) personal political rivals, which any creative person will acquire over time; and (d) greedy self-interested people who perceive the whim of the group as it is emerging and issue the final “No”. We have a society that rewards deference to authority and punishes creativity, brutally. And capitalism’s private sector, which is supposed to be an antidote to that, and which is supposed to innovate in spite of itself, is where we see that tendency in the worst way.

Greed (meaning self-interest) can be good, if directed properly by those with a bit of long-term vision and an ironclad dedication to fairness. Sadism is not. The combination of the two, which is the norm in corporate boardrooms, is toxic. Ultimately, we need something else. We need true creativity. That’s not Silicon Valley’s “make the world a better place” bullshit either, but a genuine creative drive that comes from a humble acknowledgement of just how fucking hard it is to make the world a tolerable, much less “better”, place. It isn’t easy to make genuine improvements to the world. (Mean-spirited cost-cutting, sadistic game-playing, and cost externalization are much easier ways to make money. Ask any management consultant.) It’s brutally fucking difficult. Yet millions of people every day, just like me, go out and try. I don’t know why I do it, given the harm that even my mild public cynicism has brought to my career, but I keep on fighting. Maybe I’ll win something, some day.

As a culture, we need to start to value that creative courage again, instead of tearing people down over petty differences.

 

What’s a mid-career software engineer actually worth? Try $779,000 per year as a lower bound.

Currently, people who either have bad intentions or a lack of knowledge are claiming that software engineer salaries are “ridiculous”. Now, I’ll readily admit that programmers are, relative the general population, quite well paid. I’m not about to complain about the money I make; I’m doing quite well, in a time and society where many people aren’t. The software industry has many problems, but low pay for engineers (at least, for junior and mid-career engineers; senior engineers are underpaid but that’s an issue for another time) doesn’t crack the top 5. Software engineers are underpaid, relative to the massive amount of value (if given proper projects, rather than mismanaged as is typical) they are capable of delivering. In comparison to the rest of the society, they do quite well.

So what should a software engineer be paid? There’s a wide disparity in skill level, so it’s hard to say. I’m going to focus on a competent, mid-career engineer. This is someone between 5 and 10 years of experience, with continual investment in skill, and probably around 1.6 on this software engineering scale. He’s not a hack or the stereotypical “5:01″ programmer who stopped learning new skills at 24, but he’s not a celebrity either. He’s good and persistent and experienced, but probably not an expert. In the late 1990s, that person was just cracking into six-figure territory: $100,000 per year. No one thought that number was “ridiculous”. Adjusted for inflation, that’s $142,300 per year today. That’s probably not far off what an engineer at that level actually makes, at least in New York and the Bay Area.

Software engineers look “ridiculous” to people who haven’t been software engineers in 20 years (or ever) and whose numbers are way out of date. If you’re a Baby Boomer whose last line of code was in 1985, you’re probably still thinking that $60,000 is a princely sum for a programmer to earn. When one factors inflation into the equation, programmer salaries are only “at record high” because inflation is an exponential process. Taking that out, they’re right about where history says they should be.

I would argue, even, that programmer salaries are low when taking a historical perspective. The trend is flat, adjusting for inflation, but the jobs are worse. Thirty years ago, programming was an R&D job. Programmers had a lot of autonomy: the kind of autonomy that it takes if one is going to invent C or Unix or the Internet or a new neural network architecture. Programmers controlled how they worked and what they worked on, and either answered to other programmers or to well-read scientists, rather than anti-intellectual businessmen who regard them as cost centers. Historically, companies sincerely committed to their employees’ careers and training. You didn’t have to change jobs every 2 years just to keep getting good projects and stay employable. The nature of the programming job, over the past couple decades, has become more stressful (open-plan offices) and careers have become shorter (ageism). Job volatility (unexpected layoffs and, even, phony “performance-based” firings in lieu of proper layoffs, in order to skimp on severance because that’s “the startup way”) has increased. With all the negatives associated with a programming job in 2014, that just didn’t exist in the 1970s to ’80s, flat performance on the salary curve is disappointing. Finally, salaries in the Bay Area and New York have kept abreast of general inflation, but the costs of living have skyrocketed in those “star cities”, while the economies of the still-affordable second-tier cities have declined. In the 1980s and ’90s, there were more locations in which a person could have a proper career, and that kept housing prices down. In 2014, that $142,000 doesn’t even enable one to buy a house in a place where there are jobs.

All of those factors are subjective, however, so I’ll discard them. We have sufficient data to know that $142,000 for a mid-career programmer is not ridiculous. It’s a lower bound for the business value of a software engineer (in 1999); we know that employers did pay that; they might have been willing to pay more. This information already gives us victory over the assclowns claiming that software engineer salaries are “ridiculous” right now.

Now, I’ll take it a step further and introduce Yannis’s Law: programmer productivity doubles every 6 years. Is it true? I would say that the answer is a resounding “yes”. For sure, there are plenty of mediocre programmers writing buggy, slow websites and abusing Javascript in truly awful ways. On the other hand, there is more recourse for a good programmer who find quality; rather than commit to commercial software, she can peruse the open-source world. There’s no evidence for a broad-based decline in programmer ability over the years. It’s also easy to claim that the software career “isn’t fun anymore” because so much time is spent gluing existing components together, and accounting for failures of legacy systems. I don’t think these gripes are new, and I think tools are improving, and a 12% per year rate sounds about right. Put another way, one who programs exactly as was done in 1999 is only about 18 percent as productive as one using modern tools. And yet that programmer, only 18% as productive as his counterpart today, was worth $142,000 (2014 dollars) back then!

Does this mean that we should throw old tools away (and older programmers under the bus)? Absolutely not. On the contrary, it’s the ability to stand on the shoulders of giants that makes us able to grow (as a class) at such a rate. Improved tools and accumulated knowledge deliver exponential value, but there’s a lot of knowledge that is rarely learned except over a decades-long career. Most fresh Stanford PhDs wouldn’t be able to implement a performant, scalable support vector machine from scratch, although they could recite the theory behind one. Your gray-haired badasses would be rusty on the theory but, with a quick refresh, stand a much greater chance of building it righjt. Moreover, the best old ideas tend to recur and long-standing familiarity is an advantage. The most exciting new programming language right now is Clojure, a Lisp that runs on the Java Virtual Machine. Lisp, as an idea, is over 50 years old. And Clojure simply couldn’t have been designed by a 25-year-old in Palo Alto. For programmers, the general trend is a 12% increase in productivity; but individuals can reliably do 30 percent or more, and for periods spanning over decades.

If the business value of a mid-level programmer in 1999 was $142,000 in today’s dollars, then one can argue that today, with programmers 5.7 times more productive, the true value is $779,000 per year at minimum. It might be more. For the highly competent and for more senior programmers, it certainly is higher. And here’s another thing: investors and managers and VPs of marketing didn’t create that surplus. We did. We are almost 6 times as productive as we were in the 1990s not because they got better at their jobs (they haven’t) but because we built the tools to make ourselves (and our successors) better at what we do. By rights, it’s ours.

Is it reasonable, or realistic, to argue that mid-career software engineers ought to be earning close to a million dollars per year? Probably not. It seems to be inevitable, and also better for society, that productivity gains are shared. We ought to meet in the middle. That we don’t capture all of the value we create is a good thing. It would be awful, for example, if sending an email cost as much as sending a letter by post or, worse yet, as much as using the 19th-century Pony Express, because the producers of progress had captured all of the value for themselves. So, although that $779,000 figure adequately represents the value of a decent mid-career engineer to the business, I wouldn’t go so far as to claim that we “deserve” to be paid that much. Most of us would ecstatic with real equity (not that 0.05% VC-istan bullshit) and a quarter of that number– and with the autonomy to deliver that kind of value.

Meritocracy is the software engineer’s Prince Charming (and why that’s harmful).

One of the more harmful ideas peddled to women by romance novels and the older Disney movies is the concept of “Prince Charming”, the man who finds a young girl, sweeps her off her feet, and takes care of her for the rest of her days. It’s not a healthy concept, insofar as it encourages passivity as well as perfectionism in mates. But it also encourages women to make excuses for bad (and often abusive) men. Because the archetype is so unreasonable, men who can make themselves seem to fulfill it are the manipulative and sometimes abusive ones, not genuine good (but flawed) men. I’d argue that software engineers have a similar Prince Charming.

It might begin as a search for “a mentor”. Savvy software engineers take knowledge and favor from multiple people, but every Wall Street or Silicon Valley movie showcases a mentor/protege relationship as the path to success. Meet this magical person, and he’ll take care of your career from there on out. That doesn’t exist for most people, either, and most software engineers learn that around age 25. Their counterreaction is to develop a bizarre self-reliance in which they start refusing help, wanting to work alone, and denigrating those who advance their careers based on “politics” or “connections”. Having too much dignity to wait for a magical mentor to rescue them from mediocrity, they insist on their new Prince Charming, an interpersonal force that will recognize and elevate talent: meritocracy.

The problem with meritocracy is that every organization claims to be one, yet almost all organizations are deeply political. Software engineers are not a subtle breed, so I must imagine that they imagine most non-meritocracies perceive themselves as such, and admit so much, and that’s clearly not true. Oil companies, banks, startups and dysfunctional academic bureaucracies all have this in common: they believe in their own meritocracy. Otherwise, they wouldn’t be self-consistent and stable. “We’re a meritocracy” means nothing. And what is “merit”? Organizations make promotion decisions not to recognize some abstract principle of “merit”, but on what is perceived to be in the short-term, narrow interest of the organization. It’s not what software engineers mean when they use the term merit, but one could argue that political acumen is organizational merit. The people who are promoted in and end up dominating organizations are… those most able to convince organizations to promote them, whether through delivering objective value or by trickery and intimidation. It’s a self-referential, Darwinian sense of “merit” akin to “fitness”. Darwinian fitness is neither a matter of good, bad, or anything other than the ability to self-replicate.

Of course, I know what software engineers mean when they say they want to live in a “meritocracy”. They want important decisions that affect their working lives to be made by the right people. The problem is that the ability to make good executive decisions is almost impossible to measure, reliably, especially on a timeframe that businesses would consider acceptable. Political machinations can happen, on the other hand, in split seconds. Saying something stupid in a meeting can end someone’s career, even if that person is, in general, a good decision-maker. It takes too long to select leaders based on the quality of their decisions, so organizations develop political side games that end up consuming more energy, time and attention (especially at high levels) than the actual work or purpose of the organization. Generally, this side game takes on the feeling of a war of attrition. Nonsensical pressures and busywork are added until people embarrass themselves out of contention, or their health fails, or they leave to pursue better options, leaving one person standing. Software isn’t different from that, with the long hours and posturing machismo and general disregard for health.

By believing in meritocracy, software engineers trick themselves into making excuses for awful companies and bad bosses that hurt their careers, destroy their confidence, and unapologetically exploit them. When they enter organizations, they tend (at least, when young) to want to believe in the self-professed “meritocracy”, and it’s hard to let such an idea go even in the face of adverse evidence. When these engineers are betrayed, it’s practically an ambush.

Older, savvier engineers know that few workplaces are meritocracies. In general, the claim of “meritocracy” is nothing more than a referendum on the leadership of the company. For this reason, it’s only in the midst of an open morale crisis (in which firing the obviously unhappy people isn’t viable because almost everyone is obviously unhappy) that one can admit to the organization’s non-meritocracy.

The expensiveness of it all

Software engineers’ belief in meritocracy costs them money and career advancement. By conflating their organizational position (low, usually) with personal merit, their confidence falls to zero. Computer programming, if marketed properly, ought to be “the golden skill” that allows a person unlimited mobility within industry. However, we’ve allowed the businessmen who’ve colonized us to siloize us with terms like DBA, operations, data scientist, etc., and use those to deny opportunities, e.g. “you can’t take on that project, you’re not a real NLP programmer”. As a class, we’ve let these assholes whittle our confidence down to such a low level that our professional aura is one either of clueless youth or depressive resignation. When they beat us down, we tend to blame ourselves.

Our belief in meritocracy hurts us in another way, in that we justify things being unduly hard on us. We hate the idea of political promotion. Perhaps, on first principles, we should. What this means is that engineers are promoted “confirmationally” rather than “aspirationally”. In HR-speak, confirmational promotion means that they’re given formal recognition (and the organizational permission to operate at the level they have been) once they’re already working at the level signified by the title. Aspirational promotion means that people are promoted based on potential, but this opens the door for a host of clearly political promotions. On paper, confirmational promotion is superior, if infuriatingly slow. (It requires people to blow off their assigned duties and to take unrequested risks.) Engineers, of course, prefer confirmational regimes. And what’s wrong with that?

Engineers don’t like to negotiate, they don’t like politics, and they’re against favoritism. Most have a proud self-reliance that would leave them uncomfortable even if personally favored. They’re also, in general, irreverent toward title as long as they believe they’re fairly paid. To them, confirmational promotion is right. The problem? Everyone but engineers is promoted aspirationally. Engineers need long, completed, successful projects to get bumped to the next level. What, pray tell, does it take to become VP of Product or Senior Manager as opposed to Manager, or to rise on just about any of the nontechnical tracks, in most tech companies? Absolutely nothing. There is no fucking magic there. You have to convince someone to “see something” in you. That is, you have to play politics.

To the engineer’s chagrin, playing politics comes easily for most ambitious people. It sure isn’t rocket science. Getting over one’s own moral objections is, for most people, the hardest part. The result of this is that nontechnical tracks, including management tracks that often cross over engineers, are characterized by easy aspirational promotion driven by favoritism and politics. The “meritocratic” engineering track is clearly much more difficult. There are people over 35, with IQs over 140, who haven’t made “senior engineer”, for fuck’s sake. (At a “mere” 125 IQ, you’re smarter than 97% of the nontechnical VPs at most tech companies.) It’s characterized by confirmational promotion, instead. And this is a point of pride for software engineers: it’s really hard to climb the ladder, because one is competing with the smartest people in the organization, and because while favoritism exists, political promotions are much rarer on the engineering track than on non-technical tracks (largely because promotions in general are rarer).

This is something that software engineers don’t really get. What do job titles actually mean in organizations? Companies will say that “Director” means one thing and “VP” means another, with some jargon about “the big picture” and a person’s responsibilities within the organization. The truth is that they mean very little, other than serving as political tokens that prove the person was able to get them. “Director” means, “he was able to negotiate a salary between $X and $Y from HR”. Not more.

Where it leads

If you ask an engineer whether he thinks he’s ready to be VP of Engineering or CTO, you’ll get a half-hearted, self-deprecating answer. “You know, I might be ready to lead a small team, but I’m not sure I’m at the VP/Eng level yet.” Cluelessly, he believes that “the VP/Eng level” exists objectively rather than politically. On the other hand, if you ask a nontech the same question, he’ll take it without hesitation. Even if he’s terrible at the job, he gets a generous severance (he’s a VP) and will fail up into a better job. The relevant concept here is the effort thermocline, or the level in an organization where jobs stop being harder with increasing rank, but become easier (although, more political). It can be politically difficult to get a job above the effort thermocline, but it’s ridiculously easy to keep it. At that point, one has power and credibility within the organization sufficient that one cannot, personally, fail due to a lack of effort.

Nontechs, except for clueless people in their 20s who haven’t figured out what they want to do, go into work with one purpose: to get promoted beyond the effort thermocline. That’s not to say that they’re all unambitious or lazy. They’re just realistic about how the game works. Even if you want to work hard, you don’t want hard work to be expected of you. If you’re an SVP and you show up for work every day and put in an honest effort, you get credit for it. If you’re a worker bee, you get nothing for your 8-or-more hours per day. It’s just what you’re expected to do.

Above the effort thermocline, promotion is political, and people stop pretending otherwise. When you get “into the club”, you’re permitted to speak frankly (and hear frank speech) about how the organization actually works. The issue with the engineer’s mind is that it clings to a belief in right and wrong. It’s moralistic. It struggles to accept what people really are. Engineers don’t want to contend with the basic fact of most organizations, which is that they’re politically corrupt and dysfunctional, because most people are lazy, greedy, and weak. I’d likewise argue that this is connected to the low levels of acquired social skills in people like software engineers. It’s not a neurological disability for most. They never learn to read cues beyond a subconscious and juvenile level, because they hate what they see, which is that humans are mostly defective and that many are horrible.

Engineers don’t like the concept of the effort thermocline, or of political promotion in general. As much as they can, they’d refuse to have it within their ranks. I’d tend to side with the engineers. Who wouldn’t, from first principles, prefer a meritocracy over a political rat’s nest? The business responds by turning off political promotions for most engineers– while the rest of the organization continues to get them. The result is that, while they start off well in terms of pay and occupational dignity, engineers are being surpassed by the nontechs (who gleefully accept political promotions and feel none the worse for it) by age 30 and, by 40, are undervalued and way underpaid relative to their worth to their companies.

Engineering tracks in organizations are notoriously title-deflating, in comparison to the rest of the business world. Most software engineers would be appalled by how little talent and work ethic are required to become a non-technical VP at even the most esteemed tech companies. Many of these people are lazy (11-to-3 with 90-minute lunches) and just plain dumb. And by dumb, I don’t mean programmer dumb (understands the theory behind neural networks, but has never put one in production) but actual shame-to-the-family, village-idiot stupid. You know how towns in the Midwest used to bus their “defectives” to San Francisco in the mid-20th century? Well, so does the corporate world, and they end up as nontechs and upper management in tech companies.

Conclusion?

Meritocracy is the Prince Charming of the software engineer. It doesn’t exist. It never has, and it never will. Some have asked me to comment on recent HR issues occurring at open-allocation technology companies. The only thing I can say is that, yes, open-allocation companies have serious political issues; but closed-allocation companies have those same issues and more. Open allocation is strictly superior, but not a panacea. When there are people, there is politics. The best an organization can do is to be fair and open about what is going on, and hope to achieve eventual consistency.

Every organization defines itself as a meritocracy, and most engineers (at first, until they are disillusioned with a company) will tend to believe it. They aren’t stupid, so they don’t believe their companies to be perfect in that regard, but they (cluelessly) tend to believe that meritocracy is a core value of the leadership. Almost never is that the case. “We’re a meritocracy” is code for, “don’t question promotions around here”.

The Prince Charming delusion of meritocracy is dangerous because it leads people to make excuses for bad actors. Every company has to lay off or fire people, and frequently these choices are made with imperfect information and under time pressure (one large layoff is less damaging to morale than several small, measured, layoffs) so often the wrong people are let go. A self-aware organization understands this and lets them go gracefully: with severance, outplacement assistance, and positive reference. A delusional “meritocracy” has to cook the books, create psychotic policies that impede internal mobility for everyone, and generate useless process in order to build phony performance cases. In practice, just as many people are let go as in established (and less delusional companies) but their reputations have to be demolished first, with bad project assignments and hilariously disingenuous “performance improvement plans“. Personally, I’d rather see the honest, times-are-tough, layoff than the tech company’s dishonest “low performer initiatives”, much less the permanent (and destructive) rolling layoff of stack ranking.

The biggest casualty, however, of the typical engineer’s head-in-sand attitude toward political promotion is that they never stop happening to everyone else. Engineers just make themselves ineligible. Engineers want promotion to be confirmational (that is, resulting from demonstrated merit) rather than aspirational (that is, based on potential and, therefore, subjective, personal, and political). The problem with this is that, after 10 to 20 years, most engineers haven’t been able to demonstrate even 20% of what they’re capable of. They kept getting crappy projects, were never allowed to finish anything, were rushed to produce work that broke under strain, and their lack of finished accomplishment (due to political forces often not their fault) left them ineligible for promotion to more senior roles, but too old to even pretend in the junior roles (hence, the age discrimination problem). After that gauntlet of false starts and misery, they’re still answering to nontechnical people and executives who had the benefit of aspirational, political promotion. By refusing to play politics and believing in the false god of meritocracy, they deprived themselves of the full spectrum of causes for advancement. Politics, however, went on regardless of whether they believed in it.

This false meritocracy is very clever when it comes to reinventing itself. Few expect a large company like, say, Alcoa or Exxon-Mobil to be a meritocracy. Engineers have figured out, as a group, that “big companies” become political. The response? Startups! Venture capital! The actual result of this has been to replace well-oiled and stable (if inefficient) corporate non-meritocracies with the mean-spirited and psychotic non-meritocracy of the VC-funded ecosystem and the feudalistic reputation economy that the leading investors, through collusion, self-dealing, and note-sharing, have created. The cheerleading of intellectually charismatic figures like Paul Graham and Marc Andreessen has managed to create a sense of meritocracy in that world, but belief in those idols also seems to be waning, and I’m proud to say that I contributed to that loss of faith.

If meritocracy is impossible, what should we do? As individuals, we need to learn to fight for ourselves. It’s not undignified or desperate or “pushy” to look out for our own interests. It’s what everyone else is doing, and we should get on board. As a collective, we need to have honest introspection on what we value and how best to achieve it. Perfect meritocracy within any organization is impossible. It is good to strive for that, but bad to believe it has been achieved anywhere. Eventual consistency and technical excellence are achievable, and we should aim for those.

Before we do anything, though, we need to learn how to fight for ourselves. Bringing frank knowledge to the good, in that fight, is what I’ve been striving to do all along.

Psychopathy and superficial reliability

Lord Acton says: judge talent at its best and character at its worst. This is a wise principle, yet it fails us miserably when misapplied, as it often is in modern society. Why is that? The world is large, so our knowledge of each is extremely sparse. We often lack the information necessary to judge either talent or character well well. The consequence of information sparsity in judgment of talent is the existence of celebrity. It’s better to have everyone know that you’re a 6, than to be a 10 in secret. This itself is not so dangerous, but the contest for visibility, even in supposed meritocracies like the software industry, gets destructive quickly. Even in small companies, more effort is often expended to gain control of the division of labor (thus, one’s own visibility and reputation) than is spent actually completing the work. The fact that awful people are excellent at office politics is so well-known that it requires no documentation. It becomes visible within the first 6 months of one’s working life. This makes assessment of character as important as the judgment of skill and talent. Is the guy with the flashy resume a legitimate 99.99th-percentile talent, or a degenerate politicker and credit-taker who managed to acquire credibility? Reference checking is supposed to solve that, and it doesn’t work. I’ll get to that, a little bit later.

Information sparsity in the assessment of talent is a known danger, but I tend to see it as a short-term and minor threat. There’s probably an eventual consistency to it. Over time, people should converge to levels of challenge, responsibility, and influence commensurate with their ability. More dangerous, and infinitely more intractable, is the information sparsity that pertains to character. People tend to overestimate, by far, their ability to judge other peoples’ ethical mettle. In fact, the vast majority of them are easy to hack, and their excessive confidence in their own assessment is, in truth, easily used against them by the bad actors.

This problem is pretty much impossible to solve. Most people know from experience that the worst people– the psychopaths– are superficially charming, which means that personal impressions are of low value. What about getting access to the person’s history? In employment, that’s what reference checks are for, but shady characters often have great references. Why? Because they lie, extort, and manipulate people until their histories become not only socially acceptable but outright attractive. They hack people with as much skill and malice as the worst black-hat “crackers”. The people who are harmed by intensive reference checks are honest people with difficult histories, not the degenerate and dishonest who are the real threat.

My experience is that people lack the tools to judge others for character, at least at scale. Any fair punitive structure is predictable, and the most skilled of the bad actors will adapt. Any unpredictable punitive structure will be unfair, and rely on decisions made by influential humans, who are more likely than average to be psychopaths, and will certainly have psychopathic courtiers (whom the powerful person has not yet detected). The best one can do is to judge people by their actions, and to punish bad deeds swiftly and objectively. This is not a trivial art, of course.

Laws and imprisonment serve this punitive purpose, but most of the people in our jails are impulsive people of low social class, with only moderate overlap between the imprisoned population and the psychopaths. In employment, there’s a naive hope that, while psychopaths can climb high within corporations, they will eventually be unable to escape their histories and be flushed out of respectable careers. It never happens that way. Moral degenerates don’t get blacklisted. They acquire power and do the blacklisting.

One acquired strategy for dealing with such people is “Distrust everyone”. That’s how most seasoned managers and executives, having been robbed a couple times by dishonest subordinates, tend to view the people below them– with implicit, prevailing distrust. That strategy fails especially badly. Why? First, there are degrees of trust and distrust. Becoming a managerial favorite (managers are not always psychopaths, but managerial favorites almost always are) simply requires superiority in relative trust, not any level of absolute trust. Second, it’s functionally impossible to get a complex job done (much less lead a team) with prevailing total distrust of everyone, so people who “distrust everyone” are desperate for people they can give partial trust. Psychopaths play people with that attitude quite easily. It’s not even work for them. A boss who thinks his subordinates are all morons is surprisingly easy to hack.

The conclusion of all this is that, in defending scalable institutions such as corporations against psychopaths, we’re basically helpless. We don’t have the tools to detect them based on affability or social proof, and any strategy that we devise to deal with them, they will subvert to their own ends. We can’t “beat” them when they look exactly like us and will be undetected until it’s too late. Our best shot is not to attract them, and to avoid engaging in behaviors that make our institutions and patterns most easily hackable.

Despite our complete lack of ability to assess individuals for character at scale, we develop metrics for doing so that often not only fail us, but become tools of the psychopath. A going assumption that people make is that the small is indicative of the large. If Fairbanks is chilly in the summer, it must be frigid in the winter. (This applies to most climates, but not to San Francisco.) People who make occasional misspellings in email must be stupid. People who have mediocre accomplishments (by adult standards) at young ages are destined for adult brilliance. People who regularly take 75-minute lunches are “time-stealing” thieves.

Talent is judged in the workplace based on minor accomplishments, largely because there are so few opportunities for major accomplishment, and those are only available to the well-established. The guy who reliably hits a “6” is judged to be capable of the “9” (see: Peter Principle) while the one who gets bored and starts dropping “5”s is flushed out. Character is judged, similarly, based on useless and minor signals. The person who regularly arrives at 9:00, never says the wrong thing, and projects the image of a “team player” (whatever the fuck that means) gets ahead. What takes the place of character– which, I contend, cannot be assessed at scale and amid the extreme information sparsity of modern society– is superficial reliability. The people who pass what a company thinks are character and “culture fit” assessments are, rather than those of pristine character, the superficially reliable.

Who wins at this game? I wouldn’t say that it’s only psychopaths who win, but the best are going to be the psychopaths. The earnestly honest will break rules (formal and informal) to get work done. They care more about doing the right thing than being perceived the right way. Psychopaths are not by-the-word rule-followers with regard to formal policies, but they always follow the informal social rules (even to the breach of formal and less-powerful informal rules). They figure them out the quickest, have few distractions (since they rarely do actual work; that’s not what the office game is about!) and, fairly early on, find themselves in the position to make those rules. 

Superficial reliability works in favor of the worst people. Why? It evolves into a competition. Once everyone is in the office from 9:00 to 6:00, the new standard becomes 8:00 to 7:00. Then it’s 7:00 to 8:00, with expected email checking to 11:00. People start to fail. The noncompliant are the first to drop away and judged by the organism (the team, management) to have been the least dedicated, so it’s not seen as a loss. The next wave of failures are the enervated compliant, who meet the increasingly difficult standards but underperform in other ways. They spend their 13 hours at the office, but start making mistakes. They turn into temporary incompetents, and are flushed out as well. They’re not seen as a loss either. “We have a tough culture here.” As those burn off, people who were formerly at the center of the bell curve (in reliability, status, and performance) are now on the fringe, which means that there’s an atypically large set of people on the bubble, generating a culture of anxiety. They become catty and cutthroat now that the middle is no longer a safe place to be. People fight, and some come out of it looking so terrible that their reputations are ruined. They leave. Psychopaths rarely enter these contests directly, but evolve into puppet masters and arms dealers, ensuring that they win regardless of each battle’s outcome. Soon, the psychopath has entrenched himself as a key player in the organization. He’s not doing most of the work, but he’s feared by the actual stars, enough that they’ll let him take credit for their work and (in management’s eye) become one.

Most reliability contests work this way. There’s some performance metric where the bottom few percent justly deserve to be fired. As a limited measure, such a “sweep” is not a bad idea. (“Let’s stop paying the people who never show up.”) Management, however, is not measured or limited. It’s faddish, impulsive, absolute, and excessive. Whatever policy is used to separate from true underperformers (about 2%) must also be used to “stack rank” the other 98 percent. It’s no longer enough to enforce an honest 8-hour day; we must require an 11-hour day. This overkill damages the work environment and culture, and psychopaths thrive in damaged, opaque, and miserable environments.

Another example is reference checking in employment. The real purpose of the reference check is to discourage the morally average from lying about their histories, and it works. The moral “middle man” at the center of the ethical bell curve would probably lie on his resume given the right incentives, but would stop short of asking 3 friends to support the lie by posing as peers at jobs the person did not hold. Most people won’t make that kind of demand of people who aren’t close to them, but few people want to be seen as unethical by close colleagues. That is the point where the average person says, “Wait a minute, this might be wrong.” The classic, three-reference check also filters out the honest but undesirable candidates who just can’t find three people to recommend their work. It’s a reliability test, in that anyone who can’t find 3 people in his last 5 years to say good things about him is probably in that bottom 2% who are undesirable hires for that reasonl alone. Yet, at senior ranks in large companies, reference checking becomes a reliability contest, with 10 to 20 references– including “back channel” references not furnished by the candidate– being required. At that point, you’re selecting in favor of psychopaths. Why? Most honest people, playing fair, can’t come up with 20 references, nor have they engaged in the intimidation and extortion necessary to pass an intensive “back-channel” reference check in a world where even a modestly positive reference means no-hire. It’s those honest people who fail those cavity searches. A psychopath with no qualms about dishonesty and extortion can furnish 50 references. Beyond the “classic 3″, reference checking actually selects for psychopathy. 

Why do psychopaths never fail out, even of reliability contests designed to cull those of low character? The answer is that they have a limited emotional spectrum, and don’t feel most varieties of emotional pain, which makes them exceptionally good at such contests. They don’t become upset with themselves when they produce shoddy work– instead, they plan a blame strategy– so they don’t mind 15-hour days. (Office politics is a game for them, and one they love to play, so long hours don’t bother them.) They are emotionally immune to criticism as well. While they care immensely about their status and image, they have no reason to fear being dressed down by people they respect– because they don’t actually respect anyone. While psychopaths seem to despise losing, given the awful things they will do to avoid a minimal loss, even defeat doesn’t faze them for long. (This is an erroneous perception of the psychopaths; when we see psychopaths doing awful things to avoid minor losses, we assume they must have a desperate hatred of losing because we would require extreme circumstances in order to do such bad things. In truth, the difference is that they have no internal resistance against bad action.) Losses do not depress or hamper them. They pop right back up. Psychopaths are unbeatable. You can’t find them out until it’s too late, and whatever you try to kill them with is just as likely to hit someone innocent. Indeed, they thrive on our efforts to defeat them. When they are finally caught and prone, our punishments are often useless. There is truly “no there there” to a psychopath, and they have nothing to lose.

For an aside, I am not saying that we are powerless to curtail, punish, or rehabilitate the larger category of “bad actors”. Laws, social norms, and traditional incentives work well for normal people. Petty theft, for example, is rare because it is punished. Plenty of non-psychopaths would– out of weakness, desperation, curiosity, or even boredom– steal if they could get away with it. Jail time deters them. Prison is an environment to which normal people adapt poorly, and therefore an undesirable place to be. Psychopaths are different in many ways, one of which is that they are extremely adaptive. They love environments that others cannot stand, including prisons and “tough” workplace cultures. Punishing a psychopath is very hard, given his imperviousness to emotional pain. You could inflict physical pain or even kill him, but there would be no point. He would suffer, but he would not change.

Why does psychopathy exist? It’s useful to answer this question in order to best understand what psychopathy is. My best guess at it is that it has emerged out of the tension between two reproductive pressures– r- and K-selection– that existed in our evolutionary environment. An r-selective strategy is one that maximizes gross reproductive yield, or “spray and pray”. K-selective strategies are focused more on quality– fewer, more successful, offspring. The r-selective “alpha” male has a harem of 20 women and 200 children, most neglected and unhealthy. The K-selective “beta” has one wife and “only” 8 or 9 offspring, and invests heavily in their health. Neither is innately superior to the other; r-selective strategies repopulate quickly after a crisis, while K-selective quality-focused strategies perform well in stability. Human civilization has been the gradual process of the K-strategist “betas” taking over, first with monogamy and expected paternal investment, which was later extended to political and economic equality (because high-quality offspring will fare better in a stable and just world than a damaged one). Almost certainly, all humans possess a mix of “alpha” and “beta” genes and carry impulses from both evolutionary patterns, with the more civilized beta strategy winning over time, but not without a fight. Indeed, what we view as morally “good” in many societies is intimately connected with beta patterns– sexual restraint, nonviolence, positive-sum gradualism– while our concept of “sin” is tied to our alpha heritage. Psychopathy seems to be an adaptation in which the beta, or K-selective, tendencies of the mind are not expressed, allowing the alpha to run unchecked. In evolutionary terms, this made the individual more fit, although often at the expense of society.

Psychopaths (for obvious evolutionary reasons) like sex, status, and resources, but that alone doesn’t identify them, since almost everyone does. What differentiates the psychopath is the extreme present-time orientation, as well as the willingness to make ethical compromises to get them. The future-oriented, positive-sum mentality is absent in the psychopath. Unhampered by conscience, psychopaths quickly acquire resources and power, these being key (at least, throughout most of our evolutionary history) to reproductive proliferation. In business, their sexual appetites are not of major interest. What’s most relevant to our problem is their attraction to power and status. That is what they want. It’s only about money for them so far as it confers social status.

If we cannot defeat psychopaths, then what should we do? This turns out not to be a new problem– not in the least. Why, for example, do American elected officials draw such mediocre salaries? Why do we need all the checks and balances that make even the presidency so much damn work? Making power less attractive is one of the first principles of rational government, as the concept was developed during the Age of Reason. The reactionary clergies and hereditary aristocracies had to go– that much was clear– but how could one prevent a worse and more brutal lord from filling the vacuum? The idea was to compensate for power’s natural attractiveness by limiting it and attaching responsibilities. In the U.S., this even came to the matter of location, with the nation’s capital being chosen deliberately in an undesirable climate. In elected politics, I would say that this has mostly worked. We’ve had some downright awful political leaders, but a surprisingly low number (by corporate comparison) of psychopaths in top political positions. I wouldn’t go so far as to say that elected office doesn’t attract them, but other positions of power attract them much more. With the first-rate psychopaths making millions in the corporate world, the psychopaths who are attracted to elected political positions are the C-students in psychopath school.

Taking a macroscopic perspective, psychopathy is a very hard problem to solve. A closed system such as a nation-state has some probably invariant population of psychopaths that, inevitably, will be attracted to some variety of social status and dominance over other people. Flush them out of politics, and they end up in business. Yet if business were made unattractive due to an overpowered state (e.g. left-wing authoritarianism) they would end up back in government. They have to go somewhere, and it is impossible to identify them until they’ve done their damage (and, often, not even then). Yet the microeconomic problem for an individual firm is much easier– don’t attract psychopaths.

In technology, one strategy is Valve-style open allocation, under which employees are permitted to work for the firm directly rather than requiring managerial approval. Want to change projects? Move your desk and start. The typical extortion that middle managers use to build their careers– work for me or you don’t work here at all– doesn’t exist, because no one has that authority. Managerial authority attracts psychopaths like little else– more than money or prestige– and if one can do without it, one should consider doing so.

Much of the appeal of startups in technology is the perception (sometimes, an inaccurate one) that small technology companies haven’t yet been corroded and politicized by managerial extortions. In the ideal case, a startup operates under a constrained open allocation. It’s not yet “work on whatever you want”, because the startup requires intense focus on solving a specific problem, but employees are trusted to manage their own contribution. When do those companies go to closed allocation? Often, “hot” companies lose their cultural integrity in the process of hiring executives. The flashy career office-politician with impressive titles and “a track record” demands authority from the go, and it’s given to him. Five direct reports is not enough; he demands ten. He gets 15. Over time, employees lose status and autonomy as it’s chipped away to feed these people.

Most of the cultural losses that companies endure as they grow are suffered in the quest to hire executives from the outside, but what kind of person are you going to attract if you’re immediately willing to sell off your employees’ autonomy to “close a deal”? The people you’re most likely to get are those who enjoy power over people. Not all of these are psychopaths (some are mere narcissists or control freaks) but many are. Your culture will disappear rapidly.

If you’re running a typical VC-funded, build-to-flip operation, then hiring power-hungry external executives might be the way to go. A great way to buy an important decision-maker (an investor, an executive at an acquirer) is to give his underperforming friend an executive position at your company. You might take on a psychopath or few, but you’re not going to be in the company for very long, so it’s not your concern. On the other hand, if you want to build a stable company whose culture and values will still be worth a damn in 20 years, then you can’t do that. To the extent that your organization needs positions of power to function, you need to make them undesirable to the psychopath. This is one of the major reasons why you need intrinsic limits (checks and balances) on power.

Unfortunately for corporate executives, making a company less psychopath-friendly means equalizing the distribution of power and reward within companies. It means moving away from the CEO-as-king model and the eight-figure pay packages. Over the past forty years, we’ve been paying more and getting less when it comes to corporate management. Flushing out the psychopaths requires that we pay less, both financially and in terms of authority over other people, for managerial positions. The whole concept of what it means to be an “executive” will require reinvention as radical as the replacement of hereditary monarchs by elected legislators.

The stupid, superficial reliability contests that corporations use to assess character and protect themselves against psychopaths don’t work. In fact, they do the opposite, becoming the psychopath’s favorite tools. Companies that want to avoid being invaded and controlled by such people will have to reinvent themselves in a form radically unlike the traditional, hierarchical corporation.

We should pay people not to subordinate

In the very long term, technological society will need to implement a basic income, as soon as full employment becomes untenable. Basic income (BI) is an income paid to all people, with no conditions. Alaska already has a small one, derived from its oil wealth. In the long term, however, full employment will be impossible due to the need for ongoing, intensive, and traditionally unpaid training.

Today, I’m not going to talk about basic income, because we’re probably a couple of decades before society absolutely needs one, and even farther away from one being implemented, because of the monumental political hurdles such an effort would encounter. Instead, I’m going to talk about right now– January 7, 2013– and something we need to do in order to maintain our capacity to innovate. I will address something that society ought to do in order to prevent a pointless and extreme destruction of human capital.

Peter Thiel has created a program (“20 Under 20″) that pays high-potential young people to skip college, but the entry-level grunt work most people spend the first few years of their careers on is, in my opinion, much more damaging, especially given its indefinite duration. (I don’t think undergraduate college is that damaging at all, but that’s another debate.) There is some busywork in college, and there are a few (but they’re very rare) incompetent professors, but more creativity is lost during the typical workplace’s years-long dues-paying period, which habituates people to subordination, than to any educational program. I do not intend to say that there aren’t problems with schools, but the institutions for which the schools prepare people are worse. At least grading in school is fair. A professor as corrupt and partial in grading as the typical corporate manager would be fired– and professors don’t get fired often.

In terms of expected value (that is, the average performance one would observe given an indefinite number of attempts) the market rewards creativity, which is insubordinate. However, when it comes to personal income, expectancy is completely meaningless, at least for us poors who need a month-to-month income to pay rent. Most people would rather have a guaranteed $100,000 per year than a 1-in-1000 shot (every year) at $500 million, with a 99.9% chance of no income, even though the latter deal has more expectancy in it. Risk-adjusted, people of average means are rewarded for taking stable jobs, which often require subordination.

Technically speaking, people are paid for work, not subordination, but the process that exists to evaluate the work is so corrupt and rife with abuse that it devolves into a game that requires subordination. For a thought experiment, consider what would happen to a typical officer worker who, without subversion or deception to hide her priorities, did the following:

  • worked on projects she considers most important, regardless of her manager’s priorities,
  • prioritized her long-term career growth over short-term assignments, and
  • expressed high-yield, creative ideas regardless of their political ramifications.

These activities are good for society, because she becomes better at her job, and obviously for her. They’re even good for her company. However, this course of action is likely to get her fired. Certainly, there’s enough risk of that to invalidate the major benefit of being an employee, which is stability.

So, in truth, society pays people to be subordinate, and that’s a real problem. In theory, capitalist society pays for valuable work, but the people trusted to evaluate the work inevitably become a source of corruption as they demand personal loyalty (which is rarely repaid in kind) rather than productivity itself. However, the long-term effect of subordination is to cause creative atrophy. To quote Paul Graham, in “You Weren’t Meant to Have a Boss“:

If you’re not allowed to implement new ideas, you stop having them. And vice versa: when you can do whatever you want, you have more ideas about what to do. So working for yourself makes your brain more powerful in the same way a low-restriction exhaust system makes an engine more powerful.

I would take this even farther. I believe that, after a certain age and constellation of conditions, creativity can be lost effectively forever. People who keep their creativity up don’t lose it– and lifelong creative people seem to peak in their 50s or later, which should kill the notion that it’s a property of the young only– but people who fall into the typical corporate slog develop a mindset and conditioning that render them irreversibly dismal. It only seems to take a few years for this to happen. Protecting one’s creativity practically demands insubordination, making it almost impossible to win the corporate ladder and remain creative. This should explain quite clearly the lack of decent leadership our society exhibits.

We should offset this by finding a way to reward people for not subordinating. To make it clear, I’m not saying we should pay people not to work. In fact, that’s a terrible idea. Instead, we should find a repeatable, robust, and eventually universal way to reward people who work in non-subordinate, creative ways, thereby rewarding the skills that our society actually needs, instead of the mindless subordination that complacent corporations have come to expect. By doing this, we can forestall the silent but catastrophic loss that is the wholesale destruction of human creative capital.

Why I wiped my LinkedIn profile

I wiped my LinkedIn profile recently. It now says:

I don’t reveal history without a reason, so my past jobs summary is blank.

I’m a New York-based software engineer who specializes in functional programming, machine learning, and language design.

This might not be the best move for my career. I’m mulling over whether I should delete the profile outright, rather than leaving a short note that appears cagey. I have a valid point– it really isn’t the rest of the world’s business what companies I have worked for– but I’m taking an unusual position that leaves me looking like a “tinfoiler”. For that, I’m honestly not, but I do believe in personal privacy. Privacy’s value is insurance against low-probability, high-impact harms. I don’t consider it likely that I’ll ever damage myself by publicly airing past employment history. It’s actually very unlikely. But why take the chance? I am old enough to know that not all people in the world are good, and this fact requires caution in the sharing of information, no matter how innocuous it might seem.

Consistency risk

My personal belief is that more people will damage their careers through respectable avenues such as LinkedIn than on Facebook, the more classic “digital dirt” culprit. For most jobs, no one is going to care what a now-35 software engineer said when he was 19 about getting drunk. Breaking news: all adults were teenagers, and teenagers are sometimes stupid! On the other hand, people could be burned by inconsistencies between two accounts of their career histories. Let’s say that someone’s CV says “March 2003 – February 2009″ while his LinkedIn profile says “March 2003 – November 2008“. Uh-oh. HR catches this discrepancy, flags it, and brings the candidate in for a follow-on interview, and the candidate discloses that he was on severance (and technically employed, but with no responsibilities) for 3 months. There was no lie. It was a benign difference of accounting. Still, the candidate has now disclosed receipt of a severance payment. There’s a story there. Whoops. In a superficial world, that could mean losing the job offer.

This isn’t a made-up story. The dates were different, but I know someone who ended up having to disclose a termination because of an inconsistency of this kind. (LinkedIn, in the case of which I’m aware, wasn’t the culprit.) So consistency risk is real.

Because the white-collar corporate world has so little in the way of actual ethics, the appearance of being ethical is extremely important. Even minor inconsistencies admit a kind of scrutiny that no one wishes to tolerate. This career oversharing that a lot of young people are participating in is something I find quite dangerous. Not everything that can damage a person’s reputation is a drunk picture. Most threats and mistakes are more subtle than that, and consistency risk is a big deal.

Replicating a broken system

My ideological issue, however, with LinkedIn isn’t the risk that’s involved. I’ll readily concede that those risks are very mild for the vast majority of people. The benefits of using such a service quite possibly outweigh them. The bigger problem I have with it is that it exists to replicate broken ways of doing things.

In 2013, the employment market is extremely inefficient in almost all domains, whether we’re talking about full-time jobs, consulting gigs, or startup funding. It’s a system so broken that no one trusts it, and when people distrust front-door channels or find them clogged and unusable, they retreat to back-door elitism and nepotism. Too much trust is given to word-of-mouth references (that are slow to travel, unreliable, and often an artifact of a legal settlement) and low-quality signals such as educational degrees, prestige of prior employers, and durations of employment. Local influences have a pernicious effect, the result of which is unaffordable real estate in virtually any location where a career can be built. Highly-qualified people struggle to find jobs– especially their first engagements– while companies complain of a dearth of appropriate talent. They’re both right, in a way. This is a matching problem related to the “curse of dimensionality“. We have a broken system that no one seems to know how to fix.

LinkedIn, at least in this incarnation, is an online implementation of the old-style, inefficient way of doing things. If you want an impressive profile, you have to troll for, trade, and if you’ve had a bad separation, use the legal system to demand in a settlement, recommendations and endorsements. You list the companies where you worked, job titles, and dates of employment, even if you honestly fucking hate some of those companies. We’ve used the Internet to give wings to an antiquated set of mechanics for evaluating other people, when we should be trying to do something better.

None of this is intended as a slight against LinkedIn itself. It’s a good product, and I’m sure they’re a great company. I just have an ideological dislike– and I realize that I hold a minority opinion– for the archaic and inefficient way we match people to jobs. It doesn’t even work anymore, seeing as most resumes are read for a few seconds then discarded.

Resumes are broken in an especially irritating way, because they often require people to retain a lasting association with an organization that may have behaved in a tasteless way. I have, most would say, a “good” resume. It’s better than what 98 percent of people my age have: reputable companies, increasing scope of responsibility. Yet, it’s a document through which I associate my name with a variety of organizations. Some of these I like, and some I despise. There is one for which I would prefer for the world never to know that I was associated with it. Of course, if I’m asked, “Tell me about your experience at <X>” in a job interview, for certain execrable values of X, social protocol forbids me from telling the truth.

I’ll play by the rules, when I’m job searching. I’ll send a resume, because it’s part of the process. Currently, however, I’m not searching. This leaves me with little interest in building an online “brand” in a regime vested in the old, archaic protocols. Trolling for endorsements, in my free time, when I’m employed? Are you kidding me?

The legitimacy problem

Why do I so hate these “old, archaic protocols”? It’s not that I have a problem, personally. I have a good resume, strong accomplishments for someone of my age, and I can easily get solid recommendations. I have no need to have a personal gripe here. What bothers me is something else, something philosophical that doesn’t anger a person until she thinks of it in the right way. It’s this: any current matching system between employers and employees has to answer questions regarding legitimacy, and the existing one gets some core bits seriously wrong.

What are the most important features of a person’s resume? For this exercise, let’s assume that we’re talking about a typical white-collar office worker, at least 5 years out of school. Then I would say that “work experience” trumps education, even if that person has a Harvard Ph.D. What constitutes “work experience”? There’s some degree of “buzzword compliance”, but that factor I’m willing to treat as noise. Sometimes, that aspect will go in a candidate’s favor, and sometimes it won’t, but I don’t see it conferring a systemic advantage. I’m also going to say that workplace accomplishments mean very little. Why? Because an unverifiable line on a resume (“built awesome top-secret system you’ve never heard of”) is going to be assumed, by most evaluators, to be inflated and possibly dishonest. So the only bits of a resume that will be taken seriously are the objectively verifiable ones. This leaves:

  • Company prestige. That’s the big one, but it’s also ridiculously meaningless, because prestigious companies hire idiots all the time. 
  • Job titles. This is the trusted metric of professional accomplishment. If you weren’t promoted for it, it didn’t happen.
  • Length of tenure. This one’s nonlinear, because short tenures are embarrassing, but long stints without promotions are equally bad.
  • Gaps in employment. Related to the above, large gaps in job history make a candidate unattractive.
  • Salary history, if a person is stupid enough to reveal it.
  • Recommendations, preferably from management.

There are other things that matter, such as overlap between stated skills and what a particular company needs, but when it comes to “grading” people, look no farther than the above. Those factors determine where a person’s social status starts in the negotiation. Social status isn’t, of course, the only thing that companies care about in hiring… but it’s always advantageous to have it in one’s favor.

What’s disgusting and wrong about this regime is that all of these accolades come from a morally illegitimate source: corporate management. That’s where job titles, for example, come from. They come from a caste of high priests called “managers” who are anointed by a higher caste called “executives” who derive their legitimacy from a pseudo-democracy of shareholders who (while their financial needs and rights deserve respect) honestly haven’t a clue how to run a company. Now, I wouldn’t advise people to let most corporate executives around their kids, because I’ve known enough in my life to know that most of them aren’t good people. So why are we assigning legitimacy to evaluations coming from such an unreliable and often corrupt source? It makes no sense. It’s a slave mentality.

First scratch at a solution

I don’t think resumes scale. They provide low-signal data, and that fails us in a world where there are just so many of the damn things around that a sub-1% acceptance rate is inevitable. I’m not faulting companies for discarding most resumes that they get. What else would they be expected to do? Most resumes come from unqualified candidates who bulk-mail them. Now that it’s free to send a resume anywhere in the world, a lot of people (and recruiters) spam, and that clogs the channels for everyone. The truth, I think, is that we need to do away with resumes– at least of the current form– altogether.

That’s essentially what has happened in New York and Silicon Valley. You don’t look for jobs by sending cold resumes. You can try it, but it’s usually ineffective, even if you’re one of those “rock star” engineers who is always in demand. Instead, you go to meetups and conferences and meet people in-person. That approach works well, and it’s really the only reliable way to get leads. This is less of an option for someone in Anchorage or Tbilisi, however. What we should be trying to do with technology is to build these “post-resume” search avenues on the Internet– not the same old shit that doesn’t work.

So, all of this said, what are resumes good for? I’ve come to the conclusion that there is one very strong purpose for resumes, and one that justifies not discarding the concept altogether. A resume is a list of things one is willing to be asked about in the context of a job interview. If you put Scala on your resume, you’re making it clear that you’re confident enough in your knowledge of that language to take questions about it, and possibly lose a job offer if you actually don’t know anything about it. I think the “Ask me about <X>” feature of resumes is probably the single saving grace of this otherwise uninformative piece of paper.

If I were to make a naive first scratch at solving this problem, here’s how I’d “futurize” the resume. Companies, titles, and dates all become irrelevant. Leave that clutter off. Likewise, I’d ask that companies drop the requirement nonsense where they put 5 years of experience in a 3-year-old technology as a “must have” bullet point. Since requirement sprawl is “free”, it occurs, and few people actually meet any sufficiently long requirement set to the letter, so that seems to select against people who actually read the requirements. Instead, here’s the lightweight solution: allocate 20 points. (The reason for the number 20 is to impose granularity; fractional points are not allowed.) For example, an engineering candidate might put herself forward like so:

  • Machine learning: 6
  • Functional programming: 5
  • Clojure: 3
  • Project management: 3
  • R: 2
  • Python: 1

These points might seem “meaningless”, because there’s no natural unit for them. but they’re not. What they show, clearly, is that a candidate has a clear interest (and is willing to be grilled for knowledge) in machine learning and functional programming, moderate experience in project management and with Clojure, and a little bit of experience in Python and R. There’s a lot of information there, as long as the allocation of points is done in good faith and, if not, that person won’t pass many interviews. Job requirements would be published in the same way: assign importance to the things according to how much they really matter, and keep the total at 20 points.

Since the points have different meanings on each side– for the employee, they represent fractions of experience; for the company, they represent relative importance– it goes without saying that a person who self-assigns 5 points in a technology isn’t ineligible for a job posting that places an importance of 6 for that technology. Rather, it indicates that there’s a rough match in how much weight each party assigns to that competency. This data could be mined to match employees to job listings for initial interviews and, quite likely, this approach (while imperfect) would perform better than the existing resume-driven regime. What used to involve overwhelmed gatekeepers is now a “simple matter” of unsupervised learning.

There is, of course, an obvious problem with this, which is that some people have more industry experience and “deserve” more points. An out-of-college candidate might only deserve 10 points, while a seasoned veteran should get 40 or 50. I’ll admit that I haven’t come up with a good solution for that. It’s a hard problem, because (a) one wants to avoid ageism, while (b) the objective here is sparseness in presentation, and I can’t think of a quick solution that doesn’t clutter the process up with distracting details. What I will concede is that, while some people clearly deserve more points than others do, there’s no fair way to perform that evaluation at an individual level. The job market is a distributed system with numerous adversarial agents, and any attempt to impose a global social status over it will fail, both practically and morally speaking.

Indeed, if there’s something that I find specifically despicable about the current resume-and-referral-driven job search culture, it’s in the attempt to create a global social status when there’s absolutely no good reason for one to exist.

The unbearable B-ness of software

I’m not Jack Welch’s biggest fan. For one thing, he invented the “rank-and-yank” HR policies that literally decimate companies. I don’t disagree with the idea that companies would improve their health by letting go 5-10 percent of their people per year, but I think the discovery process involved is impossible and often politically toxic. There is, however, one thing he’s said that I think has a lot of value: “A players hire A players; B players hire C players“. His point is that if you have mediocre management, you’ll actually end up with terrible employees. I would say it’s not limited to hiring only. A players make more A players. They teach each other how to be better. Not only that, but they raise the potential for what an A player can do. B players don’t have the foresight or “ownership” mentality to mentor others, and produce non-productive C players.

The insight I had recently is that this applies to software as well. “B” architectural decisions or tooling choices, which seem mediocre but tolerable at the time, create “C” (ugly, defective, or unusable) software. Software contributions often have, not always intentionally, a multiplicative (or divisive) effect across a company, in which case a single programmer can become “manager-equivalent” in impact. This is something that most companies, especially outside of technology, fail to realize about software. They miss this fact to their peril.

I’m increasingly convinced that it’s hard to be a great programmer and not be, at least, a half-decent leader. This doesn’t mean that one needs to be a “people manager” or even have any interest in taking that direction. However, a great software engineer is:

  • a decision-maker, because engineers frequently have to choose technologies and make infrastructural choices under conditions of incomplete information.
  • a creative problem-solver, because for the hardest problems there is no “canned” solution. In fact, for some problems (as in machine learning) the best solution may not be known, and simply designing the experiments is a non-trivial problem.
  • a designer, because the ability to create attractive, robust and maintainable software systems is uncommon, and the work is non-trivial.
  • a teacher, because good engineers don’t just leave a wad of undocumented code laying around, but take responsibility for making sure that other people can use them.

How is all of this, when done right, not a leadership role?

Of course, software engineers are not in general treated like leaders in any large company that I know of, and a fair proportion of the people who are hired into software positions just aren’t capable of being leaders. Still, there’s an overwhelming and self-perpetuating culture of B-ness in software, with engineers not trusted to pick their projects and choose their own tools. This culture of mediocrity is one which what I called “Java Shop Politics” emerges. I regret the name, however. It’s not fair to single out Java, especially when it was Microsoft, with Visual Basic and the first scratches at IDE-culture, that first attempted to create the commodity programmer world. A better name would be “Big Software Politics”.

I would replace Welch’s A-, B-, and C-player language with a four-class system of dividers, subtracters, adders, and multipliers. I’ve separated the “C” category between the ones who are truly toxic and hurt others’ productivity (dividers) from the more harmless people who just don’t get much done (subtracters). Dividers, I think, should be fired quickly if they don’t improve. The only crime of subtracters is to draw more in salary than they produce, but it’s worse (for morale) to fire them, so they should be mentored into adders and (eventually) multipliers whenever it is possible, and gently let go if it seems not to be. Ultimately, no company should retain an employee who doesn’t have the drive and capability to become a multiplier, but it takes time for a person to get there and firing is an extremely blunt instrument. In general, I wouldn’t fire anyone but a divider.

“B-player” and “adder” seem to correspond neatly, as do “A-player” and “multiplier”. The first category can crank out CRUD apps just fine, and write software to spec, but lack the architectural or design skill to build much on their own. Adders are the workhorses who are capable of implementing others’ decisions but unready to make their own, while multipliers deliver growth by making others (and themselves) more productive through their (often far-reaching) contributions.

Management is especially challenging and dangerous because it becomes impossible, almost by definition, for a manager to be a mere adder. A manager’s job is to alter the way people work, and as with stock traders, “random” moves have negative expectancy. The percentage of people who have multiplier-level knowledge, talent, or skill is small– maybe 10 to 15 percent, in a typical company. Managers who don’t have that capacity become dividers who add noise and entropy.

Programming, the art of managing machines, is much the same way. There are always a few junior-level, self-contained additive projects in every company, but the core infrastructural work that will be used by a large number of people is multiplicative– if done well. Done poorly, it reduces capability and has a dividing effect. How is it that software– typically construed as an asset– can have such a divisive effect? The problem is management. When people don’t freely choose what tools they use, and what software they rewrite as opposed to what they deal with “as-is”, low-quality software combined with managerial blessing will lead to unproductive and unhappy programmers.

At the beginning of this year, I developed a scale for assessing the capability of a software engineer, and I’d formalize it a bit more with a model that first separates software work into three levels:

Level 1: Additive work, such as scripts to generate reports or CRUD business apps. This will typically be written once and read only by the original code author. Relevant question: can you code?

Level 2: Multiplicative work, such as tool development and critical production infrastructure in which performance, scalability, design and code quality matter, because large numbers of people rely on the work. Most of the “sexy” problems fit in this category. Relevant questions: does your work make others more productive? Can they use it? Do they enjoy using it?

Level 3: Globally multiplicative work, such as the design of new general-purpose languages. A level-3 accomplishment needs to be “best in class”, in some way, on a worldwide basis because its purpose is to push forward the state of the entire industry. Linux and Clojure are some examples of level-3 achievements. Most of this work is R&D that few companies are willing to pay for, these days. Relevant question: are you doing original work that increases capability globally?

As with any model, this is chock full of simplifications. In reality, there are hard L1 tasks that might be rated above 1.5, and easy L2 jobs as well that might be “only” 1.7 in difficulty, but for simplicity I’ll assume that tasks can neatly be “bucketed” into one of these three categories. The going assumption is that a programmer’s level represents the level at which she will make the right calls 95% of the time. For a level-2 task, the 2.0 programmer will succeed 95% of the time, the 1.5 programmer will get 50%, and the 1.0 will get 5%, with an “S-shaped” logistic interpolation giving meaning to the fractional levels (e.g. 1.1, 1.9). In practice, these concepts are too difficult to define for formal measurement (making it useless even to attempt to measure beyond one decimal place) and the bulk of professional software engineers are between 1.0 and 2.0. While it’s difficult to apply percentiles to software engineering, the population being ill-defined, I’d estimate that:

  • the median full-time professional software engineer is about 1.1. Many senior (20+ years experience) engineers never crack 1.2.
  • graduates of elite computer science programs are about 1.3.
  • about 1 in 10 professional software engineers are at 1.5 or above.
  • about 1 in 100 software engineers are 2.0 or above.

In the corporate world, level-3 software considerations are generally irrelevant. Such efforts tend to have an R&D flavor, and there’s rarely the budget (the limiting resource being the time of extremely high-power people) or risk-tolerance for a company to attempt them, so levels 1 and 2 are where the action is. You could safely say that level-1 work can usually be done by an adder or “B player”, while level-2 projects require an “A player”, “10X-er”, or multiplier.

Companies and software managers know, from extremely painful (read: expensive) experience, that level-2 work is hard and risky, and that most professional software engineers lack the competence to do it well, and even fewer are capable of supervising such work. The result is that they try to minimize the amount of such work, and the degree to which they’re willing to rely on it. If one “unit” of level-2 work can be replaced with four “units” of level-1 work, that seems like the prudent choice, because it’s astronomically easier to hire 1.0-1.3 programmers than to vie for 1.8-2.1, who can only be detected and assessed by other great programmers. This is the essence of the “commodity programmer” culture: create a world in which as much of the work as is possible is level 1, and allocate the level-2 work only to people with a “track record”, an assessment that often has more to do with politics and social position than capability.

What goes wrong? Well, the first problem with commodity developer culture is that the bulk of engineers living within it never improve. They stop learning, because there’s no need for them to progress. When companies staff people on subordinate, bite-sized work below their level of capability, they get bored and often decline as time goes on. If you don’t have 1.5+ level work, you’re not going to have many 1.5+ engineers, and you won’t keep any of them for very long. If what high-level work you have is jealously guarded and allocated politically, the best engineers won’t stick around for years to prove themselves adequate for it. The ones without other options will.

The second problem is that projects often become level-2 by accident, and also that level-2 needs tend to emerge once the complexity load of all the level-1 work reaches a critical mass. This is akin to Greenspun’s Tenth Rule, which essentially states that when low-level languages are applied to complex problems that require a more “high-level” approach, people implement the features that already exist in high-level languages like Lisp. This shouldn’t actually be taken as a slight against C: for low-level (here, “low-level” pertains to the degree of abstraction that is used, and is non-pejorative) problems where memory management is critical (yes, plenty of those still exist) it is often the best language, but you wouldn’t want to write a complex, modern web application entirely in C. In any case, what this “Rule” is really about is the emergence of complexity, driven by need. Lisp is a well-thought-out high-level language (a level-3 accomplishment) and almost guaranteed to be superior to the domain-specific language (DSL) that a typical corporate programmer, constrained by deadlines and managers and low autonomy, would “inject” into a C project to add high-level functionality (on an ad-hoc basis) to it.

I think Greenspun’s Tenth Rule can be generalized. At some point, the complexity load induced by level-1 requirements and work requires level-2 insights and capability. The problem with the commodity developer world is that, because level-2 work is effectively not allowed, the stuff that does happen at such level occurs accidentally in an ad-hoc way. A manager might see a neat script (a level-1 accomplishment) written by a junior developer and say, “This whole company should have that. By next Friday.” However, the developer lacks both the engineering skills and the political clout to recognize bad requirements and reject them, the result being an overblown, late, and unmaintainable system that serves many needs poorly instead of doing a few things well. Such are the good intentions that pave the road to hell.

All of this, I think, explains the sorry state of the software industry today. The business world understands software’s problems at a high level– most software is shitty, late, and over budget– and (correctly) concludes that the majority of software developers lack the skills to attack problems at the multiplicative (level-2) scope, while most executives are incapable of assessing software talent at the individual level. The result is a bureaucratic, creativity-killing culture that is tailored to software engineers in the 1.1-1.3 range, a world designed to protect itself against B players. The long-term problem is that this makes it nearly impossible for most software engineers to become 1.5+ A-players. One almost certainly won’t achieve it during one’s day job. Mediocrity self-perpetuates. What companies really need to do is give software engineers a creative “sandbox” (an R&D-like environment) in which they can attempt 1.5+ projects, and if they fail, no one gets hurt. However, I wouldn’t hold my breath for this, because most managers are going to see this as “not doing real work”.

The result is a culture of “B”-ness in software. That would be acceptable, but the reality is that level-2, “A player” needs emerge anyway as a result of the need to manage (and, preferably, reduce) the accumulating complexity that additive software generates. Unfortunately, this commodity developer culture is utterly unprepared to approach such problems, and fails miserably at them. The result of this is the slew of “C” (failed) software projects that will litter any corporate environment given enough time.

Fundamental Subordinate Dishonesty

This essay has two parts. The first is about the ethics of lying in the context of a job search, whether on a resume or in an interview. The second half focuses on a massively common falsification that is not only common, but socially accepted: the Fundamental Subordinate Dishonesty.

I realize that this opinion is not without controversy, but I don’t find myself up-in-arms about most resume lies. In the professional world, that seems to be considered one of the worst things people can do, leading to immediate termination, even without any other context. In my opinion, much of this attitude is overblown, a high-horse position taken because, since there is so little in the way of actual ethics in the white-collar corporate world, the appearance of ethical “above-boardness” becomes critically important. 

Let me make myself clear on three things, however. First, I don’t lie on my resume. Why? I don’t need to, and the consequences of being caught are severe. There are annoyances in my career history, but nothing so damaging that I have to hide it. Second, I’m using “resume lie” as a shorthand for dishonesty that occurs in a job search, so lies on an interview would count. On that, it’s tactically a lot better to keep that resume factual and truthful and deliver whatever inflations you need to make verbally, if you can. Third, there are two categories of this type of lie, one of which I consider extremely unethical, and the other of which I don’t care about. Feigning competences that a person doesn’t have is charlatanry and a form of fraud. That can be extremely harmful. A person who lies about having a medical degree (a “quack” doctor) is hurting people, committing a dangerous fraud, and deserves to go to prison. There’s no question that charlatanry is unethical. It’s completely unacceptable. The other category of career inflation is what I call cosmetic inflation. When a person represents work experience honestly but inflates the title, or alters dates of employment by a couple months to cover an embarrassing gap, he’s not trying to get a job he can’t perform. He’s just inflating his own social status– in truth, mostly lying about performance reviews, which falls into the “who gives a fuck?” category as far as I am concerned– in order to bring his image in line with his capability.

On when such cosmetic inflations– fudging titles, knitting dates, representing termination as a voluntary resignation, and upgrading “performance-based” compensation to the top bracket– are advisable, I can’t say when they are and when not. I don’t do them, and I have no data. The downside is obvious– you can get just as fired for an ethical lie as an unethical one– and the upside is often unclear. Is someone really going to be treated better when he self-assigns the promotion from AVP to VP? I have no earthly idea. What’s obvious to me is that there’s nothing ethically wrong with this sort of thing. People aren’t faking capabilities they lack, with such falsehoods, but improving their stated political trajectories in the direction of what’s socially optimal. If it’s unethical to lie about social status and past political success, then the whole world is guilty.

Companies worry a lot about resume lies, and understandably so, because I imagine they’re common given the stakes. So, I asked myself: do they lose money to these? How much? This said, I’m focusing only on the cosmetic brand of lie: upgraded job titles and date-fudging, not actual fraud. I’m not talking about the objectively unethical and fraudulent lies (charlatanry) because (I would hope) only a very small percentage of the population is depraved enough to attempt them.

Perversely, I can actually see companies winning more than they lose from cosmetic inflations. Why? One of the major causes of corporate inefficiency is the lack of trust. Most people are stuck in junior roles below their level of ability (and, therefore, producing less value for the company than they otherwise could) because they aren’t trusted. They have the capability but not the credibility. The existence of outright fraud is obviously a part of this problem, even though psychopaths are often skilled at office politics and can easily win these cosmetic awards (such as job titles). Cosmetic dishonesty, perversely, might be the cure. It sounds ridiculous that I would be advising outright (if harmless) lying as a remedy to a trust problem (although I think this is an absurdly common social behavior) so let me give a concrete example. Bob is a 34-year-old banker who was an Associate for 5 years, and then laid off when his bank hit a rough patch. In his job search, he represents his work experience, and strengths and weaknesses, honestly but upgrades his political success by claiming to be a VP (in banking, “VP” is a middling rank, not an executive) who is still with the firm in good standing, and simply looking for new challenges. He gets another job, and performs well there. How much damage does he do? He might receive $20,000 more in salary from his next employer on account of his falsification. Big deal: if he’s actually incompetent, they can fire him and cut their losses. Chances are, he’ll deliver over $300,000 of additional value to his new employer on account of being trusted to perform better work. He is lying to his future employer, and making a huge return on investment for them.

What Bob is actually correcting is a cognitive flaw of humans whereby mediocrity in past social status is conflated with ethical depravity. This made sense in evolutionary times, because people (at least, men) of low social status had to subvert the existing order, often in violent ways, in order to reproduce. The alpha ape needed to watch out for the gammas, who might attempt to compensate for their lack of physical superiority by trickery, thereby killing him. It’s less relevant in the modern time, when there is a nonviolent and socially positive solution to low social status: move somewhere else and reinvent yourself. If anything, I think people who do this are more likely to be ethical. Rather than do whatever it takes (read: cheat) to win where they are, they walk away and find another game.

For an aside, most of these cosmetic lies aren’t dishonesty so much as insubordination. When someone upgrades his title to reflect a promotion that he was passed over for, is he lying? One could argue so, but one could equally convincingly argue that he’s merely de-legitimizing the managerial authorities that delivered a negative assessment of him. They called him a loser and he’s saying, “No, I’m not”. Most job titles don’t reflect objective accomplishment but political success in any case, and who’s to say that the inflated title isn’t a more accurate reflection of who he is? He’s clearly showing a lack of subordination to his ex-managers, but why shouldn’t he? What he is really doing by inflating his title is counterfeiting a social currency that he believes to be illegitimate anyway. Very little harm is done.

So what is the costliest of the cosmetic lies? Are there any that lose money for employers? The answer I’ve come to is that there is one, but it’s (a) not a traditional resume lie, and (b) so common that it is not conventionally considered unethical: the Fundamental Subordinate Dishonesty.

Subordinate dishonesty is a term I use for the often omissive deceptions that people have to use in order to survive a subordinate role in an organizational hierarchy. These aren’t ethical faults, because they’re often adaptations required for survival. For example, I learned early in my career that it is never acceptable to voice the opinion that your assigned project will not pan out, even if that is obviously true. If I hold that opinion, I keep it to myself. Even if true, “This project is never going to work, and here’s why” is a one-way ticket to Firedville. The basic, underlying principle of subordinate dishonesty is “never bear bad news“. You have to know your boss extremely well, and usually this requires a pre-existing friendly relationship, before you can trust him enough to associate your face with negative news.

There is one variety of subordinate dishonesty that is especially common, and I’ve given it the name above: the Fundamental Subordinate Dishonesty. This is when a person represents himself as being happy to take a subordinate role. This is very common on job interviews. Most people wait too long to look for jobs, and are semi-desperate by the time they “officially” hit the market. By this point, they exaggerate their willingness to take on junior, low-impact roles or positions far outside of their desired specialty. They’re just happy “to be a part of something great”. This isn’t an intentional dishonesty, so much as a self-deception repeated often, because it’s socially attractive. The problem is that it doesn’t last. After a few months in a new job, the “honeymoon period” ends and people will no longer be happy in a role where their credibility is below their capability. However, when people are either desperate or (more commonly) think they are, they will frequently overestimate and, thereby, overrepresent their willingness to fill such a role “as long as it’s needed of me” in order to “just close” an offer. But once they are settled in and no longer in fear of unemployment, they become agitated in a way that they didn’t predict (at the time, they were just happy to get out of the mess they were in) but that should have been predictable.

If I’m looking for a big-ticket loss in terms of resume lies, I don’t think inflated titles or padded dates do much damage. At their ethical worst, they’re a zero-sum game that is to the disadvantage of truthful nonparticipants, like me. (I have no ethical problem with cosmetic inflation. My decision not to take part is strategic. My career history is above-average so I gain little and risk too much.) Being in that category, I’m OK with this “loss”, because the companies where I might not get hired are those that value things (like past job titles) that I find to be imbecilic noise. It’s better, for the world, for that company to hire an (ethical) liar and trust him so he can get his job done than for it to hire me and trust me less because my (truthfully represented) social signals are not as strong. This “victimless” crime that the liar is committing doesn’t bother me much.

Instead, the trillion-dollar killer is the Fundamental Subordinate Dishonesty. It creates massive unhappiness and turnover.

For all this, I don’t want to make it sound like employees are dirty and companies are clean on this one, because that’s not true. In fact, employers have a history of over-representing the autonomy associated with a position and the challenge of the work just as brazenly as job applicants overrepresent their will to subordinate. Sometimes this is unintentional, and sometimes it’s overt. In general, I’d say that it’s aspirational. A manager’s sense of time is distorted by the need to deal with multiple concurrent and often conflicting story lines, which is why an employee who asks for more interesting work at the 4-month mark is doing so “way too soon”. So managers often present a job role, in terms of authority and creativity, based on where the employee “should be” after a “short time” (read: 2 years on busy-work that’s mostly evaluative because it’s not critical to the business) if that employee “proves himself” (read: doesn’t piss anyone off). The problem is that very few people are willing to be outright bored for two whole years. This begins as benign aspirational hiring, but grows into something else once a company goes into full-on recruiting mode. I’ve seen companies (mostly, crappy startups) that literally cannot function because over 80 percent of their people were promised leadership roles and nobody is willing to follow.

This is also why, as I’ve become older, I advise people to pay a high degree of attention to salary in comparing job offers. I used to say that it was worth it to take a low-salary job for “interesting work”. I’m not so sure anymore. If you’re going to be a founder of a startup, with real (5% or more) equity and standing in the business, then this may be an exception. Otherwise, you’re often better off looking at the hard number to size up what the company really thinks of you, unless you have solid data. Don’t trust intangibles on a “just-said” basis, because employers lie about those all the fucking time. This is the mirror image of the Fundamental Subordinate Dishonesty. The employer represents a low-level position within the hierarchy as being better than it actually is, and the employee represents herself as being content with a lower level of autonomy that she’ll actually accept. Eventually, both sides find themselves dissatisfied with the relationship because the other party “should be” picking up on their real desires. It should be no wonder that this commonly leads to a theatrical crash.

So what is the solution? I think the answer is many-fold. First of all, what actually makes people happy at work isn’t the salary. Compensation makes people happy outside of work. When at work, it’s social status. People put an optimistic spin on this in accord with what’s socially acceptable, but the reality is that people don’t like jobs where they can’t get anything done, and it’s impossible to have real achievements in a position of low status. So an organizational hierarchy that leaves most people with low status is going to make a lot of people unhappy. This was not a major issue with traditional industrial work, where unhappy workers were only 20 or 40 percent less productive than happy ones, but for modern technological work, that difference is at least an order of magnitude. Hierarchical companies are losing their ability to perform at the highest levels.

I see no other conclusion than the fact that corporate hierarchy is obsolete and increasingly incapable of solving modern technical problems. The subordination on which it relies is fundamentally dishonest, because no one is content to be held artificially at such a low level of impact. People who will follow orders in the context of a symbiotic mentor/protege relationship that advances their careers, for sure. They will also follow another’s lead on a specific project when they believe that other person has a better grasp of the problem than they, themselves, do. What they’re not willing to do anymore is be truly subordinate: to follow rank because it is rank, and to let their own long-term career goals take a backseat to professed organizational needs. Why? Because when you accept this, you reduce your long-term career prospects, and are effectively paying to keep your own job. That era is finished. Yet social protocols remain and require a certain dishonest signaling about the willingness to subordinate, leading to failed communication, overblown expectations and festering unhappiness. 

If there’s a job search lie that’s a billion- or trillion-dollar killer, I would say that the Fundamental Subordinate Dishonesty is it.

MOOCs will disrupt the corporate world

One of the largest, but also most admirable, threats to the existing social norms of the corporate workplace is Mooks.

Wait, no… that’s not it. I mean MOOCs: massive open online courses, such as those offered by Coursera, Udacity, and edX. MOOCs, which bring a modularity and freshness into education, are not a fad. Their use will grow, and the rising generation will be so comfortable with the concept that they’ll become a regular component of educational and working life. Within 10 years, self-directed continuing education will be as important a component of career progress as traditional “resume” metrics like job titles and workplace accomplishments.

Much has been said about the potential for MOOCs to disrupt education, although I think the threat is overstated. While the current extortion (a choke-hold on the middle-class job market) that justifies enormous college tuitions will go away, as it should, there’s still an immense social value to the college experience, much of which occurs outside of the classroom. That won’t change. So if “disruption” occurs, it will be in the direction of making “the Real World” more like college. It won’t kill off the university. That topic, however, is too big for me to attack in one morning. I’ll focus on one subtopic, which is the effect that MOOCs will have on the modern workplace. They’re a serious threat to the entrenched corporate leadership, because they provide a path to alternative credibility that is independent of a single workplace or manager. The current corporate workplace operates based on a credibility drought. Artificially scarce job titles, project allocations, and referrals are used to motivate people to put years into dues-paying grunt work. The alternate credibility that MOOCs, in maturity, shall offer disrupts that, because people can learn faster than they acquire credibility in an artificially slowed-down institution.

Open-source projects have a similar potency, but there’s one difference. An indignant employer, when it discovers that an employee has favored the open-source project on working time, can attempt to claim ownership of the code– that it was “work for hire”. This limits the willingness that most people have to subversively pursue it on work time. With MOOCs, such firms won’t have that recourse. No court will invalidate an educational accreditation simply because it was earned on “working time”.

The MOOC generation

When the Baby Boomers went to work, the rules were simple. Do what’s asked of you, don’t complain, and when you have time to spare, ask for more work. That was how one showed ambition and the potential for leadership: always being “done” ahead of schedule, always being “caught up”, always wanting more work to do. If you were lucky, you eventually started getting a higher quality of work and would eventually (after years) start to acquire credibility within the organization and possibly be tapped for a leadership role. If you weren’t, you got more grunt work, took on as much as you could handle and plateaued. Most people weren’t especially lucky, and the dues-paying period lasted for 3 to 10 years– longer than the average job lasts for us.

That model doesn’t work for Millennials. As a generation, we’re poorly paid. (Software engineers are adequately, but not exceptionally, paid.) We’ve seen the corporate ladder disintegrate, so we have no faith in it. We’re not willing to sacrifice the now in favor of a promised future when we’ve seen such promises discarded for convenience. However, we have one thing the Boomers didn’t. There seems to be more variety in the kind of work that’s out there, making more avenues toward success. These alternatives take time and focus, but they’re there: one doesn’t have to climb an institutional ladder to be successful. My general impression is that, while the corporate world “proper” has become worse, there’s a wider variety of interesting jobs out there now than existed for my parents. Consequently, if we start investing in our careers very early and do so aggressively, we can grow our earning potential by 10-30 percent per year for many years, and eventually work our way to a position of high pay, autonomy, and flexibility. Boomer managers complain about us being “entitled” or expecting rapid career progression. We’re not. “Entitled” is the last thing we are. Rather, we work very hard, and we’re extremely loyal when we believe that loyalty is deserved. We do, however, tend prioritize our own career goals well above those of our companies or managers– and we’re nakedly obvious about it, which is something that Boomer managers aren’t used to. Why? Because the future pays us. We knew, at 22, that corporate loyalty was done-for and that we are our own bosses.

Boomers had a “company man”, one-firm-for-life model. At least, that was the ideal. Leaving your firm for a promotion was considered disloyal and couldn’t be done too often, and getting fired could ruin your career. Generation X, which followed, entered a winner-take-all world that emerged as the corporate social contract disintegrated. A few people got great mentors and became millionaire options traders, or connected venture capital with a good idea and managed to protect themselves well enough from investors to strike dot-com gold. Others (most of them) languished on busy-work they had to endure to pay the bills, and their career trajectories were nasty, brutish, and short. We, the Millennials, are the keep-learning-and-carry-on crew. We saw what happened to the laid-off Boomers and the less successful Xers and refuse to let it happen to us. We’re not going to let our creativity be sapped by subordinate roles. We don’t stop learning. When there’s slack in the schedule at work, we don’t ask how we can “help out”. We log in to our Kindle reader or Coursera and learn the skills that we’ll need to get where we want to go, and to perform well once we’re there.

I’ll also note that most of us have no ethical problem with doing this and I, personally, agree with that stance. We may call it “stealing an education” from a boss, to give this practice an anarchist flair, but the reality is that we do it with the intention of becoming better at our jobs, which is a valid use of work time. It may be insubordinate, but it’s not unethical. We believe that we don’t have to ask for permission to dedicate half our working time to what we think is important, and when we’re young and unestablished, that’s often development of new skills. We just go off and do it, rather than ask for clearance. It’s an awkward conversation, and the manager might say no. Channeling Grace Hopper, we’d rather ask forgiveness.

What’s coming into form is a generation that will prove itself extremely capable, but also extraordinarily difficult, if not impossible, to manage in the traditional sense of the word. The old approach won’t work anymore.

What’s changing?

First, the traditional mechanisms for evaluating leadership potential are done. Old-style companies allocated new hires to low-relevance grunt work and measured their potential for leadership based on how eagerly they took it on, and how fast they completed it. This worked in 1990 when there wasn’t much to do at work but the assigned work. Rather than face hours of boredom with nothing to do, the most competent would “give back” any slack in their schedule by asking for more work to do. The current generation won’t. Opening a CS textbook at work may be a faux pas outside of research– this guy’s reading a book at work?– but Kindle and PDF versions solve that. What this means is that the rising leaders are hard to detect based on performance on assigned work, where they manage their performance to the middle for the purpose of freeing time for self-directed learning.

What shall happen when an employee is “caught” putting half her working time into online education? Well, she could be fired, but that would mean terminating someone who was more skilled and competent than the person who was originally hired– an inconsistent decision. That might make sense if it raised questions about integrity, but the rising generation doesn’t view this behavior as unethical. So I see that extreme response as unlikely. Blocking MOOCs is a non-starter, since many departments and managers will want their reports to have access to them, or to use them themselves, and people won’t gladly work for a company that gets a reputation for blocking Coursera.  This is not a change to be fought; it can only be embraced.

So what might this mean, at the macroscopic level? Companies won’t be able to prevent the rising generation from putting the bulk of their working time into self-directed learning and projects, and the more progressive ones won’t even want to do so. Companies will have to accept the self-executive mindset of the most capable people. Thus, work will be redefined, and the relationship will be more like one of sponsorship than subordination. On the whole, this will be a positive change. One of the nastiest tensions in the modern corporation surrounds the lack of career coherency in work as presently defined. Career incoherency refers to the fact that a worker’s assigned tasks may not be what is best for her long-term desirability in the workforce. Present-day young workers are often forced to balance competing job and career needs, and afraid of catching the “not a team player” label if they serve the latter too obviously. Hence, the current need to be furtive in “MOOCing”. That will change under the sponsorship system, where young people are expected to serve their career needs and their employers are just there to fund them and collect products they produce along the way. That will be superior and, in the long run, produce better technical work. The proliferation of high-quality, free, online education is the start.

I, for one, welcome our new MOOC overlords.

Discredit, don’t humiliate

I wrote yesterday on the rhetorical power of “I don’t know”. I want to take this into a more aggressive arena. Yesterday, I wrote about debate, which can be stately and high-minded. When you debate someone, you often like that person. You don’t want to hurt or embarrass her; you just disagree with her. Today, I’m going to talk about battle. Battle is when there are people who want to hurt you, and you often have to hurt them first.

In a debate, you can bring the discussion back to facts by saying “I don’t know”, implying that no one has preternatural knowledge, and then presenting a compelling argument. In battle, you have to know. It’s a different game. You must present the appearance that you know what is going on and the reasons why, and you must present the cause of the problem in a way that is favorable to you, in order to win support.

For one example, most people will face managerial adversity at least once in their careers. Verbal criticism isn’t managerial adversity, nor is a bad review if that review is confidential between manager and employee. However, if the review will limit the employee’s project allocation and transfer opportunities, for example, or make it easier to fire that employee later, then it is managerial adversity, and the employee needs to be prepared for a fight. Decent managers will criticize in private, verbally, and confidentially, but give glowing reviews to keep the employee’s reputation strong. They will never make a move that damages the employee’s reputation within the firm. But there are a lot of indecent bosses out there. Managerial adversity exists when a boss disfavors one’s career needs, and it’s one of the most common kinds of battle that adults face. The warning sign is a manager who begins delivering criticism in writing. Once anything negative is put in email, battle has begun.

So how should one fight it? Managerial adversity is not the only kind of battle, but there’s a reason I’m focusing on it. It’s humiliating, and that creates a tendency to overreact– to humiliate the manager in kind. A common approach is to go “over his head” or, worse yet, to HR in a vain attempt to reverse the power dynamic. It almost never works. The boss’s boss picked the boss and is not going to toss him aside to favor some peon. People would rather be wrong but consistent than become inconsistent for the rather useless purpose of being right, and this favor for imbecilic consistency is a species of idiocy that becomes more prominent in powerful individuals. Going “to HR” is worse, since their job is to help the company cheaply fire people. What works a lot better is to get out of the boss’s way: to work out a transfer (note: you should have someone who already wants you before you notify your manager or officially enter the transfer process) and leave. Still, people who make this move are often expected to explain a bad review or an atypical transfer. So how?

Three words: discredit, don’t humiliate. There is a strong tendency to make one’s manager appear to be incompetent and unethical. In fact, he probably is. It doesn’t matter. The goal is least necessary harm. Embarrass him enough to discredit his assessment outright, but no more. Assert that what he is saying is untrue, and back it up, but don’t use the word “unethical”. Let the other parties decide how they feel about him. Why? I’m not saying this because I think people should use the kid gloves with bad bosses. In fact, we’d have a better world if it were more socially acceptable to throw an unethical manager under the bus, but for now we have to live with the world we’ve got. I’m saying it because the last thing a person needs when under managerial adversity is volatility. A mildly discredited adversary will not fight as hard as one facing humiliation.

One of the most important career principles to learn is to avoid outrunning the bear. Every company has an employee who works very hard, 70 hours per week, for about six months, but he doesn’t get the recognition he expected, and so he burns out hard and stalls. Then he gets fired “for performance” because, although he had six good months, his 7th was a bit slack. What did he do wrong? He tried to outrun the bear. You can’t outrun a bear, but you don’t have to. You just have to outrun the other guy. The same applies to managerial adversity. If you mildly discredit your boss’s assessment of you, enough to get back into good standing and have another chance, you’re pitting 100% of yourself against 10% of him, because he has other battles to fight. You stand a good shot of winning, because he won’t know what’s happening until it’s too late, and after you’ve successfully moved on, he won’t care. On the other hand, if you attempt to ruin your boss’s career, you’re going to be fighting 100% of him.

This also applies to the social expectation that people refrain from “badmouthing” ex-employers when they interview for other jobs, and to use the “seeking more challenging work” explanation for the departure. This advice doesn’t always work. People are expected to stick with a small company for at least 9 months, and a large one (where internal transfer is a possibility) for at least 18, and the going assumption is that stable people won’t leave jobs sooner than that (in part, because it looks bad; it’s a recursive phenomenon). If the job tenure is too short, someone looks bad. Make sure it’s not you. A better policy might be stated as “don’t badmouth unless asked” or “don’t appear vengeful”. Divulge negative information (truthful if possible, creative if necessary) is just a large enough dose to discredit the negative performance assessment implied by the short tenure, and not one iota more.

The best discreditation that I can come up with, for explaining away a bad review, is this: “Because of my qualifications (list them) I was overqualified for the work I was being asked to do, and so not a lot of time was invested in me because they thought I’d move on to something more glamorous, even though I gave them no reason to suspect that. And so, while I worked hard and loyally, no one ever took the time to clarify for me what the team’s priorities were. Hence, the review before you. I was deeply surprised and hurt by this, and so now I am looking for better work.” Then move back to a discussion of your qualifications and why you are a fit for the team you are trying to join. You’re not saying, “My boss and co-workers were unethical and they screwed me.” That never works. You’re saying, “Shit happens but I’m good.”

Perversely, the “discredit, don’t humiliate” policy often means that you’re working on behalf of the bad guys. Often, what makes it so tempting to humiliate a boss when under managerial adversity is the fact that the truth actually is humiliating to him. It feels more ethical to go to the extreme and air dirty laundry. Many people naively believe that bringing their unethical managers to the attention of HR will actually bring the boss to discipline and improve the company. I used to think this way, and have made a few mistakes in that vein (*cough* eng-misc) but enough experience has convinced me that it’s more ethical to take the milder approach. Why? Because if you shoot past refutation and into humiliation territory, your boss will come to the battle better-armed, greatly increasing the chances that you lose. And if you lose, your credibility goes to zero and you are in no position to discipline, much less humiliate, anyone. It’s best to move on and leave the vigilante justice to someone else.