Technology’s Loser Problem

I’m angry. The full back story isn’t worth getting into, but there was a company where I applied for a job in the spring of 2013: to build a company’s machine learning infrastructure from scratch. It was a position of technical leadership (Director equivalent, but writing code with no reports) and I would have been able to use Clojure. As it were, I didn’t get it. They were looking for someone more experienced, who’d built those kinds of systems before, and wouldn’t take 6 months to train up to the job. That, itself, is not worth getting angry about. Being turned down happens, especially at high levels.

I found out, just now, that the position was not filled. Not then. Not 6 months later. Not to this day, more than a year later. It has taken them longer to fill the role than it would have taken for me to grow into it.

When they turned me down, it didn’t faze me. I thought they’d found a better candidate. That happens; only thing I can do is make myself better. I found myself, however, a bit irked when I found out that they hadn’t filled the position for longer than it would have taken me to gain the necessary experience. I lost, and so did they.

That’s not what makes me angry. Rationally, I realize that most companies aren’t going to call back a pretty-good candidate they rejected because they had just opened the position and they thought they could do better (if you’re the first 37% of candidates for a job, it makes sense for them not to choose you and, empirically, first and second applicants for a high-level position rarely get it). That’s the sort of potentially beneficial but extremely awkward social process that just won’t happen. What makes me angry is the realization of how common a certain sort of decision is in the technology world. We make a lot of lose-lose decisions that hurt all of us. Extremely specific hiring requirements (that, in bulk, cost the company more in waiting time than training a 90% match up to the role) are just the tip of the iceberg.

You know those people who complain about the lack of decent <gender of sexual interest> but (a) reject people for the shallowest, stupidest reasons, (b) aren’t much of a prize and don’t work to better themselves, and (c) generally refuse to acknowledge that the problem is rooted in their own inflated perception of their market value? That’s how I feel every time I hear some corporate asswipe complain about a “talent shortage” in technology. No, there isn’t one. You’re either too stingy or too picky or completely inept at recruiting, because there’s a ton of underemployed talent out there.

Few of us, as programmers, call the initial shots. We’ve done a poor job of making The Business listen to us. However, when we do have power, we tend to fuck it up. One of the problems is that we over-comply with what The Business tells us it whats. For example, when a nontechnical CEO says, “I only want you to hire absolute rock stars”, what he actually means is, “Don’t hire an idiot just to have a warm body or plug a hole”. However, because they tend to be literal, over-compliant, and suboptimal, programmers will interpret that to mean, “Reject any candidate who isn’t 3 standard deviations above the mean.” The leads to positions not being filled, because The Business is rarely willing to pay what one standard deviation above the mean costs, let alone three.

Both sides now

I’ve been on both sides of the interviewing and hiring process. I’ve seen programmers’ code samples described with the most vicious language over the most trivial mistakes, or even stylistic differences. I’ve seen job candidates rejected for the most god-awful stupid reasons. In one case, the interviewer clearly screwed up (he misstated the problem in a way that made it impossible) but, refusing to risk face by admitting the problem was on his end, he claimed the candidate failed the question. Another was dinged on a back-channel reference (don’t get me started on that sleazy practice, which ought to be illegal) claiming, without any evidence, that “he didn’t do much” on a notable project four years ago. I once saw an intern denied a full-time offer because he lived in an unstylish neighborhood. (The justification was that one had to be “hungry”, mandating Manhattan.) Many of us programmers are so butthurt about not being allowed to sit at the cool kids’ table that, when given the petty power associated with interviewing other programmers, the bitch-claws come out in a major way.

Having been involved in interviewing and recruiting, I’ll concur that there are a significant number of untalented applicants. If it’s 99.5 percent, you’re doing a lot of things wrong, but most resumes do come from people way out of their depth. Moreover, as with dating, there’s an adverse weighting in play. Most people aren’t broken, but broken people go on orders of magnitude more dates than everyone else, which is why most peoples’ dating histories have a disproportionate representation of horror stories, losers, and weirdos. It’s the same with hiring, but phone screening should filter against that. If you’re at all good at it, about half of the people brought in-office will be solid candidates.

Of course, each requirement cuts down the pool. Plenty of companies (in finance, some officially) have a “no job hopper” or “no unemployeds” rule. Many mandate high levels of experience in new technologies (even though learning new technologies is what we’re good at). Then, there are those who are hung up on reference checking in weird and creepy ways. I know of one person who proudly admits that his reference checking protocol is to cold-call a random person (again, back-channel) is the candidate’s network and ask the question, without context, “Who is the best person you’ve ever worked with?” If anyone other than the candidate is named, the candidate is rejected. That’s not being selective. That’s being an invasive, narcissistic idiot. Since each requirement reduces the size of qualified people, it doesn’t take long before the prejudices winnow an applicant pool down to zero.

Programmers? Let’s be real here, we kinda suck…

As programmers, we’re not very well-respected, and when we’re finally paid moderately well, we let useless business executives (who work 10-to-3 and think HashMap is a pot-finding app) claim that “programmer salaries are ridiculous”. (Not so.) Sometimes (to my horror) you’ll hear a programmer even agree that our salaries are “ridiculous”. Fuck that bullshit; it’s factually untrue. The Business is, in general, pretty horrible to us. We suffer under closed allocation, deal with arbitrary deadlines, and if we don’t answer to an idiot, we usually answer to someone else who does. Where does the low status of programmers come from? Why are we treated as cost centers instead of partners in the business? Honestly… much of the problem is us. We’ve failed to manage The Business, and the result is that it takes ownership of us.

Most of the time, when a group of people is disproportionately successful, the cause isn’t any superiority of the average individual, but a trait of the group: they help each other out. People tend to call these formations “<X> Mafia” where X might be an ethnicity, a school, or a company. Y Combinator is an explicit, pre-planned attempt to create a similar network; time will tell if it succeeds. True professions have it. Doctors look out for the profession. With programmers, we don’t see this. There isn’t a collective spirit: just long email flamewars about tabs versus spaces. We don’t look out for each other. We beat each other down. We sell each other out to non-technical management (outsiders) for a shockingly low bounty, or for no reason at all.

In many investment banks, there’s an established status hierarchy in which traders and soft-skills operators (“true bankers”) are at the top, quants are in the middle, and programmers (non-quant programmers are called “IT”) are even lower. I asked a high-ranking quant why it was this way, and he explained it in terms of the “360 degree” performance reviews. Bankers and traders all gave each other top ratings, and wrote glowing feedback for minor favors. They were savvy enough to figure out that it was best for them to give great reviews up, down, and sideways, regardless of their actual opinions. Quants tended to give above-average ratings and occasionally wrote positive feedback. IT gave average ratings for average work and plenty of negative feedback. The programmers were being the most honest, but hurting each other in the process. The bankers and traders were being political, and that’s a good thing. They were savvy enough to know that it didn’t benefit them to sell each other out to HR and upper management. Instead, they arranged it so they all got good ratings and the business had to, at baseline, appreciate and reward all of them. While it might seem that this hurt top performers, it had the opposite effect. If everyone got a 50 percent bonus and 20% raise, management had to give the top people (and, in trading, it’s pretty obvious who those are) even more.

Management loves to turn high performers against the weak, because this enables management to be stingy on both sides. The low performers are fired (they’re never mentored or reassigned) and the high performers can be paid a pittance and still have a huge bonus in relative terms (not being fired vs. being fired). What the bankers were smart enough to realize (and programmers, in general, are not) is that performance is highly context-driven. Put eight people of exactly equal ability on a team to do a task and there will be one leader, two or three contributors, and the rest will be marginal or stragglers. It’s just more efficient to have the key knowledge in a small number of heads. Open source projects work this way. What this means is that, even if you have excellent people and no bad hires, you’ll probably have some who end up with not much to show for their time (which is why open allocation is superior; they can reassign themselves until they end up in a high-impact role). If management can see who is in what role, it can fire the stragglers and under-reward the key players (who, because they’re already high performers, are probably motivated by things other than money… at least, for now). The bankers and traders (and, to a lesser extent, the quants) had the social savvy and sense to realize that it was best that upper management not know exactly who was doing what. They protected each other, and it worked for them. The programmers, on the other hand, did not, and this hurt top performers as well as those on the bottom.

Let’s say that an investment bank tried to impose tech-company stack ranking on its employees, associate level and higher. (Analyst programs are another matter, not to be discussed here.) Realizing the mutual benefit in protecting each other, the bankers would find a way to sabotage the process by giving everyone top ratings, ranking the worst employees highly, or simply refusing to do the paperwork. And good for them! Far from being unethical, this is what they should do: collectively work The Business to get what they’re actually worth. Only a programmer would be clueless enough to go along with that nonsense.

In my more pessimistic moods, I tend to think that we, as programmers, deserve our low status and subordinacy. As much as we love to hate those “business douchebags” there’s one thing I will say for them. They tend to help each other out a lot more than we do. Why is this? Because they’re more political and, again, that might not be a bad thing. Ask a programmer to rate the performance of a completely average colleague and you’ll get an honest answer: he was mediocre, we could have done without him. These are factual statements about average workers, but devastating when put into words. Ask a product manager or an executive about an average colleague and you’ll hear nothing but praise: he was indispensable, a world-class player, best hire in ten years. They realize that it’s politically better for them, individually and as a group, to keep their real opinions to themselves and never say anything that could remotely endanger another’s career. Even if that person’s performance was only average, why make an enemy when one can make a friend?

“Bad code”

Let’s get to another thing that we do, as programmers, that really keeps us down. We bash the shit out of each other’s code and technical decision-making, often over minutiae.

I hate bad code. I really do. I’ve seen plenty of it. (I’ve written some, but I won’t talk about that.) I understand why programmers complain about each other’s code. Everyone seems to have an independent (and poorly documented) in-head culture that informs how he or she writes code, and reading another person’s induces a certain “culture shock”. Even good code can be difficult to read, especially under time pressure. And yes, most large codebases have a lot of code in them that’s truly shitty, sometimes to the point of being nearly impossible to reason about. Businesses have failed because of code quality problems, although (to tell the whole story) it’s rare that one bad programmer can do that much damage. The worst software out there isn’t the result of one inept author, but the result of code having too many authors, often over years. It doesn’t help that most companies assign maintenance work to either to junior programmers, or demoted (and disengaged) senior ones, neither category having the power to do it right.

I’d be the last one to come out and defend bad code. That said, I think we spend too much time complaining about each other’s code– and, worse yet, we tend toward the unforgivable sin of complaining to the wrong people. A technical manager has, at least, the experience and perspective to know that, at some level, every programmer hates other peoples’ code. But if that programmer snitches to a non-technical manager and executive,  well… you’ve just invited a 5-year-old with a gun to the party. Someone might get fired because “tabs versus spaces” went telephone-game into “Tom does shoddy work and is going to destroy the business”. Because executives are politically savvy enough to protect the group, and only sell each other out in extreme circumstances, what started out as a stylistic disagreement sounds (to the executive ear) like Tom (who used his girlfriend’s computer to fix a production problem at 11:45 on a Friday night, the tabs/spaces issue being for want of an .emacs.d) is deliberately destroying the codebase and putting the whole company at risk.

As programmers, we sell each other out all the time. If we want to advance beyond reasonable but merely upper-working class salaries, and be more respected by The Business, we have to be more careful about this kind of shit. I’ve heard a great number of software engineers say things like, “Half of all programmers should just be fired.” Now, I’ll readily agree that there are a lot of badly-trained programmers out there whose lack of skill causes a lot of pain. But I’m old enough to know that people come to a specific point from a multitude of paths and that it’s not useful to personalize this sort of thing. Also, regardless of what we may think as individuals, almost no doctor or banker would ever say, to someone outside his profession, “half of us should be fired”. They’re savvy enough to realize the value of protecting the group, and handling competence and disciplinary matters internally. Whether to fire, censure, mentor or praise is too important a decision to let it happen outside of our walls.

There are two observations about low-quality code, one minor and one major. The minor one is that code has a “all of us is worse than any of us” dynamic. As more hands pass over code, it tends to get worse. People hack the code needing specific features, never tending to the slow growth of complexity, and the program evolves over time into something that nobody understands because too many people were involved in it. Most software systems fall to pieces not because of incompetent individuals, but because of unmanaged growth of complexity. The major point on code-quality is: it’s almost always management’s fault.

Bad code comes from a multitude of causes, only one of which is low skill in programmers. Others include unreasonable deadlines, unwillingness to attack technical debt (a poor metaphor, because the interest rate on technical “debt” is both usurious and unpredictable), bad architecture and tooling choices, and poor matching of programmers to projects. Being stingy, management wants to hire the cheapest people it can find and give them the least time possible in which to do the work. That produces a lot of awful code, even if the individual programmers are capable. Most of the things that would improve code quality (and, in the long term, the health and performance of the business) are things that management won’t let the programmers have: more competitive salaries, more autonomy, longer timeframes, time for refactoring. The only thing that management and the engineers can agree on is firing (or demoting, because their work is often still in use and The Business needs someone who understands it) those who wrote bad code in the past.

One thing I’ve noticed is that technology companies do a horrible job of internal promotion. Why is that? Because launching anything will typically involve compromises with the business on timeframe and headcount, resulting in bad code. Any internal candidate for a promotion has left too many angles for attack. Somewhere out there, someone dislikes a line of code he wrote (or, if he’s a manager, something about a project he oversaw). Unsullied external candidates win, because no one can say anything bad about them. Hence, programming has the culture of mandatory (but, still, somewhat stigmatized) job hopping we know and love.

What’s really at the heart of angry programmers and their raging against all that low-quality code? Dishonest attribution. The programmer can’t do shit about the dickhead executive who set the unreasonable deadlines, or the penny-pinching asswipe managers who wouldn’t allow enough salary to hire anyone good. Nor can he do much about the product managers or “architects” who sit above and make his life hell on a daily basis. But he can attack Tom, his same-rank colleague, over that commit that really should have been split into two. Because they’re socially unskilled and will generally gleefully swallow whatever ration of shit is fed to them by management, most programmers can very easily be made to blame each other for “bad code” before blaming the management that required them to use the bad code in the first place.


As a group, software engineers are losers. In this usage, I’m not using the MacLeod definition (which is more nuanced) and my usage is halfway pejorative. I generally dislike calling someone a loser, because the pejorative, colloquial meaning of that word conflates unfortunate circumstance (one who loses) with deserved failure. Here, however, it applies. Why do we lose? Because we play against each other, instead of working together to beat the outside world. As a group, we create our own source of loss.

Often, we engage in zero- or negative-sum plays just to beat the other guy. It’s stupid. It’s why we can’t have nice things. We slug each other in the office and wonder why external hires get placed over us. We get into flamewars about minutiae of programming languages, spread FUD, and eventually some snot-nosed dipshit gets the “brilliant” idea to invite nontechnical management to weigh in. The end result is that The Business comes in, mushroom stamps all participants, and says, “Everything has to be Java“.

Part of the problem is that we’re too honest, and we impute honesty in others when it isn’t there. We actually believe in the corporate meritocracy. When executives claim that “low performers” are more of a threat to the company than their astronomical, undeserved salaries and their doomed-from-the-start pet projects, programmers are the only people stupid enough to believe them, and will often gleefully implement those “performance-based” witch hunts that bankers would be smart enough to evade (by looking for better jobs, and arranging for axes to fall on people planning exits anyway). Programmers attempt to be apolitical, but that ends up being very political, because the stance of not getting political means that one accepts the status quo. That’s radically conservative, whether one admits it or not.

Of course, the bankers and traders realize the necessity of appearing to speak from a stance of professional apolitical-ness. Every corporation claims itself to be an apolitical meritocracy, and it’s not socially acceptable to admit otherwise. Only a software engineer would believe in that nonsense. Programmers hear “Tom’s not delivering” or “Andrea’s not a team player” and conceive of it as an objective fact, failing to recognize that, 99% of the time, it means absolutely nothing more or less than “I don’t like that person”.

Because we’re so easily swayed, misled, and divided, The Business can very easily take advantage of us. So, of course, it does. It knows that we’ll sell each other out for even a chance at a seat at the table. I know a software engineer who committed felony perjury against his colleagues just to get a middle-management position and the right to sit in on a couple of investor meetings. Given that this is how little we respect each other, ourselves, and our work, is it any wonder that software engineers have such low status?

Our gender issues

I’m going to talk, just briefly, about our issues with women. Whatever the ultimate cause of our lack of gender diversity– possibly sexism, possibly that the career ain’t so great– it’s a major indictment of us. My best guess? I think sexism is a part of it, but I think that most of it is general hostility. Women often enter programming and find their colleagues hostile, arrogant, and condescending. They attribute that to their gender, and I’m sure that it’s a small factor, but men experience all of that nonsense as well. To call it “professional hazing” would be too kind. There’s often nothing professional about it. I’ve dealt with rotten personalities, fanaticism about technical preference or style, and condescension and, honestly, don’t think there’s a programmer out there who hasn’t. When you get into private-sector technology, one of the first things you learn is that it’s full of assholes, especially at higher levels.

Women who are brave enough to get into this unfriendly industry take a look and, I would argue, most decide that it’s not worth it to put up with the bullshit. Law and medicine offer higher pay and status, more job security, fewer obnoxious colleagues, and enough professional structure in place that the guy who cracks rape jokes at work isn’t retained just because he’s a “rockstar ninja”.

“I thought we were the good guys?”

I’ve often written from a perspective that makes me seem pro-tech. Originally, I approached the satirical MacLeod pyramid with the belief that “Technocrat” should be used to distinguish positive high-performers (apart from Sociopaths). I’ve talked about how we are a colonized people, as technologists. It might seem that I’m making businesspeople out to be “the bad guys” and treating programmers as “the good guys”. Often, I’m biased in that very direction. But I also have to be objective. There are good business people out there, obviously. (They’re just rare in Silicon Valley, and I’ll get to that.) Likewise, software engineers aren’t all great people, either. I don’t think either “tribe” has a monopoly on moral superiority. As in Lost, “we’re the good guys” doesn’t mean much.

We do get the worst (in terms of ethics and competence) of the management/business tribe in the startup world. That’s been discussed at length, in the essay linked above. The people who run Silicon Valley aren’t technologists or “nerds” but machiavellian businessmen who’ve swooped in to the Valley to take advantage of said nerds. The appeal of the Valley, for the venture capitalists and non-technical bro executives who run it, isn’t technology or the creation of value, but the unparalleled opportunity to take advantage of too-smart, earnest hard workers (often foreign) who are so competent technically that they often unintentionally generate value, but don’t know the first thing about how to fight for their own interests.

It’s easy to think ourselves morally superior, just because the specific subset of business people who end up in our game tends to be the worst of that crowd. It’s also a trap. We have a lot to learn form the traders and bankers of the world about how to defend ourselves politically, how to stand a chance of capturing some of the value we create, and how to prevent ourselves from being robbed blind by people who may have lower IQs, but have been hacking humans for longer than we could have possibly been using computers. Besides, we’re not all good. Many of us aren’t much better than our non-technical overlords. Plenty of software engineers would gladly join the bad guys if invited to their table. The Valley is full of turncoat software engineers who don’t give a shit about the greater mission of technology (using knowledge to make peoples’ lives better) and who’d gladly sell their colleagues out to cost-cutting assholes in management.

Then there are the losers. Losers aren’t “the bad guys”. They don’t have the focus or originality that would enable them to pull off anything complicated. Their preferred sin is typically sloth. They’ll fail you when you need them the most, and that ‘s what makes them infuriating. They just want to put their heads down and work, and the problem is that they can’t be trusted to “get political” when that’s exactly what’s needed. The danger of losers is in numbers. The problem is that so many software engineers are clueless, willing losers who’ll gladly let political operators take everything from them.

When you’re young and don’t know any better, one of the appeals of software engineering is that it appears, superficially, to tolerate people of low social ability. To people used to artificial competition against their peers, this seems like an attractive trait of the industry; it’s not full of those “smooth assholes” and “alpha jocks”. After several years observing various industries, I’ve come to the conclusion that this attitude is not merely misguided, but counterproductive. You want socially skilled colleagues. Being the biggest fish in a small pond just means that there are no big fish to protect you when the sharks come in. Most of those “alpha jocks” aren’t assholes or idiots (talk to them, nerds; you’ll be surprised) and, when The Business comes in and is looking for a fight, it’s always best to have strong colleagues who’ve got your back.

Here’s an alternate, and quite possible hypothesis: maybe The Business isn’t actually full of bad guys. One thing that I’ve realized is that people tend to push blame upward. For example, the reputation of venture capitalists has been harmed by founders blaming “the VCs” for their own greed and mismanagement. It gives the grunt workers an external enemy, and the clueless can be tricked into working harder than they should (“they don’t really like us and haven’t given us much, but if we kill it on this project and prove them wrong, maybe they’ll change their minds!”). It actually often seems that most of the awfulness of the software industry doesn’t come directly from The Business, but from turncoat engineers (and ex-engineers) trying to impress The Business. In the same way that young gang members are more prone to violence than elder dons, the most creative forms of evil seem to come from ex-programmers who’ve changed their colors.

The common enemy

So long as software engineers can easily be divided against each other on trivial matters like tabs versus spaces and scrotum versus kanban, we’ll never get the respect (and, more importantly, the compensation) that we’re due. These issues distract us from what we really need to do, which is figure out how to work The Business. Clawing at each other, each trying to become the favored harem queen of the capitalist, is suboptimal compared to the higher goal of getting out of the harem.

I’ve spoken of “The Business” as if it were a faceless, malevolent entity. It might sound like I’m anti-business, and I’m not. Business is just a kind of process. Good people, and bad people, start businesses and some add great value to the world. The enemy isn’t private enterprise itself, but the short-term thinking and harem-queen politics of the established corporation. Business organizations get to a point where they cease having a real reason to exist, and all that’s left is the degenerate social contest for high-ranking positions. We, as programmers, seem to lack the skill to prevent that style of closed-allocation degeneracy from happening. In fact, we seem to unintentionally encourage it.

The evil isn’t that software is a business, but that technical excellence has long since been subordinated entirely to the effectively random emotional ups and downs of non-technical executives who lack the ability to evaluate our work. It’s that our weird ideology of “never get political” is actually intensely political and renders us easy to abuse. Business naturally seems to be at risk of anti-intellectual tendencies and, rather than fight back against this process, we’ve amplified it just to enjoy the illusion of being on the inside, among the “cool kids”, part of The Business. Not only does our lack of will to fight for our own interests leave us at the mercy of more skilled business operators, but it attracts an especially bad kind of them. Most business people, actually, aren’t the sorts of corporate assholes we’re used to seeing run companies. It’s just that our lack of social skill appeals to the worst of that set: people who come in to technology to take advantage of all the clueless, loser nerds who won’t fight for themselves. If we forced ourselves to be more discerning judges of character, and started focusing on ethics and creativity instead of fucking tabs-versus-spaces, we might attract a better sort of business person, and have an industry where stack ranking and bastardized-”Agile” micromanagement aren’t even considered.

If we want to improve our situation, we have to do the “unthinkable” (which is, as I’ve argued, actually quite thinkable). We have to get political.

VC-istan 8: the Damaso Effect

Padre Damaso, one of the villains of the Filipino national novel, Noli me Tangere, is one of the most detestable literary characters, as a symbol of both colonial arrogance and severe theological incompetence. One of the novel’s remarks about colonialism is that it’s worsened by the specific types of people who implement colonial rule: those who failed in their mother country, and are taking part in a dangerous, isolating, and morally questionable project that is their last hope at acquiring authority. Colonizers tend to be people who have no justification for superior social status left but their national identity. One of the great and probably intractable tensions within the colonization process is that it forces the best (along with the rest) of the conquered society to subordinate to the worst of the conquering society. The total incompetence of the corrupt Spanish friars in Noli is just one example of this.

In 2014, the private-sector technology world is in a state of crisis, and it’s easy to see why. For all our purported progressivism and meritocracy, the reality of our industry is that it’s sliding backward into feudalism. Age discrimination, sexism, and classism are returning, undermining our claims of being a merit-based economy. Thanks to the clubby, collusive nature of venture capital, to secure financing for a new technology business requires tapping into a feudal reputation economy that funds people like Lucas Duplan, while almost no one backs anything truly ambitious. Finally, there’s the pernicious resurgence of location (thanks to VCs’ disinterest in funding anything more than 30 miles away from them) as a career-dominating factor, driving housing prices in the few still-viable metropolitan areas into the stratosphere. In so many ways, American society is going back in time, and private-sector technology is a driving force rather than a counterweight. What the fuck, pray tell, is going on? And how does this relate to the Damaso Effect?

Lawyers and doctors did something, purely out of self-interest, to prevent their work from being commoditized as American culture became increasingly commercial in the late 19th century. They professionalized. They invented ethical rules and processes that allowed them work for businessmen (and the public) without subordinating. How this all works is covered in another essay, but it served a few purposes. First, the profession could maintain standards of education so as to keep membership in the profession as a form of credibility that is independent of managerial or client review. Second, by ensuring a basic credibility (and, much more important, employability) for good-faith members, it enabled professionals to meet ethical obligations (i.e. don’t kill patients) that supersede managerial or corporate authority. Third, it ensured some control over wages, although that was not its entire goal. In fact, the difference between unionization and professionalization seems to be as follows. Unions are employed when the labor is a commodity, but ensure that the commoditization happens in a fair way (without collective bargaining, and in the absence of a society-wide basic income, that never occurs). Unions accept that the labor is a commodity, but demand a fair rate of exchange. Professionalization exists when there is some prevailing reason (usually an ethical one, such as in medicine) to prevent full commoditization. If it seems like I’m whitewashing history here, let me point out that the American Medical Association, to name one example, has done some atrocious things in its history. It originally opposed universal healthcare; it has received some karma, insofar as the inventively mean-spirited U.S. health insurance system has not only commoditized medical services, but done so on terms that are unfavorable to physician and patient both. I don’t mean to say that the professions have always been on the right side of history, because that’s clearly not the case; professionalization is a good idea, often poorly realized.

The ideal behind professionalization is to separate two senses of what it means to “work for” someone: (1) to provide services, versus (2) to subordinate fully. Its goal is to allow a set of highly intelligent, skilled people to deliver services on a fair market without having to subordinate inappropriately (such as providing personal services unrelated to the work, because of the power relationship that exists) as is the norm in mainstream business culture.

As a tribe, software professionals failed in this. We did not professionalize, nor did we unionize. In the Silicon Valley of the 1960s and ’70s, it was probably impossible to see the need for doing so: technologists were fully off the radar of the mainstream business culture, mostly lived on cheap land no one cared about, and had the autonomy to manage themselves and answer to their own. Hewlett-Packard, back in its heyday, was run by engineers, and for the benefit of engineers. Over time, that changed in the Valley. Technologists and mainstream, corporate businessmen were forced to come together. It became a colonial relationship quickly; the technologists, by failing to fight for themselves and their independence, became the conquered tribe.

Now it’s 2014, and the common sentiment is that software engineers are overpaid, entitled crybabies. I demolished this perception here. Mostly, that “software engineers are overpaid” whining is propaganda from those who pay software engineers, and who have a vested interest. It has been joined lately by leftist agitators, angry at the harmful effects of technology wealth in the Bay Area, who have failed thus far to grasp that the housing problem has more to do with $3-million-per-year, 11-to-3 product executives (and their trophy spouses who have nothing to do but fight for the NIMBY regulations that keep housing overpriced) than $120,000-per-year software engineers. There are good software jobs out there (I have one, for now) but, if anything, relative to the negatives of the software industry in general (low autonomy relative to intellectual ability, frequent job changes necessitated by low concern of employers for employee career needs, bad management) the vast majority of software engineers are underpaid. Unless they move into management, their incomes plateau at a level far below the cost of a house in the Bay Area. The truth is that almost none of the economic value created in the recent technology bubble has gone to software engineers or lifelong technologists. Almost all has gone to investors, well-connected do-nothings able to win sinecures from reputable investors and “advisors”, and management. This should surprise no one. Technology professionals and software engineers are, in general, a conquered tribe and the great social resource that is their brains is being mined for someone else’s benefit.

Here’s the Damaso Effect. Where do those Silicon Valley elites come from? I nailed this in this Quora answer. They come from the colonizing power, which is the mainstream business culture. This is the society that favors pedigree over (dangerous, subversive) creativity and true intellect, the one whose narcissism brought back age discrimination and makes sexism so hard to kick, even in software which should, by rights, be a meritocracy. That mainstream business world is the one where Work isn’t about building things or adding value to the world, but purely an avenue through which to dominate others. Ok, now I’ll admit that that’s an uncharitable depiction. In fact, corporate capitalism and its massive companies have solved quite a few problems well. And Wall Street, the capital of that world, is morally quite a bit better than its (execrable) reputation might suggest. It may seem very un-me-like to say this, but there are a lot of intelligent, forward-thinking, very good people in the mainstream business culture (“MBA culture”). However, those are not the ones who get sent to Silicon Valley by our colonial masters. The failures are the ones sent into VC firms and TechCrunch-approved startups to manage nerds. Not only are they the ones who failed out of the MBA culture, but they’re bitter as hell about it, too. MBA school told them that they’d be working on $50-billion private-equity deals and buying Manhattan penthouses, and they’re stuck bossing nerds around in Mountain View. They’re pissed.

Let me bring Zed Shaw in on this. His essay on NYC’s startup scene (and the inability thereof to get off the ground) is brilliant and should be read in full (seriously, go read it and come back to me when you’re done) but the basic point is that, compared to the sums of money that real financiers encounter, startups are puny and meaningless. A couple quotes I’ll pull in:

During the course of our meetings I asked him how much his “small” hedge fund was worth.

He told me:


That’s right. His little hedge fund was worth more money than thousands of Silicon Valley startups combined on a good day. (Emphasis mine.) He wasn’t being modest either. It was “only” worth 30 billion dollars.

Zed has a strong point. The startup scene has the feeling of academic politics: vicious intrigue, because the stakes are so small. The complete lack of ethics seen in current-day technology executives is also a result of this. It’s the False Poverty Effect. When people feel poor, despite objective privilege and power, they’re more inclined to do unethical things because, goddammit, life owes them a break. That startup CEO whose investor buddies allowed him to pay himself $200,000 per year is probably the poorest person in his Harvard Business School class, and feels deeply inferior to the hedge-fund guys and MD-level bankers he drank with in MBA school.

This also gets into why hedge funds get better people (even, in NYC, for pure programming roles) than technology startups. Venture capitalists give you $5 million and manage you; they pay to manage. Hedge fund investors pay you to manage (their money). As long as you’re delivering returns, they stay out of your hair. It seems obvious that this would push the best business people into high finance, not VC-funded technology.

The lack of high-quality businessmen in the VC-funded tech scene hurts all of us. For all my railing against that ecosystem, I’d consider doing a technology startup (as a founder) if I could find a business co-founder who was genuinely at my level. For founders, it’s got to be code (tech co-founder) or contacts (business co-founder) and I bring the code. At my current age and level of development, I’m a Tech 8. A typical graduate from Harvard Business School might be a Biz 5. (I’m a harsh grader, that’s why I gave myself an 8.) Biz 6 means that a person comes with connections to partners at top VC firms and resources (namely, funding) in hand. The Biz 7′s go skiing at Tahoe with the top kingmakers in the Valley, and count a billionaire or two in their social circle. If I were to take a business co-founder (noting that he’d become CEO and my boss) I’d be inclined to hold out for an 8 or 9, but (at least, in New York) I never seemed to meet Biz 8′s or 9′s in VC-funded technology, and I think I’ve got a grasp on why. Business 8′s just aren’t interested in asking some 33-year-old California man-child for a piddling few million bucks (that comes along with nasty strings, like counterproductive upper management). They have better options. To the Business 8+ out there, whatever the VCs are doing in Silicon Valley is a miserable sideshow.

It’s actually weird and jarring to see how bad the “dating scene”, in the startup world, is between technical and business people. Lifelong technologists, who are deeply passionate about building great technology, don’t have many places elsewhere to go. So a lot of the Tech 9s and 10s stick around, while their business counterparts leave and a Biz 7 is the darling at the ball. I’m not a fan of Peter Shih, but I must thank him for giving us the term “49ers” (4′s who act like 9′s). The “soft” side, the business world of investors and well-connected people who think their modest connections deserve to trade at an exorbitant price against your talent, is full of 49ers– because Business 9′s know to go nowhere near the piddling stakes of the VC-funded world. Like a Midwestern town bussing its criminal element to San Francisco (yes, that actually happened) the mainstream business culture sends its worst and its failures into the VC-funded tech. Have an MBA, but not smart enough for statistical arbitrage? Your lack of mathematical intelligence means you must have “soft skills” and be a whiz at evaluating companies; Sand Hill Road is hiring!

The venture-funded startup world, then, has the best of one world (passionate lifelong technologists) answering to the people who failed out of their mother country: mainstream corporate culture.

The question is: what should be done about this? Is there a solution? Since the Tech 8′s and 9′s and 10′s can’t find appropriate matches in the VC-funded world (and, for their part, most Tech 8+ go into hedge funds or large companies– not bad places, but far away from new-business formation– by their mid-30s) where ought they to go? Is there a more natural home for Tech 8+? What might it look like? The answer is surprising, but it’s the mid-risk / mid-growth business that venture capitalists have been decrying for years as “lifestyle businesses”. The natural home of the top-tier technologist is not in the flash-in-the-pan world of VC, but the get-rich-slowly world of steady, 20 to 40 percent per year growth due to technical enhancement (not rapid personnel growth and creepy publicity plays, as the VCs prefer).

Is there a way to reliably institutionalize that mid-risk / mid-growth space, that currently must resort (“bootstrapping”) to personal savings (a scarce resource, given that engineers are systematically underpaid) just as venture capital has done to the high-risk /get-big-or-die region of the risk/growth spectrum? Can it be done with a K-strategic emphasis that forges high-quality businesses in addition to high-value ones? Well, the answer to that one is: I’m not sure. I think so. It’s certainly worth trying out. Doing so would be good for technology, good for the world, and quite possibly very lucrative. The real birth of the future is going to come from a fleet of a few thousand highly autonomous “lifestyle” businesses– and not from VC-managed get-huge-or-die gambits.

VC-istan 7: solving the wrong problem

I’ve written at length about VC-istan, its poor performance and its bigotries. What, however, is VC-istan’s “original sin”? Why is it so dysfunctional? Is there a foundational reason for its pattern of across-the-board moral and financial failure? I think the answer is obviously, “yes”. There’s a simple root cause: it’s solving the wrong problem. This requires two investigations: what problem should venture capital be solving, and what is it actually doing?

What’s the purpose of venture capital?

This is an easy one. The purpose of venture capital is to finance endeavors that require substantial backing in an all-or-nothing transaction. A biotechnology firm that requires $100 million to develop, and put into clinical trial, a new drug or device would be one example of this. With $10 million, it produces nothing salable; with ten times that, it has a chance. Others exist around infrastructure and in more deeply industrial pursuits like clean energy. Venture capitalists do invest in these spaces, and that’s outside of what I’d call “VC-istan”. Not everything that venture capitalists do is ugly, of course, and not all of it is VC-istan.

Venture capital, in a way, was originally intended as the “capital of last resort” for high-risk, capital-intensive businesses that would never qualify for more traditional financing. Why? Because when the proper way to invest is all-or-nothing, that has (unavoidable) negative consequences for all sides. It means that most people won’t get funded, and it’ll be extremely competitive to get capital, and dilution of founder equity will be severe. It’s not ideal, but if all you have is an idea, your product is 3 years away from the market in the best-case scenario, and you’re asking for $125 million to get started, those are the terms you have to take. It is, of course, quite a noisy process. The best ideas might not get funded, because there is literally no one able to assess what the best ideas are.

Venture capital for biotechnology and infrastructure has its own rules and culture. I’m not an expert on that, but it’s not what I consider “VC-istan”. From my perspective, which may be limited, venture capitalists in that space are trying to act in good faith and invest in viable businesses. To be blunt, I don’t think the “cool kids” nonsense (see: TechCrunch) matters so much in those sectors, because the science has to be sound. If you’re trying to turn algae into diesel fuel, Mike Arrington’s half-baked opinion of you matters a billion times less than the chemistry inside your lab.

What problem is VC-istan solving?

VC-istan is a subset of “all venture capital”, and focused on the “hip” stuff that can be likened to “reality TV”. To explain this analogy, ask this question: why are “reality TV” shows so prolific? It’s not about their quality. Cheap production is often cited, but it’s not just amount the numbers in the accounting ledger. The reality show formula is one that admits perfect commoditization. Writers and actors, at high levels of talent, resist commoditization. They won’t work on shows that are bad for their careers, and they have agents whose full-time job is to represent their interests. This makes them non-fungible. At the highest level of talent, labor may be able to push back against commoditization of itself, because there are few enough people at the highest levels to make the market discrete rather than continuous– or, in other words, illiquid. Reality TV does away with those “prima donna” creatives and celebrities: the writing demands are minimal and can be fulfilled with a mediocre staff, and the actors are nonentities. This enables the production studio to iterate quickly with half-baked concepts without needing to concern itself with the career needs of the parties involved.

VC-istan loves social media, and consumer-web marketing experiments, which are like reality television in that they can be produced with mediocre, “commodity-grade” inputs. To launch a biotech firm, one actually needs to have a strong grounding in science. Assessing founders for scientific literacy is hard, and private equity people are rarely up to the task. But any idiot can come up with, and hire someone good enough to implement, a Snapchat or a Clinkle. In the soft space of marketing experiments using technology, as opposed to the much harder sector that is technology proper, commodity founders and engineers suffice, and because the space is a gigantic bike shed, every investor feels entitled to have strong opinions. If genuine technical talent is needed for “scaling” down the road, it can be hired once the company has been covered by TechCrunch and appears legitimate.

Ultimately, the purpose of VC-istan’s “tech” companies is not to innovate or to solve hard problems. It’s to flip teams that have been validated by three to six rounds of venture funding, and possibly by success in the market (but that’s optional). Occasionally there’s an IPO, but those are about as common as big-company spinoffs. More typical is the “acqui-hire”, whose purpose can only be understood in the broader context of corporate dysfunction.

M&A has replaced R&D

A company’s need for top talent tends to be intermittent or subtle, and most often both. An example of the first (intermittent need) is around a short-term crisis that only a small percentage of people will have the insight, creativity, experience, or work ethic that is necessary to surmount it. The second pertains to the long-term existential need for innovation; if the company doesn’t have some engine that produces an occasional positive-impact black swan, it will be torn to shreds by the bad kind: crises that no amount of talent or effort can resolve. While every company pays lip service to its need for top talent, the truth is that most companies don’t need top talent for their day-to-day operations. If they did, that would be irresponsible design: a dependency on something that is somewhere between a highly volatile commodity and an outright non-commodity. The need for top talent tends to be a long-term issue.

Top talent is difficult to truly employ; one merely sponsors it. Old-style corporations understood that and invested in R&D. When the rare crisis that was truly existential would emerge, talent could be borrowed from the R&D pool. Additionally, while R&D could focus on basic research that was of general benefit to the world, and not necessarily in the firm’s immediate, parochial interests, the proximity to that research the corporation enjoyed gave it such an edge in practical innovation to pay for itself several times over.

Unfortunately, basic research was one of the first casualties of the private equity invasion that began in the 1980s. The old R&D labs that built C, Unix, Smalltalk and the internet weren’t scrapped outright, but reduced to a fraction of their former size, and forced to take a next-quarter focus. Conditions weren’t actually made so bad as to flush existing talent out, but positions became scarce enough that new talent couldn’t get in. The executives of those companies weren’t all short-sighted idiots, though. They knew that the high-autonomy, R&D-oriented work was the only thing keeping top talent in place. With corporate R&D near obliteration, that was threatened. So they knew they needed a solution that talent-intake problem. What did private iniquity propose as a solution? More private equity.

Enter venture capital, formerly a subsector of private equity that was generally avoided by those with other career options, due to its infuriatingly intermittent performance. What would it mean, however, if venture capital could be made less “venture”, by filling a need created by the disintegration of another part of the economy? Companies shutting down the blue-sky, high-autonomy R&D work had to get talent somehow. Explicitly paying for it proved to be too expensive, except in investment banking, due to hedonic adaptation– people who are performing at a high level, if their needs for autonomy are not met, require 25-50% per year raises to be content. Tapping high-talent people for managerial ranks proved fruitless as well, because many of these people (while exceptional as individual contributors) had neither the desire nor the ability to manage (and, additionally, middle-management positions were also cut during the private equity invasion). The remaining solution to the talent problem became one that private equity men found extremely attractive, given the premium they collect on deals– companies must buy it.

I don’t intend to insult the low-level employees of the Googles and Yahoos of the world by saying that those companies have “no talent” at the bottom. That’s clearly untrue. Companies don’t acqui-hire (which is far more expensive than internal promotion) because they have no top talent in their ranks. They have plenty, but they acqui-hire because they have lost the ability to discover what they have. It’s a malfunction of the middle-management layer. These companies are like hoarders that buy new coats every winter not for a lack of coats, but because their houses are so out of order that a new purchase is preferable to sorting the old place out.

Moreover, a company cannot, in general, adequately commoditize its highest levels of talent. The best will always seek their own career goals foremost, and perform at their highest only when there is coherency between their long-term personal goals and the work assigned to them. There are also, to put it bluntly, not enough such people to merit any explicit managerial correction to this problem. An executive focused on the career-coherency issues coming out of the most talented 5% is ignoring the day-to-day work completed by the other 95%. Two (problematic) solutions end up emerging. The first is for the company to ignore the high-talent problem and treat its top 5% like everyone else: closed allocation, low autonomy, etc. Then it loses them, plain and simple, and becomes dysfunctional after a few years of brain drain. The second is to leave them alone and effectively let them work on whatever they want. That’s great, in the short term, but it can be politically messy; others (who may belong in the top 5%, but haven’t been recognized) may resent them for their higher level of autonomy, or that set of people may lose sight of their need to continually market themselves and justify their favorable conditions, and then be crushed (not for a lack of value to the organization, but because it fails to market itself) when there is a management or market change.

So what is the “problem” that VC-istan exists to solve? It’s there to commoditize top talent. Although a specific company cannot commoditize its top 5%, the conceit is that an army of dedicated specialists– a mix of investors, corporate biz-dev executives, and “tech press”– can do so. In the consumer web space, venture capitalists have become a sort of high-end headhunter, but one that follows different rules.

For one major difference between the old corporate ladder and the acqui-hire system, employers are not allowed to explicitly discriminate on age, pregnancy status, health issues, race or gender. Investors can. Middle managers are too busy to conduct invasive “back channel” reference checks that, in truth, constitute civil harassment and would admit blacklisting charges if they ever interfered with employment (thus, risk-averse companies prefer not to do so). Investors can do so, and in such a way as to work through people who will keep their secrets (preventing lawsuits). This is a wet dream of the new right wing, an Uberization of executive hiring. The old system, with decades of regulation thrown into it because those rules were actually necessary, has been supplanted by a premium, rule-breaking, and vicious new one. The people who need the regulations imposed by the old system (i.e. women, minorities, people with health problems, people over 40, people with kids) are simply judged unfit to compete.

Here’s a question: how well is VC-istan actually doing, on its own terms? The first question is: what does it mean to “commoditize top talent”? While that sounds like something I might be against, I can’t actually say it’s a bad thing– not even for top-talent people. When something is commoditized, a fair price (that may fluctuate, but is fair relative to published market conditions) is established and it’s very easy to buy or sell it near that price. Currently, the compensation for top (2.0-level) engineering talent swings between about $75,000 and $10+ million per year– there is a clear uncertainty about what it is worth– with a median around $150,000. If that level of talent were adequately and fairly commoditized, that range would be more like $300,000 to $500,000– which would give most of them a hefty pay bump. The truth about the commoditization of labor is that labor generally finds it unobjectionable when the terms are fair. In fact, one effect of labor unions is to explicitly commoditize labor while attempting to ensure fairness (while professions, in general, oppose the commoditization regardless of terms). The murky issue in technology is that “top talent” is very hard to detect, because the people with the requisite skill have better things to do. Those who can, do; those who can’t, evaluate others’ work.

VC-istan, then, is built on the record-company model. Founders and engineers are treated as commodities (and generally, for reasons I won’t get into here, don’t get fair terms) but there is a hope that, thanks to the law of large numbers, top talent will be detected and validated by the outside market.

Where VC-istan went wrong is that it never figured out what top talent might look like, so the resources were thrown behind those who were either best at self-promotion or (increasingly, over time) those who could pull inherited connections. As a mechanism for detecting the rising generation’s top marketing talent, it might not be doing so bad. For picking out the best technical talent, especially as pertains to long-term R&D, it’s worse than abysmal. It’s doubtful that it’s picking up any signal at all. Companies that have a genuine need for R&D talent will be poorly served if they source it through acqui-hires.

VC-istan exists to commoditize top talent, but it has also erected a feudalistic reputation economy in which investors hold the cards. Founders hold few, engineers hold none. The highest levels of technical talent have been rendered, by this new economy, effectively irrelevant, depriving it of any leverage whatsoever. So, the terms are made bad– so bad that top engineering talent is rarely delivered. Whether this will strip VC-istan of credibility in the long run is something that remains to be seen.

The point I’ve made here is that it’s “solving” an ugly problem in a bad way.

What can venture capital do for technology?

Venture capital’s purpose is to build companies that, if successful, will become massive corporate behemoths. On a fundamental level, it’s stuck in the 20th-century mentality where a gigantic organization is the only acceptable prize for winning. Startup life is sold (by founders, and rarely by investors directly) to talented, usually clueless, engineers as an antidote to the ills of “MegaCorp” when, in truth, the explicit purpose of the VC-funded startup is to become exactly that: a new MegaCorp, but usually with crappier health benefits, longer hours, and faster firing.

What the best engineers actually tend to want is high autonomy so they can deliver exceptional work. They’d prefer ownership over it, all else being equal, but as long as they’re fairly compensated, they’re generally happy whether they work for a 20,000-person company or for themselves. When corporate R&D was sold for parts, venture-funded startups were proposed as the solution, the new way forward. Don’t like what happened to your old job? Create a new job for yourself! The lie here is that founding a VC-funded company provides the autonomy associated with true ownership. In truth, venture capitalists become full owners (de facto, if not de jure, due to the power afforded them by VC’s feudal reputation economy) of the company even when they hold a minority stake. Working for VCs is not fundamentally better than working for a boss; in many ways, it’s worse because the social distance is greater. Most bosses don’t consider themselves inherently superior based on favorable birth;

There are several critical misses that have become evident as venture capital has attempted to replace more traditional venues for innovation. One is that it has proven not to be a valid replacement for internal R&D. Nothing that VC-istan has coughed up is anywhere near the order of magnitude of Bell Labs or Microsoft Research. The second is that it has failed to be an engine of small-business generation, which is necessary for economic growth. It hasn’t connected top talent with the autonomy that comes from ownership. Rather, it has abandoned top talent in the pursuit of commodity startups run by commodity founders and commodity engineers. Over time, one might hope top talent to abandon it. That trend seems to be emerging, but I have no idea when or how (or, even at this stage, if) it will mature.

There is a fundamental technical flaw with VC-istan, additionally. That I’ll focus on, because it might lead us in the direction of a solution. If we consider the risk/reward profile of businesses, we see an underserved middle of the spectrum. Low-risk businesses can take bank loans, but those require personal liability, so it’s not wise to use them for anything that might actually fail. High-risk gambits with above-90% chances of failure, but that are capable of returning 20-50x on success, are what VCs love. The mid-risk/mid-growth space– targeting 15 to 50% annual growth, with a low but nonzero chance of business failure– is inappropriate for bank loans (too risky) but unpalatable to venture capitalists (not risky enough). Unfortunately, I don’t see an easy fix for that. Venture capital could become very profitable by funding the 15-50% range, but investment decisions aren’t driven by profits so much as the career needs of the investors. Returning a steady profit (say, 25% per year, with a bit of variance) by investing in a number of solid but moderately-sized businesses is not career-making; having been in on Facebook (even as a minor and late investor) is. The name-dropping world of Sand Hill Road cannot be expected to change, and if it does not, the focus will be less on building quality businesses and more on taking insane risks in the hope of hitting a career-making blockbuster.

This is problematic because the mid-growth/mid-risk space is exactly where true technologists live. They do not become machine learning experts or compiler gurus in an overnight episode of “virality”, and whether Mike Arrington or Paul Graham owes them a favor is irrelevant to whether they can actually code. They get good (and, if possible, rich) slowly. In terms of abstract value-added capacity, 15 to 50% per year seems to be about the natural rate (although most engineers would be thrilled to have salary increases at even half that rate). Technologists are extremely good at delivering these 20 and 40 percent per year improvements. What lies outside their interest (and, usually, their ability) is engineering the social conditions that admit 100x “viral” growth (or, far more often, abysmal failure). It’s just not where they live; they weren’t born in casinos.

The future

VC-istan is not about to die, any more than recording labels have ceased to exist. As a method of shaving 15 years off a rich kid’s corporate-ladder climb via “acqui-hire”, it will persist. As a machine that produces commodity startups run by commodity entrepreneurs, it will persist and probably be profitable for quite some time. As a way of enabling companies to discriminate on age, health, pregnancy status, and other illegal factors at upper levels (filled through acqui-hires, while rendering internal promotion rare) while keeping the discrimination off their books, it will hold that niche quite well. How relevant will VC-istan remain to true top talent? On that one, VC-istan’s lifespan may be limited. In that territory, it’s “ripe for disruption”.

So what shall be built to bring the disruption?

An alternate theory of shark-jumping

I’ve watched a fair amount of TV in my life, seen quite a few movies, read a large number of books. A theme that becomes common in creative endeavor is “jumping the shark”, or the decline in creative quality that occurs when a series (of TV seasons, or sequential movies) seems to run out of creative steam and begins “grasping” desperately at new ideas– often, ideas that are not necessarily bad but completely incoherent with the flavor of the series– as it tries to stay relevant. I’m going to address shark-jumping: why it happens, and if there is a way to prevent it.

In the abstract

There are a number of reasons why a series might decline in quality with age, a phenomenon most prominently seen in TV series with undefined length. Why does it happen? The most common explanation given is that the show’s originators “run out of ideas”, as if there were a finite supply of them that each person gets for one lifetime. I don’t think this is adequate, for two reasons. The first is that not all creative people “jump”. Some novelists run out of ideas and peak early; others keep getting better into old age. It doesn’t seem to be that common for a person to actually “run out of ideas”; some creative people become complacent once they’re lifted into upper-middle-class social acceptance (which is hard to attain for a creative person!) but that’s a change of context rather than a natural decline, and it doesn’t happen to everyone. The second is that it’s not a sufficient explanation, in light of the first point. Specific creative people can remain fresh for 15 years, no problem. But almost no fictional TV series, no matter how skilled its people, can stay fresh for that long. Most don’t keep quality for a third of that time.

In fact, the more people and money involved in a creative production, the faster the shark-jumping process– which is the opposite of what you’d expect if it were merely a problem of people running out of ideas. Novelists can stay fresh for a lifetime, while TV series tend to jump the shark after 3-6 years on average. Movies tend to jump even more quickly than that– in the first sequel, except in planned series (e.g. those that were designed to be trilogies from the outset). Magic: the Gathering (which requires a large design team) jumped, in terms of thematic quality, when I was half my current age, but Richard Garfield’s new games are still good.

This suggests strongly that shark-jumping is about teams, not individuals. That makes sense. The “idea person” might remain brilliant, but if her team is full of hacks, she’ll be inclined to stick to the tried-and-true. That’s one pattern of shark-jumping, but probably not the most common. Equally or more common is the taking of more risks, but with the new creativity feeling gimmicky and forced. When The Office jumped, it began taking more risks, but was incoherent and haphazard in doing so. When House jumped, the characters’ personal lives became more unusual. Whether more risks or fewer risks are taken, a decline in quality happens either way.

If shark-jumping is about teams, then why not fire the old team and start with an entirely fresh set of people? Most often, that will only make things worse. Even if the people on the new team are paid four times as well, and even if they’re individually quite creative, I maintain that their output will (on average) be worse than if the old team had stayed (in which case decline would still occur). As a TV or movie series matures, the set of constraints laid down upon future creativity increases. That isn’t always bad. More rigid poetic forms like the sonnet (as opposed to free verse) often encourage creativity because the poet has to spend a lot more time thinking about words, and how they sound and flow together, than in typical prose. The same, I think, goes with serial creative work. The increasing constraint load, for some time, actually improves the product. In TV, Season 2 is typically better than Season 1. There is a point, however, when those constraints become a burden. Reasonable avenues of exploration become fewer as the story moves along. That’s not unnatural. In drama, we see that in the tragic arc: eventually, the protagonist reaches a point where the only remaining option is to surrender to the forces that have multiplied against him; the mortal dies, the gods win. In a television series intent on prolonging its life, however, this results in increasingly ridiculous ploys to get the main characters out of whatever final state– whether a positive one like marriage for a lothario, or a negative one like imprisonment or terminal illness– they’ve arrived at. This should also explain why series designed with a finite life in mind (such as Breaking Bad) rarely jump the shark. They’re programmed to end before that would happen.

As much as shark-jumping is about the increasing constraint load and the inverted-U shape of its effect on creative output, it’s also about people. Would the same calibre of people sign up to work on Inception II as worked on the original? I doubt it. It’d be possible to get good people, yes, but the best people would prefer to work on something more original than a sequel. You’d get more people who are there to burnish their resumes and fewer who are there to do the best creative work of their lives. Mature brands tend to draw people in with careerist rather than creative impulses: ability to lead a large group, attach one’s name to a known entity, etc. The average credibility (in terms of on-paper accomplishment and social status) goes up as the brand matures, and this might also improve the mean output, but it reduces variance. Thus, peak creative output is almost always lower in the brand’s later phases.

Therefore, a “fire the old team” strategy is likely to accelerate the shark-jumping problem, which is about the type of team that a series will attract more than the individuals themselves. The old-timers who had the vision are gone, and they’ve been replaced by people who are on the job for careerist reasons. In general, I’d say there’s nothing wrong with this– most people take most jobs for careerist reasons– but it’s not conducive to the highest levels of creative output. If there are still a couple of clever ways, for a series, out of no-credible-options-left shark-jump territory, a fresh team of mercenaries is not likely to find it. They’re likely to barge through walls, strain credibility, and make shark-jumping palpable in the final product.

It’s not that people “run out of ideas”. They don’t. Teams, however, lose members and gain new ones constantly. That’s inevitable. And if there’s one thing I’ll say confidently about creative people as an entire set, it’s that we’re intermittent. Something like a TV series requiring 600 minutes of show time (using the industry-standard 100:1 multiplier, that’s 1000 hours of production time) requires a creative team, because even the best individuals can’t hit every note right over that duration without some help. So, at least in television, even the best of visionary creators needs the support of (and challenges from) a strong team to keep going. And no matter what, that team will evolve in a direction that’s likely to be sharkward. The new team might be paid four times as much as the old one but, by Season 7, almost no one’s focus is on the work in front of them. Rather, they’re more interested in Season 1 of their next project, where they’ll have more input and opportunity to shine. This career incoherency (disparity between what’s good for their jobs vs. their careers) doesn’t actually cause them to “run out of ideas”. More often, it’s the reverse. They (probably subconsciously, for the most part) take risks that may confer personal career benefits, but that go against the grain of what the series is really about.

In software

That this applies to software, also, should not surprise anyone. Like a television series, software is designed with an indefinite lifespan in mind. There is somewhat of a difference, which is that software doesn’t always jump the shark. In the open-source world, it’s far less likely to do so. However, I think that commercial software obeys the same principle of shark gravity. After a certain point, a corporate software module will be in maintenance mode and struggle to attract a high calibre of people.

There are people who will hack the Linux Kernel or Postgres or Clojure because they use those products and care about them deeply. Open-source software is, in truth, a brilliant solution to the career-coherency problem: people can benefit their careers and add value to the world. Such software can jump the shark, but I don’t think it’s guaranteed to do so, and the best software products seem never to jump. There are career benefits to maintaining a respected open-source product, and the fact that the maintainer is also typically a user (and, therefore, aware of existing context) prevents the bad creative risks for which post-shark teams are known.

In-house or commercial software, on the other hand, seems always to jump. Within most companies, however, the career payoff of maintenance work is almost invariably inferior to that of new invention. Open-source software solves the career coherency problem, but internal products almost never become respected enough for that to happen. Software’s shark-jumping dynamic is, in many ways, actually more severe than that of a TV series. In television, the people who join a mature series aren’t necessarily less qualified or worse at their jobs– they have different objectives that are less conducive to doing their best work, but they’re not across-the-board less qualified people. In closed-allocation software companies, however, maintenance usually becomes the ghetto for people who can’t fight their way to something meatier, and very few people who are any good will stay with it for very long.

Rarely, if ever, is a closed-allocation software company able to solve this problem. When the company recognizes that a legacy module is important, it will commit resources to its upkeep, usually in two ways. The first is to increase headcount, and the second is to increase salaries of the people doing the work. On the first, that tends to attract less capable people for two reasons. The incompetent like larger teams because they can “hide within the herd”. But this applies to middle managers as well as front-line workers. Mediocre managers also prefer large teams because it inflates their headcount statistics; they’re more likely to make Director if they can say they had ten reports than if they had three. Good managers generally want to lead teams of high average competence and achieve something tangible; mediocre and bad managers usually want to lead large teams (with minimal concern over whether they get good reports or bad) to improve their stats. So that first solution fails to have the desired effect. What about the second, which is increasing the pay of the maintenance staff? That rarely works, either. The truth is that a savvy, capable software engineer can’t be motivated to do career-incoherent work with a one-time 20 percent– or even 50 percent– bonus. The opportunity cost for her (in not doing work that will advance her career) is too great. She might be appeased with a permanent 30% salary bump, for a year or two, but then that will become “the new normal” for her compensation and she’ll need another bump. But HR is not about to let the salary for a lowly “Software Engineer III” to go that far out of band, and promoting her (to compensate for unpleasant work, regardless of whether she meets the typical criteria for promotion) will often annoy engineers who will (accurately) perceive the promotion as “political”. Even if engineers abstractly agree that undesirable work deserves a reward, they’ll usually oppose a promotion (especially if it is over them) that appears to be given for doing grunt work that is (and because it is) unpleasant rather than technically impressive. So that’s untenable, too. How does the typical closed-allocation software company solve that maintenance problem? The rewards generally all go to the “heroic” middle manager (who usually takes the project on for a second or third chance, after failing at new invention) for “rescuing” the ailing legacy module. In the large closed-allocation software titans, these awards (again, to the managers of the maintenance projects, and rarely to the teams) can reach six or seven figures. The peons get nothing; they’re just doing their jobs and, in the typical closed-allocation hellhole, their managers can easily prevent them from having other options.

That the above doesn’t work, at all, shouldn’t surprise anyone. I’ve already said a lot about that topic here and here, so I won’t lengthen this particular essay by reviewing it.

In sum, shark-jumping (whether in television, or in software) occurs because of two reasons, neither of which requires an individual to “run out of ideas” (we know that that doesn’t always happen). The first pertains to the constraints imposed by the project’s history. At first, constraint is conducive to superior creativity– that’s why most poets are better in rigid forms than in free verse– but, at some point, the complexity load gets to a point where high-quality options have been exhausted. The second, and probably more inexorable, factor is the change in team dynamics. As a brand matures or a software module goes into maintenance mode, the evolution in the motivational profile (that is, why the team is there) is enough to bring on the shark.

What is the solution? For television, the best solution seems to be to let the narrative arc tend toward its natural close– and not to senselessly prolong the life of the series. Breaking Bad did that and never jumped, but with another season, it probably wouldn’t have been as good. Software doesn’t have that option, because it’s designed to be infrastructural in nature. It should mature to a point where it “just works” from the perspective of 99+ percent of those who interact with it. The issue is that someone will have to maintain it. In software, the only incentive system that seems to work– i.e. the only one that can solve the otherwise-crippling career-coherency issues of maintenance work– is the open-source economy.

Software engineer salaries aren’t inflated– at least, not for the 99%

It’s autumn 2013, and there’s a lot of discussion around the current bubble (now obviously one) in the VC-funded technology world and how it will end. Business Insider acknowledges that a bubble exists, but gets some crucial details wrong. Let’s talk about one that most of us actually care about. Business Insider claims: “It’s not just tech asset prices that are high. Salaries are high, too.” Them’s fighting words. Is it true? Well, sort-of. Here’s the evidence, from tech recruiter Matt Allen:

Instead, we’re seeing sign-on bonuses for individuals five-years out of school in the $60,000 range. Candidates queuing-up six, eight or more offers and haggling over a few thousand-dollar differences among the offers. Engineers accepting offers and then fifteen minutes before they’re supposed to start on a Monday, emailing (not calling) to explain they found something better elsewhere.

Ok, let’s dissect this. One: a few people (and it’s not clear that they’re engineers) are getting huge signing bonuses. $60,000 isn’t a number to sneeze at, but it’s not that extreme. Management-level hires typically get signing/relocation bonuses that cover the cost of a cross-country move (easily over $20,000, for people with families) and there’s no reason software engineers shouldn’t get the same. Additionally, signing bonuses usually have clawback provisions if the employee leaves (even involuntarily) in the first year, penalizing the job-hopping for which the worst of our generation is known. Given the tax penalty associated with receiving a bonus and risking having to pay it back, I’m not sure I’d want a $60,000 bonus under typical terms. Two: some candidates are queuing up 6 to 8 job offers. I call bullshit on that one, if only because of the scheduling difficulties in a startup ecosystem where 7-day exploding offers are the norm. I’m sure there are people getting 6-8 offers in the course of an entire job search (I’ve had that) and that people are claiming to have portfolios of excellent offers in negotiation, but the logistics of getting 6 active, credible job offers at one time are unfavorable, to say the least. Three: people are being unprofessional dickbags, pulling out of accepted offers on their start date. I’m sure that that is happening, but how is an occasional episode in which a privileged young hotshot acts like a jackass newsworthy, much less the sign of a bubble? It’s not.

Managers and product executives are making a killing in the present-day startup economy, no doubt, and some of those people might be able to pass as programmers due to some PHP scripts they wrote in their teens, but when one actually studies the contemporary startup economy, there are not a lot of software engineers making over $200,000 per year outside of finance– and those who are tend to be either very good or unusually lucky. For a VC-funded startup to offer $200,000 to an engineer would be incredibly rare, even in the Bay Area, and equity allotments after VCs are involved are notoriously stingy.

Twenty years ago, when startups were underdogs almost by definition, the scene had a “Revenge of the Nerds” feel. A bunch of ragtag computer aficionados, typically from middle-class backgrounds and far away from the East Coast’s financial and corporate elite, were showing up the old guard. New, powerful technologies were being developed, and power shifted (temporarily, perhaps) to those few who understood them at a deep level. There was slight subversion of the 1%; they weren’t destroyed or even harmed, but they were visibly outperformed. For a contrast, the hot properties of the current VC-funded world almost entirely come from the 1%. Behind almost every single one of the VC darlings, there’s a series of strings pulled by powerful people repaying favors to the rich daddies of the founders. There’s no meritocracy in it. It’s not a challenge to the established and rich; it’s a sideshow for the supercapitalists. In a surprising reversal, the old-style corporate world (and the enterprise companies existing and being formed to serve their needs) has a much more middle-class culture, because the current-day rich find it boring.

Software engineer salaries in the VC-funded world are not especially low (nor are they high). They’re usually 75 to 95 percent of what more typical employers are offering. Equity distributions, on the other hand, are extremely lopsided. I worked for a company once where the board refused to allow more than 0.04% to go to an engineer. (Why? Because fuck the people doing the work, that’s why.) There’s something that needs to be discussed here, because it applies to the age-old question of why people who do actual work are modestly compensated, while vacuous celebrity types take the lion’s share. It’s the Teacher-Executive Problem.

The Teacher-Executive Problem

As a society, we need teachers, police officers, park rangers, and other such people who are modestly compensated. We don’t need celebrities, business executives, or professional athletes. I’m not going to argue that the latter are overpaid, insofar as it’s difficult and competitive to get to the top ranks in any field. That would be a subjective argument; all I intend to say is that, objectively, the need for the latter class of labor is smaller. If we didn’t have teachers or police, society would fall apart. If we didn’t have corporate executives, companies would find other ways to survive. So why are the more necessary people paid less? Because being necessary means that more workers will be drawn into the field, and that limits individual compensation. We probably pay more, as a society, for teachers and police than we do for corporate executives (as we should) but the individual slices for the larger, more necessary, job categories are smaller.

We have 3 million teachers in the US, and we need that large number of them, because individual attention per student is important. The functioning of society would be greatly impaired if that number dropped to 2 or 1 million. One might argue that competent teachers are “worth” $200,000 (or much more) per year– and I’d say that the best are worth several times that– but can society afford to pay that much for teaching? Three million $200,000 paychecks is a $600-billion annual liability. Taxes would go up substantially– in a time when the base of political power is (unfortunately) divided between a structurally disadvantaged (read: mostly fucked) emerging-adult cohort and retiring Boomers whose children are out of school– and society would likely determine that $200,000 annual paychecks for teachers “can’t be afforded” (especially given the claim that “they get off work at 3:00″). $200,000 isn’t a large amount of money for a single person, but for people who are actually needed in significant numbers, the multiplier of 3 million makes it seem unacceptable. (I am not arguing that teachers don’t deserve $200,000 salaries; only that it would be politically impossible to get them there.)

For a contrast, the social need for corporate executives (excluding entrepreneurs) is pretty minimal, and society recognizes this in a rational way: there aren’t a large number of slots: title inflation aside, there might be ten thousand truly executive roles in powerful companies. However, when the number of people performing a job is low, gigantic salaries (if those people control the distribution of resources) become socially affordable. Three million somewhat high salaries is a problem, ten thousand enormous ones is not. This is paradoxical because the middle-class conceit is that the way to become wealthy is to make oneself valuable (or, better yet, necessary) to society. What the Teacher-Executive Problem shows us is that there’s more potential for outlier compensation in doing things that aren’t necessary, because asking for more compensation doesn’t carry the implicit multiplier based on the size of the labor base. Society “can’t afford” to pay the 3 million teachers such high salaries, but it can afford the huge salaries of corporate executives, and the $850-million acquisition that enriches the top executives of

Why do so few software engineers get a fair shake in the VC-funded world? They’re on the wrong side of the Teacher-Executive Problem. They’re actually necessary. They’re required in order for technology firms to function.

What about 10X?

The generally accepted consensus (even among software engineers) is that average programmers aren’t very valuable. They write all that buggy, hideous legacy code. There’s little that software engineers and business executives agree upon, but the low status of the average programmer is probably not a point of disagreement. I don’t care to speculate on what the “average” software engineer is like, because while I have seen a ton of incompetents (and a smaller number of good engineers) out there in the world, I don’t have a representative sample. I also think that most of the engineering incompetence comes not from a lack of ability, but from an anti-intellectual culture originating in business culture at large, as well as nonexistent mentoring, so it’s not programmers who are mostly at fault. However, I will agree readily that the bulk of software engineers don’t deserve high ($200,000) salaries. They might have the talent, but few have that level of skill.

However, there is the concept of the “10x” software engineer, one who is 10 times as productive as an average engineer. It reflects a truth of software engineering, which is that excellence and peak productivity are tens to hundreds of times more powerful than the average-case output. (In fact, often that ratio is infinite because there are problems that require top talent to solve it.) Moreover, groups of engineers often scale poorly, so a team of 10 isn’t really (most of the time) 10 times as productive, but maybe 2 or 3 times as strong, as an individual. So it’s not surprising that a great engineer would be 10 times as valuable. The degree to which “10x” is real depends on the type of work, the context, project-person fit, and the competence of the engineer. It’s highly context-dependent, it’s not always the same people, and it’s quite unpredictable. The national average salary for a software engineer is about $90,000. The 10x-ers are not earning 10 times that and, to be honest about it, they probably shouldn’t. You can’t know, when hiring someone, whether the context that supports 10x output for that person is going to exist in the role. The bona fide 10x engineers typically earn 1.5 to 2 times that amount ($135,000 to $180,000) in the U.S. I’m not going to argue that they’re underpaid at this level– although, at least in comparison to MBA graduates earning twice that before age 30, I think they clearly are– but they’re far from overpaid at that level.

Why don’t 10x engineers get paid astronomical sums? For a large part, I think it’s because of the context-dependent nature of “10x”. It doesn’t require only a good engineer, but a good engineer connected with the right kind of work. Companies can’t afford (obviously) to pay $900,000 salaries to senior engineers just on the hunch that those (typically highly specialized) talents will find a use. When engineers do find environments in which they can deliver 10x output, they’re happy– and they’re not liable to clamor for huge raises, or to move quickly and risk starting over in a defective environment. This isn’t especially wrong; engineers would rather have interesting work at “merely” 1.5x salaries than risk happiness and growth for a chance at more. It’s just worth pointing out to establish that, in general, software engineers (and especially the most capable ones) are not overpaid. Moreover, the people commanding $500,000+ salaries in technology are typically not real engineers, but managers who might occasionally “drop down” and hack on one of the sexier projects to keep their skills sharp. Finally, the few (very few) software engineers making that kind of money are generally worth it: we’re not talking about top-1% talent at that level, but top-0.05% (and a level almost never achieved before age 40). There are plenty of people drawing undeserved high salaries in the Valley, but they aren’t the ones writing code.

Why must I point this out?

This (bubble) too, shall pass. The era when a well-connected rich kid can raise a $1-million seed round rather than eating his lumps in an investment bank’s analyst program will end. I don’t think that that’s a controversial assumption. Timing the end of a bubble is nearly impossible, and I don’t think anyone has shown reliability in that particular skill; but predicting that it will die off is trivial– they always do. When this happens, there will be a lot of job losses and belt-tightening. There always is. It’ll get ugly, and that’s fine. Most of these businesses getting funded and acquired don’t deserve to exist, and the economy will inevitably purge them. What I don’t want to see is the bubble made into an argument against the middle-class, hard-working software engineers. When the bubble ends, there will be losses to eat and austerity to go around, and it ought to go right to the people who reaped the benefits while the bubble existed. The end of the bubble should not be used to reduce the compensation of software engineers as a whole, whose pay is currently (I would argue) not quite unfair, but on the low side of the fair interval.

For the 99 percent, there is no software engineer salary bubble.

Blub vs. engineer empowerment

No, I’m not quitting the Gervais / MacLeod Series. Part 23, which will actually be the final one because I want to get back to technology in how I spend spare time, is half-done. However, I am going to take a break in it to write about something else. 

I’ve written about my distaste for language and framework wars, at least when held for their own sake. I’m not fading from my position on that. If you tell go off and tell someone that her favorite language is a U+1F4A9 because it’s (statically|dynamically) typed, then you’re just being a jerk. There are a few terrible languages out there (especially most corporate internal DSLs) but C, Python, Scala, Lisp and Haskell were all designed by very smart people and they all have their places. I’ve seen enough to know that. There isn’t one language to rule them all. Trust me.

Yet, I contend that there is a problem of Blub in our industry. What’s Blub? Well, it’s often used as an epithet for an inferior language, coined in this essay by Paul Graham. As tiring as language wars are, Blubness is real. I contend, however, that it’s not only about the language. There’s much more to Blub.

Let’s start with the original essay and use Graham’s description of Blub:

Programmers get very attached to their favorite languages, and I don’t want to hurt anyone’s feelings, so to explain this point I’m going to use a hypothetical language called Blub. Blub falls right in the middle of the abstractness continuum. It is not the most powerful language, but it is more powerful than Cobol or machine language.

And in fact, our hypothetical Blub programmer wouldn’t use either of them. Of course he wouldn’t program in machine language. That’s what compilers are for. And as for Cobol, he doesn’t know how anyone can get anything done with it. It doesn’t even have x (Blub feature of your choice).

As long as our hypothetical Blub programmer is looking down the power continuum, he knows he’s looking down. Languages less powerful than Blub are obviously less powerful, because they’re missing some feature he’s used to. But when our hypothetical Blub programmer looks in the other direction, up the power continuum, he doesn’t realize he’s looking up. What he sees are merely weird languages. He probably considers them about equivalent in power to Blub, but with all this other hairy stuff thrown in as well. Blub is good enough for him, because he thinks in Blub.

When we switch to the point of view of a programmer using any of the languages higher up the power continuum, however, we find that he in turn looks down upon Blub. How can you get anything done in Blub? It doesn’t even have y.

By induction, the only programmers in a position to see all the differences in power between the various languages are those who understand the most powerful one. (This is probably what Eric Raymond meant about Lisp making you a better programmer.) You can’t trust the opinions of the others, because of the Blub paradox: they’re satisfied with whatever language they happen to use, because it dictates the way they think about programs.

So what is Blub? Well, some might read that description and say that it sounds like Java (has garbage collection, but not lambdas). So is Java Blub? Well, not quite. Sometimes (although rarely) Java is the right language to use. As a general-purpose language, Java is a terrible choice; but for high-performance Android development, Java’s the best. It is not James Gosling’s fault that it became the go-to language for clueless corporate managers and a tool-of-choice for mediocre “commodity developers”. That fact may or may not be related to weaknesses of the language, but it doesn’t make the language itself inferior.

Paul Graham looks at languages from a language-designer’s viewpoint, and also with an emphasis on aesthetics. As an amateur painter whose original passion was art, that shouldn’t surprise us. And in my opinion, Lisp is the closest thing out there to an aesthetically beautiful language. (You get used to the parentheses. Trust me. You start to like them because they are invisible when you don’t want to see them, but highlight structure when you do.) Does this mean that it’s right for everything? Of course not. If nothing else, there are cases when you don’t want to be working in a garbage-collected language, or when performance requirements make C the only game in town. Paul Graham seems to be focused on level of abstraction, and equating the middle territory (Java and C# would take that ground, today) with mediocrity. Is that a fair view?

Well, the low and high ends of the language-power spectrum tend to harbor a lot of great programmers, while the mediocre developers tend to be Java (or C#, or VB) monoglots. Good engineers are not afraid to go close to the metal, or far away from it into design-your-own-language land, if the problem calls for it. They’re comfortable in the whole space, so you’re more likely to find great people at the fringes. Those guys who write low-latency trading algorithms that run on GPUs have no time to hear about “POJOs“, and the gals who blow your mind with elegant Lisp macros have no taste for SingletonVisitorFactories. That said, great programmers will also operate at middling levels of abstraction when that is the right thing to do.

The problem of Blubness isn’t about a single language or level of abstraction. Sometimes, the C++/Java level of abstraction sometimes is the right one to work at. So there certainly are good programmers using those languages. Quite a large number of them, in fact. I worked at Google, so I met plenty of good programming using these generally unloved languages.

IDEs are another hot topic in the 10xers-versus-commodity-engineers flamewar. I have mixed feelings about them. When I see a 22-year-old settling in to his first corporate job and having to use the mouse, that “how the other half programs” instinct flares up and I feel compelled to tell him that, yes, you can still write code using emacs and the command line. My honest appraisal of IDEs? They’re a useful tool, sometimes. With the right configuration, they can be pretty neat. My issue with them is that they tend to be symptomatic. IDEs really shine when you have to read large amounts of other peoples’ poorly-written code. Now, I would rather have an IDE to do than not have one (trust me; I’ve gone both ways on that) but I would really prefer a job that didn’t involve trudging though bad legacy code on a daily basis. When someone tells me that “you have to use an IDE around here” I take it as a bad sign, because it means the code quality is devastatingly bad, and the IDE’s benefit will be to reduce Bad Code’s consumption of my time from 98% to 90%– still unacceptable.

What do IDEs have to do with Blub? Well, IDEs seem to be used often to support Blubby development practices. They make XML and Maven slightly less hideous, and code navigation (a valuable feature, no disagreement) can compensate, for a little while, for bad management practices that result in low code quality. I don’t think that IDEs are inherently bad, but I’ve seen them take the most hold in environments of damaged legacy code and low engineer empowerment.

I’ve thought a lot about language design and languages. I’ve used several. I’ve been in a number of corporate environments. I’ve seen good languages turn bad and bad languages become almost tolerable. I’ve seen the whole spectrum of code quality. I’ve concluded that it’s not generally useful to yell at people about their choices of languages. You won’t change, nor will they, and I’d rather work with good code in less-favored languages than bad code in any language. Let’s focus on what’s really at stake. Blub is not a specific language, but it is a common enemy: engineer disempowerment.

As technologists, we’re inclined toward hyperrationality, so we often ignore people problems and mask them up as technical ones. Instead of admitting that our company hired a bunch of terrible programmers who refuse to improve, we blame Java, as if the language itself (rather than years of terrible management, shitty projects, and nonexistent mentorship) somehow jammed their brains. Well, that doesn’t make sense because not every Java programmer is brain damaged. When something goes to shit in production, people jump to the conclusion that it wouldn’t have happened in a statically-typed language. Sorry, but that’s not true. Things break in horrible ways in all kinds of languages. Or, alternatively, when development is so slow that every top-25% engineer quits, people argue that it wouldn’t have happened in a fast-prototyping, dynamically-typed language. Wrong again. Bad management is the problem, not Scala or Python or even Java.

Even terrible code isn’t deserving of the anger that’s directed at it. Hell, I’ve written terrible code, especially early in my career. Who hasn’t? That anger should be directed against the manager who is making the engineer use shitty code (because the person who wrote it is the manager’s favorite) and not at the code itself. Terrible romance novels are written every day, but they don’t anger me because I never read them. But if I were forced to read Danielle Steele novels for 8 hours per day, I would fucking explode.

Ok, that’s enough negativity for a while…

I had a bit of a crisis recently. I enjoy computer science and I love solving hard problems. I enjoy programming. That said, the software industry has been wearing me down, this past couple of years. The bad code, low autonomy, and lack of respect for what we do is appalling. We have the potential to add millions of dollars per year in economic value, but we tend to get stuck with fourth quadrant work that we lack the power to refuse. I’ve seen enough of startups to know that most of them aren’t any better. The majority of those so-called “tech startups” are marketing experiments that happen to involve technology because, in the 21st century, everything does. I recently got to a point where I was considering leaving software for good. Computer science is fine and I have no problem with coding, but the corporate shit (again, just as bad in many startups) fries the brain and weakens the soul.

For some positivity, I went to the New York Clojure Meetup last night. I’ve been to a lot of technology Meetups, but there was a distinct feel at that one. The energy was more positive than what I’ve seen in many technical gatherings. The crowd was very strong, but that’s true of many technical meetups. Here, there was a flavor of “cleaner burning” in addition to the high intelligence that is always the case at technology meetups. People weren’t touting one corporate technology at the expense of another, and there was real code– good code, in fact– in a couple of the presentations. The quality of discussion was high, in addition to the quality of the people.

I’d had this observation, before, about certain language communities and how the differences of those are much greater than differences in language. People who intend to be lifelong programmers aren’t happy having New Java Despondency Infarction Framework X thrown at them every two years by some process-touting manager. They want more. They want a language that actually improves understanding of deep principles pertaining to how humans solve problems. It’s not that functional programming is inherently and universally superior. Pure functional programming has strong merits, and is often the right approach (and sometimes not) but most of what makes FP great is the community it has generated. It’s a community of engineers who want to be lifelong programmers or scientists, and who are used to firing up a REPL and trying out a new library. It’s a community of people who still use the command line and who still believe that to program is a virtue. The object-oriented world is one in which every programmer wants to be a manager, because object-orientation is how “big picture guys” think.

I’m very impressed with Clojure as a language, and that community has made phenomenally good decisions over the past few years. I started using it in 2008, and the evolution has been very positive. It’s not that I find Clojure (or Lisp) to be inerrant, but the community (and some others, like Haskell’s) stands in stark contrast against the anti-intellectualism of corporate software development. And I admire that immensely. It’s a real sacrifice that we 1.5+ engineers make on an ongoing basis when we demand that we keep learning, do things right, and build on sound principles. It doesn’t come easy. It can demand unusual hours, costs us jobs, and can put us in the ghetto, but there it is.

In the mean time, though, I don’t think it’s useful to mistake language choice as the prevailing or most important issue. If we do that, we’re just as guilty of cargo cultism as the stereotypical Java-happy IT managers. No, the real issue that matters is engineer empowerment, and we need to keep up our culture around that.

Gervais / MacLeod 18: more on trust, Square Root Syndrome, Brownian and Progressive Time

In Part 17, I discussed the financial considerations of starting a technology company financed by passive equity-holders. In that model, these investors are enabled to enjoy the high rate of return associated with human creative risk, but do not take an active management role. I used the term “lifestyle business” but I’ve since realized that I’m talking about something more specific: mid-growth businesses. “Lifestyle”, as I’m using it, isn’t about size, but about intended growth rate. It refers to businesses that prioritize long-term cultural health over rapid expansion, but that have a clear interest in growth itself. It’s not headcount or revenue growth that the mid-growth business optimizes for, but healthy growth: growth that doesn’t compromise the culture.

A low-growth business might be a restaurant with inherent scale limitations, or a “4-hour work week” business intended to run itself later in time. The ceiling is fairly low, and consequently there’s not a lot of interest in passive equity financing. Banks will make loans (debt financing) and usually require personal liability. Failure rates are considerable (business is always risky) but not so high as to make this completely untenable. On the contrast, a high-growth business is insistent on rapid growth– in headcount, revenues, footprint, and market dominance. 100 percent per year is barely socially acceptable; 150-200% is expected. Investors take an active managerial role and lose interest if growth falls short of 10% per month. The major downside of the high-growth business is that the vast majority run out of money and fail.

I’ve been trying to figure out a way to address the needs of the 1%- and 2%-per-month growth businesses. That doesn’t deserve to be sneezed at! If one could invest $10,000 into a business whose value grew reliably at 1.5% per month, that would turn into $356,000 after 20 years. That’s not the kind of thing that screams “fuck-you money” to thrill-seeking prospectors, but it certainly would make most passive investors happy. The conclusion I’ve come to is that we need passive equity financing of a very large number (a “fleet”) of highly capable but slow-growing (by VC-istan standards) businesses. Right now, regulations exist to keep “dumb money” away from such “lifestyle” businesses, judging small investors incapable of getting a decent deal, considering the “principal-agent problem” involved. My methodology (in the previous essay) fixes that by making compensation and profit sharing extremely simple and transparent while handling “HR expediencies” in a such a way that they don’t compound over time. Good. Solved that problem, at least on a technical front. That’s one of the major obstacles right now against my vision of connecting passive capital with human creativity in a way that doesn’t involve the career volatility and ethical compromises of VC-istan. It’s not the only obstacle, but it’s the biggest one.

Compensation is the harder of the two trust problems in organizations: do the owners (principals) trust the workers (agents) not to steal from them by overcharging or through various devious manipulations (e.g. “holding the company hostage” with key information)? Extreme transparency on compensation and culture helps a lot there. The more openness there is about the way decisions are being made, the more it is made clear that devious play-the-company-against-itself tricks won’t result in outsized personal yield, the less likely defection is. There’s a second question of trust, which is traditionally left to managers (owners don’t get involved): can we trust people to get the work done? That’s a fun one, too. Why do so many projects fail to ship? Why are so many people seemingly incompetent at self-executivity (including many actual executives)? That requires introducing the phenomenon of wasted time in an interesting context: Brownian time.

Brownian Time

I once found myself in a discussion about the value of an hour of time. A friend of mine were trying to determine whether it was possible to value work on hour-by-hour basis? We realized that, for most work days, only 3 hours (the square root of 9) actually mattered. The other 5-6 were spent in meetings, goofing off, or general “zoned out” low productivity, for most people. This seems to be the norm, and it’s not an intentional or morally bad thing. People just can’t hold intense concentration over an 8-hour contiguous block of time that someone else picked, five days in a row at the exact same time. It’s not possible.

We realized that this idea (“Square Root Syndrome”) applied to more than just hours in the day; it was visible at larger scales. Three hours of the workday really matter; the rest is wasted. Days in a week? It seems typical to have real victories on 2 out of the 5; the other 3 are unstellar and see minimal useful work. Stuff gets done, but rarely important stuff. Apply this to the 49 weeks in a typical work-year: 7 weeks pertain to real highlights– macroscopic achievements, lines on the resume– and the other 42 are forgettable. Then look at a 36-year software engineering career. Six years of the typical engineer’s career are spent on jobs that really deliver– lead to promotions, “fuck you money”, interesting contributions to humanity. The other 30? Wasted on executive pet projects, startups that go nowhere, ingrate bosses, and bad ideas. That’s not how it is for everyone, but those numbers seem pretty typical.

The depressing conclusion of this is that out of a whole career, only a little bit counts: 3 hours/day times 2 days/week times 7 weeks/year times 6 years gives us 252 hours that are really worth a damn. Of course, that’s not how actual careers work. It could be zero hours in a person’s career that count, meaning that there’s no progress and it isn’t really a career. It could be several thousand that matter. I’ll get to that later on.

This recursive “square root” relationship is what I call Brownian Time. It shows us the downside of unstructured, chaotic behaviors. If there’s no feedback or conscious work at using time properly, you get square-root scaling.  So you get twice as much out of a 4-hour meeting as you get out of a 1-hour meeting. (That’s generous; for most 4-hour meetings, one gets less.) I don’t know that the actual human pattern follows an exact power of 0.5, but it’s not far off. Why is it Brownian? It pertains to the a fractal pattern called Brownian Motion, which is a model for a variety of random processes including stock prices. If a stock has volatility (variance) of 1% per day, its volatility over a 256-day year is 16%. Most of the ups and downs cancel each other out. If one could call the good days in advance (and, of course, one can’t) then one could hold the stock for only a few days that year and realize all of its gains (or losses) in that small slice of the year. Brownian Motion is random “drift” that scales with the square root of time. If your goal is twice as far away, it’ll take 4 times as much drifting to get there.

In practice, I don’t care much for “Agile” software development practices, which often become the opposite. If you have mutual trust between managers, software engineers, and customers, then it can work well, but it’s not needed. If there isn’t that trust, Agile breaks down horribly and becomes not only a justification for intense micromanagement that borders on emotional bullying (several status checks per day) but with rigidity in such micromanagement that generates undesirable process complexity. Good teams are already doing things that look like Agile without necessarily calling that: keeping each other informed, taking full ownership of quality issues, and prioritizing important issues over silly and egotistical ones, without going nuts if (oh, my God!) something goes into version control without an associated “story”.

Yet I decided to read into “the dark side” and found a good talk about Agile by Mike Cohn, and I got a sense of what Agile really is and why it exists. Cohn isn’t trying to give evil managers a cudgel or mire teams in 45-minute “standups” where managers get to sit down. He’s trying to fix the Brownian Time problem. Between the burndown charts and time-management protocols, he’s trying to create a framework in which a team’s use of time show linear productivity rather than a square-root relationship. That, I’d say, is admirable.

Related to “Brownian Time”, of course, is Brownian Management. Requirements accumulate with no discrimination regarding which are the real requirements and which are “nice-to-haves” and which are fourth quadrant bullshit laid on because it’s free to tell someone what to do. Managers squabble over headcount and people are moved. Authority topologies change and expectations (always poorly laid out) shift. People are pushed left, than right. Ninety percent of work exists to counteract side effects of other work. From a microscopic level, people seem busy, but from a macroscopic perspective, nothing’s getting done. From an outside perspective, it looks infuriating. Ten thousand people are being paid to accomplish what looks like it should be done by 100. In other words, Square-Root Syndrome seems to apply to groups of people just as it does to time.

Progressive Time

If all the value in a creative person’s career could be condensed into 252 hours, that would be quite an unhappy conclusion. An incredible amount of time would be wasted. If we’re going to graph the per-hour economic yield of a typical person’s career (which rarely tells the whole story, because there are interactions between one hour and the next) we’ll probably find something like that. Nassim Taleb made almost all (over 98%, if I recall correctly, of his $40 million lifetime P&L) of his lifetime earnings as a trader in one day: the October 1987 stock market crash (“black swan”). This is why narratives work for us: they capture the small number of high-impact moments; Rocky Music plays in a “practice montage” of three minutes that stands in for 3 years of intense training. Sure, one can put the most critical parts of a drama (unfolding over five years) in a 3-hour movie, and that tells the whole story. Yet we know, from experience, that you can’t get to any level of readiness for the critical moments with a measly 3 hours of preparation. It takes deliberate practice. It requires progress, and that involves making sure future hours are informed by the past and present, so the right decisions are made, and the best ideas are had, in those 252 critical hours.

One neat thing about learning as opposed to economic “doing” is that it scales better. You don’t get Square Root Syndrome with building up a knowledge base. In fact, you probably get synergy: faster-than-linear scaling of economic value. However, economic value itself is not what I intend to measure. Here’s I’m just talking about productivity: the pushing forward of a project (which might be to learn a new concept). With well-structured learning processes, people continue to push forward at an approximately linear rate, rather than experiencing the Square Root Syndrome of Brownian Time.

Most individual productivity strategies (such as the Pomodoro Technique) are designed to bring peoples’ awareness and planning up to a level where linear Progressive Time is common, rather than square-root Brownian Time. The idea behind Agile, executed well, is the same: to put enough microscopic consciousness of time into the process to remove the drift that causes Brownian Time. If a team is getting bogged down with Brownian Management, escalating technical debt, or other scaling problems, it should show up on the burndown chart.

I still don’t like Agile, because it’s built on fundamental closed-allocation assumptions. I dislike the idea of having a totalitarian Product Owner with unilateral priority-setting authorities, on the assumption that engineers will “just go do it”. Totalitarian “get it done” management is appropriate for existential threats, but those are rare and shouldn’t be assumed in normal planning. It would also be better if engineers were empowered to push for Progressive Time on their own terms (self-executivity). I think there are some good ideas in “Agile” that deserve further inspection, but I wouldn’t buy the thing wholesale, and I’ve seen it become a disaster in practice.

Square-Root Syndrome and Hierarchy’s Role

I’ve already stated my hypothesis that something like a Square-Root Syndrome applies to people. If there are 100 people in an organization, then it’s probably doing the work of 10 people. Again, I don’t know that 0.5 is the exact right power to apply, but it’s not a far-off guess for a start. I’ll get back to that.

Why is “bigness” maladaptive? Why aren’t biological cells 20 meters in diameter? The answer is simple. At that size, it will starve. Surface area grows quadratically in the diameter of the cell, while mass and need for nourishment grow as the cube. In other words, a cell’s ability to nourish itself grows as the 0.6667th power of the size. The same seems to hold with organizations, although we’re no longer talking about a 3-dimensional physical space, but an N-dimensional abstract space– ideas, information, social connections, business strategies. I’m keeping this hand-wavy intentionally, but let’s focus on the N (it works as metaphor, at least) and talk about dimensionality.

Let’s say that we’ve hired four people whose fluencies (0 to 10) in various programming languages are as follows:

Person | Java | Python | Haskell |  C  |
Alan   |    6 |      2 |       0 |  7  |
Barb   |    7 |      6 |       0 |  3  |
Carl   |    5 |      5 |       5 |  4  |
Diana  |    0 |      7 |      10 |  5  |

Who is the best programmer? Clearly, there’s no good way to answer that. Alan is the best at C, Barb is the best at Java, and Diana is the best at Haskell and Python. What about Carl? He’s not especially strong in any of the languages, but if there’s a project that requires Java and Haskell, he’s the only one who is ready (non-zero fluency) to do it! At 4 dimensions, we already have a world in which there’s no well-defined concept of the “best” or “worst” of these four programmers.

Dimensionality is relevant to organizations because, even though organizational dimensionality isn’t well-defined (there isn’t a clear set of “4 meaningful dimensions” that exist platonically, because what dimensions are relevant is somewhat subjective) it pertains to the optimal size of an organization. The more dimensionality there is in a business problem, the more it favors larger organizations with more specialized players. At least as metaphor, the idea of a cell in N-dimensional space works. Capacity for nourishment grows (in size p) as p^(N-1), and need for it grows as p^N, so overall organizational productivity grows as p^(N-1)/N and per-person productivity evolves in proportion to p^(-1/N)– it decreases.

What is the appropriate N for a typical corporation? Surprisingly, it’s disappointingly low. Businesses need a lot of different skill sets to operate, so one might expect this to make the case for high underlying dimensionality. If there are 10 dimensions on which people are evaluated for fit, then we get N = 10 and we scale as p^0.9, meaning we only get 7% more inefficient for each doubling in size. However, let’s consider two things. First, the proper value for N might not be an integer; it could be something like 2.35. This is a “fuzzy logic” situation where it’s subjective which dimensions matter, and how much. (This is an abstract fractal space, not a clean geometric one.) Does it matter if Carol speaks German (a candidate 5th dimension)? It depends on what the company is doing and what it needs. So the matter of which dimensions are included and excluded (a social phenomenon, not an explicit mathematical one) is unclear and could be akin to dimensions “possibly mattering, but intermittently and not all that much”. The effective N is much lower than the number of actual candidate dimensions (which is, at least, in the hundreds and arguably infinite). Second, organizational decision making is executed by humans, who can’t even visualize more than 2 dimensions easily. Three is possible, but a stretch. Four is just way outside of our experience. People making important “big company” decisions are not going to take stock of all the possible candidate dimensions. Everything gets collapsed into 2 dimensions: vertical (social status, importance, proximity to decision makers) and horizontal or “lateral” (all that other crap). Then, N = 2, and one gets exactly the square-root scaling. Since the “lateral” dimension is treated as inherently inferior (anything important would live in the vertical dimension) it might be more reasonable to treat the effective N as some lower value: 1.9? 1.85? 1.1? I won’t even begin to claim what the right number is, but it’s between 1 and 2 for most companies, and that induces something worse than square-root scaling.

If one finds this fractalized organizational pseudomathematics to be “hand wavy”, I’ll agree that it is, but there’s an important message in it. The more hierarchical and social-status driven an organization is (i.e. the lower the effective dimensionality, or the more social forces there are that collapse the organization into an org-chart or a “ladder”) the worse its capacity for nourishment (in this case, information rather than physical food) will fall behind its need. It will starve.

This is one of the inherent problems with big organizations. Their high underlying (“true”) dimensionality of needs requires size, but humans can only visualize two dimensions well as they work out their social plans, and this low effective dimensionality leads to information starvation, opacity, and inefficiency.

Management as a factor

The metaphor above discusses the biological cell, which does not scale to enormous size because of its surface-area-to-volume ratio would become too low to sustain it. Organizations have this issue as they get big: important players are on the surface, while most sit in the starving interior. This is made worse by the exertion of hierarchy, whose effect is to prioritize one point on the surface– the executive apex. (That’s where the low effective dimensionality, above, comes from.) How does this pathetic scaling relate to Brownian Management? It comes down to the MacLeod Clueless.

Losers sit away from the information surface area. They like the interior– it’s warm and comfortable and someone is closer to the outside than you in any direction– and avoid the edge. Clueless tend to be nearer to that edge, but are starved of important knowledge, or lack the competence to get it. Incidentally, they’re also the culprits in the bumbling, non-strategic, inconsistent direction of others’ time that becomes Brownian Management. They play a major role in the duplication of efforts, the go-nowhere projects, and overall waste of such a large amount of time. They get the information handed to them, which is rarely what they’d need to be properly strategic, and if the Sociopaths at the surface are engaged in zero-sum squabbling, they’ll cancel each other out. Why don’t Losers, who tend to be more strategic, fight back against the waste of Brownian time? The answer is that it won’t get them anything. They’re more likely to get fired than noticed in a good way, so they keep their heads down and implement ideas they know to be bad. Only when the ill-conceived project starts demanding above-board personal sacrifice (i.e. it becomes a “death march”) do they push back, and usually by leaving.

Solving It

Understanding of organizational efficiency usually comes down to discussions of percentages. “I was only at half speed today.” That’s not the right way to understand this particular problem. First, there’s the matter of faster-than-linear returns on performance (convexity). Even without that, though, we see that often organizational inefficiency isn’t some percentage cut. That would be tolerable. Eighty percent efficiency, meaning 20 percent is dropped on the floor? That’s a cost of doing business. Square-root scaling, however, means that efficiency goes to zero with growth. You might start out at an acceptable 80% efficiency, but find yourself at 8% when you scale up by an order of magnitude.

Preventing personnel congestion is a matter of conservative hiring. Only hire multipliers who will make the whole group more productive. It’s not that mere adders should be considered unacceptable. For commodity labor, that’s perfectly fine. However, if the work is a commodity, you can specify it contractually and hire it on the market. Why bring a new person on board (and increase communication complexity) for that? I’m loathe to use the word synergy because it has become such an MBA buzzword, but that’s exactly what I’m talking about.

I believe that a company that grows conservatively can avoid Square-Root Syndrome in its people. Communication topologies and political complexities will get more complicated, but that can be offset by sharing of ideas and collaboration. So long as growth is slow enough to remain strategic and cooperative, it’s a good thing and will probably improve per-person efficiency. The problem that VC-istan companies seem to inflict on themselves is that they grow so fast that internal competition (for larger equity shares, executive roles) emerges and the whole thing implodes.

However, if you hire for synergy and avoid the Square-Root Syndrome of rapid expansion and turnover, you get to a point where the second trust problem (investors’ ability to trust in the organization to do the work) solves itself. Hire great software engineers and give them just enough direction to outline the problem, and just enough incentive (profit sharing, not equity in some far-off liquidation that might involve horrible investor preferences that wiping out common stock) to care about the profit motive, and they’ll get their work done.

Thus, most important on a day-to-day level is avoiding Square-Root Syndrome in time: getting employees to work in Progressive Time. That doesn’t mean that every idea has to come to fruition or that failure won’t be tolerated. Instead, it’s the opposite. It’s okay to fail so long as you can affirmatively answer the question: did you learn something? The difference between Brownian and Progressive Time is that the latter has a memory. The first is bumbling blindly and retracing worn paths, usually under (MacLeod Clueless) managerial dictation. The second is exploration that builds a knowledge base and enables future explorations to be more successful.

VC-istan, by the way, lives in Brownian Time. Now that M&A has replaced R&D, institutional knowledge of failures just dissipates, resulting in massive duplications of effort that swell up every few years. There is progress, but it’s at the Brownian drift rate (with selection imposing macroscopic forward movement; in other words, mindless evolution) rather than anything that could legitimately be considered deliberate forward progress.

What’s the practical way to do all of this? How does one inject these principles into a software company?

  1. Self-Executivity in Progressive Time. Personnel reviews aren’t about “How loyal were you to your boss’s career goals this year?” No, they’re about: did you work in Progressive Time? What did you learn? What did you teach? What multiplier effects did you have that made the whole company better? Why is this a better place to work in 2013 than it was in 2012? Why are you better in 2013 than in 2012?
    • By the way, I fucking hate the term performance review. If I were running a company, there’d be no such thing. There’d be regular impact reviews. You’re assumed to be performing. You’re trusted (and those who prove unworthy of trust are packaged out) to be working hard and in good faith. The impact meeting is to discuss unintended effects (that he or she might not see) of a person’s work and behavior on the company. Very low (or negative) impact doesn’t mean you’re a horrible person who deserves to be humiliated; it’s assumed to be no-fault but means that you need to do things differently.
  2. “10-Year Fit” / Invest in Employees. I will confess that I’m somewhat of a “job hopper“, and I’m shameless about it. Most companies (and even many managers in the more progressive firms) don’t invest in their peoples’ careers and don’t deserve loyalty. Progressive Time is not compatible with a head-down-and-following-orders attitude toward work. However, I am personally tiring out of the job-hopping lifestyle. So instead of the typical corporation where one has to be a lucky protege to have a real career, I’d build a company around the concept of the 10-year fit, and aim to invest in employees and get progressive returns amid convexity. Fuck aiming for 10 years, let’s make it 50. You can leave and I’ll make sure that you have a great title and reference, but my job is to make things so great that you never want that option.
  3. Agile that Doesn’t Suck. The good thing about Agile is that it exists to coerce time into a linear, progressive march rather than the haphazard, Brownian stumbling of managed work when it isn’t monitored. The problem with it is that it involves closed-allocation assumptions and limitations of self-organization. Perhaps Agile could be adapted to an open-allocation world, however. That deserves a lot more investigation.
  4. Three-hour Workday. Employees are expected to work full-time in spirit, not in hours. Project plans will be based on 3 dedicated hours: that’s three hours of “metered work”, certainly not inlcuding goofing off and eating and water-pooler chat. Three is intended as the minimum obligation; of course, no one will be tracking and a person who delivers the typical corporate workday (9 hours at 33% efficiency) is in good standing instead of three solid hours. Three hours is also a right: 180 minutes of uninterrupted “building time” per day during daylight hours without meetings or those god-awful impromptu status pings. Employees should ideally be spending the other 4-7 “off-meter” hours each day to learn new skills (Progressive Time) or experiment or use Coursera or share ideas with each other.
    • In practice, few people would be able to get away with a strict 3-hour day. It’s somewhat of a planning fiction that accounts for the difficulty of estimation and the extreme long-term importance of off-meter time. In this model firm that I’m building up, I can’t see anyone working less than 6 hours per day and fulfilling the softer, off-meter obligations such as continuing education, and the average would be the standard 8.
  5. Culture of Mentoring. The most junior engineers are expected to use their off-meter hours to learn from more senior people. The most highly-compensated people, if they wish to remain so, are expected to share knowledge and multiply their expertise across the company. This expectation of mentoring (for senior hires) and progress (for junior hires) would be the only interference with self-executive culture. If we are to stay self-executive, we must be competitive in the market place; to be competitive, we must be progressive in the growth of internal skill.
  6. Well-defined Audit Cycle. With the overall goal being to have each employee in Progressive Time, there’d need to be some sort of incremental work monitoring. As much as “status” meetings are disliked, there’d need to be an understanding of how often a person or team is expected to ship or demo something. Demos would invite the whole company (not one manager). I think I’d have junior hires on a 3-week audit cycle (in which, “here’s what I’ve learned” is perfectly acceptable) and senior engineers expected to demo once every 8 weeks (as a minimum; I’d encourage the same 3-week cycle). The most senior, fellow-level, engineers wouldn’t have an audit cycle; since they’d be expected to be continuously multiplying their expertise across the company and mentoring new people, such a thing would be irrelevant.

So, that solves the second trust problem: how does one ensure people get their work done? You need the right structure. It’s not about Agile or gamification or anything out of management books, and self-executivity is a necessary but not sufficient condition. You also need to put everyone in linear (Progressive) rather than square-root (Brownian) time. You need to make that a cultural pillar: not working hard, but working mindfully.