Impact Shares For Speculative Projects

Introduction

Recently I founded a new project with Jasen Murray, a close friend of several years. At founding the project was extremely amorphous (“preparadigmatic science: how does it work?”) and was going to exit that state slowly, if it at all. This made it a bad fit for traditional “apply for a grant, receive money, do work” style funding. The obvious answer is impact certificates, but the current state of the art there wasn’t an easy fit either. In addition to the object-level project, I’m interested in advancing the social tech of funding. With that in mind, Jasen and I negotiated a new system for allocating credit and funding.

This system is extremely experimental, so we have chosen not to make it binding. If we decide to do something different in a few months or a few years, we do not consider ourselves to have broken any promises. 

In the interest of advancing the overall tech, I wanted to share the considerations we have thought of and tentative conclusions. 

DALL-E rendering of impact shares

Considerations

All of the following made traditional grant-based funding a bad fit:

  • Our project is currently very speculative and its outcomes are poorly defined. I expect it to be still speculative but at least a little more defined in a few months.
  • I have something that could be called integrity and could be called scrupulosity issues, which makes me feel strongly bound to follow plans I have written down and people have paid me for, to the point it can corrupt my epistemics. This makes accepting money while the project is so amorphous potentially quite harmful, even if the funders are on board with lots of uncertainty. 
  • When we started, I didn’t think I could put more than a few hours in per week, even if I had the time free, so I’m working more or less my regular freelancing hours and am not cash-constrained. 
  • The combination of my not being locally cash-constrained, money not speeding me up, and the high risk of corrupting my epistemics, makes me not want to accept money at this stage. But I would still like to get paid for the work eventually.
  • Jasen is more cash-constrained and is giving up hours at his regular work in order to further the project, so it would be very beneficial for him to get paid.
  • Jasen is much more resistant to epistemic pressure than I am, although still averse to making commitments about outcomes at this stage.

Why Not Impact Certificates?

Impact certificates have been discussed within Effective Altruism for several years, first by Paul Christiano and Katja Grace, who pitched it as “accepting money to metaphorically erase your impact”. Ben Hoffman had a really valuable addition with framing impact certificates as selling funder credits, rather than all of the credit. There is currently a project attempting to get impact certificates off the ground, but it’s aimed at people outside funding trust networks doing very defined work, which is basically the opposite of my problem. 

What my co-founder and I needed is something more like startup equity, where you are given a percentage credit for the project, and that percentage can be sold later, and the price is expected to change as the project bears fruit or fails to do so. If six months from now someone thinks my work is super valuable they are welcome to pay us, but we have not obligated ourselves to a particular person to produce a particular result.

Completely separate from this, I have always found the startup practice of denominating stock grants in “% of company”, distributing all the equity at the beginning but having it vest over time, and being able to dilute it at any time, kind of bullshit. What I consider more honest is distributing shares as you go and everyone recognizes that they don’t know what the total number of shares will be. This still provides a clean metric for comparing yourself to others and arguing about relative contributions, without any of the shadiness around percentages. This is mathematically identical to the standard system but I find the legibility preferable. 

The System

In Short

  • Every week Jasen and I accrue n impact shares in the project (“impact shares” is better than the first name we came up with, but probably a better name is out there). n is currently 50 because 100 is a very round number. 1000 felt too big and 10 made anything we gave too anyone else feel too small. This is entirely a sop to human psychology; mathematically it makes no difference.
  • Our advisor/first customer accrues a much smaller number, less than 1 per week, although we are still figuring out the exact number. 
  • Future funders will also receive impact shares, although this is an even more theoretical exercise than the rest of it because we don’t expect them to care about our system or negotiate on it. Funding going to just one of us comes out of that person’s share, funding going to both of us or the project at large, probably gets issued new shares. 
  • Future employees can negotiate payment in money and impact shares as they choose.
  • In the unlikely event we take on a co-founder level collaborator in the future, probably they will accrue impact shares at the same rate we do but will not get retroactive shares. 

Details

Founder Shares

One issue we had to deal with was that Jasen would benefit from a salary right away, while I found a salary actively harmful, but wouldn’t mind having funding for expenses (this is not logical but it wasn’t worth the effort to fight it). We have decided that funding that is paying a salary is paid for with impact shares of the person receiving the salary, but funding for project expenses will be paid for either evenly out of both of our shared pools, or with new impact shares. 

We are allowed to have our impact shares go negative, so we can log salary payments in a lump sum, rather than having to deal with it each week.

Initially, we weren’t sure how we should split impact shares between the two of us. Eventually, we decided to fall back on the YCombinator advice that uneven splits between cofounders is always more trouble than it’s worth. But before then we did some thought experiments about what the project would look like with only one of us. I had initially wanted to give him more shares because he was putting in more time than me, but the thought experiments convinced us both that I was more counterfactually crucial and we agreed on 60/40 in my favor before reverting to a YC even split at my suggestion. 

My additional value came primarily from being more practical/applied. Applied work without theory is more useful than theory without application, so that’s one point for me. Additionally all the value comes from convincing people to use our suggestions, and I’m the one with the reputation and connections to do that. That’s in part because I’m more applied, but also because I’ve spent a long time working in public and Jasen had to be coaxed to allow his name on this document at all. I also know and am trusted by more funders, but I feel gross including that in the equation, especially when working with a close friend. 

We both felt like that exercise was very useful and grounding in assessing the project, even if we ultimately didn’t use its results. Jasen and I are very close friends and the relationship could handle the measuring of credit like that. I imagine many can’t, although it seems like a bad sign for a partnership overall. Or maybe we’re both too willing to give credit to other people and that’s easier to solve than wanting too much for ourselves. I think what I recommend is to do the exercise and unless you discover something really weird still split credit evenly, but that feels like a concession to practicality humanity will hopefully overcome. 

We initially discussed being able to give each other impact shares for particular pieces of work (one blog post, one insight, one meeting, etc). Eventually, we decided this was a terrible idea. It’s really easy to picture how we might have the same assessment of the other’s overall or average contribution but still vary widely in how we assess an individual contribution. For me, Jasen thinking one thing was 50% more valuable than I thought it was, did not feel good enough to make up for how bad it would be for him to think another contribution was half as valuable as I thought it was. For Jasen it was even worse because having his work overestimated felt almost as bad as having it underestimated. Plus it’s just a lot of friction and assessment of idea seeds when the whole point of this funding system is getting to wait to see how things turn out. So we agreed we would do occasional reassessments with months in between them, and of course we’re giving each other feedback constantly, but to not do quantified assessments at smaller intervals.

Neither of us wanted to track the hours we were putting into the project, that just seemed very annoying. 

So ultimately we decided to give ourselves the same number of impact shares each week, with the ability to retroactively gift shares or negotiate for a change in distribution going forward, but those should be spaced out by months at a minimum. 

Funding Shares

When we receive funding we credit the funder with impact shares. This will work roughly like startup equity: you assess how valuable the project is now, divide that by the number of outstanding shares, and that gets you a price per share. So if the project is currently $10,000 and we have 100 shares outstanding, the collaborator would have to give up 1 share to get $100.

Of course, startup equity works because the investors are making informed estimates of the value of the startup. We don’t expect initial funders to be very interested in that process with us, so probably we’ll be assessing ourselves on the honor system, maybe polling some other people. This is a pretty big flaw in the plan, but I think overall still a step forward in developing the coordination tech. 

In addition to the lack of outside evaluation, the equity system misses the concept of funder’s credit from Ben Hoffman’s blog post which I think is otherwise very valuable.  Ultimately we decided that impact shares are no worse than the current startup equity model, and that works pretty well. “No worse than startup equity” was a theme in much of our decision-making around this system. 

Advisor Shares

We are still figuring out how many impact shares to give our advisor/first customer. YC has standard advice for this (0.25%-1%), but YC’s advice assumes you will be diluting shares later, so the number is not directly applicable. Advisor mostly doesn’t care right now, because he doesn’t feel that this is taking much effort from him. 

It was very important to Jasen to give credit to people who got him to the starting line of this project, even if they were not directly involved in it. Recognizing them by giving them some of his impact shares felt really good to him, way more tangible than thanking mom after spiking a touchdown.

Closing

This is extremely experimental. I expect both the conventions around this to improve over time and for me and Jasen to improve our personal model as we work.  Some of that improvement will come from saying our current ideas and hearing the response, and I didn’t want to wait on starting that conversation, so here we are. 

Thanks to several people, especially Austin Chen and Raymond Arnold, for discussion on this topic.

The Oil Crisis of 1973

Last month I investigated commonalities between recessions of the last 50 years or so. But of course this recession will be different, because (among other things) we will simultaneously have a labor shortage and a lot of people out of work. That’s really weird, and there’s almost no historical precedent- the 1918 pandemic took place during a war, and neither 1957 nor 1968 left enough of an impression to have a single book dedicated to them.

So I expanded out from pandemics, and started looking for recessions that were caused by any kind of exogenous shock. The best one I found was the 1973 Oil Crisis. That was kicked off by Arab nations refusing to ship oil to allies who had assisted Israel during the Yom Kippur war- as close as you can get to an economic impact without an economic cause. I started to investigate the 1973 crisis as the one example I could find of a recession caused by a sudden decrease in a basic component of production, for reasons other than economic games.

Spoiler alert: that recession was not caused by a sudden decrease in a basic component of production either.

Why am I so sure of this? Here’s a short list of little things,

 

But here’s the big one: we measure the price of oil in USD. That’s understandable, since oil sales are legally required to be denominated in dollars. But the US dollar underwent a massive overhaul in 1971, when America decided it was tired of some parts of the Bretton Woods Agreement. Previously, the US, Japan, Canada, Australia and many European countries maintained peg (set exchange rate)  between all other currencies and USD, which was itself pegged to gold. In 1971 the US decided not to bother with the gold part anymore, causing other countries to break their peg. I’m sure why we did this is also an interesting story, but I haven’t dug into it yet, because what came after 1971 is interesting enough.  The currency of several countries appreciated noticeably (Germany, Switzerland, Japan, France, Belgium, Holland, and Sweden)…

 

(I apologize for the inconsistent axes, they’re the best I could do)

 

 

…but as I keep harping on, oil prices were denominated in dollars. This meant that oil producing countries, from their own perspective, were constantly taking a pay cut. Denominated in USD, 1/1/74 saw a huge increase in the price of oil. Denominated in gold, 1/1/74 saw a return to the historic average after an unprecedented low.

 

 

 

(apologies for these axes too- the spike in this graph means oil was was worth less, because you could buy more with the same amount of gold)

 

This is a little confusing, so here’s a timeline:

  • 1956: Failed attempt at oil embargo
  • 1967: Failed attempt at oil embargo
  • 1971, August: US leaves the gold standard
  • 1972: Oil prices begin to fall, relative to gold
  • 1972, December: US food prices begin to increase the rate of price increases.
  • 1973, January: US Stock market begins 2-year crash
  • 1973, August: US food prices begin to go up *really* fast
  • 1973, October, 6: Several nearby countries invade Israel
  • 1973, October, 17: Several Arab oil producing countries declare an embargo against Israeli allies, and a production decrease. Price of oil goes up a little (in USD).
  • 1974, January, 1: Effective date of declared price increase from $5.12 to $11.65/barrel. Oil returns to historically normal price measured in gold.

This is not the timeline you’d expect to see if the Yom Kippur war caused a supply shock in oil, leading to a recession.

My best guess is that something was going wrong in the US and world economy well before 1971, but the market was not being allowed to adjust. Breaking Bretton Woods took the finger out of the dyke and everything fluctuated wildly for a few years until the world reached a new equilibrium (including some new and different economic games).The Yom Kippur war was a catalyst or excuse for raising the price of oil, but not the cause.

 

Thanks to my Patreon subscribers for funding this research, and several reviewers for checking my research and writing.

 

The Tallest Pygmy Effect

Status: I thought this was a common economics term, but when I google it I get either unrelated or references using it the way I expect but not defining it. It’s a really useful term, so I’m going to attempt to make it a thing.

“Tallest Pygmy Effect” is when you benefit not from absolute skill or value at a thing, but by being better at it than anyone else.  For example, the US dollar is not that great a currency and the US economy is not that great an economy. However, the dollar is more stable than other currencies, so it becomes the currency of choice when you want stability. This high volume makes USD more stable and is in general good for the US economy (because e.g. US companies don’t have to take on currency risk when they borrow money).

Tallest pygmy effects are fragile, especially when they are reliant on self-fulfilling prophecies or network effects. If everyone suddenly thought the Euro was the most stable currency, the resulting switch would destabilize the dollar and hurt both its value and the US economy as a whole.

Better Pay(ment System) for Pro Athletes

Pro-athletes as a group are terrible with money. It’s not merely that they’re bankrupt within a few years of leaving their sport; many don’t save enough cash to make it through the off season.  You could blame the athletes, but the system is really set up to create this problem. You’re taking mostly poor 22 year olds, selected for their ability to take risks and disregard odds, telling them they’ve won their ultimate dream and giving them more money than anyone they know has ever had.  Of course that goes poorly.  I’ve watched programmers from middle class families go kind of nuts their first year working, and that’s a much smaller transition they’re much better prepared for.

The money goes to a few major places

  1. Status competitions with other athletes
  2. Helping out people the athlete genuinely wants to help- his loving mother, the little league he got started in.
  3. Helping out people the athlete doesn’t want to help, but can’t figure out how to say no to- abusive parent, cousin’s former neighbor’s boyfriend
  4. Child support
  5. Terrible investments.  It’s hard to sort out good investments from bad when you’re 22 and everyone you know is in debt.  Also overlaps with 3 a lot.
  6. Being 22.

I have a potential solution: hold back most of their salaries.  Pay each athlete the same amount (say, whatever the lowest paid person makes now), and put the rest in a trust, invested in index funds or even bonds.  After they retire, gradually shift more and more of the money into their control.  Here’s why I think this would work:

Status competition is a zero sum game, so nothing is lost if you handicap everyone equally.  3 and 5 are essentially taxes on people knowing you have money, so they go away if you don’t have access to it.    Child support is here to stay, but the current calculation is stupid: it’s based on current income only, which means athletes pay through the nose while they’re working and then need to go to court to get it lowered when they retire.  Income smoothing for the athlete means income smoothing for the child as well, which is ultimately better for them.  And being 22 will definitely be fixed with time.  After retirement, when they’re a little older, a little more experienced with money, and have the time to learn how to invest, they’ll make better choices.  Turning over the money gradually gives them space to learn without a single mistake ruining their lives.

This will slow down their ability to help the people they love.  OTOH, a few years of largesse followed by a return to poverty isn’t very fun for the recipients either.  We’d get fewer “I bought my mom a house” draft stories but also fewer “athlete’s entire extended family facing foreclosure on 9 different houses” retirement stories.

It would be paternalistic for a league to impose this on athletes, but I see no reason a players’ union couldn’t demand it.  There’s precedent for solving player collective action problems with union demands (no one in the NHL wore helmets when they were optional because they reduce visibility and make you look like a wuss, but their union demanded they be mandatory).  They might even be able to demand some portion of endorsement money go into the trust

Also potentially useful tactic: focus athletes on what they will do after retirement.  Warning them they probably won’t play that long and money doesn’t last forever doesn’t work because people who believe odds apply to them don’t become professional athletes in the first place.  But “there will be time after football” isn’t an odd, it’s a fact.  If we redirected children and college students to view sports as generating the seed money for their real life goal, they’ll develop more skills and think a little harder about spending money.  Bonus: the 99.99999% of aspiring athletes that don’t become pro athletes will have useful skills to fall back on.

World War Technology

[Content Warning: both World Wars, the Holocaust]

I have a very vivid memory of reading Cryptonomicon, where a character explains that the Allies won World War 2 because they worshiped Athena (technology, strategy), and the Germans worshipped Ares (Brute Strength, physical and moral).

[Some of you may be thinking “But German craftsmanship was better, right?  It took 5/10 American tanks to take down 1 German Tank?”  I thought so too, but apparently no.  To the extent it was true, it was craftsmanship, not technology.]

The Axis did do better in encryption originally, but by the end we were reading much more of their mail than they were reading of ours.  Although it’s important to give credit to this to Polish Intelligence, who broke the Enigma code early on, enabling them to keep up as Germany increased its complexity.  If they hadn’t sent their results to Britain just as Germany invaded, Alan Turing et al. may never have been able to crack it.  That was some high leverage work there.

Anyways, I’m reading The Alchemy of Air (Thomas Hager) now, which is about the history of nitrogen chemistry, which played a much larger role in World War 1 than I would have guessed.  Fritz Haber’s invention of a way to transform atmospheric nitrogen into a usable form, something previously only accomplished by lightning and a handful of bacteria, is estimated to have prolonged the war by at least one year, possibly two. We’ll get into why in a later post.  That’s Athena.  According to the book, a lot of what the Allies wanted in reparations was not actually money, but German technology, especially chemistry.

[I do not entirely trust Hager on the relative important of chemistry and money here.  He’s spent the entire book waxing lyrical about the importance and beauty of nitrogen.  The internet was not terrible helpful; I’ve confirmed that dyes and pharmaceuticals were among goods taken as reparations, but not the amounts.  Some guy on Quora says the US, Britain, and Germany were equally competitive in technology in 1914.]

Even if Alchemy is overestimating German dominance in chemistry, I think it’s safe to say that technology was a major force behind German military power in World War 1.  And by World War 2, it wasn’t.  They made some advances and would have done worse without them, but no one ended the war thinking “man, getting access to this German technology will save us 20 years in research”.  But 60 years later, Germany is again a leader in technology, and has one of the more functional economies in the world.

This was going to be a “me wondering about a mystery” post, but once I thought about it the answer to “what changed?” is obvious.  Germany exiled or killed 25% of their scientists.  Fritz Haber, the guy who added years to the war with one invention and went on to pioneer chemical warfare?  Jewish  “Germany hurt itself while killing several million people” is not exactly news, but I think it’s important to note individual stories of how.

Although this puts me in the weird position of honoring the guy who more-or-less created chemical warfare.  But that’s maybe okay, because the same process that made Germany gun powder is also feeding half the world right now.  Utilitarian morality is complicated.

 

 

“But they’ve repaid the debt several times over”

This gets repeated a lot in Debt (David Graeber), and in the world in general.  It annoys me as a criticism of lenders or lending.

Would you rather have $100 now, or in a month?  I’m guessing now, unless your tax circumstances are about to change drastically.  How much additional money would it take for you to prefer payment in a month?  $10?  $15?  What if there were significant transactions costs to receive payment?  What if there was risk involved?  The fact that you would rather have money sooner than later is known as the time value of money.

This is the principle behind interest on a loan: you’re compensating the lender for them not having the money until later.

How much of an increase would you need to agree to delay receiving some money by 50 years, instead of $100 now?  I’m guessing it is a lot.  Many times the original $100.  The implication of the phrase “but they’ve repaid the debt several times over” is that this is morally wrong.  But if you’re not referencing the timespan on which that repayment took place, the statement is meaningless.  To compare apples to apples you need to do a present value calculation, which tells you the equivalent of what they paid if it had been delivered as a lump sum at the beginning.

This statement often gets entangled with the idea of usury (unfairly or immorally high interest rates).  I am not a big fan of the usury taboo: you’re not hurting someone by giving them the option to take a loan .  The counterargument is that deal was opaque (which is a fair criticism) or that the borrowers circumstances were so bad they had no choice.  Which is definitely a thing, but… maybe we should fix the problem at that end?  Much like debt forgiveness this appears to be a call to give poor countries/people more money, with a layer of obfuscation added by debt.  I am extremely curious why this seems to be more attractive than my solution “just give them money”.

Debt: The First 5000 Years (David Graeber)

This book seriously changed my thinking when I first read it, and I’ve shared many cool ideas from it, but I’ve found that when the ideas are challenged I don’t know enough to defend them.  So I’m going to reread the book and really dig in, with the following goals:

  1. Understand and be able articulate Graeber’s ideas without ambiguity
  2. Look up the data he cites and opposing arguments
  3. Update my beliefs based on what I learn

And I’m going to publish it here, probably chapter by chapter but if I need to break it down smaller I will.

What I publish will be a mix of “my understanding of his arguments”, “steelmen of his arguments”, “his argument updated by other things I know” and “things this made me think about”.  I will try to make it obvious what’s my opinion and what is his, but the application of the principle of charity is inevitably biased by what I consider charitable.

A few people have expressed interest in doing a small group chat over Whatever, in response to my “talk to me for an hour” offer.  If there’s enough interest, this strikes me as a good topic for that, so let me know if you’re interested.

And now, Debt: The Introduction.

You know what would be helpful?  A definition of debt.  Here is my idealized definition of debt:

Person A has a way to spend money to make more money later, but not the initial starting money (capital).  Person B has money, but no way to spend it to make more money.  Person B gives person B the money and A gives B money on a set schedule, up to a certain amount.  Everyone is better off.  Hurray.  The difference between debt and investment is that debts are owed no matter what, whereas in investment the risk is shared.

Graeber definitely isn’t using that definition.  There are a number of examples he gives that make me want to scream the chronological distribution of payment is not the issue here.  E.g.:

  • France billed Madagascar for their own invasion, and for the building of infrastructure they didn’t want.  Madagascar not having the cash on hand to pay them, this became a debt paid by onerous taxes.  Graeber claims Madagascar is still paying France, but I don’t trust him that this is the same bill.  He provides no source for this claim and I couldn’t find one.  But the wikipedia article on the subject makes it sounds like France had a bit of a dust up and somehow found itself running Madagascar, so I’m not convinced it’s unbiased.
  • France billed Haiti for the property damaged and confiscated during the Haitian slave rebellion, and convinced the rest of the world to embargo Haiti (unclear how long this lasted).  Haiti finished paying this in 1947.  No seriously, they had to pay France for no longer being slaves.
  • A Japanese legend about a woman who committed various commercial misdeeds, including loaning rice with a small cup and reclaiming it with a large cup.  The problem here is theft by deception.
  • Also in Madagascar: in the early 80s Madagascar had a resurgence of malaria, after almost wiping it out, because they couldn’t pay for their anti-malaria programs any more.  Graeber blames the IMF, which imposed austerity in order to refinance loans made by first world banks to Madagascar.  He makes no mention of whether Madagascar would have been able to pay for mosquito programs absent the loans.
  • As late as the 1970s, moneylenders in the Himalayas would take borrowers’ daughters as collateral and rape them as interest payments.  (source: “Galey 1983”, which probably exists because google scholar found other citations to it, but not the piece itself).  No one would have been happier if fathers had the ability to compel their daughters into prostitution proactively.
  • Graeber’s strongest point is that much of the debt owed by third world countries was taken by dictators and used for either personal enrichment or to repress the populace that is now forced to pay it.  Which is an extremely fair point, but still not any worse than repressive taxation in general.

So that’s a whole bunch of times the economic concept of debt was not the problem.  But… maybe the social constructs around debt let humans do things they wouldn’t otherwise do (this seems especially likely in the dictator case).  This seems curiously tied up with the concept of quantification (which is how he distinguishes between a debt and an obligation).  The way this makes sense to me is that this is an anthropology of debt, not an exploration of the economics

 

This is not a comprehensive summary of the chapter but it’s odds and ends and I don’t want this to turn into liveblogging, so they’ll all wait till their own chapter.

Special Tax Status for Non-Profits?

Non-profits in America get several tax benefits:

  • Contributions to them can be deducted by donors.
  • They do not have to pay sales or property tax.
  • Exemption from corporate income tax

Should they?

There’s a number of problems with this.  One, it makes deciding what is and is not a non-profit really important.  I’m fine with government subsidies (which is what tax breaks are) to help poor people eat, but not for rich people’s entertainment.  Those two are easy to distinguish, but there’s a lot of room between homeless shelters and operas, and I’m uncomfortable with the government drawing the line. Or what about charities that have beliefs you find abhorrent?  Bob Jones University lost its tax-exempt status in 1983 due to its ban on interracial dating, and fear of a repeat apparently drives a lot of the religious objection to same sex marriage.  I think people who oppose interracial dating or same sex marriage are wrong and should be shamed, but I’m really uncomfortable having that much money riding on values judgments by the government.

The sales tax thing isn’t that big a deal, as witnessed by the fact that a lot of charities don’t even bother with it.  But property tax is.  It starves the tax base of municipalities with a large percentage of land occupied by non-profits.  It’s something of a problem in DC and an enormous one in some university towns, especially if the surrounding area is poor.  That’s hard to stomach when prestigious universities have endowments in the billions and a good chunk of their work is making the rich richer.  Lack of property taxes pushes charities to buy property when they would otherwise rent (which means it benefits only those charities with consistent funding), and to occupy more valuable real estate than they otherwise would.

I like that I get a tax deduction for my donations, and I’d probably donate less without that.  OTOH, it creates a distinct gap between “people organizing to do some good things” and Official Charity, which creates a barrier to entry.  It’s not a trivial barrier either- I’ve served in the leadership on both official (my old dojo) and unofficial (Seattle EA) charities.  Among other things, official recognition forces a fairly specific kind of hierarchy on you.   In Utopia of Rules, David Graeber talks about the strain his autonomous non-hierarchical collective experienced when someone had the gall to give them a car.  It ends with them destroying the car with a sledge hammer.

Now we see the violence inherent in the system.
Now we see the violence inherent in the system.

Removing the benefits of official incorporation would let more organizations find their natural structure, rather than a one size fits all government imposed one, and also lead to fewer car destruction parties.

My opinion on the corporate income tax is a post in and of itself, so let’s put that aside for now.  I think there’s a very good case for not exempting non-profits from property tax, and not making charitable contributions tax deductible.  I also think it would be extremely disruptive to abruptly switch, so we should ease over gradually, and the change should be revenue neutral.*

*People say I’m cynical but then I write things like “the government should raise this tax and lower another one so they get the same amount of money” so I don’t know what they’re talking about.

In Defense Of The Sunk Cost Fallacy

Dutch disease is the economic concept that if a country is too rich in one thing, especially a natural resource, every other sector of the economy will rot because all available money and talent will flow towards that sector.  Moreover, that sector dominates the exchange rate, making all other exports uncompetitive.*  It comes up in foreign development a lot because charitable aid can cause dutch disease: by paying what the funders would consider a “fair wage”, charities position themselves as by far the best employers in the area.  The best and the brightest African citizens end up chauffering foreigners rather than starting their own businesses, which keeps the society dependent on outside help.  Nothing good comes from having poverty as your chief export.

I posit that a similar process takes place in corporations.  Once they are making too much money off a few major things (Windows, Office, AdWords, SUVs), even an exceptionally profitable project in a small market is too small to notice.  Add in the risk of reputation damage and the fact that all projects have a certain amount of overhead regardless of size, and it makes perfect sense for large companies to discard projects a start up would kill for (RIP Reader).**

That’s a fine policy in moderation, but there are problems with applying it too early.  Namely, you never know what something is going to grow into.  Google search originally arose as a way to calculate impact for academic papers. The market for SUVs (and for that matter, cars) was 0 until someone created it.  If you insist on only going after projects that directly address an existing large market, the best you’ll ever be is a fast follower.***

Simultaneously, going from zero to an enormous, productive project is really, really hard (see: Fire Phone, Google+, Facebook’s not-an-operating-system).  Even if you have an end goal in mind, it often makes sense to start small and iterate.  Little Bets covers this in great detail.  And if you don’t have a signed card from G-d confirming your end goal is correct, progressing in small iterative steps gives you more information and more room to pivot.

More than one keynote at EA Global talked about the importance of picking the most important thing, and of being willing to switch if you find something better.  That’s obviously great in in some cases, but I worry that this hyperfocusing will cause the same problems for us that it does at large companies: a lack of room to surprise ourselves.  For example, take the post I did on interpretive labor.  I was really proud of that post.  I worked hard on it.  I had visions of it helping many people in their relationships.  But if you’d asked at the time, I would have predicted that the Most Effective use of my time was learning programming skills to increase my wage or increase my value in direct work, and that that post was an indulgence.   It never in my wildest dreams occurred to me it would be read by someone in a far better position than me to do something about existential risk and be useful to them in connecting two key groups that weren’t currently talking to each other, but apparently it did.  I’m not saying that I definitely saved us from papercliptopia, but it is technically possible that that post (along with millions of other flaps of butterfly wings) will make the marginal difference.  And I would never have even known it did so except the person in question reached out to me at EA Global.****

Intervention effectiveness may vary by several orders of magnitude, but if the confidence intervals are just as big it pays to add a little wiggle to your selection.  Moreover, constant project churn has its own cost: it’s better to finish the third best thing than have to two half finished attempts at different best things.  And you never know what a 3rd best project will teach you that will help an upcoming best project- most new technological innovations come from combining things from two different spheres (source), so hyperfocus will eventually cripple you.

In light of all that, I think we need to stop being quite so hard on the sunk cost fallacy.  No, you should not throw good money after bad, but constantly re-evaluating your choices is costly and (jujitsu flip) will not always be most efficient use of your resources.  In the absence of a signed piece of paper from G-d, biasing some of your effort towards things you enjoy and have comparative advantage in may in fact be the optimal strategy

Using your own efficiency against you

My hesitation is that I don’t know how far you can take this before it stops being effective altruism and starts being “feel smug and virtuous about doing whatever it is you already wanted to do”- a thing we’re already accused of doing.  Could someone please solve this and report back?  Thanks.

* The term comes from the Dutch economic crash following the discovery of natural gas in The Netherlands.  Current thought is that was not actually Dutch disease, but that renaming the phenomenon after some third world country currently being devastated by it would be mean.

*Simultaneously, developers have become worse predictors of the market in general. Used to be that nerds were the early adopters and if they loved it everyone would be using it in a year (e.g. gmail, smart phones).  As technology and particularly mobile advances, this is no longer true.  Nerds aren’t powerusers for tablets because we need laptops, but tablet poweruser is a powerful and predictive market.  Companies now force devs to experience the world like users (Facebook’s order to use Android) or just outright tell them what to do (Google+).  This makes their ideas inherently less valuable than they were.  I don’t blame companies for shifting to a more user-driven decision making process, but it does make things less fun.

**Which, to be fair, is Microsoft’s actual strategy

***It’s also possible it accomplished nothing, or makes it worse.  But the ceiling of effectiveness is higher than I ever imaged and the uncertainty only makes my point stronger.

IQ Tests and Poverty

Recently I read Poor Economics, which is excellent at doing what it promises: explaining the experimental data we have for what works and does not work in alleviating third world poverty, with some theorizing as to why.  If that sounds interesting to you, I heartily recommend it.  I don’t have much to add to most of it, but one thing that caught my eye was their section on education and IQ tests.

In Africa and India, adults believe that the return to education is S-shaped (meaning each additional unit of education is more valuable than the one before, at least up to a point).  This leads them to concentrate their efforts on the children that are already doing the best.  This happens at multiple levels- poor parents pick one child to receive an education and put the rest to work much earlier, teachers put more of their energy into their best students.  Due to a combination of confirmation bias and active maneuvering, the children of rich parents are much more likely to be picked as The Best, regardless of their actual ability.   Not only does this get them more education, but education is viewed as proof one is smart, so they’re double winners.  This leaves some very smart children of poor parents operating well below their potential.

One solution to this is IQ tests.  Infosys, an Indian IT contractor, managed to get excellent workers very cheaply by giving IQ tests to adults and hiring those who scored well, regardless of education.  The authors describe experiments in Africa giving IQ tests to young children so that teachers will invest more in the smart but poor children.  This was one of the original uses of the SATs in America- identifying children who were very bright but didn’t have the money or connections to go to Ivy League feeder high schools.

This is more or less the opposite of how critics view standardized testing the US.  They believe the tests are culturally biased such that a small sliver of Americans will always do better, and that basing resource distribution on those tests disenfranchises the poor and people outside the white suburban subculture.  What’s going on here?

One possible explanation is that one group or the other is wrong, but both sides actually have pretty good evidence.  The IQ tests are obviously being used for the benefit of very smart poor children in the 3rd world.  And even tests without language can’t get around the fact that being poor takes up brainspace, and so any test will systematically underestimate poor children. So let’s assume both groups are right at least some of the time.

Maybe it’s the difference in educational style that matters?  In the 3rd world, teachers are evaluated based on their best student.  In the US, No Child Left Behind codified the existing emphasis on getting everyone to a minimum bench mark.    Kids evaluated as having lower potential than they actually do may receive less education than they should, but they still get some, and in many districts gifted kids get the least resources of any point on the bell curve.

Or it could be because the tests are trying to do very different things.  The African and Indian tests are trying to pick out the extremely intelligent who would otherwise be overlooked.  The modern US tests are trying to evaluate every single student and track them accordingly.  When the SATs were invented they had a job much like the African tests; as more and more people go to college its job is increasingly to evaluate the middle of the curve.  It may be that these are fundamentally different problems.

This has to say something interesting about the meaning of intelligence or usefulness of education, but I’m not sure what.