Cliff Notes - What We Owe The Future 💸.

In a pre- Sam Bankman Fried past longtermism held more intrigue for me. I’ve since developed an aversion to an ideology that places the preservation of human life at the forefront of all environmental decision making. Nevertheless, I am intrigued by the thought of always keeping “the most good for the most amount of people” at the forefront of decisionmaking, as well as longtermism’s efforts to evaluate the likelihood of various environmental futures. Here is a summary of my takeaways from William MacAskill’s book “What We Owe the Future.”

Positively influencing the future is the key moral priority of our time

  • Longtermist beliefs echo those of native tribes such as the Iriquois Confederacy, whose “seventh-generation” principle states that all key decisions made should relate to the well-being of the seventh generation to come.
  • If we last as long as the typical mammalian species (around 1 million years), there will be million future people alive for every one person today.
  • The earth will remain habitable for hundreds of millions of years.
  • Natural processes will return carbon dioxide concentrations to preindustrial levels only after hundreds of thousands of years.
  • We are more connected now than at any point in our history, which give us an outsized opportunity to positively impact the future.
  • Society is currently still malleable and can take many shapes, but longtermists believe that the choices we make in this era might soon become solidified for a long period of time, which is why it is important to act now.
  • Economic growth is driven by technological progress, but productivity has been falling for the last 50 years.
  • The most important action you can take personally is your choice of career. You will spend 80,000 hours of your life on your career. If you find a career twice as impactful as your current one, it’s worth spending half your career looking for that career.
  • Working on problems together will allow us to achieve far more than working individually. A community gets to learn from the mistakes and triumphs of its members.

Climate Change

  • We need to decarbonize within 50 years, even as energy demand triples over that time.
  • Our decarbonization efforts might likely get stuck, due to factors such as by breakdown in international coordination, resulting in warming as high as 7 degrees Celsius.
  • Climate change lasts a long time, staying similar up to ten thousand years later, and only returning to “normalcy” after hundreds of thousands of years.
  • Solar and wind cannot provide the high-temperature heat necessary for certain industries like Cement, Steel, Brick, and Glass, thus we will likely need to invent, in a short amount of time, methods of providing sustainable energy that do not currently exist.
  • Working on more neglected problems can allow us to have an outsized impact on climate change, yet we are plagued by our inability to prioritize amongst competing courses of climate action.
  • Donating to effective charities could have a 10X greater impact on carbon reduction than a personal action (e.g. buying Fair Trade products).

A Framework For Thinking About the Future

  • A decision’s “expected value” should be weighed by 3 factors:
    • significance (value in bringing about a state of affairs)
    • persistence (how long it lasts),
    • contingency (extent to which it depends on a small number of specific actions that would not otherwise occur through other means)
  • Cultural evolution can be described by 3 principles:
    • variation: cultural traits vary
    • differential fitness: different characteristics have different rates of survival
    • inheritance: cultural traits can be transmitted
  • As an example of cultural evolution, 1 in 5 Asians say they are vegetarian, while 1 in 20 North Americans do. If the industrial revolution had occurred in vegetarian-friendly India, how would that have changed factory farming?

Longtermist views on AI

  • Longtermists hold a weird fascination with AI, specifically Artificial General Intelligence (AGI - the point at which artificial intelligence is at parity with human intelligence), theorizing that AGI has the ability to codify our current values into systems that will carry forward long into the future, potentially long after the human species.
  • Longtermists cite a 50% chance of AGI by 2050.
  • Whoever successfully develops AGI could dictate the shape it takes, including:
    • who has power - e.g. the Military or a large corporation.
    • how morally exploratory it can be (e.g. labs for economy policies)