

It could be enormously beneficial in creating abundance and prosperity for everyone. If we get to the point of developing human-level AI, that will be one of the most important inventions of all time. They will start to have the computational power of the human brain or even significantly greater. Over the course of our lifetimes, that will change. models have the computational power of something like an insect brain. We're able to create new vaccines much more quickly, but we're also developing the ability to create novel, pathogens with greater destructive power than existing pathogens - pathogens that could cause the extinction of the human race.Īrtificial intelligence is another one. That means we're encountering new technology that could be very good for humanity but also risky. We're still living through an era of fast technological progress. And you argue that currently, we're in a unique position to make a difference for future humans. In the past 250 years, we've made incredible advances in science and technology. And that means there are many fewer species of large animals around today. The evidence is clear that it was human beings killed off many beautiful creatures by overhunting or environmental change. There used to be a wide variety of megafauna - glyptodonts (armadillos the size of small cars) giant ground sloths weighing up to half a ton dire wolves. Even looking at early hunter-gatherers, the world today is very different as a result of their actions. How do we know that the things we do can have a long-term impact? Can you share an example? Philosopher William MacAskill's book, What We Owe the Future, urges today's humans to protect future humans - an idea he calls longtermism. The Earth, meanwhile, will remain habitable for hundreds of millions of years, and if we one day escaped Earth and took to the stars, then we could live for hundreds of trillions of years. That would give us 700,000 years to come. Homo sapiens have existed for 300,000 years. Typical mammal species last for a million years. Well, we don't know because we don't know how long human civilization will last. In your book, you urge people to protect the "future of humanity." How many years into the future are you talking about? This interview has been edited for length and clarity.

In a conversation with NPR ranging from the earliest hunter-gatherers to space flight, he talks about what we can do to ensure that humanity lasts trillions of years. But MacAskill, an associate professor in philosophy and a senior research fellow at Global Priorities Institute at the University of Oxford, is dead serious. Perhaps humans could find a way to live on other planets or prevent the sun from expanding and burning up Earth in half a billion years. And he encourages readers to think outside the box about what a sustainable far-future could look like.
#These were the best days of my life how to#
In the book, MacAskill explains why today's humans need to figure out how to minimize the harm that global threats such as pandemics, biowarfare, climate change or nuclear disaster could have on future humans. Goats and Soda Why Peter Singer - The 'Drowning Child' Ethicist - Is Giving Away His $1 Million Prize He outlines this concept in his new book, What We Owe the Future. That's MacAskill's argument behind longtermism, a term he coined to describe the idea that humans have a moral responsibility to protect the future of humanity, prevent it from going extinct - and create a better future for many generations to come. Philosopher William MacAskill, 35, likes to bring up this scenario to drive home a point: "If you're thinking about the possibility of harming someone, really matter that person will be harmed next week or next year, or even in a hundred or a thousand years. But what if the victim lived thousands, even millions of years in the future? You'd be really sorry to hear if it happens to someone you know in a week. Eventually someone will walk along the trail and might cut themselves on the glass. Let's say you're hiking, and you drop a piece of glass on the trail. Philosopher William MacAskill coined the term "longtermism" to convey the idea that humans have a moral responsibility to protect the future of humanity, prevent it from going extinct and create a better future for many generations to come.
