When it comes to the long term future of humanity, I'm fundamentally an optimist. But The Atlantic has an interview with Nick Bostrom, director of Oxford University's Future of Humanity Institute, who is a little more downbeat than me: "We're Underestimating the Risk of Human Extinction".
In one of your papers on this topic you note that experts have estimated our total existential risk for this century to be somewhere around 10-20%. I know I can't be alone in thinking that is high. What's driving that?
Bostrom: I think what's driving it is the sense that humans are developing these very potent capabilities---we are doing unprecedented things, and there is a risk that something could go wrong. Even with nuclear weapons, if you rewind the tape you notice that it turned out that in order to make a nuclear weapon you had to have these very rare raw materials like highly enriched uranium or plutonium, which are very difficult to get. But suppose it had turned out that there was some technological technique that allowed you to make a nuclear weapon by baking sand in a microwave oven or something like that. If it had turned out that way then where would we be now? Presumably once that discovery had been made civilization would have been doomed.
Some interesting passages later in the interview discuss machine intelligence and the Kardashev Scale.