Howard, thank you for this articulate and insightful look at a tremendously difficult question: how to fairly and reasonably but effectively guide the influence of technology on society, politics, culture, and the environment. I think you fairly question our ability to halt research into some of these new technologies (artificial intelligence, synthetic biology, nano, etc.), especially globally, and yet the dangers these technologies pose for purposeful or accidental misuse are terrifying, especially given the ease with which some of them can be exploited. I also especially liked your point that “[q]uestions about the threats of technology often come down to the nature of capitalism.” As you know, this is why Naomi Klein and others argue that the threat of global warming should logically spell the end of capitalism. Our seeming inability to even contemplate action of a sufficiently comprehensive nature in that direction reflects the condition I describe in my book as “de facto technological autonomy,” which simply means that we find ourselves unable to give up the technologies that are killing us because the technological societies we’ve constructed have left us fatally dependent on them. To address that dilemma is perhaps the single greatest challenge facing humankind today, and whether we will find a way to do so without horrendous upheaval and violence is an open question — it’s hard to be optimistic. Congress, as you mention, has discarded the Office of Technology Assessment and ever since has shown not the slightest inclination to take responsibility for regulating technology development; whether it will do anything about the hacking of our democracy remains to be seen. (See my piece in the Boston Globe on technology assessment, “Staying Ahead of Technology’s Curves.”) The general public, meanwhile, is too distracted by the devices technology provides to play any active role in the discussion. Of the positive steps I see on the horizon, the degrowth movement is the one I find most hopeful.