I am glad an insider is pointing out the risks. Keep in mind he is a chief driver of those risks. He is acknowledging major but lesser risks to keep attention away from the extinction level risks. Even nuclear weapon proliferation and usage is a far smaller problem than superintelligence. A new species will have far different goals than humanity has and will have massive powers beyond our understanding to achieve those goals. Humanity used its intelligence to dominate the planet and sent thousands of species to extinction without even really noticing. What will happen when we are the algae on the planet compared to superintelligence?
The only real option, which looks completely infeasible would be a one world government completely halting advances until far more safety measures are put in place. And I am a right wing libertarian who hates government. But by the time enough people see the threat being big enough, it will be too late