I’m not a natural “doomsayer.” But unfortunately, part of my job as an AI safety researcher is to think about the more troubling scenarios.
1) It is impossible for humans to create biological weapon without knowing they did it.
2) Global pandemic with an extreme pathogen from nowhere will lead all concerns towards ASI so it would be destroyed by enraged public.
3) 3% population of humanity won‘t be able to maintain all required industrial complexes - ASI will stop functioning in a short time.
Great job. You are closer to realistic bio threats than you think !
1) It is impossible for humans to create biological weapon without knowing they did it.
2) Global pandemic with an extreme pathogen from nowhere will lead all concerns towards ASI so it would be destroyed by enraged public.
3) 3% population of humanity won‘t be able to maintain all required industrial complexes - ASI will stop functioning in a short time.
Great job. You are closer to realistic bio threats than you think !