Your AI would need to be prescient and possibly omnipotent, and at that point, the only thing stopping the AI from being It would need to be a virus, maybe? It'd probably be easier to just drop a nuke or peddle fentanyl if you didn't like a nation, anyways.
This is a bit lazy. We already have CRISPR: the tech is undeniably getting more and more accessible.
Of course it's lazy- why work harder for some maladapted theory of future warfare and viral bioweapons when literally anything is easier? With a little luck, we're not likely to see bloodshed on a warfare scale among superpowers again, and with a little more hope, we'll see less usage of other nations as pawns in proxy wars. You can make a nation suffer with tariffs and cyberweapons without bloodshed or declaration of violence- we're seeing this already.
Seems you agree anyways, since prescient/omnipotent AI would effectively be 'singularity tech,' or that you're trying to steer an agreement towards it to appeal to your concerns. You're communicating in vagueness about how we need to come to terms with how sharp relief is thrown about what we have to consider about the implications, because Covid/Pandemic/AI. Yes, pandemics are bad. Yes, Covid has helped us understand a lot about pandemics. No, AI is not going to magically fix our shortcomings in pandemics or outright stop the next bug. We don't even have the computational power to simulate folding proteins in a scalable way, that's literally crowdsourced to the public. No, quantum computation isn't going to spontaneously fix this, either.
Call me a luddite if you'd like, but it's my opinion you're jumping at shadows to steer conversation towards a specific conclusion of yours.