What, exactly, is a "singularity level" AI?
I mean... I've heard the term "singularity" plenty of times before. But it's got kind of a nebulous definition..
Depending on who you talk to that's either the point where technology takes off so fast that we can't predict what's going to happen next, or the point where the line between man and machine becomes invisible, or something else entirely.
I guess I'm going to assume you just mean that it's an insanely smart, insanely powerful,
Skynet-type AI.
So... Why do they need to defeat it? What horrible things is it doing? Are they actually
bad things? Or do people just not like having an AI in control?
Was it supposed to be in control? Was it supposed to be
under control? Did it break some kind of safeguards? Did humanity just underestimate their creation?
What kind of creation are we talking about? Just a big computer in a single location? A network of fairly distinct computers? A planet-spanning distributed intelligence? Is it even recognizable as a computer any longer?
Is it self-aware? Is it rational? Can it be reasoned-with?
I mean... Ultimately, it's your story. You can make things as (im)possible as you want...
But the options are just about endless. You could have your humans convince the AI to give up the fight. Or maybe the AI gets bored and takes a spaceship out for a cruise around the universe. Or maybe there's a killswitch that everyone forgot about that some plucky band of heroes stumbles across. Or an elite hacker shuts the thing down. Or maybe it's dependent on some specialized bit of hardware that can be destroyed. Or maybe there's some kind of memory leak that slowly kills it. Or maybe it's just misunderstood and didn't realize that you couldn't actually hug people with nuclear arms.