Not a decent comparison. The AI prediction, for most scientists and experts, isn't a matter of if, but of when, while the mayan calendar thing was mostly a combination of seasonal fearmongering + excessive dan brownism.
I'll admit that was a shoddy comparison, but when is rather far away. Technology, as quickly as it advances now, is still maturing into robotics. Maybe in a while we'll have that lovely apocalypse all those edgy 14-year-old Youtube commenters have been dreaming about, but speculation at this early point doesn't lead to much other than fear.
Coincidentally, one of the singularity arguments advocating the imminence of the singularity is that because technology advances exponentially on itself, eventually humanity will achieve the technology capable for a human level general intelligence earlier than most people would predict, and as
that first human level general AI learns more and more it will itself increase exponentially until boom, singularity. Coincidentally this singularity is always predicted to occur within the singularity proponent's lifetime.
I'm all for cool science fiction future happening now, but I'm not saying this is a concrete argument for or against how far in the future a human-level ai will be created. More of a theory based on observed trends evident in human civilization's history. Lots of futurist predictions don't work out.
Also, youtube comments are brain-bendingly stupid. I was watching a bunch of cool videos from Boston Dynamics earlier and on every video there were people complaining about how the robots will obviously be used as killer robots for big bad mister government. Jesus Christo it's just science calm your caveman tits, people. Stop freaking out about the unavoidable horrors of future warfare and think of how cool this shit is.