Russia and China are very much at the forefront of organized social media misinformation campaigns. That doesn't necessarily mean they're the most dangerous vectors. Arguably, a much bigger problem is that a lot of misinformation is spread not by deliberate "push this narrative" decisions by anybody in power, but by the way social media algorithms operate to boost "engagement". Fringe crap and conspiracy theories tend to get a lot of views and a lot of reactions (largely negative... at first), which has made Facebook and Twitter and the like decide people want to see more content like that, which drives more engagement which gets more content pushed in an escalating spiral. And, over time, the garbage starts to look less and less crazy through sheer exposure - the first time you hear about Evil Pedo Tunnels it gets dismissed entirely because it is so obviously bullshit. But you keep seeing it, until you start reacting less with "what are these people smoking" and more "ah, this nonsense again". Then it gradually becomes less "ah, this nonsense again" and "You know, I've been hearing a lot about this, am I sure there's nothing to it?". Especially if you are also seeing a lot of other, crazier stories that have popped up and shifting your credulity window. This is not a conscious process, but rather a slow mental erosion that it is far too easy to sink into, especially if it aligns with your existing biases.
This is made worse by grifters, politicians, and foreign agents (circling back around, Russia's provably dumped a lot of garbage intended to appeal to every political faction (their alignment with the far right has been known for awhile, and a number of leftist personalities were just arrested as Russian agents) in the US into the datastream for the sole purpose of increasing partisanship) catching on and deliberately exploiting it, but the pure mechanical grind is really scary.