Couple things to consider here. For the one, there's more than just "good graphics - bad graphics". You've got High graphics, Low graphics, Good graphics and Bad graphics.
High graphics are the bleeding-edge innovations of the industry, showering you with the most detailed and fancy digital magic this day and age has to offer. Low graphics, not so much.
Good graphics are graphics that are just innately pleasing, fitting, or easy-on-the-eyes. People like to look at good graphics, because they just look good. Bad graphics, not so much.
There is really no relation between the two. A game can have ancient (or just bad) graphics technology and still look beautiful. This is why we sometimes get surprised by how good some old games look. On the flip side, a game can have the absolute best graphic enhancements the industry has to offer (and the hardware to run it), but it really just looks like hi-def crap.
The "hate on graphics" comes from certain gaming companies putting most (sometimes all) of their eggs in the "High graphics" basket. This causes the other areas of that game to suffer due to resource constraints (time not being the least).
High-graphic games are bought by people who are impressed by the shiny graphics. Hell, to some level, just about all of us fall into that category. Even if something doesn't look particularly "good", we can still be impressed by incredibly detailed graphics.
The problem comes from when people buy lots and lots of these games, which cause the industry to react by making more High graphic games. This causes an industry imbalance that leaves good gameplay and solid plots behind in favor of the latest and greatest HDR lighting update.
So it's not really a disdain of High graphics, it's a disdain of the commercial community that springs up from a graphic-centric industry. We want games that are actually fun to play. If we want something pretty to look at, we'll buy a painting or rent Baywatch.