I'm not intimately familiar with the bible, but I don't believe there's any part where it says 'don't bother trying to heal the sick; God will take care of it'. If you take religious texts literally things tend to go poorly. Some people say that's because taking any metaphor literally tends to go poorly; other people say it's because religions tend to be products of their times, and we're past those times.
I mean, if you look at those mandates the way a lot of people seem to, in that they were made for the times and only apply nowadays if it's to do with morality and closeness to god, rather than, say, 'eat shellfish and you'll probably get sick and die'. Or 'STDs spread a lot faster when people aren't faithful'. Or 'stop screwing around with that guy's stab wound you're making it worse you incompetent fuckwits'.
Gender roles can be debated as to whether they're right and true and proper blah blah blah. Mostly I object because saying 'it's counterproductive to the worldview I want to create' can be used by anyone, and isn't all that valid of a criticism, since traditionalists can say that the progressive "muckery regarding gender roles and relationships...sexual practices"...you see where I'm going with this?
Also, sustainability? Isn't there a whole movement around being the stewards of the earth?
Religion isn't anything but how people interpret it. If they interpret it in a way that leads to Bad Stuff, it will lead to Bad Stuff. If they interpret in a way that leads to Good Stuff, it will lead to Good Stuff.