The world simulation often gets blamed for the slowdowns in .40.24, but most of the actual evidence I've seen people present seems to suggest that it's at best a minor contributor to the issue. One experiment to try (certainly not conclusive, but suggestive of an upper limit to the amount of lag world simulation might cause) would be to do this:
Set up a fort with FPS problems such that it can run for 2 weeks without any pausing/input from you. Then let it run for 2 weeks and time how long that takes. Then abandon/retire your fort and start a new one (or reclaim) in the same world. Time how long the calendar at the beginning takes to run 2 weeks worth of world simulation. Double that time to account for the fact that switching between the simulation and the fort might add extra lag. Divide the doubled pre-game time by the total 2 week time in-game, to get a reasonable upper limit to the proportion of its game time that DF was spending on world simulation, in your fortress. Call that ratio x. It follows that the lower limit to the ratio of game time that was being spent on your actual fortress instead of world sim overhead would be 1-x. So, if you take the average FPS during the 2 weeks you were measuring (FPSavg) then the formula FPSavg - (FPSavg / (x-1)) gives a good guess at an upper bound to how much FPS world simulation was costing you.