There is an unavoidable performance degradation that occurs, apparently just because of items having ever been on your map.
I can only speculate, but it might just be a giant array or vector of items that just never really gets cleaned up, and as the game goes on, this vector keeps on getting filled up with items that, even when they are destroyed or permanently removed from the map, are still taking up some sort of reference slot in memory, and the game keeps on checking past them.
I strongly suspect that this is the case as well. Are there any utilities that will allow you to view the entire item list and permanently destroy items? Can DFHack do this?
It's not an ideal solution, but might be something you could try if you're desperate.
I also noticed this problem, now I am trying to produce a bug report with:
*save before event and decent FPS
*save about season later, with the same number of items ad creatures and half of FPS
This should help Toady to track this problem. It may be related to clothing (even in games without clothing items), so maybe upcoming fixes will end this.
That's not the way this works, it's a structural programming issue.
If the way this works is the way I think this works, Toady either has to be using a vector, or else a custom-made data structure that behaves in vector-like ways.
For those with no programming knowledge, normally, you declare a variable in a program as a chunk of memory designated to hold a value, in the case of objects in the fortress, this would be either a reference to another chunk of data that tells you what that object is, or a container object that has multiple references to where the object is, what the object type is, and what it's made of, and what state it happens to be in, or whatever other map data needs to be contained.
An Array is basically just a way to declare a large chunk of variables all at once, and reference them. So, instead of having Number, you could have the array Number[10], which is ten Numbers in a row, and asking for Number[4] would give you the fifth Number's value (because 0 counts as the first).
The problem with arrays is that arrays need to have a defined, specific number of chunks of data that cannot ever be changed once defined without deleting the whole array. (That is, you can change what is in the specific data fields of the array, but you can't change the number of fields that an array contains.) So, you can't decide at first you only need an array that has 10 objects, then decide you really needed 12, you have to create an entirely new array to add more values.
So, there is something called a Vector, which is basically a special advanced data structure (as opposed to the basic data structure that an array is) that creates a linked list of arrays, and creates or deletes arrays as need be so that you can have a theoretically infinite number of arrays, and an infinite number of data fields.
Sounds like that solves all the problems with arrays, right? Not at all. Unfortunately, vectors are the bane of many programmers, due to the fact that they are linked lists, and that means vectors are made by having arrays of an arbitrary length that simply tell you where to find the next array at the end of every array when you are searching for a specific piece of data. If you have an array with 100 values in it, the computer only has to search once to find the 100th data value. If you have a vector, you might have to search a dozen times, following a trail of pointers all over your memory until you finally find the data you were looking for... EVERY TIME YOU SEARCH. And DF has to search 100,000 times every frame once you get going.
The probable problem is that what we are dealing with is a very, very large list inside of a very, very large vector.
When a programmer deals with a large list in an array or vector, even when they have to continually add or subtract values, they tend to avoid sorting like the plague. (Think of it this way - if you have a list that has 100,000 entries in it, and you erase something from that list, and think the blank space on that list is ugly, are you going to write in the piece of information on the space on your list below that into the blank spot, erase the piece of information that you just copied, then copy the information in the space below that up, over and over and over again every single time you mark something off that list? No, it's a massive waste of time and resources, especially when you are doing things like constantly keeping track of a tree being chopped down into a log, then that log being turned into a barrel.)
What this would result in, however is eventually you have a list that is a million entries long, but where almost all those entries are blank. And in order to actually find anything in your list, you have to search through all the blank pages in order to actually find the few pages that still have some information on them.
Hence, the problem I am speculating might be happening is a structural problem that could only really be solved by an occasional sorting of this list, no matter how long that sorting might take.
This would probably be best done on the year change, when all the other heavy lifting and "spring cleaning" is taking place, anyway, and would basically result in a "don't bother waiting, just go make yourself dinner and come back when you're done" sort of year-end wait, but it would sweep up the gradual FPS death that occurs after several years of a fortress running.
EDIT:
Ah, damnit, I just realized there's a faster way to do the same thing as that previous list source, since there's no need for the data fields to have any sort of order.
You could simply have a pair of pointers, one starting at the front of the list, and stopping when it hits a blank space that needs to be filled, and the other starting at the end of the list, working backwards, and stopping at and transferring any memory field that has data (and garbage collecting completely emptied arrays).
However, that would depend on the ability for you to actually be able to traverse the list backwards somehow, whether by double-linked list or by giving the second pointer a memory of how it got to where it is.