Finally figured out how to clean up the jobs.
I first walked each FellTree jobs to see if any tiles were marked for FellTree. None were. So these jobs existed without a corresponding tile.
I cobbled together this, some of the code taken from advFort.lua:
function smart_job_delete( job )
local gref_types=df.general_ref_type
--TODO: unmark items as in job
for i,v in ipairs(job.general_refs) do
if v:getType()==gref_types.BUILDING_HOLDER then
local b=v:getBuilding()
if b then
--remove from building
for i,v in ipairs(b.jobs) do
if v==job then
b.jobs:erase(i)
break
end
end
else
print("Warning: building holder ref was invalid while deleting job")
end
elseif v:getType()==gref_types.UNIT_WORKER then
local u=v:getUnit()
if u then
u.job.current_job =nil
else
print("Warning: unit worker ref was invalid while deleting job")
end
else
print("Warning: failed to remove link from job with type:",gref_types[v:getType()])
end
end
--unlink job
local link=job.list_link
if link.prev then
link.prev.next=link.next
end
if link.next then
link.next.prev=link.prev
end
-- Don't delete the ptrs because I think something still is hanging on to them.
--link:delete()
--finally delete the job
--job:delete()
end
local job_link=df.global.world.job_list.next
while job_link do
local job = job_link.item
job_link = job_link.next
if job and job.job_type == 9 then
dfhack.job.printJobDetails(job)
smart_job_delete(job)
end
end
I had to comment out the parts that actually freed the memory or the save crash -- my guess is there was a double free or something. As long as the double linked list was trimmed, the save process saved the trimmed list.