I was poking around in the git repository, and it looks like things have gotten quite a bit more complex from the days of just having to update a memory offset for a particular tool to work. Is that accurate?
Yeah. The way old DFHack did things, all the addresses (which give the location of statically-allocated structs, including pointers to dynamically-allocated stuff) and offsets (which give the location of fields within dynamically-allocated structs) were loaded at runtime from one big XML file.
Tools needed to update for a typical new version of DF:
- Cheat Engine (or similar)
- a decent text editor
In the new DFHack, all the offsets (and a fair number of addresses) are gone; all structs are now described using a system of XML files, which get translated into C++ header files and
compiled into the DFHack binary.
Tools needed to update for a typical new version of DF:
- git
- cmake
- perl with XML::LibXML and XML::LibXSLT
- MSVC++ 2010 (on Windows) or GCC (on Linux)
I guess the new system is meant to improve speed (nice for stuff like stonesense) and memory usage (very important, since the new SDL-fakery means DFHack lives in the same memory-space as DF now), but I'm glad it's not my job to update it.