The "tool" I'm using to do it is the "tool" developed for DF itself. Basically, I'm trying to see if it is feasible to do a tool library that allows optional external specification of key memory offsets (crc32 computation/checking is all embeded into the lib). The tool just handle the logic and is itself version neutral . Basically, if it finds any binary that it couldn't identify via the signature, it will output the signature to prompt people to look for/make a new memory mapping data that it can take in, and also to add the signature and version string to somewhere to be loaded so that next time it can recognize it (periodically, I guess all those external data for older versions can be packed into a new version of the library to make it more convenient).
As for getting the full path name and stuff....... No manual specification. if you do memory hacking in windows, and already do stuff like GetModuleBaseName ... then I don't see any problem. The companion GetModuleFileName(A/W) will net you the full path. You don't need helper dll. os handles that. I've already done a proof of concept tool that automatically finds the df process, gets its module, check it's signature and output the version if it is recognized or output the signature itself if not recognized (that's how I get the above crc32 values in the first place)
@Jifodus
Using the PE header just have one dependency : the crc value has to be correctly updated in the first place. I have not verified it, but somebody in the wiki discussion mentioned that crc for the df binaries are unset. meaning : all 0. meaning : can't used to differentiate between versions. They DID mention reading the timestamp though. Which could work in most cases unless somebody deliberately messes with the timestamps, or Toady's machine suffer a case of broken cpu clock (which should have rendered the machine unusable in most cases). The crc32 is the "sure kill" way, but it doesn't necessarily mean that it is the "minimal" way. Other cheaper methods (like timestamp) could work. crc32 just add a perhaps-not-so-needed robustness. My main reason for starting this thread isn't to push for the use of my method. Rather, it's to throw this into open discussion so that memory hackers know what methods there are out there to identify versions (previous to this, there is almost nil info on this. Everybody just do mysterious constructions in his/her own forge). By having this discussion thread, and perhaps later documenting all the useful stuff in wiki, hopefully we can fill up the version identifying gap to make life easier for tool makers in the future. Also, hopefully, all tools will be somewhat backward compatible, instead of having to keep a separate binary for each version.
I'll see if I can make my testing program do something actually useful and release it, as a beta tool, to showcase crc32 versioning. Just not sure whether I want to wait until I put in the optional external memory maps. Currently, although the logic of the tool is version independent, only the mapping for 33b, which I am using for testing, is internally added. I guess I may have to either add in the mappings for later versions, or I add in the external maps so that people can specify new mappings via files before it will actually be tested. Currently, when I am playing in between coding, I use the tool to list down dwarves who are unhappy or injured (show most serious injury level). There is also a list all creature mode which is used more for exploring values in the creature structure.
edit: Btw, I'm using this guy's assembly implementation for the crc32 computation.
[ December 10, 2007: Message edited by: sphr ]