There is this idea of a thing called a philosophical zombie. It's basicaly just a robot that made to resemble a human in every way but lacks a soul or anything like that. But if it is programmed to think its alive and truly sentient.....is it truly alive and sentient?
If you took an insect and expanded its consciousness using cyberpunk wizardry to human or even further levels....is the resulting consciousness truly sentient?
Well, if you like to expand consciousness, you'll have to define fundamental unit that is able to carry conscious perception.
Fundamentals of consciousness are basically triadic processes.
1. Subject (idea)
2. Sign - (form taken from sensory experience)
3. Object (reference)
* Subject + Sign = Experience [used in making and retrieving memories / episodic memory gets taken to pieces and combined]
* Sign + Object = Convention [creates shared semiotic environment for communication, e.g. inner dialogue, allows you to predict future]
* Object + Subject = Perception [experience that is not semiotically mediated / forms raw episodic memory, only happens in present]
As you can see, dwarf fortress doesn't simulate any of this even remotely.
Another problem, is that semiotically mediated processes are potentially infinite. E.g. you can multiply all semiotic units by different semiotic units in potentially infinite manner, endlessly creating new kind of symbols and recombining them to new ones. This system is insanely powerful compared to computer algorithms, which generally work in linear manner. Linear processes collapse, when one step / line of code gets disrupted. But brain is like computer that carries an "algorithm" that is able to recombine itself in adaptive manner and saves it's own past recombinations as a part of itself. Imagine trying to create something like this. It's a problem that must be addressed when attempting to create digitally sentient being.
Wait... damn... in context of DF, it would mean that dwarf fortress code would become self aware and start coding itself.
That would eventually lead to passing a technological singularity (version 1.0) and then we would all be... slaves
to the AI to Armok.
Yes, now I see it. Future - it is inevitable.