I wonder if DF might work as an 'interpreted' game?
Sure it can work. It's a matter of make "building blocks" made of functions. Game data is available, so one level of interpreting is to display them in a 2D ascii graphics despite that game data is in 3 dimensions.
There are some tools existing such as in DF Remote that sends a cube of x, y, and z coordinates of the map already with every object in it.
So those "building blocks" are low functions. Simple info such as "what is the location of the object", "is it hostile", "is it food", "is it a weapon".
Then there needs to be some mid function that encapsulates an object that is more summarized for the user.
An example, "House of Urist". It is 3x3 room. Wood floors. Stone walls. Stone roof. It has a willow wood bed, a granite coffer, a copper cabinet. It has clothes inside the cabinet. There is a backpack on the floor with 2 rations, and waterskin with 3 units of drinks. Where it is located in the map.
How much detail will the user need to know is putting those low functions of the objects into a summarized chunks of digestible "audio".
Another a "Goblin Lasher." It is hostile, it has a whip, and in xyz location.
A higher level function is to provide data for the user how "House of Urist" is related to "Goblin Lasher."
If the goblin just entered the map, then not much relation with the house.
When the goblin moves closer, then House of Urist can help provide a point of view on where the goblin is in the map.
Machine learning is nice and all, but I think a framework to allow users to define these object relationships, save them for future use and re-use will be more practical.
The stories that you love about DF require a great leap of imagination to get from the screen to the narrative. The user interface is an absolute horror show from an accessibility point of view, and the user has to ignore big chunks of stuff and heavily imagineer their way around others.
True. But Toady does not create the stories either. He provides all these data to intermingle. The stories come from users reading the info and making their on connections.
So there is no need to try to make sense of the data by code. Just need to make those text and data object make sense using audio output that describes game things that are presented in 3 dimensions.
But perhaps what might work would be for an experienced, sighted DF player to act as an interface to the interface. This would involve describing the environment, reporting on events, answering queries and interpreting actions. A bit like a D&D GM would, for instance, but using DF as the story engine rather than maps, dice and character sheets.
Fortress mode is already like this. The game story in on-going. The 7 starting dwarfs will go hungry and thirsty by themselves and hunt vermin and drink murky water without user action from game start.
Let's start with a scenario with a default embark that has food, drinks, building materials. Enable auto-labor even.
Well we start with 7 dwarfs, and a wagon in xyz. How will the player want to interact with the game?
The answer will the provide coders how to approach it.
If we were smart, we could make an effort to record the requests and responsse, with a view to at some point using machine learning techniques to replicate the interpreter. And who doesn't want voice-controlled fortresses!
Obviously the big difficulty is having intepreters generous with their time (probably not that hard actually) and synchronising player and interpreter session times (tougher). An alternative would be to conduct them asynchronously, e.g. via the forum, but you might get a glacial pace then.
Time goes fast when you're having fun.