No worries, I really didn't follow the tech stuff. As it turns out, I ended up looking at the highest price tag on the shelf *FIRST*, then looked to see the difference between that and the next few down the line. This offered the most memory, and I knew for what I do I'd spank on memory. I'm also running a higher end graphics card. I do graphics editing, and
http://www.filterforge.com/ can make my higher end graphics and everything else scream, cry and whine. Actually, it just throttles the hardware max, where normally it runs whisper quiet. As far as using dual channel, Ganged or Holo-emitter, I wouldn't know. It reports 9G. My next upgrade (if I bother to upgrade this 'old' hardware) will be for blue-ray, THEN for a 3d monitor. Wouldn't it be bitchen if someone made an actual 3d display for DF still only using the character graphics display? Something like StoneSense, only top down instead of isometric? Well, a guy can dream.
But yeah, I have a beefy machine compared to the folks who get the $250 to $500 Wal-mart machine. I paid roughly $1200 several years ago, and it would please me if Dwarf Fortress could take advantage of the iron I bring to the table.
The problem, in this case, is not if a large embark is 'unstable due to my hardware', it is 'unstable due to allocation of memory beyond the 2G realm this was compiled in'. If the software is compiled incapable of allocating more memory than 2G, it shouldn't offer, or at the very least have exception handling to prevent crash-to-desktop. CTD's because it tries to allocate memory beyond a limit compiled into the software is bad form. An error along the lines of, "I cocked it up, let me save your game so you can recover" isn't best, but is better than CTD. "I could allocate more memory if you had it, and you don't" or "You have the memory, but I'm limited to 2G" is even better. Internal caching 'slow use' data to disk... such as legends that can be read as needed by engravers, or ... HELL, I don't know, but Toady may have an idea... using the hard drive to fake data in memory that is of more marginal use for most things. Creating a means for DFHack/ StoneSense to access their data without them being in the same memory footprint and processor core, as I initially suggested, that would be happy.
Look at 5 years from today... I have a 9GB machine. The reason I won't likely 'upgrade' this one much is I *might* be buying a 12 or 16GB computer for $1200 in a few years or so. For which, if nothing changes in memory handling, DF will *STILL be running in less than 2GB of it, still on one core of a virtual 16 or 32 core machine, and still doing CTD because of 'not enough memory'.
If there are zero plans to allow for DF to *access* the memory required for a 16 x 16 embark, it shouldn't allow it in the first damn place. So much the better if DF could detect it was running on a 1G machine even if it could do 'slightly less than 2', and respond accordingly WITHOUT a CTD. It's just my opinion, of course. Toady's current roadmap is really the opinion that counts. But if he could calculate a cost estimate on destroying memory related CTD issues, he *could* float the cost in a 'donate to CTD destruction' fund and see if people will chip into it. I don't have much money I could chip into it, but I would. Maybe others would feel the same way, that they'd like to max out what is *possible* with DF on their hardware.
The point, to me, is not so much expanding horizontally vs. along the Z axis for a fortress. The point to me is, I have a 9G 64-bit OS machine, and DF *allows* me to have a 16 x 16 embark, so why the CTD? All the rest is just a matter of preference, and perhaps that *is* too big an embark. DF invites me to do a big embark, then CTD's when I attempt to do so. WTF, man? Or I manage to get my 13 x 13 embark, I run StoneSense to admire it, expand the Z-axis buffer... and it blows up. WTF, man? Or I do a save because immigrants just came, and it hangs on the save because it touches the 2G limit. "FUN" on this level is clearly out of my league.
As my current hardware becomes that $200 machine being sold at Wal-Mart with 9 GB 64-bit OS, more people attempting to play DF for the first time are going to be saying, "WTF, man?" I really hope Toady takes the rapid changes in hardware expansion over time into consideration for his DF roadmap. I can see people with their 12 GB smart-phones in the next 5 to 10 years trying to play DF... "WTF, man? Out of memory!?"