Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1] 2 3

Author Topic: System Memory issues to consider on a 'refactoring' of DF memory handling  (Read 3624 times)

Romaq

  • Bay Watcher
    • View Profile

http://www.bay12games.com/dwarves/mantisbt/view.php?id=136 is the bug and carries details, but an actual resolution may take some time going through the code, and perhaps some heavy hitting.

  • Data storage in memory needs to be retooled. I have 9G of memory, but I can only use less than 2G for DF. If I try too big an embark, crap blows up. If I try to use the latest DFHack + StoneSense (with sdl.dll), crap blows up. I can watch this happen with task manager, works every time I find a way to push memory requests in DF.
  • Instead of the hacked sdl.dll for DFHack + StoneSense, please consider a configurable data port that can be configured to turn on and offer the same data sdl.dll provides, but a) without the hack to sdl.dll and b) without StoneSense & DFHack having to wedge into that same tight 2G memory area and on the same core.
  • Said data port *could* be a simple web server where all the data and maps people ask for could be 'read' off the port. http://localhost:4319/ gives a quick link to all maps and appropriate data in .xml, including 'legends maps' and whatnot. Port 4319 is 0x10DF in hex, for what it's worth. The point being, this would make 'live' data available to all the utilities that can easily read it as a 'web page', with those utilities being outside of DF's memory space and single CPU thread, a drawback to the sdl.dll hack. This could also tell me the current date, season, weather and other game states useful to SoundSense. If SoundSense and DF's "web" server had simple streaming, DF could give the current game state in the stream, SoundSense could catch the changes more easily than "grep tail -f" of the game log.
  • Along with memory storage being abstracted, hopefully with a built-in means to avoid crashes, hopefully with plans for 64-bit multi-core hardware capability in mind, perhaps game data could be put into an actual database, or at least some sort of long term plan for trading relatively faster newer hardware drive capacity and speed for holding infrequently used blocks of data tying up memory. Sure, I'd love to have DF on a 64-bit Windows compile, but to some extent I'd be very happy to allow DF to use some disk space, perhaps to have 'FUN on HOLD' sit on a disk file ready to use, but until that point *NOT* taking up memory. When I hit the magic tile that includes FUN, it would be good to SAVE the freaking game, then bring the inactive information hot into memory, and if the game crashes, at least have it report that more memory is needed. Perhaps that specific example would not work in DF's game design, but the general idea is 'if it mostly sits & waits, it should sit & wait on the hard drive'. Perhaps that is already mostly being done... I'm just trying to think of ways to get more DF without CTD due to the 2G memory barrier.

Anyway, those are my thoughts and reflections as I tuck DF away for a while again. I'll come back again at some point, I have before. But repeatedly crashing out because I cut too close to the 2G limit is more FUN than I can work with at the moment. I'll try again after another major revision or two. Great stuff! Just... the http://www.bay12games.com/dwarves/mantisbt/view.php?id=136 bug needs repaired, or at least plans made long-term in that direction. Please. Something anything other than CTD. Could a donation fund be made just for people to pay for time dedicated towards quashing 2G memory CTD issues? Perhaps for the purchase of a compiler library or kit for improved memory management? If little money is put to a dedicated fund, the vote must mean it isn't a big enough deal for folks. But if people are willing to throw money at a 'make memory CTD go away' fund, perhaps that could get some priority time and needed tools? (shrugs) Won't get told 'no' if I don't ask. :)

BTW, one other happy-fun thought... if 'frame rate / display frame rate' means the display code is heavily abstracted from the rest of the game, it would be a joy to know display is on a separate thread, and thus could run on a separate core of my multi-core machine. That won't help any on memory CTD, but it would be a happy step towards making DF supporting multi-core hardware. :)
Logged

knutor

  • Bay Watcher
  • ..to hear the lamentation of the elves!
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #1 on: December 03, 2011, 10:42:03 am »

How does your duel channel controller handle an odd allotment of RAM?  My mobo manual says I gotta stick in even G's.  Of course my PC's mobo  is maybe 3yrs old.  Does the BIOS up in there, even recognize your extra 7G?  Mine doesn't.  You get a cycle at bootup that shows all 18G, of duel channeled memory?

I don't question that the memory needs retooling.  Your right.  Windows should not be showing, Not Responding.  But if crap blows up on you, so easily it maybe just a result of how you have your Virtual Memory being handled, and nothing else.  Mines on a RAID0 array on a seperate array of HD. My video seldom tears, and I have my minimum FPS at 20, what is yours set to?  This vidcard on here is so old, its got gray circuits.  Often I see my FPS being reported at 6/7 in the game, and I wonder how its going below my 20 limit.  That is why I feel like you do, some retooling is necessary.  Some measure of pixel scaling has to be done.

Not sure I understood the web server data suggestion, but if your recommending Toady force us to load save games off the web, like that latest overcopyprotected release of Settlers, I gotta say, Hell NO!  I'll not be loading single player saves off the web, at launch, at reload, or at anytime. 

Sincerely,
Knutor
« Last Edit: December 03, 2011, 10:50:14 am by knutor »
Logged
"I don't often drink Mead, but when I do... I prefer Dee Eef's.  -The most interesting Dwarf in the World.  Stay thirsty, my friend.
Shark Dentistry, looking in the Raws.

Artanis00

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #2 on: December 03, 2011, 01:45:32 pm »

Not sure I understood the web server data suggestion, but if your recommending Toady force us to load save games off the web, like that latest overcopyprotected release of Settlers, I gotta say, Hell NO!  I'll not be loading single player saves off the web, at launch, at reload, or at anytime. 

Sincerely,
Knutor

Dunno about the rest of that post, but the web server suggestion was not for the game to be hosted online. The localhost domain is a loopback to your own machine. He's suggesting that DF essentially be a local client-server architecture, which is a reasonably simple way to separate the UI code from the game code by putting them in different processes that speak over HTTP or other transfer protocol, and also suggesting that access to game data be allowed to third-party applications.

This will likely never happen, however. I hope something like it comes to pass myself, but not until DF hits 1.0 at least. Toady has a hard stance against providing API access to game data, since whenever he changes it, third-party apps would break and he does not want those complaints.
Logged
Git - fast, efficient, distributed version control system
Github - Free public repositories, issue tracking, wikis, downloads...

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #3 on: December 03, 2011, 07:01:56 pm »

@knutor, I don't know how the motherboard does 9 GB of memory. "My Computer" reports 9GB, taskman reports 9207 MB. It's all off-the-rack the best I could get at the time. I'm not prepared to dig up what ASUS motherboard it is.

I'm not recommending Toady force us to load save games off the web. My primary desire is to look at ways of 'doing the same thing without CTD when DF asks for more than the 2GB limit'. Currently, DFHacks + StoneSense is using SDL.DLL to interact with DF, and they have to 'go along for the ride' within the same 2GB limit and on the same processor core thread. Not having crap blow up is my desire. Offering what the SDL.DLL hack does without forcing StoneSense to occupy the same memory and core by providing a 'thin server' is one approach. It's just an off the cuff, shot-in-the-dark guess as to a way of not having crap blow up CTD on memory.

Perhaps the solution is to look at 64-bit windows compiles, and stop supporting 32-bit windows versions. That may take some more work, and some people still running 32-bit Windows OS will be unhappy. Supporting 2 flavors of Windows is going to be a pain.

Fixing memory requests to "Don't freakin' CTD on memory requests" is the *real* solution, but far more involved and not as sexy. People who don't attempt 16 x 16 embarks won't notice or care. Once proper memory handling is implemented, people who pig out on 16 x 16 embarks "won't notice or care" either, it will either just work or they will get a proper exception error telling them, "Your machine can only handle a 100 tile embark, such as 10 x 10. Upgrade your memory or live within your means!" They may not like it that their reach exceeds their grasp, but it's not a CTD with loss of everything JUST because you added one too many visible z levels in StoneSense, or something in the game in progress happened (immigrants? a siege?) that pushed the memory just a little too far.

As to my virtual memory settings, it would not matter. I have plenty of free, real memory. 16 x 16 embarks are simply no-go, I get the crash-to-desktop. I can get 13 x 13 embarks, but that's really pushing things. Sometimes during a save (just before memory clean-up) I get hangs that require I shut down the game, but at least I have a save. Task Manager shows "Memory (Private Working Set)" with how much memory Dwarf Fortress.exe is using. Every time I get a CTD, it's at 1.9G something, and it is related to use of StoneSense, DFHack, I *think* a game-save from a season change (just after the save but before cleaning up memory), I'm not sure beyond that. I just know if 'something' requires more memory than 1.9G by my taskmanager view, I *WILL* have CTD. According to http://www.bay12games.com/dwarves/mantisbt/view.php?id=136 this is a compiled limit. It is possible for a 32-bit program to use 3 to 3.5 GB with a compiler switch, which would be fine. The basic CTD because of hitting *ANY* memory limit is ... less than desirable.

It should not matter if my hardware has 1 GB, 10 GB or 100 GB, or if the compiler has a 2 GB, 3.5 GB or 50 GB hard limit. DF is a memory bound process. It should a) know it's limits, and know how to live within its limits without CTD, and b) use every possible means to trade use of real memory for virtualization to disk where it makes sense to do so, possibly with settings in the config. Word processors do this so you can edit huge documents you can't possibly fit into real memory. GIMP allows you to edit huge images you can't possibly fit into real memory. DF is much more complex than either word processing or image editing, but the concept is the same. The programs at least allow you to configure the 'speed for disk caching' trade-off, and neither does a CTD that I'm aware of just because you bit off more than was wise to chew. And usually, if you can live with 'disk bound slow', you can at least proceed as planned while working on a way to carve up the project into more manageable chunks. CTD because of memory requests = "Screw you! You're a pig, and I'm not even going to talk to you!" Which is particularly odd, since the game allows for 16 x 16 embarks it can not possibly deliver.

@Artanis00, the API ideas are just a stab at solving the larger CTD issue. If I did not have CTD, I wouldn't care so much. Perhaps the hacked SDL.DLL can be retooled to offer a 'thin server', and thus provide the end result without Toady being responsible for any code changes to DF *beyond CTD on memory requests*. The DFHack folks have adapted quite nicely with the memory.xml file that can be more easily updated. I do appreciate Toady's interest in not breaking third party tools by not providing an API. He does not claim responsibility for what he is not providing, so if stuff breaks, it falls on those who have to figure out how to hack their own API into the game. That's cool.

CTD's are not so cool. That's more FUN than what I can buy into at the moment. I'm game for whatever error handling will prevent CTD's on memory requests first, and second reasonable approaches to having things like DFHack/ StoneSense not have to share the same core and memory footprint as DF. Third for reasonable ways for Toady to visualize DF's memory consumption in a configurable way to trade speed for disk space. Fourth, getting DF beyond the 32-bit addressable memory limit of 2 GB within a single monolithic core thread.

The http based server API was just a shot-in-the-dark hope of getting towards that. Might be the wrong tool, or the wrong timing on the right tool for much later in the process. CTD because the game allows me to ask for more than it can possibly deliver on embark is... poor form.
Logged

knutor

  • Bay Watcher
  • ..to hear the lamentation of the elves!
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #4 on: December 04, 2011, 03:54:20 am »

I'm not sure.  I don't do embarks that big.  The game itself suggests not doing that if its unstable.  I see no gain in large embarks, since hunting is goofy and that's just about all I do uptop.  Most of my time is spent going vertical, not horizontal.  If hunting was worked, and ambush kept the lil fella alive and out of site during sieges, that might be a different story.

Wow, your post made me learn a thing or two.  Here I was thinking Ganged mode was Duel channel mode.  Its somewhat different.  Says on Wiki, duel channel mode is old tech now.  HA!  Man these things evolve so fast, now they are selling us Unganged mode designs.  Its all very confusing.  I guess its important to know what chipset is in there.

Sorry to suggest an odd number was bad, your running higher tech than me, it would appear.  64bit systems don't duel channel, they triple channel.  You got 3 sticks of matching DDR3 3G in there, I betcha.  Heres a good link with pics, of what I'm talking about.  My setup has 4 sticks, duel channel.  Yours must be that more advanced one, below mine. 

http://www.intel.com/support/motherboards/desktop/sb/cs-011965.htm

Sincerely,
Knutor

Logged
"I don't often drink Mead, but when I do... I prefer Dee Eef's.  -The most interesting Dwarf in the World.  Stay thirsty, my friend.
Shark Dentistry, looking in the Raws.

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #5 on: December 04, 2011, 07:29:11 am »

No worries, I really didn't follow the tech stuff. As it turns out, I ended up looking at the highest price tag on the shelf *FIRST*, then looked to see the difference between that and the next few down the line. This offered the most memory, and I knew for what I do I'd spank on memory. I'm also running a higher end graphics card. I do graphics editing, and http://www.filterforge.com/ can make my higher end graphics and everything else scream, cry and whine. Actually, it just throttles the hardware max, where normally it runs whisper quiet. As far as using dual channel, Ganged or Holo-emitter, I wouldn't know. It reports 9G. My next upgrade (if I bother to upgrade this 'old' hardware) will be for blue-ray, THEN for a 3d monitor. Wouldn't it be bitchen if someone made an actual 3d display for DF still only using the character graphics display? Something like StoneSense, only top down instead of isometric? Well, a guy can dream. :)

But yeah, I have a beefy machine compared to the folks who get the $250 to $500 Wal-mart machine. I paid roughly $1200 several years ago, and it would please me if Dwarf Fortress could take advantage of the iron I bring to the table.

The problem, in this case, is not if a large embark is 'unstable due to my hardware', it is 'unstable due to allocation of memory beyond the 2G realm this was compiled in'. If the software is compiled incapable of allocating more memory than 2G, it shouldn't offer, or at the very least have exception handling to prevent crash-to-desktop. CTD's because it tries to allocate memory beyond a limit compiled into the software is bad form. An error along the lines of, "I cocked it up, let me save your game so you can recover" isn't best, but is better than CTD. "I could allocate more memory if you had it, and you don't" or "You have the memory, but I'm limited to 2G" is even better. Internal caching 'slow use' data to disk... such as legends that can be read as needed by engravers, or ... HELL, I don't know, but Toady may have an idea... using the hard drive to fake data in memory that is of more marginal use for most things. Creating a means for DFHack/ StoneSense to access their data without them being in the same memory footprint and processor core, as I initially suggested, that would be happy.

Look at 5 years from today... I have a 9GB machine. The reason I won't likely 'upgrade' this one much is I *might* be buying a 12 or 16GB computer for $1200 in a few years or so. For which, if nothing changes in memory handling, DF will *STILL be running in less than 2GB of it, still on one core of a virtual 16 or 32 core machine, and still doing CTD because of 'not enough memory'.

If there are zero plans to allow for DF to *access* the memory required for a 16 x 16 embark, it shouldn't allow it in the first damn place. So much the better if DF could detect it was running on a 1G machine even if it could do 'slightly less than 2', and respond accordingly WITHOUT a CTD. It's just my opinion, of course. Toady's current roadmap is really the opinion that counts. But if he could calculate a cost estimate on destroying memory related CTD issues, he *could* float the cost in a 'donate to CTD destruction' fund and see if people will chip into it. I don't have much money I could chip into it, but I would. Maybe others would feel the same way, that they'd like to max out what is *possible* with DF on their hardware.

The point, to me, is not so much expanding horizontally vs. along the Z axis for a fortress. The point to me is, I have a 9G 64-bit OS machine, and DF *allows* me to have a 16 x 16 embark, so why the CTD? All the rest is just a matter of preference, and perhaps that *is* too big an embark. DF invites me to do a big embark, then CTD's when I attempt to do so. WTF, man? Or I manage to get my 13 x 13 embark, I run StoneSense to admire it, expand the Z-axis buffer... and it blows up. WTF, man? Or I do a save because immigrants just came, and it hangs on the save because it touches the 2G limit. "FUN" on this level is clearly out of my league. :(

As my current hardware becomes that $200 machine being sold at Wal-Mart with 9 GB 64-bit OS, more people attempting to play DF for the first time are going to be saying, "WTF, man?" I really hope Toady takes the rapid changes in hardware expansion over time into consideration for his DF roadmap. I can see people with their 12 GB smart-phones in the next 5 to 10 years trying to play DF... "WTF, man? Out of memory!?"
Logged

Dwarf

  • Bay Watcher
  • The Light shall take us
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #6 on: December 04, 2011, 07:46:24 am »

Out of curiosity, care to list some specs? These 9 GB are bugging me.
Logged
Quote from: Akura
Now, if we could only mod Giant War Eagles to carry crossbows, we could do strafing runs on the elves who sold the eagles to us in the first place.

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #7 on: December 04, 2011, 07:00:22 pm »

http://reviews.cnet.com/4504-4_7-0.html?id=33703094&tag=compare might be my machine, or at least in the same line. Same case anyway. Note 9 GB (installed) / 24 GB (max) - DDR3 SDRAM on the web page. But I'd rather not have system spec issues in this thread cloud the larger issue I intended to address:

In 10 years if you don't have 64 GB on your machine, it isn't a serious game machine, and people will be able to play DF on their smartphones. Will DF *still* not be able to tell if a machine has only 1 GB of memory and safely limit memory requests to fit within that 1 GB without CTD? If DF *STILL* has to fit within just shy of 2 GB, will it be able to do so without CTD? When people wish to play DF on their 64 GB game machine in 10 years, why not allow DF to take advantage of it?

As a quick exercise, make a 256 x 256 world using a 16 by 16 font. That creates a 65536 by 65536 map. Now enlarge that by the 48 x 48 of each embark tile, and you have a 3145728 by 3145728 image. Drop your 8 x 8 embark map on the appropriate spot on that image, and you can get an idea of the amount of rich detail the world *potentially* has on just one Z level. That's a LOT of data, and it blew my mind when I first went through the trouble of overlaying my embark map on top of a proper scaled world map image.

And my above data may be wrong, and AGAIN I please don't let's get lost on the details of overlaying an embark image on top of a world image. Let's just say Toady has done a fantastic job of putting in very rich *POTENTIAL* detail within DF, and he managed to cram all that within a program that must fit within less than 2 GB.

It STILL CTD's if it touches that 2 GB limit. The CTD must die. Breaking DF out of the 2 GB limit is even better. Suppose DF supported the larger memory, and suppose it dropped your 8 x 8 embark in the middle of a 16 x 16 'embark' area it kept tract of? Suppose you could see armies form outside your 8 x 8 embark area? Suppose you had neighbors to negotiate with just on the other side of your embark area? What if you dug outside your embark area *and got caught*, in which case you could start a war with your neighbors? Again, I don't want to get side-tracked with the merits of such proposals. Without fixing memory so 'no more CTDs on hitting the 2 GB limit', nothing like that can seriously even be on the radar.

I'm just saying, DF has a HUGE amount of very rich textured detail that it packs away inside that 2 GB of single-core running memory. Once memory handling is fixed, Toady could examine things like having a separate core thread continue running 'legend' history so events continue, including battles and such, and having that outside world occasionally spill into the working fortress. But none of those ideas fly if the CTD isn't fixed and memory can't break beyond the < 2 GB. :(
Logged

knutor

  • Bay Watcher
  • ..to hear the lamentation of the elves!
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #8 on: December 05, 2011, 10:36:32 pm »

I am not sure how you come up with the figure of DF needing to fit within 2GB of RAM.  Is that just a guesstimate, as I never heard of games having an upper cap on memory usage.  They gobble it up like a pig.  Til they have it all, then they move on and on, to other memory.  Or if its a duel channel, they go back across the stick, reaccessing.  When that RAM fills doesn't it access HD memory?  Virtual Memory.  This game not like the others?  If it is like other games it could quite possibly take up 100s GBs, and still keep going on some folks machines, if the Virtual Memory were large enough available.  It'd be slow, but it wouldn't crash, unless some quarky windows parameter crashed it, as a favor to us. LOL!

Like my virtual memory is 40G, I have 2 raptor drives(20G ea) in a RAID0 matrix array, set aside for just for virtual memory and temporary internet files.  It kicks in and compensates for when the RAM and the Video Card RAM fills up.  Games consume tons of memory, I'd hate to see a 2GB cap be the limiting factor for my entertainment.  I'd love to see a stopper to plug these memory leaks, which has gotta be present in DF, too.  DF for me gets slower as it progresses in its miriad of different possibilities.  I've set up some large embarks, and never crashed.  Unless I wheel mouse in and out too fast.  But I think that's just because of the GUI.

At the start my blips are flying around, sometimes I can't even keep track of where the seven dwarfs are all going.

Yep.  DDR3 is in that box you linked Romaq, hope you didn't get that PC tho, those bastards didn't even tell you the name brand of the vid card in that box.  Hell!  That's as bad as a console vendor.  For the money, we should get rock hard specs.  Feds should enforce that.  But Feds are too busy chasing terrorists I guess.  Ya don't even get the 'kind' of vid card memory, DDR3, DDR, etc..  just some vague chipset,  NVIDIA GeForce GTX 260 and a size of memory of 896MB.  That's kinda weak for a description.  I hate when they do that.

So your tweaks there from that site, are they overclocks?  HEHE!  I love making video cards churn a helling!  That is so cool.  I'd love to hear that.   If your into throttling up to game, put a custom fan in there, or a snorkel hole in top of your case.  I used to go crazy over cooling, back when the arthritis wasn't so bad and I could twitch game with the pros.  Now I'm older and slower. 

I feel a leak too.  I am not sure if its in the code or in the memory mechanics and have absolutely no way of knowing.  I can't read memory dump files.  This PC here doesn't slow down in other apps, as time passes and they stay open in a window.  Only DF.  Which leads me to agree with you, that its something up with DF, and not us.  *shrug*

Oh, and I know you don't wanna talk about specs.  But in my mobo, only the two sticks in slots A and B are ganged.  The other two are just draining electric.  They are full, but I don't think they do anything, because the memory controller is old tech crap.  Its an interesting thing to look up if you got the manual for your mobo.  Why do I have them slotted?  If they do nothing?  I figure I'm wasting all this electric powering empty USB ports, what's it matter about sticks.  That and I'm lazy.  I wish I knew more about it all, but I think sometimes they keep me in the dark for my own good.

Oh, I know you didn't wanna post stats, but how much free space ya got on your HDs?  I'm looking at 757G of free space on here.  I have a JBOD setup.  Just a Bunch Of Drives.  And the 2 40G raptors.  There is absolutely no reason for it to become, 'Not Responding', with that much space.

Sincerely,
Knutor
« Last Edit: December 05, 2011, 10:40:20 pm by knutor »
Logged
"I don't often drink Mead, but when I do... I prefer Dee Eef's.  -The most interesting Dwarf in the World.  Stay thirsty, my friend.
Shark Dentistry, looking in the Raws.

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #9 on: December 06, 2011, 06:50:16 am »

As far as the 2 GB limit for DF, that is from http://www.bay12games.com/dwarves/mantisbt/view.php?id=136 but I initially noticed that the crash-to-desktop would happen around 1.9 GB as reported by TaskManager. I happened to have it up at the time. The only real way to get a good look at a potential site is to embark, then kill the DF process. According to the link above, that's a 32-bit 'feature' of the compiler.

Commercial games tend to recognize their memory limits. Even if they are 32-bit games, there are ways to 'cheat', but CTD's are a huge no-no and much effort is put into quashing those. Even if DF were released as a 64-bit compile, memory allocation still has to be aware of what limits it has to stay within them and avoid CTD's.

Having DFHack & StoneSense wedged into the same memory space outside of DF's control is *NOT* going to help matters. Toady really has zero control over StoneSense busting the memory limit through the SDL.DLL hack. :( The long-term roadmap needs a better plan.

I'll respond privately about the system spec stuff, at least as best as I can scare it up from the info I have. It's around somewhere, I just don't keep my spec sheet handy.
Logged

King Mir

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #10 on: December 06, 2011, 04:31:31 pm »

2GB is an OS limit on single 32bit applications. 64bit apps have a larger limit.

knutor

  • Bay Watcher
  • ..to hear the lamentation of the elves!
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #11 on: December 07, 2011, 07:30:56 am »

2GB is an OS limit on single 32bit applications. 64bit apps have a larger limit.

Never heard of that 2GB limit, before.  I had to read up on it.  I know many BIOs set a physical limit on RAM that a controller cannot exceed, but that's different with ea chipset.   I did hear that Vista accessed RAM in a more efficient manner, it was one of its selling points.  But I can't remember what that was called.  Here is a nice link to limits.

http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx

DF accesses no virtual memory?  Bummer.  *sigh*  I guess there is a good reason for it, although I can't understand why.  Most RTS games access virtual memory, as a result of there continously expanding design needs, demand for it.

Knutor
Logged
"I don't often drink Mead, but when I do... I prefer Dee Eef's.  -The most interesting Dwarf in the World.  Stay thirsty, my friend.
Shark Dentistry, looking in the Raws.

thisisjimmy

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #12 on: December 07, 2011, 11:43:51 pm »

Programs don't choose whether to access virtual memory, and it has nothing to do with the 2GB limit.  Memory management is handled by the OS.  The OS will will try to swap pages that haven't been used recently out of physical memory and into virtual memory to make room for more recently used data in physical memory.  The program is not aware of this process.

The 2GB limit has to do with the address space addressable by 32-bit pointers and some OS limitations.
Logged

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #13 on: December 08, 2011, 06:50:34 am »

Yeah, what thisisjimmy said. :)

What a program *can* do is write files in a format which allows it to do more things. That would be GIMP using cached info to edit image files much larger than what can fit in your memory, or a word processor creating temp files to store parts of a huge document that you are not looking at at the moment, but all of which needs to be kept track of for document editing. But that would be deliberate action taken by the software, and does not count as 'virtual memory'. It *does* count as smart programming, depending upon how well this is done.

Roughly speaking, disk is 1000 times slower than memory. VERY roughly speaking. Your mileage may vary. For the purpose of this 'suggestion' thread, CTD due to memory handling has to go away. Please. :)
Logged

astaldaran

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #14 on: December 08, 2011, 08:34:47 pm »

there is some cool techy term for this... but i remember when master and commander came out that it crashed on me because I was running 32bit and it just wanted to many memory locations.

I agree though that, especially if the development of this game is going to cover the next decade, that some long term plan needs adopted. Though for all we know maybe Toady has one.
Logged
Pages: [1] 2 3