Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 [2] 3

Author Topic: System Memory issues to consider on a 'refactoring' of DF memory handling  (Read 3577 times)

King Mir

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #15 on: December 09, 2011, 09:14:34 pm »

2GB is an OS limit on single 32bit applications. 64bit apps have a larger limit.

Never heard of that 2GB limit, before.  I had to read up on it.  I know many BIOs set a physical limit on RAM that a controller cannot exceed, but that's different with ea chipset.   I did hear that Vista accessed RAM in a more efficient manner, it was one of its selling points.  But I can't remember what that was called.  Here is a nice link to limits.

http://msdn.microsoft.com/en-us/library/windows/desktop/aa366778(v=vs.85).aspx

DF accesses no virtual memory?  Bummer.  *sigh*  I guess there is a good reason for it, although I can't understand why.  Most RTS games access virtual memory, as a result of there continously expanding design needs, demand for it.

Knutor
The link is correct, but you're misunderstanding it. DF does use virtual memory as needed (in fact it doesn't have much choice to). But the OS limits the (virtual) address space for a single 32 bit app to 2GB. Strictly a 32 bit address in x86 can point to a larger address space, which is why a 32 bit OS can benefit from 4GB RAM.

thisisjimmy is right, but the clincher is the OS limit, not the pointer limit, since the OS limit is smaller. But the reason for the OS limit has to do with the address space limits.
« Last Edit: December 09, 2011, 09:26:58 pm by King Mir »
Logged

King Mir

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #16 on: December 09, 2011, 09:34:20 pm »

there is some cool techy term for this... but i remember when master and commander came out that it crashed on me because I was running 32bit and it just wanted to many memory locations.

I agree though that, especially if the development of this game is going to cover the next decade, that some long term plan needs adopted. Though for all we know maybe Toady has one.
Yes, you're making me cringe at "too many memory locations." "want too much memory" would be correct. So would "run out of memory."

There is really only one reasonable solution to the problem: release a 64 bit version.

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #17 on: December 10, 2011, 02:25:42 am »

There is really only one reasonable solution to the problem: release a 64 bit version.

Not so good a solution without prep work, in my opinion.
  • If you have a 64-bit OS with only 4 GB including virtual memory, and DF wants 4.1 GB (well way before that because of the OS, but by way of example...) if you have Crash-To-Desktop (CTD) now, you'll have it then too. Memory handling so as to avoid CTD is required, or a 64-bit release would be agony *anyway*.
  • Toady either has to abandon the 32-bit 'flavor', or support *two* 'flavors' on each platform (Linux, Windows). It would be best to have him focus on making DF better, since simply fixing CTD would allow you to nearly double memory usage ability anyway and stay 32-bit
  • Multi-threading would be more useful than a 64-bit compile, in my opinion. Right now, DF is locked to a single core. If DF could properly handle the nightmare of forking multiple threads, my OS may use up to 8 CPU cores. You can have display and interface on one core so even if your framerate drops below 5 to 1 frame per second, you can still have 50 fps graphics display and have the game respond. How fast you choose to display would not need to impact how fast the 'do stuff' frame process runs, as it would be on a different core.
  • I've been thinking of how DF could be bound to a database such as MySQL instead of all internal memory. Suppose you no longer had to choose how much embark space... the size of the world would be how much space you permitted that world to occupy on your hard drive. How large your fort was within that would would be a function of 'how much land you can grab AND HOLD. No more 'embark size' issues. You may find having a smaller size fort is easier to manage while keeping the frame-rate where you tolerate it. Not to mention your ability to deal with HFS, sieges, and so on. I've not quite fleshed out the notion, but breaking DF out of memory to a disk based database stored on the drive might take DF up to the next level the same way breaking DF out of 2D took DF up a notch. I'm still fleshing out ideas for a proposal to suggest on that one.

I don't think a 64-bit compile is a 'quick/ easy' thing to do. I would not mind having a 64-bit compile, I can run it. But a 64-bit compile that still CTDs, or takes away from the Toady One's focus and passion for DF would be undesirable, in my opinion.
Logged

King Mir

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #18 on: December 10, 2011, 05:15:35 pm »

Not so good a solution without prep work, in my opinion.
  • If you have a 64-bit OS with only 4 GB including virtual memory, and DF wants 4.1 GB (well way before that because of the OS, but by way of example...) if you have Crash-To-Desktop (CTD) now, you'll have it then too. Memory handling so as to avoid CTD is required, or a 64-bit release would be agony *anyway*.
I'm not aware of any 64 bit system that so strangely limits virtual memory to 4GB. Switching to 64 bit will mean the CTD will occur much later. 8TB later.

Quote
  • Toady either has to abandon the 32-bit 'flavor', or support *two* 'flavors' on each platform (Linux, Windows). It would be best to have him focus on making DF better, since simply fixing CTD would allow you to nearly double memory usage ability anyway and stay 32-bit
I agree that supporting multiple versions does open up complications. But it's likely that the program would not need to be very different when compiled in 64 bit instead of 32 bit mode. It may even be as trivial as recompiling it under different setting.

Quote
  • Multi-threading would be more useful than a 64-bit compile, in my opinion. Right now, DF is locked to a single core. If DF could properly handle the nightmare of forking multiple threads, my OS may use up to 8 CPU cores. You can have display and interface on one core so even if your framerate drops below 5 to 1 frame per second, you can still have 50 fps graphics display and have the game respond. How fast you choose to display would not need to impact how fast the 'do stuff' frame process runs, as it would be on a different core.
I agree but proper multi-threading is much harder to do than a 64 bit compile. It requires changes in the logical flow of the program, and in managing memory access. I expect a 64 bit compiler sooner than multi-threading support, because a 64 bit compile is much easier to do.

Quote
  • I've been thinking of how DF could be bound to a database such as MySQL instead of all internal memory. Suppose you no longer had to choose how much embark space... the size of the world would be how much space you permitted that world to occupy on your hard drive. How large your fort was within that would would be a function of 'how much land you can grab AND HOLD. No more 'embark size' issues. You may find having a smaller size fort is easier to manage while keeping the frame-rate where you tolerate it. Not to mention your ability to deal with HFS, sieges, and so on. I've not quite fleshed out the notion, but breaking DF out of memory to a disk based database stored on the drive might take DF up to the next level the same way breaking DF out of 2D took DF up a notch. I'm still fleshing out ideas for a proposal to suggest on that one.
You post this after you said that Toady should be focusing on multithreading? It's a sound idea, but it's harder than a 64 bit compile. Perhaps not as hard as hard as multi-threading. I'm also not sure that the fps bottleneck is CPU bound. If it's memory access time bound, then switching to a database would make the problem worse.

Quote
I don't think a 64-bit compile is a 'quick/ easy' thing to do. I would not mind having a 64-bit compile, I can run it. But a 64-bit compile that still CTDs, or takes away from the Toady One's focus and passion for DF would be undesirable, in my opinion.
I don't know about quick and easy, but it's quicker and easier than any alternatives. CTD is not a problem if it only occurs at 8TB.

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #19 on: December 10, 2011, 08:37:46 pm »

All this is very definitely out of my area of expertise.

I just know that CTDs are a category of HFS that is way more fun than I care to allow myself. :) The only warning when trying to do a 16 x 16 embark is about 'lag'. Options that are physically impossible due to the lack of available memory should be blocked with a warning, "Only X amount of space is available, alter Y and/ or Z for more of X". CTDs, especially due to memory should be exception handled, hopefully with a 'crash save in /crash' for a hope-in-hell recovery.

Everything else is in the 'if wishes were fishes'.

If a 64-bit compile were even considered, that only pushes the problem down the road rather than resolving the problem. Pushing the CTD issue down the road another 5 years works for me, so long as there is the understanding it is not a good long-term solution.

If the problem can be isolated, a dollar figure put on it, and a pot we can throw money into, I will commit to throwing $50 into the pot in two payments. I know it's not enough money, but it's what I can do. Any problem you can solve with money you have at your disposal is no problem. Hopefully memory CTDs are annoying enough for people they'd want to throw sufficient money at it so Toady can make them go away, and stay gone. 4 GB, 8 GB, 16 TB, 256-bit OS platforms with 1024 cores... whatever. CTD's be gone! :D
Logged

thisisjimmy

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #20 on: December 11, 2011, 02:26:16 am »

I'm kinda surprised you aren't suffering from very low frame rates on such large maps.  I typically have issues with FPS death even on small maps.

As a quick-fix, I would consider it acceptable to simply disallow embark zones larger than 12x12 or whatever crashes. 

For a more long term solution, loading the map in chunks from the disk, as Romaq suggested, would be really nice.  It could potentially allow us to load arbitrarily large maps without using extra RAM.  On the other hand, it's not that easy to implement well, and Toady would still have to improve the FPS on large maps.

A 64-bit compile would be easier, and would benefit those of us good machines.  It should stop the crashes for even the largest embark areas.  However, for those of us that don't have much RAM, large maps will probably be so slow that they'd be considered unplayable.  Of course, that's still better than crashing. 

Both solutions have their advantages.  I'm curious though, how many people actually want to use embark zones larger than 12x12?
Logged

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #21 on: December 11, 2011, 03:43:43 am »

It sounds as if the only concern with a 16 x 16 embark, were it possible, is FPS death. Ever faster hardware will keep pushing FPS death back, as well as playing so as to keep the mobile count down. FPS death 'of a sort' is also a problem for Civilization, including Civ V: you have to wait for the other Civs to 'take their turn'. So at some point you decide to cut back on the number of Civs and size of your world to make playing the game tolerable.

But if you have a $2000 'game machine' or a cheap used Sun Blackbox emulating Windows, it ain't braggin' if you can swing it. :)

Civ V does not bother attempting to stop you from playing 'max size world + max size civs'. It certainly doesn't CTD when you attempt to do so. It simply has you wait while the many other Civs are plotting your death behind your back. Some sort of 'test mode' in DF wouldn't hurt.

"Run System Test" could be a new menu option. The system test returns a recommended 'max setting' and offers to write that max setting into your system's config file. The system test simply has a stock fortress and it spends a minute running that fortress from a known start point. A group of dwarves are committed to battle, foes are ready to destroy them. Other dwarves have tasks assigned to them, and the test runs for, say, 500 frames and then ends. The system can then figure out a 'score' for your system, and based on that score suggest you can run a 16 x 16 embark (after ensuring you have the memory to support it, and DF won't just CTD on you) or it may suggest you are pushing it if you do more than a 4 x 4 embark.

That would be a means to provide the game player a tool to evaluate how best to plan their DF gameplay experience without the CTD being a part of that.

Who would want to have a 16 x 16 embark? Well... if I have the hardware and the memory to support it, why not? Well, FPS death. Yeah? Well... what tools do I have to evaluate how big I can make it without FPS death in an established fortress? Crash-To-Desktop. Oh yeah, if you get a CTD, your fortress is 'likely' too big. K'thanks.

CTD is NOT a reasonable means to suggest you are biting off more than you can chew on an embark. A 'standardized test run process returning useful info' would be far more useful than CTD. Hrm... 'standardized test run package' would be another suggestion I ought to post...
Logged

King Mir

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #22 on: December 11, 2011, 08:03:51 am »

All this is very definitely out of my area of expertise.

I just know that CTDs are a category of HFS that is way more fun than I care to allow myself. :) The only warning when trying to do a 16 x 16 embark is about 'lag'. Options that are physically impossible due to the lack of available memory should be blocked with a warning, "Only X amount of space is available, alter Y and/ or Z for more of X". CTDs, especially due to memory should be exception handled, hopefully with a 'crash save in /crash' for a hope-in-hell recovery.

Everything else is in the 'if wishes were fishes'.

If a 64-bit compile were even considered, that only pushes the problem down the road rather than resolving the problem. Pushing the CTD issue down the road another 5 years works for me, so long as there is the understanding it is not a good long-term solution.

If the problem can be isolated, a dollar figure put on it, and a pot we can throw money into, I will commit to throwing $50 into the pot in two payments. I know it's not enough money, but it's what I can do. Any problem you can solve with money you have at your disposal is no problem. Hopefully memory CTDs are annoying enough for people they'd want to throw sufficient money at it so Toady can make them go away, and stay gone. 4 GB, 8 GB, 16 TB, 256-bit OS platforms with 1024 cores... whatever. CTD's be gone! :D
Well It would be relatively easy to change the CTD to a crash-to-title screen. Anything more would be non-trivial. Trying to save the problem map may not be possible, or useful, since the map would use the same amount of memory when loaded back up.

It would be possible for Toady to guess when an embark is large enough that it could run into the memory issue and include a warning message, but it would not be more than what is already known by the community. It would be helpful to newcomers, but it would be as good a guess as you can already make, so it won't help you.

As for pushing the problem down the road, the fact is, 8TB is 4000 times more than 2GB. And in theory 64 bit addresses could access Exabytes of memory. If DF manages to use that much memory, the other problems will come much much sooner than running out of memory, even with ever faster and larger memories promised by Moore's law. Out of memory CTDs will not be a problem on a 64 bit version of DF.

thisisjimmy

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #23 on: December 11, 2011, 12:57:52 pm »

Ever faster hardware will keep pushing FPS death back

I wouldn't count on this.  Processor speeds haven't increased much in the past 7 years.  Most of the performance improvements have come from adding more cores.  However, DF is single threaded, so it can only use one core.
Logged

Romaq

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #24 on: December 11, 2011, 11:44:33 pm »

You are correct. I dunno. I think I'll just keep enjoying the game as best I know I can and let other people sort it out. I'll just go on record as thinking that CTD's by memory ought to go away, one way or another.
Logged

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #25 on: December 12, 2011, 01:32:51 am »

Ever faster hardware will keep pushing FPS death back

I wouldn't count on this.  Processor speeds haven't increased much in the past 7 years.  Most of the performance improvements have come from adding more cores.  However, DF is single threaded, so it can only use one core.

The number of cores have increased, yes, but there are also architectural improvements to the processors that make the much faster at even the same clock rate.  Compare a 4Ghz Pentium 4 to a Core 2 at 2Ghz.

This is a doomed thing too, of course, but processors are still getting faster per core.  Slowly.
Logged
Through pain, I find wisdom.

knutor

  • Bay Watcher
  • ..to hear the lamentation of the elves!
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #26 on: December 12, 2011, 06:17:57 pm »

processors are still getting faster per core.  Slowly.

There is speed, and then there is capacity.  I would think DF requires more capacity, than speed.  And just a relatively acceptable fetch speed.  The calculations in the GPU, are they all that breathtaking?  As compared to a FPS, that is gonna drive the system to drink, with pixel mapping?   

I wish I would have known about this 2G memory cap, before I setup my 20G virtual memory.  I've got 18G of wasted space in this 32bit Vista system.  HA.  Well Its not completely wasted, sending Internet Temp files over there too.  But geez!  So many parameters to remember, and so little time and money...  HA!  Thank you for the pro tips, King and Jimmy.

There in some window I saw that 2G limit, but it wasn't grey'd out, like limits are usually.  Are you sure its a 32bit hard cap, and not just a default, that can be tweaked up?  Gonna see if I can remember where to find it, somewhere I found it, in the Vista Control Panel.

« Last Edit: December 12, 2011, 06:23:19 pm by knutor »
Logged
"I don't often drink Mead, but when I do... I prefer Dee Eef's.  -The most interesting Dwarf in the World.  Stay thirsty, my friend.
Shark Dentistry, looking in the Raws.

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #27 on: December 12, 2011, 07:05:49 pm »

Speeding up the processor cores in general will benefit DF, but I'm betting it's memory bound.  Almost everything is memory bound.  Many modern processor optimizations are finding ways to do useful work while waiting for the memory system to return data you're asking for.  As to how DF could be reworked to improve this... heaven alone knows.  Cache coherency is going to be a problem, and there's probably a point where it goes kablooie when one of the game's many internal memory tables gets too large to fit nicely into the processor cache.

And you're not wasting that 18GB of memory.  The limit is 2GB per process.  So you could run up to 10 programs, each using 2GB of memory, without running into trouble.

The reasons for this are somewhat arcane, but basically in a 32-bit address space, you can only count up to 4 billion addresses, or 4GB.  When the OS maps memory to the process, it has to map part of that address space for itself (for various uses), but the rest is allocated to the process.  The actual physical address can be anything, but the process can only use up to 2GB of the virtual address space.  So the OS ends up mapping that same 2GB of physical addresses for itself into each process, and hands out 2GB slabs of the physical space as translated virtual space for the individual processes.

This is how Windows used to work anyway.  There's no specific reason for it to be 2GB, and I'm pretty sure Linux at least can change this to 3/1GB or other variations.  Modern versions of Windows may allow this to be adjusted as well, but there might be issues if you don't give the OS enough address space per process.

I actually have no idea how PAE (physical address extension) gets around this in 32-bit systems to allow addressing past 4GB, but I'm betting it's some sort of virtual address translation on the processor's MMU somehow.  Probably has no effect on the 2GB address space limit.  Getting kind of off topic now though...

Anyway, rewriting DF for 64-bit is going to become necessary eventually.  I only wish it was as simple as setting the compiler target for 64-bit and hitting the build button.  I doubt Toady has too many 32-bit specific hacks or inline ASM or anything of the sort, but I'm going to bet he'll run into issues somewhere.  Using 32-bit pointers, for example.

I wish it was practical to offload the internal storage into an external database system or server of some sort, but if you think performance is bad now...
Logged
Through pain, I find wisdom.

thisisjimmy

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #28 on: December 12, 2011, 09:16:08 pm »

There's a link option, "large address aware", that will allow a 32-bit program to access 3GB on 32-bit Windows, and 4GB on 64-bit Windows.  That's potentially another quick fix Toady could add.
Logged

King Mir

  • Bay Watcher
    • View Profile
Re: System Memory issues to consider on a 'refactoring' of DF memory handling
« Reply #29 on: December 12, 2011, 10:24:29 pm »

Speeding up the processor cores in general will benefit DF, but I'm betting it's memory bound.  Almost everything is memory bound.  Many modern processor optimizations are finding ways to do useful work while waiting for the memory system to return data you're asking for.  As to how DF could be reworked to improve this... heaven alone knows.  Cache coherency is going to be a problem, and there's probably a point where it goes kablooie when one of the game's many internal memory tables gets too large to fit nicely into the processor cache.
Yeah, there are several levels of cache, and an application can potentially be bound by any one of them, or the speed of the memory itself (and the motherboard chipset, which needs to support the high speed memory). I suspect DF is bound by the latter. It depends on the locality of sequential memory access; if DF tends to jump all over it's memory, or access blocks at a time.

But I do think that there would be some benefit from multithreading, despite the heavy memory access time dependence. Enough to justify considering making it multithreaded as a performance priority. 
Pages: 1 [2] 3