Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  

Author Topic: Dwarf Fortress GPU acceleration discussion  (Read 3550 times)

Megaman3321

  • Bay Watcher
  • I'm everywhere but this Universe
    • View Profile
Dwarf Fortress GPU acceleration discussion
« on: June 17, 2016, 04:16:14 pm »

So, lately I've been thinking about something that I find to be very interesting. In the wild world of high-volume number crunching, GPU acceleration is used to vastly improve compute performance for such repetitive and relatively computationally simple, but cycle-intensive tasks such as weather modeling, aerofoil simulation, and tectonic activity. Dwarf Fortress, from what I've read and know of it, is one of these cycle-intensive, computationally simple tasks (like say, rendering a 3D scene in a game). So, how much do you nerds think that having GPU acceleration support in DF would benefit the game's runtime? I think it'd be a fairly dramatic shift; with the levels of improvement you see from even having a basic GPU involved in the process of rendering video, it could be reasonably inferred that GPU acceleration in dwarf fortress would yield similar results (What I'm talking about, if you're unfamiliar: https://www.youtube.com/watch?v=g7cQK8jFPzo ) with some optimization.

Thoughts?
Logged
Glacial on dwarves being assigned socks:
Quote
You see, here's how I think this works:
Overseer: Welcome to the military! You need to wear socks! Dorf: Oh, I should get military socks. My socks are civilian socks. Dorf discards socks Dorf: You know, I need a whole lot of gear now. I should get socks... last. Oh, but these steel boots with the white goo on them are nice!
I know you can pick up water, then throw said water, while underwater, to kill a fish -He_Silent_H

Sizik

  • Bay Watcher
    • View Profile
Re: Dwarf Fortress GPU acceleration discussion
« Reply #1 on: June 17, 2016, 05:44:33 pm »

GPU acceleration is just massive parallelization. Dwarf Fortress would have to be multithreaded first before seriously considering it.
Logged
Skyscrapes, the Tower-Fortress, finally complete!
Skyscrapes 2, repelling the zombie horde!

Shonai_Dweller

  • Bay Watcher
    • View Profile
Re: Dwarf Fortress GPU acceleration discussion
« Reply #2 on: June 17, 2016, 05:51:50 pm »

GPU acceleration is just massive parallelization. Dwarf Fortress would have to be multithreaded first before seriously considering it.
And since Tarn seriously talked about multithreading the other day, can we please at least discuss this idea without writing it off with a simplistic 'without x, why bother talking about y'.

I'd be upset momentarily if DF actually required a gaming graphics card. Then I'd shrug and get myself a new computer to play DF.
Logged

Bumber

  • Bay Watcher
  • REMOVE KOBOLD
    • View Profile
Re: Dwarf Fortress GPU acceleration discussion
« Reply #3 on: June 17, 2016, 06:41:34 pm »

I'm pretty sure this was brought up before in another thread. Someone said that GPU acceleration is good for stuff that can be stored on the GPU and just send/receive changes in state. You can't send bulk data between the GPU and CPU.

I'm not sure which parts of DF fit that description, but it would all be just as well served by CPU multithreading.
Logged
Reading his name would trigger it. Thinking of him would trigger it. No other circumstances would trigger it- it was strictly related to the concept of Bill Clinton entering the conscious mind.

THE xTROLL FUR SOCKx RUSE WAS A........... DISTACTION        the carp HAVE the wagon

A wizard has turned you into a wagon. This was inevitable (Y/y)?

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Dwarf Fortress GPU acceleration discussion
« Reply #4 on: June 17, 2016, 07:04:40 pm »

Quote
Dwarf Fortress, from what I've read and know of it, is one of these cycle-intensive, computationally simple tasks (like say, rendering a 3D scene in a game).

Unfortunately, it's not computationally simple from a GPU's point of view.

The deal breaker is that GPUs ideally work under the following conditions:

1. You can break your problem up into thousands of instances of small, identical problems.
2. Each instance should access memory in a regular pattern related to its neighbors (i.e. thread 1 reads memory address 1, thread 2 reads address 2, etc.).
3. As many threads should follow the same control path as possible (that is, every time you have an if statement in your code, you're introducing a potential performance problem).
4. You should have as little synchronization as possible, and ideally none.

Some of those restrictions have probably been relaxed since the last time I did serious GPU work a few years back, but probably not too much.

So, you can't readily do things like creature or item updates with it (way, way too much synchronization is needed), or background world updates.  Pathfinding might be possible and I'm sure someone has implemented A* on a GPU.  Temperature updates might be possible, but it's tricky to do it right because of the need to interact with neighboring cells.  The weather simulation is similar.  Fluids, a major source of FPS drain, would probably very difficult to implement.

If anything, pathfinding is probably the only thing that would really be worth the effort, but as Bumber suggested it would be much simpler to do it with multithreading on the CPU and probably for similar gain.  In order for Toady to make pathfinding work on GPUs, he'd need to rewrite the pathfinding code using CUDA (only works on Nvidia cards) and/or OpenCL (supported less well on Nvidia cards but works on AMD cards), then write code that is able to detect if your system supports the accelerators (on multiple platforms) and transfer the path information back and forth to the GPU.  Not super hard to do, but it's daunting compared to something like OpenMP (just adding extra compiler pragmas for the most part) or pthreads (mostly adding extra function calls here and there).
Logged
Through pain, I find wisdom.

Megaman3321

  • Bay Watcher
  • I'm everywhere but this Universe
    • View Profile
Re: Dwarf Fortress GPU acceleration discussion
« Reply #5 on: June 18, 2016, 02:55:22 am »

Quote
Dwarf Fortress, from what I've read and know of it, is one of these cycle-intensive, computationally simple tasks (like say, rendering a 3D scene in a game).

Unfortunately, it's not computationally simple from a GPU's point of view.

The deal breaker is that GPUs ideally work under the following conditions:

1. You can break your problem up into thousands of instances of small, identical problems.
2. Each instance should access memory in a regular pattern related to its neighbors (i.e. thread 1 reads memory address 1, thread 2 reads address 2, etc.).
3. As many threads should follow the same control path as possible (that is, every time you have an if statement in your code, you're introducing a potential performance problem).
4. You should have as little synchronization as possible, and ideally none.

Haha, I didn't know that at all going into this! DF's pathfinding would definitely be doable on a GPU, if my own prior (limited) experience with coding nodal pathfinding is anything to go by. Now that I think about it though, it'd definitely make more sense for DF to support multithreading before even considering GPU acceleration; though, once multithreading comes around it'd definitely make sense to start working on the code!

In the vein of "thousands of similar tasks," though, wouldn't worldgen fit that bill? Considering that worldgen is procedurally generated, shouldn't it fit the criteria you listed? Or am I misunderstanding something here?
Logged
Glacial on dwarves being assigned socks:
Quote
You see, here's how I think this works:
Overseer: Welcome to the military! You need to wear socks! Dorf: Oh, I should get military socks. My socks are civilian socks. Dorf discards socks Dorf: You know, I need a whole lot of gear now. I should get socks... last. Oh, but these steel boots with the white goo on them are nice!
I know you can pick up water, then throw said water, while underwater, to kill a fish -He_Silent_H

Telgin

  • Bay Watcher
  • Professional Programmer
    • View Profile
Re: Dwarf Fortress GPU acceleration discussion
« Reply #6 on: June 18, 2016, 02:29:23 pm »

Some parts of worldgen might be suitable for running on a GPU, but it's hard to say without knowing intricate details of the algorithms.  If Toady is just evaluating a simple function across each worldgen tile, for example, then that would work well as long as the data doesn't have to be shuttled back and forth from the GPU and host system repeatedly.

My guess is that things like terrain generation and mineral veins are probably mostly noise functions, so that would work.  Laying out rivers... maybe.  Erosion, rain maps and things like that are also maybes.

The biggest thing that makes worldgen slow though is generating history, which unfortunately wouldn't really work.  It depends on interactions between lots of independent agents that would require tons of synchronization, which would tank GPU performance.
Logged
Through pain, I find wisdom.

dennislp3

  • Bay Watcher
    • View Profile
Re: Dwarf Fortress GPU acceleration discussion
« Reply #7 on: June 19, 2016, 03:03:07 pm »

I think in the future after multi threading is actually implemented (still working on implementing 64 bit stuff so its a bit ways off) it might be more useful, assuming they can work it out so that it can function more seamlessly with the processor...until then multi threading is the "holy grail" of performance optimization/boosting for this game
Logged