Bay 12 Games Forum

Please login or register.

Login with username, password and session length
Advanced search  
Pages: [1] 2 3

Author Topic: So I heard talk..  (Read 2761 times)

Hoborobo234

  • Bay Watcher
  • For the revolution!
    • View Profile
So I heard talk..
« on: April 25, 2009, 09:03:18 am »

Of fortresses not being spread out over the multiple processors that people have, with the next release, will it be possible for Fortresses to be spread out, causing alomost no trouble with the reletively small Catsplosions we have to day. Could this be a cure for not setting a populaton cap and having thousands of dwarves in a fort? have you got any info you could give me?>
Logged
Rather than having them directly force you to mine adamantine, I would suggest that they give you strange moods that require adamantine. "Dig out the adamantine or Urist here goes insane and dies" is suitably vicious.

(It occurs to me that you can probably get "Lovecraft" as the random name of your fortress. That's when you know you're screwed.)

Taritus

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #1 on: April 25, 2009, 09:19:38 am »

Your question barely makes sense, but I think the answers you're looking for are as follows:  no, DF will not be multithreaded; no, there is currently no work being done to multithread pathfinding.
Logged



Greiger

  • Bay Watcher
  • Reptilian Illuminati member. Keep it secret.
    • View Profile
Re: So I heard talk..
« Reply #2 on: April 25, 2009, 09:21:22 am »

I believe the open gl optimizations will allow the graphical stuff to run on another core (or be handled by the graphics card or something which it apparently doesn't do now) but as for full multicore support, that is a long ways away.

It is a touchy topic though.  Many folks want it done as soon as possible despite the time it will take and there being essentially no new features to show for it other than the multicore support, others believe that DF can still run the way it is now and isn't worth a rewrite, despite all signs pointing to it needing to be done eventually.

P.S. Intentionally tried to stay neutral.  I have my opinion, but I want to avoid another argument about this.
« Last Edit: April 25, 2009, 09:25:04 am by Greiger »
Logged
Disclaimer: Not responsible for dwarven deaths from the use or misuse of this post.
Quote
I don't need friends!! I've got knives!!!

dreiche2

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #3 on: April 25, 2009, 09:27:13 am »

I'm for multithreading, but please, does it have come up in every second thread? I'd suggest that the OP locks this thread, or otherwise that we let it die.
Logged

azrael4h

  • Bay Watcher
    • View Profile
    • My Dwarf Fortress-centric You Tube videos, part of my nominally vintage gaming channel.
Re: So I heard talk..
« Reply #4 on: April 25, 2009, 09:28:37 am »

Actually, if we wait long enough, 3.x+ ghz quad cores will become the standard in the low end computers. Maybe 2 years. Probably by the time I replace this laptop, 3.x ghz dual cores will be standard on lower end computers.

So soon enough, you'll be able to buy or build a $500-600 computer that can run a 200 dwarf fort, even without multithreading. You may now be able to build it, but I haven't priced stuff in about 8 months. I should.

Edit

Built a comp at Newegg for $357, not counting a few rebates and shipping. Didn't really try to price shop either. I could probably get the pre-shipping total down below $300. Easily if I used more than one website instead of just Newegg
3.0ghz AMD dual core
4gb RAM
512mb nVidia 8400
Sound Blaster
Samsung DVD/CD burner
plus case, power supply, fans, and other stuff.
« Last Edit: April 25, 2009, 09:46:57 am by azrael4h »
Logged

codezero

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #5 on: April 25, 2009, 09:53:47 am »

It's still possible CPU's with cores will be a fad. Because some things run better in serial and some in parallel, and parallel is what GPU's currently do. With the growing popularity of programmers writing software that uses the GPU for stuff the CPU used to do, graphics cards might be rebadged as 'parallel processers', running alongside the 'serial processer' (your old cpu).
Logged

Kardos

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #6 on: April 25, 2009, 10:02:47 am »

There's actually some research into building mobos that support two processor sockets.  One for your CPU, and the other for GPU.  Don't ask where I saw that, I can't for the life of me remember.
I think its about time really.  Way back when the video card was first implemented, video was an accesrory to a terminal machine.  Now, video is as important as being able to use an X-Cel or word processor program. 
Logged

Volfram

  • Bay Watcher
  • hate you all.
    • View Profile
Re: So I heard talk..
« Reply #7 on: April 25, 2009, 10:10:01 am »

It's still possible CPU's with cores will be a fad. Because some things run better in serial and some in parallel, and parallel is what GPU's currently do. With the growing popularity of programmers writing software that uses the GPU for stuff the CPU used to do, graphics cards might be rebadged as 'parallel processers', running alongside the 'serial processer' (your old cpu).
Not likely, in my opinion.  We're approaching the physical limitations of clock speed currently, which was beginning to look like a roadblock for Moore's law, but multicores, along with better instruction sets and larger bitwidth, allowed chipmakers to circumvent those limitations, and Moore's law still holds true.
Logged
Andir and Roxorius "should" die.

Yes, actually, I am trying to get myself banned.  I wish Toady would quit working on this worthless piece of junk and go back to teaching math.

codezero

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #8 on: April 25, 2009, 10:21:28 am »

Yeah now that I think about it multicore CPU's are probably a huge scam. I bet a $200 GPU could run threaded stuff (with some help from Cuda and ATI's version) 100 times faster (literally) than an $800 iCore.

This after I read latest post;
I still think it will be a fad, 'cause someone will probably find a way to make multicores run serially, that might seem like a silly argument, but it would fulfil the need for serial processing power, whilst a GPU already takes care of multi-threading.

EDIT: No matter how many cores you throw on a cpu, you're still gonna be better off running some stuff through the gpu, and to put all those extra cores on, you probably have to reduce the frequency.
« Last Edit: April 25, 2009, 10:24:45 am by codezero »
Logged

CynicalRyan

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #9 on: April 25, 2009, 10:43:38 am »

Yeah now that I think about it multicore CPU's are probably a huge scam. I bet a $200 GPU could run threaded stuff (with some help from Cuda and ATI's version) 100 times faster (literally) than an $800 iCore.

Not true if you look at processor architecture. It's a trade off: A GPU is great doing complicated floating point calculations and other tasks that can be easily parallelized (cracking WAP keys, for example, or other cryptographical problems with thrive on floating point and large primes).

However, Google's MapReduce (the way the search engine works, more or less) wouldn't be able to run on these beasts, since it's essentially really, really fast data look up. For that, you need a general purpose CPU (that's how Google does it, for example), or other specialized hardware. But who owns a PhysX board (It's to physics engines what old VooDoo was for 3D acceleration)?

And multicores don't just provide a benefit to one multi threaded application, but you can run multiple single threaded applications at the same time. For example, I could run two instances of Dwarf Fortress without performance loss (assuming all else being equal).

Quote
I still think it will be a fad, 'cause someone will probably find a way to make multicores run serially, that might seem like a silly argument, but it would fulfil the need for serial processing power, whilst a GPU already takes care of multi-threading.

Somebody said the same about personal computers, too: They are a fad, and only interesting for businesses.

Quote
EDIT: No matter how many cores you throw on a cpu, you're still gonna be better off running some stuff through the gpu, and to put all those extra cores on, you probably have to reduce the frequency.

The first part is true (see above), but not the second part. The only time you lose a bit of performance is, if a thread has to be shifted from one core to another core. However, multicores are, essentially, a proven technology: Servers and workstations (like the SPARCStation, or the Mac Pro) use multiple CPUs for ages now.

Multicore simplifies architecture (shared L2 cache, shared memory pipelines) to lower cost (at the trade off that you lose maybe one or two cycles of processing power, which doesn't matter with cycles in the 2 000 000 Hz range).

The trick is to exploit the benefits of multicores, and it will take a while until software developers are able to solve that. Concurrency is hard, but necessary to fully exploit the hardware. It's akin to the advent of structured programming languages, or object oriented programming. Takes a while to catch on.

To answer the OP's question: I doubt that DF will be multi-threaded any time soon (at least as a playable version). Multithreaded apps are a whole different approach to program, and require major(-ish) reworkings of how DF does things currently.
Logged

codezero

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #10 on: April 25, 2009, 11:35:48 am »

Fair enough, I'm glad it's around, so people become more aware of parallisation, but any software that needs to run serially won't benefit at all from multicores, and any software that can be made parallel could just utilise the GPU. Though there are some exceptions according to you. So I think they're essentially useless except as a spur to more efficient coding practices. Alright I can run two programs at the same time, but that's about as useful as buying another computer, and having a switch on my monitor.
Logged

G-Flex

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #11 on: April 25, 2009, 11:39:37 am »

Fair enough, I'm glad it's around, so people become more aware of parallisation, but any software that needs to run serially won't benefit at all from multicores, and any software that can be made parallel could just utilise the GPU. Though there are some exceptions according to you.

Reading the guy's post *again*, it seems like GPU being better for certain tasks is the exception, not the rule. There's a reason they aren't general-purpose CPUs, or why they can afford to be better than them at certain things.
Logged
There are 2 types of people in the world: Those who understand hexadecimal, and those who don't.
Visit the #Bay12Games IRC channel on NewNet
== Human Renovation: My Deus Ex mod/fan patch (v1.30, updated 5/31/2012) ==

Volfram

  • Bay Watcher
  • hate you all.
    • View Profile
Re: So I heard talk..
« Reply #12 on: April 25, 2009, 12:46:58 pm »

Hardware vs. software implimentation.  GPUs have a very specific instruction set which is very good for converting raw data into an image to display onscreen.  They're not so good for general number crunching.  You wouldn't want to run your AI through your GPU, for example.  It wouldn't work.(or at very least, it would crawl in comparison to the CPU)

The CPU is much more versatile and has a much broader instruction set, but this means that it isn't as efficient at doing specific tasks.  The GPU trades versatility for specialization, the CPU trades specialization for versatility.

As point of reference, your average CPU currently runs what, in the 3 Ghz area?(and they have for the past two or three years, because of physical limitations to clock speed.)  A high-end GPU runs between 400 and 800 Mhz, but hardware-implementation graphics accelleration is considerably faster than software, even though the processor doing the calculations runs 3-8x slower.

The GPU is really, really wimpy in comparison to the CPU in terms of raw power, but it's absurdly good at what it's designed for.  It's like comparing a mine cart to a pickup truck.  Sure, the truck can go more places, carry more, and probably moves faster in general, but if you're transporting ore through a mine on a cart track, the mine cart is a far better choice.

Guess what my major is.
Logged
Andir and Roxorius "should" die.

Yes, actually, I am trying to get myself banned.  I wish Toady would quit working on this worthless piece of junk and go back to teaching math.

macdonellba

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #13 on: April 25, 2009, 01:17:31 pm »

Guess what my major is.
Automotive engineering? :P Any good computer engineering major knows that the relevance of the clock speed of a massively parallel architecture (like a GPU) is so far removed from the clock speed of a strictly-serial / instruction re-ordering architecture that it's completely irrelevant for comparison's sake. In fact, if you're discussing 'raw power' in GFLOPS, most modern GPUs take the cake. While I realize this remains an apples-to-oranges comparison, I prefer it to the apples-to-Apples (the Swiss municipality) explanation you gave. ;)

Anyway, just for the sake of completeness: modern GPUs aren't so much specialized to handle 'converting data to an image onscreen' as they are for massively parallel manipulation of vectors and matrices in order to support 3D geometry calculations such as shaders. This makes them awesome for 'general number crunching' (see: CUBLAS), but terrible for the kinds of indirection- and memory-intensive work that CPUs usually spend so much time on.
« Last Edit: April 25, 2009, 01:34:54 pm by macdonellba »
Logged

CynicalRyan

  • Bay Watcher
    • View Profile
Re: So I heard talk..
« Reply #14 on: April 25, 2009, 03:17:33 pm »

Reading the guy's post *again*, it seems like GPU being better for certain tasks is the exception, not the rule. There's a reason they aren't general-purpose CPUs, or why they can afford to be better than them at certain things.

Pretty much, yes. I wasn't clear enough in that regard, my apologies.

To sum it up: If you need to crunch ginormous amounts of numbers: Use a RISC (Reduced Instruction Set Computer) (pun very much not corrected), like a modern GPU. If you need to do anything else, use a plain old CISC (Complex Instruction Set Computer).
Logged
Pages: [1] 2 3