Isn't there a performance loss of a Multicore game forced to use a single core processor?
First, a clarification: There is no such thing as a "multicore game". There can be, however, such a thing as a multi-*threaded game*. To explain: A program can, in theory, be broken down into multiple parts, that can do stuff independently. In the example of Dwarf Fortress: You could have a thread drawing the UI, another thread doing the AI, another thread doing world stuff (like the migrations, or armies in the background).
My apologies for my anal-retentiveness in this regard, but it is an important difference.
Now to answer your question:
Short answer: No.
Optional, long answer: It depends.
Multi-threaded applications, and in particular games, require a lot of
concurrency (meaning that the threads do stuff on their own, while using data from other threads). If you do that wrong, you get all sorts of interesting effects, mostly manifesting in crashes. Concurrency is *hard*. It is even harder in languages like C/++, which, IIRC, is used to write Dwarf Fortress.
Then, there is the problem of scheduling. Let's say each CPU (let's count each core as its own CPU, for simplicity's sake), can do n+1 threads, where "n" is the number of cores, without performance loss. If you start to execute n+m threads, where "m" is the number of threads a program has, you get more cache misses (meaning that data has to be gotten from a slower memory, which takes longer), more page faults (data isn't in memory, but paged to disk, taking very long to load, even multiple seconds, which is *ages* for even a Pentium II CPU), and of course scheduling conflicts (i.e. the OS can't assign all threads the time they actually need, and they get thrown out of the CPU, and another thread gets the CPU for a time, needing access to the cache, and so on).
Of course, *any* and *all* modern operating systems are able to do multi-tasking, and can thus handle many threads at once. So, as long as the programs are half-way decently written, there should be no noticable effect in performance, while a computer with a multi-core CPU enjoys the benefits of more performance.
Now, here is the really strange issue: Even if you have a single-core CPU, the lack of threads can actually seem like the application is actually slower. You can notice that with Dwarf Fortress: During saving and loading, sometimes the screen stops updating: That makes it *seem* like the application is slower, when in effect it just means that an application is very busy.
Whence the advice to application developers to use two threads when they write their spreadsheet application: An UI thread, and a so-called worker thread. The UI thread keeps the screen updated and semi-responsive, while the worker=thread is, well, working.
Clear as mud, innit?
Edit: Clarity and an apology for possible perceived rudeness/condescending attitude.