My computer runs Morwind, Minecraft, and Dark Souls 2 perfectly on their highest settings. Can barely handle Dwarf Fortress at 60 dwarves.
Morrowind is old, and there's a significant difference between those games since they tax the GPU, not the CPU.
Sorry man, but I know a good deal about how Morrowind runs under the hood; it uses the CPU very aggressively for graphical functions. It's a DX7 title, which has DX8 vertex and pixel shading tacked on top. It also uses the CPU for atmospheric sound effects processing by default.
Almost all of the menu transparency effects are done using software overlay rendering, and not using hardware based methods. (This is likely due to the issues with transparency in hardware implementations at the time, where AMD cards and nVidia cards did things in radically different ways, which would have been tricky to deal with. That, and the gamebryo/netimmerse engine is total crap.)
Things like FPS Optimizer and pals actually patch the in-memory game executable's code to force it to run in hardware accelerated modes!
Sorry to nitpick, on that. Carry on.
That said, I recently have come into possession of a dual CPU socket (Each sporting dual core XEONs) rackmount enterprise server (An HP
ProLiant DL380 G4). It's an older model intended for use with windows server 2008 (basically winxp, but for servers), with 16gb of RAM. I am not too sure about what the memory bus width is for memory accesses, but this is still a server grade appliance. (Dual PSUs, Scsi RAID disk array, and all.) I recently decided to try running DF on it, but was... disappointed... with the outcome.
The baked in graphics chip has absolutely NO ACCELERATION AT ALL. No, Not even 2D. I suspect that this is the issue. Even with the CPU basically doing everything, and the graphics being little more than a dumb framebuffer, it gets about 25fps on a large embark. Not too bad, but I would have expected better.