Huh? You probably meant $100 at least.
No, really -
here is a $34 card that should be able to run shader-mode. Probably there are cheaper cards out there, that was the one I found in 30 seconds.
The real problem - and the reason most commercial games still have some legacy support for DX9 hardware - is laptops and all-in-ones. Of course, DF has legacy support for older hardware as well...
Back to the theme - Ironhand you could throw us some graphical bone or you will have another few pages of offtop
How about this? This is a mockup in Rendermonkey (ATI's shader editor) of depth-blending, which Artanis and I were talking about before the derail of the derail. Please excuse the cheesy Photoshop Cloud texture I used as noise:
As I see it, next job is to get some prettiness up and running to justify this semi-fancy hardware.
In 3D, you have to do all this crazy spherical math to do depth-blending, but in 2D it's dead simple. Here's the shader:
uniform sampler2D depthTex;
uniform sampler2D colorTex;
uniform sampler2D miasmaTex;
varying vec2 texCoord;
void main(void)
{
float depth = texture( depthTex, texCoord ).r;
vec3 color = texture( colorTex, texCoord ).rgb;
vec3 miasma = texture( miasmaTex, texCoord ).rgb;
// made-up formula to get a blend I like
float thinning = (miasma.r - 0.1) * 0.5;
float alpha = clamp(thinning + 1.2 * depth, 0.0, 1.0);
gl_FragColor.rgb = color * alpha + miasma * (1.0 - alpha);
}
We don't have all the tile info, though, so blending cloud effects like that is kinda silly. At least until we can talk Toady into making multiple passes as mentioned earlier.
Pretty much, although as WormSlayer said, it might be possible to do something neat around the edges. And Baughn foolishly said that multiple passes might not be unreasonable!