From my experience with apple proiducts, I occasionally have the displeasure of working with a mac. Now, the thing about macs is that it's advertised that it just works. Well, that kind of thinking makes software developers lazy, thus more prone to crashes from unexpected actions. For example, dragging a playlist between two separate panes. It works fine, but the panes don't update without restarting the program, and if you try to interact with where the playlist once was, the program locks up. Furthermore, the mac doesn't have any obvious way to deal with the frozen software, other than rebooting (With windows, after a certain timeout, the old window is replaced with a dummy that can be moved, resized, etc., and trying to close it gives an option to force it to close. Advanced users can go straight to the task manager, and end it from there. Really advanced users may opt to terminate the process directly, for an instantaneous close.)
Therefore, macs inspire a culturer of lazy programming and half-assed (when at all) handling of unexpected situations, bugs, crashes, freezes, and other unpleasant facts of software life.
With windows, there is a basic understanding that nothing is perfect, so it is a good idea to check pointers against null that should never be null, actually respond to error codes, validate basic user input thouroughly, and most importantly, have your program be the one to catch those errors, rather than the OS. Since, you know, someday someone may run your software on a mac, and then windows will not be around to cleanly deal with the problem, so all you are left with is mac, scratching it's head, and puzzling over that program that suddenly stopped working because of a heisenbug that was never noticed before, while the mac is completely unable to decide what to do, because it had never considered that it might be somehow, even remotely, flawed.