At this point isn't it better (performance/cost wise) to just buy the "better" options from the start, instead of trying to push the lesser ones to their limits?
I suppose looking at it rationally, it's not particularly likely that you'll end up with an i5 that runs at i9 speeds stably, but overclockers like the thrill of it. They like being able to a) save money, and b) sticking it to the man by showing off these overclocked CPUs. Hell, they call the tendency for CPUs to exhibit variance in how far you can push them, "silicon lottery". If you buy an i5 that reaches i9 speeds when overclocked, you've basically "won" the "lottery".
This is way before my time, but there was this CPU called the Celeron 300A, which ran at 300 MHz. Now, you hear "Celeron" now, and you associate it with pain. Back then, though, it turns out that you could push these things to 450 MHz 9 times out of 10. What does that get you? Well, the Pentium II 450 MHz was a thing. In workstation applications, sure, the Celeron would lose. In gaming? You got the performance of a 450 MHz Pentium II, and you paid Celeron prices.
That's sort of the dream of overclocking. To be able to buy a weaker product, and pushing it to its absolute limits to get the performance of a stronger product while only paying for the weak product.
Omigosh this is why I try to tell my mom I don't understand computer hardware! Just because I can diagnose and replace a failed power supply doesn't mean I understand all the RAM/CPU numbers and compatibility issues Part of me still wants to think "More clock speed has to be gooder even across instruction sets and architectures!"
Intel Pentium 4, AMD Bulldozer. Prime examples of higher clock speed != higher performance. The main problem with them is that, yes, they reach stupid-high clock speeds, but they sacrifice efficiency, to the point that they literally sit around just stalling half the time.
It gets even better with the Pentium 4-based Celerons. They literally cut down the cache, on an architecture that already struggled with continually stalling for other reasons. You know what Intel did to "compensate"? Increased the clock speeds. This increased heat output further. Yeah, people say P4s are room heaters, but the
Celerons of the era... Jesus Christ. I can't properly corroborate numbers, but a Celeron D (as they were called) compared against a Pentium 4 of the same performance and era would end up consuming more power than the P4. It's a jump from 65W to 84W, something in that ballpark. The Intel stock coolers of the era reflected that; the P4 ones were copper-cored just so that they could dissipate the ludicrous amount of heat they generated.
Of course, now you have CPUs that easily go above 125W, given sufficient cooling. The difference there is that you actually get performance when you invest in a good cooling system. It's not that they're inefficient, it's just that the sheer number of cores you get in high-end processors just need that much power.