Most software developers are just lagging too far behind the curve right now & only a hand-full are even starting to consider native 64bit multi-threaded support.
I don't think that's a fair thing to say.
Most of the software many people use on their computers are I/O bound and are just sitting there waiting for you to make an input. And often times they're not even using a lot of RAM to justify going 64-bit (which there are only two primary benefits: natively handling 64-bit data types and expanded memory access). Going 64-bit doesn't automatically translate into better performance. And in my last job, I rarely had to use anything beyond 16-bit data, on a 32-bit processor no less.
The multi-threaded support is another thing, but again, since most programs we use are I/O bound, it doesn't matter. As far as games are concerned, considering that everyone is pushing to lower the CPU requirement bar even further (because most of it is API overhead)... well there you go.
What I believe is going on is a combination of running out of hardware techniques (there hasn't been a major breakthrough in processor architecture since multi-core design) and there's enough eggheads to create well optimized software that's reached a "good enough" point and optimizing it further takes more resources than is worth. Like the whole "What the fuck" reverse square algorithm. It's fast, it's accurate, and while I'm sure you can find something even better, it does what it does well.
Or it's like we could use gold or silver instead of copper for electrical and thermal applications, but they cost too much to justify how much more "performance" they give.