In the Computer Graphics industry, there's a concept called the "uncanny valley". The idea is that there's a major visual plateau that you hit when things get VERY realistic looking. For a while, things are more and more convincing as you get more photo-realistic, and more and more pleasing, until the graphics get so realistic that every little thing that's just off jumps out at you.
And because its otherwise so realistic (and most people see these defects only subconciously), this can create disbelief, and even revulsion. The gap in belief actually WIDENS in this "uncanny valley" as you approach photorealism, at least until one can iron out these previously unimportant kinks.
I think that's more or less where we are with compute power on the desktop.
The trivial summation of Moore's law is: "Computers get twice as fast every 18 months". There's more subtlety there (something about empirical observations in trending of transistor counts per square inch on circuit boards), but its a fair summation I think.
Unfortunately, we're not seeing the product and consumer experiences that really benefit from Moore's Law anymore (games aside). We're in the "uncanny valley" of application experiences, at least from a desktop compute power perspective. (OK, so its more of a plateau than a valley, but you get the idea...)
What's interesting is this: It's not clear to me if this current experience gap, and our (industry "our") ability to clear it, is a failure of sufficient compute power growth to enable these new experiences, or a failure of imagination.
I suspect the latter.