I cant say for sure. Difficult to predict. Moore's Law makes some faulty assumption, and its well accepted that its about to fail as a trend. On the one hand, computing power is about to reach a peak, while on the other hand media files keep getting larger, hard drives keep getting bigger. Our ability to access and process ever increasing sizes of information wont necessarily keep up with the ever increasing sizes of the information itself we wish to access and process. Unless we are careful to consider this, we may find that certain processes take longer, or at least have the appearance of taking longer, to completion. Examples including searching the entire hard drive. Of course a simple algorithm such as a prime tester would speed up to computational peak speed, while loading a movie file or defragmenting a hard drive could take longer.
Then again, algorithms become more efficient and innovative, computers need defragging less frequently since computers are more efficient at handling data. More and more data is being kept in cyberspace. Movie files do keep getting bigger such as ultra mega high definition taking up 1 TB, but the audio and visual experience is so marginally improved most people dont care and the rest dont notice, and we are satisfied with a 300 MB lower quality equivalent.
I think the next great revolutions in computing wont be in microprocessor speeds, but will be instead in algorithm design and in infrastructure. There are a multitude of other technologies being worked on, too.