If I could get a dollar each time a marketing guy comes with new buzzword, I would be a rich man by now
"Cloud"-environments have existed for a long time. But we didn't know that they were "clouds" so we called them "fail-tolerant clusters". That was run on the big iron. In last decade, x86 hardware got faster and storage got cheaper which allowed you to build clusters with run-off-the-mill components. VMWare greatly improved this so many companies are virtualizing their servers into big VMWare-clusters. (this is very much a step back to "dumb-terminal-to-mainframe" architecture of 40 years ago).
So essentially, nothing has changed. This has been around forever. Only difference is that Internet lines got faster and Web-browsers can now run processes internally. That will allow "outsourcing" of certain applications to be run remotely by home users on big clusters built on cheap Wintel hardware.
"Clouds" are already here, you just don't notice them. Your Google search is done in the "cloud", your bank account is handled by a cluster of computers...heck, I bet even this forum is probably hosted on couple of load-balanced servers an sucking it's data from clustered database servers.
A personal observation: Things are equally slow nowadays as they were 20 years ago. A web-based "thin" client needs as much or more time to render all that fluff, connect to the Interwebs, parse the HTML and deliver the data as old Telnet-terminal took 20 years ago.
So don't worry, you will still need to buy newest Wintel CPU with gazzilion GHz in order to run your spreadsheet or write a simple document. Otherwise the whole industry would grind to halt.