Every once in a while I really consider how much, say, my parents and grandparents paid for computer processing power (grandfather's C64 at ~ $500, our first PC running a 386 at $1100) and compare it to the torrent of amount of memory, processing speed, etc. that's shooting out of the Earth now.
It's astonishing, Moore's Law (and variants) not withstanding.
We know there is some limit to Moore's Law, but I have a hard time really wrapping my head around what kind of nano computing we'll have available for pennies in a decade.
> Every once in a while I really consider how much, say, my parents and grandparents paid for computer processing power (grandfather's C64 at ~ $500, our first PC running a 386 at $1100) and compare it to the torrent of amount of memory, processing speed, etc. that's shooting out of the Earth now.
It's such a simple way of looking at it but I've always found this amazing:
There was as much progress between 1965 and 2012 as between 2012 and now.
Back in the late '60s, I worked at a place that had an IBM mainframe, with a big refrigerator-sized expansion that contained some add-on memory: an entire megabyte. They got it on sale for less than a million dollars (not much less, but less). What's that, about a factor of 100 million in cost, and probably a thousandfold increase in bandwidth? It's mindblowing to contemplate.
Video and games are the only consumer tasks (that I can think of) that drive the consumer technology further. Pretty much everything else already runs fast on the previous gen hardware.
When it comes to video, even my wife's slow netbook can play 1080p MKVs without stuttering. 4K can be played and recorded by current gen hardware.
So it looks like games will drive the hardware improvements.
At the moment, maybe, but look at the recent post about a plugin that does automatic background OCR for every image in a browser.
That kind of "processing the ambient environment all the time, just in case you might need some part of it" is going to thrive when computing power allows it.
Permanent background face recognition, just in case there's a face in any picture. Permanent background face identity checking just in case you know them. Permanent audio speech recognition. Permanent searching of all sorts of things to find contextually relevant information. Permanent local analysis trying to work out "what's going on" and "what mood are you in" and "what are you working on" and "is now a bad time to interrupt?".
It's astonishing, Moore's Law (and variants) not withstanding.
We know there is some limit to Moore's Law, but I have a hard time really wrapping my head around what kind of nano computing we'll have available for pennies in a decade.