Two weeks ago, at USENIX Security, I banged on a whole lot about the implications of cheap bandwidth and cheap data storage, by way of lifelogging using devices descended from today's smartphones.
But I am currently thinking that I over-narrowed my focus.
Here's the thing: let us postulate that by 2021, we will have hit the buffers using current microlithography techniques on CMOS -- say at a resolution of 5nm (compared to today's 22nm process). (Below 10nm our integrated circuits experience interesting quantum effects, not necessarily in a good way, due to electron tunnelling.) At this point we're well into the realm of nanolithography. Today's Intel Westermere Xeon server cpu has on the order of 5 million transistors per square millimetre (on a 512mm2 die) using a 32nm production process; my BOTE calculation suggests 80 million transistors per mm2 is likely by the time we get to 5-6nm resolution, giving full-sized chips with up to 40 billion transistors.
What applications are going to hit mass consumer adoption in the wake of us reaching a point where a first-rank CPU of some 40 billion transistors (equal to, say, 16 x 10 core i7's) cost US $250, and low power CPUs (an n'th generation ARM descendant with, say, 2.6 billion transistors -- a thousand times the component count of today's Cortex A-9 ARM architecture) can deliver the clout of a 10 core i7 on a TDP of around 10mW for a component cost of around $1-2?
Years ago, a couple of eminent computer scientists (if I remember the story correctly one of them was Danny Hillis; I forget who the other way) were discussing trends in chip production around 1980, and one of them objected to the other's extrapolation with, "but there's no market for such cheap chips! What are you going to do, embed them in door handles?" And five years later, checking into a hotel, he suddenly realized that he was using a magstripe card to open his hotel room door because there was indeed a microprocessor in the door handle.
But it doesn't take much in the way of embedded logic to operate a magstripe reader and a deadbolt. So what are the doorhandle applications that become practical when low-cost embedded devices are as powerful as today's high end servers?
One trivial possibility is widespread adoption of biometric authentication based on mixed parameters that take quite a lot of processing: for example, that hypothetical hotel room door might open for you by recognizing your facial bone structure and gait pattern as you approach. Again, your car won't have a key; it will "simply" recognize you, both by your face and your voice and more subtle cues such as your pressure distribution as you sit in the driver's seat.
But that's a gimmick. By which I mean yes, it's convenient, but it's not a game-changer: we already have ways of achieving these objectives (hotel room keys — or magstripe cards — and car keys with immobilizer chips). It doesn't fundamentally change the way we live the way that, say, mobile phones or lifeloggers would bring about basic behavioural changes.
What are the consequences of powerful microprocessors getting really ridiculously cheap — applications that just aren't practical today? Things like the library digitizer from Vernor Vinge's "Rainbows End" (which I shall not describe, because it's both a spoiler for the book and a thing of horror to bibliophiles), or infinite focal depth cameras, or giving your lifelogger real-time ubiquitous text recognition (as in, everything textual in your field of vision is scanned, digitized, and indexed immediately). What am I missing that isn't possible today and doesn't substitute for an existing process or technique? Alternative formation: if spimes are artefacts which are the physical instantiation of an entity with a trackable history on the internet, what happens when spimes acquire enough on-board processing power to act as the container for their own virtual existence?