Back to: How I got here in the end, part nine: the little start-up that could | Forward to: How I got here in the end, part ten: head-first into the Singularity

Hiatus ...

There are a couple more chunks of the tech sector autobiography still to come. However, writing it is turning out to be a time-consuming process (17,000 words so far — about a fifth of a novel), and I'm off for an extended weekend in London on Thursday. (If I'd known about the ghastly heat wave when I booked the trip I'd have picked another week; as it is, I'm just glad I picked a hotel with air conditioning.) To add to the fun, when I get home I'm expecting to trip over the copy-edits to "The Trade of Queens", which will need turning around ASAP (they're already overdue, having been held up in production). Oh, and some time this year I've got to write the sequel to "Halting State", and doubtless I'm going to be interrupted during that process by the copy edits to the third Laundry novel, "The Fuller Memorandum", which I delivered to Ace and Orbit a few days ago, to say nothing of going to the Worldcon in Montreal and the Danish national SF convention Fantasticon in Copenhagen, where I'm guest of honour this year.

Which is by way of saying: I'm going to try to squeeze out one more article before I head off, but then there'll be a bit of a gap, and I expect to be too busy to blog at this kind of length and frequency in July and August.

Meanwhile, from the HALTING STATE department:

* China bans Gold Farming

* What a 2009 13-year-old makes of a Sony Walkman

* How badly can you get it wrong? Forbes magazine, circa 2000, took this stab at predicting where computers would be by 2010: it's so full of fail that it's funny, but the reason why it's full of fail is a fertile field for meditation.



Great Forbes link, thanks, gave me a good chuckle.

Whenever I read any forwards-looking statements about technology I always get the same feeling, I remember that earlier this year (or late last) the BBC ran a story with someone insisting in all seriousness that in 5 years time no-one will be using a mouse with their computer and no computer would come with one.

Probably happen about the same time I get a flying car. (Honest, we'll all be flying them in 20 years time, this time is for real.)

Btw, the bio posts have been very entertaining, but I'm struggling to find time to read them, so I've no idea how you've had time to write them with your schedule. :)


Hi - I purchased a copy of 'Halting State' today, looking forward to reading it :) I hope you don't mind me stumbling across your website.


"17,000 words so far — about a fifth of a novel..."

Have you considered publication? Your autobiography might be quite saleable, appealing to a broad readership. You could add a few chapters about your personal take on life, the universe, and everything.


Ah yes, the "Every neat-sounding idea, no matter how useless or annoying, will go into commercial production and be successful" school of futurology.


I liked that Forbes article. It's so easy to be wrong, I like to focus on what these articles get right. From what I read, the only thing they were right about was available disk space increasing -- except that a terabyte doesn't seem so large any more. They also mention fingerprint security, which is fairly standard.

I remember seeing a film in college from the 50s about the future of cable television (when it was still experimental) and all the things you'd be able to do with it. One appliance they imagined was a sort of computer (screen + keyboard) that could connect to the cable.

The computer was hilarious and gigantic, of course, and everyone in class thought it was really funny. There was one scene where the housewife was looking at the big screen choosing a lawn sprinkler to buy. My thought was, well, they basically predicted cable internet and, even if they got some of the details wrong.

That aside, why does every article written about the future of computers, from the 1950s onward, predict that future computers will turn on the lights and tell us when the milk is bad? Is it just a failure of imagination, or are magazine writers bad at checking expiration dates?


My understanding is that China hasn't banned Gold Farming; they're trying to preserve their control of the Yuan by banning people using virtual currencies outside of their virtual worlds. If they don't, then there is a real danger of the money supply in China being completely privatised.


Thanks for the update. I've truly appreciated those 'diary of a startup-geek' entries. I was 6 years younger, so I only heard/experienced the fringes of what you are talking about, but lots of things make (more) sense now. If or When you are ready to continue writing them, I'll be reading with avid attention !


Owen @2, My personal pet peeve is the undying "your main interface method with the computer is going to be spoken language" meme (hint: try to guide another human over the phone).


Oh my. That Forbes article really did make me smile. As Tassos said @8, why do these articles invariably wheel out the spoken language interface concept? Is this because everyone thinks that HAL9000 is the end-result of all our computing efforts?

Those sorts of embarrassments explain why I refuse to offer predictions on anything more distant than what I'm having for my supper. Even then I'm often wrong.

Enjoy your break in London. I can assure you that it is currently really rather sticky here. Do your best to avoid the tube and you should be okay, though.


That description of token coins for virtual money turning into a real-world medium of exchange in China sounds startlingly like Terry Pratchett's account of postage stamps becoming a circulating medium in "Going Postal" and "Making Money."


Are you, are you, are you coming to OpenTech on Saturday?

Regarding Forbes: strange, there's no mention of the SOFTWARE that will run on these incredible computers of the future. Not at all. In that sense, it reads like the 1950s. Also, hardly anything about networking or internetworking - in 2000!

I suppose if you re-shaped a netbook or a Mac Mini/appletv you'd have something roughly like a frisbee. And we do a lot of clever optical stuff, in routers and switchen.

But the real fail is that they expected the future would have companies called Nortel Networks and Lucent Technologies.


OpenTech? What's that, and where is it?


How do you dig up these things Charlie? And I do mean dig up wrt to the Forbes article.


been to London just until last Saturday and it was, indeed, quite a bit warm. This happens to me wherever I go. I went to Ireland in .. 2001 or thereabouts -> heatwave. Went to Canada the next year -> heatwave.

Now 2 weeks ago, I had to go to London for work (stuff at Imperial college) and what happens: it's hot like hell and of course the hotel (since it was paid for by my University and therefore cheap) didn't have air-conditioning. I spent as little time as possible in my room, and the time I did spent there I was pretty motionless ... Hooray for pubs with free wifi! ("Duke of Kendal" on the corner of Kendal Street and Connaught Street (Westminster) as an example. Nice place, that.)


It's the joint ORG-UKUUG-No2ID-MySociety shindig that used to be Notcon/Festival of Inappropriate Technology. It's being held on Saturday at the University of London Union. The schedule is here.


@11 The Forbes article did mention pay-per-view webcasts, which isn't too far off.

It also got screens right in the less pie in the sky theory, that we would just use better tech in screens like they had then. Which is exactly what happened.

I think in all, the forbes people got it mostly correct. You don't plug your computer into your house, but you CAN. You don't use voice recognition software for everything, but you CAN. Their harddrive estimate was off by less than 6 months, and 1TB will be getting common right about 2010. And you can get tabletop displays. See Surface.

They didn't get the opolectronics, but those got stuck in the 3-5 years stage. You know that stage, 5 years ago "It'll be ready in 3-5 years." Today, "It'll be ready in 3-5 years." 5 Years from now, same thing.


Seems to me that Forbes got some things right:

Frisbee sized - check - Mac Mini, net books. Hard Drive size: 1TB - check - right on time.

Also rans - desktop screens - Microsoft has shown a prototype. We've seen them in movies since, oh, Tron in 1984? Touch screen technology seems to make that viable.

Connecting the house to the computer. A perennial dream, mainly stymied by people probably not actually wanting the technology. The complexity of central control is hard to overcome too, better to have individual smart components.

Whizzy new technology, like the described opto-electronics hasn't happened because industry was able to keep squeezing more out of Silicon. The current whizzy tech is graphene as a substrate - we'll see where that is in 10 years.

Let's be reasonable, Charlie, you angst over political and social events overtaking your stories. Why laugh at Forbes for writing a story that looks like it is promoting some of its advertisers?


Alex: I'll believe in houses being hooked up to the personal computer when we see houses being re-wired on a five year cycle instead of a thirty-year one. (As rewiring in, say, this particular house, involves digging trenches in the plasterwork and ripping up floorboards, it doesn't happen that often.) Wireless might change things, natch, but there's still the cost to think of. One controller box per room adds up pretty fast; one per socket adds up even faster.

Oh, and I reckon computer control of household peripherals is going to get a very bad name thanks to the security headaches inherent in smart meters. It's bad enough in SCADA systems, but: hijacking your fridge to send spam? Avaunt ye!

What I'm really interested in is: why did Forbes get so much wrong? What exactly is wrong with their vision of the future? Hint: it's not the individual predictions so much as the sum of the parts ...


Most of the Forbes predictions were really just technology extrapolations. Three orders of magnitude for speed, storage (and apparently power reduction, but we can blame the optoelectronics path for that). There was little mention of the "why" - what are we using this computer for, why are people buying them. What can you really do with it that you couldn't do in 2000. I'm pretty sure there was no mention of video games....


Displays not being flat on the desk is, for most tasks, a feature not a bug.

Even back when people mostly worked with paper on desks there were pieces of furniture and accessories which attempted to tilt documents so they were more vertical for easier viewing and reaching. And many of those relied on gravity to hold things in place so could not tilt up very strongly.

Drafting tables, reading stands, book holders, etc.

The idea is permanently in prototype because it isn't very good.


Charlie. Your point about the wiring cycle is partially valid. As you know, houses in the western US are mostly timber frame and plasterboard. Rewiring is not easy, but also not that hard, compared to plaster on brick. In addition, many custom homes come pre-wired for ethernet and this is also done on remodels as it is so inexpensive. Nevertheless, smart homes are still an extremely tiny minority. The cost issue of the controllers is more important, but frankly I don't yet see the benefits to offset this.

As for appliances being high-jacked, didn't the BBC do a piece on that in the form a documentary from the future about a decade ago? One piece was hackers tormenting people by controlling their vacuum cleaners.

As to why Forbes got so much wrong, just look at how wrong even computer and popular science magazines get wrong about these sorts of predictions. A prosaic article saying computers will be much the same but smaller, faster, cheaper won't sell many copies compared to one describing novel concepts that might happen. In other words, why beat up on Forbes? PC magazine, Discover, Popular Mechanics etc, etc make the same errors, year in, year out.

I appreciate that you are suggesting that the individual components would not make sense when aggregated, again not exactly an uncommon mistake. But as I suggested, I doubt the purpose was to actually describe a computer of 2010 so much as support the companies doing the research. Forbes is well known to tailor article content to please advertisers. No chance that they wanted to sell full page space for corporate ads to those companies?


A very early (1946) story about computers of the future: "A Logic Named Joe."


@8 & @9: It seems to me that the biggest problem with voice control is the noise it would generate in the average office. I find it hard enough to concentrate with the general hubbub of office conversation (and it's even worse when sales put their music on, anyone got the PRS's email address?) I'm sure I wouldn't be alone in being unable to work if everyone was using voice to control their PCs.


Roy: voice control as a menace in cubicle farms is nothing compared to what happens in universities. Think in terms of drunken frat boys running through the library shouting: "COMPUTER! START COMMAND.EXE! FORMAT C COLON BACKSLASH! RETURN! RETURN!" (or whatever the current equivalent Micro$oftish incantation might be).


I think that Forbes confused two very different things for the same. There is established, well understood technology on the one hand, that only needs to be refined and mass produced for economies of scale to kick in and melt prices to almost nothing.

And there is new technology that is just being understood on a fundamental level and is entering basic engineering, with lots of problems that no one expected cropping up all the time and commercial production being still pretty far off.

Now, when it comes to technology, Forbes is your typical newspaper, that doesn't know what it is reporting about. And knowing what you are reporting about is kind of important if you want to tell from the other.

The possibility of optical computers were just being explored, the very first logic components were demonstrated on a scale of centimeters, a mere 6 orders of magnitude in one dimension above integrated circuits, meaning a density 12 orders of magnitude below those, not to mention that speed was probably at least another 4 orders of magnitude below ICs. Knowing this, they would have never made their prediction.

So, lets call this one unawareness of the actual technological state of the art (in contrast to the hype by people who try to sell stuff that they don't understand either, but have hardware and software monkeys working for them who explained it to them, albeit without much success), unawareness of the development processes in research and engineering, and finally a complete lack and refusal of understanding natural science by the newspapers and lots of media in general.

This lack of understanding is so complete, that it is futile for them to ask experts, because a) they can't distinguish experts from people-with-an-academic-title and b) they can't hope to know what questions to ask, much less understand the answers. In the end, they write whatever they understood in a process more akin to Chinese whispers than actual reporting, amplified by the necessity of writing articles that sell newspapers with big headlines.


Charlie @ 19, Alex @ 22: The high cost and difficulty of home re-wiring in many areas is largely due to a combination of less-stringent building codes for homes than for office buildings, and design philosophies which emphasize minimizing initial cost without regard to possible future upgrades.

Particularly in timber frame and drywall (plasterboard) home construction, standard U.S. practice is to run electrical and electronic cables directly through the walls of the building. Most cable runs are not through conduit of any kind; cables are typically held in position by being stapled directly to the framing timbers. Thus, existing cables generally cannot be removed (or re-routed) without tearing out large chunks of at least some of the interior walls. With older style lath-and-plaster interior construction, and even more so with other classic constructon methods, the costs and difficulties of major wiring changes are even greater.

Contrast this with current commercial office building codes and practices, which normally require at least all of the electrical cables to run in conduit (or armored cable). For equivalent complexities of layout, the cost of running wiring in conduit is typically a small multiple of the cost of home-style wiring, but still a very modest percentage of total building cost. When done right (e.g., using large enough diameter conduit, suitable cable branching designs, sufficient junction boxes and outlet boxes in the right places, etc.) this style of wiring can provide an enormous capacity for relatively low-cost modifications to the type and layout of power distribution within the building. Although generally not required by commercial building codes, also running electronic control cables in their own sets of conduit costs only a little more, and similarly provides vastly more flexibility for future modifications of those circuits.

Since modern office buildings are typically designed with an expectation that tenants (and interior layout details) will change from time to time, more often than the ownership of the building, there is in this type of commerical construction a financial motive to make the net cost of interior layout changes relatively low, even if initial construction costs are fractionally greater.

Since neither most home builders nor most new home buyers are accustomed to thinking in these terms (let alone making systems upgrade capabilities a significant part of their cost evaluations), this aspect generally gets ignored in the residential building sector. Thus, there is no meaningful financial incentive to build in this type of upgrade capability, in most residential construction.


Leroy, if I live in my apartment for another fourteen years, I will throw a bicentennial party for it. The walls are stone -- in some cases, five feet thick -- and the joists under the floorboards are lumps of oak more than six inches thick. It wasn't merely built before indoor wiring; it was built before indoor plumbing.

In case that sounds a little weird (not to mention eccentric) to you, the mean age of the UK's housing stock is 75 years. This is despite the Luftwaffe doing their best to force the pace of change in 1939-45 by knocking holes in it. (My previous apartment was a mere 120-130 years old or thereabouts. I have never lived in a building that was less than 15 years old.)

In my experience, dwellings aren't made with upgrades in mind; they're made with permanence in mind.


I agree with #3, reading your autobiographical posts has been quite entertaining, and considering the time investment that you've clearly put into it already, maybe you should consider filling it out and potentially publishing it...


Agreed -- and entirely consistent with my own observations. Although a substantial percentage of U.S. housing stock is relatively new (say, less than 50 years old) there are still a lot of inhabited older buildings. They're just not evenly distributed, nor are many of them (at first glance, from the outside) all that conspicuously different from houses a half-century or century newer.

I recall visiting one of my great-aunts several years ago, at her home about 50 miles north of New York City. The house had gone through a couple of significant upgrades since it was built some time in the first half of the 19th century: first the gaslight fixtures were installed, then a few decades later (shortly after Mr. Edison had grudgingly accepted that alternating current might actually be practical) fitted with adapters to hold incandescent bulbs. The light switches and fixtures appeared to be those from the original wiring job, apparently done late in the 19th century. All of them still worked just fine, so long as the user's lighting and power needs were not significantly different from those for which the system was designed.


There was a book called Tog on Software Design which came out of a project Sun had -- "Starfire" -- projecting the computer of the future (similar timeframe) from the early 1990's using tech already in the labs. The overall picture was similar to that of the Forbes article, except that it was far more focussed on "the network is the computer", so that all that you'd carry around was a key. Computing was document-centric rather than application-centric, with touchscreens making up actual desktops. I hadn't thought of it in years.

There are a number of factors which tend to make these projections badly off: one of the more obvious ones is the tendency of people to be conservative in their computing styles: once they learn how to do something (say wordprocessing, or using a WIMP interface) they stick with it. Developers, and technical writers, tend to be neophiles, so they miss that trait: systems tend to resemble their predecessors relatively closely, with few sudden jumps.

It isn't just infrastructure, or the tyranny of the installed base, which prevents radical change. It's the fact that the users will actively resist anything very different unless it delivers so many obvious benefits that the advantages are overwhelming -- even when offset against the learning curve many people have learning a new interface.


James: yup, the tyranny of the installed base sums it all up.

The iPhone, I will contend, is a breakthrough technology -- but it's not about the hardware underpinnings (which are boringly similar to those of any Windows Mobile smartphone), it's about the user interface. Apple turned the existing mobile phone paradigm on its head and demonstrated just how crap everything else on the market was, mainly by bringing superior UI design skills to the table and then strong-arming one major vendor into taking them up on it. Along the way they took a 20 year old technology idea -- multi-touch -- and finally put it in a product that (a) needed it and (b) had a huge potential market.

The rest of this stuff ...? Holographic storage? Optical memory? This is of no interest to the end users. And a voice-driven UI is just plain silly (except in situations when some sort of high-bandwidth graphical interface is impractical).


A totally voice-driven UI is silly. One for limited purposes can be a pretty convenient idea. "Call Fred" or "Directions to 112 Main Street"

Whether it's a good idea to have a chitchat with your computer in your car or not is another matter, but it's still better than a keyboard/keypad.


Voice-driven 'Solitaire' in a cubicle farm, how would anyone within earshot stay sane?


When I first spotted that BBC article, I was particularly drawn to this quote:

"Personally, I'm relieved I live in the digital age, with bigger choice, more functions and smaller devices. I'm relieved that the majority of technological advancement happened before I was born, as I can't imagine having to use such basic equipment every day."

It reminded me of some comments I think you made in the 'extras' for Halting State: that people find it hard to project technological (or social) progress into the future, and thus fail to recognise that today's cutting edge will look as dated as that walkman in 2039.


I remember attending a panel where the guy who worked on the interfaces in the Gates mansion (damn interesting stuff) complained bitterly about the nightmare of getting any existing SVHS deck to interface with the house because EVERY piece of electronics short of the kid's toys had to do so. (When a gossip snippet had Martha Stewart making nasty comments about Bill Gates being so old fashioned that he used Cat 5 in the house rather than WiF I decided I didn't want Martha handling my home IT, but that's another matter entirely...)


As regards the age of housing, parts of my parents house will be 600 years old next year. It appears to have be "modernised" approximately once every two hundred years, most recently in the and the electrical systems have been added to selected rooms on a fifty year timescale, so half the house still uses round pin plugs. Given the pace of technological change, and particularly in the UK, as Charlie pointed out, having a computer built into your house seems remarkably foolish, especially since trying to ensure ease of upgrade to unknown future technologies is a very difficult task.


Regarding the Forbes future, we were discussing the ubiquity of the IP-enabled fridge concept in ideas of "the future" at work yesterday; one of the things that turns me off all "digital home" ideas is that a couple of years ago I saw the demo house at BT Martlesham Heath Research Centre and it's exactly the same stuff I remember watching on Tomorrow's World in the 80s.

But why the fridge obsession? It's at least 40 years old, after all. I can only theorise it links up with a huge well of unconscious sexism.


I'd say the main problem with the Forbes article is that it has no understanding of, or interest in, what people actually want to do with computers. The vast majority people have little interest in computers, and when they think about them its the software, applications (Sky+ is a computer after all), or even the websites (google, rather than the browser).

The other problem is the totally inability to understand ergonomics. Touch typing on a virtual keyboard... not fun. Where form matters is when it provides convenience (which has a lot to do with the success of laptops), or new forms of use (iphones).


I think part of the problem with Forbes' prediction - and others like it - is that they treat it as a discontinuity. They ignore the intermediate steps.

As to the fetish for voice command - I've used voice on my laptop - it's not worth the effort.


Well, a voice UI can be practical, but only if you pair it with a fairly intelligent AI. Without AI, it does have only some marginal usefullness in situations where the user can't use hands.


Oh, about the article on the Sony Walkman: one thing that's lacking in the modern iPOD and other digital music playing equipments, is the old romantic idea of giving as a gift a tape with a personalized soundtrack. The act of choosing the most significant songs, the time required to dub them, and then the writing of the cover rigorously by hand... Nowadays I guess you could create a playlist file, maybe put the song on a usb drive/SD card, but it's not the same thing. Even burning a CD is becoming rapidly obsolete...



About this Entry

This page contains a single entry by Charlie Stross published on June 30, 2009 12:13 PM.

How I got here in the end, part nine: the little start-up that could was the previous entry in this blog.

How I got here in the end, part ten: head-first into the Singularity is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.

Search this blog