« A message from our sponsors | Main | 15 minutes of fame »

Unpacking the Zeitgeist

future shock in a nutshell

I'm trying to work out how I'd go about explaining this news item from WOWinsider to someone thirty years ago, in 1977, and it is making my head hurt because there are too many prior assumptions nested recursively inside it to unpack easily. (Unless the person in 1977 who I'm trying to explain it to is John Brunner, who I think would get it first time.)

Okay, let's take it from the top:

There exists a vast, global data network for exchanging information between computers. It's called the internet. It's used by corporations and governments and other groups such as people who like to dress up as furry animals to keep tabs on us.

These computers aren't just big mainframes; most of them are small brightly coloured consumer items. Some of them are disguised as pocket radio telephones that play music and double as television cameras. (Yes, TV cameras the size of a pocket calculator.)

People use their personal computers for playing games. (Some people have more than one computer.) Many of the games run over this "internet" and let people play against, or with, each other in teams in imaginary cartoonish worlds where they can take on the character of mighty-thewed barbarian heroes or dress up as furry animals. (Yes, the personal computers have flat colour television screens to display data. Why do you ask?) They can also chat to each other by typing on their computer keyboards.

One of the more popular multi-person internet games is called "World of Warcraft". When you join, you start out with limited resources, and you need to collect gold and magic weapons and kill monsters and go on quests to acquire loot and gain higher levels (which come with whizzy new abilities). A bit like that new-fangled Dungeons and Dragons game everyone's talking about, except using a computer instead of dice and rule books and lead figurines. (Yes, there are several million people doing this right now. This isn't rocket science.)

Grinding your way up to higher levels is boring, so some enterprising eastern sweat-shop owners have come up with a new business scheme; they fill offices with low-paid staff sitting at computers who go on quests, acquire loot and gold, and then sell these for real-world money to impatient gamers. This practice is known as "gold farming" and is frowned upon because it takes a lot of the fun out of the game for those people who're playing it as a game.

Gold farmers need to advertise where potential customers can see them.

There is a common practice on this "internet" called spamming — sending out huge volumes of advertisements via electronic mail and other media. Because the cost of delivering electronic mail is nearly zero, and the recipient pays the fees, spammers can deluge mailboxes and send out millions of junk messages. Indeed, ninety percent of the electronic mail conveyed over the vast intercontinental data network consists of offers of pornography, drugs for erectile dysfunction, and attempts to con recipients out of their bank account details.

Advertisers in a game world annoy the players; it's a form of spamming. So the corporation who run World of Warcraft have built robot filters that destroy spam messages in chat sessions.

So ...

Being unable to stand on a soap box using a megaphone to yell "buy our gold!" one particular gold farming company decided that to get their message across, they'd create hundreds of new characters in the game — all gnomes, all identically outfitted — place them at precise locations, and drop them from a very great height, so that their splattered corpses would spell out the address of the firm's shop front on the internet.

Got that? Good!

Your question: at which step in this narrative would my 1977-era audience first say "you've got to be shitting me!" ... and when would they start moaning and holding their head in their hands?

There are thirty years' worth of future shock condensed into this one news item. And the reason I'm writing about it is that I don't think I could get away with putting such an conceptually overloaded incident into one of my novels; it would take too much set-up and require so much infodumping that many readers would lose interest. This Russian doll of a news item contains some rather scary pointers to where we're going, and a harsh warning about the difficulty of accurately portraying plausible futures in literature.

(In the meantime, just one warning: I'm going to slam the comment thread on this posting shut after no more than seven days because it is going to draw the WOWgold spammers like flies to a honeypot.)




I was 1 in 1977, so I'm imagining explaining this to my mum, who still knows as much about the Internet and computers as she did in 1977.

I think framing the WOW backstory is the issue.

I think they would ask some money related questions, then some social ones.
First, are these machines expensive? (Relatively of course). Can anyone have them, do they require much education to play the game, like a degree or something similar?
Are the gnomes 'alive', are they 'hurt' in any way? Seeing as the economies are different, 'eastern sweat-shop owners' might need explaining.

I think the big question would be, has this replaced 'normal life', do most people play this, or a minority, or one particular segment of society, like white English speaking males?

Seeing as many people I know now (not all) who were adults in 1977 see WOW and related MMORPGS as stunning wastes of money, time & opportunity(and the player's life), I think convincing the listener that people would voluntarily spend so much time just grinding away till they got to play the game in a serious way. It might be like telling them you are only allowed play kick-about in the park with your mates if you first run a marathon to show you are fit and committed.


Finally! Someone's explained this stuff to me in words I can understand... ;-)


I love your articles like this... it's so easy to take for granted all the spiffy gizmos and gadgets without realizing just how far we've really come.
(frog, pot of boiling water...)


This way of explaining technology (go back 30 years and explain the present) might be used to explain it to those who have been left out of it, people who have never touched a computer before. But then again it would get boring for them probably.

I like it, though!

PS: I was 10 in 1977.


As a 9yo at the time, I'd have been most baffled about the furry animals people and the erectile disfunction...

When discussing how much the world changes, they forget that people change hugely over time as they pass throught the 7 ages of man. You need to ask this question of 55+ people who were knowledgeable adults in 1977, the rest of us have too much difficulty differentiating how the world was then and how we were then.


I think I can hear Phil Dick laughing even now. Or is it Kurt Vonnegut?

The problem with intelligent people writing about the future is they forget how dumb most of their fellow human beings are.


I wonder what would happen if a computer user in 1977 had some sort of magical modem that connected him to the present-day internet? What would the internet look like to someone using a 1977 computer?


I don't know, have you ever read Keith Laumer's short stories from the sixties? I think he had all of this stuff. His question then would have been "what? it's not immersive? lame!"

He had a really good run of short stories for a while - I don't know if you can even get them anymore. I don't think he had any idea what the technology would look like, but he had fully immersive virtual reality, with humans as lab subjects. He had reality TV. He nearly had the Truman show, only he missed the part about the character who wasn't on TV. Or did he? It's been a long time. And of course he had the purple, carnivorous, sexy space aliens with four breasts (they just don't *make* acid like that anymore).

I don't know if you can even get any of these short stories anymore - it looks like a few of them are available on webscriptions, but I don't see most of it even in the amazon used books list. Which is too bad - if you judged him by what you see on Amazon, you'd think he was a hack.

Anyway, the point is that there were easily two or three thousand science fiction fans to whom you would have had no trouble conveying the zeitgeist of 2007 in 1977. And that point probably extrapolates to modern fandom. Trust us - we're quicker on the uptake than you imagine.


I was -1 in 1977, so it's difficult for me to imagine their mindset regarding computers. :)

I think the key would be who you were telling this to. If you told it to come Electrical Engineering grads, or folks doing homebrew computers they'd just get excited.

If you were writing about it in Time Magazine, a deluge of confused and angry mail calling for better editorial work to keep out the "kooks" would be the result.

I think the hardest thing for people to swallow would be that there are low cost personal computers that are very powerful and cheap enough that most families have more than one. Cost would probably be the part that's hardest to believe. If they could get past that, they'd dismiss the rest about WoW as "kid's stuff".

Now, if you did it 5 years later it would be much different. They'd see it as something like TRON, most likely. The late 70s were a cusp in our worldview about computers and IT.


I think that a hail of gnomes may have been mentioned in the Book of Revelations. In a way, though, it's strangely beautiful.


This sounds like a case of my latest favorite phrase, a "Time Abyss". Unfortunately the definition only exists in a book called "The Encyclopedia of Fantasy" -- I haven't found a webified definition of it yet.

My sister summarized a Time Abyss thusly: "a moment where you're almost outside of yourself, and have a larger perception of the passing of time." Her example was googling for an image of a Nantes cathedral in France. She suddenly realized that fifteen years ago the same task would have required a great deal more effort, possibly a trip to the library.

People become accustomed to the future very quickly. One year it's "information superhighway!", and the next it's "worthless porn-drenched spam-clogged hell-hole!" At the same, they still have utopian dreams about the next product, the next step up.

I think the moment a 1977 person would stop you would be the part where you mention personal computers and a global network, and then treat it like it's inconsequential. What they would say is, "wait a minute, millions of people have machines thousands of times better than our supercomputers, and they're all connected together... and people use them to play Dungeons and Dragons???? The future must be ruled by nerds!"


Owen @ 11:
"The future must be ruled by nerds!"

They wouldn't be far wrong. Who hasn't felt the power of the dark side running through them as they fix aunty's PC while normal people just stare in awe at your keyboard skilz?


I think all you really need to do is list the specs for the video iPod and -- a device that fits in your pocket with more memory capacity than all the computers in the US at the time ... used as a portable jukebox? C'mon, surely you can come up with a better use for the technology. A good sci-fi writer would construct an entire story around the device, not give it some mundane purpose as a background deal. Sheesh.


Actually, it's a good point. There may be things lurking thirty years down the pike that we just don't have a vocabulary for yet, but as a writer you have to extrapolate and write about them in terms readers today will understand. It would be really interesting to see what kind of science fiction was being written in 2084; I have a feeling some of it would be almost unintelligible.


I'm glad I'm not the only one who plays this game. Er, not WoW (I don't play, too many people I know have become WoWZombies) -- the game of trying to explain the modern world to someone from the past, reexamining both the facts and assumptions and trying not to explode on fractally-recursive backstory.
Admittedly, I usually find myself doing this (while doing the washing-up or something equally trivial, and when my mind's burned out on more pressing issues) with someone from way earlier -- the Enlightenment, perhaps, or just around the development of flight.

I usually get bogged down around the mid-twen-cen. I try and at least get something in about Turing before it all goes pear-shaped, just for form's sake.

PS I couldn't help but predict the "infodump" Wikipedia link would point to Neal Stephenson's entry... I guess it's true that too much "suck.com"-style facetious linking is bad for the soul :P


Akiyama @7: The text-heavy internet of the early '90s would be comprehensible enough. The main thing is that it was very texty compared to information services of the 300 bps modem era. I suppose a very efficient program could be written to filter out the unintelligible content from a modern webpage, though what's left might not be much in a lot of cases. (The process would be interminable, but so was loading programs off of audio cassette tapes.)

To the questions, the "shitting me" would be no later than the portable communicator and information can opener stage -- not necessarily the concept (as Ted notes @8) so much as the "they will be cheap enough that university students and not-necessarily-rich people from LDCs will own them in large numbers" part (yet there will not be a single supersonic airplane in civilian service).

The moaning stage would likely vary a lot with the audience. There are economists who would take the "eastern gold-farming sweatshops" news and say, "Cool! Markets in everything!!" (In fact, it would say something about the level of development in countries where labor is cheap enough to make gold farming profitable.)

You'd probably need to explain that the "electronic mail" will have killed off the postal letter, that indeed the mail will consist largely of solicitations for credit cards and credit-card bills (and that many people will have several, not carry cash or use checks, and often owe the equivalent of hundreds or thousands of dollars at usurious interest rates), which will probably be disappointing.

Quantifying the volume of spam e-mail would probably do it for the moaning, if nothing else, though there would be reassurance that there were also computer programs that will automatically get rid of most of it while rarely also throwing away letters from your mom.


To be fair, if you tried to explain role playing to today's uninitiated, you'd have a hell of a challenge on your hands. I don't believe there've been enough WoW-related news bytes in the mass media to create an osmosis effect, which is how most of us find out about stuff outside our immediate circle of interest.


I think a 1977 audience would have accepted the story well enough as science fiction; Vinge didn't have all that much trouble selling a crime-infested, globe-spanning MMORPG in True Names (1980)...


I think it was 1978, when I was 12 or 13, when I first sat down at a computer and tapped out a 10-line BASIC program my math teacher had written on the blackboard. This was on a DEC PDP-11 time-sharing system, so the idea of locally-connected computers, at least, wouldn't have been too much of a stretch for my flexible young mind to take in.

But I have future-shock when I think about gold-farming and all the attendant behaviors even today. Apparently the gold farmers pay thugs to protect them from getting offed before their characters are fully ripe, and when they do sell the character, the buyer can go in-game and kill the character in a canned hunt, Cheney-style.

The technology is interesting and not entirely surprising. But the social and economic ramifications are, well, too ramified to fully appreciate when surrounded by it, much less to predict.


I would remind everyone that the Internet was invented back in the late sixties. I would also remind everyone that the Apple ][ existed back in 1977 as well. I believe there were a few networked computer games back then, text based. Graphical games existed as well. Spam did not exist, but conventional junk mail did.

Now for the average person, it would have been hard, but for me back in 1977 (when I was ten) I don't believe that it would have been that hard as I was fooling around with personal computers then.


I wonder how much people knew about Play-By-Mail games in 1977.

Start with postal chess: what the internet does it take out the postal delays. What the computers do is allow games with more than two players, and the mix of cooperation and competition that the new D&D thingy allows. (The first issue of White Dwarf has just appeared.) It's a postal game and penfriend club all in one.

But "The Adventure Game" on TV didn't start until 1980. Pity, it's what you could point at to show the general idea.

I would say that the early Eighties are the time when role-playing games had enough clout to be appearing in some ordinary shops: not just the side-street nerd-traps with trains and tanks and planes and things.

And I think you got a little diverted from the core of the story: you don't need mobile phones and flat-screen TV to explain this. I think you maybe need more space given to the idea of computer graphics.


"There's a game that people all over the world can play together. Some people like to amass lots of play money in the game and then try to sell it to other players for real money. One group of them pulled an advertising stunt where they killed off a bunch of their game characters so the pictures of their bodies spelled out the name of the group."

You don't need to explain *all* the details. Just enough to make things make sense.


I think I discovered D&D in 1977 or early 1978, back when I was 13 years old -- a bunch of us at school were doing table-top wargaming when the Basic D&D rulebook showed up in the local games shop, and then the original D&D manuals and then bits of AD&D 1st edition.

I first got my hands on a computer at school in, um, 1980? When they acquired a lab with three Apple IIs and a Systime-505 Concurrent-CP/M box. A year later I got to buy my first computer, a Sinclair ZX-81.

I first met the internet in 1985 (although I'd been hearing about it before then), first logged onto a MUD in 1986, and first got my own account (via Bradford University CS department) in 1989. Since which time I've been on the net ever since, except for a nine-month outage in 1990/91.

I first saw spam back in, um, 1994? Or was in 1993? When Canter & Siegal spewed the first ever Green Card spam all over usenet.

Explaining the news item to someone who was computer literate back in 1997 would be a no-brainer. Explaining it to someone back in 1987 -- most computer literate folks would get it. 1977 seems to me to be the cusp, as only a very few people had personal computers back then, connectivity via dialup modem at 300 baud was still pretty rare. (IIRC the WELL was up and running by then, but it was the very dawn of the BBS era -- Hayes weren't in business yet and hadn't standardized modem control codes -- and many of the mainframe networks that later glommed together into the early pre-TCP/IP internet were still being developed. Hmm ...)

The ingredients to extrapolate a future and predict this incident were all there by 1977, and indeed arguably by 1967 (if you were a serious visionary who read papers by Ivan Sutherland and were keeping an eye on ARPAs research projects), but the probability of nailing it dead centre would be quite low ... there are too many branchpoints where different design/tech choices, not all of which have obvious advantages, might divert you one way or another. (For example, why is email effectively free? Why is it possible to transfer in-game objects? Why are networks not centralized so we can simply block off spammers? And so on.)


Well, you could explain the cell-phone-with-camera as being something a Star Trek communicator, only better. *That* people could get.

Small computers were becoming noticeable in 1976 - Altair and Imsai advertised in Scientific American, and I met an IBM cassette-based machine at a bicentennial festival at my junior college (they had it playing - what else? - Star Trek).

I still have a DOS-based machine from 1991. (I need to pull it out and run it, because it has the 5-inch floppy drive. (Copy from the 5-inch to the 3-inch, then from the 3-inch to CD/DVD. Archives. I'm glad I don't have to deal with 8-inch, because then I'd have to find another machine.))


As a lot of others have noted, the news isn't that the technology is here, it's that it didn't get completed. All the 60's, 70's, and 80's sci fi assumes full immersion environments, which haven't even come close to happening outside of the occasional university or military base. Assistive HUDs are more common, but have no application in personal computing. Hardly anyone even uses voice control, despite it being at a workable stage of technology (at least as good as predictive text, which people put up with).

That sort of gap in the technology is a hallmark of good sci fi -- realizing that the stuff is built by people in societies, and they don't build what they don't want. The Iain M. Banks Culture novels are a good example; sure, ships and drones could plug straight into your brain, but they don't because it's nasty and slow in there, so voice and visual interfaces are used. In those novels again, or the Vernor Vinge book A Fire Upon The Deep, long-distance communication is constrained to text-only email instead of magical full-immersion space where the avatars meet to discuss Matters of Importance (as in Justina Robson's Natural History).

I'm also reminded of David Foster Wallace's Infinite Jest, in which video phones inspire an unintended sub-industry of "professionally interested and composed" masks to wear during calls. There are several other interesting accidents of social technology in there, but even writing in the mid-90's, the killer video of the title isn't transmitted nearly as quickly as a youtube meme travels now.


Charlie, I ran your post past my 82-year-old father. He snorted and replied, "Bullshit. Another f--kin' kid. Coulda told me that story in 1948 and I woulda f--kin' gotten it." He went on to discuss a putative early-1960s article on the social implications of miniature cameras and his own late-Seventies work on early attempts to use computers in film animation, before the conversation turned to his analysis of my experiences in the Army and just what a pain-in-the-ass it is to plan a wedding.

Now, it is very possible that my father's attempt to simulate the reaction of his 24-year-old self is an impossible enterprise. But I would also state that my own opinion is that you are greatly exaggerating the degree of future shock encapsulated in that story. The only moaning that I can see would come in when the audience realizes that the technological wonder that is the Internet is mostly used for the utterly banal purpose of separating people from their money without provided a corresponding service.

Or pornography.

Which isn't unpredictable as much as it's depressing.

In short, I must say that I am in agreement with most of the above posters on this thread. While I am very open to being persuaded otherwise, I think you hit a foul ball with this one.



And this is a relatively simple MMO story. Trying to sell the story of, say, Goonfleet would fill a reasonable-sized book on its own.

Thing is, 30 years ago...yes, it'd be a problem. 29 years ago, MUD1 was written. The modern MMO is, in essence, a graphical MUD.

For myself, I had my first real PC in 1991. 386 SX/40, 2 MB RAM, 41MB hard disk. I had a spectrum +2 before that, but didn't use it much. It was '93 before I found what a modem did, and '95 before I used it for more than a local BBS.


Free email would have gone unquestioned in 1977 -- every single electronic communication system researchers built had free, global, easy communications built in. Think of the "wall" command in Unix; that's a vestigial remnant of the default 70s attitude. Spam came as a real shock to a lot of people -- remember the giant flamewars about the green card spam on Usenet? The expectation of malicious communication on the network is only newly-entered into the design zeitgeist.


Noel, my gripe is not so much about telling the 2007 story to someone in 1977, as about writing a story today about events in 2037 and getting it in the ball-park.

As a conceit, it defies plausibility.


at which step in this narrative would my 1977-era audience first say "you've got to be shitting me!"

"Yes, the personal computers have flat colour television screens to display data."

Color televisions circa 1977 were big and expensive - a new one was a major event for a middle-class family and it came complete with a technician or two to roll the thing in and get it setup correctly.

Sorry for the history lesson but it's been my experience that if you have not experienced mundane stuff like this it's hard to believe.

... and when would they start moaning and holding their head in their hands?

"yes, there are several million people doing this right now. This isn't rocket science."

C'mon, Stross (says the hypothetical me in 1977). In your average high school there are - maybe - a half-dozen kids who play with computers. If the high-school has access to a mainframe. Where are millions of people going to come from to play the jumped up D&D game?


@7 - Akiyama
What would the internet look like to someone using a 1977 computer?

I do not think you could see much of what we think of 'the internet': No web browser.

Email you could get.


I think a lot of the posters above are assuming that you're at least talking to a science-fiction fan. I think the average person would probably lose suspension of disbelief at "These computers aren't just big mainframes; most of them are small brightly coloured consumer items."


I generally imagine just the opposite. Someone like Isaac Asimov in the 1960s getting a vision of World of Warcraft and trying to translate it into terms he could pass on to his readers:

Lije Borkon was an ether cop. He reached for his datalink and jacked into the ethersphere. His first stop was a game world called World of Warcraft. His data link connected with the Computron 850, the only computer powerful enough to host Warcraft and its thousands of players.

The world materialized on the datalink’s screen. When Borkon saw the dead gnomes he knew he’d be putting in some overtime.


Bjorn gets it.

Bluntly, the majority of people are not on the bleeding edge. They're not even a couple of years behind it. To most people, PCs were invented some time around 1990 and the world wide web showed up around 1998-99. (My parents didn't even have colour TV until 1986 and cellphones until 2002.)

convincing the listener that people would voluntarily spend so much time just grinding away till they got to play the game in a serious way. It might be like telling them you are only allowed play kick-about in the park with your mates if you first run a marathon to show you are fit and committed.

Stephen, do you think you'd also have trouble convincing people that there are folks who are willing to commit hundreds of hours to learning how to throw, catch, and hit before they can play baseball in "a serious way"?


My Dad constructed his own colour TV from a Heathkit box in 1967 or so... he was always big into new technology, an early adapter I guess. I remember he bought a "handheld" calculator when they were hefty $75 thingies with four functions and red numbers.

Anyway, I was 14 in 1978 when I saw my first computer, a TRS-80 from Radio Shack owned by my uncle. I remember being fascinated by what it could do, but also dissatisfied. I think most people I knew then would not have had much trouble digesting the story above - most of us were thinking well this is cool, but how come the resolution is so low? Why is it so slow/crashy/complicated? Why is the hardware so big and heavy? Hrmph! And so on.

My lingering feeling of disgruntlement that computers were not yet up to snuff was only recently dispelled when I converted from PC to Mac and got their latest and greatest. *Sigh of relief*


Bjorn: I don't think most people put any thought at all into computers. There was an assumption that they were big and clunk (as in 2001 or any other SF movie). But at the same time they had robots with small computers built in that gave human level or above human level intelligence (Star Wars, Buck Rogers, Forbidden Planet).

Often the same source would have computers that were both room sized monstrosities and tiny devices that can run a human-analog.

Bluntly, most people in 1977 had no idea what a computer really was or what it was capable of.


My family got our first computer in 1984, an Apple IIc with a 12" monochrome monitor. My dad tended to upgrade every few years though, so that by 1991 I got my first PC - a beige box 286 handed down from my dad.

So for me, it's a pretty much built in part of my life that computers are there and they get more powerful every year. What I would find hard to believe is if someone from 2037 came and told me that computers weren't much more powerful than they are now and no one used them much but that most of our automation and IT was replaced by engineered monkeys.


Thanks for making me feel aweful young for a change. ;-) In 1977 I still had 6 years of blissful nonexistence ahead of me.

The point in the story that I would find - with a detached mind - most inconsistent would be using keyboards to chat with each other.

With all that great technology, why don't they just *talk* to each other?

Your explaination would be too much of a stretch for me. "Oh, well, they could, but for the most part it is too much of a hassle." "Why?" "People couldn't quite agree on just one standard that would make voice transmission easy to implement for everyone. And anyway, people don't like to have their voice heard by all the other guys." No, that would be too much of a stretch for me.


Charlie@28: ah, now I understand. Still, for the uninitiated --- I may be the only non-fan who regularly reads your blog --- it was entirely unclear that you were taking the point of view of a man from 1977 being asked to predict the future, rather than the POV of a man from 1977 trying to understand an account of the future.

The comment @32, with which you agreed @34, implies the former, not the latter.

Few people in 1977 would have turned around and said, "I don't believe that computers will ever be that small," which is what Bjorn and you imply in your comments. Susan Bridges McKay is right: your average Joe didn't know enough about how computers worked to demonstrate any such disbelief. They might not have been able to predict that computers would become pocket size, and they certainly would not have been able to predict the consequences that would flow from that, but you are both quite wrong about the ability of the average person in 1977 to believe that they would grow that small, or fail to appreciate the attractiveness the of the videogames and portable phones that such shrinkage would make possible.

To make this more concrete, an average person would be a Puerto Rican immigrant living in Brooklyn and working as a runner on Wall Street or as an auto mechanic.

While Charlie's correct that most people like to avoid sharp blood-drawing edges, the implied inference that they cannot therefore imagine or understand future technological change is both unwarranted and contrary to my own experience.

I'm not sure that either of you intended to disparage the imaginative capacity of the average person --- I'm pretty sure, in fact, that Charlie didn't --- but that is what you both did.

Brian@30: you exaggerate quite a bit. A technician to install? A major event for a middle-class family? The average price of a color television in 1977 was $1,073 in 2006 dollars. Pricey, yes, but we shouldn't exaggerate. I watched cartoons on a color television in 1977, and my family income at the time came from AFDC.

If you're not American I completely withdraw the above comment, with appropriate contrition.


As I was a Dungeons & Dragons player in 1977, was already aware of power gamers and had also written computer programs (games) on networked computers (OK it was a teletype with a 30bps link to the local university, but the principle is the same). I think the only thing that would have been startling to me was the "people who like to dress up as furry animals" - I would strongly suspect you of making that bit up.

Other than that I think I might have been disappointed there were no significant scientific breakthroughs in the scenario (such as holographic displays which were common enough SF). In fact that might have made me suspicious of the overall story.


Noel Maurer @ 40

Brian@30: you exaggerate quite a bit. A technician to install? A major event for a middle-class family? The average price of a color television in 1977 was $1,073 in 2006 dollars. Pricey, yes, but we shouldn't exaggerate. I watched cartoons on a color television in 1977, and my family income at the time came from AFDC.

I grew up in Oklahoma, my parents are middle-class. Spending $1,000 at a throw was a big deal, at least for for my dad (who is an old-school geek) and my mom (who is frugal). They spent a few weeks shopping for the thing, and this involved some schlepping back and forth between retailers. The technicians were the guys who brought it in from the truck and plugged it in to the antenna.

It seemed like a major event - big truck in the drive, strange guys rolling in this huge console.

This was 1978, and it repalced a set they had bought when they were first married, a decade plus before.

I exagerated a bit, yes. My apologies.

And I didn't get to watch cartoons on it, or not very often. Cartoons - well kids TV viewing - was only for the portable B+W TV in the rumpus room.


Noel: I'm reminded of a story, circa 1985, of an American academic visiting Moscow. Who went to a dinner party, where one of the other guests took him to task for a magazine article (may not have been his) saying that Russians didn't have personal computers. "Of course we have personal computers in USSR!" He exclaimed. "Look! Here is my personal computer!" ... At which point he shoved his pocket calculator under his guest's nose.

Rewind to 1981 and my first school vacation job, in a pharmaceutical laboratory. They had a programmable desktop computer for running statistical regression tests on samples they were working on. The machine -- a Wang lab box of some description -- had 1024 bytes of programmable memory (in machine code), a paper take printer, a magnetic card reader for programs, and a display consisting of 24 nixie tubes. It had cost them £2500 about four years earlier. It already looked long in the tooth to me (having seen Commodore PETs and Apple IIs by then) but until about 1980 it was what most people associated with the term "personal computer" -- a machine for doing mathematical and statistical analysis like a jumped-up desktop calculator.


When I took a sliderule to school in 1979-82ish, rather than an electronic calculator, I _knew_ I was being retro.

I think that the WoW VR stuff is probably easy to explain to yr '77er. The difficult bit is why it isn't all organised like Prestel.


I had my first brush with computers in the mid-70s when our school got a terminal connected to the mainframe at Sheffield University. This thing filled a small room (and it was only the terminal, remember) and didn't even have a monitor - it communicated using an IBM golfball printer the size of a steamer trunk. And now I think about it, even then all we used it for was playing games.
Incidentally, I have no idea why I said 2084 earlier. I meant 2037.


I was born in 1973. When I played my first computer game in 1979/1980 I started asking questions and dreaming of things that would turn out to be the evolution of what Role Playing Gaming would be.

See, I played D&D too and waited for the first games of D&D to come out. But when I played, it lacked the fun of the games with my friends. I knew someday someone was going to make that possible - when modems because mass marketed, I knew it was coming.

I played The Realm. I played AC. I played EQ. I played them all into the evolution of what is now WoW. And it's only going to get better or worse depending on how you view it.

Eventually, it won't be a nerd-fad. Eventually, it will be something more immersive. Eventually, it will be more portable. Eventually....

Well, eventually, if we don't blow ourselves up, I have every reason to believe that either through intra-gaming (biotech based) or extra-gaming (hologram based) we will get more and more immersed. Look at the Wii. Look at games like DDR and Guitar Hero. These are the first pioneers into those arenas.

It's your imagination made reality. What will an article in 30 years from NOW read like?


So THAT'S what George W. Bush means when he says that Coalition troops are sending a message to terrorists everywhere. He's trying to ensure that "splattered corpses would spell out" a message, and he just hasn't quite perfected his calligraphy.

On the other hand, that's what Sunni and Shiite and other groups are also trying to do.

As the mathematician Serge Lang puts it:

“Papers should also be required to be neat and legible. They should not look as if a stoned fly has just crawled out of an inkwell.?

Tigris, Tigris, Burning Bright
Jonathan Vos Post

Come sail with me
to the coast of the Pentagon
our epiphany
whatever it meant is gone

Sail across the sea
to the coast of the Pentagon
escape to ecstasy
mounting the mastadon

Come sail away
to the coast of the Pentagon
beyond the Beltway
failure's not im my lexicon

The tide is going out
to the coast of the Pentagon
ride the waterspout
to the kingdom of Prester John

There's a chance we're lost
on the coast of the Pentagon
pentacost, holocaust
crossing the Rubicon

Sail with me
sick of the cyclotron
weep with me
on the coast of the Pentagon
fallen is Babylon
whatever it meant is gone

4 July 2007


(The Well was founded in 1985, but the software it ran (& still runs) was based on other social-software systems from the 70s.)

Now this has me thinking of Life on Mars again - since I was born in 1977, I don't really remember anything until the 80s were well underway. But to me, when I look at media from (or set in) the 70s, it seems like the 70s were more like the 60s than they were like the 80s, at least in terms of technological progress apparent to the average person. Maybe someone who lived through them both can say better than I can what it was like, though.

On the other hand, miniaturization was already a well-known trend by 1977, the global telephone (and telex) network existed and were quite visible, and things like Star Trek had probably already put expectations for miniature communications devices into the popular imagination. So I don't think - in broad terms - the technology would be such a surprise. TVs were giant appliances then, but I don't think anyone doubted that they were going to get smaller and cheaper and better.

And by 1982 Tron was out, and that wasn't a million miles from WoW. Now, the gold farmers, and other things related to virtual assets, that might have taken a little longer to explain, but for anyone who'd played D&D I think it'd come naturally.


As a number of people have pointed out, the answer largely depends on who you are talking to. I wonder how many people today would have trouble understanding this? Now extrapolate that forward thirty years...

Sitting here in the 21st century, the social implications seem more unbelievable than the technological gadgets. Gosh-whiz gizmos would be fairly comprehensible, I think. The social aspects would not be.


Regarding the reason we aren't using voice technology:

Text is better. Well, it's better for most cases of remote communication involving computers. One reason it's taking so long for telephone and computer network technology to integrate effectively is that the two paradigms of communication solve distinctly different purposes, solve distinctly different problems.

The reason something like Teamspeak is better than text for dungeon crawls in WoW is that communication is mostly emoting -- you exclaim "I'm getting creamed! Heal me!" -- while much of Internet communication outside of gaming and similarly "unimportant" (speaking as a WoW player here) leisure/entertainment venues has a much higher signal-to-noise ratio. People use the Internet, for instance, to convey business information, argue about politics (with all the attendant sourcing and supporting argumentation that implies), and provide help with gnarly technical problems. Such purposes are far more suited to text than voice communication.

Imagine trying to write a search utility for skimming through voice archives to find a specific factoid, heuristically doing fuzzy comparisons of sound waveforms to word comparison templates (which would themselves have to be a form of algorithm). Now compare that with the relative ease of writing a program that allows you to do plain-text searches in your text-based communication archives (which are, by the way, less than one hundredth the size of equivalent voice archives), using standardized character sets that always look the same to your search software regardless of the font used for display. It's even easier to code your way around typos than variations in tonal patterns and other hairy problems of voice archive searching.

That's not all, though. Even given "perfect" searchability for voice, text has its advantages. For instance, you don't say "um" and "err" unless you want to, you can correct glaring errors in what you say before hitting "send", you don't look like some weird obsessive that doesn't understand etiquette if you pause in the middle of composing a reply to research something, it's easier when reading to skim back a sentence or two if you're trying to figure out what someone said, interruption isn't really an issue the way it is with voice, and so on. More people can have a conversation in the same channel, room, whatever, without losing all ability to communicate effectively: compare a busy IRC channel with the idea of thirty people (a fraction of what constitutes "busy" in IRC) all talking on the same telephone line (or in the same Yahoo! voice chat) at the same time.

Much of what makes voice communication valuable is actually lost in transit when done remotely. Context is lost; facial expressions and gestures are lost entirely, or rendered problematic by low-quality webcams; et cetera. It's easier to attach context hints to text-based communication with full knowledge of what's missing than to voice communication. While a one-on-one discussion may be higher-bandwidth in some ways with voice (in terms of conveying information to the person at the other end), in most others its bandwidth is far lower. For quick alerts, a sound is more grabbing and can make your point faster (thus the value of Teamspeak in WoW), but for longer discussions -- actual "conversations" -- you're much better off with text, generally.

I could go on in this vein for hours. The point, ultimately, is that people often find it difficult to predict the future with regard to technology and its uses not because the technology is unpredictable, but because people try to predict based on what technology they think is easily achieved as an expected development path. They don't think as much about what people will want out of it, or how useful it actually is. Text is, for the most part, still far more useful for network communications than voice -- thus, it's still king, regardless of the fact that we had the basis of networked voice communication in the days of Alexander Graham Bell.


I remember playing on my Spectrum in the mid 80's. You'd insert a tape cassette and then sit there for about 30 minutes as horizontal bars of lurid colors danced up and down your TV screen while the speakers shrieked at you. Then, finally, you would get to play Frogger for a bit.

30 minutes of maniacal banshee shrieking to play Frogger, and I thought it was the very bleeding edge of computer cool. If you had told me about a virtual realm depicted with gorgeous graphics, a world with its own economy, millions of players across the globe playing at any time, all the time, I would have scoffed.

And then gone back to watching my shrieking TV set for another 20 minutes.


You forgot that 30 years ago, "erectile disfunction" was called "impotence" and all the "drugs" for it were called "spanish fly".


This might be easier to explain from the other direction-- "A free-enterprise organization resorted to unusual advertising to attempt to break the gold-trade monopoly of another organization. But it all happened on computers."
Just replace Spain with Blizzard and Sir Francis Drake with some kid in Myanmar.



I'd point out that in WoW the raiding guilds to tend to use voice. Regardless, the emphasis is largely on "learning" encounters. That is, to know how to react on a rote basis to the things a boss does without the need for more coordernation than calling off stages.

If you take a MMO with far more fluid content like Eve-Online, voice in combat tends to be far more critical and used in different ways, to react to unexpected changes because of the ability to hear and react in-game without taking your focus off the controls.

And most of the long-time guilds I've been in, for various games, do indeed use voice as a chat medium. When you're playing, especially when you're playing differnt games, it can still forge a community.

Voice is used in multiple ways :)


Apotheon gets it right, I think, and in a similar vein, I predict the iPhone will be a flop. Yes, it would be nice if all the features worked as advertised, but the fact of that matter is that for humans, a certain minimum screen size is, well, not necessary, perhaps, but desired. The same with the keyboard feature. The work-around is clever and incites my admiration, but in the end . . . it's just not good enough. Anyone have any good info on the latest in chording keyboards?

And another retrodiction: these same denizens of 1977 after hearing all of that WoW stuff as background would find it hard to believe that videophones were never a serious retail item.


Philip@51, c'mon. I don't believe that you would have scoffed, any more than I would have scoffed. I believe that you would have been jealous. At least I hope so ... you're talking about the 1980s, for crying out loud.

But that brings up a question that will be ignored, I'm sure, because it's such a village idiot question. So here goes.

What exactly are the social aspects of computer technology that we're talking about here?

Seriously. Sure, it's easier for fuzzies to find each other ... but when I go looking around for mainstream social changes that I can unambiguously pin to the Internet, I have trouble finding them.

Campaign finance, maybe. But other than that?

Myspace, cool and all that, but what are 17-year-olds really doing differently? Terrorism, you'd think, easier to network, but have we seen anything not entirely explicable or imaginable in the pre-Internet days? Commerce? Sure, but what's not glorified mail-order? Pornography? Well, yes, but it wasn't exactly unavailable before the Web, and have we really seen any changes in sexual behavior as a result?

There have been lots of behind-the-scenes changes that have added up to a productivity revolution in myriad industries, of course, but that's not what I'm asking about. I'm asking about social changes. I don't mean anecdotes and I don't mean subcultures (which, believe it or not, existed about as much in America circa 1990 as they do now) --- I mean major quantifiable social changes that can truly be pinned on computer technology. What's out there?


Charlie, I think trying to write something _realistic_ set in 2037, particularly if you're dealing with cutting-edge tech, is somewhere between impossible and hubristic.

It can't be done and you'll drive yourself mad trying, and no matter how well you do it it'll still look silly in 2037, or even 2027. That's why I don't write in-this-timeline near-future SF.

OTOH, if you avoid the high-tech stuff, it wouldn't be too hard -- most everyday technology (subways, elevators, cars, bathtubs and showers, airplanes, guns) won't be all that different from what we have today, just as 1977's tech isn't much different from 2007's at that level.

What'll really getcha is the politics and social trends. I mean, you can do _some_ large-scale prediction with a reasonable degree of confidence -- I'll bet any money you care to name that China will be in deep do-do in 2037, and Japan and Italy will be even worse -- but the details... gevalt.

1977... I bought my first computer with the advance for selling my first book, in the early 1980's. Even early word processing beat manual typewriters all to hell; cut and past meant _cut and paste_.

I think I'd have followed a description of WoW fairly easily. I'd have thought "wow, that would be boring compared to reading a book or structured daydreaming", but then, I think that now... 8-). I tried D&D at the time, but it was just too limited compared to what I could do in my own head.


I think War Games might also be important - while it is not about VR or RPG, it shows a teenager owning a PC (oK, Home Computer) and doing things on a computer network. Both movies are from 1982, probably no accident.

At least that was when I learned about such things. Only three years later, owning a C64 myself, I had no problems at all to imagine something like WoW.
And a friend of mine was using his VIC20 (3.5 kb RAM) for online games.


If you look at the hierarchy of boggle on the part of the person you're trying to explain this stuff to, I think you'll find that the tech is relatively easy, especially if you're talking to a technical professional or an SF reader. But the social changes and the implications for the way the average life is lived are much harder to explain and much less believable because that's where most of the visible change has taken place. The number of people who own more than one computer is much more evident to the proverbial observer from Mars than the number of electronic devices on a chip in the current IC generation.

And I think a lot of people are overestimating the imaginative reach of the average person, even the average engineer. In 1977 I was working in Silicon Valley. I had just spent some time at one of the companies that invented the personal computer, and was working at Intel, which invented and developed the technology that made the internet at its current scale possible. I think I could see some of what was happening then (it helped that I had read "Shockwave Rider"); it's one of the reasons I chose to be working there: I wanted to help create it. And I'd been playing with computer graphics for several years at that point. But from the conversations I had with my colleagues at the time, I think I can be pretty confident that very few of them, even there at the heart of the engine of change, could even get their heads around the kind of technical and social changes we talking about here; and few of those who could would believe in them ever happening.

In terms of writing SF, Charlie, I'd say that you can fulfill you obligation to your story and your reader if you can see past the obvious application of new technology to the way things are done now, and produce some reasonable speculation about things that aren't done now, but could be. Not as prediction, but as a way to make concrete the magnitude of the kinds of change we face. Even today, a lot of people who give lip service to the notion of exponential change don't really believe or understand that this means that the way we live will be drastically different.


Noel @56: the key behavioural changes are subtle.

One example: geographically dispersed interest groups can network now. If you're into some weird hobby that only one in half a million people share, the odds are good that you're the only guy in your town who's into it. There might be a small magazine for it in an area the size of the USA, and monthly meet-ups in a bar in New York, but that's about it. But with the internet, you can seek out your fellow freaks and communicate. So we find that marginal, weird, outre interests are suddenly out there and dragged into public view because hundreds of people suddenly get together and start talking about them for the first time even.

And then there's the delocalization effect of mobile phons. Mobile phones don't like addresses, they link people. This is clearest when you look at young folks who've grown up with them; they do ad-hoc social networking, agreeing to converge on a given location, without prior planning, because they can.

David Brin, in Earth, predicted the near-death of the snail mail postal service in, um, 1994? ... But I thought he was going a bit far. Now I turn around and discover that I haven't actually written anyone a letter on paper (other than a covering note for business purposes) in over a decade.

These things creep up on us, and they're like the (apocryphal) frog-boiling process: you don't notice them because they're a gradual progression.


Charlie, @60, and businesses are a long way behind people. Unless you're in a high-tech industry, they're still wedded to paper mail.

And for some things, seeing how good their security is, that's probably still the best choice.


I am seriously astonished at the posters here who assume that most people in 1977 wuld have expected the miniaturised & networked computers that we have now. I certainly didn't. I remember a BBC Horizon programme in the late 1970's that predicted that computers would get small, cheap and *useful* enough that everyone would have one at work - I was sceptical. And yes, I read sci-fi, I had learnt to program at school - via an accoustic coupler linked to *the* computer at our local polytechnic - and I watched Tomorrow's World, Horizon, etc.

As far as I recall, computers in sci-fi were still large, expensive things. The characters often used them, but the assumptions about the technology were completely wrong. Apart from "The Shockwave Rider", I hadn't read any book with a computing infrastructure even remotely like today's. The future then was full of flying cars and personal jet packs, not Treos and iPhones.(FWIW, I went on to study maths and computer science at Cambridge & Warwick universities).

Charlie, looking through your article, I notice that you didn't explain "electronic mail" - I didn't encounter that until 1981, my second year at university. Also, the phrase "robot filter" wouldn't have made any sense , back then, the word "robot" always meant machines. The only question I'd have about the article is whether WoW really requires all the technology you mention. After all, it is basically hosted on a few big central "computers" (= server farms).


Steve @ 57: You raise an interesting point -- for many people in 1977 technology might be the easiest thing to swallow.

But tell people in 1977 that the USSR collapsed peacefully without a fight, that much of the former Warsaw Pact nations are now part of a Supranational EU with western Europe, that Communist China is a major player in international capitalism and produced most of the world's high tech consumer products along with South Korea. Then tell them that the US's main enemies are ill-defined terrorists in multiple nations, and that some people are pushing for war with the *Islamic Republic* of Iran. Tell them that most industrial nations are starting to worry about underpopulation and that pollution is mostly a solved problem in the the first world -- the big pollutant people are upset about is C02.

And to top it off, there have been no major advancements in space travel -- we haven't been back to the moon or gone to mars. We have a space station but it's not that much better than Skylab, that we still use the Space Shuttle and are thinking of replacing it with something based on Apollo.

See how much of that they believe. Then think what might have happened by 2037...


Dave @ 62: I think it would be possible for a game like WoW to be created even if we didn't have personal computers. Some sort of multi-million dollar mainframe running the game, with people connecting from home terminals.

The technology certain could have gone that way, if there had been something preventing the development of personal computers. (National Security concerns?)


My boss was telling me yesterday that the one area where Ireland has an advantage over say, India or China, was the ability of Irish to understand idiomatic American English, which would allow an Irish software engineer to communicate over the telephone (the first virtual environment, and one that had been around since the early 1900s, perhaps?) for business purposes with his/her American counterpart in a way the other lot couldn't. That's a social change that was utterly unpredictable from the standpoint of the Ireland of 1977, yet also strangely predictable - if you had factored into the equation not only an Irish economic advance from the conditions of the 1970s (when in some rural areas there were still landless agricultural labourers who had to sleep in barns with animals) to those of today (where a high-tech urbanised society is sustained by a brittle property boom) but also the retention of Irish cultural links with its diaspora, not only in the neighbouring island but also across the Atlantic. I'm not sure what (if anything) this might mean for the original post, but give me a moment and maybe something will come to me.


Dave Berry wrote: 'I am seriously astonished at the posters here who assume that most people in 1977 would have expected the miniaturised & networked computers that we have now....Apart from "The Shockwave Rider", I hadn't read any book with a computing infrastructure even remotely like today's.'

You should have read more. MICHAELMAS by Algis Budrys got it all in 1977 -- and Budrys wrote much of the first draft in the late 1960s. Aside from his novel's predictive aspects, Budrys was a better writer -- in terms of literary style -- than 99 percent of SF authors then or now.


I'm with those who would expect that people wouldn't have a problem with the idea of smaller computers. Miniaturization of electric gizmos in fiction goes back far beyond 1977 (look at dick tracy's wrist radio and the already-shrinking size of calculators), and to most people, a computer was an electric gizmo. The Apple II was introduced in 1977, and "minicomputers" had existed before that.

I doubt that explaining that people play games on them would really be THAT much more difficult, considering that 1977 was the very year that Atari released its first console, and if I recall correctly, the Apple II was a pretty nifty little gaming system in its own right.

The hardest part might be explaining online gaming, but even there, it's not that tricky. You just tell people that everybody plays a D&D-style game online, together, and that you amass money in the game in order to buy new weapons and armor for your little in-game knight. It's against the rules to buy or sell in-game money for real money, but people do it anyway. One seller pulled a stunt where a lot of the little knights died in an arrangement advertising his wares.

AndrewG at #63 called it: the hard part would be explaining geopolitics nowadays.


Andrew G,

Certainly. The ideal MMO client *is* a dumb terminal. All the computer on your side does, ideally, is the interface and graphics. As Raph Koster put it;

"The client is in the hands of the enemy"

For that matter, look at something like Runescape which is precisely that - you run it in a browser...


Re social networking, there was a 19th century magazine that in many ways operated similarly to a modern discussion board.

I think the moral of that story is not to overvalue the big trends of history. The past must be full of strange, forgotten subcultures, which didn't make it into the history books, but which prefigured trends that technological and cultural circumstances have made more viable today. Or maybe not even subcultures, just lone individuals with no outlet for their strange ideas. If you were born into 19th century London with the mind of what today would be a programmer, how would that affect your choice (if any) of work, interests, hobbies, social circles? If we could look at all the unpublished novels of the past, would we find any that are eerily prescient, and just too weird to be appreciated in its time?

Correspondingly, think of the social experiences and dynamics one could have a few hundred years ago, that have become unviable today, (for instance the experience of reading in a mostly illiterate world with few and expensive books). I imagine some of these experiences would sound as strange to us today as WOW to someone from the past, (which is where historical fiction comes in, to make the forgotten real).


Bjørn: and then there were APAs, back in the pre-BBS pre-weblog day. Been there, did that in the eighties and early nineties; some of 'em are still going.

The key point is that the message response time in an APA was typically a month, and the scope for discussion was limited -- usually to no more than fifty or sixty people, max, for logistical reasons. Printing off fifty copies of your monthly contribution and mailing them to a coordinator who'd collate them and mail them out again was hard work, especially before cheap photocopying: I harbour a number of unfond memories of the insides of a Gestetner stencil duplicator.

Basically, the investment of time and effort it took to participate, and the limitations on the scope and scale and speed of debate, meant that it was simply inaccessible to most people. A web board (like this very blog!) compares with a such pre-computer communications media like a printing press with a scriptorum full of clerks copying books in long-hand.

Sometimes adding a new technology to a pre-existing task results in a qualitative change, rather than merely a quantitative one.


For those of you who are interested in the "email from the future", Michael Swanwick recently received an email from his future self of 2107. This communication was published in a recent issue of Asimov's magazine.

Swanwick's future self didn't go into much detail about the technological advances of the future, but he did mention that Charlie was turned into a giant blue lobster in the year 2076. How ironic...!


57: Actually, I bet cars will have radically different social effects in 2037. For one thing, people won't drive them -- they'll drive themselves. This will radically alter how cars interact with urban centers, because now you don't need to put parking within a short walk of where the people want to go -- parking only needs to be within a short drive of where people want to go, because a car can drop you off where you want to go, and then drive itself to a parking spot than can be up to several miles away. This pretty much reverses most of the deurbanizing effects of the 20th century automobile.


Neel: see also HALTING STATE (when it comes out).

Interesting statistic: at peak rush hour on a weekday, about 95% of the UK's car fleet is ... parked. In fact, the load factor of the automobile industry, if compared directly to the airline industry -- or any other form of public transport -- is desperately bad.

I suspect that a lot of the attraction of the personal automobile comes from two factors, plus marketing: the two factors being (a) territoriality (people don't like sharing space with strangers) and (b) convenience (it's at your beck and call). These feed into the industry's use of aspirational values like freedom and independence as marketing tools. But imagine for a moment that you've got a magic taxi account. Want to go somewhere? A taxi will pull up and take you there within three minutes, guaranteed or your money back. Any distance, no problem -- you pay by the mile. No human driver in the front seat to listen in on your conversation, and no worries about servicing and maintaining and cleaning and refueling the vehicle: it's all part of the service.

Would you use such a service? I sure would. (But then again, I don't enjoy driving; nor do I ride a horse or fly a light plane.)


In my neck of what once were Live Oak woods, mostly chopped down, replaced by ostrich farms and orange groves and dairy farms, it would seem that cars have the vote, and humans do not. Cars and coyotes and cacti and rats and mountain lions and bears, all of whom show open contempt (or, at best, indifference) for people. But, then, Ray Bradbury wrote about that a long time ago in his short story "The Pedestrian" (1951).

I wrote myself a poem from my future, but it makes me somewhat unhappy to read now, due to it being uncomfortably accurate.

A decade or two from now, many critics will evaluate your stories for predictive accuracy, which was not (I suspect) really your point.

"If this goes on..." as Robert Heinlein wrote (Happy 100th birthday, Bob!) is a basic SF story approach. The trend is shown, not for prediction, as such, but to attract or repel, and perhaps allow us to choose utopia over dystopia.

Happy Heinlein Day!


Andrew Crystall @54:

I think you missed the fact that I pointed out MMORPGs constitute an exception to the general rule. Raids and the like, as I clearly stated, rely more on emoting than on intellectual discourse. There's little actual conversation going on -- it's just a quick and easy way to alert others to changes in circumstances within a quick-paced, highly variable circumstance. When you're not calling for a healer or warning about an approaching level sixty elite bearing down on the group, and when there are a great many people in a single channel or "room", text is by far the more effective option.

Quoting myself, I said: "something like Teamspeak is better than text for dungeon crawls in WoW". Thus, your statement doesn't disagree with me -- it just uses a single, limited example of where an exception is true to imply that voice is better in general.

ScentOfViolets @55:

It's interesting that you bring up chording keyboards. That and HUD-type technology are more in line with the direction things need to go if you want truly improved portability and miniaturization for general-purpose, network-capable computers. Until we can get keyboard-equivalent functionality with one hand (or using some kind of direct mental input) and a wearable "screen" of some sort, laptops are going to continue to be irreplaceable.

Well . . . there's another option, but it basically requires that everybody stop touch-typing and become far less productive in their habitual use of computers. When we all become unskilled hunt-and-peck typists, maybe then something like a stylus-based PDA interface will prove more popular for "serious" computer use, or those atrocious little miniature keyboards that just beg to be used for fat-fingering typos every third word at a rate of six words per minute.

The real win for touch-typing, and the thing we need for greater miniaturization and portability to catch on, is the fact that full-size keyboards can be used without having to stare at the input device, and without having to think about every single letter we want to input. This is the same reason that those optical keyboards, really shallow keystroke keyboards whose key action is too mushy, and keyboards miniaturized to the point where you have to very carefully pick out every single letter visually, all tend to fail as replacements for a full-size keyboard. It's also the reason that despite the best efforts of Apple in the '90s and Microsoft this century the mouse has not ever replaced the keyboard and, with luck, will not do so either. It's also the reason that "multi-touch" interfaces like the iPhone's will never replace something with real tactile feedback and a clear, standardized arrangement like a QWERTY keyboard.

I think the single biggest problem with the iPhone's design is that it cannot be efficiently operated without looking at it.


You can use stranger futures in humorous stories, I think. See Murray Leinster's 1946 "A Logic Named Joe," which got a fair amount right about the Net. (He didn't foresee how decentralized it would be, though.) Also William Tenn's 1957 "Winthrop Was Stubborn"/"Time Waits for Winthrop" in which the future includes such obviously-absurd things as women shaving their heads.


Charlie@73: I'm on record as being willing to bet large sums of money that most people will still be driving their cars manually in 2037. Let's say $10,000, adjusted for inflation. Any takers?


Avedon pointed at this set of local reactions to terrorism in Scotland.

You don't think you might have made the characters in "Halting State" a little bit wimpish?


There are still agricultural laborers in Ireland who sleep in barns with the livestock.

The difference between 1977 and 2007 is that now they're Polish.

Try telling someone in 1977 that in 2007 6% of the population of Ireland would be Slavic migrant workers. _That_ would boggle the mind.

For that matter, tell him that... oh, that in 2007 France would have a higher birth-rate than Algeria or Iran (which it does, incidentally).

Or that British troops would be fighting Pathans in Afghanistan again.

Or that apartheid would be dismantled peacefully and largely voluntarily -- _I_ wouldn't have believed that.

OTOH, some things wouldn't be surprising. Eg., the changes in gender roles we've seen since 1977 were fairly predictable then.


Charlie@73: I'm on record as being willing to bet large sums of money that most people will still be driving their cars manually in 2037. Let's say $10,000, adjusted for inflation. Any takers?

-- no, you're probably right about that.

I'd be willing to bet that the average speed of air travel will be within 10% of the 2007 level, too.


This is a great article, I love WOW. Gratuitous mass Gnomeslaughter is always funnny.


Future of automobiles: a very large majority of cars in the US drive less than 60 miles a day.

60 miles is well within the range of an electric car, even with current battery technology.

The problem is that you need to be able to drive much further, and you can't have a separate IC-engined car just for that.

That's why I think plug-in hybrids are the answer in the short to medium term.

They have a whole range of good features; you don't need the huge battery pack you do for an attempt at a serious all-electric car; they charge mostly at night (which evens out load on power stations and reduces costs); you don't have to kick in the IC engine for most of your ordinary daily travel; but you can seamlessly transition to ordinary hybrid operation when you need to.

And they don't need an expensive new infrastructure; they use the power lines and system of liquid fuel distribution and marketing that we already have. No tech breakthroughs needed, either, and mass-production models would be only slightly more expensive than a conventional hybrid.

MPG of liquid fuel is very high for an ordinary urban car; in the 250 mpg range, ten times the current average.

Make the IC engine flex-fuel, in the Brazilian manner, and you can painlessly feed in ethanol as cellulosic conversion becomes available.

That would mean we could have ten times the number of cars with the current input of liquid fuel, or the same number of cars with only one tenth the petroleum.


I think there's still some stuff that needs explained; for example, the concept of URLs. Obviously you attempted to simplify the idea by saying "the store front's address," but that might just lead to further confusion as they might draw some unforeseen parallel between the Internet and a physical store's address. Also you might need to mention something about the input devices used on personal computers and how that allows characters in the game world to be manipulated directly and such.


Unfortunately Stross' assumptions are 100% dead wrong. Like the rest of the kooks and cranks and crackpots who believe in the mythical Singularity, Stross glances at the recent past and ill-advisedly projects the technological and social changes straight into the future. That's not only desperately foolish, it's fundamentally ignorant. No exponential tech curve continues for long, and instead of exponential curves, we always get Gompertz curves.
Let's take some analogies with other predictions made in other areas to see how foolishly false and amazingly ignorant this singularitarian projection is.
[1] From 1900 to 1969, we went from the fastest mode of human transportation being a train (roughly 100 mph) to the fastest mode of human transportation being a Saturn V rocket (25,000 mph). Projecting that exponential growth in speed forward from 1970 to 2040, we arrive at the foolishly ignorant conclusion that the fastest mode of human transportation today should be 250*25,000 mph, or 6.25 million miles per hour, or 0.93 percent of the speed of light. Call it 1% of the speed of light as a round number. Yeah! Where's my jet pack, I want to whip around town at 1% of the speed of light! Heck of a whiplash when you stop one of those things, eh?
[2] From 1970 to 2000, personal computing CPU speed rose from about 0.06 MIPS (Intel 4004 running at 108 khz with 4 bits, www.xnumber.com/xnumber/intel_4004.htm) to the Intel P4 2.4 Ghz (4464 MIPS). Projecting this speed increase into the future, a foolishly ignorant futurist would obtain a rate of increase of 145.34% per year, which would put our CPU speeds in 2007 at 2068 Ghz, or a little over 2 Terahertz. Obviously this hasn't happened. In fact, the fastest Wintel CPUs right now run at 3.8 Ghz with multiple cores -- but the parallellism is largely useless, since most of the tasks for which PCs get used today do not benefit from parallelization. Obvious exmaples include downloading video (parallelization doesn't make it run faster), word processing (ditto), and playing World Of Warcraft (ditto). The Germans have a sardonic motto -- "Sleep faster! Be more efficient!" No one can sleep faster. The process is serially limited. Such is the case with most personal computer operations. Some Photoshop filters run faster on multi-core CPUs and some spreadsheets recalculate quicker, but that's all. Nothing else speeds up.
CPU parallellization has turned out to be a bust, and Moore's Law has stalled out and run into the ditch since around the year 2000. There is no sign of any CPU breakthrough on the horizon that will speed things up, either.
[3] If we look at the speed increase from the 707 to the Concorde, a naively foolish futurist would conclude that we'd all be travelling around the world in rockets by now. Of course, what actually happened is that the Concorde got shut down because of hte pollution and safety and noise problems. The fastest intercontinental jet today is slower than the fastest intercontinetal jet in 1980. Moreover, it's likely to remain that way for the foreseeable future. Heinlein's semi-ballistic rocket travel was a pipe dream, like the Singularity, genuine "hard" AI, nanotechnoloyg, and genetic engineering to increase human inctelligence. (No even even knows what human intelligence is, much less how to increase it. 2 future Nobel prize winners got excluded from the high-IQ group in Terman's IQ test stars. It's a good rule of thumb that if you can't define it or measure it, you can't reliably use technology to increase it. www.theconglomerate.org/2006/07/forecasting_exc.html)
It's obvious even to a small child why these predictions based on the recent past are all foolishly wrong, and would have been ridiculous even in 1970: the period from 1900 to 1969 was a statistical outlier in transportation technology, just as the period from 1960 to 2000 was a statistical outlier in information technology. If we take the period from 1840 to 1900, for instance, we don't see anywhere near the 250-fold increase in speed that we see from 1900 to 1969. Likeiwse, prior to 1960 or after 2000 we see nothing like the startling increase in computing speeds we saw during the abberrant period, regardless of Ray Kurzweil's ignorant and foolishly false statistical gamesmanship.
Likewise, if we take the period from 1970 to the current day, or from, say 1700 to 1770, we see nothing like that bizarrely unusual 250-fold speed increase. The historical period in question is wildly unusual.
Likewise, the period from 1977 (dawn of the personal computer era) to today is just as bizarrely unusual. Prior to 1978 there existed no personal computer. Prior to the 1980s there was no large-scale connected internet. These innovations only occur once, just as the Wright Brothers' first flight can only happen once and just as the first satellite launch into orbit will only happenonce. To continue proujecting future change based on their bizarrely unusual one-time-only leaps is not just weird and ignorant, it's frankly stupid.
The pace of change in computer technology has not only dropped, it's nearly slowed to a crawl. Despite endless hype, there's no sign whatever that AI is going anywhere, and despite the frenzied nonsense spewed by nanotechnology boosters, that field remains confined to the same realm of fantasy as alchemy or Steorn's "free energy device." As for biotechnology, that field shows more promise -- but so much remains to be learnt before we can even begin to sensibly modify organisms in useful ways that it's safe to say at least a century of hard work lies ahead of us, probably much more.
Of course, that's the argument from the available evidence, and as we know, evidence and logic seldom prove convincing to the human animal. Another parallel argument against the sort of absurdly rapid continuing exponential social change based on ever-increasing rates of computer change is simple common sense. Look around. Vista represents a huge step backwards. Linux desktops, as glitzy as they are with compiz and beryl, have pretty much hit a brick wall. There's no sign of any significant progress beyond compiz, nor has there been for quite a whiel -- and bear in mind that compiz is basically the borg visual interface from the Star Trek: The Next Generation episode "The Best Of Both Worlds, Part 2." COmpiz is a 1980s vision of the future of computer interfaces, and it's nothing but eye candy.
Genuine improvements in the computer desktop, such as intelligent "find what I mean" search using agents, have hit a brick wall. yesterday I searched for a german movie title involving the word "ein" (Ein Einsatz Kammer Zuruck)and Google ads spat out a bunch of offers to file my employer identification number -- my EIN. This is the disastrous state of computer user inerfaces today, and it's not getting any better. In fact, as spammers abuse and misuse contextual adverts with splogs, it's getting worse. Today, so-called "AI" still translates the proverb "Out of sight, out of mind" as "blind and insane" and still translates the motto "The spirit is willing but the flesh is weak" as "the liquor is good but the meat is rotten." AI has crashed and burned, and languishes in the same crackpot backwater as free enrgy pereptual motion machines and the search for the Philosopher's Stone.

Incidentally, notice that the state of the art of computer technology and programming remains so shockingly crude that there exists no web browsers able to recognize and flag as incorrect above typos like "yesterday" instead of "Yesterday" and "COmpiz" instead of "Compiz." I left these typos in as brutal examples of the lack of progress in computer technology, despite all the hype.

K. Eric Drexler has never produced a single shred of actual new science to back up his wild claims, and serious materials scientists judge him to belong to the same category as Uri Geller and George Adamski. Drexler has made his name as a PR promoter along the lines of Don King, not as a scientist:

And the total net result of gene therapy efforts to date has been to kill one patient:
Indeed, the evidence now suggests that various forms of RNA prove more basic to the mechanisms of heredity than DNA, and that the so-called "junk DNA" in the genomes isn't:

The idea that over the next 30 years society will experience the same kinds of changes we've encountered as a result of the introduction of the personal computer + internet is as foolish and ignorant as the delusion that we'll be travelling around with personal jet packs at 1% of the speed of light based on projections from the Kitty-hawk-To-Apollo-11-Moon-Landing period. On the contrary -- current trends in web censorship and the collapse of broadband internet in the United States, where dialup access at 56K still costs $20 to $30 and "high speed" internet is a laughable 768 kbps, tells us that the future of the internet and computer technology is likely to be:
[1] Slower internet speeds and less internet access, not faster and more, thanks to gov't censorship and corporate monopolist c(r)apitlaism which systematically strangles the free market and puts giant telecom providers in malevolent monopoly positions;
[2] More sluggish and less useful computer interfaces, not faster and more futuristics ones (my Ubuntu linux desktop boots up 2 minutes faster than my Windows 2000 Pro machine, which itself boots up a minute faster than my WIndows XP machine -- but all of them pale by comparison with my Apple II, which boots up in 15 seconds; (Wirth's Law: Software slows down faster than hardware speeds up)
[3] Less useful and slower searches, as splogs and other spam garbage continnue to clog the net and destroy the usefulness of google, which is already so low that a typical search yeilds most junk instead of useful information.

By the way, I'm sending this on 56K dialup. I can't afford $100 U.S. a month for crappy piss-slow 2 megabit broadband service, just like 50% of the adult American population, and this is not changing in the foreseeable future. Indeed, another round of rate increases has just been announced in my area, and my dialup costs are soon going to rise.

(Yes, I realize that Charles Stross and other inhabitants of the sceptred isles enjoy considerably faster internet speeeds -- but please note that Stross and his countrymen live in a civilized country. I do not. I'm stuck in "Burkina Faso with rockets," as Bruce Sterling has described it, the Benighted States of Amnesia. And there are 330 million co-inmates stuck here along with me inside this giant open-air prison, which should actually gives up its seat in the U.N. and be turned into a giant theme park called "Medieval Land," complete with Creation Museums and gay lynchings and people running Windows 98SE on dialup to access the net using Pentium II machines. What do you expect in Burkina Faso? We're just thankful no one in my town has burned anyone at the stake as a witch...though our valient American police are engaged in a fierce battle against those terrifying gangs of pistol-packin' lesbians:

For a truer projection of the near future, see Tom Wolfe's essay "The Great Relearning."

None of this will of course make any impact, since Charles Stross is a computer programmer, and programmers remain the absolute bottom of the food chain in the sciences and technology. A programmer doesn't actually know anything, and has never had any contact with the scientific method. If a programmer's code fails, he can just run it on a larger machine, or tell the end user it's going to take longer -- physicists or molecular biologists or materials scientists don't have the luxury of changing their basic constraints, but programmers do. A physicist can't just say, "Okay, let's change the constant of universal gravitation to 1/10 of its current value, then re-run the experiment." But programmers can indulge in this sort of technological onanism by fiddling and twiddling with either the hardware and the software until their kludgy crufty junk code runs at an almost-acceptable rate. For examples, see any version of Windows, or, for that matter any dependency-hell Linux upgrade.
Because they've never had any exposure to actual science and know nothing about the scientific method and have never had to deal with genuine engineering problems, programmers like Stross remain the very worst candidates for writing about the future of science. This may explain why the laughably foolish pipedreams of the Singularity have been promoted prmarily by programmers -- Ray Kurzweil, Vernor Vinge, Charles Stross, Cory Doctorow -- rather than actual scientists or engineers. Programming is to real science as alchemy is to chemistry. Real scientists aren't taken in by this kind of singularitarian twaddle.
Fortunately these sorts of mass delusions riptide through science fiction periodically and never amount to much. In the 50s the fad was aliens landing in Central Park (credible, eh?) and post-thermonuclear-war stories (fortunately, no nuclear winter in novels like Pangborn's Davy -- how convenient); in the 60s, the fad was torchships accelerating at 1G using fusion engines to whiz around the solar system (good thing no terrorist ever aimed one at Earth, and no exotortionist ever threatened to); in the 70s, the fad was O'Neill colonies at the L5 point and giant solar power satellites in orbit (at 150 billion U.S. Dollars to construct an ISS for a grand total of 3 occupants, it is left as an exercise for the reader to calculate the putative cost of an O'Neill colony with a population of 50,000 if 'twere built today); in the 80s, the fad was human-computer neural interfaces (great thing those William Gibson Ono-Sendai decks never crashed with bluescreens and gave their users acute embolic infarcts), and in the 90s we got the latest and most foolish fad, the Singularity, where anything can happen and consequently the rules of fiction get thrown out the window. (Hamlet in the Singularity version: Act I, Scene I, Hamlet rebuilds his dead dad's body and downloads his consciousness back into his body. End of play. Not quite as impressive as the original, is it? This is your fiction...this is your fiction on the Singularity. Any questions?)
By the 2010s, science fiction will probably have moved on to 3-eyed bunny rabbits from mars -- a fad slightly less ludicrous than the Singularity, and with more basis in scientific reality.


mclaren @84: You seem to have mistaken me for a hostile figment of your imagination.

Please note that in addition to a background in computing -- and I'm quite aware that applied computing tends to suffer from the same problem vis-a-vis ontology that afflicts engineering disciplines in general -- I have a prior background in another science-based profession, and I'm fully aware of the faddish nature of SF and the risks of extrapolating a sigmoid curve indefinitely. (See this entry for a discussion of the prospects for space colonization, pace the 1960s fad, and this one for a non-singularitarian look at some of the likely consequences of the current sigmoid development curve in computing, then feel free to get back to me.)

Please also note that this is a discussion forum. If you want to stick around and discuss things, fine; but if all you want is a soapbox, feel free to get your own blog.

(Somewhat more constructively, I think your dismissal of gene therapy is premature and your comment about the collapse of the Central Dogma in genetics is well-taken, but I'd consider it a sign of progress, rather than the opposite. And while I'd take this as highly suggestive that the route to synthetic life forms using toolkits derived from study of existing organisms is being explored faster than, say, Eric Drexler's mechanosynthesis hypothesis, I'd like to note that there seems to be a market developing for materials structured on a nanometric scale -- which was, IIRC, one of Drexler's more conservative projections back in the mid-eighties.

I suspect your pessimism is just American fin-de-siecle withdrawl in the face of the retreat from empire and turning away from science that's endemic in the USA this century. Don't sweat it; the rest of the world has other plans.


Noel @73, I'm pretty sure I've said elsewhere that I don't gamble. (It's one of the aspects of my low-key puritanical religious upbringing that stuck with me.)


Steve @ 82: plug-in hybrids are a possible future for personal transport but they do in fact require a lot of new infrastructure if they become a mass-market phenomenon. Base load night-time power generation is already spoken for (a lot of industrial processes like metalcasting use it) so new generating capacity will have to be built. It will also require new distribution capacity as well -- to charge a hybrid in your garage will take 5-10kW of extra power that wasn't designed in when the house was built. It also assumes that when the car is parked up a charging station is available -- our car is parked in the street about 400 yards from where I'm typing this. A lot of people (including Charlie) live in similar high-density urban circumstances.

In addition a battery-powered car takes at least a couple of hours to charge up from near-empty, unlike a gasoline car which can be totally refuelled in five minutes. This puts a severe crimp on how such a car can be used, especially if an emergency crops up.


Personal computers not surprising in '77. That's three years after the Altair 8800 kit, and just about when the announcement of a PDP-11 on a chip was a signal to a beginning programmer like me that things were moving into high gear. They were already in the culture, under the banners of "mini-" , "electronic", or "printed circuit". Consider (Jack Kirby introduced the New Gods around '71):


But along the lines of unpredictable social phenomena that you're thinking of, there's the following. Which I can imagine appearing verbatim in New Worlds or F&SF in '77, but which I would have called a clever gag, though not really believable:

Mousetrap: A brief history of the events of Fall 1995, the last great Agora scam, and a view of Agora beyond the game.
by S. Andrew Swann (with thanks to Players Steve and elJefe.)


If I was going to write about a situation like this in a book, I'd put it in as a mix of dialogue and protagonist's thoughts. Info-dump tends to work a lot better that way, because people should already care about your characters, so they understand the characters need to know. It also give you an excuse to cut the explanation off with a "not now, we have to do X" when you've had the story-relevant information explained. Recently, I've been reading The Dresden Files book series by Jim Butcher, and he uses this technique to explain the magic stuff. Both the uninitiated reader and characters get to know the setting at the same time.


Burkina Faso with rockets, eh? If what's meant by that phrase is the contemporary United States, it should be recalled that before Thomas Sankara came to power in 1984, the name of the country was Upper Volta, and the phrase 'Upper Volta with rockets' was applied to the dear old USSR. And if the suggestion is that the US will go the way of the SU, I'd say we'd best not be too overoptimistic on that one. As for the actually existing Burkina Faso, it has at least got computers. . . my evidence for this is that most of the 419 scam emails I get originated from the Burkinabe capital Ougadougou. Which is kind of ironic, given that the literal meaning of 'burkina faso' is 'the land of honest men'.

S.M. Stirling - agriculture in Ireland is pretty much with O'Leary in the grave. But if it makes you feel better to believe that we're all still riding around on donkeys and knocking lumps out of each other with shillelaghs, don't let me stop you.


My understanding of farming in Ireland is that it's to a large degree semi-professional small farms with an emphasis on livestock. Farms supplement their income with other work and with CAP checks (did I say that right?).

I suppose there might be some types of farming that rely heavily on immigrant labor, as do vegetable and fruit farmers here in the US.


A personal electric car may not be suitable for occasional longer journeys, but you don't have to own your own IC car as well. E.g. I use a car club (www.citycarclub.co.uk) for short trips and hire cars (= rental cars in the US) for longer trips. (In my case, this supplements cycling & buses for everyday commmuting, but the same principle could hold for electric cars).


As an avid player of computer games in 1977, I can assure you that my memory of trying to explain to the average person the concept of a game that you play using a computer is clear as a bell, and it was as futile then as trying to explain I/O monads in Haskell to the average person is now.

I'm pretty sure the point where the average person in 1977 would say "you have got to be shitting me" is this: "...other groups such as people who like to dress up as furry animals..." The idea of anybody other than corporations and governments having access to computers, much less a global network of them, would have been like being asked to believe in Atlantis. The point where they clutched their head like a stunned monkey would come not much later, at the mention of far-east sweatshops selling imaginary gold to impatient gamers. By the time you got to the gnome corpses spelling out a web address, your audience would think you were on drugs— very bad drugs.


mclaren @84, Stross @85:

I agree more with Mr. Stross, having written a book manuscript "Success Curve" on the sigmoid applied to technology (including transportation), science, art, my own autobiography, and history.

The most influential and important book on the subject is:

Conquering Uncertainty: Understanding Corporate Cycles and Positioning Your Company to Survive the Changing Environment (Businessweek Books) (Hardcover)
by Theodore Modis (Author)

Reviewer David Rouse said in Booklist:

Modis is a physicist turned management consultant who argued in Predictions: Society's Telltale Signature Reveals the Past and Predicts the Future (1992) that cycle theory and mathematical tools such as the S curve can be used to forecast natural and social phenomena. Here he applies the same notion to the business world. Modis adapts such concepts as equilibrium, competition, feedback, and survival of the fittest to suggest that product life cycles can be treated like those of natural species. He reintroduces the S curve and builds chaos theory into his forecasting model. Modis claims that chaos is seasonal and shows how to use the cyclical swings from chaos to order and back to prepare for future events. He spends much time "proving" his model by applying it to historical events as diverse as mobile-telephone sales in Greece, the pricing of fountain pens, and Ernest Hemingway's publishing output.

# Hardcover: 198 pages
# Publisher: Mcgraw-Hill (April 13, 1998)
# Language: English
# ISBN-10: 0070434050
# ISBN-13: 978-0070434059

Similarly, I've commented in various blogs about the death of the simplistic notion of "gene" based on the ENCODE project's amazing results. I am having to rewrite several paper-in-progress already. As with Mr. Stross, I see this as a breakthrough. This may or may not be the death of an epicycle-laden old paradigm, but it does seem to be progress, in the form of a nascent new paradigm.

I am far too close to the K. Eric Drexler story, having played a key role as the elder and more published statesman in the field, protege of Richard Feynman (great-grandfather of nanotechology), and as the man who got Drexler his early publicity by introducing him to Dr. Stanley Schmidt (Editor of Analog) and getting Omni magazine to run a story on Drexler, and so forth.

Drexler is wrong on some points, in my opinion (I emphasized wet floppy nano when he pushed rigid diamondoid nano) but he is, in the main, significantly right.

Nano plays a role in the potential Singularity, whether via nanocomputing and/or via seriously redesigning humanity.

By the way, the man who introduced me to Theodore Modis is the brilliant Economics/International Business professor Philip V. Fellman at Southern New Hampshire University. He, I, and some coauthors have 2 very recent papers on the arXiv, but this is not the time nor place to flog them nor the open source science in which they are embedded.

Change. Live with it, or die without it.


Charlie: "David Brin, in Earth, predicted the near-death of the snail mail postal service in, um, 1994? ... But I thought he was going a bit far [...] These things creep up on us, and they're like the (apocryphal) frog-boiling process: you don't notice them because they're a gradual progression."

I think it depends on when you were born, relative to a given technological advance, and -- more importantly, perhaps -- how old (& naive :)) you were when you first read scifi discussing it :)

For example, I think for many, that's been the case with space travel -- for my parents, it's astonishing that we went from not having heavier-than-air flight, to landing on the moon, in such a relatively short time; while for others -- those who grew up reading Golden Age scifi and watching the incredible progress whooshing by -- it's deeply frustrating that manned exploration of the stars isn't getting any closer (c.f. people's rejection of your "High Frontier" post despite its factual soundness).

Those of us who grow up watching a particular technology progress rapidly, and reading scifi extrapolating that, instead tend to accept it quite easily, and in fact have to tone down our expectations to reality.

Well, the Golden Age of scifi was before my time, really, and air and space travel developments had pretty much levelled off by the time I was paying attention. But computers and networking? I grew up immersed in that, watched progress flying by, and yeah, read the scifi, and spent my teenage years irritated by my lack of connectivity, and trying to persuade high-school physics teachers to help me build microwave transceivers so I could network my home computer to my friends' in neighbouring villages. (This didn't happen -- I have zero hardware chops -- but I did write my own email software, interactive animated vector-graphics BBS software (think Flash, only on a 286 running a b0rken DOS clone) and took several stabs at networked hypertext -- in the period from 1987 to 1993, when I finally managed to sneak a modem into the house... :P )

Another point about people from the 70s disbelieving (f.e.) miniaturised computers is ... why not? Presented with a technology I knew nothing about, why would I assume they couldn't make it faster/better/smaller? With no measuring-sticks, I find that beyond a certain level of Clarkian Sufficiently Advanced Technology, people just sort of switch off their astonishment and say, "My my, the things they can do these days...".

Whereas someone who did grok the technology at the time might be more surprised, because they're aware of the technical limitations (of their time). I've read earnest Usenet posts, archived for posterity, in which people learnedly argue the unlikeliness of technology we presently take for granted (such as gigs of ram and terabytes of disk space), even as little as ten years ago (I'm trying to dig up an example but can't find it right now.)

As for the future...? Maybe I've just inhaled too much Doctorow and Sterling, but ignoring the fiction and just looking at current technical trends, I still think ubicomp and small-scale fabbing becoming cost-effective are the next two big things. Where that takes you in 30 years, I have no idea. Not trying to write fiction then, I don't have to care, fortunately :) Long-term plans don't really work in this context, so for dealing with reality, I just need to look ahead far enough to jump when the tram changes tracks...

Incidentally, I think one of the reasons the iPhone has a lot of potential is that it is likely to be the first ubicomp platform anyone really cares about. Phones have been able to do nearly every single feature it has for years now, but it doesn't matter, because noone cared, because the user experience was so dire, noone bothered.

What the iPhone does, that's genuinely novel, is step away from the "checkbox mentality" of current "feature phones" (ie 'we need feature X -- but only enough so that we can check the box on the feature comparison chart") and actually make it usable for Real People. At which point, network effects and economies of scale start to kick in, people who aren't technology-focused find new 'humane' uses for it, and we see shifts in the market.

I used to listen to mp3s on my computer, back in the mid-90s when WinAmp first came out. I remember clearly a friend laughing at the suggestion that this ridiculously arcane technology would ever replace record shops... "It's great for people like you, but my mum & dad will never use it...". Napster (supply), iTunes (management) and iPod (playback) changed that for the general population. Both my parents now have little mp3 players they bought from the bargain bins in Lidls. That's pretty much the future of any technology ;)


You have inspired me to blog about this!
[snip]This is the trouble with "time travel thinking." This post reminds me of that old picture of the guy with this HUGE room-sized computer of 2004 (click the picture to enlarge) --- 30 years ago, in 1977, when the original Space Shutlle was still being conceived, would anyone believe we'd still be using the same old design in 2007?
The amount of gizmos, widgets and services available for bloggers in 2007 never ceases to amaze! Many Asian bloggers load up on the eye candy... Take a look at "Picturetrail" at the head of the sidebar on ?e?.?


Dave @ 92: You're correct about the various alternatives to a private conventional IC car -- I know people right now who mainly use mass transit or walk, and rent a car when they want to take a long road trip (a couple times a year).

However, I think Steve's scenario is more likely to occur due to social/cultural factors - here in the US, at least. Europeans and Asians in 2037 might be shaking their heads, wondering why Americans go to such lengths to hold on to privately owned, large, expensive cars when there are much more affordable alternatives in use in Europe and Japan (and maybe China, Korea, etc...)


Like #23, I guess it's not so hard. The ARPA-net is from the 70s, the first games (D&D and some line-art asteroids game) are from that time, and video consoles are also more or less from that time. Take a person from the 1940s, and it becomes much harder.

I don't know if someone else has linked to it, but this ad about electronic mail (from 1977) shows to me that the cultural background to explain something like the internet and gold farming started there. And if the person isn't John Brunner, but has read the Shockwave Rider, it will also be quite easy to explain everything.


A second comment (the first is still in needs-moderation-limbo): from a narrative point of view, I would do it the other way round: start with the "there are gold farmers dumping gnomes to spam their message in the mmorpg World of Warcraft", and work conceptual backward from there. Or what ever the 2037 example is.


Oh yes, and a quick addition before I forget: "CPU parallellization has turned out to be a bust" -- bzzt. If you think that, you haven't used Google lately. Or Amazon. Or recent games consoles. Or ... well, anyway, trust me, it's no bust. Look up MapReduce, or EC2.

It's not a linear mapping (2x CPUs does not equal 2x speed) but lots of people are parallelising the crap out of tasks -- and upending various markets in the process.

The main hold-up is simply that people find it hard to reason about parallel processing and tend to find themselves in deep pain as a result. Those who are winning, tend to have found simple, useful abstractions for dealing with the complexity (this applies to many areas of tech, of course).

As for the rest of the doomsaying: I can't be bothered addressing all of it, since there are so many obviously-refutable parts. I still find Google useful, and that article I note is dated 2005, so two years down the line it's still not a serious issue (unless you're Matt Cutts), and much of the rest mostly looks like artifacts of your ambient politics; or is self-contradictory (if your Ubuntu box is that much faster, clearly things are improving, not getting worse, right? You just need to make sensible choices as to what you use, much like anything else); or outright wrong ("If a programmer's code fails, he can just run it on a larger machine, or tell the end user it's going to take longer" -- faster computers don't fix bugs, they just get you to the point of failure quicker).


SM Stirling,

No bet on air travel speeds. The economies of this are pretty clearly worked out. Still, there might well be a low-volume, high-speed, high-price option again. (Condorde was profitable, after all, and aircraft design has moved on considerably in the last few decades)


"Hamlet in the Singularity version"

Not so - see _Altered Carbon_.


Charlie@73: Note that in some major US cities, car-sharing clubs have taken off again. These are automated systems that lease cars at specific locations by the hour. I'll typically sign in to the website, reserve a car for a few hours, walk to the car's reserved parking space, swipe my smart card over the windshield (unlocking the doors and the ignition), retrieve the key, and be on my way. Reservations are communicated to the car by satellite. Maintenance, fuel, and insurance are wrapped up in the hourly rate, which fluctuates according to high- and low-demand times.

It's not the autotaxi that we'd like to have, but combined with mass transit it's allowed us to avoid owning a car in the city (and carrying expensive personal liability insurance, mandatory for automobile owners in the US) for four years now.


It might be easier to generalize this-- if every development either transmits information (if by no other means than putting a person on a plane or rocket and sending him out to talk at a destination), or increases the useful complexity of matter (computer chips beating out silicate rocks, for example), we could get a useful result that doesn't depend on any one technology.


I'm an avid reader of sci-fi, run a network of websites for a living, and own three game consoles. I've written more than 20 published computer books.

Nonetheless, I found your post very useful because otherwise I wouldn't have understood this story at all, not being a WOW player. (The bits about flat TV screens and tiny computers wouldn't be needed, but the rest helped.)

Context is everything.


"Like the rest of the kooks and cranks and crackpots who believe in the mythical Singularity, Stross glances at the recent past and ill-advisedly projects the technological and social changes straight into the future."

-- y'know -- and I speak as a man who's had his share of arguments, cue understatement reflex -- that's not a way to start a productive dialogue.

The odd thing is that I think you're right, as far as the Singularity goes, and the general S-curving of the rate of advance of specific technologies.

But even compared to _me_, you're tact-challenged.

Charlie isn't stupid, to put it mildly. It is possible for intelligent, well-informed people to disagree.


Charlie: "I suspect your pessimism is just American fin-de-siecle withdrawl in the face of the retreat from empire and turning away from science that's endemic in the USA this century. Don't sweat it; the rest of the world has other plans."

-- ah... you _are_ aware that the US spends 50% of the world's R&D, and that nearly 500,000 European science workers are employed here?

Really, Charlie, I know you'd _like_ the US to be in decline, but there's simply not one iota of evidence for it. Our economy is growing faster than the rest of the developed world, and so is our population.

This seems very unlikely to change in the immediate future, so you'll just have to endure the second American century.


O'Kane@90: S.M. Stirling - agriculture in Ireland is pretty much with O'Leary in the grave. But if it makes you feel better to believe that we're all still riding around on donkeys and knocking lumps out of each other with shillelaghs, don't let me stop you.

-- are you seriously under the impression that there aren't any Poles working on Irish farms? Or that they don't often face lousy conditions?

Because if that's so, apparently Irish newspapers are engaging in systematic falsification.

As for the rest... you're tapping postings from an alternate timeline, perhaps?


Robert@87: In addition a battery-powered car takes at least a couple of hours to charge up from near-empty, unlike a gasoline car which can be totally refuelled in five minutes. This puts a severe crimp on how such a car can be used, especially if an emergency crops up.

-- well, yeah, but that was exactly my point; plug-in hybrids don't have that drawback. In an emergency, you just unplug, turn 'em on, and drive off. Same-same if you want to go beyond your usual less-than-60-miles daily trip. The IC engine just kicks in.


Dave@92: "A personal electric car may not be suitable for occasional longer journeys, but you don't have to own your own IC car as well."

-- In theory, no. In practice, there's a major advantage to having a combo electric/IC like a plug-in hybrid.

It's not significantly more expensive than either a pure electric or pure IC car, but it can do everything or nearly everything _either_ can do.

This frees you from the inconvenience and scheduling problems of renting cars; your own vehicle is right there whenever you want it.

And it does so _without much increasing your costs_, either in terms of money or storage space. A hybrid doesn't take up much more space than a pure electric, after all.

As for bicycles and so forth, there are cities in the US where you can live without a car. But there aren't that many of them, and they are _all_ losing population to the much less dense suburbs, exurbs, and edge cities.

Those older, denser cities are increasingly dominated by young singles and impoverished recent immigrants, and both move out when they establish families/get some money. (Leaving aside some of the very wealthy, who have more options.)

People here just don't want to live at those densities. I strongly suspect, from (e.g.) the way the housing markets in Britain behave, that a majority of people in Europe would prefer a spread-out, single-family-home residential pattern too, if only they could get it.

(Britain would be one large suburb from Portsmouth to Caithness if it was built at the housing densities of, say, Kansas City.)


NB: people from the eastern side of the Pond tend to get a very distorted idea of the US if they spend most of their visiting time in the big coastal cities. In particular, they tend to think of America as much more like Europe than it actually is.

Those old, densely populated cities _are_ somewhat more like Europe, at least on first acquaintance, but they're not where most Americans live and the population has been shifting away from them for some time. They punch above their weight institutionally and culturally, but that's also changing.

Of course, it's an understandable error -- native-born Americans who live in the big coastal cities are often astonishingly parochial about the rest of the country too. I've talked to some who had really, really severe culture shock when they moved from NYC to the Raleigh triangle in North Carolina (they worked for a publisher who switched HQ's).


S. M. Stirling @ 105

-- ah... you _are_ aware that the US spends 50% of the world's R&D, and that nearly 500,000 European science workers are employed here?

Steve, the quality and quantity of specifically technological research in the US, both corporate and university has been declining for more than 20 years. I worked for a corporate lab that was shut down (not entirely coincidentally the corporation, which used to be a giant among technology companies, is an also-ran now), and know a number of people who've been at or are still at research centers from IBM to Stanford. Less and less long-term or basic research is being done, and agencies like DARPA and NASA which used to sponsor lots of cheap, but potentially useful projects have scaled back or eliminated this sponsorship.

Even military research spending is largely focused on incredibly expensive weapons system for fighting the Cold War, which make little sense when we're fighting insurgents and terrorists.


Two comments:

- SF authors of the past always tended to forget the difference between practical, achievable and desirable. For example videophones are perfectly achievable, but most persons consider them impractical, they prefer not to be seen while chatting. Electric short range cars are achievable and practical, but most persons prefer IC cars because they give them a sense of individual freedom (and power?) they wouldn't get from electric cars. A collective car service that provided an automatic car to carry you anywhere will certainly be achievable some years in the future (IMHO it would be achievable now if we were willing to spend the money) and it would be practical, but would it be desirable? As mentioned, one of the bonuses a car provides is satisfying our territorial instinct - I think many men are more attached to their car than to their home (at least if they are married!)- and besides in our equalitarian epoch the car is the most evident sign of our social status...

- We all tend to assume inconsciously that tech is more or less the same all over the world, or at least I think so. But I did travel to the USSR in the late 80s and I can assure you that using a pocket calculator in a shop aroused a shocking wave of public interest - everyone wanted to see that ultra-advanced device! This can be a simple truism, but one aspect of the future in 2037 will be that... if we are lucky, most of the humanity will by then live as we Westerners do in 2007.


@112: but hopefully without the ecological footprint of us Westerners in 2007.


I'm going to make the unprecidented action of agreeing with Mr Stirling! The Hybrid option is probably going to be the way to go - massivly efficient for short innercity cops, with the IC engine for long journeys etc.

If you manage to use a biofuel engine (I read somewhere that diesel engines can actually run on vegetable oil, but I'm not sure if that was a (crack)pipe dream on the posters behalf) you could really start to cut down on emissions and the current 'carbon footprint' bugbear.

Would hopfully be cheaper too - especially if you get a wind turbine atop your charger station to reduce the amount of draw on the grid.

I believe the Toyota Prius (sp?) is the forerunner of commercially available hybrid vehicle. But I'm not sure how efficient the thing actually.


"We all tend to assume inconsciously that tech is more or less the same all over the world"
Weeeelllll, William Gibson said "The future is already here, it's just unevenly distributed". I think we're mostly aware of that, it's just that for the purpose of this discussion (comparing the knowledge/views of someone in the 70s to today and attempting to project forward into the future in order to write fiction) we're assuming someone of a fairly similar background, since I imagine Charlie's primary book market is, indeed, The First World(tm). :)


S.M. Stirling - the Irish economic boom does indeed involve hyper-exploitation of immigrant labour. But this is very far from being a mere continuation of the survivals from Ireland's pre-industrial past which I alluded to in my original post referring to the landless labourers and their unattractive accomodation options.

The kind of Irish society we have now would have seemed utterly and laughably bizarre to any denizen of the year 1977 - and if you went to talk about a Northern Irish government being jointly by Ian Paisley and PIRA quartermaster Martin McGuiness you'd be accused of making a joke in very bad taste indeed.


Andrew Crystall #101: Concorde was only profitable if, as happened to BA and Air France, you were given the aircraft and spares effectively for free. This, and the Halo effect, were why the airlines kept flying her.


Jakob, Air France didn't make a profit. BA did. And BA did pay a considerably amount of the purchase price of the aircraft. As I said though, we could design and build a replacement for Concorde for a lot less than, well, Concorde.

Richard Branson's probably thinking about it (heck, he tried to buy the Concordes).



Oh, consciously we are aware of it... but until you aren't really there you don't fully understand the implications of those differences (for example some visionaries foresaw MMORPGs, but no one foresaw Chinese gold farmers, which exist because those differences exist but are not so extreme that Chinese can't access the web - no one complains about Somalian or Afghan gold farmers)

A problem I usually see in SF is that the futures described almost always include a marked degree of uniformity and convergence.

The future as it appears in SF is usually (I'm tempted to say always) far more uniform than the real world is. Even alien races are usually described as extremely uniform, one nation with only one culture, one tech level, one society, all belonging to one religion or all of them atheists... and, in relation with that uniformity, those futures seem to assume a marked degree of convergence. Like those economic forecasts which always seem to predict that the countries growing faster will deacelerate and those growing slower will accelerate...

Regarding SF and the impossibility of including a realistic future world because the number of changes would be excessive, IMHO that's true, but almost unavoidable. Not completely unavoidable (good historical novels must face the same problem and succeed at it) but almost.


Steve @109: if the British Isles were blanketed in population at the same density as Greater Los Angeles ... we'd have to deport 30% of our population to fit.

(This includes the Lake District, Scottish Highlands, and other bits of picturesque but basically inhospitable terrain -- inhospitable because it may not be high in altitude-above-sea-level terms, but it's sufficiently crinkle-cut to make putting in roads and sewers and infrastructure extremely headache-worthy.)

Andrew @118: I gather that BA wanted to keep flying Concorde too. They were willing to put a chunk of money into keeping just one flying, as a display/flagship aircraft. However, Airbus -- the notional manufacturer -- declared that they wanted an insane sum in return for continuing type certification past the 30 year mark: IIRC something like €200 million per airframe. If this is true, it looks like they deliberately killed off Concorde -- I'm not sure why, but I can make some guesses involving plans that never got off the drawing board for a rival to a Boeing paper plane that never got off the drawing board (the "Sonic Cruiser").

Alastriste @115: the uniformity you mention is sometimes jokingly referred to as "small farming planet syndrome" (there's no such thing as a small planet, they're all enormous -- and planets don't do monocultures any more than entire countries do). When you see it, it's usually a sign that the author (a) hasn't traveled much outside their home culture, or (b) hasn't really thought things through.


If US corporate R&D is so great, where are the products? Why the huge trade deficit?


American R&D can be as good as ever or better, but in relative terms things don't look so rosy, specially regarding the 'R' part. The rest of the world has advanced a lot and very fast, and in consequence measures like the % of american scientific articles published in relation with the world total register a decline and have done so for many years now.


First computer video game, AFAIK, was 'Spacewar', PDP-1, 1962. The background was real - and moving! - constellations. (I've seen it played. Some of the settings can produce really interesting effects.)


Dave @ 62: I think it would be possible for a game like WoW to be created even if we didn't have personal computers. Some sort of multi-million dollar mainframe running the game, with people connecting from home terminals.

Compuserve and Infocom investigated doing precisely this in the early 80s.

Previously, one of the authors of Zork and some friends did this with a simple FPS game called Maze, connecting TWO players(!) in Cambridge, MA and San Diego.

The first MUDs did something similar as well, also in the early 80s. I remember people who tried them complaining, "There's nothing to do but kill each other."

So times haven't changed that much in 30 years.


So many people seem to believe that the average 70's resident would be fine with all of this - but as Canis said; you're thiking of yourself (and you're already here).

My mother in law, father in law and brother in law - all 'around and active' in the 70's still flip out over the fact I can use my internet connection to make phone calls.

As a matter of fact in my home, I can do so wirelessly - plugging a skyp phone in via my tablet PC and have a meander. Letting me chat/browse/work at the same time. Whilst my wife uses another wireless device to browse for clothes (very Clich I know...).

Trying to explain to them that educated, rational people will pay real money for a commodity that has no 'real' value or counterpart would likely cause one of them to suffer some kind of seizure.

The point of SF and even to an extent, fantasy, is to try and suggest a future. Of course various authors pick a point and work from there, I'm sure some folk would love to get it all right - but getting something close is good enough.

(Like me prediciting that in 30 years we'll nearly all be working from home using instant messenger/video conference systems with some form of thin client tech' to access our employers software/in house systems

If we've not been uploaded)


Can anyone point me to any comparative data which shows a correlation between R+D spend and economic growth? I was under the impresion that there wasn't any.


Charles, et al:

I started working with computers in 1972. Talk about the dark ages! The first computer I worked with had 16K (that's right K) of memory and 16 5-megabyte hard drives each of which took up the space of a large modern day microwave oven. Oh, with many separate processors, the computer filled the floor of a fair sized telephone switching building in Lower Manhattan. And, it was state of the art.

In order to run the system we first had to learn manual processing, by toggling in instructions from the console. Then we had to learn assembly language and finally, Job Control Language. Other than toggling in, all programs were on unsequenced punch cards. God forgive you if you dropped a box of cards.

Some folks in the business were already interested in what were then called mini and micro computers. The latter being close to what we now have, but still very large.

In spite of all that background, I still continued to be awed by the advances in miniaturization and sheer power. For all its multiple discontents, and all the contradictory comments above, the technology of today is wondrous and amazing. One need not believe in Singularities (about which I remain skeptical, but hopeful) to recognize the technological progress of the last 35 years.

Of course there are many promises left unfulfilled. I saw 2001: A Space Odyssey the day it opened. And I believed that was our future.

However, at 62 I can honestly say I remain a deep lover of science fiction, and a skeptical optimist.

Richard York


I wish my wife's father and grandfather were still alive; I'd love to be able to ask them about their background. (Granddad worked on the Manchester Mark One for Ferranti, and probably knew Alan Turing personally; dad worked at Ferranti, ICL, and then Microvitec.)

Incidentally, one of my favourite job interview memories is of the one that ended when I stood up and said "don't call me, I'll call you", then walked out. Note that I was unemployed at the time ...

It wasn't just that the prospective employer wanted me to sit an aptitude test to see if I was suited for a career in computing (although, with a master's degree in CS, I figured that their approach to inducting new graduate intake hires was just a little inflexible), and then send me on a three month training course to learn about binary and hexadecimal and the basics of what computers are and how to use them.

Rather, it was the immortal phrase, uttered by one of my interviews: "we upgrade to COBOL next year".

That was in 1990.

(As William Gibson said, "the future is already here, it's just unevenly distributed". What he forgot to add was, "and so is the past.")


Re: plug-in hybrid cars.

Right now, no regular car manufacturers are building plug-in hybrid cars. There are logistical reasons for this, and they can be seen by looking at the private plug-in conversions of existing hybrid car designs being performed by enthusiasts. To get a plug-in that's got any sort of reasonable range (100-150km say) under battery power it requires that the car be pretty much filled with spare batteries. All that is usually left is space for two humans and zero luggage. It also requires upgrades to the brakes and suspension to handle the extra mass -- quite often the converters don't do this resulting in a vehicle that handles like a pig and has a very rough ride.

The solution is to recover extra mass and volume from the design by removing the diesel/petrol engine and go all-electric, at which point you get a better electric range but at the cost of the hybrid advantage.

Hybrids right now are like the diesel-electric locos of the recent past, carrying a bulky, heavy and complex engine/generator system to deliver torque via an electric motor to the road wheels for propulsion. There are control advantages to doing it this way that make it worthwhile in terms of fuel economy but it's putting lipstick on a pig, engineering-wise. It's also high-maintenance as there are a lot of parts to go wrong and a lot of ways for the car to stop working suddenly, a lot more than even a modern IC direct-drive car has. With hybrids, the era of the shade-tree mechanic has passed.


What I'd like to see, automotive propulsion wise, is a cheap fuel cell that can reform diesel (biodiesel, even) to hydrogen and carbon monoxide then run on those. Get rid of those annoying whirly things with their moving parts that are prone to metal fatigue -- whatchumacallits, internal combustion engines.

Alas, reformers add weight, complexity, and inefficiency to hydrogen fuel cells. While hydrogen adds weight, complexity, and inefficiency to the fuel distribution side of the economy.


Yeah, but how cool is this?

Or, indeed, in a rather more vapourish manner, this:

We're getting there.

[ObBook for the uneven spread of technology: D. Edgerton's _The Shock of the Old_]


Still, Charlie, the reformers of today are advancing quite quickly. And once you've got one in a vehicle, it's easier to adjust it to take various sorts of fuel than it is to get it in there in the first place.

A hydrogen distribution economy...well... that's expensive well beyond reformers. My Father works for a company that services a fair proportion of the petrol pumps in the UK, and it's a vast infrastructure...

And while petrol's nasty stuff, it's a lot less nasty than highly compressed hydrogen.

"we upgrade to COBOL next year".

Oh MY.


Charlie @120

Airbus basically didn't want to run what was, to all intents and purposes, a bespoke engineering fabrication facility for a fleet of a dozen aircraft for 2 airlines.

They were gearing up for the A380 and didn't want anything getting in the way. It was a dull and predictable commercial decision. It wouldn't have made any difference if anybody had let Branson take them either, Airbus had spoken. The Sonic Cruiser didn't really come into it, IIRC the view inside the industry at the time was it was yet another Boeing paper aeroplane while they tried to remember why they were moving the HQ to Chicago again.


Very much in the same vein as Charlie's original post here, I just ran across the wikipedia article on Basshunter (check out one of his music videos if you haven't already) and thought "now that, that would be tough to explain to 1977 me." A guy singing a song about playing a user-created map for an MMORPG, uploading the music video to a site anyone can access (and then someone else taking it upon himself to subtitle the Swedish lyrics and upload that version), and achieving some level of worldwide notoriety as a result. And having these facts dryly recounted in an encyclopedia that anyone can edit.

Yeah. That sounds kind of weird too. It touches on some of the same stuff that's been discussed here. The way that small communities can band together over the Internet and gain more exposure as a result. The real weirdness being in the social consequences. The instantaneous and universal quality of communications. And also the fact that creativity and authorship can be decentralized (despite all the media concentration we see).


Hey, COBOL pays for me to write Java. There's a lot of money in COBOL. And there are plenty of sites still involved in migrations to things like DB2 or CICS from something even more ancient (like Charlie's would-be employers back in the day, although I haven't heard of anyone switching to COBOL from another language, in my experience in the peripheries of the mainframe world over the last couple of years).

That computers handling your medical, insurance, financial and government records will still be running COBOL programs in 2037 would be one of your safer guesses, of course - not an original point, but until recently I hadn't had an opportunity to see just how true it is.


The other thing to remember is what sticks out as important or major developments is different from person to person. We tend to focus on computers and IT, because that's the development in the past 30 years that's meant the most to us personally.

My grandmother probably had only a vague idea of what computers were, even in the 1990s. However, she once told be that the greatest invention of the 20th century was the microwave oven. That was what affected her the most -- preparing meals went from taking several hours each day to taking a few minutes within her lifetime. By 1977 they were pretty common, but if you went back to 1957 and told people then that buy 1987 most homes would have an oven that could cook a meal in a few minutes, how many would believe you? And if they did, they'd probably wonder what women did with all their spare time around the house...


Andrew Crystall @ 131

If memory serves we had a long thread on the hydrogen economy a few months back. Nothing has appeared since then to change my opinion that using hydrogen makes no sense, even in the rather safe form of metallic hydrides or nano-engineered chelates. Even compressed liquid hydrogen is still not dense enough to provide enough energy density to replace hydrocarbons economically, and the cost of the infrastructure is ridiculously high. Metallic hydrogen might work, but we have no clue how to create it, and no idea if it can be stabilized under reasonable conditions. We've already got an electricity distribution system we've paid at least 10E12 € worldwide for installation and operation; a new hydrogen system would cost far more per joule distributed because right of way for pipeline costs so much more now.

Hydrogen economy is a mirage; using hydrogen as a local seocndary storage medium is possible, but it makes no sense as a primary fuel. Hydrocarbons like propane synthesized using distributed electricity or local solar power make sense, but the technology isn't efficient enough yet by at least an order of magnitude.


Remember what we were saying about software archeology upthread? We're seeing that right now, as integrators wrap COBOL applications in remote procedure call or even remote message send wrappers that make them look like C or Java or Python services (or OLE, or COM, or SOAP, or ...). The result is going to be big globs of completely opaque code that's buried under layers of interface and protocol-matching. And if it fails, they're going to have to thaw out the COBOL maintainer who got the short straw and had to go into the liquid helium tank for on-call system maintenance. Still want to be a legacy maintainer?


The movie Probe gives another example of how much things have changed in just over three decades.
The two central technologies posited in this "Pilot for sci-fi detective series Search" are:

  • * A "scanner" which transmits video & audio from an agent to headquarters, plus an earjack for audio reception in the other direction;

  • * Cybernetic support back at HQ, essentially "a room full of experts, monitoring [the agent's] actions and vital signs, and supplying him with encyclopedic information."

These advanced and fantastic-bordering-on-mythical (in 1972!) technologies are scarcely distinguishable
from today's cellphones, Bluetooth earpieces, and Google / Wikipedia. Plus cést change...

(Back then, anyone who admitted to watching “that futuristic crap” was either branded a nerd [now ‘geek’], or was quick to use the “But it's got Elke Sommer and Burgess Meredith in the cast!? excuse.)

- Chris


American R&D?

Looks like it's doing pretty good - especially if you include things like medical and materials research.

111: The Cold War? Have you kept up on what the Pentagon is actually buying? Not so many big ships, lots more stealthy and high-information-use machines. Yes, the planes are expensive as hell, but ALL modern planes are expensive as hell... and they're very, very good at going after some of the folks who are going to be the next-gen targets. Even the next generation of Big Tech like aircraft carriers make it possible to extend power over long distances - and don't pretend that it's no longer needed. A Cold War strategy would be leaning a lot more towards large forces with simpler equipment instead of smaller forces with more sophisticated weapons and data handling.

...and no, even with the European delusion of "the American Empire," you don't want us to take our toys and go home to leave the rest of you to deal with things. You can't afford it, for one, and WW IV would break out in less than a year in Asia.

121: "Where are the products?"
All over the place. Wait a couple of months, and see Europe freak out over the iPhone when they start selling it there. A big chunk of computing technology is created in the US (and built in factories all over the world). Trade deficits are a lousy way of measuring this, since a lot of the money movement isn't measured as "trade." Also look at stuff like pharmaceuticals (more than half of those cheap drugs the rest of the world needs so badly are created and tested in the US - by some measures as much as 2/3), and medical hardware (things like MRI machines - which were invented in England, but made available at moderate prices by US companies).

Ditto for articles published. Much of the R&D that's happening here now isn't coming from universities any more, but from companies that aren't really interested in having their highly-paid engineers tell everyone how their research departments work. On the other hand, you have some really prestigious European journals which have apparently given up on that whole "science" thing in order to get enough articles (the Lancet and MMR, for a really blatant example).


Charles Stross @ 2^7

. (Granddad worked on the Manchester Mark One for Ferranti, and probably knew Alan Turing personally; dad worked at Ferranti, ICL, and then Microvitec.)

I grew up in Philadelphia, where Eckert and Mauchley built ENIAC. The company they formed became the Univac division of Sperry Rand. When I was about 12 or 13 I went on a tour of a Univac 1 installation. The machine took up something like 10 or 20,000 square feet of a huge room. The memory, a few Kbits, consisted of acoustical delay lines using mercury. The tour guide said, "Here's what we're going to be using in the next model for memory", and held up a big frame strung across with wires, on which were threaded ferrite cores. This was the first solid-state memory; I think there werr 256 bits on each frame.


"very, very good at going after some of the folks who are going to be the next-gen targets."

Every so often, the targets are foolish enough to fight you in the way that you want (1991 springs to mind). Mainly, not being stupid, they will fight you in ways that play to their strengths and your weaknesses. If the blue team is so sure about who the red teams going to be, then it's likely that the reverse is true and they are planning accordingly.


"Granddad worked on the Manchester Mark One"
I went to Manchester Uni (for a while, before fleeing to do something useful). They still make students program (a simulation of) that thing!


"Every so often, the targets are foolish enough to fight you in the way that you want (1991 springs to mind). Mainly, not being stupid, they will fight you in ways that play to their strengths and your weaknesses."

Actually, mainly, they do fight you in ways that let you kill them in droves. It's a very small number of them who do the actual productive warfighting. By using the big, high-dollar, high-tech weapons to knock down most of the bad guy population, you give them a much, much smaller population of effective troops to pull their high-quality guys from.

Another very, very important thing to remember is that, for the most part, the folks we'd face still use old-style, professional military forces, or act in ways that are easily countered by the use of real, heavy-duty firepower.

Even for the "small wars," having (for example) a long-range, stealthy, high-endurance fighter-bomber on station gives you some tactics that the other guys just can't match. Yes, they're "insurgents," but in much of the world, they're insurgents with government intel and support from neighboring countries. Which is part of the reason the US is spending so much money on things like drones and stealth tech, along with military comm and data processing.

This isn't even getting into the China/Russia situation. As resources like oil get harder to extract, it'll be cheaper for some political groups to pay for an invasion of a smaller (but resource-rich) country than to try and buy the resources on the open market. So China would take a shot at Indonesia, for example (their current build rates and force structure for amphibious landing ships seems to point at just that scenario). Unless someone like the US has a much bigger (or disproportionately effective) countering force, they could get away with it, too.

Actually, WOULD get away with it. Sure, they'd get some sternly-worded letters from Europe, and a few trade embargoes, but after a short time, most of the world would just let it slide. Which is why the US spends all of that cash on those big old aircraft carriers.

(Remember those talks with Iran that suddenly had some success? Oddly enough, they stopped being hardasses about a lot of things things on the exact same day the US parked a couple of aircraft carriers off the Iranian coast. What a coincidence.)


Wait a couple of months, and see Europe freak out over the iPhone when they start selling it there.

That would be the iPhone that does mostly the same stuff Nokia Communicators from five years ago do, just more expensively, and all the same stuff Nokia N91s from three years ago do, just more expensively?

The same one that has a South Korean-made (Samsung), British-designed (ARM Holdings) CPU, a South Korean (Samsung) mass storage unit, a German RF baseband chip (Infineon), a British audio interface (Wolfson Microelectronics), a British Bluetooth stack (Cambridge Silicon Radio), and a large touchscreen from Germany (Balda AG)?

The one based on GSM? As in Groupe Special Mobile de la Conference Européene des Postes et Telecommunications?

Literally the only distinctively Apple contribution is the pretty box, and that would be the work of Jonathan Ive, who the last time I checked was...ahem, un-American.

Riiight. Next time, get an example that supports your argument.



If winning the big-army war means so much, how come the fighting continues in Iraq, and makes it difficult for the winning big army to go and break things somewhere else?

I don't think the Chinese amphibs make sense for an occupation of Indonesia. Forget American carriers, you need to be able to project air power to protect the attacking fleet.

Invading Taiwan doesn't make much more sense, but a plausible threat has political advantages.

And the Italians didn't need battleships to put British battleships on the bottom of Alexandria harbour.


Interesting stuff - and a spot of nostalgia for me.

Charlie is quite right about most people not living at the bleeding edge.

Ironically, back in 1977 (when I was aged 15) I was happily hacking my way into the South Western Universities computer network at Exeter Uni (SWURC was the fore runner to MIST I guess). What I was up to then was utterly incomprehensible to my peers, let alone the poor buggers nominally watching over me.

Computer security then consisted of backtyping in black over the passwords on the teletype printouts, but this never obscured the indents in the paper. So there I was raiding the litter bins with a 2B pencil. Passwords were also initialised sequentially, so I could steal whole blocks of ID's at a time.

Now that's one method I used to "hack" 30 years ago - and it would still be a bastard to explain to most people today (albeit perhaps for different reasons).

And for much the same reasons those phishing emails, stock tips, etc work and suck in people..........


Sorry for the fawning fanboyism, but this story illustrates precisely why I was so impressed with "Accelerando". It did a beautiful job of staying just one step ahead of where my own imagination immediately jumped, and thus encouraging me to take that next step knowing that I'd not be too deep in unfamiliar territory. Then, as soon as I started to get comfortable, it was time for another step... It got dizzying, but managed to seem at least partly comprehensible, with *just* enough recognisable "humanity" (for lack of a better word) for my mind to accept it.

Perhaps the medium of (serious) 1977 science fiction novels would be the best way of telling this tale to audiences past, partly due to the audience's expectations (and their willingness to be led into unfamiliar realms), and partly due to the scorn and doubt you'd receive if you tried initially to present this as anything other than fiction. Offer the story, let them absorb and understand, and then *afterwards* tell them that it's all true, once they have already realised how plausible it sounds if they can accept a few leaps forward.

Of course, if you turn around and tell me that Accelerando's all true, and you and John Titor (*) had already reached this conclusion, then all bets are off! ;-)

(*) http://en.wikipedia.org/wiki/John_Titor



(Like me prediciting that in 30 years we'll nearly all be working from home using instant messenger/video conference systems with some form of thin client tech' to access our employers software/in house systems

If we've not been uploaded)

Ah - now that illustrates where the unexpected happens.

The technology did indeed advance so that my work could be done remotely and I could work from home.

At which point they outsourced the work to people in India who would do it for less than I'd get on the dole......


How to explain the World Of Warcraft article to a person living in 1977...?

(Reminds me of a similar problem, my favorite: "How do you survive if you're suddenly zapped into the Middle Ages?" ;-))

Let's approach the issue from another angle. Don't get too wrapped up in the tech details: most people in 1977 couldn't explain how a TV set worked, but they were happy to use one.

Appeal to the familiar, the very fammiliar, to broad principles, and work your way from there:

"Imagine what games we'll have 30 years from now. Like board games and those new role-playing games, only much better."


"Imagine how the games of the future are hooked up to computers, even more powerful than today, to create images and sounds, almost as good as reality."


"Like, for instance, a game based on Tolkien's novel 'The Lord of the Rings'. You play characters from the book, and the computer creates three-dimensional images of the landscape and characters on the screen. And the players can guide them!"


"You can see this vast Midgard-style landscape spread out on the screen. Other players are hooked up with the same landscape electronically, like a shared game-board -- they don't even have to be in the same room."


It's much harded to explain social change. People have rigid concepts about how they should behave, vote, and relate to each other.

The 70s person would start to groan at People Who Like To Dress Up As Furry Animals: "Man, now you're going too far! Computer users dressing up like furry animals? Freaks like that would be committed to insane asylums before anyone would let them get their hands on expensive, delicate computer equipment!"




In 1985 many good (high) schools in big cities in USSR already had computer labs with personal computers. Lucky schools had imported, unlucky -- locally produced. We had class with about 20 computers tied up in LAN with fancy Yamaha logo on them, each with 64k of RAM.

When I went to college in 1987, they had many labs with IBM PC clones and older locally produced PC based on PDP-11 architecture.

Either this american academic was making story up or his host was making fun of him. Of course, he could have met especially obtuse one...

Of course, almost no one had personal computer at home. But quite a lot of people already had access to them in 1985.


In 1977 I was a 15 year old high school student who spent a lot of time reading science fiction. At the time, I was certainly aware of computer gaming at the level of Adventure, and I was used to logging in to a time sharing system using a 300 baud modem and a terminal. I was also aware of the idea of a "personal computer" and busy bugging my parents for money to buy a kit so that I could build a system. I was aware of how rapidly computer technology was changing and I honestly thought that by the year 2007 we would be well on our way to a world of science fiction technology.

So, I don't think that I would have been that amazed to hear this story in 1977. If anything, I would have been more interested in questions like: Where are the supersonic transport aircraft? Why haven't humans travelled to Mars yet?


Alexey @43: yeah, but I think the story was told of the early 1980s. (Or you might be right about their hosts making fun of them.)


In 1970, the people who worked with the IBM mainframe at Cornell University were routinely playing a game based on the lunar lander simulation. You manuevered the descent of the lander by punching direction keys (just like the real one).

The object was to land directly in front of a McDonald's.
If you did it right. A little spaceman got out and entered the McDonalds for a burger.

If you landed on the McDonald's, destroying it, the screen went blank and then huge letters appeared saying "You idiot! You have destroyed the only McDonald's on the moon!" Your play was penalized from then on.

Not only was this game available to those at Cornell, but the game was actually on a mainframe at the University Of Illinois and you played it over a special wide data link.

So in 1977, the reaction of the folks who knew about this would have been, 'It's about time we got a better game!. What have you been waiting for?"

It's used by corporations and governments and other groups such as people who like to dress up as furry animals to keep tabs on us.

Waitaminute! The Furries are watching me? That creeps me out more than the idea that Marks & Spencer's and the government are watching me. After all, I know more or lesss what the government's and M&S's agendas are, but the Furries? Brrr!


A late entry in the Where Were You in 1977 collection:

I was sixteen, just finishing my Computer Science O-Level by working on a project to convert one of Simulation Publication's simpler wargames to computer-aided play. (It was 'Oil War', ironic as that is to my present self; an invasion of the Persian Gulf by the US and Israel.) A vastly ambitious project, expecially considering that the program was written in BASIC, and literally written on modified graph paper and posted to Plymouth Polytechnic to be typed in and run on their mainframe, there being no other available computers in the education system within 30 miles. It never ran properly, I don't think, but I got a B on my O-Level and an invitation to visit an MoD computer installation where they ran war simulations.

I think Shockwave Rider and True Names were still quite a few years into my reading future in 1977. As far as computer games go, I was still imagining something like those cardboard counters and two-colour maps on a colour TV screen. I'm still surprised by how much games now rely on spiffy 2.5D graphics and have so little sophistication in their play.


I like the chain of thought involved here, but for Mr 1977 could we just call it an electronic advert of the future?.. I must say discussing the death of electronic gnomes is a nice way of showing how much change has happened in just 30 years.
We may not fully understand our future but isn't this also true for the past? I don't know how the pyramids were built but I bet most of the hyrogliphs used in that era were probably still promoting herbs for erectile disfunction and the like...
If we are at 'Virtual Gnomes' now - then what will it be in another 30 years of accelerated change?!
(Shrugg...Maybe we should keep the worldwide EMP on standby just incase we need to pull the plug )


"That would be the iPhone that does mostly the same stuff Nokia Communicators from five years ago do, just more expensively, and all the same stuff Nokia N91s from three years ago do, just more expensively?"

Actually, no.

It does a lot of that stuff, but it does it WELL, and is damned easy to use. Your comment is like someone claiming that a DOS computer would do as much as a late-1980s Macintosh, because it would let you edit documents and save files.

So far, most of the complaints like yours come from people who haven't actually seen an iPhone yet. I had my iPhone out the other day, and a guy with the newest, hottest, top-line Nokia he'd just bought was sitting next to me. The first thing he said was "it makes my new one look like a toy!" I was using it yesterday, and another guy had that nice new Blackberry Pearl - no comparison, starting with the screen size and working down the list...

Yes, the components are made all over the place, but that's true for EVERYTHING nowadays - and if it's so simple to put these things together to get an iPhone, then you're going to need a helluva explanation for why it does all of that stuff so much better than the "list the features but don't worry if they actually work well together" phones. A number of the "it won't do that" lists also ignore the fact that the iPhone is a real computer under the skin, and most of the issues will be easily solved with software updates.

A lot of that tech that's "made" in places like China are built from designs that come from American companies, and are licensed (or pirated, in many cases) from US sources.

Look outside of the computer and telecom industry, too. Medicine, materials, et bloody cetera. It slips by (as I mentioned before) because a lot of US companies either invest in those foreign companies that are producing the end product, or get royalties from those companies from licenses. Once again - doesn't show up in many "trade deficit" measures.


"If winning the big-army war means so much, how come the fighting continues in Iraq, and makes it difficult for the winning big army to go and break things somewhere else?"

It really doesn't. You need to realize that a lot of the reason the US has only a certain number of troops in Iraq (and not that much equipment, overall- very few aircraft other than rotorcraft and such) is that we're holding back a LOT of troops and capability for just such an occasion. At worst, we'd have time to withdraw our Iraq forces in case of the outbreak of WWIV.

"I don't think the Chinese amphibs make sense for an occupation of Indonesia. Forget American carriers, you need to be able to project air power to protect the attacking fleet."

Look at the distances and travel times. A good-sized fleet of ships could make the run in much less time than the US could put a fleet on duty - and the Chinese sub fleet could make a US President hesitate just enough to let them get a beachhead.

Consider also that the first move by China would probably be something like a handful of large container ships full of troops - most people don't realize that most of the Chinese "merchant fleet" are what everyone else considers "active duty navy."

They could get upwards of 100,000 soldiers on the ground (plus equipment and ammo) with ONE large container ship - ten guys plus equipment per container... the landing craft would be for the more-isolated places, like oil fields.

"And the Italians didn't need battleships to put British battleships on the bottom of Alexandria harbour."

A couple of dozen minelayers, plus the Chinese sub fleet, and the US Navy would (at best) be slow to get on-site. Which would be the time for our OTHER hardware to get involved. Those "Cold War" B-1s and B-2s, along with our other strategic assets and long-range fighters (the F-22 would be able to work from existing bases, while older fighters would be out of the picture) would give us a shot at stopping things before they got entrenched too much.

If the Chinese didn't manage to keep the US carrier fleets out of the war, they might as well not try - but with politics nowadays, much of Europe would be bitching about the US, not China, in this situation.


I think the Chinese leadership are far happier to let the example of the successful repatriation of Hong-Kong to China from Britain without breaking economically or socially percolate through to the Taiwan situation. Most of the intemperate remarks on the area have been from military sources, not the party.

They probably believe that they have a long term winning hand vs the US as long as the game doesn't get thrown on the floor too early,

-- Andrew


Cirby @ 159: Is that the new official explanation for why things are so fucked up in Iraq: because the Maladministration didn't take it seriously enough? Because they were holding back the reserves to intimidate China?

That's an absurd suggestion. Though that doesn't prevent it from being true, alas.


It occurs to me that if you were describing this scenario to someone in 1977, they might blithely assume that the dead gnomes were all AIs, wonder whether they were really dead or whether they could be restored from back-up, and want to get into the ethics of creating sapient beings just to slaughter them, especially for such a trivial purpose.


having read through these (seemingly) endless comments, the one thing that jumped out and yanked my chain was the "amurka -- we're bigger, badder, better than everbuddy els" meme.

i'm amurkin myself and wishing desperately i could get the hell out of this place ... i remember 1977 ... i spent that summer in saudi arabaia (dhahran north, to be precise), computers were nowhere on my horizon and i was still innocently hopeful that people everywhere were making positive forward progress ... i have since grown up, having my innocent hopefulness ripped off and trampled by people who should have known better, done better, wanted better.

and this WOW thing? i have the same opinion of that as i do/did D&D ... it's the way some people choose to fool themselves into thinking they actually have a life. but then again, who am i to cast aspersions on the methods and means by which others conceal their flayed lives from themselves and those around them?

but c'mon, boys, keep the testosterone in check -- let's cool it with the "amurka furst" crap, it makes everything else that comes out of your fingertips only slightly less turgid in comparison. otherwise, an interesting and in many ways enlightening discussion.


Charlie - Just started reading Glasshouse - love the "past shock" "Reeve" is experiencing - never thought of it working backwards (Although coming home to AU from the US is similar - my wifi "netlink" is dead here most of the time too).




Cirby, I have a Nokia 73, which has been out for two years now, and it runs cutdown Office apps, a web browser that renders HTML, CSS and javascript correctly, with a cache, that does the zooming-in-on-a-large-virtual-page thing all the pundits are raving about, an RSS reader (in fact I read this blog on it this morning), a faster radio data link, and a 3.2 megapixel camera with a real lens. And it can be programmed in Python.

Before that I had a Hewlett Packard HW6915w, which has a 400MHz or so processor, MS Windows Mobile 5, Pocket Office, WLAN, GSM/EDGE (so the same radio config as the iPhone - 2 years earlier), and GPS.


Another: "If winning the big-army war means so much, how come the fighting continues in Iraq, and makes it difficult for the winning big army to go and break things somewhere else?"

Cirby: "It really doesn't. You need to realize that a lot of the reason the US has only a certain number of troops in Iraq (and not that much equipment, overall- very few aircraft other than rotorcraft and such) is that we're holding back a LOT of troops and capability for just such an occasion. At worst, we'd have time to withdraw our Iraq forces in case of the outbreak of WWIV."

Cirby, you really should understand that you've got to decant the BS from the jug labeled 'Genuine 100% Certified Bull Sh*t' before passing it around here.

However, thanks for a pre-view of the latest argument going around among the 26% dead-enders.


57, SM Stirling: "OTOH, if you avoid the high-tech stuff, it wouldn't be too hard -- most everyday technology (subways, elevators, cars, bathtubs and showers, airplanes, guns) won't be all that different from what we have today, just as 1977's tech isn't much different from 2007's at that level."

Considering that subways are generally far, far more surveilled now than they were in 1977, I think that you might want to reconsider. With guns I can think of several important things, ranging from widespread smart-locks to standard equipment of night vision sights.


@167: Details details. That's the sort of stuff you tweak anyway to build your future.


I think I saw advances in computers and communication technology; what I didn't see was the vast popularization. Computers? Isn't that too geeky to live? My first *positive* thought after seeing "toyota.com" on a Boston subway ad was, "well, maybe a temporary fad will build out comms for us geeks and make it *really* cheap after the telecoms bust." Little did I know....

Another thing it would have been hard to explain to me is the commercial value of botnets and hence 0days, since the real bootstrap for them was spam, and I of course thought the spam problem would be mostly solved by 1998 or so. Trojan keyloggers, that I could believe.

Re videophones: I use them at least weekly for business. They've been here in a big way for well over a decade. For personal use, well, porn led the way again....


Charlie, just out of interest, has this thread attracted much spam from the gnome-bombers?


Dave: weirdly, none at all (yet).


Barry @ 166:

I'm not sure what you're trying to say in your post, that the US military has most of it's troops in Iraq?

We have close to 170,000 troops in Iraq, compared to the 116,000 we have pacifying Europe. Plus the 80,000 or so occupying Japan and Korea right now. :)

And another 20,000 in Afghanistan, but they always get overlooked because of Iraq.

But all of that pales in comparison to the over 1 million active troops we keep in the US.


Andrew G, Barry: this is not a let's-re-fight-the-cold-war topic.

Any more postings on this subject will be deleted (and you'll be banned).


Ahem. If this gets banned, I could certainly see why, but . . . my own two cents is that the 21st Century will be known as the Century of Consolidation. All that stuff about AI, (more) intensive manned space travel, genetic engineering, nano tech, etc. will have a lot of spadework done now, but they won't happen until the _next_ century. No, the big thing will be that everyone catches up to the standard of living Usians enjoy. Kids in Nigeria will be spoiled rotten, and playing WoW, and cursing Mom & Dad haranguing them to either go to college or get a real job. Iraq? Hard to believe that there was a war on and no constant electricity in their grandparents time. Oh, there will be some pockets of old-fashioned 20th C. poverty, but they'll be really in the backwoods.

Iow, the future is now in some priveleged places, and 83 years hence everyone else will have caught up.

_That's_ the big story of the coming century.


I've just come across this via www.crikey.com.au, which mostly talks about Australian politics, business and media. {No, I don't own shares in it; or any other connection}

Apart from the great article, what impresses me most is the quality, thought and insight in so many of the responses. Not to mention the small % of flame ... or have you deleted those already?

I was just finishing high school in 1977. I was one of the lucky ones with a Texas Instruments Programmable calculator! I went to a meeting where a guy talked about a thing called an Altair and a thing called an Apple, but I didn't really see anything exciting about programming microprocessors :(

I've been saying for the last 10 years to anyone who'll listen "Our grandchildren will ask "You mean you didn't always just TALK to computers???""

But reading through all these posts, perhaps I'm too optimistic. WoW has never really appealed to me that much. But I still waste 2-10 hours a week on a football manager simulation called Hattrick; 10 years old and with almost a million users.

BTW, I want to buy some gold. Where are all those spammers you promised at the start ??? :)))


ScentOfViolets: you're dead right, that's one of the coming big stories of the 21st century. The other is that nobody is going to be living like 2007 middle-class America. The definition of affluence changes over time, and a lot of things that look desirable right now are going to seem laughable in 2107 -- the trouble is, we don't know which things.

I'm guessing that having to drive your own automobile and fuel it with horrendously expensive petrochemicals will seem nonsensical by 2107. (But there'll be some equivalent niche for personal autonomy/travel/freedom, and we'll still be getting about, and as with horse riding today, there'll still be plenty of hobbyists maintaining their manual transmission historic gas burners.) And I'm guessing the big news in dealing with climate change will be civil engineering projects on scales we can't even envisage right now, driven by much better environmental models than any we've had in the past. And there will be more people from 2007 surviving into 2107 than anyone now expects. But there are too many long-term variables to make any solid predictions ...

Charlie K: weirdly, I haven't had to bring down the hammer on anyone yet. (Although I've got a fairly efficient system for dealing with blog spammers, hence the lack of v**gr* and offshire gambling ads).


Yes, I should have clarified. Most everyone will have the signifiers of the equivalent of 20th C. Usian middle class. I don't see the automobile as being one of those . . . but it's possible.

In fact, that's a whole 'nother topic, one that I don't see much exploited in sf: what will be the signifiers of a typical affluent middle-class person?

Right now, my daughter just _has_ to have a cell phone, a personal computer, and an iPod, and those fulfill a typical 13-year olds definition of being up to the minute with the current status objects (being a 21st affluent middle class parent, she of course has all of those things.) What would a 13-year old covet in 2107?


Charlie @ 176:

"Offshire" gambling? Is that when hobbits sneak out of the Shire and visit Bree? I wondered why there were so many people at The Prancing Pony, and how Bree folk were so prosperous in what's basically the middle of wilderness.


ScentOfViolets@177 " ... What would a 13-year old covet in 2107?"

I seriously fear it will be food and a dry place to live. :(

Andrew G.@ 178 Gambling in the Shire has been banned since Morgroths Pony won the Hobbiton Cup in 4723. :)

Cmon, you've never done a tpyo on your intire life?


Oh, I do plenty of typos. I just thought that was a funny one. I pictured hobbits lined up a slot machines... :)


I'm just wondering how many of the "the US is declining and doesn't do anything any more" folks are using Windows or OS X machines, or are taking drugs that were invented and tested by US drug companies...


Cirby: you are beginning to annoy me.

Go polish your marching band's brass on someone else's blog.


ScentOfViolets@177 makes an excellent point, I think. Instead of tying to predict what governments and multinationals will be doing in thirty or a hundred years from now, we ought to be wondering what's going to ring the teenagers' bells. I might be wrong, but it seems to me that the mobile phone revolution has been driven in quite a large part by teenagers taking to text messaging in colossal numbers. Was that predictable back in 1977? If it was, I don't remember hearing about it.

One problem can't predict the future down to the last decimal point; you can only make educated guesses, and even then there will always be something that comes out of left field and changes things in way nobody ever expected.
Another problem is that change is not uniform. I recently interviewed an old chap who built his own television set so he could watch the Coronation back in 1953. He got the instructions from Wireless World or some magazine like that, saved up a few bob each week and bought the components cheap from a mate who ran a television rental shop. You couldn't do that today.(Now someone's going to tell me you could)
On the other hand, apart from some bells and whistles, the internal combustion engine hasn't changed all that much in quite a long time, and someone who tinkered with engines back in 1953 wouldn't need too much conversion time to work on one of today's engines. A television engineer from 1953 might have more difficulty with today's flatscreen LCD sets.
Charlie's hypothetical bloke from 1977 should easily be able to follow his briefing about WOW. Someone from 1967 might have more difficulty. Someone from 1957 would have absolutely no idea what you were talking about. But he'd still be able to build you a tv.
Sorry, went on a bit there.


Sorry, I meant to say `One problem is that you can't...' I should hit the `preview' button more often.


@176 "The other is that nobody is going to be living like 2007 middle-class America. The definition of affluence changes over time..."

As Charles mentioned a few days ago, Agatha Christie once said "when I was young I never expected to be so poor that I couldn't afford a servant, or so rich that I could afford a motor car".

For things that everyone reasonably affluent has now, but won't in the future, water is a good candidate ("When I was young taps didn't have photocells", "I never expected to be so poor I couldn't afford lawn and a swimming pool") followed by meat ("back in my times I ate steaks and burgers every day") and - very specially - humans ("shops used to have attendants, and supermarkets had cashiers, human cashiers, in those days" "everyone went to the hairdresser's to get a haircut"). Space has become ever more expensive in the last generation, but I wonder if telework won't mean millions of families leaving for villages and small towns and houses becoming bigger and cheaper.

Upper class & expensive articles that could become common in the near future include individually customized/tailored garments and shoes instead of pret-a-porter, GPS everywhere (you'll never walk alone, literally, but will never again lose an umbrella, you could even conceive a simple device with two buttons, the blue one calls police, the red one gets a medical emergency team), alarm systems connected with the nearest police/private security station in every home and security guards in every block, genetic therapy for everyone's sons (apart of the 'feel good' value states will probably decide it saves money in the long run), your own personal blood, tissues, bone marrow and stem cells reserves started from the very moment you are born...

@183 "we ought to be wondering what's going to ring the teenagers' bells..."

Have you watched the movie 'Sweet Liberty'...? Excellent comedy, the archetypical Hollywood mogul kept telling that teens are attracted by three things, namely (in my words, not theirs)

- Rebellion, individualism, freedom
- Property destruction and big explosions
- Sex, sex and more sex

He was speaking about moviemaking, of course, but he had a point. Mobile phones make easier private conversations and getting freedom (because their parents can always contact them), and in consequence should have been expected to be an instant hit (and perhaps their makers actually did foresee that success...)

In the future I would expect, for example, teens to like having GPS implants allowing them to be located in case of need or emergencies, because in the same way than mobile phones they would increase their actual freedom of action by allowing police and parents to monitor them more easily... a bit paradoxical? Perhaps.


Isidro@185, I have seen Sweet Liberty, although it was a while ago, and enjoyed it. Oddly enough, I'm attracted to those things, too...


The idea that the Nigerian masses will sometime this century find themselves enjoying a standard of living equivalent to that of today's western suburbs is not completely fanciful, but unless there are serious changes in Africa's political relationships to the outside world I'd place that idea somewhere towards the 'highly implausible' end of the scale.


D. O'Kane @ 187:

I disagree -- I don't think that Africa will have to clean up it's act politically to enjoy 2007 first world standards of living this century.

Now, they certainly will to reach contemporary first world standards of living. Nigeria in 2057 might have the standard of living of the US in 2007, but not of the US in 2057...

Take phones as an example. For years there were wait lists, high costs, poor service, minimal coverage. Then cellphones came along, and in many places increased phone usage many times over. We'll probably see the same sort of thing with computers in the next decade.

If cheap solar power gets developed, Africa can take advantage of it without the need for the same sort of stability you need to build a conventional power plant.

And personal water purification tech developed for hikers, campers, and soldiers can be used in regions where the political situation makes water purification plants and distribution systems impossible.


Never mind that the primitive from 1977 wouldn't be able to understand adults spending hundreds of hours playing a video game, I can't either and I've been computering for over 20 years now.


Andrew G., post 188. What I'm about to say may sound snide and hostile, but I honestly, genuinely, don't mean it like that. . . but what prior experience (first or second hand) of Africa do you have?

There are already people in almost all African countries who enjoy western standards of living. They are all in the upper echelons of their societies, however, which means that they have access to power and wealth which are denied to the broad masses.

Look at the problem of food security, for example. We know from the work of Amartya Sen and Alex de Waal that famine is not a natural but a poltical phenomenon, and that it is avoided when the masses who would otherwise be at risk of famine can bring political power to bear on their rulers. I'd say the same principle can be generalised to other areas of access to subsistence and living standards.

The Niger Delta is in ferment because the local population receives literally ZERO benefit from the extraction of its oil wealth, and this occurs because the Nigerian elites are able to monopolise political power and exclude the masses in the delta and elsewhere from access to that power and therefore from access to economic power as well.

The nature of that elite is best expressed in the case of the Nigerian space programme. . . it does exist, and there's at least one Nigerian satellite up there right now. But the Nigerian academic I spoke to about it said that it's basically a prestige thing, an exercise in window dressing.

*Maybe* technological change will bring things like water purification or electricity power to African communities in the ways that the cell phone revolution has done. But I wouldn't *assume* that this is inevitable, and I wouldn't isolate that process of technological change from the social and political changes which will be necessary if the masses are to benefit from it.


D. O'Kane: I haven't been to Africa personally, though I have spoken with Africans and people who visited Africa beyond going there as tourists. I've also studied the history of the continent, though that's relatively broad. So I do know something of the sad reality of the place, and a century of mismanagement and squandered opportunity.

You're right that the most serious obstacle to development there is political, rather than financial and technical. I just think that where technology brings the cost of something down to a level where the government can be bypassed, the average African will use that to make his/her life better. Cellphones are the best modern example of that. If things like the OLPC and similar actually work and get produced, I expect African PC ownership will reach levels close to that of the rest of the world -- even if their computers aren't much like those used in the US by most people. The convergence of cellphones and computers is another trend that holds promise for Africa -- they may do all of their computing on cellphones with external keyboards and screens, connected to the internet via the cellular network.

I think that's inevitable unless a government actively opposes it, above and beyond the usual level of corruption. Which is possible, but I doubt it will happen for the whole continent.

And the nice thing about computing, cellphones, and internet access is that it allows Africans to learn more about the rest of the world, and draw on the resources & knowledge of the West. There are all sorts of things from medical information, to how to build a windmill to power your home, to better ways to farm organically that are available for free if only you can find it.

And that's just the tip of the iceberg. Right now Google and others are digitizing books, eventually every book in the world will be available online. There are free scientific journals and scholarly articles available right now as well. Imagine if anyone with a small computer and internet connection anywhere in the world had full access to the 10+ million volumes in the libraries of Yale, Harvard, Oxford, etc. Or if they could talk with people fighting government corruption in Mexico, Poland, India, the US...


O'Kane: - the Irish economic boom does indeed involve hyper-exploitation of immigrant labour. But this is very far from being a mere continuation of the survivals from Ireland's pre-industrial past which I alluded to in my original post referring to the landless labourers and their unattractive accomodation options.

-- I'm aware of that; my post was _ironic_. This is hard to convey online. Let's split the responsibility for the misunderstanding equally... 8-).


Alatriste: The future as it appears in SF is usually (I'm tempted to say always) far more uniform than the real world is. Even alien races are usually described as extremely uniform, one nation with only one culture, one tech level, one society, all belonging to one religion or all of them atheists...

-- I agree that is unlikely in the short term, and unlikely on any world that's preindustrial or which has only recently gone through its Scientific and Industrial Revolutions.

OTOH, in the long term (and barring catastrophe) it's quite likely IMHO.

Social and, even more importantly, geographical distance are the main explanations for the cultural diversity of our species.

Modern communications, mass migrations, and the breakdown of social barriers to cultural and biological mingling (racial prejudices, etc.) are reducing the cultural differences on Earth. We're not usually aware of it because the same factors bring different cultures into closer contact, so it _feels_ as if the world is growing more "diverse". The world as a whole isn't -- it's just that most _parts_ of it are.

Eg., England now has lots of people from the West Indies, South Asia, and Eastern Europe whereas 50 years ago nearly everyone there was British-born, tho' of course with some exceptions.

But the West Indies, South Asia, China, etc., are much more Westernized now than they were 50 years ago -- the end of the Western colonial empires has _increased_ the degree of cultural Westernization.

For example, in 1948, something like 5% of people in India could speak English. It's now over 20% and growing fast; when I was in Bangalore this summer, every third building seemed to have a sign for an "English academy".

And look at the way smalllanguages have been going extinct at ever-accelerating speed.

So eventually, if civilization persists (and barring a Singularity) Earth would probably become pretty monocultural.

Everyone will wear the same clothes, be about the same toast-brown color, speak the same language (English, probably), have roughly the same social-political ideology, etc.

Mind you, this would take a long time, and there are conceivable developments which could stop it. But I'd say it's the maximum probability.


Alex121: If US corporate R&D is so great, where are the products? Why the huge trade deficit?

-- there are lots of products, and US worker productivity has also been growing faster than the developed-country average.

(Europe's grew faster from the late 40's through the 70's, then started falling behind again in the 80's. This is why the richer EU nations have per-capita GDP's about at the level of the poorer US states. That's not counting mini-me's like Luxembourg.)

The US has been running a merchandise trade deficit for... hell, for most of the time since 1964, IIRC.

And it ran a deficit for most of the period before 1914, too, particularly in manufactured goods. The period between WWI and the mid-60's was an anomaly, as the rest of the manufacturing world were bollixed up by the aftermath of WWI, and then for a generation after 1945 they were still recovering from WWII.

"Merchandise trade deficit" is another way of saying "attracts a lot of capital inflow."

There's a vulgar economic error which associates a big merchandise trade surplus as evidence of a "strong" economy, and ditto with a currenty that's appreciating -- as some sort of weird virility symbol. I suppose it's a leftover ideological afflatus from mercantalism or something.

It isn't. The reason, say, Germany and Japan export a lot is basically because of oversaving, and depressed domestic demand. Depressed partially by the way their economies are structured, and increasingly by their deathbed demographics.


Chris Williams 126: Can anyone point me to any comparative data which shows a correlation between R+D spend and economic growth? I was under the impresion that there wasn't any.

-- depends. Sometimes there's a correlation, and sometimes there isn't.

Eg., the Japanese often spend a lot on basic research, but get very little out of it. Japanese scientists are notoriously more productive outside Japan than inside it, probably due to the local academic culture. The same holds for much of Europe, but to a much smaller degree of course. The Japanese are, however, very good at _commercializing_ research. Their engineers are second to none.

The US has shown very consistent economic growth rates over the last generation; around 3%-3.5%. This is in line with the long-term national trend level and it's rather higher than the 1st-world average.

Something happened in the 70's-80's to reverse the post-WWII trends. It's complicated, but it's there.


Dave Bell146: And the Italians didn't need battleships to put British battleships on the bottom of Alexandria harbour.

-- ah... that was a one-off, and the RN defeated the Italians rather comprehensively in the Med in WWII. The Italians were often brilliant and daring on an individual or small-unit level, as in that raid on Alexandria, but collectively they were abysmal to an almost cartoonish extent.

Just before they changed sides, a German military magazine published a joke article which started: "An Italian armored division has attacked a British bicyclist on the southern front. After a day of intense combat, our allies are in firm possession of the front wheel, and fierce hand-to-hand combat is raging for the handlebars."

The Chinese aren't going to invade Taiwan as long as they know the USN would intervene. In conventional terms they'd get licked, and no nuclear exchange would or could be limited.

That's why the "how many cities are you willing to trade for Taiwan? We will trade two or three" thing is nonsense. If they used _any_ nuclear weapons on us, they'd get a massive response, not a limited one.

This is why nuclear-armed countries can't fight large conventional wars with each other.

The Soviet Union bankrupted itself buying tens of thousands of tanks for a war that couldn't happen(*). I think the Chinese will be a lot smarter than that -- they'd figure out other ways to use force and/or intimidation to get what they want.

(*) back in the 80's, someone who had connections at the Pentagon told me that the real US strategy for WWIII in Europe was: "Fight with conventional weapons until we lose, then fight with tactical nuclear weapons until we lose, then blow up the world."

"Tactical" nuclear weapons being defined as "nukes that go off in Germany".


167: Considering that subways are generally far, far more surveilled now than they were in 1977, I think that you might want to reconsider. With guns I can think of several important things, ranging from widespread smart-locks to standard equipment of night vision sights.

-- nope. Getting on the subway is about the same as it was in 1977. Details have changed (charging the fare to a card, frex); the experience itself is much the same.

Ditto with guns. Basically, firearms changed rapidly between the 1880's and 1900; since then, it's been fiddling with the details. Cars became a mature technology in the 1940's.

Aiming technology has begun to seriously improve recently (optics getting cheaper, red dot projectors), but the _gun_ is pretty much the same old thing.

The US army's assault rifle and machine-guns are all 50's designs... except for the Ma Deuce, which is a _1919_ design.

In 1919, a 50-year old rifle would have been totally obsolete and a 50-year-old machine gun would have been a black-powder cranked Gatling.


Now, _tanks_ are not a mature technology yet. Tanks have been improving quite rapidly over the past generation.

The current cutting-edge MBT's are much more resistant to damage than their immediate predecessors -- the gun-antitank weapon/armor race has been swinging towards the armor side.

1945 tanks were much more vulnerable to available man-portable antitank weapons than 2007 ones, tho' of course they're not invulnerable. Active-intercept and electromagnetic protection systems are at the proof-of-concept stage and prototypes are being deployed.

Dave Drake's far-future supertanks in the HAMMER'S SLAMMERS series are already outclassed, in some respects (armor protection, tho' of course not the fusion powerplants!)

And the IT input in the latest tanks is really important to their performance, not just in the 'nice to have' category.

360-degree video with very good low-light and smoke-and-fog penetration, and surveillance programs that identify movement, weapons signatures, etc. A buttoned-up tank used to be nearly blind and an invitation to an infantry antitank hunting team. That's ceasing to be true. The ability to share data extensively is a real force multiplier, too, and it's a lot easier to provide for a tank than for a foot-soldier, though they're trying hard.

Take a look at a Merkava Mk 4. The gunner just designates the target on a screen and punches the firing control. The systems do _all_ the rest, even at night and high speed, and you get a near 100% hit rate.

I would really, really not like to have to fight a T-72 in a fight with one of those.


Incidentally, Bangalore is a fascinating place and very impressive, not least for its sheer energy.

You get the impression that if three rocks dropped off a truck, within an hour someone would have built a shop over them, hired a couple of stone-carvers, and the discoverer would be outside selling like hell.

I was there for a wedding -- between a Russian-American (via Uzbekistan) convert to Hinduism who was marrying an Indian IT type currently working in Denver.

Now, THAT'S globalization.

As the only gora-log there, we got a quick immersion ourse in life in an upper-middle-class Hindu family, early 21st century style.

Very impressive people, and very hospitable. I hope they all immigrate to the US, like Amit. They'd certainly be an asset, and they _really_ know how to throw a 5-day nonstop wedding!


Somewhere above was mentioned the idea that everybody in the 2037 would have the lifestyle and wealth of the US today. I commented about the problems with sustainability, but not only this -- first of all, I'm not so sure if the implicit theory of modernization is right (cf. @199 -- multiple modernities), and the second point is, that I would prefer it much more if the benchmark for wealth everywhere is one of the top nations on the human development index (HDI; US is only place #8), adjusted for ecological impact. The HDI also gives a good idea that wealth in the sense of happiness doesn't have the follow the US path; the other top-runners of the HDI are Norway, Iceland and Sweden, Australia and Ireland, Canada and Switzerland, and also Japan. That's quite a spectrum of systems and life-styles.


SM Stirling - well, there's a shift since the 70's to bullpup designs for Rifles, but yes.

As for active intercept systems, well, the Merkava's are being refitted with them later this year as I understand. The priority got increased after the Lebannon war, where it would of helped in over a dozen incidents.

The concept is apparently very similar to the techniques worked out after the Yom Kippur war, but don't need to utilise the tank's main gun for the beehive rounds, so the speed of response and reload time are much shorter.


i'm seeing an information lockdown on the horizon: video surveillance now entering the public sphere, "paperless" electronic voting, cell-tower trails recording one's whereabouts, legal narcotics prescribed to suppress normal human behaviors which have been deemed unacceptable or out of line, the enormously expanded "recordkeeping" that charlie spoke of earlier ... etc, etc. and all of the electronic surveilling (direct and indirect) propounded as "objective," therefore, most people won't complain, and even if people did complain, because the "error" came about through objective means, no one is held accountable -- a repressive society built on the theory of plausible deniability.

the western world, one day at a time, but inexorably, becomes stranger and more frightening than most of us could conjure in our wildest dreams.

i think the social repercussions are far more interesting to consider than what potential new gizmos there might be.


SMS@ 195, (on whether or not there's any actual, you know, evidence, that economic growth correlates with R&D spending) 'sometimes there's a correlation, sometimes there isn't'.

There's a technical term for when there's a correlation only once you've taken out all the data points that don't show a correlation. That term is 'yeah, right'. Anyone - can you point me to any credible research showing such a link?


Globalization makes such a _direct_ link R&D -> Growth highly unlikely. Intel, for example, has obviously a superb R&D, but that R&D is not located exclusively in America, and the company makes a lot of money from it, but its factories are more or less all over the developed world (and new ones will be even more widely scattered, in China, Vietnam... )

Thus, Intel R&D does not benefit only one country in terms of economic growth. However, the profits Intel gets from that technology do reduce US trade deficit.

I read somewhere that actually US economy was increasingly relying on three exporting sectors to balance at least partially the deficit : aerospace(military and civilian), IT (software even more than hardware) and enterntainment (mainly Hollywood movies, but also TV series, etc., etc.)


Ramos: 204: Globalization makes such a _direct_ link R&D -> Growth highly unlikely. Intel, for example, has obviously a superb R&D, but that R&D is not located exclusively in America, and the company makes a lot of money from it, but its factories are more or less all over the developed world (and new ones will be even more widely scattered, in China, Vietnam... )

-- yup.

The determining factor now, nationally, is how receptive the economic structures are; how flexible, etc.

This is particularly true now that capital has become so mobile.

I remember some years ago a German businessman being asked why German business wasn't creating any jobs. He replied (to paraphrase) that they were creating lots of jobs, just not jobs in Germany. And that when it was worth their while, they'd create jobs in Germany as well, and until it was, the working class could kiss his ass.


The Internet, you say? Ok, that sounds pretty cool.

Personal computers? Heck, I know people who have their own computers at home. How fast did you say they were? Not bad. And people use them to play games?

Spam? Yeah, I can see how that would happen. You'd think there'd be some way to stop it, but it still sounds plausible. Ok, I'm with you so far.

Hang on, did you say "drugs for erectile dysfunction"? Are you kidding me? Who's going to invent these things, and when can I buy stock?


I lived through the 70s. I don't think it would be that difficult. The conversation would go (something) like this:

1977 DISCO DAN: "I don't understand this article about gnomes. Can you explain it to me?"
2007 RAPPIN' RANDY: "Sure. You know the videogame Space Invaders?"
77: "Yeah, I love that game! Awesome graphics."

07: "Um. Yeah. So now imagine if you could rewire the machine so you could move the space invaders to any place on the screen."
77: "Okay."
07: "And you laid the invaders in a specific order, so they could spell your name."
77: "Far Out! Everyone would know it's me!"

07: "Exactly. So back to the article. That's what this guy did. He 'rewired' the video game (shown in the picture) to drop a bunch of 'invaders' on the screen to spell his name. But instead of invaders he used gnomes (kinda like dwarves)."
77: "And he got in trouble for it."
07: "Right."
77: "Because you're not allowed to rewire video games & use them for advertising."
07: "Right."
77: "Yeah now I understand. Boy I don't know what ludes that guy was smoking, but I want some."

07: "What's a lude?"


197: Um, Steve, the point about subways is that extensive surveillance *makes a difference* to many things which might happen there, in real life or in a novel. The character's ability to use the subways for a quick, anonymous meeting or hand-off could be quite compromised (depending on whom they fear is watching them).

As for guns, an example I mentioned was cheaper and more common night vision sights. Which, just in case you've never seen night before, would make a huge difference in things like the ability to hit somebody in darkness.



Karen #202: Some aspects of computer technology make it possible to enforce certain laws very strictly, which were put into effect when they were unenforceable. It's one thing to have fornication illegal on paper but never prosecuted unless the mayor catches you with his daughter, and quite another to have fornication illegal and surveilance sufficient to enforce that law. One creepy endpoint of that is what Vinge called "ubiquitous law enforcement," and the creepy part is that it's a state from which there may not be any recovery.


"Wait. Let me interrupt you here. What is 'erectile dysfunction?'"


"They cured impotence!"

"Basically, for most guys."

"Truly you live in a time of miracles."


I have a friend who is an archaeologist who works on native sites. On occasion she has to transport human remains - humans who were alive 300 years ago. Not really much in her line of work. She says she tries to explain what she's doing - driving on a highway to a museum - to the box containing the bones. Makes her feel better about dealing with human remains.


What's a "chat session"?


In 1977 people already had Atari. I recall that a couple of years earlier, my Dad had lugged back from work a "portable" desktop computer during weekends that took up most of the boot of his car. On its tiny screen he programmed it to play hangman and other text-only games. Anyway, 1977 he was starting to work on TV-delivered infotext games (teletext, ceefax, oracle, that kind of thing). At the time they thought that the "free" bandwidth in the TV's signal, coupled with the vertical blanking interval, was the key to multimedia.


Also, they'd had the *idea* of "flat TVs" since the 1950s, and the early Casio and Sinclair "pocket TVs" were in development by the late 70s. The Dick Tracy Watch (aka Sprint Power Vision) was about to become possible...



I was also thinking about the idea of SF "predicting" the future, the desirability of such a thing posited as a criterion of success for the work involved. The problem with looking back on this sort of thing is that we have both confirmation bias and survivor bias going on. Go back to 1977 (ot 67, or 57) and there were *some* people who got it "right". But 99% of the SF published then got it spectacularly wrong. After it began to get all the kudos in the late 1980s/early 1990s, (post-Neuromancer), I searched for "True Names" and read it. Or re-read it... I realised I'd read it much earlier but forgotton it - it just didn't resonate with anything I saw then. Charlie also mentioned this earlier when he pointed out that so many technological paths were closed off, or not taken. In the 1980s, we could have gone to a teletext-based networking system, Minitel on steroids. In the 1990s, we could have gone to a huge, balkanised collection of "information superhighway" networks (Marvel, AOL, Apple's cheesy town thing of which I forget the name). But we didn't. So we tend to see some writers as prescient, or some works as singularly prophetic, when in fact a lot of what we're seeing is a chance collection of lucky hunches. Mind you, I think if Heinlein's zippy sidewalk future in "Roads Must Roll" had come true, that would be hella sweet. And we had to wait for the goofy boosterism of the Segway hypsters before saying "See, Heinlein predicted *Segways*!".


As for active intercept systems, well, the Merkava's are being refitted with them later this year as I understand.

I googled for Merkava "active intercept" and the only link I got was straight back to your post. What *are* these things? I mean, I'm presuming some kind of *very* limber gimbel-mounted rapid-fire gun with a dinky phased array radar. But you were talking about hitting antitank *shells* IIRC. Any links? I don't even like to think about "using the tank's main gun for the beehive rounds" - flechettes for point defense? WTF?


Somewhere in my files at work, I have a copy of a short story from the magazine omni about a gal playing a game the netted her money. Sorry I don't have better details at hand, but if someone was industrious....


JenP 211: She says she tries to explain what she's doing - driving on a highway to a museum - to the box containing the bones. Makes her feel better about dealing with human remains.

-- It may make her feel better, but I doubt it does the bones any good.

I understand the emotions most people feel about bodies, but I don't share them. As the mock-Persian poet said:

"Some to the earth do give their dead
And some to the cleansing fire
But since I won't be here to see
You can use my guts to string a lyre".


Adrian 216:


Short form: "The Trophy active protection system creates a hemispheric protected zone around the vehicle where incoming threats are intercepted and defeated. It has three elements providing Threat Detection and Tracking, Launching and Intercept functions. The Threat Detection and Warning subsystem consists of several sensors, including flat-panel radars, placed at strategic locations around the protected vehicle, to provide full hemispherical coverage. Once an incoming threat is detected identified and verified, the Countermeasure Assembly is opened, the countermeasure device is positioned in the direction where it can effectively intercept the threat. Then, it is launched automatically into a ballistic trajectory to intercept the incoming threat at a relatively long distance."

It's being retrofitted to existing Mk 4 Merkavas and all new production, and to various APC's as well.

Some sections of the American military want it, but there's a Not Invented Here resistance.


@216, @219: it's touted as being good for rockets and ATGMs. (I'd have questioned their sanity if they claimed it also worked for APFSDS rounds ...)

Two consequences:

1) Infantry ain't going to be riding on top of those tanks.

2) Systems like Trophy will gradually become standard on MBTs and APCs, but they can be defeated by switching from low velocity guided shaped charge carriers to high velocity rounds, and I suspect they've just created the ideal market conditions for Steyr to start selling this and its larger cousins. Picture something like a larger version of the IWS on one of those tracked bomb disposal robots as the future of infantry anti-tank systems. (The separate carrier being so that the counter-battery fire from the friends of the tank it just nailed don't necessarily kill the gunner, who's in a fox-hole a hundred metres away from the gun.)


(I'd have questioned their sanity if they claimed it also worked for APFSDS rounds ...)

"The Trophy development roadmap considers an enhanced countermeasure unit to be available in the future, and protect against kinetic energy (KE) threats." I suppose "considers" is a bit weaker than "envisions".

I'm skeptical, oddly. "Specific details about the composition and mechanism of this explosive interceptor device are vague." Hmm. The whole slant of the thing looks to me like a technological solution to a behavioural problem (the need to mind one's own business, stay out of Lebanon etc.) Any urban environment anticipating Israeli (or US) attention is likely to go the tried-and-tested low-tech route of big-ass emplaced shaped charges. Good luck deflecting those with yer beams of fragments.

Systems like Trophy will gradually become standard on MBTs and APCs, but they can be defeated by switching from low velocity guided shaped charge carriers to high velocity rounds

High velocity rounds on their lonesome don't tend to go through modern armor AFAIK, you want tandem-charge warheads like the RPG29 for that. OTOH, the Russians have some experience with missiles that dodge in larger airframes - who's to say how small those could go? Harder to stay focussed on a tank than a carrier, admittedly.


It's 1977. We've all grown up with Star Trek. We expect that there will in the Near Future be instant transmission even across vast spatial differences, that computers will have all the answers instantly, and that video displays will be huge and full color and as clear as TV - and display things instantly, without breaking up unless the Enemies Are Jamming Transmissions.

(How to annoy a hardcore computer geek in 1987? Say "16 colors? Is this what you're so excited about? That doesn't look very impressive to me. They had much better computers on Star Trek." Worked like a *charm* to deflate people boasting about their OMGVGA!!!! back when.)

It's 1977. Most everyone's seen 2001: A Space Odyssey - it's so ubiquitous that it's been parodied on Sesame Street, for crying out loud. (Nowadays I have to explain to kids about the Big V From Space because it's been forgotten. We expect sentience - possibly dangerous autonomy - from computers in the near future, even if now all we have are punch cards.

We expect computers to be able to handle everything, right up to driving the spaceships, in the near future. (My friends and I "invented" the concept of the handheld scanner that year, trying to think of sfnal ways that we could get out of doing homework, studying, or punishment lines, in school: it would be a pen that you would run across a line of text, and then it would write it out for you without you doing anything. We had no idea how such a thing would work, but vague ideas of however copy machines did it. We were in 3rd grade and watched Lost In Space and Gatchaman dubs religiously.)

It's 1977. Star Wars has become the surprise smash word-of-mouth hit of the year, runs are extended in theatres, nobody's talking about anything else. The high-tech game presently available, which two years earlier created lines before the demo model at Sears, is Pong, which is really disappointing compared to the color/lights/flashy/dimensional gameplay you can get at a good pinball arcade in 1977 - but we know what the future will bring, we just saw it on the big screen. It's going to be *amazing.*

3D realtime strategy games projected by the computer in the round - and autonomous sentient computers capable of playing them against organic opponents, and winning.

And remember, pretty much everyone in the country has either read or heard of Orwell by 1977, too. Just add "It's like Big Brother, only you can switch it off and you control what goes out," and presto, there's your network concept. All you'd have to say is "computers the size of R2's head," "computers that work like on Star Trek, not like the gawdawful tape stuff you have now" and
"games like the one in Star Wars only not holographs, we don't have them yet" and "computer games that respond like pinball, only with more bells and whistles and control," and you'd be able to make a mint selling time-travel tickets in 1977.

And that's just mainstream, mundane awareness. In the 1930s, Stanley Weinbaum conceived of rebuilt postapocalyptic New York being completely networked with big-brother levels of instant surveillance and telecommunication (The Black Flame.) So did A. Merritt with his hollow-earth advanced civilization, same time frame (The Moon Pool.)

Thus, old-time fans should have even less trouble accepting it, in 1977 - unless they actually *worked* in computers and had a real sense of how hard it was going to be to get from the relatively-new printed circuits to micro-miniaturization without meltdowns.


Oh, and we *all* played "Let's pretend we're police/cowboys/pirates/explorers/ spacemen/soldiers/ archeologists/shipwrecked on Atlantis" back in the day, so "you use the computer to let you play makebelieve with your friends when they're at home and can't come over, on a fancy TV screen something like the chess game in Star Wars where the aliens move around and beat each other up by themselves, " would not have been a difficult concept to get across - or sell - to your average eight-year-old in 1977...

I mean, we also knew that realtime videophones were going to be standard household equipment by the 21st century, after all - that's what *all* the TV shows, books, and movies told us!


[[ A bit like that new-fangled Dungeons and Dragons game everyone's talking about, except using a computer instead of dice and rule books and lead figurines. (Yes, there are several million people doing this right now. This isn't rocket science.) ]]

What's the antecedent of 'this'? Do you mean several million people were playing D&D in 1977 -- sounds high.

We built our first Heathkit computers in 1979 iirc, but had been watching for an affordable kit for a while. Got into D&D in 1980 iirc, though friends had been into it for several years.

You lose me-now at the gnomes' bodies spelling out something. I don't play WOW so I don't know what screen view you'd see that in, or who would see it before the staff caught it.

As for explaining something to someone lower-tech, there was great thing online about explaining to Tarzan how to use paper mail.


Regarding the hoary old "computer translates 'out of sight, out of mind' as 'blind idiot'" apocrypha at 84:

I just tried asking Google Translate to convert several proverbs to Japanese, namely "out of sight, out of mind", "the spirit is willing, but the flesh is weak", and "a bird in the hand is worth two in the bush". It converted each one into flawless idiomatic Japanese, to the best of my ability to tell. (I briefly suspected it of entirely flubbing the first one, before I realized that it was almost too idiomatic for my own Japanese skills.)

Proverbs are easy. All it takes is a database.


has this thread attracted much spam from the gnome-bombers?

Now that's a John Brunner line.


My dad would have probably understood what you were talking about in 1977 if you were speaking in 2007 terms. Of course, that's kind of unfair considering my dad helped pioneer the computer age at Control Data. Plus, around that time, he had already been involved in building multi-processor computers and probably designed the first removable hard drive, among other so-called "recent" inventions.

Most people have no idea how old the technology they use today really is. (Thank marketing, honestly. Marketing's contribution to the legacy of the human civilization is the perception that "progress" is coming along faster than it really is. Yeah, futurism is almost entirely driven by marketing.)

Take USB for instance. Wikipedia says the technology was created in 1996. Sorry, guys, that's when the technology was merely packaged into a form called "USB." The technology was in use under a different name 20 years prior to 1996. But don't ask me what it was called. I try to retain my dad's technology history lessons. Unfortunately, I'm not that technically oriented. But maybe someday I'll write his biography--unauthorized, of course, because he would never approve of being placed under any sort of spotlight.

There's a ton of unwritten history that we'll never know because the people who made it aren't vocal enough to tell us. Not everyone likes to talk about their work. Most people who do truly amazing work keep quiet because being involved in amazing work is what they enjoy most. That's probably hard to understand in today's world where it seems like everyone has a blog.


This sort of reminds me of the thought that occurs to me anytime I watch a horror film. You know how we chuckle at the erstwhile horror films of the 50s? You know, The Blob and whatnot? Presumably, people of that era were actually frightened by them. Well, what would happen if you transported those people to the present day and showed them what passes for a horror flick now? Would their hearts simply explode?


In 1977, I had been playing wargames and RPGs for several years. My first computer was six years in the future, but I had fooled around on DECs and even Apples and read THE SPACE GAMER and CREATIVE COMPUTING and knew how things might shake out. Hell, I really looked forward to systems like Ben Bova described in "The Dueling Machine."

As for the rest of it.

In the early 1980s, I played a Massive Multiplayer Play by Mail game called Star Master.

It was a "4E" game: Explore, Expand, Exploit, Exterminate. Or something like that. You designed an alien species, got a homeworld to start off from and a certain level of technology.

It was run entirely by hand, using minimum wage teenyboppers. You filled out forms telling them what your world(s) built, where ships would go, and what your scientists were up to.

I played the shabby thing obsessively for too long. When I was ready to bow out, I sought a buyer for my empire. This is when I discovered that the real power players were way more obsessive than I was. They started empires and played them for a while specifically so their main empire could conquor and loot it.

My lovingly set up empire was destined for that.

So. I don't think it would be too much of a stretch for the 1977 me to comprehend gold mining. The notion of dead gnomes spelling out advertisements would probably strike me as hilarious.

OTOH, I suspect I'm too much of a fogey now to come to grips with whatever is in store in 2037.


228: I was watching the DVD of "The Two Towers" the other day, and had a similar thought.

What would be the reaction of an audience of movie goers circa 1950 to Jackson's trilogy? I suspect it would seriously phuq their minds. If you slipped "Fellowship" into the lineup of a Saturday Morning kiddie matinee the aisles would be awash in urine shortly after the balrog appeared . . . although the ones who weren't traumatized would probably be seriously jonsing for the sequel.


@217: "Somewhere in my files at work, I have a copy of a short story from the magazine omni about a gal playing a game the netted her money. Sorry I don't have better details at hand, but if someone was industrious...."

Not in OMNI (I was an avid OMNI reader until it went bad in the mid-late '80s, too), but in Dragon #97 (May 1985) there was a short story "Catacomb" by Henry Melton.

It seemed very slightly dated to me when I read it, as I was then using the PLATO network and playing a graphical MMORPG, Avatar, which was invented in the late '70s (though I didn't play until the early '80s), and required the same kind of teamwork to play as a modern MMORPG or tabletop RPG. I was also active in playing pencil-and-paper RPGs online on BBS's, using messages during the week to "blue-book" story action and then chat on the weekends to handle action scenes and direct interaction. But the premise of using a game to make money seemed entirely reasonable but non-obvious, and I was glad to be exposed to it.

Fast-forward to 2007, and I make a small second income making bespoke gadgets and scripts in Second Life, and could easily scale up to doing it full-time if I didn't mind ending up hating it.

The big development in computer tech for me isn't the actual capabilities, most of which I could do at least 20 years ago and in some cases over 30, but that the hardware is much smaller and more portable, and I can do many things all at once. I've got a couple dozen browser tabs, two terminals to different Unix systems, iChat (who says videophones aren't popular? I use video chat all the time!), an editor, email, listening to a tech news podcast, and I'm downloading files for later work. 20 years ago, my state-of-the-art computer had 2MB RAM, and I could have done one of those activities at a time (generally the terminal so I could get to a Unix box to do multiple tasks...)


Nice piece. "chat room" slipped in without explanation.


I am disappointed!

It took until yesterday (July 27th) for the first WOWGold spam to show up in this thread.


The earliest novel about a simulated virtual world that I can think of, off the top of my head, was Simulacron-3 by Ron Goulart in 1964.

The earliest story about a simulated virtual world that you played video games in on a worldwide computer network? The movie Tron came out in 1982, two years before the novel Neuromancer.

Shockwave Rider in 1975 had the computer network and games, but no VR.

The Apple II came out in 1976, and had the TV screen attached. It was $1500, not cheap enough to come in your breakfast cereal but people were already predicting that: Robert Anton Wilson either said it or quoted Leary as saying it.

I wrote my first text-based virtual-dungeon style BBS in 1983. Bartle was probably already thinking about MUD in 1977, since it was running in 1978.

I think more people than just Brunner could have kept up with this in 1977, but I suspect most of them would at least know who Brunner was. :)


And I'm so late to the thread!

One thing that pretty much everyone here totally misses is demographics.

SM Stirling could brag all he wants about advanced tech in tanks and airplanes. It wouldn't matter. If it did, the Saudis wouldn't have the joke of a military it does. If it did, the Israelis would have won that little tete a tete in Lebanon recently.

Look, advanced tech and advanced societies are *hugely* talent hungry on the back end. All those cool gizmos you can put on a tank? They requires armies of ph.d'ed engineers to maintain and debug. They require massive amounts of education of the end users (which also uses up a substantial proportion of the active force...think what happened to japanese and german airforce as experienced pilots died off). To do all that, you've gotta have a growing population with which to harvest talent. A growing population that is getting fed, sheltered and educated adequately.

It is not a coincidence that much of the basic infrastructure of US technology was developed by the late 70's, as this represented a peak supply of young, highly educated (all those smart women who could only be teachers, remember?) talent. It's not a coincidence that Israel lost the last engagement with Hezbollah, as both Patrick Lang and Gary Brecher indicated that a primary reason for defeat was undertrained people who weren't especially commited to Israel, or as Gary puts it...

"But we're talking demographics again, dude. Passage of time, plus difference in birthrate, means that by now the IDF has a thin, real thin, crust of Ashkenazi brains'n'brawn on top and a bunch of flabby mama's boys under them. See, those whitleather-tough survivors wasted their genes on the whole socialist kibbutz commune experiment, had a kid or two, or none. Their kids are old now. Meanwhile, Israel admitted every loser from Russia or Ukraine or Yemen who could claim a grandpa who liked carp or a grandma who carried the overprotective gene or whatever, anything that could make them look Jewish. Half of them were just lying to get out of their native Hellholes, and none of them were willing to die for Israel the way that kick-ass first generation was. Look at the news pictures up close, or just look at the pictures of that schmuck who got kidnapped in Gaza, Shalit, and you'll see what I mean: the weak and the freeloaders outbred the strong. Hell, that loser's name says it plain enough. What kind of soldier would anybody with the same name as that loudmouth ugly prick Gene Shalit be?""

Demographics are one of the primary drivers of technical and social expansion. It would be nice to see something like this outside of general apocalyptic literature...


Andrew @63: I think you could have told some people what happened without them falling over in shock. Have you read Ursula Le Guins The Lathe of Heaven, published in 1971? Among other things, it contains an ill-defined war with Islamic nations, and the aftermath of runaway global warming caused by CO2. Oh, and no space travel to speak of. The book is set in 2006. Plot apart, this aspect of it is faintly spooky. Im only glad it doesnt get more things right or it would be _very_ spooky.

Im glad that Charlie brought this up (128): I think someone in 1977 would be very surprised that most people using computers today arent programmers in any way. I was born in 1979, and at least until the age of eight or so I was sure that Id have to learn to write programs like my grandpa, who wrote games for me and my sister if I was going to be able to do much more than play Pacman on computers. My husband would agree (he was, in fact, already writing programs by the time he was seven). I never actually learnt to.

Im not suggesting that programming was a universal skill in 1977, but that the non-computer-using multitude probably would then have thought of it as an indivisible part of using a computer.

A pity the discussions about voice versus keyboard seem to have died for me, the main reason not to use voice is the speed! I can touchtype faster than I can speak, and then edit so the message conveys my meaning as accurately as possible. I couldnt do that with speech.

By the way, Charlie more sort-of nostalgia: if your wifes grandfather and father worked for Ferranti, they probably knew my abovementioned grandfather, who also did and was in the same field. Im about to go and sort through his papers, and am hoping to find some interesting stuff about the early days of computers.