More thoughts on worldbuilding in fiction.
The first thing to note is that there's more than one way to do it. Which is to say: worldbuilding in SF and fantasy is by definition a divergent process, because no two people are going to come up with the same visualization even if you give them the same goal ("you're going to write a space opera in which our hero, raised by poor but honest folks on a small farming planet, goes forth to discover his destiny ...") — give me that brief and I'll come up with something utterly different from George Lucas, I promise!
So here are some rules of thumb I use, tending towards an increasingly narrow focus. (Sorry if you were expecting me to address the broader uses of confabulation as a fictional tool; this is very much a set of practical guidelines rather than an examination of the theory behind the activity.)
1. Humans are interested in reading fiction about humans.
Constraint #1 on any work of fiction is that it needs to provide an environment in which recognizable human protagonists can exist. If they're not human (e.g. "Diaspora", by Greg Egan; "Saturn's Children", by me) you need to provide some sort of continuity with the human and give the reader reasons to feel concerned for them. Or you can go for the "they're not human, don't look human, and they have no connection with us", but what you get is either borderline-unreadable at best, or suffers from human-mind-in-a-giant-land-snail-body syndrome (which risks demolishing the reader's willing suspension of disbelief).
So I'm going to focus on providing a human environment ...
2. In general, High Fantasy steals its dress from pre-modern history; Urban Fantasy buys off-the-shelf in TK-Maxx: and Science Fiction goes for that bold futurist look.
Which is to say, if you're going to write a trilogy with a young soldier on the rise and a throne and an evil emperor, you can do a lot worse than plunder the decline and fall of the Roman Empire for your social background. Note, however, that you'll do a lot better if you read some social history texts rather than believing what you see in the movies.
It's quite common for future-oriented SF stories to loot historical backgrounds and settings. I think this is in general a huge mistake unless there's an explicit reason for our future society to have suppressed a raft of useful technologies and social trends. Suppression of the modern does happen (look at Iran, 1979-84, or the abandonment of firearms within the Tokugawa Shogunate) but it usually seems to follow some kind of massive social trauma (like 200 years of civil war, in the case of Japan) and requires draconian enforcement, because people don't like giving up the triple-ply quilted toilet paper and bittorrent downloads.
3. Any sufficiently advanced technology is indistinguishable from magic, and vice versa.
The first chunk Clarke's third law (aside from my three word extension at the end) is familiar and hard to argue with. Show an iPhone (working, in its natural internetty ecosystem) to a scientist from the 1930s and they will boggle. The components inside the glass-and-steel shell are so small that the individual circuit tracks on the ICs are barely visible to an electron microscope of the day, and the rare earths that contaminate the chips are present in such low concentrations that the analytical equipment of the pre-war period would probably be unable to detect them. An iPhone 4 is, in fact, way ahead of the 1960s SFnal vision of a Star Trek Communicator. By extrapolation: the future will be full of stuff that works by means that we probably don't have the theory background to understand — seamless exotica. (For a good fictional depiction of this, I cite "Rainbows End" by Vernor Vinge.)
The extra clause I added is, I think, less often explored in fantasy: we generally treat our technologies as if they are reliable, predictable, cheap pieces of magic, so it behooves the author of a fantasy to ask — why wouldn't their protagonists treat readily available magic the way we treat technology? Either magic needs to be rare, unreliable, erratic, and a bit feeble ... or it's going to be everywhere in chains.
Now I'm going to focus on near-future SF ...
(Near-future SF is SF set in the future, but close enough to the present that the author risks a serious ribbing when the date rolls around and their more bizarre predictions haven't shown up.)
4. The future is going to be like the present, only with extra layers.
Let's face it, history doesn't get simpler with time (unless you live in very interesting times indeed, such as those of the First Emperor). Current events take place against a backdrop of old assumptions and grievances, and add their own framing context to tomorrow's events. A story set in Egypt in 2031 is going to inevitably reflect the echoes of the overthrow of Mubarak, and also of today's elections (and the military counter-revolution in train), not to mention tomorrow's subsequent events, which will take place within the frame created by the Arab Spring.
The same goes for technologies. New technologies almost never destroy old ones — unless they accomplish the same task in a manner that is clearly better. DVDs replaced VHS video cassettes once writable DVDs became commonplace because VHS tapes are bulky, more expensive to manufacture, and don't provide random access (youngsters: after you watched a movie you had to press the "rewind" button and wait for a couple of minutes as the tape rewound before you could start watching it again). But TV/video/DVDs didn't kill off cinema — the big screen in a theatre remains a different social experience. And cinema didn't kill off the ancient art of the theatrical performance. Musicals didn't kill off opera, either. All of these older entertainment methods exist today (albeit in reduced niches, because we have so many more alternatives) because they are non-interchangeable.
Only technologies with directly substitutable replacements that are clearly superior go to the wall (for example: pocket calculators killed the slide rule market due to improved precision, DVDs replaced video tape due to size and quality and random access, diesel and electric locomotives supplanted steam traction on the railways due to being more fuel-efficient and not requiring huge amounts of water resupply infrastructure).
5. People evaluate the new using the cognitive toolkit they acquired in the past.
We train our children. In particular, we (or our neighbours, or our schools) train our children early to aspire to a consensus vision of the Good Life. This is based on the assumption that what was achievable in the past is achievable and desirable in the future: grow up, do well at school, get a degree, get a job, get married, buy a house, have children, work, enjoy a long retirement and decline, hand the torch of heredity on to the next generation. This isn't necessarily a bad vision (it worked for the ancestors) but it's an example of a backward-looking one. Break it down into its component sub-stages and they reflect what was necessary to achieve a comfortable/successful life a generation ago.
If you drop a futurist stone in this reflecting pool, the ripples it produces will bounce off the surface of those historic aspirations. For example, if you posit a cheap and effective cure for the ageing process that gives everyone indefinite youth prolongation for the cost of an aspirin a day, it's unreasonable to expect most people to suddenly abandon the milestones by which they and their parents measure life progress. Most of them will continue to dance the degree/job/marriage/home ownership/work fandango because it takes a lot of effort to interrogate one's unquestioned axioms, and it's even harder to let go of them if they are found lacking: and people don't like externally imposed change. Again: the global uptake of the internet didn't, for the most part, change human behaviour. What it did achieve was to break down geographical barriers so that isolated people with outlying interests could interact for good or ill, and to act as an amplifier for some types of social activity. (It also turned out to be the equivalent of hydrofluoric acid for supply chains, but that's another matter.)
Yes, there are outliers. In every society, there are some people just waiting to throw all the old ways out in the trash and experiment with new and exciting ways of organizing their lives. But there's also a similar proportion of stick-in-the-mud reactionaries, who see history as being on their side. Both factions are right and wrong: the ratio changes depending on external circumstances.
TL;DR version of this axiom: people are people. You're welcome to write a near-future story populated with New Soviet Men or frictionless and perfectly spherical libertarians; don't expect it to be convincing.
6. The shiny bright City of Tomorrow is also full of slums and favelas.
We get stuff wrong. Property development magnates build gated communities for billionaires that open for business just as the real estate market crashes. The office buildings of a booming middle eastern emirate go up so fast that the municipal sewage system can't cope so skyscrapers end up being serviced by huge queues of sewage trucks. Country dwellers migrate to cities that can't expand fast enough to give them adequate housing, so they end up in favelas and shanty towns. And people keep driving ancient automobiles long after Ford or General Motors would like to have sold them a new one.
This isn't just a subset of rule 4: rule 4 is about the tendency of the present to embed the past. Rule 5 is about the tendency of the present to embed the past's mistakes.
Arbitrary business or design decisions made in the early stages of a boom or a new technology field get locked in as the field expands. Consider which side of the road vehicles drive on: in some countries (the UK, India, Japan, various African states) drivers use the left, while in others (the USA, most of Europe) they use the right. While it might seem sensible to standardize on one side, everywhere, in practice changing over is a really big deal. Again, other mistakes early in a field have given us much grief: the alleged opposition of the NSA to transport-level encryption in TCP/IP during the 1980s bequeathed us an insecure internet. The decision to use null-terminated strings in the C programming language, which allowed any number of buffer overrun attacks. On a large, culture-wide scale: the decision to criminalize and persecute the use of some intoxicating substances (opiates, cannabis, hallucinogens) while regulating and tolerating others despite their arguably deadlier side-effects (alcohol, tobacco).
Part of the problem is that we build rafts of infrastructure on top of existing design decisions. Which means that fixing a bad decision requires the abandonment of lots of stuff that depends on it. In the case of the war on drugs, the gigantic police and punishment industries would be hard-pressed to justify their existence without Prohibition. In the case of driving on the left, all the road signs would need to be replaced, markings at junctions revised, traffic flow around gyratory and roundabout systems reversed, and all the drivers would need to be re-educated. A study proving conclusively that driving on the other side led to a reduction in traffic accidents could come out tomorrow and it would still be very difficult to make such a far-reaching infrastructure change.
7. The ratio of the near future is: 90% of it is just like today, 9% is stuff that is on the drawing boards, and 1% is unutterably strange and alien and unexpected.
Consider this: it takes time for new technologies and products to make their way from the design agency to the production line, and longer still for them to make an impact in the wider world. The gap is usually measured in single-digit years; if a new design stays in development hell for a decade then it's highly likely that it's there because turning it into a product is either not profitable enough to repay the development costs, or because of some pre-existing infrastructure with which it is incompatible. Even when there's a huge sales draw (consider high definition flat screen TVs) whatever it replaces (tube TVs) may linger for many years. So although it was possible to buy a flat plasma screen TV in the late 1990s, it took over a decade for the big flat screens to become ubiquitous and even today, you probably own, or know someone who owns, a 20 year old CRT. Again: in the year 2030 there will almost certainly be some cars on the road that were built in 2010, or earlier. So the near future mostly resembles the present, with added inclusions of stuff that's come out of the technology pages of the fishwrap.
And then again, sometimes there's stuff that simply wasn't predictable. If our horizon for near-future extrapolation and worldbuilding is 25 years, then looking back to 1986 some aspects of the world of 2011 are simply not obvious. Oil shortages, climate change, revivals of 1970s or 1980s fashions, genetic engineering — those would have struck a 1986 reader as being properly science-fictional and reasonable in that time frame. But other changes were less obvious. the internet in 1986 existed as a tool for connecting large corporations and universities, but computer communications were still largely modem and bulletin board based, and the web hadn't been invented. The idea that by 2011 around 50% of adults would meet their sexual partners via the internet would have been flat-out ridiculous, much less that the internet would eat the retail supply chain. And the proposal that by 2011, new automobiles would leave the factory gate with over a dozen computers and ten million lines of code aboard would have indeed seemed science-fictional, but not in a good way: why on earth would anyone do that? The reader of 1986 would complain. The colour revolutions of the early noughties, the War on Terror (triggered indirectly by the Soviet withdrawal from Afghanistan and the subsequent end of the West's use for Saddam's regime as a proxy tool in the cold war), not to mention the Arab Spring of 2011 ... would seem alarming and arbitrary an anarchic, to a reader conditioned by forty years of Cold War duopoly. Finally, the collapse of the Soviet Union took almost everyone in the west by surprise. As William Gibson remarked, if he'd tried to sell "Zero History" in 1985 (a novel set in 2001), the only part of that mainstream-ish novel that would really cause his editor to question his sanity would be the collapse of the USSR.
Anyway, that's it for now. I'll try and continue this thread when I have something new to say, but that's enough for one morning ...